Skip to Main content Skip to Navigation
Conference papers

Exploring the Influence of Focal Loss on Transformer Models for Imbalanced Maintenance Data in Industry 4.0

Abstract : Harnessing data from historical maintenance databases may be challenging, as they tend to rely on text data provided by operators. Thus, they often include acronyms, jargon, typos, and other irregularities that complicate the automated analysis of such reports. Furthermore, maintenance datasets may present highly imbalanced distributions: some situations happen more often than others, which hinders the effective application of classic Machine Learning (ML) models. Hence, this paper explores the use of a recent Deep Learning (DL) architecture called Transformer, which has provided cutting-edge results in Natural Language Processing (NLP). To tackle the class imbalance, a loss function called Focal Loss (FL) is explored. Results suggests that when all the classes are equally important, the FL does not improve the classification performance. However, if the objective is to detect the minority class, the FL achieves the best performance, although by degrading the detection capacity for the majority class.
Document type :
Conference papers
Complete list of metadata

https://hal-uphf.archives-ouvertes.fr/hal-03407955
Contributor : Kathleen Torck Connect in order to contact the contributor
Submitted on : Thursday, October 28, 2021 - 5:38:12 PM
Last modification on : Friday, October 29, 2021 - 3:58:23 AM

Identifiers

  • HAL Id : hal-03407955, version 1

Collections

Citation

Juan Pablo Usuga Cadavid, Bernard Grabot, Samir Lamouri, Arnaud Fortin. Exploring the Influence of Focal Loss on Transformer Models for Imbalanced Maintenance Data in Industry 4.0. 17th IFAC Symposium on Information Control Problems in Manufacturing (INCOM 2021), Jun 2021, Budapest, Hungary. ⟨hal-03407955⟩

Share

Metrics

Record views

12