Bridging the Gap: A Comprehensive Study on Named Entity Recognition in Electronic Domain using Hybrid Statistical and Deep Knowledge Transfer
DOI:
https://doi.org/10.32473/flairs.37.1.135582Palabras clave:
deep learning, domain specialized, low domain, statistics, knowledge transfer, hybrid systemResumen
Training deep neural network models in NLP applications with a small amount of annotated data does not usually achieve high performances. To address this issue, transfer learning, which consists of transferring knowledge from a domain with a large amount of annotated data to a specific domain which lacks annotated data, could be a solution. In this paper, we present a study case on named entity recognition for the electronic domain, that relies on several approaches based. on statistics, deep learning, and transfer learning. Our
evaluations showed a significant improvement in overall performance, with the best results using transfer learning, up to +15% compared to other approaches.
As Transformers-based models have shown their effectiveness in many NLP tasks in the last years, in this study, we compare our models performance to some Transformers-based models.
Descargas
Publicado
Cómo citar
Número
Sección
Licencia
Derechos de autor 2024 Fatiha Sadat, Ghaith Dekhili, Tan Ngoc Le
Esta obra está bajo una licencia internacional Creative Commons Atribución-NoComercial 4.0.