Bridging the Gap: A Comprehensive Study on Named Entity Recognition in Electronic Domain using Hybrid Statistical and Deep Knowledge Transfer
DOI :
https://doi.org/10.32473/flairs.37.1.135582Mots-clés :
deep learning, domain specialized, low domain, statistics, knowledge transfer, hybrid systemRésumé
Training deep neural network models in NLP applications with a small amount of annotated data does not usually achieve high performances. To address this issue, transfer learning, which consists of transferring knowledge from a domain with a large amount of annotated data to a specific domain which lacks annotated data, could be a solution. In this paper, we present a study case on named entity recognition for the electronic domain, that relies on several approaches based. on statistics, deep learning, and transfer learning. Our
evaluations showed a significant improvement in overall performance, with the best results using transfer learning, up to +15% compared to other approaches.
As Transformers-based models have shown their effectiveness in many NLP tasks in the last years, in this study, we compare our models performance to some Transformers-based models.
Téléchargements
Publié-e
Comment citer
Numéro
Rubrique
Licence
© Fatiha Sadat, Ghaith Dekhili, Tan Ngoc Le 2024
Cette œuvre est sous licence Creative Commons Attribution - Pas d'Utilisation Commerciale 4.0 International.