Bridging the Gap: A Comprehensive Study on Named Entity Recognition in Electronic Domain using Hybrid Statistical and Deep Knowledge Transfer
DOI:
https://doi.org/10.32473/flairs.37.1.135582Keywords:
deep learning, domain specialized, low domain, statistics, knowledge transfer, hybrid systemAbstract
Training deep neural network models in NLP applications with a small amount of annotated data does not usually achieve high performances. To address this issue, transfer learning, which consists of transferring knowledge from a domain with a large amount of annotated data to a specific domain which lacks annotated data, could be a solution. In this paper, we present a study case on named entity recognition for the electronic domain, that relies on several approaches based. on statistics, deep learning, and transfer learning. Our
evaluations showed a significant improvement in overall performance, with the best results using transfer learning, up to +15% compared to other approaches.
As Transformers-based models have shown their effectiveness in many NLP tasks in the last years, in this study, we compare our models performance to some Transformers-based models.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2024 Fatiha Sadat, Ghaith Dekhili, Tan Ngoc Le
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.