Bridging the Gap: A Comprehensive Study on Named Entity Recognition in Electronic Domain using Hybrid Statistical and Deep Knowledge Transfer
##plugins.pubIds.doi.readerDisplayName##:
https://doi.org/10.32473/flairs.37.1.135582关键词:
deep learning, domain specialized, low domain, statistics, knowledge transfer, hybrid system摘要
Training deep neural network models in NLP applications with a small amount of annotated data does not usually achieve high performances. To address this issue, transfer learning, which consists of transferring knowledge from a domain with a large amount of annotated data to a specific domain which lacks annotated data, could be a solution. In this paper, we present a study case on named entity recognition for the electronic domain, that relies on several approaches based. on statistics, deep learning, and transfer learning. Our
evaluations showed a significant improvement in overall performance, with the best results using transfer learning, up to +15% compared to other approaches.
As Transformers-based models have shown their effectiveness in many NLP tasks in the last years, in this study, we compare our models performance to some Transformers-based models.
##submission.downloads##
已出版
##submission.howToCite##
期
栏目
##submission.license##
##submission.copyrightStatement##
##submission.license.cc.by-nc4.footer##