Evaluating Graph Attention Networks as an Alternative to Transformers for ABSA Task in Low-Resource Languages
DOI:
https://doi.org/10.32473/flairs.37.1.135630Keywords:
Graph Attention Networks, Aspect-Based Sentiment Analysis, Low-resource languagesAbstract
Opinions toward subjects and products hold immense relevance in business to guide decision-making processes. However, due to the increase in user-generated content, manual analysis is unrealistic. Techniques such as Sentiment Analysis are paramount to understanding and quantifying human emotion expressed in text data. Aspect-Based Sentiment Analysis aims to extract aspects from an opinionated text while identifying their underlying sentiment. Graph-based text representations have been shown to bring benefits to this task, as they explicitly represent structural relationships in text. While studies have demonstrated the effectiveness of this representation for Aspect-based Sentiment Analysis using Graph Neural Networks in English, there is only sparse evidence of improvement using these techniques for low-resource languages such as Portuguese. We develop a straightforward Graph Attention Network model for the Aspect-Based Sentiment Analysis task in Brazilian Portuguese. The proposed approach achieves a Balanced Accuracy score of 0.74, yielding competitive results and ranking third place in the ABSAPT competition. Furthermore, by leveraging sparse graph connections our model is less computationally demanding than a Transformer architecture in terms of training and inference.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2024 Gabriel Gomes, Alexandre T. Bender, Arthur Cerveira, Larissa A. Freitas, Ulisses B. Corrˆea
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.