Utilizing RoBERTa and XLM-RoBERTa pre-trained model for structured sentiment analysis

Nikita Ananda Putri Masaling, Derwin Suhartono

Abstract


The surge in internet usage has amplified the trend of expressing sentiments across various platforms, particularly in e-commerce. Traditional sentiment analysis methods, such as aspect-based sentiment analysis (ABSA) and targeted sentiment analysis, fall short in identifying the relationships between opinion tuples. Moreover, conventional machine learning approaches often yield inadequate results. To address these limitations, this study introduces an approach that leverages the attention values of pre-trained RoBERTa and XLM-RoBERTa models for structured sentiment analysis. This method aims to predict all opinion tuples and their relationships collectively, providing a more comprehensive sentiment analysis. The proposed model demonstrates significant improvements over existing techniques, with the XLM-RoBERTa model achieving a notable sentiment graph F1 (SF1) score of 64.6% on the OpeNEREN dataset. Additionally, the RoBERTa model showed satisfactory performance on the multi-perspective question answer (MPQA) and DSUnis datasets, with SF1 scores of 25.3% and 29.9%, respectively, surpassing baseline models. These results underscore the potential of this proposed approach in enhancing sentiment analysis across diverse datasets, making it highly applicable for both academic research and practical applications in various industries.

Keywords


Opinion tuples; RoBERTa; Structured sentiment analysis; XLM-RoBERTa

Full Text:

PDF


DOI: http://doi.org/10.11591/ijict.v13i3.pp410-421

Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

The International Journal of Informatics and Communication Technology (IJ-ICT)
p-ISSN 2252-8776, e-ISSNĀ 2722-2616
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).

Web Analytics View IJICT Stats