Header menu link for other important links
Self-attention based sentiment analysis with effective embedding techniques
S. Sivakumar,
Published in Inderscience Publishers
Volume: 65
Issue: 1
Pages: 65 - 77
The problem of sparse vector representation exists in handling the large-scale data and also semantics of words are not considered in many existing works of sentiment analysis. Effective word embedding techniques improve the task of sentiment analysis by overcoming the above two problems. It is a challenging task, when the review is expressed in multiple sentences and the entire sentence needs to be considered instead of individual words to determine the sentiment. To achieve this task, we have proposed a novel LSTM-based deep learning architecture that applies a sentence embedding using universal sentence encoder along with an attention layer. To evaluate the proposed approach, we have carried out various experiments on IMDB data set. From the experimental results, it is observed that the proposed method is significantly better than the other word-embedding-based approaches with an improvement of 5% and we have achieved a F1 score of 89.12% for our approach. Copyright © 2021 Inderscience Enterprises Ltd.
About the journal
JournalInternational Journal of Computer Applications in Technology
PublisherInderscience Publishers