v1v2 (latest)
Sentiment Analysis with Contextual Embeddings and Self-Attention
International Syposium on Methodologies for Intelligent Systems (ISMIS), 2020
Abstract
In natural language the intended meaning of a word or phrase is often implicit and depends on the context. In this work, we propose a simple yet effective method for sentiment analysis using contextual embeddings and a self-attention mechanism. The experimental results for three languages, including morphologically rich Polish and German, show that our model is comparable to or even outperforms state-of-the-art models. In all cases the superiority of models leveraging contextual embeddings is demonstrated. Finally, this work is intended as a step towards introducing a universal, multilingual sentiment classifier.
View on arXivComments on this paper
