Pregled bibliografske jedinice broj: 1119520
Pretraining and Fine-Tuning Strategies for Sentiment Analysis of Latvian Tweets
Pretraining and Fine-Tuning Strategies for Sentiment Analysis of Latvian Tweets // Human Language Technologies – The Baltic Perspective / Utka, A. et al. (ur.).
Kaunas: IOS Press, 2020. str. 55-61 doi:10.3233/FAIA200602
CROSBI ID: 1119520 Za ispravke kontaktirajte CROSBI podršku putem web obrasca
Naslov
Pretraining and Fine-Tuning Strategies for
Sentiment Analysis of Latvian Tweets
Autori
Thakkar, Gaurish ; Pinnis, Marcis
Vrsta, podvrsta i kategorija rada
Poglavlja u knjigama, znanstveni
Knjiga
Human Language Technologies – The Baltic Perspective
Urednik/ci
Utka, A. et al.
Izdavač
IOS Press
Grad
Kaunas
Godina
2020
Raspon stranica
55-61
ISBN
978-1-64368-116-0
Ključne riječi
Sentiment analysis ; word embeddings ; BERT ; Latvian
Sažetak
In this paper, we present various pre-training strategies that aid in improving the accuracy of the sentiment classification task. At first, we pre-train language representation models using these strategies and then fine-tune them on the downstream task. Experimental results on a time- balanced tweet evaluation set show the improvement over the previous technique. We achieve 76% accuracy for sentiment analysis on Latvian tweets, which is a substantial improvement over previous work.
Izvorni jezik
Engleski
Znanstvena područja
Informacijske i komunikacijske znanosti
Citiraj ovu publikaciju:
Časopis indeksira:
- Scopus