Pregled bibliografske jedinice broj: 1215801
IRB-NLP at SemEval-2022 Task 1: Exploring the Relationship Between Words and Their Semantic Representations
IRB-NLP at SemEval-2022 Task 1: Exploring the Relationship Between Words and Their Semantic Representations // Proceedings of the 16th International Workshop on Semantic Evaluation (SemEval-2022) / Emerson, Guy ; Schluter, Natalie ; Stanovsky, Gabriel ; Kumar, Ritesh ; Palmer, Alexis ; Schneider, Nathan ; Singh, Siddharth ; Ratan, Shyam (ur.).
Seattle (WA): Association for Computational Linguistics (ACL), 2022. str. 36-59 doi:10.18653/v1/2022.semeval-1.5 (predavanje, međunarodna recenzija, cjeloviti rad (in extenso), znanstveni)
CROSBI ID: 1215801 Za ispravke kontaktirajte CROSBI podršku putem web obrasca
Naslov
IRB-NLP at SemEval-2022 Task 1: Exploring the Relationship Between Words and Their Semantic Representations
Autori
Korenčić, Damir ; Grubišić, Ivan
Vrsta, podvrsta i kategorija rada
Radovi u zbornicima skupova, cjeloviti rad (in extenso), znanstveni
Izvornik
Proceedings of the 16th International Workshop on Semantic Evaluation (SemEval-2022)
/ Emerson, Guy ; Schluter, Natalie ; Stanovsky, Gabriel ; Kumar, Ritesh ; Palmer, Alexis ; Schneider, Nathan ; Singh, Siddharth ; Ratan, Shyam - Seattle (WA) : Association for Computational Linguistics (ACL), 2022, 36-59
ISBN
978-1-955917-80-3
Skup
The 16th International Workshop on Semantic Evaluation (SemEval-2022)
Mjesto i datum
Seattle (WA), Sjedinjene Američke Države, 14.07.2022. - 15.07.2022
Vrsta sudjelovanja
Predavanje
Vrsta recenzije
Međunarodna recenzija
Ključne riječi
deep learning ; natural language processing ; semantic representations ; codwoe ; definition modeling ; reverse dictionary
Sažetak
What is the relation between a word and its description, or a word and its embedding? Both descriptions and embeddings are semantic representations of words. But, what information from the original word remains in these representations? Or more importantly, which information about a word do these two representations share? Definition Modeling and Reverse Dictionary are two opposite learning tasks that address these questions. The goal of the Definition Modeling task is to investigate the power of information laying inside a word embedding to express the meaning of the word in a humanly understandable way -- as a dictionary definition. Conversely, the Reverse Dictionary task explores the ability to predict word embeddings directly from its definition. In this paper, by tackling these two tasks, we are exploring the relationship between words and their semantic representations. We present our findings based on the descriptive, exploratory, and predictive data analysis conducted on the CODWOE dataset. We give a detailed overview of the systems that we designed for Definition Modeling and Reverse Dictionary tasks, and that achieved top scores on SemEval-2022 CODWOE challenge in several subtasks. We hope that our experimental results concerning the predictive models and the data analyses we provide will prove useful in future explorations of word representations and their relationships.
Izvorni jezik
Engleski
Znanstvena područja
Računarstvo
POVEZANOST RADA
Ustanove:
Institut "Ruđer Bošković", Zagreb