Nalazite se na CroRIS probnoj okolini. Ovdje evidentirani podaci neće biti pohranjeni u Informacijskom sustavu znanosti RH. Ako je ovo greška, CroRIS produkcijskoj okolini moguće je pristupi putem poveznice www.croris.hr
izvor podataka: crosbi !

Staying True to Your Word: (How) Can Attention Become Explanation? (CROSBI ID 699680)

Prilog sa skupa u zborniku | izvorni znanstveni rad | međunarodna recenzija

Tutek, Martin ; Šnajder, Jan Staying True to Your Word: (How) Can Attention Become Explanation? // Proceedings of the 5th Workshop on Representation Learning for NLP. 2020. str. 131-142 doi: 10.18653/v1/2020.repl4nlp-1.17

Podaci o odgovornosti

Tutek, Martin ; Šnajder, Jan

engleski

Staying True to Your Word: (How) Can Attention Become Explanation?

The attention mechanism has quickly become ubiquitous in NLP. In addition to improving performance of models, attention has been widely used as a glimpse into the inner workings of NLP models. The latter aspect has in the recent years become a common topic of discussion, most notably in recent work of Jain and Wallace ; Wiegreffe and Pinter. With the shortcomings of using attention weights as a tool of transparency revealed, the attention mechanism has been stuck in a limbo without concrete proof when and whether it can be used as an explanation. In this paper, we provide an explanation as to why attention has seen rightful critique when used with recurrent networks in sequence classification tasks. We propose a remedy to these issues in the form of a word level objective and our findings give credibility for attention to provide faithful interpretations of recurrent models.

Natural Language Processing ; Interpretability ; Explainable AI ; Recurrent neural networks

nije evidentirano

nije evidentirano

nije evidentirano

nije evidentirano

nije evidentirano

nije evidentirano

Podaci o prilogu

131-142.

2020.

objavljeno

10.18653/v1/2020.repl4nlp-1.17

Podaci o matičnoj publikaciji

Proceedings of the 5th Workshop on Representation Learning for NLP

Podaci o skupu

Association for Computational Linguistics

poster

05.07.2020-10.07.2020

online

Povezanost rada

Informacijske i komunikacijske znanosti

Poveznice