Pretražite po imenu i prezimenu autora, mentora, urednika, prevoditelja

Napredna pretraga

Pregled bibliografske jedinice broj: 992030

Iterative Recursive Attention Model for Interpretable Sequence Classification


Tutek, Martin; Šnajder, Jan
Iterative Recursive Attention Model for Interpretable Sequence Classification // Proceedings of the 2018 EMNLP Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP / Linzen, Tal ; Chrupała, Grzegorz ; Alishahi, Afra (ur.).
Brisel: Association for Computational Linguistics (ACL), 2018. str. 249-257 (poster, međunarodna recenzija, cjeloviti rad (in extenso), znanstveni)


CROSBI ID: 992030 Za ispravke kontaktirajte CROSBI podršku putem web obrasca

Naslov
Iterative Recursive Attention Model for Interpretable Sequence Classification

Autori
Tutek, Martin ; Šnajder, Jan

Vrsta, podvrsta i kategorija rada
Radovi u zbornicima skupova, cjeloviti rad (in extenso), znanstveni

Izvornik
Proceedings of the 2018 EMNLP Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP / Linzen, Tal ; Chrupała, Grzegorz ; Alishahi, Afra - Brisel : Association for Computational Linguistics (ACL), 2018, 249-257

Skup
EMNLP Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP

Mjesto i datum
Bruxelles, Belgija, 01.11.2018

Vrsta sudjelovanja
Poster

Vrsta recenzije
Međunarodna recenzija

Ključne riječi
Obrada prirodnog jezika ; Duboko učenje ; interpretabilnost
(Natural language processing ; Deep learning ; interpretability)

Sažetak
Natural language processing has greatly benefited from the introduction of the attention mechanism. However, standard attention models are of limited interpretability for tasks that involve a series of inference steps. We describe an iterative recursive attention model, which constructs incremental representations of input data through reusing results of previously computed queries. We train our model on sentiment classification datasets and demonstrate its capacity to identify and combine different aspects of the input in an easily interpretable manner, while obtaining performance close to the state of the art

Izvorni jezik
Engleski

Znanstvena područja
Računarstvo



POVEZANOST RADA


Projekti:
KK.01.1.1.01.0009 - Napredne metode i tehnologije u znanosti o podatcima i kooperativnim sustavima (EK )

Ustanove:
Fakultet elektrotehnike i računarstva, Zagreb

Profili:

Avatar Url Martin Tutek (autor)

Avatar Url Jan Šnajder (autor)

Poveznice na cjeloviti tekst rada:

www.aclweb.org

Citiraj ovu publikaciju:

Tutek, Martin; Šnajder, Jan
Iterative Recursive Attention Model for Interpretable Sequence Classification // Proceedings of the 2018 EMNLP Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP / Linzen, Tal ; Chrupała, Grzegorz ; Alishahi, Afra (ur.).
Brisel: Association for Computational Linguistics (ACL), 2018. str. 249-257 (poster, međunarodna recenzija, cjeloviti rad (in extenso), znanstveni)
Tutek, M. & Šnajder, J. (2018) Iterative Recursive Attention Model for Interpretable Sequence Classification. U: Linzen, T., Chrupała, G. & Alishahi, A. (ur.)Proceedings of the 2018 EMNLP Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP.
@article{article, author = {Tutek, Martin and \v{S}najder, Jan}, year = {2018}, pages = {249-257}, keywords = {Obrada prirodnog jezika, Duboko u\v{c}enje, interpretabilnost}, title = {Iterative Recursive Attention Model for Interpretable Sequence Classification}, keyword = {Obrada prirodnog jezika, Duboko u\v{c}enje, interpretabilnost}, publisher = {Association for Computational Linguistics (ACL)}, publisherplace = {Bruxelles, Belgija} }
@article{article, author = {Tutek, Martin and \v{S}najder, Jan}, year = {2018}, pages = {249-257}, keywords = {Natural language processing, Deep learning, interpretability}, title = {Iterative Recursive Attention Model for Interpretable Sequence Classification}, keyword = {Natural language processing, Deep learning, interpretability}, publisher = {Association for Computational Linguistics (ACL)}, publisherplace = {Bruxelles, Belgija} }




Contrast
Increase Font
Decrease Font
Dyslexic Font