Pretražite po imenu i prezimenu autora, mentora, urednika, prevoditelja

Napredna pretraga

Pregled bibliografske jedinice broj: 1266322

Extending the recurrent neural network model for improved compositional modelling of text sequences


Tutek, Martin
Extending the recurrent neural network model for improved compositional modelling of text sequences, 2022., doktorska disertacija, Fakultet Elektrotehnike i Računarstva, Zagreb


CROSBI ID: 1266322 Za ispravke kontaktirajte CROSBI podršku putem web obrasca

Naslov
Extending the recurrent neural network model for improved compositional modelling of text sequences

Autori
Tutek, Martin

Vrsta, podvrsta i kategorija rada
Ocjenski radovi, doktorska disertacija

Fakultet
Fakultet Elektrotehnike i Računarstva

Mjesto
Zagreb

Datum
22.07

Godina
2022

Stranica
93

Mentor
Šnajder, Jan

Ključne riječi
Word Representations ; Multiprototype Representations ; Semantic Composition ; Interpretability ; Recurrent Neural Networks ; Natural Language Processing

Sažetak
The thesis focuses on exploring extensions to the recurrent neural network (RNN) algorithm for natural language processing (NLP) in terms of improving its capabilities of semantic composition, investigating the possible benefits of leveraging multi-prototype word representations and improving its overall interpretability. While RNNs have received a strong competitor in form of the Transformer model, both approaches to processing natural language sequences possess their own set of issues. This thesis investigates methods of inducing sparsity in neural networks in order to learn shared sense representations and also tackles the problem of semantic composition in recurrent networks. The thesis introduces a novel approach for building recursive representations of language which is better suited to the hierarchical phrasal structure of language. The original scientific contributions of the thesis are: 1. Empirical analysis of the convergence of recurrent neural network algorithms on text sequence modeling tasks with respect to different input word representations ; 2. An algorithm for learning word representations serving as input to text processing models based on contextualized word representations ; 3. An extension of the recurrent neural network model for processing text sequences with mechanisms for processing linguistic phenomena such as polysemy, semantic composition, and coreference.

Izvorni jezik
Engleski

Znanstvena područja
Informacijske i komunikacijske znanosti



POVEZANOST RADA


Projekti:
--KK.01.1.1.01.009 - Napredne metode i tehnologije u znanosti o podatcima i kooperativnim sustavima (DATACROSS) (Šmuc, Tomislav; Lončarić, Sven; Petrović, Ivan; Jokić, Andrej; Palunko, Ivana) ( CroRIS)

Ustanove:
Fakultet elektrotehnike i računarstva, Zagreb

Profili:

Avatar Url Jan Šnajder (mentor)

Avatar Url Martin Tutek (autor)

Poveznice na cjeloviti tekst rada:

repozitorij.fer.unizg.hr

Citiraj ovu publikaciju:

Tutek, Martin
Extending the recurrent neural network model for improved compositional modelling of text sequences, 2022., doktorska disertacija, Fakultet Elektrotehnike i Računarstva, Zagreb
Tutek, M. (2022) 'Extending the recurrent neural network model for improved compositional modelling of text sequences', doktorska disertacija, Fakultet Elektrotehnike i Računarstva, Zagreb.
@phdthesis{phdthesis, author = {Tutek, Martin}, year = {2022}, pages = {93}, keywords = {Word Representations, Multiprototype Representations, Semantic Composition, Interpretability, Recurrent Neural Networks, Natural Language Processing}, title = {Extending the recurrent neural network model for improved compositional modelling of text sequences}, keyword = {Word Representations, Multiprototype Representations, Semantic Composition, Interpretability, Recurrent Neural Networks, Natural Language Processing}, publisherplace = {Zagreb} }
@phdthesis{phdthesis, author = {Tutek, Martin}, year = {2022}, pages = {93}, keywords = {Word Representations, Multiprototype Representations, Semantic Composition, Interpretability, Recurrent Neural Networks, Natural Language Processing}, title = {Extending the recurrent neural network model for improved compositional modelling of text sequences}, keyword = {Word Representations, Multiprototype Representations, Semantic Composition, Interpretability, Recurrent Neural Networks, Natural Language Processing}, publisherplace = {Zagreb} }




Contrast
Increase Font
Decrease Font
Dyslexic Font