Show Reference: "Emergence of hierarchical structure mirroring linguistic composition in a recurrent neural network"

Emergence of hierarchical structure mirroring linguistic composition in a recurrent neural network Neural Networks, Vol. 24, No. 4. (12 May 2011), pp. 311-320, doi:10.1016/j.neunet.2010.12.006 by Wataru Hinoshita, Hiroaki Arie, Jun Tani, Hiroshi G. Okuno, Tetsuya Ogata
@article{hinoshita-et-al-2011,
    abstract = {We show that a Multiple Timescale Recurrent Neural Network ({MTRNN}) can acquire the capabilities to recognize, generate, and correct sentences by self-organizing in a way that mirrors the hierarchical structure of sentences: characters grouped into words, and words into sentences. The model can control which sentence to generate depending on its initial states (generation phase) and the initial states can be calculated from the target sentence (recognition phase). In an experiment, we trained our model over a set of unannotated sentences from an artificial language, represented as sequences of characters. Once trained, the model could recognize and generate grammatical sentences, even if they were not learned. Moreover, we found that our model could correct a few substitution errors in a sentence, and the correction performance was improved by adding the errors to the training sentences in each training iteration with a certain probability. An analysis of the neural activations in our model revealed that the {MTRNN} had self-organized, reflecting the hierarchical linguistic structure by taking advantage of the differences in timescale among its neurons: in particular, neurons that change the fastest represented "characters", those that change more slowly, "words", and those that change the slowest, "sentences".},
    author = {Hinoshita, Wataru and Arie, Hiroaki and Tani, Jun and Okuno, Hiroshi G. and Ogata, Tetsuya},
    citeulike-article-id = {8677562},
    citeulike-linkout-0 = {http://dx.doi.org/10.1016/j.neunet.2010.12.006},
    citeulike-linkout-1 = {http://view.ncbi.nlm.nih.gov/pubmed/21273043},
    citeulike-linkout-2 = {http://www.hubmed.org/display.cgi?uids=21273043},
    day = {12},
    doi = {10.1016/j.neunet.2010.12.006},
    issn = {08936080},
    journal = {Neural Networks},
    keywords = {language, model, mtrnn, self-organization},
    month = may,
    number = {4},
    pages = {311--320},
    pmid = {21273043},
    posted-at = {2014-08-14 10:53:18},
    priority = {2},
    title = {Emergence of hierarchical structure mirroring linguistic composition in a recurrent neural network},
    url = {http://dx.doi.org/10.1016/j.neunet.2010.12.006},
    volume = {24},
    year = {2011}
}

See the CiteULike entry for more info, PDF links, BibTex etc.

Hinoshita et al. propose a model of natural language acquisition based on a multiple-timescale recurrent artificial neural network (MTRNN).

Hinoshita et al. define self-organization as the phenomenon of a global, coherent structure arising in a system through local interaction between its elements as opposed to through some sort of central control.

According to Hinoshita et al., recurrent neural networks are capable of self-organization.

Hinoshita et al. argue that by watching language learning in RNNs, we can learn about how the human brain might self-organize to learn language.