Show Reference: "Natural Language Grammatical Inference with Recurrent Neural Networks"

Natural Language Grammatical Inference with Recurrent Neural Networks IEEE Transactions on Knowlege and Data Engineering, Vol. 12, No. 1. (January 2000), pp. 126-140, doi:10.1109/69.842255 by Steve Lawrence, C. Lee Giles, Sandiway Fong
@article{lawrence-et-al-2000,
    abstract = {This paper examines the inductive inference of a complex grammar with neural networks\^{A}┬┐specifically, the task considered is that of training a network to classify natural language sentences as grammatical or ungrammatical, thereby exhibiting the same kind of discriminatory power provided by the Principles and Parameters linguistic framework, or {Government-and-Binding} theory. Neural networks are trained, without the division into learned vs. innate components assumed by Chomsky, in an attempt to produce the same judgments as native speakers on sharply grammatical/ungrammatical data. How a recurrent neural network could possess linguistic capability and the properties of various common recurrent neural network architectures are discussed. The problem exhibits training behavior which is often not present with smaller grammars and training was initially difficult. However, after implementing several techniques aimed at improving the convergence of the gradient descent backpropagation-through-time training algorithm, significant learning was possible. It was found that certain architectures are better able to learn an appropriate grammar. The operation of the networks and their training is analyzed. Finally, the extraction of rules in the form of deterministic finite state automata is investigated.},
    address = {Piscataway, NJ, USA},
    author = {Lawrence, Steve and Giles, C. Lee and Fong, Sandiway},
    citeulike-article-id = {8490459},
    citeulike-linkout-0 = {http://portal.acm.org/citation.cfm?id=628048},
    citeulike-linkout-1 = {http://dx.doi.org/10.1109/69.842255},
    citeulike-linkout-2 = {http://ieeexplore.ieee.org/xpls/abs\_all.jsp?arnumber=842255},
    doi = {10.1109/69.842255},
    issn = {1041-4347},
    journal = {IEEE Transactions on Knowlege and Data Engineering},
    keywords = {ann, language, learning, model, rnn},
    month = jan,
    number = {1},
    pages = {126--140},
    posted-at = {2014-08-19 11:45:53},
    priority = {2},
    publisher = {IEEE Educational Activities Department},
    title = {Natural Language Grammatical Inference with Recurrent Neural Networks},
    url = {http://dx.doi.org/10.1109/69.842255},
    volume = {12},
    year = {2000}
}

See the CiteULike entry for more info, PDF links, BibTex etc.

Lawrence et al. train different kinds of recurrent neural networks to classify sentences in grammatical or agrammatical.

Lawrence manage to train ANNs to learn grammar-like structure without them having any inbuilt representation of grammar They argue that that shows that Chomsky's assumption that humans must have inborn linguistic capabilities is unnecessary.