Show Tag: rnn

Select Other Tags

Lawrence et al. train different kinds of recurrent neural networks to classify sentences in grammatical or agrammatical.

Lawrence manage to train ANNs to learn grammar-like structure without them having any inbuilt representation of grammar They argue that that shows that Chomsky's assumption that humans must have inborn linguistic capabilities is unnecessary.

According to Hinoshita et al., recurrent neural networks are capable of self-organization.

Hinoshita et al. argue that by watching language learning in RNNs, we can learn about how the human brain might self-organize to learn language.