Doctoral Dissertations
Date of Award
5-1999
Degree Type
Dissertation
Degree Name
Doctor of Philosophy
Major
Computer Science
Major Professor
Bruce J. MacLennan
Committee Members
Bethany Dumas, Mike Johnson, Carmen Trammell, Brad Vander Zanden
Abstract
This research project examines Hebbian learning in recurrent neural networks for natural language processing and attempts to interpret language at the level of a two and one half year old child. In this project five neural networks were built to interpret natural language: a Simple Recurrent Network with Hebbian Learning, a Jordan network with Hebbian learning and one hidden layer, a Jordannetwork with Hebbian learning and no hidden layers, a Simple Recurrent Network back propagation learning, and a nonrecurrent neural network with backpropagation learning. It is known that Hebbian learning works well when the input vectors are orthogonal, but, as this project shows, it does not perform well in recurrent neural networks for natural language processing when the input vectors for the individual words are approximately orthogonal. This project shows that,given approximately orthogonal vectors to represent each word in the vocabulary the input vectors for a given command are not approximately orthogonal and the internal representations that the neural network builds are similar for different commands. As the data shows, the Hebbian learning neural networks were unable to perform the natural language interpretation task while the back propagation neural networks were much more successful. Therefore, Hebbian learning does not work well in recurrent neural networks for natural language processing even when the input vectors for the individual words are approximately orthogonal.
Recommended Citation
Barilovits, L. Karlyn Ammons, "Hebbian learning in recurrent neural networks for natural language processing. " PhD diss., University of Tennessee, 1999.
https://trace.tennessee.edu/utk_graddiss/8764