Masters Theses

Date of Award

12-1991

Degree Type

Thesis

Degree Name

Master of Science

Major

Engineering Science

Major Professor

Lefteri Tsoukalas

Abstract

The functional link net of Yo-Han Pao and the high-order neural network of Giles and Maxwell require that the user select the expansion terms to suit the particular data set. For the two category classification problem a method of finding adaptively appropriate expansion terms for a one layer functional link net is presented and discussed. In the training phase input vectors x are expanded using Hebbian learning in the form of second order neurons (or outerproduct expansion). A new network layer is then created by multiplying the expanded vectors by a matrix determined by applying the Karhunen-Loeve expansion to those expanded vectors. This removes all correlation from the features of the expanded vectors and may reduce their dimensionality. If the Ho-Kashyap algorithm indicates linear separability, quit; otherwise, expand the current layer and repeat above steps until linear separability is obtained. It is then possible to pass a vector x having symbolic terms through the multilayer network. The result is a polynomial in the components of x. The symbolic portion of each term of the polynomial represents one expansion function of an equivalent one layer functional link net, and the numerical coefficient of the term is the associated weight.

Files over 3MB may be slow to open. For best results, right-click and select "save as..."

Share

COinS