Masters Theses

Date of Award

12-1988

Degree Type

Thesis

Degree Name

Master of Science

Major

Computer Science

Major Professor

Bruce MacLennan

Committee Members

David Mutchler, David Straight

Abstract

Neural nets can provide a user with the ability to train a system to accomplish a task. The net learns from the training pairs, and stores these relationships. The net provides a method to generalize an output based on the nearness to all of the trained inputs. This works well even in where the inputs are noisy or distorted. This is a study of a back propagation net. Neural nets degrade gracefully as the input deterioates or gets farther and farther from the trained input. Hidden layers allow the net to store more complicated relationships. These relationships may well be something other than what the user thinks is the best or most obvious. This relationship is stored as connection strengths. The net applies an activation function to the weighted inputs at each node and then passes that information to the next node. This study shows how well the back propagation net accomplished the tasks of recognizing alphabetic characters.

Files over 3MB may be slow to open. For best results, right-click and select "save as..."

Share

COinS