Masters Theses

Date of Award

8-2019

Degree Type

Thesis

Degree Name

Master of Science

Major

Computer Science

Major Professor

James S. Plank

Committee Members

Mark E. Dean, Garrett S. Rose

Abstract

A deep neural network is a non-spiking artificial neural network which uses multiple structured layers to extract features from the input. Spiking neural networks are another type of artificial neural network which closely mimic biology with time dependent pulses to transmit information. Whetstone is a training algorithm for spiking deep neural networks. It modifies the back propagation algorithm, typically used in deep learning, to train a spiking deep neural network, by converting the activation function found in deep neural networks into a threshold used by a spiking neural network. This work converts a spiking deep neural network trained from Whetstone to a traditional spiking neural network in the TENNLab framework. This conversion decomposes the dot product operation found in the convolutional layer of spiking deep neural networks to synapse connections between neurons in traditional spiking neural networks. The conversion also redesigns the neuron and synapse structure in the convolutional layer to trade time for space. A new architecture is created in the TENNLab framework using traditional spiking neural networks, which behave the same as the spiking deep neural network trained by Whetstone before conversion. This new architecture verifies the converted spiking neural network behaves the same as the original spiking deep neural network. This work can convert networks to run on other architectures from TENNLab, and this allows networks from those architectures to be trained with back propagation from Whetstone. This expands the variety of training techniques available to the TENNLab architectures.

Files over 3MB may be slow to open. For best results, right-click and select "save as..."

Share

COinS