Masters Theses

Date of Award

5-2020

Degree Type

Thesis

Degree Name

Master of Science

Major

Computer Engineering

Major Professor

Amir Sadovnik

Committee Members

Catherine D. Schuman, Bruce J. MacLennan

Abstract

Spiking neural networks (SNNs) have recently gained a lot of attention for use in low-power neuromorphic and edge computing. On their own, SNNs are difficult to train, owing to their lack of a differentiable activation function and their inherent tendency towards chaotic behavior. This work takes a strictly neuroscience-inspired approach to designing and training SNNs. We demonstrate that the use of neuromodulated synaptic time dependent plasticity (STDP) can be used to create a variety of different learning paradigms including unsupervised learning, semi-supervised learning, and reinforcement learning. In order to tackle the highly dynamic and potentially chaotic spiking behavior of SNNs both during training and testing, we discuss a variety of neuroscience-inspired hoemeostatic mechanisms for keeping the network's activity in a healthy range. All of these concepts are brought together in the development of a SNN model that is trained and tested on the MNIST handwritten digits dataset. In order to achieve this, we introduce also introduce a custom Python package called Ganglion that can be used to rapidly design and test SNN architectures.

Files over 3MB may be slow to open. For best results, right-click and select "save as..."

Share

COinS