Author(s): Heath, Andrew (2021)
Abstract:
Sparsity constraints decrease computation time and memory requirements for Artificial Neural Networks (ANN). Further research has shown that pruning during each epoch based on accuracy shows similar improvements. Research on pruning has brought ANNs closer in line with their biological counterparts. However, the formation and reinforcement of hub neurons, as seen in a brain, has not been explored entirely in ANNs. Reinforcement of said hub neurons could reduce the time and memory requirements of network training pruning algorithms. This research investigates the Laplacian centrality of neurons in ANNs and Sparse Neural Networks (trained using SET) during training, showing changes in the distribution of Laplacian centrality of the neurons. We propose an ANN training method, CenSET, that uses Laplacian centrality to instruct pruning of connections during the training of a sparse ANN. We show that this approach does not dramatically decrease the accuracy compared to training an ANN using a conventional MLP or SET approach.
Document(s):
Bsc_Thesis_Andrew_Heath (1).pdf