University of Twente Student Theses


Understanding the Dynamic Sparse Training Models

Teymurlu, Ibrahim (2023) Understanding the Dynamic Sparse Training Models.

[img] PDF
Abstract:Artificial Neural Networks (ANNs) have gained popularity for their improved performance in various fields due to their computational efficiency and reduced storage space requirements. However, traditional ANNs consist of multi-connected layers, leading to an increase in redundant and weak connections as the number of neurons increases. This results in extensive memory and computation consumption. To address this issue, techniques such as pruning and sparsity have been developed. Pruning involves the removal of unnecessary connections that do not have significance in a network, while adaptive/dynamic sparsity involves removing redundant connections while allowing others to grow in their place during training. In this paper, we aim to bring more understanding to Dynamic Sparse Training models by discussing its relation to mammal brains and improving the pruning process and re-connection of neurons. We introduce the Brain-Mimetic Synapse Adjustment algorithm and successfully assess his classification performance using two datasets, CIFAR10 and Fashion MINST. By the end of this research, we expect to contribute to the understanding of Dynamic Sparse Training models, improve the removal process in sparse models, and address the rewiring of neurons in Dynamic Sparse Training models. This gives us valuable insights into developing more human brain-like efficient, and effective neural networks.
Item Type:Essay (Bachelor)
Faculty:EEMCS: Electrical Engineering, Mathematics and Computer Science
Subject:54 computer science
Programme:Computer Science BSc (56964)
Awards:Best Presentation Award
Link to this item:
Export this item as:BibTeX
HTML Citation
Reference Manager


Repository Staff Only: item control page