University of Twente Student Theses

Login
As of Friday, 8 August 2025, the current Student Theses repository is no longer available for thesis uploads. A new Student Theses repository will be available starting Friday, 15 August 2025.

Accelerating Neural Network Training Using the Neural Tangent Kernel

Sarvanau, K. (2025) Accelerating Neural Network Training Using the Neural Tangent Kernel.

[img] PDF
5MB
Abstract:The training of deep neural networks, while foundational to modern artificial intelligence, presents significant computational challenges that limit efficiency and scalability. This thesis investigates the Neural Tangent Kernel (NTK) as a theoretical framework to accelerate this process. We propose a hybrid training strategy that begins with standard Stochastic Gradient Descent (SGD) and transitions to an NTK-based update method once the network enters a stable, "lazy training" regime. A core contribution of this work is the development of data-driven heuristics, based on monitoring either the change of parameter norm or the stability of the NTK, to dynamically identify the optimal switch point. This methodology is systematically evaluated on two distinct tasks: image classification with the Fashion-MNIST dataset and regression with the Wine Quality dataset. We implement and compare three different NTK update methods against a pure SGD baseline across networks of varying widths. Our results demonstrate that the hybrid approach, particularly a one-shot update derived from the integrated NTK flow, can achieve comparable final performance to full SGD training in a fraction of the time, validating the strategy as a promising method for accelerating neural network convergence.
Item Type:Essay (Bachelor)
Faculty:EEMCS: Electrical Engineering, Mathematics and Computer Science
Subject:54 computer science
Programme:Applied Mathematics BSc (56965)
Link to this item:https://purl.utwente.nl/essays/107993
Export this item as:BibTeX
EndNote
HTML Citation
Reference Manager

 

Repository Staff Only: item control page