University of Twente Student Theses
As of Friday, 8 August 2025, the current Student Theses repository is no longer available for thesis uploads. A new Student Theses repository will be available starting Friday, 15 August 2025.
Trade-offs in the Non-Convex Stochastic Gradient Method : Generalization, Optimization, and Computational Complexity
Soldatov, Andrii (2025) Trade-offs in the Non-Convex Stochastic Gradient Method : Generalization, Optimization, and Computational Complexity.
PDF
606kB |
Abstract: | This thesis explores the Stochastic Gradient Method (SGM) and Deterministic Gradient Method (DGM) in non-convex optimization. For SGM, we focus on generalization bounds and computational complexity, showing that the method's performance is strongly affected by the smoothness of the objective function and the constant step size. The generalization bound for non-convex SGM with a constant step size exhibits exponential complexity, indicating a need for more flexible step size strategies. The time complexity also scales with the number of samples and problem dimensionality. In the case of DGM, we examine the relationship between smoothness and expansiveness, which has not been proven for non-convex functions. DGM's performance is sensitive to the step size. The optimization error for DGM has exponential complexity, defined by the number of iterations, and we derive an upper bound for population risk. Simulations show that for complex non-convex functions, smoothness decreases as dimensionality increases, leading to poorer convergence, larger jumps, and more inconsistency, indicating the challenges higher dimensions pose for performance. |
Item Type: | Essay (Bachelor) |
Faculty: | EEMCS: Electrical Engineering, Mathematics and Computer Science |
Subject: | 31 mathematics |
Programme: | Applied Mathematics BSc (56965) |
Link to this item: | https://purl.utwente.nl/essays/105173 |
Export this item as: | BibTeX EndNote HTML Citation Reference Manager |
Repository Staff Only: item control page