University of Twente Student Theses

Login

Gradient Descent with Random Minibatches in the Linear Model

Mash'Al, Yazan (2024) Gradient Descent with Random Minibatches in the Linear Model.

[img] PDF
1MB
Abstract:This thesis explores the dynamics and convergence properties of gradient descent algorithms when integrated with dropout techniques and random minibatch sampling in the context of linear regression. Dropout, a form of regularization, and minibatch gradient descent are both crucial in developing robust machine learning models that generalize well to unseen data. This paper aims to showcase the theoretical properties and practical implications of using these two methods.
Item Type:Essay (Bachelor)
Faculty:EEMCS: Electrical Engineering, Mathematics and Computer Science
Subject:31 mathematics, 54 computer science
Programme:Applied Mathematics BSc (56965)
Link to this item:https://purl.utwente.nl/essays/100665
Export this item as:BibTeX
EndNote
HTML Citation
Reference Manager

 

Repository Staff Only: item control page