University of Twente Student Theses

Login

The Effect of Higher Order Activation Functions on Infinitely wide Neural Networks.

Heeringa, T.J. (2022) The Effect of Higher Order Activation Functions on Infinitely wide Neural Networks.

[img] PDF
1MB
Abstract:Machine learning using neural networks is a very powerful tool used for solving high dimensional and nonlinear problems. Neural networks can approximate almost any function to arbitrary precision, and seem not to suffer from the curse of dimensionality. A key goal in Applied Mathematics and Computer Science is to understand which neural networks can approximate which functions well, and to figure out why neural networks do not suffer from the curse of dimensionality. A major step towards that goal is to understand the continuous limit of neural networks. In recent research several function spaces have been suggested as candidates for this limit. However, the mathematical relationship between these spaces is not fully understood. In the thesis we further the understanding of these candidates, and investigate which of the function spaces continuously embeds into which. We show that Barron space with ReLU as activation function is the largest of these spaces, and derive a novel description of the remainder of a Taylor series in terms of a shallow neural network with the higher order ReLU as activation function. We demonstrate that this shallow neural network does not suffer from the curse of dimensionality. We conclude with an analysis of the continuous limit of a deep neural network using control techniques.
Item Type:Essay (Master)
Faculty:EEMCS: Electrical Engineering, Mathematics and Computer Science
Subject:02 science and culture in general
Programme:Computer Science MSc (60300)
Link to this item:https://purl.utwente.nl/essays/90545
Export this item as:BibTeX
EndNote
HTML Citation
Reference Manager

 

Repository Staff Only: item control page