University of Twente Student Theses

Login
As of Friday, 8 August 2025, the current Student Theses repository is no longer available for thesis uploads. A new Student Theses repository will be available starting Friday, 15 August 2025.

Distillation and Partial Model Freezing for Continual Learning in Neural Fields

Visser, W.A. (2025) Distillation and Partial Model Freezing for Continual Learning in Neural Fields.

[img] PDF
9MB
Abstract:Neural fields have increasingly been utilised as signal representations. Using neural networks, they can be applied as solutions to problems such as creating continuous representations and solving inverse problems. In some use cases, the signals represented are not only spatial but also temporal. In practice, when such a signal has to be represented, it can be beneficial to incorporate measurements into an existing neural field as soon as they are available. However, this presents a problem, as neural networks, which neural fields are based on, can forget knowledge learnt when being presented with new knowledge. In this work, we investigate how continual learning affects neural fields of different architectures. Additionally, we demonstrate two methods that can be used to remedy the problems caused by continual learning in neural fields. First, we demonstrate how the model trained on earlier tasks can be reused to prevent performance degradation through knowledge distillation when learning a subsequent task. Second, we demonstrate how, for models using a DINER architecture, a part of the model can be frozen, thereby lowering performance loss while still allowing the model to learn to represent new measurements.
Item Type:Essay (Master)
Faculty:EEMCS: Electrical Engineering, Mathematics and Computer Science
Subject:54 computer science
Programme:Computer Science MSc (60300)
Link to this item:https://purl.utwente.nl/essays/107275
Export this item as:BibTeX
EndNote
HTML Citation
Reference Manager

 

Repository Staff Only: item control page