University of Twente Student Theses

Login

Hardware efficient Co-DETR model for mobile applications

Takken, S. (2024) Hardware efficient Co-DETR model for mobile applications.

[img] PDF
3MB
Abstract:Object detection models are evolving and their wide range of applications is expanding as well. Object detection models can be utilized as a tool for sustainability and there are various methods to aid climate change. Included in these benefits is the usage of mobile applications with object detection models. Mobile devices are resource constraint and have limited storage, therefore the object detection model should attain to these limits. However, object detection models can be complex and great in size. A state-of-the-art object detection model is Co-DETR, which is currently too large to deploy for mobile applications. This research will focus on the effects of global unstructured magnitude pruning on the hardware efficiency of the CO-DETR model for mobile applications. As object detection models can be compressed in order to decrease the model size and a method that can be applied is pruning. With pruning, redundant components of a network can be removed and there can be a trade-off between average precision and model compression. However, to the best of my knowledge, there is no research available on the application of model compression on the Co-DETR model. This research applies global unstructured pruning on the ResNet50 backbone of Co-DINO, focusing on pruning the weights of the convolutional layers. The impact on the average inference time, model size and average precision with a range of sparsity levels is evaluated, before re-training with a 20% sparsity level. The results showed a negative impact on the Average Precision (AP), average inference time and model size when applied, demonstrating an increase in average inference time and model size and a decrease in AP. This research aims to investigate the effects of pruning strategy on the Co-DETR model for mobile applications and broaden the knowledge of the effects of pruning on the average inference time, model size and average precision of the Co-DETR model.
Item Type:Essay (Bachelor)
Faculty:EEMCS: Electrical Engineering, Mathematics and Computer Science
Subject:54 computer science
Programme:Computer Science BSc (56964)
Link to this item:https://purl.utwente.nl/essays/101945
Export this item as:BibTeX
EndNote
HTML Citation
Reference Manager

 

Repository Staff Only: item control page