Degree Type

Thesis

Date of Award

2020

Degree Name

Master of Science

Department

Computer Science

Major

Computer Science

First Advisor

Jin Tian

Abstract

Neural networks can be repurposed via reprogramming to perform new tasks, which are different from the tasks they were originally trained for. We introduce new and improved reprogramming technique that, compared to prior works, achieves better accuracy, scalability, and can be successfully applied to more complex tasks. While prior literature focuses on potential malicious uses of reprogramming, we argue that reprogramming can be viewed as an efficient training method. Our reprogramming method allows for re-using existing pre-trained models and easily reprogramming them to perform new tasks. This technique requires a lot less effort and hyperparameter tuning compared training new models from scratch. Therefore, we believe that our improved and scalable reprogramming method has potential to become a new method for creating machine learning models.

DOI

https://doi.org/10.31274/etd-20200624-122

Copyright Owner

Eliska Kloberdanz

Language

en

File Format

application/pdf

File Size

29 pages

Share

COinS