ORCID

0000-0002-9839-1163

Date of Award

2025

Document Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

Department

Complex Systems and Data Science

First Advisor

Nick Cheney

Abstract

Neural networks have demonstrated remarkable capabilities in many domains, yet they struggle with continual learning - the ability to keep learning new information without forgetting previous knowledge. This dissertation investigates how repeatedly resetting and relearning weights in artificial neural networks, a process we term "zapping," can improve their ability to learn continuously and transfer knowledge to new domains.

Through three interconnected studies, we first develop OmnImage, a novel dataset containing 1,000 classes of natural images optimized for few-shot learning through evolutionary computation. Using this and other datasets, we then demonstrate that periodically resetting the weights of the last layer during training, combined with a sequential learning procedure we call Alternating Sequential and Batch (ASB) learning, enables models to achieve state-of-the-art performance in continual learning tasks without requiring complex meta-learning approaches. Finally, we conduct a detailed investigation into the mechanisms behind zapping's effectiveness, revealing how it shapes the dynamics of learning and forgetting within neural networks.

Our findings suggest that repeatedly exposing networks to controlled instances of forgetting during training leads to the development of more robust and adaptable features. This work advances our understanding of continual learning in neural networks and provides practical techniques for improving their ability to learn continuously in real-world applications. The insights and methods developed here have implications for building more flexible artificial intelligence systems capable of sustained learning over time.

Language

en

Number of Pages

136 p.

Share

COinS