Skip to main content

Continual learning by surrogate objectives

Supervisors

Suitable for

MSc in Advanced Computer Science

Abstract

A critical challenge in continual learning is catastrophic forgetting (CF), where the acquisition of new information leads to the erosion of previously learned knowledge. This phenomenon poses a substantial barrier, particularly in the context of updating large models, rendering the process computationally unscalable as data increase. CF is primarily attributed to biased gradients resulting from shifts in data sampling distribution over time. This project explores a computationally efficient approach to alleviate forgetting. The core idea involves identifying surrogate objective functions that circumvent the need for extensive memory and computational resources during optimization. The student undertaking this project will work on:

  1. Literature Review: Delving into key papers on continual learning to gain insights into the intricacies of the problem and experiment settings.
  2. Implementation: implementing the proposed method and conducting a comparative analysis against existing approaches, with a focus on both sample and computation efficiency.