Skip to main content

Effect of Noise in NN-Training

Supervisor

Suitable for

Computer Science, Part B
Mathematics and Computer Science, Part C
Computer Science and Philosophy, Part C
Computer Science, Part C

Abstract

One of the interesting approaches to reducing overfitting in neural networks is to add noise to the inputs and activations before performing a gradient step. The key insight is that this noise injection prevents the learnt weights from being too delicately balanced to fit the data; some kind of robustness is necessary to fit noisy data.

Another interesting consequence of noisy data is that recent work shows that learning algorithms using noisy data may be better at protecting privacy of the data. Thus, there may be twin advantages to this approach. This project will involve understanding the backgroud in this topic, performing simulations to understand the behaviour and hopefully developing new theories.

This project may involve collaboration with Mr. Alexis Poncet and Dr. Thomas Steinke.