Neural Network Robustness Analysis : Insights, Limitations, and Future Directions
Annelot Bosman ( Leiden University )
- 15:00 10th July 2025Bill Roscoe LT (112)
Robustness is a key requirement for deploying neural networks in safety-critical applications. During my PhD, I have investigated how to quantify robustness in neural networks using formal verification methods.
In this talk, I will discuss the insights we have gained so far into how robustness behaves across different models and datasets, what our proposed metrics can be applied to, and the practical limitations that remain in current verification techniques.
I will also touch upon the potential of warm-starting to make verification more scalable and efficient. Although this idea is theoretically possible, it has not yet been explored in practice for neural network verification. I aim to pursue this direction during my upcoming visit to the University of Oxford, where I will collaborate with Marta Kwiatkowska’s team.