Skip to main content

Neural Networks and Formal Languages

Supervisors

Suitable for

MSc in Advanced Computer Science
Mathematics and Computer Science, Part C
Computer Science and Philosophy, Part C
Computer Science, Part C

Abstract

There are several projects available on the theme “Neural Networks and Formal Languages”. Please get in touch for the most up-to-date information.

Neural networks such as RNNs and Transformers are able to process sequences, and hence they can operate as language recognisers. On one hand, this makes them applicable to settings where automata and temporal logics are typically employed. On the other hand, it implies that the capabilities of these networks can be studied through the lens of formal languages and automata theory. The goal of this area of research is to get a better understanding of neural networks operating on sequences, improve their capabilities, and broaden their range of applications.

There are projects focusing on:

(i) expressivity results,

(ii) learning algorithms,

(iii) benchmarks.

The specific topic will be identified together with the candidate, based on the candidate’s interests and background.

This project is at the intersection of two fields, automata and formal languages on one side, and neural networks on the other side. Applicants interested in exploring this intersection should not be discouraged if their background is mostly in only one of the two fields.

Reference: Nadezda Alexandrovna Knorozova, Alessandro Ronca: On the Expressivity of Recurrent Neural Cascade, AAAI
2024.

Pre-requisites: Some familiarity with automata and formal languages, or with neural networks.