Neural Networks as Dynamical Systems (1ECTS, MAT/07)

Speaker:  Davide Murari - University of Cambdrige (UK)
  Monday, November 10, 2025 at 8:30 AM

This seminar course develops the viewpoint that many modern neural networks can be viewed as dynamical systems. We begin with a compact review of the mathematical foundations of deep learning, covering the key approximation and stability results. We then show how neural networks can be interpreted as discrete or continuous dynamical systems, and why this is valuable. From this lens, we study Neural ODEs and continuous normalising flows (CNFs) for generative modelling, and introduce symplectic neural networks to discover and simulate Hamiltonian systems. Brief PyTorch demonstrations accompany the theory. Background: Students should be familiar with the basic notions in linear algebra and probability. Some exposure to numerical methods for ODEs (e.g., Runge-Kutta methods) is helpful but not required. No prior experience with neural networks or PyTorch is assumed; basic familiarity with Python is helpful.

Schedule: 

- 10/11 Aula M 8:30-10:30

- 12/11 Aula Alfa 13:30-15:30

- 13/11 Aula T.05 10:30-12:30

Contacts: Nicola Sansonetto (nicola.sansonetto@univr.it) - Giacomo Albi (giacomo.albi@univr.it)

 

 

 


Programme Director
Giacomo Albi

External reference
Publication date
October 31, 2025

Studying

Share