Master's Degree in Data Science (2025-2026)

For the previous year (2024-2025), refer to this page.

Important Info

Material: slides, assignments, and grading will be done via Google Classroom.
Timetable: Thursday 9-12 AM (Aula A5, Via Ariosto), Friday 9-10 (Aula A5, Via Ariosto).

News

General overview

The course provides a general overview on neural networks as compositions of differentiable blocks, that are optimized numerically. We describe common building blocks including convolutions, self-attention, batch normalization, etc., with a focus on image, audio, and graph domains. The course combines rigorous mathematical descriptions with coding sessions in JAX, as well as an overview of the current AI landscape.

Material

Lab sessions implemented in JAX are in blue. Homeworks and projects (mandatory) are in red. Seminars (optional) are in green.

  Date Content Material
L0 25/09 About the course Slides
L1 25/09 (Historical) introduction Slides
L2 02/10 Preliminaries Slides
L3 09/10 Supervised learning Slides
E1 10/10 Lab 1: logistic regression in JAX Notebook
L4 16/10 Fully-connected models Slides
L5 27/11 Automatic differentiation Slides
L6 23/10 Convolutional layers Slides
  30/10 Lab 2: CNNs in Keras Notebook
L7 06/11 Convolutions beyond images Slides
L8 06/11 Building deeper models Slides
E3 13/11 Lab 3: neural networks in Equinox Notebook 1
Notebook 2
L9 20/11 Transformer models Slides
L10 04/12 Recurrent models Slides
- - End-of-term homework: KV cache compression Notebook
L11 05/12 Reinforcement learning Chapter (draft)

Book: Alice’s Adventures in a Differentiable Wonderland

The course is complemented by a book which expands on most topics covered during the lectures:

  • Buy the book on Amazon (independently published to keep the price low).
  • Downloaded the updated full draft (29/08/2025).

For the full book webpage (with arXiv version and errata list): https://sscardapane.it/alice-book/