Neural Networks for Data Science Applications

Master's Degree in Data Science (2022-2023)

For the previous year (2021-2022), refer to this page.

Important Info

Lecture for the new academic year will be in presence according to the latest regulations. Streaming does not allow users to interact from remote. The passcode for the recordings is provided on Classroom or on request.
Material: slides, assignments, and grading will be done via Google Classroom.
Timetable (updated): Tuesday 9-11 (Aule A5 + streaming in A6, Via Ariosto), Wednesday 8-11 (Aule A5 + streaming in A6, Via Ariosto).
Streaming link (Zoom): Tuesday, Wednesday

News

General overview

In the course, we first provide a general set of tools and principles to understand deep networks as compositions of differentiable blocks, that are optimized numerically.

Then, we overview common building blocks including convolutions, self-attention, batch normalization, etc., with a focus on image, audio, and graph domains. Towards the end, we overview the deployment of these models (adversarial robustness, interpretability, Dockerization), and some selected state-of-the-art research topics (continual learning, self-supervised learning, …).

The course combines rigorous mathematical descriptions with many coding sessions in TensorFlow.

Material

Lab sessions (mandatory) implemented in TensorFlow are in blue. Homeworks and projects (mandatory) are in red. Seminars (optional) are in green.

  Date Content Material
L1 20/09 About the course [Slides] [Video]
L2 21-27/09 Preliminaries [Slides] [Video 1/2] [Video 2/2]
L3 28/09, 04/10 Linear models [Slides] [Video 1/2] [Video 2/2]
E1 05/09, 11/10 Lab session: logistic regression [Colab] [Video 1/2] [Video 2/2]
L4 12/10 Fully-connected models [Slides] [Video 1/2] [Video 2/2]
H1 14/11 Homework: Calibrating neural networks [Template] [Video] [Solution]
L5 18/10 Convolutional neural networks [Slides] [Video 1/2] [Video 2/2]
L6 25/10 Building deep neural networks [Slides] [Video 1/4] [Video 2/4] [Video 3/4] [Video 4/4]
E2 02/11 Lab session: convolutional networks [Colab] [Video 1/2] [Video 2/2]
L7 09/11 Graph neural networks [Slides] [Video 1/2] [Video 2/2]
S1 11/11 Seminar (optional): Pack your subgraphs (Beatrice Bevilacqua, Fabrizio Frasca) [Slides] [Video]
E3 16/11 Lab session: graph neural networks [Colab] [Video 1/2] [Video 2/2]
L8 29/11 Attention-based models [Slides] [Video 1/4] [Video 2/4] [Video 3/4] [Video 4/4]
L9 06/12 Modular neural networks [Slides] [Video 1/2] [Video 2/2]
H2 - Homework: Experimenting with modular neural networks [Template] [Instructions]
E4 13/12 Lab session: audio classification with transformers [Colab] [Video 1/2] [Video 2/2]
S2 16/12 Seminar (optional): Integrating Combinatorial Solvers and Discrete Distributions in Neural Models (Pasquale Minervini) [Slides] [Video]
S3 17/01 Seminar (optional): Transformers learn in-context by gradient descent (Johannes von Oswald) [Slides] [Video]
S4 09/06 Seminar (optional): Modular deep learning (Edoardo Ponti) [Slides] [Video]
S5 16/06 Seminar (optional): Data representations in deep generative modelling (Kamil Deja) [Slides] [Video]

Exam

  • 1 homework (5 point), 1 final project (10 points), oral examination (15 points).
  • The homework can be recovered in the final project if not done during the course. The mark of the homework may be kept for the entire academic year.
  • Lode is given only to students with a project and oral examination highly above average.
  • Optional exercises and reading materials are provided during the course.

Reading material

The main reference book for the course is Dive into Deep Learning. Slides mention the corresponding sections in the book at the end.