Neural Networks for Data Science Applications
Master's Degree in Data Science (2021-2022)
Important Info
Material: slides, assignments, and grading will be done via Google Classroom. | |
Timetable: Tuesday, 10-12 AM, Wednesday, 8-11 AM, see official timetable. | |
In-person attendance: Via Ariosto 25, Room A4 (Tuesday), Room A2 (Wednesday). | |
Remote attendance (Zoom): Tuesday, Wednesday |
General overview
In the course, we first provide a general set of tools and principles to understand deep networks as compositions of differentiable blocks, that are optimized numerically.
Then, we overview common building blocks including convolutions, self-attention, batch normalization, etc., with a focus on image, audio, and graph domains. Towards the end, we overview the deployment of these models (adversarial robustness, interpretability, Dockerization), and some selected state-of-the-art research topics (continual learning, self-supervised learning, …).
The course combines rigorous mathematical descriptions with many coding sessions in TensorFlow.
Slides
See also the suggested reading materials at the end of each topic.
 | Date | Content | Material |
---|---|---|---|
S1 | 21/09/2021 | About the course | Slides Video |
S2 | 21-22/09/2021 | Preliminaries | Slides Video (part 1) Video (part 2) Video (part 3) Chapter 2 of the book |
S3 | 29/09/2021, 05/10/2021 | Supervised learning on vectorial data | Slides Video (part 1) Video (part 2) |
S4 | 06/10/2021 | Fully-connected neural networks | Slides Video (part 1) Video (part 2) Video (part 3) |
S5 | 20/10/2021 | Convolutional neural networks | Slides Video (part 1) Video (part 2) |
S6 | 26/10/2021 | Tips and tricks for training deep networks | Slides Video (part 1) Video (part 2) |
S7 | 09/11/2021 | Graph neural networks | Slides Video (part 1) Video (part 2) |
S8 | 23-24/11/2021 | Attention-based models | Slides Video (part 1) Video (part 2) Video (part 3) Video (part 4) |
S9 | 31/11/2021 | Multi-task models | Slides Video (part 1) Video (part 2) |
Lab sessions
These are implemented in TensorFlow. A separate set of exercises can also be found in the book itself.
 | Date | Content | Propedeuticity | Material |
---|---|---|---|---|
L1 | 28/09/2021 | Autodiff in TensorFlow | S2 (Preliminaries) | Colab Video |
L2 | 12/10/2021 | tf.keras and tf.data |
S4 (Fully-connected models) | Colab Video (part 1) Video (part 2) |
L3 | 02/11/2021 | Building deep convolutional neural networks | S6 (deep CNNs) | Colab Video (part 1) Video (part 2) |
L4 | 16/11/2021 | Implementing graph neural networks | S7 (graph NNs) | Colab Video (part 1) Video (part 2) |
L5 | 21/12/2021 | Audio classification with fine-tuning | S9 (multi-task models) | Colab Video (part 1) Video (part 2) Video (part 3) |
Homeworks
Mandatory exercises for the admission to the exam.
 | Deadline | Content | Material |
---|---|---|---|
H1 | 07/11/2021 | Implementing custom activation functions | Instructions (Classroom) Video (instructions) Template Solution Correction (video) |
H2 | See instructions | Experimenting with continual learning | Instructions (Classroom) Video (instructions) |
Exercises
These are optional, self-graded exercises that extend and clarify certains aspects of the course or the lab sessions.
 | Description | Propedeuticity | Material |
---|---|---|---|
E1 | Implementing momentum and higher-order derivatives | L1 (Autodiff) | Exercise (Classroom) Solution |
E2 | Advanced concepts from tf.keras | L2 (Fully-connected models) | Exercise (Classroom) Solution |
E3 | Caching, residual connections, and Weight & Biases | L3 (Deep CNNs) | Exercise (Classroom) Solution |
Exam
- 1 homework (5 point), 1 final project (10 points), oral examination (15 points).
- The homework can be recovered in the final project if not done during the course.
- Lode is given only to students with a project and oral examination highly above average.
- Optional exercises and reading materials are provided during the course.
Reading material
The main reference book for the course is Dive into Deep Learning. Each set of slides will mention the corresponding sections in the book.