Thesis Proposals

Thesis proposals

Below you can find an overview of topics under investigation in our research lab. Many of them intersect, so do not feel restricted to a single area.

⚠️ A note on industrial theses & internships: for information on external theses or internships, please contact me directly.

See a list of past thesis supervised (or co-supervised) by me

🔨 Adversarial attacks and interpretability

We explore robustness and interpretability of novel neural architectures, including graph-based networks, self-supervised models, and in novel domains such as audio processing and medical applications. This is fundamental for a correct deployment of these methods in multiple practical scenarios.

🎧 Audio analysis and processing

Deep learning for audio is challenging but highly rewarding (eg, speech, music, recordings, …). Self-supervised models for audio are opening new research directions, especially for low-resources languages such as Italian.

🤔 Continual learning for neural networks

Continual learning is the capability of learning without forgetting past information, and it is a key goal of recent literature. We explore new algorithms for performing continual learning with low memory budgets. We are part of the Avalanche team, a PyTorch-based library for continual learning.

🖧 Distributed deep networks

Distributing training and inference of neural networks is fundamental in, eg, IoT applications. We are especially interested in fully-distributed training of neural models, and models that can be naturally distributed across tiers of computations.

💉 Deep learning applied to medical domains

Medical applications are especially suitable for deep learning, thanks to the presence of multiple types of high-dimensional data (eg, cell scans, 3D volumes, …). A number of non-trivial challenges and projects are open, ranging from cytology to neurology.

📊 Graph neural models

Can we extend deep networks to handle graph-based data (eg, social networks)? Today, many architectures are available for this, and we explore them under many aspects, including cutting-edge applications, and the design of new models and algorithms.

🔬 Quaternion-valued and hyper-complex neural networks

Hyper-complex neural networks extend deep networks beyond classical real-valued algebra. They are especially suitable for reducing the complexity of the networks in high-dimensional domains. This is a very recent and open-ended domain (see also our hTorch library based on PyTorch).

📈 Self-supervised learning

Can we train networks with no supervision? Self-supervised learning tries to facilitate supervised learning by creating multiple auxiliary objectives for training. We would like to investigate self-supervised learning for complex tasks (eg, handwritten manuscripts, speech recognition) and new models for self-supervised learning.