EUGLOH logo

MA/MSc Internship for EUGLOH program

Title: Deep learning and tensor decomposition for the analysis of patterns in signals and multimodal imaging. Application to neuropathies

Keywords: machine learning, deep tech, tensor decomposition, time series, neuroimaging

Internship Duration: 30/11/-1 - 30/11/-1


Head of the hosting team: Nazim Agoulmine

Website: Click here

Address of the host laboratory:
IBISC EA4526
Team Vincent VIGNERON
40 rue du Pelvoux
91020 Courcouronnes France

Supervisor: Vincent VIGNERON
E-mail: vincent.vigneron@univ-evry.fr
Phone: +33663568760




Internship description:

The exponential development of AI and neural networks is renewing the study of time series from both a fundamental and applied point of view. Particularly for multivariate signals, the tensor can be a more adequate representation than the matrix, because it avoids the loss of data structure, and therefore the loss of information.
Automatic learning on tensor data is classically carried out by linear tensor decomposition, for example CPD/PARAFAC or Tucker [Sid17]. Recently, tensor representations have been integrated into neural networks and have enabled significant developments in deep learning, particularly in the field of images, by reducing the number of parameters to be estimated.
To increase the identifiability and interpretability of deep neural models, constraints are added, for example, non-negativity, classic in a matrix and tensor learning framework [Kol08]. In deep learning, variational autoencoders have been interpreted in a non-negative matrix factorization framework, but also as a CPD tensor factorization, and even non-negative Tucker [Mar22]. Autoencoders belong to the family of generative models. They make it possible to discover latent spaces by learning an automorphism x=f(x). Their latent space can be structured in tensor form, which provides very good performance [Pan21]. It has been shown that this allows a compromise in terms of performance and interpretability, between a simple unconstrained autoencoder and a non-negative Tucker model, for different tasks (segmentation, pattern detection). However, this preliminary work leaves significant room for progress, and the properties of this type of hybrid model are still poorly understood.

Work program
First of all, we will establish a benchmark of the different approaches. Then we will modify the constraints that structure the tensor decomposition in an auto-encoder/Tucker decomposition type model. We will evaluate and compare the characteristics of several architectures for the autoencoder. The proposed algorithms will be tested on data from several application fields currently examined in our respective laboratories: powers transmitted on an electricity transmission network; calibration of pollutant sensors; prediction of sports performance, and segmentation of brain tumors. This work could continue in the thesis (1) by comparing the performances of the representation in the temporal, time-frequency, and time-scale domains (2) by applying these tensor decompositions on Boltzmann machines (DB networks and diffusion model) (3) by studying the influence of the network structure of the underlying phenomenon on the signal representation. Industrial collaborations are possible.

Techniques used during the internship:

Deep neural network models, VAE with Pytorch and statistical tools

Programming skills: Python or C / C ++. A practice of Tensorflow and Pytorch would be a plus.

Bibliography:

[Kol08] Kolda, Bader, « Tensor decompositions and applications », in: SIAM review 51.3 (2009), pp. 455–500.
[Sid17] Sidiropoulos et al. « Tensor Decomposition for Signal Processing and Machine Learning » IEEE Transactions on Signal Processing, 2017.
[Pan21] Panagakis et al. « Tensor Methods in Computer Vision and Deep Learning » Proceedings of the IEEE, https://doi.org/10.1109/JPROC.2021.3074329
[Mar22] Marmoret, « Unsupervised Machine Learning Paradigms for the Representation of Music Similarity and Structure », thèse IMT Atlantique, 2022.


Possibility of PhD : Yes

Remarks concerning the PhD position: funds provided by the doctoral school (coppetition june 2024)

Research field(s) of interest to the hosting team:
Language(s) spoken in the host laboratory: french/english