Structure-Preserving Transformers for Sequences of SPD Matrices - Université Paris Dauphine
Communication Dans Un Congrès Année : 2020

Structure-Preserving Transformers for Sequences of SPD Matrices

Résumé

In recent years, Transformer-based auto-attention mechanisms have been successfully applied to the analysis of a variety of context-reliant data types, from texts to images and beyond, including data from non-Euclidean geometries. In this paper, we present such a mechanism, designed to classify sequences of Symmetric Positive Definite matrices while preserving their Riemannian geometry throughout the analysis. We apply our method to automatic sleep staging on timeseries of EEG-derived covariance matrices from a standard dataset, obtaining high levels of stage-wise performance.
Fichier principal
Vignette du fichier
EUSIPCO_2024_paper_Mathieu_Seraphim_et_al_.pdf (384.46 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
Licence

Dates et versions

hal-04638595 , version 1 (08-07-2024)

Licence

Identifiants

Citer

Mathieu Seraphim, Alexis Lechervy, Florian Yger, Luc Brun, Olivier Etard. Structure-Preserving Transformers for Sequences of SPD Matrices. European Signal Processing Conference (EUSIPCO) 2024, European Association For Signal Processing (EURASIP), Aug 2024, Lyon, France. pp.1451-1455. ⟨hal-04638595⟩
152 Consultations
53 Téléchargements

Altmetric

Partager

More