Transformers for 1D Signals in Parkinson's Disease Detection from Gait
Transformers for 1D Signals in Parkinson's Disease Detection from Gait
International Conference on Pattern Recognition (ICPR 2022) - Oral
@article{nguyen2022transformerssignals,
author = {Nguyen, Duc Minh Dimitri and Miah, Mehdi and Bilodeau, Guillaume-Alexandre and Bouachir, Wassim},
title = {{Transformers} for {1D} {Signals} in {Parkinson's} {Disease} {Detection} from {Gait}},
year = {2022},
booktitle = {International {Conference} on {Pattern} {Recognition} ({ICPR})}
}
Abstract This paper focuses on the detection of Parkinson’s disease based on the analysis of a patient’s gait. The growing popularity and success of Transformer networks in natural language processing and image recognition motivated us to develop a novel method for this problem based on an automatic features extraction via Transformers. The use of Transformers in 1D signal is not really widespread yet, but we show in this paper that they are effective in extracting relevant features from 1D signals. As Transformers require a lot of memory, we decoupled temporal and spatial information to make the model smaller. Our architecture used temporal Transformers, dimension reduction layers to reduce the dimension of the data, a spatial Transformer, two fully connected layers and an output layer for the final prediction. Our model outperforms the current state-of-the-art algorithm with 95.2% accuracy in distinguishing a Parkinsonian patient from a healthy one on the Physionet dataset. A key learning from this work is that Transformers allow for greater stability in results.