A Low-Complexity Combined Encoder-LSTM-Attention Networks for EEG-based Depression Detection
A Low-Complexity Combined Encoder-LSTM-Attention Networks for EEG-based Depression Detection
Blog Article
Despite the high performance of existing state-of-the-art deep learning models for depression detection using electroencephalography (EEG), they incur a heavy computational burden.In this paper, we propose an efficient model consisting of a cascade of an encoder, long short-term memory (LSTM), and attention mechanism networks.The encoder compresses data into a lower-dimensional latent space.The LSTM models the temporal variations in brain rhythms.The attention mechanism rectifies the problem of compressed data in sequence-to-sequence models and efficiently leverages parallelism.
Compared with recent state-of-the-art, our proposed depression detection model shows better performance synovex one grass and efficiency with a validation accuracy of 99.57% on subject-dependent experiment and a testing accuracy of 84.93% on subject-independent experiment with a total number of 4,355 parameters.The proposed model has resulted in 99.65% reduction in complexity compared with the state-of-the-art EEG-based depression detection models.
The results of this study indicate the effectiveness of the proposed model design and the usefulness of the combined encoder, LSTM, and u11-200ps attention modules.These networks serve as mitigating factors for the computational load, which is vital for future research on multi-tasking mental health monitoring using AI-enabled EEG wearables.