BCI for Mobile Devices: Time–Frequency and Representation Learning Analysis of Mobile Gesture Tasks


Yılmaz Ç. M., Ulu A., Demirbaş G.

2025 IEEE International Conference on Signals and Systems (ICSigSys), Adiwerna, Endonezya, 06 Kasım 2025, ss.27-33, (Tam Metin Bildiri)

  • Yayın Türü: Bildiri / Tam Metin Bildiri
  • Doi Numarası: 10.1109/icsigsys67277.2025.11269155
  • Basıldığı Şehir: Adiwerna
  • Basıldığı Ülke: Endonezya
  • Sayfa Sayıları: ss.27-33
  • Karadeniz Teknik Üniversitesi Adresli: Evet

Özet

Mobile devices are central to daily life, yet their interaction remains primarily limited to touch and voice, which may be impractical in restricted mobility scenarios. Brain–computer interfaces (BCIs) offer a promising alternative by enabling device control through neural activity. Although motor imagery (MI) BCIs have been extensively studied, MI signals for mobile-specific gestures remain relatively underexplored. This study addresses this gap by investigating the effectiveness of combining time–frequency transformations with deep learning for classifying MI of mobile gestures. Using the MI-BMPI dataset—comprising EEG recordings of participants imagining tapping and swiping, and among the few publicly available datasets of its kind—we applied Short Time Fourier Transform, Continuous Wavelet Transform, Stockwell Transform, and Hilbert–Huang Transform as inputs to architectures including ResNeSt-50d, HRNet-W18, ConvNeXt-Base, DeiT-Base, Swin Transformer-Tiny, and RegNetX-002. In the classification of tapping and swiping MI tasks, performance across different subjects ranged from 0.61 to 0.947 when averaged over five random seeds, while individual subject models achieved results between 0.708 and 0.975. These promising outcomes are consistent with other evaluation metrics. Overall, this work provides the first systematic evaluation of mobile gesture MI-BCIs, demonstrating that time–frequency representations coupled with deep learning enable robust EEG-based mobile interaction and laying the groundwork for the development of future neuroadaptive technologies.