This work addresses the adoption of Machine Learning classifiers and Convolutional Neural Networks to improve the performance of highly wearable, single-channel instrumentation for Brain-Computer Interfaces. The proposed measurement system is based on the classification of Steady-State Visually Evoked Potentials (SSVEPs). In particular, Head-Mounted Displays for Augmented Reality are used to generate and display the flickering stimuli for the SSVEPs elicitation. Four experiments were conducted by employing, in turn, a different Head-Mounted Display. For each experiment, two different algorithms were applied and compared with the state-of-the-art-techniques. Furthermore, the impact of different Augmented Reality technologies in the elicitation and classification of SSVEPs was also explored. The experimental metrological characterization demonstrates (i) that the proposed Machine Learning-based processing strategies provide a significant enhancement of the SSVEP classification accuracy with respect to the state of the art, and (ii) that choosing an adequate Head-Mounted Display is crucial to obtain acceptable performance. Finally, it is also shown that the adoption of inter-subjective validation strategies such as the Leave-One-Subject-Out Cross Validation successfully leads to an increase in the inter-individual 1-σ reproducibility: this, in turn, anticipates an easier development of ready-to-use systems.

Enhancement of SSVEPs Classification in BCI-based Wearable Instrumentation Through Machine Learning Techniques / Apicella, Andrea; Arpaia, Pasquale; Benedetto, Egidio De; Donato, Nicola; Duraccio, Luigi; Giugliano, Salvatore; Prevete, Roberto. - In: IEEE SENSORS JOURNAL. - ISSN 1530-437X. - 22:9(2022), pp. 9087-9094. [10.1109/JSEN.2022.3161743]

Enhancement of SSVEPs Classification in BCI-based Wearable Instrumentation Through Machine Learning Techniques

Apicella, Andrea;Arpaia, Pasquale;Benedetto, Egidio De
;
Duraccio, Luigi;Giugliano, Salvatore;Prevete, Roberto
2022

Abstract

This work addresses the adoption of Machine Learning classifiers and Convolutional Neural Networks to improve the performance of highly wearable, single-channel instrumentation for Brain-Computer Interfaces. The proposed measurement system is based on the classification of Steady-State Visually Evoked Potentials (SSVEPs). In particular, Head-Mounted Displays for Augmented Reality are used to generate and display the flickering stimuli for the SSVEPs elicitation. Four experiments were conducted by employing, in turn, a different Head-Mounted Display. For each experiment, two different algorithms were applied and compared with the state-of-the-art-techniques. Furthermore, the impact of different Augmented Reality technologies in the elicitation and classification of SSVEPs was also explored. The experimental metrological characterization demonstrates (i) that the proposed Machine Learning-based processing strategies provide a significant enhancement of the SSVEP classification accuracy with respect to the state of the art, and (ii) that choosing an adequate Head-Mounted Display is crucial to obtain acceptable performance. Finally, it is also shown that the adoption of inter-subjective validation strategies such as the Leave-One-Subject-Out Cross Validation successfully leads to an increase in the inter-individual 1-σ reproducibility: this, in turn, anticipates an easier development of ready-to-use systems.
2022
Enhancement of SSVEPs Classification in BCI-based Wearable Instrumentation Through Machine Learning Techniques / Apicella, Andrea; Arpaia, Pasquale; Benedetto, Egidio De; Donato, Nicola; Duraccio, Luigi; Giugliano, Salvatore; Prevete, Roberto. - In: IEEE SENSORS JOURNAL. - ISSN 1530-437X. - 22:9(2022), pp. 9087-9094. [10.1109/JSEN.2022.3161743]
File in questo prodotto:
File Dimensione Formato  
Enhancement_of_SSVEPs_Classification_in_BCI-Based_Wearable_Instrumentation_Through_Machine_Learning_Techniques.pdf

solo utenti autorizzati

Tipologia: Versione Editoriale (PDF)
Licenza: Copyright dell'editore
Dimensione 1.45 MB
Formato Adobe PDF
1.45 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11588/890226
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 34
  • ???jsp.display-item.citation.isi??? 19
social impact