🪧 ✨ Python scripts for a Brain-Computer Interface (BCI) leveraging motor imagery to control a 2D arrow. The beta burst features methodology have been described by Papdopoulos.
The GUI is powered by Pygame. Data stream collection relies on PyACQ and LSL.
Developed by Ludovic DARMET from the DANC lab and COPHY, under the supervision of Jimmy Bonaiuto and Jérémie Mattout.
📁
👩💻
-
Clone the repository:
git clone https://github.com/ludovicdmt/online_BCI.git cd online_BCI
-
Create and activate the Conda environment:
conda env create -f BCIMI.yml conda activate BCIMI
-
Install the module in editable mode:
pip install -e .
-
Install PyACQ:
pip install git+https://github.com/pyacq/pyacq.git
-
Install PyACQ_ext:
pip install git+https://gitlab.com/ludovicdmt/pyacq-ext-mi.git
Note: Changes to the code will be reflected immediately due to the editable mode installation.
🗜️ To run the BCI, simply launch the GUI:
cd ${INSTALL_PATH}/GUI
python GUI_control.py
During calibration, a PyLSL stream for the markers is created so it can be synchronized with the EEG stream for recording.
To adjust trial time, inter-trial, and other experimental parameters, please refer to the config file.
After data collection, click on Online classification for an asynchronous decoding of the motor imagery.
🆘 If you encounter any issues while using this code, feel free to post a new issue on the issues webpage. I'll get back to you promptly, as I'm keen on continuously improving it. 🚀