Skip to content

Commit

Permalink
'prettier'
Browse files Browse the repository at this point in the history
  • Loading branch information
MohammadJavadD committed Dec 12, 2024
1 parent 1dbc8b4 commit 026b2ee
Show file tree
Hide file tree
Showing 4 changed files with 32 additions and 15 deletions.
7 changes: 3 additions & 4 deletions .github/workflows/pre-commit.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -7,10 +7,9 @@ concurrency:
on: [pull_request]

jobs:

pre-commit:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
- uses: pre-commit/[email protected]
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
- uses: pre-commit/[email protected]
16 changes: 7 additions & 9 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ ci:
autoupdate_branch: "master"
autoupdate_commit_msg: "[pre-commit.ci] pre-commit autoupdate"
autoupdate_schedule: quarterly
skip: [ ]
skip: []
submodules: false

repos:
Expand All @@ -29,22 +29,20 @@ repos:
- id: check-case-conflict
- id: forbid-new-submodules
- id: pretty-format-json
args: [ "--autofix", "--no-sort-keys", "--indent=4" ]
args: ["--autofix", "--no-sort-keys", "--indent=4"]

- repo: https://github.com/charliermarsh/ruff-pre-commit
rev: v0.4.8
hooks:
- id: ruff
args: [ "--ignore=E402,E501,F401", "--fix" ] #, --exit-non-zero-on-fix, ]
args: ["--ignore=E402,E501,F401", "--fix"] #, --exit-non-zero-on-fix, ]
- id: ruff
name: ruff lint data notebooks
args: [ "--fix", "--preview", "--select=NPY201" ]
args: ["--fix", "--preview", "--select=NPY201"]
- id: ruff
args: [ "check", "--select", "I", "--fix"]
args: ["check", "--select", "I", "--fix"]
- id: ruff-format
types_or: [ python, pyi, jupyter ]


types_or: [python, pyi, jupyter]

- repo: https://github.com/codespell-project/codespell
rev: v2.3.0
Expand All @@ -54,7 +52,7 @@ repos:
- --ignore-words-list=metadat,splitted,meaned,wil,whats,additionals,alle,alot,bund,currenty,datas,farenheit,falsy,fo,haa,hass,iif,incomfort,ines,ist,nam,nd,pres,pullrequests,resset,rime,ser,serie,te,technik,ue,unsecure,withing,zar,MAPE,mape
- --skip="./.*,*.csv,*.json,*.ambr"
- --quiet-level=2
exclude_types: [ csv, json ]
exclude_types: [csv, json]
exclude: ^tests/|generated/^.github

- repo: https://github.com/asottile/blacken-docs
Expand Down
20 changes: 20 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,28 +1,39 @@
# About

Code for ["General-Purpose Brain Foundation Models for Time-Series Neuroimaging Data"](https://openreview.net/forum?id=HwDQH0r37I)

# Getting Started

## 0. Install the requirements

To install the requirements, run the following command:

```bash
pip install -r requirements.txt
```

## 1. Download and preprocess the data

Download the NMT data from [here](https://ilabel.ai/datasets/Nust-Millitary-Hospital-TUKl-NMT-EEG-Dataset/NMT-Scalp-EEG.zip) and extract it to the `data` folder. or you can use the following command:

```bash
wget https://ilabel.ai/datasets/Nust-Millitary-Hospital-TUKl-NMT-EEG-Dataset/NMT-Scalp-EEG.zip

unzip NMT-Scalp-EEG.zip -d data
```

or you can use the following command:

```bash
gdown 'https://drive.google.com/uc?id=1jD_AcmfoaIfkOiO5lSU4J6IxHZtalnTk'

unzip NMT.zip -d data/NMT/
```

## 2. Preprocess the data

To preprocess the data, run the following command:

```bash
python ./data/preprocess.py \
--dataset nmt \
Expand All @@ -31,10 +42,13 @@ python ./data/preprocess.py \
--exp_path ./data/NMT/NMT_dl/ \
--nmt_raw_path ./data/NMT/nmt_scalp_eeg_dataset/
```

It will preprocess the data and save it as .arrow files in the `data/NMT/nmt_dl/` folder.

## 3. Train the model

To train the model, run the following command:

```bash
accelerate launch bfm/train/train.py \
--config bfm/configs/bfm-t5-base-nmt.yaml \
Expand All @@ -49,21 +63,27 @@ accelerate launch bfm/train/train.py \
--n-gpus 4 \
--max-steps 2000
```

This will train the model on the NMT dataset using the T5-base model. You can modify the config file to use a different model or dataset.

## 4. Evaluate the model

To evaluate the model, run the following command:

```bash
CUDA_VISIBLE_DEVICES=0 python bfm/evaluate/evaluate.py \
--config_path "bfm/configs/bfm-inference.yaml" \
--directory_path "./bfm/Experiments/bfm-base_nmt" \
--seed 2024 \
--device "cuda"
```

[Note:] You can also use 'data/download_moabb_datasets.py' to download the MOABB datasets. Then you can use 'data/preprocess_moabb.py' to preprocess the MOABB datasets and evaluate the model on them.

# Citation

If you find this code useful, please consider citing our paper:

```
@inproceedings{
bayazi2024generalpurpose,
Expand Down
4 changes: 2 additions & 2 deletions bfm/configs/bfm-t5-base-nmt.yaml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
training_data_paths:
- ./data/NMT/NMT_dl/nmt_dl/train/
- ./data/NMT/NMT_dl/nmt_dl/train/
validation_data_paths:
- ./data/NMT/NMT_dl/nmt_dl/val/
- ./data/NMT/NMT_dl/nmt_dl/val/
dataset: "nmt"
project_dir: "./bfm/"
experiment_name: "test_experiment_v0"
Expand Down

0 comments on commit 026b2ee

Please sign in to comment.