Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LoRA and DoRA PEFT support for Fine-Tuning TimesFM #104

Merged
merged 24 commits into from
Aug 6, 2024
Merged
Show file tree
Hide file tree
Changes from 17 commits
Commits
Show all changes
24 commits
Select commit Hold shift + click to select a range
71d9802
add parameter efficient finetuning pipeline
tanmayshishodia Jul 16, 2024
ee7462b
Merge branch 'master' into feature/lora
tanmayshishodia Jul 16, 2024
b6ebd80
revert test env name
tanmayshishodia Jul 16, 2024
77b004b
Merge branch 'feature/lora' of github.com:tanmayshishodia/timesfm int…
tanmayshishodia Jul 16, 2024
461c2cd
update checkpoint dir name
tanmayshishodia Jul 16, 2024
c34896b
update adapter init file docstring
tanmayshishodia Jul 16, 2024
c8aaf31
gitgnore all pycache dirs
tanmayshishodia Jul 16, 2024
845661d
update usage tutorial
tanmayshishodia Jul 16, 2024
e3fb45c
gitignore jax egg info
tanmayshishodia Jul 16, 2024
2174a8c
add src init file for poetry package
tanmayshishodia Jul 16, 2024
39665af
change import style
tanmayshishodia Jul 16, 2024
a59979d
add example dora.sh file
tanmayshishodia Jul 16, 2024
5ae8c7d
update lora/dora intermediate var names
tanmayshishodia Jul 17, 2024
d4d4afd
add pytest framework
tanmayshishodia Jul 17, 2024
5901805
add bash scripts for running diff FT strategies
tanmayshishodia Jul 17, 2024
a908448
add docstrings in adapter utils
tanmayshishodia Jul 17, 2024
18da73a
remove helper and fix early stopping logic
tanmayshishodia Jul 18, 2024
807ddfd
add poetry packages
tanmayshishodia Aug 3, 2024
f15daba
Merge branch 'master' into feature/lora
tanmayshishodia Aug 3, 2024
d72ff83
keep only a single bash script
tanmayshishodia Aug 4, 2024
6517590
update poetry lock
tanmayshishodia Aug 4, 2024
0ccc10f
update pytest poetry
tanmayshishodia Aug 4, 2024
e5be6bd
add new line EOF
tanmayshishodia Aug 4, 2024
55f71de
Create PEFT README.md
tanmayshishodia Aug 4, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 6 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
@@ -1,3 +1,8 @@
.venv/
dist/
**__pycache__/** */
__pycache__/
checkpoints/
wandb/
datasets/
results/
timesfm_jax.egg-info/
3 changes: 3 additions & 0 deletions environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,3 +16,6 @@ dependencies:
- jax[cuda12]==0.4.26
- einshape
- scikit-learn
- typer
- wandb
- pytest
3 changes: 3 additions & 0 deletions environment_cpu.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,3 +16,6 @@ dependencies:
- jax[cpu]==0.4.26
- einshape
- scikit-learn
- typer
- wandb
- pytest
26 changes: 26 additions & 0 deletions peft/dora.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
#!/bin/bash

export TF_CPP_MIN_LOG_LEVEL=2 XLA_PYTHON_CLIENT_PREALLOCATE=false

python3 finetune.py \
--model-name="google/timesfm-1.0-200m" \
--backend="gpu" \
--horizon-len=128 \
--context-len=512 \
--freq="15min" \
--data-path="../datasets/ETT-small/ETTm1.csv" \
--num-epochs=100 \
--learning-rate=1e-3 \
--adam-epsilon=1e-7 \
--adam-clip-threshold=1e2 \
--early-stop-patience=10 \
--datetime-col="date" \
--boundaries=34560 46080 57600 \
--use-lora \
--lora-rank=1 \
--lora-target-modules="all" \
--use-dora \
--cos-initial-decay-value=1e-4 \
--cos-decay-steps=40000 \
--cos-final-decay-value=1e-5 \
--ema-decay=0.9999
22 changes: 22 additions & 0 deletions peft/fft.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
#!/bin/bash

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I feel like different .sh scripts are not needed, one script with command line options is good enough. or we can not check in these scripts and have it as example usage in the header comment of finetune.py

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay, I will keep only one of them.

Running python3 finetune.py --help gives the following o/p, which is self-explanatory. Perhaps I can add it in README?
image

export TF_CPP_MIN_LOG_LEVEL=2 XLA_PYTHON_CLIENT_PREALLOCATE=false

python3 finetune.py \
--model-name="google/timesfm-1.0-200m" \
--backend="gpu" \
--horizon-len=128 \
--context-len=512 \
--freq="15min" \
--data-path="../datasets/ETT-small/ETTm1.csv" \
--num-epochs=1 \
--learning-rate=1e-3 \
--adam-epsilon=1e-7 \
--adam-clip-threshold=1e2 \
--early-stop-patience=10 \
--datetime-col="date" \
--boundaries=34560 46080 57600 \
--cos-initial-decay-value=1e-4 \
--cos-decay-steps=40000 \
--cos-final-decay-value=1e-5 \
--ema-decay=0.9999
Loading
Loading