I'm utilizing Huggingface's Autotrain to no-code train the Mistral 7b model. The training is performed on Kaggle's free 2 GPUs T4 and uses the Openassistant Guanaco dataset, integrated with QLoRA for only 2 hours training time.
-
Notifications
You must be signed in to change notification settings - Fork 0
Utilizing Huggingface's Autotrain to performed no-code train the Mistral 7b model on the Openassistant Guanaco dataset with QLoRA for only 2 hours training time on Kaggle's GPUs.
License
hahuyhoang411/AutotrainMistral7b
About
Utilizing Huggingface's Autotrain to performed no-code train the Mistral 7b model on the Openassistant Guanaco dataset with QLoRA for only 2 hours training time on Kaggle's GPUs.
Resources
License
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published