We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improves on LoRA by allowing magnitude fine-tuning.
Improved perplexity.
Sebastien Bubeck has published demo code. https://github.com/rasbt/dora-from-scratch
The text was updated successfully, but these errors were encountered:
cc @younesbelkada @pacman100
Sorry, something went wrong.
Hi @RonanKMcGovern ! Thanks for the feature request! There is already an ongoing work from @BenjaminBossan to add DoRA in PEFT: huggingface/peft#1474
Closing as there is a PR underway.
OK thank you @RonanKMcGovern !
No branches or pull requests
Feature request
Improves on LoRA by allowing magnitude fine-tuning.
Motivation
Improved perplexity.
Your contribution
Sebastien Bubeck has published demo code. https://github.com/rasbt/dora-from-scratch
The text was updated successfully, but these errors were encountered: