-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Main Issue] Accelerator and Plugin refactor #10416
Comments
Great summary! About
As best as I can see right now: |
Yes, definitely really excited about this. I am quite eager to finally see the accelerators marked as a stable API targetting v1.6. |
Here are some that I found and also some are mentioned in the doc After 2) Methods:
Property:
After 3)
These will either make use of the accelerator or be moved to it. |
Now that the precision plugin has moved, I will take a look at this TODO here: |
Closing this issue in favor of the smaller linked issues for the pending tasks. |
Proposed refactoring or deprecation
Motivation
Accelerator is not stable API yet, we can improve the Accelerator related logic and move towards stable Accelerator version for 1.6
Pitch
Steps
training_type_plugin
collective functions directly instead of going through the Accelerator #9677More details in: Accelerator Refactor Proposal
[updating]
FAQ
Will this be a lot of breaking changes?
Not much user facing API changes from 1,2,3,4.(Unless we found out other existing bugs during refactor) The only breaking change will be for custom plugins
5 and 6 is still RFC stage, may have breaking changes which impact user facing APIs
How does this impact lightningLite?
Should be helpful for lightningLite too, there maybe function refactor/simplification could happen for lightningLite. (@awaelchli any suggestion about this part?)
Follow up TODOs:
check if trainer.FITTING before setting up optimizers 2/n Move Precision Plugin into strategy - move optimizer related logics #10596 (comment)
Send a PR to https://github.com/ray-project/ray_lightning to update their plugins
If you enjoy Lightning, check out our other projects! ⚡
Metrics: Machine learning metrics for distributed, scalable PyTorch applications.
Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.
Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.
Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.
Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.
cc @justusschock @awaelchli @akihironitta @tchaton @Borda @kaushikb11 @ananthsub
The text was updated successfully, but these errors were encountered: