You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Proposal: Move all of the rank_zero_only utilities into their own module
Motivation
It is currently impossible to deprecate something from within distributed.py using the standard Lightning conventions without creating a circular import.
Creating a separate module solves this and clarifies dependencies better.
Create a new module: utilities/rank_zero.py which will house all of these decorators.
Alternatives
Move rank_zero_only utilities from warnings.py back into distributed.py
If you enjoy Lightning, check out our other projects! ⚡
Metrics: Machine learning metrics for distributed, scalable PyTorch applications.
Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.
Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.
Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.
Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.
Proposed refactor
The rank_zero_* utilities are currently split between distributed.py and warnings.py
distributed.py
https://github.com/PyTorchLightning/pytorch-lightning/blob/8c07d8bf905e395bbd2142b5df7b185b8e936c41/pytorch_lightning/utilities/distributed.py#L47-L95
https://github.com/PyTorchLightning/pytorch-lightning/blob/8c07d8bf905e395bbd2142b5df7b185b8e936c41/pytorch_lightning/utilities/warnings.py#L23-L51
Proposal: Move all of the rank_zero_only utilities into their own module
Motivation
It is currently impossible to deprecate something from within
distributed.py
using the standard Lightning conventions without creating a circular import.Creating a separate module solves this and clarifies dependencies better.
I'm facing this issue in https://github.com/PyTorchLightning/pytorch-lightning/pull/11745/files
Pitch
Create a new module:
utilities/rank_zero.py
which will house all of these decorators.Alternatives
Move rank_zero_only utilities from warnings.py back into distributed.py
If you enjoy Lightning, check out our other projects! ⚡
Metrics: Machine learning metrics for distributed, scalable PyTorch applications.
Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.
Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.
Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.
Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.
cc @justusschock @awaelchli @akihironitta @rohitgr7
The text was updated successfully, but these errors were encountered: