Deprecate process_position
from the Trainer constructor
#8968
Labels
deprecation
Includes a deprecation
design
Includes a design discussion
feature
Is an improvement or enhancement
good first issue
Good for newcomers
help wanted
Open to be worked on
let's do it!
approved to implement
refactor
🚀 Feature
Remove this two arguments from the Trainer constructor
Motivation
We are auditing the Lightning components and APIs to assess opportunities for improvements:
The Trainer today has over 50 arguments to its constructor. This number is growing with each feature release, and makes the trainer cluttered. It also hurts the extensibility of the trainer: a number of arguments passed to the Trainer are to customize other utilities. Plumbing arguments through the trainer creates an undesirable coupling: when the underlying components change, the framework is forced to make breaking API changes in at least two places:
Example: #8062
weights_summary
implementation changes (or deprecateweights_summary
off the constructor in favor of a callback: Deprecate summarize() off LightningModule #8478 )Example: #8780
-
log_gpu_memory
acceptingmin_max
is hyper-specific to nvidia-smi, and isn't applicable for torch.cuda.memory statsUpcoming examples:
Pitch
Deprecate
process_position
off the Trainer constructor in v1.5In version 1.7, remove
process_position
from the Trainer entirelyTo customize this, users can still construct the ProgressBar callback object and pass it to the Trainer.
Alternatives
Additional context
If you enjoy Lightning, check out our other projects! ⚡
Metrics: Machine learning metrics for distributed, scalable PyTorch applications.
Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, finetuning and solving problems with deep learning
Bolts: Pretrained SOTA Deep Learning models, callbacks and more for research and production with PyTorch Lightning and PyTorch
Lightning Transformers: Flexible interface for high performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.
The text was updated successfully, but these errors were encountered: