You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The dataloader, train_dataloader, does not have many workers which may be a bottleneck. Consider increasing the value of the num_workers argument`
#662
For anyone that gets this warning message during training:
rank_zero_warn(
/home/PiperTTS/.venv/lib/python3.10/site-packages/pytorch_lightning/trainer/connectors/data_connector.py:224:
PossibleUserWarning:
The dataloader, train_dataloader, does not have many workers which may be a bottleneck.
Consider increasing the value of the `num_workers` argument` (try 16 which is the number of cpus on this machine)
in the `DataLoader` init to improve performance.
can be resolved by editing the lightning.py found here:
For anyone that gets this warning message during training:
can be resolved by editing the lightning.py found here:
within this class:
change this default
to anything above 2, otherwise it still gives you the warning
i have 16 cores available, but as an initial test, only increased it to 4
and you'll no longer see the warning during training.
The text was updated successfully, but these errors were encountered: