We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
in 0.72, even if we don't set sampler, pytorch_lightning will not add DistributedSampler for us.
the reason is in pytorch, if we don't set sampler, pytorch will add a sampler for us. in pytorch's dataloader.py:
if sampler is None: # give default samplers if self._dataset_kind == _DatasetKind.Iterable: # See NOTE [ Custom Samplers and IterableDataset ] sampler = _InfiniteConstantSampler() else: # map-style if shuffle: sampler = RandomSampler(dataset) else: sampler = SequentialSampler(dataset)
but in pytorch_lightning we check whether sampler is None to decide to add sampler in data_loading.py funciton auto_add_sampler:
no_sampler_added = dataloader.sampler is None
because pytorch have default sampler for us, which is not None, pytorch_lighting will not automatically add sampler.
The text was updated successfully, but these errors were encountered:
Successfully merging a pull request may close this issue.
🐛 Bug
in 0.72, even if we don't set sampler, pytorch_lightning will not add DistributedSampler for us.
To Reproduce
the reason is in pytorch, if we don't set sampler, pytorch will add a sampler for us.
in pytorch's dataloader.py:
but in pytorch_lightning we check whether sampler is None to decide to add sampler
in data_loading.py funciton auto_add_sampler:
because pytorch have default sampler for us, which is not None, pytorch_lighting will not automatically add sampler.
The text was updated successfully, but these errors were encountered: