We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
transformers
No response
examples
I found after
transformers/src/transformers/trainer.py
Line 1569 in ccb92be
the output of
logger.info( f"{type(train_dataloader)}, {type(train_dataloader.sampler)},{type(train_dataloader.batch_sampler)}")
is
<class 'accelerate.data_loader.DataLoaderShard'>, <class 'torch.utils.data.sampler.SequentialSampler'>,<class 'accelerate.data_loader.BatchSamplerShard'>
The train_dataloader dataargs is
{'batch_size': 4, 'collate_fn': <function default_data_collator at 0x7f404cf33520>, 'num_workers': 0, 'pin_memory': True, 'sampler': <torch.utils.data.sampler.RandomSampler object at 0x7f404cbd26e0>, 'drop_last': False, 'worker_init_fn': <function seed_worker at 0x7f4061da8820>}
why sample changed from RandomSampler -> SequentialSampler
The sampler should be same
The text was updated successfully, but these errors were encountered:
cc @muellerzr @pacman100
Sorry, something went wrong.
it might be relate to https://github.com/huggingface/accelerate/blob/69e4c3c54da3201eda288b500d138761e7a5221c/src/accelerate/data_loader.py#L709
I am checking train_dataloader.batch_sampler.batch_sampler
train_dataloader.batch_sampler.batch_sampler.sampler is torch.utils.data.sampler.RandomSampler
No branches or pull requests
System Info
transformers
version: 4.32.1Who can help?
No response
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
I found after
transformers/src/transformers/trainer.py
Line 1569 in ccb92be
the output of
is
The train_dataloader dataargs is
why sample changed from RandomSampler -> SequentialSampler
Expected behavior
The sampler should be same
The text was updated successfully, but these errors were encountered: