Skip to content

Commit

Permalink
Fix pytorch dataloader batchsize issue (#1219)
Browse files Browse the repository at this point in the history
* fix dataloader batchsize issue

Signed-off-by: Kaihui-intel <[email protected]>

* fetch dataloader batchsize

Signed-off-by: Kaihui-intel <[email protected]>

---------

Signed-off-by: Kaihui-intel <[email protected]>
  • Loading branch information
Kaihui-intel authored Sep 6, 2023
1 parent d7b608b commit 6a98d0b
Showing 1 changed file with 3 additions and 2 deletions.
5 changes: 3 additions & 2 deletions neural_compressor/adaptor/pytorch.py
Original file line number Diff line number Diff line change
Expand Up @@ -917,12 +917,13 @@ def model_calibration(self, q_model, dataloader, iterations=1, conf=None, calib_
dataloader.batch(1)
self.calib_func(q_model, dataloader, calib_sampling_size, conf)
else: # pragma: no cover
if hasattr(dataloader, "batch_size") and calib_sampling_size % dataloader.batch_size != 0:
dataloader_batch_size = getattr(dataloader, "batch_size") or getattr(dataloader, "total_batch_size")
if hasattr(dataloader, "batch_size") and calib_sampling_size % dataloader_batch_size != 0:
logger.warning(
"Please note that calibration sampling size {} "
"isn't divisible exactly by batch size {}. "
"So the real sampling size is {}.".format(
calib_sampling_size, dataloader.batch_size, dataloader.batch_size * iterations
calib_sampling_size, dataloader_batch_size, dataloader_batch_size * iterations
)
)

Expand Down

0 comments on commit 6a98d0b

Please sign in to comment.