Skip to content

Commit

Permalink
Fixed mismatch absorb layers due to tracing and named modules (#963)
Browse files Browse the repository at this point in the history
Co-authored-by: chen, suyue <[email protected]>
  • Loading branch information
maktukmak and chensuyue authored Jun 14, 2023
1 parent 85c6a0f commit bccc89f
Showing 1 changed file with 8 additions and 0 deletions.
8 changes: 8 additions & 0 deletions neural_compressor/adaptor/torch_utils/smooth_quant.py
Original file line number Diff line number Diff line change
Expand Up @@ -722,6 +722,14 @@ def transform(self, alpha=0.5, folding=False, percentile=99.999, op_types=['Line
save_input_output = True

input_maxes = self._calibrate(self.absorb_to_layer, calib_iter, save_input_output)

# Check if input_maxes match self.absorb_to_layer
# (due to self._get_all_layer_names use layer tree instead of forward_path)
if not folding:
diff_modules = set(self.absorb_to_layer.keys()).difference(input_maxes.keys())
for d in diff_modules:
del self.absorb_to_layer[d]

if alpha == 'auto':
self.alpha_per_layer = self._auto_tune_alpha(input_maxes, **auto_alpha_args) ##save the alpha

Expand Down

0 comments on commit bccc89f

Please sign in to comment.