Skip to content

Commit

Permalink
fix a small type error on bf16+pp (#3441)
Browse files Browse the repository at this point in the history
Co-authored-by: Olatunji Ruwase <[email protected]>
Co-authored-by: Logan Adams <[email protected]>
  • Loading branch information
3 people authored May 9, 2023
1 parent f1fab90 commit 195563a
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion deepspeed/runtime/pipe/engine.py
Original file line number Diff line number Diff line change
Expand Up @@ -243,7 +243,7 @@ def _exec_reduce_grads(self):
self._force_grad_boundary = True
if self.pipeline_enable_backward_allreduce:
if self.bfloat16_enabled():
if self.zero_optimization_stage() < ZeroStageEnum().gradients:
if self.zero_optimization_stage() < ZeroStageEnum.gradients:
self._bf16_reduce_grads()
else:
raise NotImplementedError("PP+BF16 only work for ZeRO Stage 1")
Expand Down

0 comments on commit 195563a

Please sign in to comment.