Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix skipped gradient update of the last step when using gradient accumulation. #29561

Closed

Conversation

richardSHkim
Copy link

@richardSHkim richardSHkim commented Mar 9, 2024

What does this PR do?

This PR fixes skipped gradient update of the last step when using gradient accumulation in trainer.py.

I'm not sure what is the meaning of 'step is always smaller than gradient_accumulation_steps', but steps_in_epoch <= args.gradient_accumulation_steps looks like always False.
Thus, the update of the last iteration is skipped when gradient_accumulation_step > 1.

This also leads to different total iterations (-1 iteration per epoch).

It might be negligible, and the skipped gradients are used in next iteration, but I believe the gradient update behavior should be the same whether or not using gradient accumulation if the total batch size is the same.

I have tested with example script and the comparison results are here.

accelerator.accumulate() context manager now can handle the last step of data loader, so I'm trying to use it.

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

@muellerzr @pacman100

@richardSHkim richardSHkim marked this pull request as draft March 9, 2024 19:31
@richardSHkim richardSHkim marked this pull request as ready for review March 9, 2024 19:32
@richardSHkim richardSHkim marked this pull request as draft March 10, 2024 02:00
@richardSHkim richardSHkim marked this pull request as ready for review March 10, 2024 05:49
@richardSHkim richardSHkim marked this pull request as draft March 10, 2024 08:18
@richardSHkim
Copy link
Author

richardSHkim commented Mar 11, 2024

I found this implementation was intended in PR#24415, so close this PR.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant