Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix the grad_acc issue at epoch boundaries #24415

Merged
merged 3 commits into from
Jun 23, 2023
Merged

Conversation

pacman100
Copy link
Contributor

What does this PR do?

Should solve accumulating via epoch when using Accelerate (seen in #23935 (comment)). Requires huggingface/accelerate#1624

Fixes # (issue)

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Jun 22, 2023

The documentation is not available anymore as the PR was closed or merged.

Co-authored-by: sumpster
@nivibilla
Copy link

I believe this PR also fixes #24245

Copy link
Contributor

@muellerzr muellerzr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

Copy link
Collaborator

@amyeroberts amyeroberts left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for fixing!

Happy with the changes. My only concern is things breaking as this relies on feature that's just been merged on accelerate's main branch. Is the next release of accelerate going to be before the next release of transformers?

@pacman100 @muellerzr You've been handling the switch using accelerate in trainer really well, and I can see our CI runs install accelerate from source, so happy if this follows the pattern you've using.

@muellerzr
Copy link
Contributor

@amyeroberts we coordinate Accelerate releases to be a day or two before transformers, so there shouldn't be an issue there :)

(Though @pacman100 we should do the version check like we've done before with these fixes 😬 )

@pacman100 pacman100 mentioned this pull request Jun 22, 2023
4 tasks
Copy link
Collaborator

@sgugger sgugger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the work on this! There needs to be a couple of protections before this is merged (that we can remove at the next release of Accelerate when we move the pinned version up).

src/transformers/trainer.py Show resolved Hide resolved
src/transformers/trainer.py Outdated Show resolved Hide resolved
@Oxi84
Copy link

Oxi84 commented Jun 22, 2023

Hello,

How do i install this? I expected that PR means some kind of transformers update, in this case there should be install link such as git+https://github.com/huggingface/transformers@de9255de27abfcae4a1f816b904915f0b1e23cd9

@pacman100
Copy link
Contributor Author

Hello @Oxi84, you can install this once it gets merged via pip install git+https://github.com/huggingface/transformers and pip install git+https://github.com/huggingface/accelerate

@pacman100 pacman100 merged commit c036c81 into main Jun 23, 2023
@pacman100 pacman100 deleted the smangrul/grad-acc-fix branch June 23, 2023 12:13
@philpax
Copy link

philpax commented Jun 23, 2023

Just completed a training run with this PR and can confirm that the issue didn't occur. Thanks for the fix!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

8 participants