-
Notifications
You must be signed in to change notification settings - Fork 89
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add option to use flash attention #222
Add option to use flash attention #222
Conversation
Signed-off-by: Walter Hugo Lopez Pinaya <[email protected]>
Signed-off-by: Walter Hugo Lopez Pinaya <[email protected]>
Signed-off-by: Walter Hugo Lopez Pinaya <[email protected]>
@ericspod After adding the option to use the efficient memory to the model I started to get the following error when running the unit tests:
It looks like a problem with the dependency and torchscript. How would you suggest to deal with this issue? |
Where you call |
…tion-to-the-diffusion-unet # Conflicts: # generative/networks/nets/diffusion_model_unet.py # requirements-dev.txt
Signed-off-by: Walter Hugo Lopez Pinaya <[email protected]>
Fixes #210