Replies: 1 comment
-
You can benchmark it. We provide some benchmarking utilities here: https://github.com/huggingface/diffusers/tree/main/benchmarks |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Today, pytorch released version 2.2, which include flash attention v2 for scaled_dot_product_attention
Do it help speed up diffusers/stable_diffusion?
https://pytorch.org/blog/pytorch2-2/?utm_content=280360799&utm_medium=social&utm_source=linkedin&hss_channel=lcp-78618366
Beta Was this translation helpful? Give feedback.
All reactions