Question about Kernel Fusion in XLA #18344
Unanswered
southfreebird
asked this question in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I have a question concerning the fusion of kernels in XLA. In my simple model, I've implemented two flax.linen.LayerNorm layers. During compilation, I noticed that two fused kernels were generated, which resulted in copying four large tensors through HBM. This doesn't appear to be an optimal solution for my use case. Therefore, I'd like to inquire whether there is a flag or parameter that can help me modify this behavior.
Code
Your assistance in resolving this matter would be greatly appreciated. Thank you.
Beta Was this translation helpful? Give feedback.
All reactions