You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In this issue I found that FA3 doesn't support L40. In my case, I'm using vLLM for FP8 model, I'm curious if Flash-Attention 2 utilizes FP8 for inference on L40-48G?
The text was updated successfully, but these errors were encountered:
In this issue I found that FA3 doesn't support L40. In my case, I'm using vLLM for FP8 model, I'm curious if Flash-Attention 2 utilizes FP8 for inference on L40-48G?
The text was updated successfully, but these errors were encountered: