-
Notifications
You must be signed in to change notification settings - Fork 3k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Add fusion script for segment anything v2 (#22167)
### Description * Add MultiHeadAttention fusion for SAM2. * Add LayerNormalization fusion for NCHW format by inserting Transpose from NCHW to NHWC before layer normalization, and add another Transpose after layer norm to convert NHWC back to NCHW. Hopefully, those extra Transpose nodes will be removed when prefer_nhwc is enabled later. * Add a condition that the input shall be 3D when fuse SkipLayerNorm. * Update convert_to_onnx.py to add `--optimize` and `--use_gpu` options to output optimized onnx model for CPU/CUDA eps. * Add an option `--dtype fp16|fp32` in convert_to_onnx.py to support converting optimized model to float16. * Update the demo to use the optimized onnx models. ### Motivation and Context To support optimization of SAM2 for CPU/CUDA eps that is exported in #22119
- Loading branch information
Showing
10 changed files
with
1,085 additions
and
106 deletions.
There are no files selected for viewing
534 changes: 534 additions & 0 deletions
534
onnxruntime/python/tools/transformers/fusion_attention_sam2.py
Large diffs are not rendered by default.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.