Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CoreML: Add ML Program Concat #21423

Closed
wants to merge 11 commits into from
Closed

CoreML: Add ML Program Concat #21423

wants to merge 11 commits into from

Conversation

vraspar
Copy link
Contributor

@vraspar vraspar commented Jul 20, 2024

Description

Add CoreML ML Program Concat

Motivation and Context

Support priority models

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Concat op also needs changes that as ConvTranspose. The model.mm is same as #21416

@vraspar vraspar requested review from skottmckay and edgchen1 July 20, 2024 00:33
@vraspar vraspar changed the title Vraspar/mlprogram concat CoreML: Add ML Program Concat Jul 20, 2024
@vraspar vraspar marked this pull request as ready for review July 22, 2024 17:43
skottmckay added a commit that referenced this pull request Jul 24, 2024
- Add Concat (#21423)
- Add DepthToSpace (#21426)
- Add LeakyRelu (#21453)
- Add test scripts (#21427)
- Add ability to set coreml flags from python (#21434)

Also updated partitioning utils to support dropping constant initializers from a ComputeCapability's inputs. We copy these to a CoreML model so don't need the originals. If they remain as inputs ORT can't free them as they appear to be in use.

Misc changes
- Fix SkipLayerNormFusion incorrectly setting `modified`
  - causes unnecessary loops of the L2 transformers
@skottmckay
Copy link
Contributor

Included in #21472

@skottmckay skottmckay closed this Jul 25, 2024
skottmckay added a commit that referenced this pull request Jul 25, 2024
…#21472)

### Description
<!-- Describe your changes. -->
Add these changes to one PR to simplify checkin
- Add Concat (#21423)
- Add DepthToSpace (#21426)
- Add LeakyRelu (#21453)
- Add test scripts (#21427)
- Add ability to set coreml flags from python (#21434)


Other changes
- updated partitioning utils to support dropping constant initializers
from a ComputeCapability's inputs.
- noticed that the list of inputs to the coreml model was unexpectedly
long due to this
- we copy constant initializers to a CoreML model so don't need the
originals, and if they remain as inputs ORT can't free them as they
appear to be in use.

### Motivation and Context
<!-- - Why is this change required? What problem does it solve?
- If it fixes an open issue, please link to the issue here. -->
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants