Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement CustomOp Output Type Inference function #19906

Merged
merged 3 commits into from
Mar 18, 2024

Conversation

yuslepukhin
Copy link
Member

@yuslepukhin yuslepukhin commented Mar 13, 2024

Description

This change addresses the following issues with the current CustomOP Output Type inference

  • The function does not take into account optional inputs. When input is absent the inference is silently aborted, and no output type is inferred (P1 customer issue)
  • Inferring output type based on the input type for multi-kernel custom ops is done based on the latest in sequence kernel definition. There is not an attempt made to match the kernel based on the input type.
  • Inference is aborted when variadic inputs/outputs are detected when the generated input/output names fail to obtain type constraints. This is not immediately clear from the code, because custom op schema is not available within the inference function.
  • No error reporting.

Motivation and Context

Most of CustomOPs lack their own type and shape inference function as it was recently introduced. For that reason, it is important to fix this.
This change is inspired by a customer issue.

This is a follow up on:

issues:
- When an option input is absent, the inference is aboirted
- No error reporting.
- Inferring output type is done on the basis of the last KernelDef
and no KernelDef selection is not made based on the argument types
provided.
@yuslepukhin yuslepukhin marked this pull request as ready for review March 14, 2024 18:31
@yuslepukhin yuslepukhin requested a review from souptc March 14, 2024 18:32
@yuslepukhin yuslepukhin merged commit a033df8 into main Mar 18, 2024
95 checks passed
@yuslepukhin yuslepukhin deleted the yuslepukhin/rework_customop_typeinference branch March 18, 2024 17:28
TedThemistokleous pushed a commit to TedThemistokleous/onnxruntime that referenced this pull request May 7, 2024
### Description
<!-- Describe your changes. -->
This change addresses the following issues with the current CustomOP
Output Type inference
- The function does not take into account optional inputs. When input is
absent the inference is silently aborted, and no output type is inferred
(P1 customer issue)
- Inferring output type based on the input type for multi-kernel custom
ops is done based on the latest in sequence kernel definition. There is
not an attempt made to match the kernel based on the input type.
- Inference is aborted when variadic inputs/outputs are detected when
the generated input/output names fail to obtain type constraints. This
is not immediately clear from the code, because custom op schema is not
available within the inference function.
- No error reporting.

### Motivation and Context
<!-- - Why is this change required? What problem does it solve?
- If it fixes an open issue, please link to the issue here. -->

Most of CustomOPs lack their own type and shape inference function as it
was recently introduced. For that reason, it is important to fix this.
This change is inspired by a customer issue.

This is a follow up on:
- microsoft#15184
- cbourjau/ort-custom-op#11
- microsoft/onnxruntime-extensions#451
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants