Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support the merge of decoder without/with past for encoder-decoder models in the ONNX export #926

Merged

Conversation

fxmarty
Copy link
Contributor

@fxmarty fxmarty commented Mar 27, 2023

As per title, will need #924 to be merged first.

Next PR: support this in ORTModel as well.

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Mar 27, 2023

The documentation is not available anymore as the PR was closed or merged.

Copy link
Member

@michaelbenayoun michaelbenayoun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Comment on lines +624 to +627
if self.is_merged is True and self.use_cache_branch is True:
reference_model_inputs["use_cache_branch"] = DummyInputGenerator.constant_tensor(shape=[1], value=True)
elif self.is_merged is True and self.use_cache_branch is False:
reference_model_inputs["use_cache_branch"] = DummyInputGenerator.constant_tensor(shape=[1], value=False)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
if self.is_merged is True and self.use_cache_branch is True:
reference_model_inputs["use_cache_branch"] = DummyInputGenerator.constant_tensor(shape=[1], value=True)
elif self.is_merged is True and self.use_cache_branch is False:
reference_model_inputs["use_cache_branch"] = DummyInputGenerator.constant_tensor(shape=[1], value=False)
if self.is_merged:
reference_model_inputs["use_cache_branch"] = DummyInputGenerator.constant_tensor(shape=[1], value=self.use_cache_branch)

Copy link
Contributor Author

@fxmarty fxmarty Mar 27, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

edit: actually this is less explicit

optimum/exporters/onnx/base.py Outdated Show resolved Hide resolved
optimum/exporters/onnx/base.py Show resolved Hide resolved
@fxmarty fxmarty force-pushed the support-encoder-decoder-merge-in-onnx-export branch from 9005998 to 8cede08 Compare March 28, 2023 08:20
@fxmarty fxmarty force-pushed the support-encoder-decoder-merge-in-onnx-export branch from 8cede08 to afb4220 Compare March 28, 2023 08:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants