Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Converter doesn't convert Whisper model to ONNX #65

Closed
DimQ1 opened this issue Apr 1, 2023 · 1 comment
Closed

[Bug] Converter doesn't convert Whisper model to ONNX #65

DimQ1 opened this issue Apr 1, 2023 · 1 comment
Labels
bug Something isn't working

Comments

@DimQ1
Copy link

DimQ1 commented Apr 1, 2023

Describe the bug
A clear and concise description of what the bug is.

Converter doesn't convert Whisper model to ONNX.
Converter doesn't work for non *.en mpdels

How to reproduce
Steps or a minimal working example to reproduce the behavior
python ./scripts/convert.py --model_id openai/whisper-tiny --from_hub --quantize --task speech2seq-lm-with-past

result:
Merging decoders
Traceback (most recent call last):
File "D:\Users\Dimq1\source\OpenAI\transformers.js\scripts\convert.py", line 301, in
main()
File "D:\Users\Dimq1\source\OpenAI\transformers.js\scripts\convert.py", line 293, in main
merge_decoders(
File "C:\Users\Dimq1\AppData\Local\Programs\Python\Python310\lib\site-packages\optimum\onnx\graph_transformations.py", line 135, in merge_decoders
_unify_onnx_outputs(decoder, decoder_with_past)
File "C:\Users\Dimq1\AppData\Local\Programs\Python\Python310\lib\site-packages\optimum\onnx\transformations_utils.py", line 147, in _unify_onnx_outputs
_check_num_outputs(model1, model2)
File "C:\Users\Dimq1\AppData\Local\Programs\Python\Python310\lib\site-packages\optimum\onnx\transformations_utils.py", line 136, in _check_num_outputs
raise ValueError(
ValueError: Two model protos need to have the same outputs. But one has 18 outputs while the other has 10 outputs.
PS D:\Users\Dimq1\source\OpenAI\transformers.js>

Expected behavior
A clear and concise description of what you expected to happen.

Logs/screenshots
If applicable, add logs/screenshots to help explain your problem.

Environment

  • Transformers.js version:
  • Browser (if applicable):
  • Operating system (if applicable):
  • Other:

Additional context
Add any other context about the problem here.

@DimQ1 DimQ1 added the bug Something isn't working label Apr 1, 2023
@DimQ1 DimQ1 changed the title [Bug] Title goes here. [Bug] Converter doesn't convert Whisper model to ONNX Apr 1, 2023
@xenova
Copy link
Collaborator

xenova commented Apr 1, 2023

Hi! Yes, this was a bug in optimum, but it's fixed now (dev branch). See here for how to fix it: #63 (comment)

You must also use the latest conversion script (which adds a call to strict=False).

Alternatively, you can just use the model I already converted: https://huggingface.co/Xenova/transformers.js/tree/main/quantized/openai/whisper-tiny

@DimQ1 DimQ1 closed this as completed Apr 3, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants