-
-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Whisper v3 #560
Comments
Just need to wait openAi release it on HuggingFace and it will be easy to implement.
|
Wouldn't you just replace "large-v2" with "large-v3"? |
|
large-v3 it can be downloaded here: https://openaipublic.azureedge.net/main/whisper/models/e5b1a55b89c1367dacf97e3e19bfd829a01529dbfdeefa8caeb59b3f1b81dadb/large-v3.pt |
Has anyone explored if the large-v3 model link can be integrated for dynamic downloading as for the rest of the models (instead of requiring pre-download). If possible, please point out which file or files in the repository would need to be modified to achieve this. |
For use in whisperx I think the large-v3 models needs to be converted to the "faster whisper" format. It looks straightforward but perhaps best done by the original authors as something always goes wrong and requires some debugging..... |
There is already open pull request in faster-whisper to support this SYSTRAN/faster-whisper#548 including the model that was converted to ctranslate2 format, but also the features size increased to 128 from 80 |
Looks like it has been added to huggingface https://huggingface.co/openai/whisper-large-v3 |
I tried to run the above and get the following error: File "/ext3/miniconda3/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2065, in _from_pretrained Has anyone else seen this? |
OpenAI added a new language. so there is a new token |
I got the huggingface large-v3 working by upgrading the transformers package. Apparently there is new tokenization code (sigh). However, I don't think there is a new version of faster-whisper yet. When there is, can we just get it with a pip install whisperx --upgrade type of command, or must we upgrade the faster_whisper package manually ourselves? |
There are already 2 PR in faster-whisper, the official maintainer is no longer providing direct support, we have to wait for another https://github.com/guillaumekln/faster-whisper/pulls |
My use case (If anyone has any insight on how whether I can manually update
|
wait for the update |
Ctranslate2 (fast inference engine used by Faster Whisper) has been successfully updated to support Whisper large-v3: |
|
@MahmoudAshraf97 Because when I just tried it, I first encountered this problem: #444 |
@s-h-a-d-o-w I thoroughly tested it before submitting this PR, upgrade |
large-v3 now works for me but I did have to do a force-reinstall in addition o the upgrade. |
dear friend, could I ask how to use it after downing the large-v3.pt? |
dear friend, could I ask how do you install and use it after downing the large-v3.pt? |
You don't need to download it; you can just refer to it as the "large-v3" model the same way you do for "medium", "large", "large-v2" once the code it updated. It should download in the background. |
How to use new model whisper v3 with whisper X?
Just delete current model large and download it again or is needed an update?
Thanks
The text was updated successfully, but these errors were encountered: