You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, Thanks for your work!
I missed a problem when i use 8 as batch size, this is the error imformation:
Traceback (most recent call last):
File "/remote-home/share/jzhan/jzhan/WhisperASR2.py", line 85, in
res_list = model.transcribe(audio_data_list, language="zh")
File "/remote-home/share/jzhan/jzhan/batch_whisper/transcribe.py", line 75, in transcribe
return batch_transcribe(model=model,
File "/remote-home/share/jzhan/jzhan/batch_whisper/transcribe.py", line 477, in batch_transcribe
results: List[DecodingResult] = decode_with_fallback(torch.stack(batch_segments))
File "/remote-home/share/jzhan/jzhan/batch_whisper/transcribe.py", line 385, in decode_with_fallback
decode_result = model.decode(segment, options)
File "/remote-home/jzhan/miniconda3/envs/whisper/lib/python3.9/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context
return func(*args, **kwargs)
File "/remote-home/share/jzhan/jzhan/batch_whisper/decoding.py", line 860, in decode
result = DecodingTask(model, options).run(mel)
File "/remote-home/jzhan/miniconda3/envs/whisper/lib/python3.9/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context
return func(*args, **kwargs)
File "/remote-home/share/jzhan/jzhan/batch_whisper/decoding.py", line 772, in run
tokens, sum_logprobs, no_speech_probs = self._main_loop(audio_features, tokens)
File "/remote-home/share/jzhan/jzhan/batch_whisper/decoding.py", line 692, in _main_loop
probs_at_sot.append(logits[:, self.sot_index[i]].float().softmax(dim=-1))
IndexError: index 224 is out of bounds for dimension 1 with size 3
The text was updated successfully, but these errors were encountered:
I'm getting the same exact issue, seems to happen with longer files but can't pin it down.
Sometimes I can do 18 files but other times I get this error "IndexError: index 224 is out of bounds for dimension 1 with size 3" when only doing 6 files.
Working on Windows 10 with RTX 3060 12GB vram.
I'm continuing to test to see what the threshold is where it breaks. I'll post updates.
This is an awesome fork, probably the most useful I've seen so far, not sure why it's not getting more attention!
Hi, Thanks for your work!
I missed a problem when i use 8 as batch size, this is the error imformation:
The text was updated successfully, but these errors were encountered: