You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Issue:
"I used llava_mistral 1.6 and LoRA for fine-tuning. The model loads and works fine when epoch=1, but there is no output when epoch=10. Has anyone encountered the same issue? How can I troubleshoot and resolve it?"
Command:
/root/miniconda3/envs/llava/lib/python3.10/site-packages/transformers/generation/configuration_utils.py:392: UserWarning: `do_sample` is set to `False`. However, `temperature` is set to `0` -- this flag is only used in sample-based generation modes. You should set `do_sample=True` or unset `temperature`.
warnings.warn(
/root/miniconda3/envs/llava/lib/python3.10/site-packages/transformers/generation/configuration_utils.py:397: UserWarning: `do_sample` is set to `False`. However, `top_p` is set to `None` -- this flag is only used in sample-based generation modes. You should set `do_sample=True` or unset `top_p`.
warnings.warn(
The attention mask and the pad token id were not set. As a consequence, you may observe unexpected behavior. Please pass your input's `attention_mask` to obtain reliable results.
Setting `pad_token_id` to `eos_token_id`:2 for open-end generation.
![screenshot-20241116-110518](https://github.com/user-attachments/assets/865ab513-e54d-4f11-bfff-deed11a028ef)
![screenshot-20241116-110552](https://github.com/user-attachments/assets/73a805aa-1108-49fc-876c-92dbdbb97853)
Screenshots:
epoch=10
epoch=1
The text was updated successfully, but these errors were encountered:
Describe the issue
Issue:
"I used llava_mistral 1.6 and LoRA for fine-tuning. The model loads and works fine when epoch=1, but there is no output when epoch=10. Has anyone encountered the same issue? How can I troubleshoot and resolve it?"
Command:
Log:
Screenshots:
epoch=10
epoch=1
The text was updated successfully, but these errors were encountered: