You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@lucasjinreal Thanks for the feedback! I think you are right, setting these should print from 2048 to 1024.
Can you print out the actual max_position_embeddings inside the function? I think huggingface may overwrite it during run-time (i.e. not using the 1024 you give)
@DachengLi1 I just expose max_position_embeeding as a param in monkey patch, not sure what happened, but if so, if it was overwritten, then my actually training are 4096? (not 2048 as my expected).
But, my minial length set to 1024
Hi, the printed message I can not understand.
I set
ratio=2, max_position_embeddings=1024
Since my GPU can not fit minimal 2048, so, I thought it was expanding the context from 1024 to 2048
But I got print like this:
Condensing Positional embeddings from 4096 to 2048
Which I don't understand?
The text was updated successfully, but these errors were encountered: