You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
You're right, the comment doesn't match the code. After a quick glance at the LoRA paper, I don't see an explicit mention of how LoRA should be initialized for embedding layers. When checking the reference implementation by Microsoft, they, do, however, use the same scheme as we do:
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
System Info
The comment and code are contradictory please anyone explain it to me.
Who can help?
@BenjaminBossan
Information
Tasks
examples
folderReproduction
nn.init.zeros_(self.lora_embedding_B[adapter_name])
nn.init.normal_(self.lora_embedding_A[adapter_name])
Expected behavior
nil.
The text was updated successfully, but these errors were encountered: