You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
but you don't fine-tune the model. Do you have an intuition for why that works? Is it basically that the TwoWayAttentionBlock is now computing attention based on the "average" of the similarity between points <-> image embeddings and target_embedding <-> image embeddings?
Thanks!
The text was updated successfully, but these errors were encountered:
Hello --
Really enjoyed the paper. One clarifying question: you add the
target_embedding
to the query point embedding here:https://github.com/ZrrSkywalker/Personalize-SAM/blob/main/per_segment_anything/modeling/transformer.py#L94
but you don't fine-tune the model. Do you have an intuition for why that works? Is it basically that the
TwoWayAttentionBlock
is now computing attention based on the "average" of the similarity betweenpoints <-> image embeddings
andtarget_embedding <-> image embeddings
?Thanks!
The text was updated successfully, but these errors were encountered: