We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
你好,如果我想要全局的注意力图,要怎么操作呢,你的demo是对应某个grid的注意力图
The text was updated successfully, but these errors were encountered:
您好,请问您解决这个问题了吗? 我发现visualizer可视化的是某一个transformer layer的某个attention head的attn map,attn_map[i]的shape是(1, 12, 129, 129), grid_index是从0-128的int,在visualizer.py这个函数里,attention_map是(129, )的向量,如果grid_index取0,是不是就是cls的特征图,可以理解成全局的注意力图吗? 我也想要全局注意力图,但是没想出合理的办法。请不吝赐教~
Sorry, something went wrong.
No branches or pull requests
你好,如果我想要全局的注意力图,要怎么操作呢,你的demo是对应某个grid的注意力图
The text was updated successfully, but these errors were encountered: