Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Attention Grounding for GQA #67

Closed
pkhdipraja opened this issue Aug 13, 2020 · 3 comments
Closed

Attention Grounding for GQA #67

pkhdipraja opened this issue Aug 13, 2020 · 3 comments

Comments

@pkhdipraja
Copy link

Hello, I am interested in getting the grounding results for GQA but it doesn't seem to be supported at the moment. Is there a plan to support this in the future? or maybe pointer on how to extend the current implementation to support this? (I am particularly interested in getting the result for MCAN model)

Thank you.

@MIL-VLG
Copy link
Collaborator

MIL-VLG commented Aug 13, 2020

If you mean the attention map visualization in the MCAN paper, we currently do not such plan to add this function into openvqa, as we think this is not a generic functionality for other VQA models. Some other 3rdparty repo like this may be helpful for your requirements.

@pkhdipraja
Copy link
Author

Not for the visualization, but the attention weights are useful for evaluating grounding on GQA for attention-based models.

@MIL-VLG
Copy link
Collaborator

MIL-VLG commented Aug 20, 2020

Got it. The additional annotation files for evaluating the grounding performance are not provided for the test set. Therefore, this result can only be tested online.

@MIL-VLG MIL-VLG closed this as completed Aug 24, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants