Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[fix] VisualBERT returns attention tuple #1036 #1052

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

abhinav-bohra
Copy link

@abhinav-bohra abhinav-bohra commented Aug 22, 2021

PROBLEM: The default value of output_attentions in forward( ) call of BertEncoderJit (in mmf/modules/hf_layers.py) is set as False. So even if the user/developer specifies output_attentions = True in config; its value is taken as default False and thus VisualBERT returns an empty tuple for attentions.

FIX: Set output_attentions as None in BertEncoderJit's forward( ) definition, and update output_attentions to self.output_attentions if it is not passed as an argument (i.e it is None). Therefore, output_attentions will now take the value of self.output_attentions (which was initialized using config during instantiation of BertEncoderJit class)

The issue with output_hidden_states was the same, and it was fixed in a similar way.

Tested locally.

Verified

This commit was signed with the committer’s verified signature. The key has expired.
codesome Ganesh Vernekar
PROBLEM: The default value of output_attentions in forward( ) call of BertEncoderJit (in mmf/modules/hf_layers.py) is set as False. So even if the user/developer specifies output_attentions = True in config; its value is taken as default False and thus VisualBERT returns an empty tuple for attentions.

FIX: Set output_attentions as None in BertEncoderJit's forward( ) definition, and update output_attentions to self.output_attentions if it is not passed as an argument (i.e it is None). Therefore, now output_attentions will take the value of self.output_attentions (which was initialized using config during instantiation of BertEncoderJit class)

Same problem and same fix for output_hidden_states as well.

Tested locally.
@facebook-github-bot
Copy link
Contributor

Hi @abhinav-bohra!

Thank you for your pull request and welcome to our community.

Action Required

In order to merge any pull request (code, docs, etc.), we require contributors to sign our Contributor License Agreement, and we don't seem to have one on file for you.

Process

In order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA.

Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with CLA signed. The tagging process may take up to 1 hour after signing. Please give it that time before contacting us about it.

If you have received this in error or have any questions, please contact us at [email protected]. Thanks!

@facebook-github-bot facebook-github-bot added the CLA Signed Do not delete this pull request or issue due to inactivity. label Aug 22, 2021
@facebook-github-bot
Copy link
Contributor

Thank you for signing our Contributor License Agreement. We can now accept your code for this (and any) Facebook open source project. Thanks!

Copy link
Contributor

@apsdehal apsdehal left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for making this changes and contributing to MMF. Can you write a simple test to verify that this actually works. Test should go in tests/models/test_visual_bert.py.

@abhinav-bohra
Copy link
Author

abhinav-bohra commented Aug 30, 2021

Thanks for making this changes and contributing to MMF. Can you write a simple test to verify that this actually works. Test should go in tests/models/test_visual_bert.py.

Thanks for your feedback @apsdehal . I will add the tests.

I had one doubt - there can be 3 places where the user can set output_attention, namely;

  1. In config
  2. During instantiation of the model
  3. During the forward pass

For example:

config = BertConfig.from_pretrained('bert-base-uncased', output_attentions = True)
model = VisualBERTBase(config, output_attentions = False)
y, z, attn = model.forward(self.x, output_attentions = False)

Should the priority be: forward > model > config ?
i.e even if output_attentions is set as True in config but passed as False in forward, we should not return attention_list.

Or should it be an 'OR' of all the three cases; i.e. setting it anywhere as True will return the attention_list ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed Do not delete this pull request or issue due to inactivity.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants