Skip to content

Commit

Permalink
Merge pull request #1 from abhinav-bohra/1036-fixes-visualbert-attention
Browse files Browse the repository at this point in the history
[fix] VisualBERT returns attention tuple facebookresearch#1036
  • Loading branch information
abhinav-bohra authored Aug 22, 2021
2 parents 92c167d + f68fc15 commit 9aa9e22
Showing 1 changed file with 8 additions and 2 deletions.
10 changes: 8 additions & 2 deletions mmf/modules/hf_layers.py
Original file line number Diff line number Diff line change
Expand Up @@ -280,11 +280,17 @@ def forward(
attention_mask: Optional[Tensor],
encoder_hidden_states: Optional[Tensor] = None,
encoder_attention_mask: Optional[Tensor] = None,
output_attentions: bool = False,
output_hidden_states: bool = False,
output_attentions: bool = None,
output_hidden_states: bool = None,
return_dict: bool = False,
head_mask: Optional[Tensor] = None,
) -> Tuple[Tensor]:

if output_attentions is None:
output_attentions = self.output_attentions
if output_hidden_states is None:
output_hidden_states = self.output_hidden_states

all_hidden_states = ()
all_attentions = ()
for i, layer_module in enumerate(self.layer):
Expand Down

0 comments on commit 9aa9e22

Please sign in to comment.