You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The error occurs when training with machine_translation.py:
Traceback (most recent call last):
File "machine_translation.py", line 336, in <module>
train()
File "machine_translation.py", line 251, in train
max_length=args.max_length)
File "machine_translation.py", line 208, in seq_to_seq_net
decoder_size)
File "machine_translation.py", line 184, in lstm_decoder_with_attention
context = simple_attention(encoder_vec, encoder_proj, hidden_mem)
File "machine_translation.py", line 161, in simple_attention
x=attention_weights)
TypeError: sequence_softmax() got an unexpected keyword argument 'x'
The text was updated successfully, but these errors were encountered:
The error occurs when training with machine_translation.py:
The text was updated successfully, but these errors were encountered: