You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jul 7, 2023. It is now read-only.
It seems the checkpoint is problematic - in the example in above colab, the example results of translation are:
INFO:tensorflow:Greedy Decoding
Inputs: The animal didn't cross the street because it was too tired
Outputs: Das Tier überquerte die Straße nicht, weil es zu müde war, weil es zu müde war.
It repeats the second part of the sentence twice for some reason. I also tried it on my machine on other sentences and it happened again.. I understand the checkpoint isn't supposed to give perfect translation but is it supposed to be like that? Should I try to train it more on the WMT data?
I saw in other issues(for example Unable to reproduce WMT En2De results #317 ) they recommend using checkpoints gs://tensor2tensor-checkpoints/transformer_ende_test/model.ckpt-3817425 or gs://tensor2tensor-checkpoints/transformer_ende_test/averaged.ckpt-0
but it seems they're not there anymore. Can they be found elsewhere or are they deleted for some reason?
Thanks a lot!
The text was updated successfully, but these errors were encountered:
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
I'm trying to use the checkpoint for the translate_ende_wmt32k problem and then train it on my own data.
Link to google colab page:
https://colab.research.google.com/github/tensorflow/tensor2tensor/blob/master/tensor2tensor/notebooks/hello_t2t.ipynb
I have 2 questions:
INFO:tensorflow:Greedy Decoding
Inputs: The animal didn't cross the street because it was too tired
Outputs: Das Tier überquerte die Straße nicht, weil es zu müde war, weil es zu müde war.
It repeats the second part of the sentence twice for some reason. I also tried it on my machine on other sentences and it happened again.. I understand the checkpoint isn't supposed to give perfect translation but is it supposed to be like that? Should I try to train it more on the WMT data?
but it seems they're not there anymore. Can they be found elsewhere or are they deleted for some reason?
Thanks a lot!
The text was updated successfully, but these errors were encountered: