Suppresstransformers
"Models won't be available" message, update Umshini environments
#87
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This message previously printed every single time you imported chatarena, because the init.py imported the hf backend, which in turn tried to import transformers. I have now suppressed that error message, as people who are using that backend will be able to tell that they do not have pytorch or tensorflow when they try to use a given model.
For the umshini environments I have updated all of the prompts for the moderator judging, added info about max characters per response in the opening message (4000 char by default), and added print statements
VIOLATION: True.
so it is more transparent how these environments are judged. I tested many different prompts for the debate environment judge, and have it in a place where it does successfully penalize very short responses, responses where the agent simply agrees with the opponent, or cases where it tries to talk to the moderator directly.