Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Suppresstransformers "Models won't be available" message, update Umshini environments #87

Conversation

elliottower
Copy link
Contributor

This message previously printed every single time you imported chatarena, because the init.py imported the hf backend, which in turn tried to import transformers. I have now suppressed that error message, as people who are using that backend will be able to tell that they do not have pytorch or tensorflow when they try to use a given model.

For the umshini environments I have updated all of the prompts for the moderator judging, added info about max characters per response in the opening message (4000 char by default), and added print statements VIOLATION: True. so it is more transparent how these environments are judged. I tested many different prompts for the debate environment judge, and have it in a place where it does successfully penalize very short responses, responses where the agent simply agrees with the opponent, or cases where it tries to talk to the moderator directly.

@elliottower elliottower merged commit c63d208 into Farama-Foundation:main Nov 17, 2023
13 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant