Skip to content

Latest commit

 

History

History
9 lines (6 loc) · 190 Bytes

README.md

File metadata and controls

9 lines (6 loc) · 190 Bytes

MultiAgentObjectCollectorEnv

First install CleanRl by doing cd cleanrl and poetry install

To run ppo trainer do (from root folder):

cd cleanrl

and python ..\ppotrainer.py --env-id oc-v1