MultiAgentObjectCollectorEnv First install CleanRl by doing cd cleanrl and poetry install To run ppo trainer do (from root folder): cd cleanrl and python ..\ppotrainer.py --env-id oc-v1