Skip to content

HadiSDev/MultiAgentObjectCollectorEnv

Repository files navigation

MultiAgentObjectCollectorEnv

First install CleanRl by doing cd cleanrl and poetry install

To run ppo trainer do (from root folder):

cd cleanrl

and python ..\ppotrainer.py --env-id oc-v1

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages