Repository to manage the work on the object retrieval with the robot R1.
- yarp
- icub-main
- icub-contrib
- cer
- yarp-devices-llm
- tour-guide-robot
- Mdetr requirements
- Yolo v8 requirements
Build your docker container using the provided Dockerfiles in the docker folder, or just pull the images:
docker pull colombraf/r1images:r1ObjectRetrieval # to work with the actual R1 robot
docker pull colombraf/r1images:r1ObjectRetrievalSim # to work in a simulated environment
- On Linux:
git clone https://github.com/hsp-iit/r1-object-retrieval
cd r1-object-retrieval && mkdir build && cd build && cmake .. && make -j11
export PYTHONPATH=$PYTHONPATH:${R1_OBR_SOURCE_DIR}
export YARP_DATA_DIRS=$YARP_DATA_DIRS:${R1_OBR_BUILD_DIR}/share/R1_OBJECT_RETRIEVAL
export PATH=$PATH:${R1_OBR_BUILD_DIR}/bin
cd docker/docker_r1ObjectRetrievalSim
./start_docker_obj_retr_sim.sh
- Inside the docker container:
./start_sim.sh
. This will launch a yarpserver instance, the gazebo simulation, a yarprun instance (/console), yarplogger and yarpmanager.
cd docker/docker_r1ObjectRetrieval
./start_docker_obj_retr.sh
- Inside the docker container:
yarp run --server /console --log
,yarplogger --start
,yarpmanager
You can find here below a list of the main modules in this repo and a brief description of their purpose
Module | Description |
---|---|
nextLocPlanner | Planner that define the next location the robot should inspect |
lookForObject | Module that manages the motions the robot should perfom once arrived in a location to search for the requested object (tilt head and turn) |
goAndFindIt | FSM that invokes the two previous modules to navigate the robot in the map and search for objects |
approachObject | Plug-in module for the search positive outcome: the robot navigates nearer to the found object |
disappointedPose | Plug-in module for the search negative outcome: the robot assumes a predefined pose |
yarpMdetr / yarpYolo | Object Detection modules: detect objects in an input image |
r1Obr-orchestrator | Orchestrator (FSM) of the whole object retrieval application: manages the search, the speech interaction, the integration of the sensor network, and other possible actions that the robot could perform |
sensorNetworkReceiver | Interaction module between the r1Obr-orchestrator and the Sensor Network |
look_and_point | Just a simple module to use the positive output of the search and give it as an iput to the handPointing module (cer repo) |
micActivation | Module that starts/stops the audioRecorder device when a joystick button is pressed/released |
You can find in the interfaces folder the thrift files for the creation of the following classes:
- r1OrchestratorRPC: implementations of the RPC calls of the module r1Obr-orchestrator
In the app folder you can find, divided by context:
- a
scripts
folder containing the definition of the yarp applications, and eventually other useful scripts, for the corresponding module - a
conf
folder containing the configuration files for the corresponding module