The assignment content is an exercise based on ROS about a finite state machine that represent a pet-like robot behavior; this robot moves in a 3D simulation using Gazebo and is equipped with an RGB camera. Using that camera, is able to recognize a ball that moves around in the environment, following the user's commands.
Map of the environment:
This is a graph of the system architecture (obtained via rqt-graph):
The node state_miro
contains the state machine that will be described right after this paragraph.
The node cmd_generator
sends commands to the ball's action server (reaching_goal
) and moves it in the space or makes it disappear; the commands could be sent by the user or could be random.
The node image_feature
receives the image from the camera and if detects a ball sends a message to the state_miro
node; when the state machine will change state, this node will also send cmd_vel
messages to the robot depending on the distance from the ball.
The node reaching_goal2
is the robot action server.
This is a graph of the possible states (obtained via smach-viewer):
The possible states are:
SLEEP
, the robot is sleeping and so it wont respond to any command. After a while in any other state, even if not commanded to do so, it reaches location [-6,-6]. After some amount of time, the robot goes again in stateNORMAL
.NORMAL
, the robot is in the predefined state, it moves randomly around the map until he detects the ball in his lane of sight: the robot will than pass to thePLAY
state.PLAY
, the robot enters in this state from theNORMAL
one after seeing the ball: it chases it until it stops moving, than moves his head 45° to the left and to the right, then it continues staring the ball until it moves. If the ball disappears, the robot will go back to theNORMAL
state.
The only package that is present is exp_assignment2
, which contains all the executable files.
In particular, we have:
-
src
folder:moveclient.cpp
: contains the code to rotate the robot neck of a certain angle;
-
scripts
folder:state_machine_miro_ext.py
: is the state machine of the robot;ball_c.py
andball_c_random.py
that are responsible for the generation of the command for the ball, the first program recevies input from the user and the second generates them randomly, the node is calledcmd_generator
.go_to_point_action.py
: contains the code for the action server of the robot, reads the goal from thereaching_goal2/goal
topic, where the nodemiro_state machine
writes a coordinate; the node is calledreaching_goal2
.go_to_point_ball.py
: contains the code for the action server of the ball, reads the goal from thereaching_goal/goal
topic, where the nodecmd_generator
writes a coordinate;the node is calledreaching_goal
.robot_following3.py
: contains the codeimage_feature
, this node perform the blob detection on the image from the camera and if the robot is in thePLAY
state also sendscmd_vel
commands to the robot.
-
launch
folder:display.launch
: launch file that launches Rviz to check the model of the robot;gazebo_world2.launch
: launches the simulation where commands are received from the user;gazebo_world2_random.launch
: launches the simulation where commands are generated randomly;
This is a ROS package, so it will be necessary to clone this repository into the src
folder of a ROS workspace (here is assumed to be named my_ros
):
cd ~/my_ros/src
git clone
catkin_make --pkg exp_assignment2
Some packages are needed:
smach-viewer
cv_bridge
actionlib
actionlib_msgs
image_transport
To easily run the simulation i created a launch file, that can be used this way:
source ~/my_ros/devel/setup.bash
roslaunch exp_assignment2 gazebo_world2_random.launch
This runs the random simulation, the state of the robot will be outputted on the shell with the information about the command received and an eventual change of state.
If you want to interact with the simulation by giving direct commands you have to run:
source ~/my_ros/devel/setup.bash
roslaunch exp_assignment2 gazebo_world2.launch
and then open a new shell and run:
source ~/my_ros/devel/setup.bash
rosrun exp_assignment2 ball_c.py
A command line interface willconsent to send commands to move the ball or make it disappear.
For both, an additional window will open: this contains the output image from the robot's camera:
The system features a finite-state machine using the Smach
packet and a simulation done via Gazebo: the robot contains a "navigation by following" appproach, using "color blob recognition" to recognize a green ball in the image provied by the camera; the robot itself features the camera, two actuated joints (the wheels) and an additional one, the neck, that can be moved around the roll axis.
The system also features the possibility to be controlled by the user or behave fully randomly.
This is the robot:
Sometimes the ball movement makes the robot "roll over" itself and goes into a position where it cannot move anymore and have to be "flipped" manually from the user. Other times, in the generation of the random movements, more request of actions overlap and the action server so do not properly behave as planned; time delays did not solve the problem, this is not actually a big issue because the movement should be random, nevertheless is a non desired behavior.
Add a velocity control for the robot when the ball moves really close to it and find a way to avoid the overlap of the action requests. Speaking of the velocity control, the robot itself is a bit slow, the control of the velocity could be improved by accelerating slowly at the beginning and then increase when it is moving, to avoid a big acceleration at the beginning that will flip the robot. I added a tail that could possibly avoid the flipping and improve the robot stability while accelerating but i haven't done a lot of testing to see if it is effective.
The documentation is accessible in:
./doc/html/index.html
Riccardo Lastrico - 4070551
Email: [email protected]