-
Notifications
You must be signed in to change notification settings - Fork 44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Baxter placement in Unity scene #2
Comments
Hi, |
At @stevensu1838, you can definitely do this with a simulated baxter. at @agutif, it looks like the kinect transformation is not read being appropriately in unity. Does the robot itself appear? |
At @dwhit, no. In the Unity scene I can only see the point cloud generated by the Kinect and the cameras of the hands. |
@agutif Were you getting any errors in the Unity scene when running this? Can you also confirm that on the ROS side that your /ros_unity topic is streaming the TF data correctly? |
We are trying to control our Baxter robot with Oculus HMD and Oculus Touch controllers.
We installed successfully ros_reality_bridge (no_ein branch) in the Baxter workstation and our Kinect is correctly calibrated as you can see in this screenshot.
But in Windows computer, when we launch one scene, we saw the pointcloud of Baxter "emerging" from the ground, so we can not place ourselves as if we were Baxter.
We can open and close the grippers by pressing the triggers, but we can not control the arms, maybe because we can not be placed as if we were Baxter.
Thanks in advance for your help.
The text was updated successfully, but these errors were encountered: