Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Generate models with hands #29

Open
traversaro opened this issue Aug 20, 2016 · 23 comments
Open

Generate models with hands #29

traversaro opened this issue Aug 20, 2016 · 23 comments

Comments

@traversaro
Copy link
Member

No description provided.

@traversaro
Copy link
Member Author

@Yeshasvitvs check the README in this repo and the discussion in robotology/community#137 .

@traversaro
Copy link
Member Author

I want separate the problem of adding hands to the model (for which we can keep this issue) and the problem of adding eyes to the model, for which we can create a new issue in #37 .

@traversaro traversaro changed the title Generate models with hand and eyes Generate models with hand Dec 23, 2016
@traversaro traversaro changed the title Generate models with hand Generate models with hands Dec 23, 2016
@traversaro
Copy link
Member Author

Brief recap: the iCub's URDF are currently generated using two possible workflows, as described in the README of this repository https://github.com/robotology-playground/icub-model-generator#icub-model-generator . Both the workflows do not support exporting a model of the hands, for the following reason:

  • The dh workflow extract the parameters from the iDyn model, that do not include models for the fingers.
  • The simplified CAD model prepared by the mechanical guys used in the simmechanics workflow include the hand and all the fingers as a fixed body.

However, some information about the hands is currently available in this form:

@claudiofantacci is also working on getting a reliable model of the hands.

@yeshasvitirupachuri
Copy link
Member

yeshasvitirupachuri commented Dec 23, 2016

@traversaro Thanks you!

@traversaro
Copy link
Member Author

My two cents: for the time being I think the most reasonable solution is to isolate the hands from the model in vislab model [2] , and the resulting model can be easily add to all the generated model, even automatically.
The tricky thing is to identify the l_hand_dh_frame and r_hand_dh_frame frame as defined in [1] in the VisLab model [2], to make sure that the transformation of the fingers with respect to the hand and the rest of the arm are consistent.

[1] : http://wiki.icub.org/wiki/ICub_Model_naming_conventions
[2] : https://github.com/vislab-tecnico-lisboa/icub-moveit/tree/master/icub_description

@traversaro
Copy link
Member Author

@claudiofantacci
Copy link
Collaborator

It is not 100% clear to me the differences between the two workflows and related stuff.
First things first: do we want to use/support URDF, SDF or both?
As of now, my understanding is to use SDF.

Here is what I know about the hand.
The model we have are simplified CAD model. Simplified in that they are not 1:1 corresponding w.r.t. the Creo CAD, but instead they have less vertexes, i.e. meshes, and they are similar to cylinders for the phalanxes and ball-shaped for the tips. Even though this simplification are applied, they have the correct frame poses coming from the DH parameters.

As of now, I'm quite sure that the DH parameters are:

  • wrong/very imprecise for the whole thumb (maybe excluding just the position of the very first frame)
  • imprecise for the index
  • missing/unimplemented for the ring and little fingers

In superimpose-hand repo, I use the simplified CAD version and they work reasonably well. Having said that, though, we are on the process of making anew the DH parameters and possibly to have the CAD files updated.

@traversaro
Copy link
Member Author

traversaro commented Dec 23, 2016

First things first: do we want to use/support URDF, SDF or both?
As of now, my understanding is to use SDF.

Some of our software including this generator and most ROS-based software only supports URDF, so for now we need to support both.

Having said that, though, we are on the process of making anew the DH parameters and possibly to have the CAD files updated.

Great! By "CAD files updated" you mean to have a shinkwrap for each link in the hand? There is any discussion in iCub Facility's internal Redmine on this?

@traversaro
Copy link
Member Author

Sorry @claudiofantacci , I accidentally deleted your message. : (
However we can talk about this when we are back from vacation!

@claudiofantacci
Copy link
Collaborator

🙈
😂
No worries! We will discuss back from vacation!

@vicentepedro
Copy link

Hi @traversaro and @claudiofantacci

Any news about this issue?

Thanks in advance,
Pedro Vicente

@claudiofantacci
Copy link
Collaborator

Hi @vicentepedro, not at this very moment, but we have a student that will start working on this in the next months.
Keep in touch!

@vicentepedro
Copy link

Hi @traversaro

Any news on generating the iCub eyes and hands automatically from the CAD?

We are interested in using these models on pybullet. But we didn't find the cameras reference frames on the model to create the virtual cameras on the simulator.

We are testing the following repos:
https://github.com/diegoferigo/icub-model-pybullet from @diegoferigo
and
https://github.com/robotology-playground/pybullet-robot-envs

If this is not the right place to put the question, feel free to move it ;)

@traversaro
Copy link
Member Author

traversaro commented Apr 8, 2021

Hi @vicentepedro, I would say the most official model with eyes + hands (even if not completely automatically generated from CAD) is iCubGazeboV2_5_visuomanip . It includes movable eyes and actuated hands, and you can see it in use in the icub-gazebo-grasping-sandbox. We had problem in the past in using this models in PyBullet, but workaround are relatively easy (see robotology/icub-models#12).

If you find a way to use that model in PyBullet, feel free to report your success, for example in robotology/community's "Show and Tell" category. If instead you have any issue, feel free to open new issue in https://github.com/robotology/icub-models, that is the public facing repo for iCub models (we should actually move all the issues from this repo to that one).

@vicentepedro
Copy link

Thanks @traversaro

@Tiago-N will try that model instead, and we will report how it goes.

@diegoferigo
Copy link
Member

We are testing the following repos:
https://github.com/diegoferigo/icub-model-pybullet from @diegoferigo
and
https://github.com/robotology-playground/pybullet-robot-envs

The diegoferigo/icub-model-pybullet repository was an early experiment and it is currently no longer maintained. For the applications that we were interested, we switched to Ignition Gazebo instead of pybullet. I'm going to archive the repository.

Instead, for what concerns robotology-playground/pybullet-robot-envs, I'm not sure if it's still actively maintained. The main developer recently left IIT, and I don't know if internally there's a plan to keep its development.

@traversaro
Copy link
Member Author

Instead, for what concerns robotology-playground/pybullet-robot-envs, I'm not sure if it's still actively maintained. The main developer recently left IIT, and I don't know if internally there's a plan to keep its development.

Probably on this @xEnVrE may know something.

@traversaro
Copy link
Member Author

@Tiago-N will try that model instead, and we will report how it goes.

Ok, pay attention to the issue in robotology/icub-models#12 that is quite critical, however it should be easy to workaround.

@xEnVrE
Copy link

xEnVrE commented Apr 8, 2021

Instead, for what concerns robotology-playground/pybullet-robot-envs, I'm not sure if it's still actively maintained. The main developer recently left IIT, and I don't know if internally there's a plan to keep its development.

Probably on this @xEnVrE may know something.

As far as I know, at the moment there are no planned activities for that repository.

@traversaro
Copy link
Member Author

Probably on this @xEnVrE may know something.

As far as I know, at the moment there are no planned activities for this repository.

And just to clarify, with "this repository", you mean https://github.com/robotology-playground/pybullet-robot-envs, right?

@xEnVrE
Copy link

xEnVrE commented Apr 8, 2021

Probably on this @xEnVrE may know something.

As far as I know, at the moment there are no planned activities for this repository.

And just to clarify, with "this repository", you mean https://github.com/robotology-playground/pybullet-robot-envs, right?

Sorry for being not clear. Yes I meant that repository. I think we can also ask @fbottarel for information on that repository.

@fbottarel
Copy link

I was never involved in https://github.com/robotology-playground/pybullet-robot-envs, and as far as I know no one uses it nor maintains it right now.

@S-Dafarra
Copy link
Contributor

Let me try to revive this issue. Lately we have been working on the teleoperation pipeline using iCub3. It would be useful to have an iCubGazeboV3 robot with hands.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

8 participants