-
Notifications
You must be signed in to change notification settings - Fork 435
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Alternative IMU orientations #37
Comments
Where do you see that assumption? I've tested it in a lot of different initial orientations without problems. The only thing that's buggy is the initialization code, with all the Euler angles and stuff, but it's supposed to work for all initial orientations. Also see #36 . |
It does work for all starting orientations of ENU IMU, where the acceleration vector points along +ve z when the filter should output (0, 0, 0) for orientation. For an NED EMU, when the sensor is oriented (0,0,0), acceleration points along -ve z, since +ve z is pointed towards the ground. However imu_filter_madgwick determines that the sensors is upside down, because it does not know that imu_link is not ENU aligned. I've confirmed this with our NED Microstrain IMU: processing the sensor's acceleration and gyro through imu_filter_madgwick when the sensor is upright provides (pi,0,0) orientation, however the IMU's internal EKF is outputting (0,0,0), because it is an NED sensor. So in essence, adding an 'imu_reference_frame' that IS ENU aligned will transform incoming data into ENU, specifically where acceleration points +ve z when the sensor is (0,0,0). This will also correct the issue of unorthodox mounting positions that are present on some of our robots, where an ENU IMUis mounted sideways. |
Ah, I see. So the problem you want to solve is that you want the output of I don't think I like the proposed solution, however. In my mind, there is no such thing as a "ENU sensor" or "NED sensor"; the device just provides accelerations, velocities and magnetic field readings, and it's up to the fusion stage to merge them into an orientation, so that's where the decision between ENU and NED is made, regardless of what's printed on the IMU housing. REP 103 mandates the use of ENU, and I think it's important to stick with this convention in ROS (or rather, stick with any convention, as long as there's exactly one). However, REP 103 offers a way out:
This is similar to what ROS camera drivers (e.g. openni) are doing; their main tf frame (e.g., So what I think we should do is:
Your internal EKF should probably simply publish its messages in the paulbovbel wrote:
That should be handled by a proper transform from |
Okay, so disregarding the ENU/NED discussion for now, although I do agree it's important and would like to deal with IMUs in a REP in the near future. The reality is, IMUs may come in one flavour or the other, and as long as an NED publishes with a default frame_id of imu_link_ned, putting the data into ENU is just a transform away. But this is just in terms of the IMU. If the IMU is mounted sideways, imu_link is now neither ENU or NED in the geographic scheme of things. This is okay, not all frames can be geographically ENU/NED (silly example i.e. wheels), as long as there's a context (e.g. base_link) up the tree that is. I'm aiming to use the output of imu_madgwick_filter to replace the internal EKF on the UM6. If my NED IMU publishes data in frame /imu_link_ned, simply plugging it into imu_filter_madgwick will not work. I first need to transform the IMU data into an ENU referenced frame. This may be imu_link, or if the IMU is mounted sideways, it may be base_link instead. See the gist here for my test setups, sorry it's not highlighted, but if you click edit you can turn on XML formatting At the end of the day, the driver should not be listening to TF, and transforming the data into a geographically correct ENU/NED format. The driver should dump the data from the device in a consistent form (ENU or NED) which is what happens in the wild today. The robot integrator sets up the contextual information (i.e. the URDF). The receivers of the IMU data can then make informed decisions about how to process it. In the case of imu_filter_madgwick, we can already assert that incoming data has to be ENU. To that end, my proposal is that instead of users having to add an intermediary node to transform the data to ENU, they could instead specify an ENU reference frame (e.g. base_link) that the filter madgwick can use to pre-process the data. |
Actually alternatively, it may be easiest just to write an imu_transformer node/nodelet combo, would you be willing to add this to imu_tools? This would keep the filter madwick code from getting too complicated. The reasoning for all this is that any nodelet running in the same manager as the filter has to be GPL licensed, so this seems like the most appropriate home. |
I believe I'm still missing your point. On our robot, we have an IMU that's mounted "upside down" (i.e., the gravity vector goes to -z as reported by the IMU driver; it's an "NED" imu if you go by the axes that are printed on it). Also, its x axis points to the back of the robot. This is reflected in the URDF, as can be seen here (it's a bit hard to see, but the IMU z axis points down and the base_footprint z axis points up). The URDF is read by robot_state_publisher, which publishes the TF transform Maybe here's the part you're missing: robot_pose_ekf transforms the orientations from the imu topic into the base_footprint frame, so it doesn't matter in which frame they are given, as long as the base_footprint --> imu tf is correct. (BTW, robot_pose_ekf only transforms the orientations, not the covariance matrices, which is probably a bug). If we were to mount the IMU in any other way, all we would have to do is make sure that the URDF still properly reflects the base_footprint --> imu tf. Theoretically, this would even work if we would mount the IMU on the wheels, although it's a terrible idea. :-) If you want to have a closer look, it's all in the calvin_bringup package. The top-level launch file is calvin.launch, and the only other ones of interest for you are imu.launch and ekf.launch. |
+1 for using a separate node/nodelet. However, @chadrockey already has an imu_transformer node, and I'd rather use that (or ask him to also provide a nodelet) than duplicate functionality between imu_tools and imu_pipeline.
Oh crap. I'll have to read up a bit more on this, and see whether we can re-license imu_filter_madgwick under LGPL if all authors agree, and whether that would actually help at all (see #28). Alternatively, if your tf2 PR mentioned earlier goes through, then transformation would be literally one line of code, so we could skip the transformer altogether and simply expect the client to transform the data into the desired frame (like robot_pose_ekf already does, and like many point cloud processing nodes do). |
TF is probably the best place for this. I don't really believe nodelets are necessary for the total bandwidth of IMU data over the headaches that they provide in reliability and debugging of live systems, especially given the GPL issues. Typically it is sufficient to turn Nagle's algorithm off to achieve low latency, high rate IMU data. I did, in fact, try to get imu_filter_madgwick relicensed but the discussion fell a little flat and I didn't have a lot of motivation other than that it was a nice filter. Now a days I think that porting the Apache V2 sensor fusion from Android would be a good choice: It has a good tuning for cellphone quality sensors, support accelerometer, gyro, and magnetometers, and is pretty well tested by gamers, possibly by project Tango, and by others all over the world. It also is run in Android in various modes with or without sensors. Maybe give it a shot? |
@chadrockey, I will look into porting the Apache licensed code in the near future, would definitely be a big help. My immediate goal though is to get imu_filter_madgwick working with our Husky robot so that we can do an Indigo release. @mintar, LGPL would definitely help, would allow loading the madgwick nodelet in a nodelet manager along with BSD code. I'll be honest, I'm not really sure what the disconnect is. My adventures with IMUs for the past few weeks have led to me finding a lot of inconsistencies. From what I've found for the UM6, the data needs to be transformed into an ENU frame before hitting imu_filter_madgwick. Although we have an ENU mode baked into the driver, this doesn't save me because the IMU is mounted sideways. To that end, would you be willing to help out with an imu_tools release including an imu_transformer nodelet? I'm hesistent to pull on imu_pipeline since it's currently organized as one top-level package. It looks like there's interest in changing that (ros-perception/imu_pipeline#3 (comment)), but that would be a pretty big change for indigo at this late point. @chadrockey, would you be interested in pulling an updated transformer into imu_pipeline for jade? |
I don't have a lot of interest in IMU pipeline these days, but I'd be glad to give up/share maintainership. |
@chadrockey I'd be happy to take over or share maintainership if you're willing. |
hi, @paulbovbel , I wonder to know, before hitting imu_filter_madgwick, i just only need to transform from body-NED W.R.T NED -> body-ENU WRT NED is ok? or totally transform from body-NED W.R.T NED -> body-ENU WRT ENU? Thanks! |
Currently, imu_filter_madgwick assumes all incoming data is on frame where the +ve z axis points 'up'. This does not work for NED IMUs since these have +ve z axis pointing 'down' towards the ground when the IMU is upright.
Working on a pull to transform the IMU data into a parametrized
imu_reference_frame
(e.g. base_link) before filtering. Documentation will have to be added that this frame has to be ENU for the filter to obtain the correct orientation estimate.The text was updated successfully, but these errors were encountered: