Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Device or resource busy while slamming #10332

Closed
AlaricLcrs opened this issue Mar 23, 2022 · 10 comments
Closed

Device or resource busy while slamming #10332

AlaricLcrs opened this issue Mar 23, 2022 · 10 comments

Comments

@AlaricLcrs
Copy link

| Camera Model | d435i |
| Operating System & Version | Ubuntu 18.04 |
| Platform | Jetson |
| Segment | Robot |

Issue Description

Hello !

I'm currently working on a walking robot with a d435i camera connected to a Jetson board. I'm trying to implement an autonomous navigation module and in order to do so, I need to Slam the room the robot is in. However, when trying to with the opensource_tracking.launch file included in the ros sdk, I get loop of "error : Device or resource busy, number : 16" messages and the camera freezes on Rviz/Rtabmap. Before the robot starts walking, it seems to work correctly though the fps seems rather low and I get "Failed to meet update rate" messages on my terminal. Do you know how I could fix this issue please ?

Thanks !

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Mar 23, 2022

Hi @AlaricLcrs There have been past reports of the Failed to meet update rate! message happening when using the opensource_tracking.launch launch file, so I would not strongly suspect that there is a hardware problem in the camera, especially as it is able to publish data (though at a slow rate).

There is very little documentation available on this error. When ROS does generate this error though, it sometimes provides this advice: "Try decreasing the rate, limiting sensor output frequency, or limiting the number of sensors". I believe that 'sensor output frequency' is referring to the D435i's IMU frequency for gyro and accel. The launch log in the terminal should state the frequency that the IMU topics are being published at, just before the 'RealSense Node is Up' line near the end of the log.


In regard to the error : Device or resource busy, number : 16 error, I have seen a couple of past issues involving this error on Jetson Nano, where the cause was suspected to be insufficient power being provided to the camera.
IntelRealSense/realsense-ros#1164 is a case where a RealSense user performed a very detailed analysis of this problem.

Intel themselves strongly recommend enabling the barrel jack connector on Nano for additional power if the particular Nano model being used has one and point to the instructions at the link below for doing so.

https://www.jetsonhacks.com/2019/04/10/jetson-nano-use-more-power/

Is your Jetson model a Nano, please?


Also, in the RealSense ROS wrapper the gyro and accel topics are disabled by default. So does it make a difference if you add the following instructions onto the end of your opensource_tracking.launch roslaunch instruction?

enable_gyro:=true enable_accel:=true

For example:

roslaunch realsense2_camera opensource_tracking.launch enable_gyro:=true enable_accel:=true

@AlaricLcrs
Copy link
Author

Thank you for your answer !

I have tried running roslaunch realsense2_camera opensource_tracking.launch enable_gyro:=true enable_accel:=true but am not noticing anything different, still getting the "Failed to meet update rate!" messages.
Regarding the "Device or resource busy" messages, I am not using a Jetson Nano but a Jetson NX. But I think that you might be right and it might be related to insufficient power, or it might also be a CPU issue. I've attached the interface that htop gives me and we can see that both cores are at full use (when they're only at 2% when RViz isn't running). Do you have any idea that could fix these performance issues ?

Thanks !

Capture d’écran 2022-03-23 à 15 37 36

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Mar 23, 2022

opensource_tracking.launch is setting align_depth to True.

https://github.com/IntelRealSense/realsense-ros/blob/development/realsense2_camera/launch/opensource_tracking.launch#L5

Alignment is a processing-intensive operation. If you are using Jetson (which has an Nvidia graphics GPU) then you can enable CUDA support for acceleration of alignment by offloading the processing of alignment from the CPU onto the GPU.

If you built librealsense and the ROS wrapper separately then CUDA support can be enabled either by building librealsense from packages or building from source code with the -DBUILD_WITH_CUDA=true flag included in the CMake build instruction.

If librealsense and the ROS wrapper are built together from packages with the ROS wrapper's Method 1 instructions then CUDA support is not included in the build.

#2670 provides a chart with estimations that illustrate the potential reduction in CPU processing burden when performing align with CUDA support enabled.

Enabling alignment (and also pointclouds) on Jetson devices specifically in the ROS wrapper is also known to create problems such as slowdown that do not occur when the same configuration is used on non-Jetson computers with Ubuntu such as desktop / laptop PCs, or when align or pointcloud are not enabled on Jetson.

@AlaricLcrs
Copy link
Author

AlaricLcrs commented Mar 24, 2022

Hello !

Thank you for your answer ! I'm not getting de "Device or ressource busy" messages anymore so it really was a CPU problem, thank you for solving that (with the CUDA flag) ! Now, I would like the robot to be able to SLAM without being connected to a screen. I've tried unplugging the HDMI cable and making it travel across the room but it doesn't seem to record anything when it's not plugged. Do you know if it would be possible to record without being plugged to a screen ? Thanks.
Also, I've retried slamming with rtabmap now that the workspace has been built with CUDA but unfortunately, the 3D map of the software never seems to load, as you can see in the attached screenshot. Also, the fps is much lower than on RViz, do you know what might cause all that ?

Thanks !

Capture d’écran 2022-03-24 à 09 17 39

@MartyG-RealSense
Copy link
Collaborator

If you plan on using the librealsense SDK "headless" without a display then you can build the SDK without graphics support by using the build flags -dbuild_examples=true -dbuild_graphical_examples=false

This means that when the SDK is built, it will not include OpenGL support and graphics-based tools and examples such as the RealSense Viewer will not be included in the build. Plain-text tools and examples such as rs-hello-realsense and rs-data-collect will be included in the build though.

Rather than having a single computer on a robot without a display attached, it would be more typical though to also have a 'host' computer too and connect to the computer on the robot remotely over a wi-fi / WLAN wireless connection.

IntelRealSense/realsense-ros#2052 is relevant to this kind of remote connection and also to your problem with not being able to record without a screen (as the RealSense ROS user in that case experiences the same issue).

Other RealSense ROS users have experienced a low FPS when using RTABMAP. introlab/rtabmap_ros#358 should be a helpful reference.

@MartyG-RealSense
Copy link
Collaborator

Hi @AlaricLcrs Do you require further assistance with this case, please? Thanks!

@AlaricLcrs
Copy link
Author

Yes, sorry I haven't replied. I don't think I've used the flags -dbuild_examples=true -dbuild_graphical_examples=false the right way, when am I supposed to use them ?

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Apr 1, 2022

If you were building the librealsense SDK for Jetson from source code after patching the Linux kernel then you would use them within a CMake build instruction, as described in the instructions linked to below.

https://github.com/IntelRealSense/librealsense/blob/master/doc/installation_jetson.md#building-from-source-using-native-backend

For example:

cmake .. -DBUILD_EXAMPLES=true -DBUILD_GRAPHICAL_EXAMPLES=false -DCMAKE_BUILD_TYPE=release -DFORCE_RSUSB_BACKEND=false -DBUILD_WITH_CUDA=true && make -j$(($(nproc)-1)) && sudo make install


Alterntively, if you were building from source code on Jetson with the RSUSB backend method (which is not dependent on Linux versions or kernel versions and does not require patching) then the Jetson instructions at #6964 (comment) may be helpful.

For example:

cmake ../ -DFORCE_RSUSB_BACKEND=ON -DBUILD_PYTHON_BINDINGS:bool=true - DPYTHON_EXECUTABLE=/usr/bin/python3.6 -DCMAKE_BUILD_TYPE=release -DBUILD_EXAMPLES=true -DBUILD_GRAPHICAL_EXAMPLES=false -DBUILD_WITH_CUDA:bool=true

@MartyG-RealSense
Copy link
Collaborator

Hi @AlaricLcrs Do you require further assistance with this case, please? Thanks!

@MartyG-RealSense
Copy link
Collaborator

Case closed due to no further comments received.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants