D435 and T265 for Navigation
I am using D435 and T265 on Nvidia Jetson Tx2 to navigate Trutlebot2
- If both D435 and T265 are connected while the Tx2 starts T265 is never detected. I have a disconnect and reconnect T265. It works fine after that. Any solutions for this issue?
- I am creating a 2d map of the environment using the occupancy package (https://github.com/IntelRealSense/realsense-ros/tree/occupancy-mapping/occupancy). What is the best way to save the created map? I am using the following command
rosrun map_server map_saver map:=/occupancy –f my_map_1
- In order to use the wheen odometry with T265 do I need to use the following code (https://github.com/schmidtp1/librealsense/blob/wheel-odometry-python-sample/wrappers/python/examples/t265_wheel_odometry/t265_wheel_odometry.py)?
- Once I have the map what is the best way to localize and navigate the environment?
- One way is to use D435 depth frames and convert them to laser scan using depthimage_to_laserscan and then use amcl. But how to make use of T265 for better navigation?
-
If the D435 launches before the T265 then it can prevent the T265 from being detected correctly. A number of people encountered this issue, and it is discussed in the link below.
https://github.com/IntelRealSense/realsense-ros/issues/774
As indicated in the discussion, code was added to the RealSense ROS wrapper to help address the problem. The wrapper also later added the option to launch devices by USB port ID instead of serial number with the usb_port_id instruction.
https://github.com/IntelRealSense/realsense-ros#launch-parameters
I also recommend using the launch file designed for using the T265 and a depth camera together if you are not launching this way already.
roslaunch realsense2_camera rs_d400_and_t265.launch
Regarding the question about the best way to save the 2D occupancy map, I recommend posting a question at the RealSense ROS GitHub forum. You can do so by visiting the link below and clicking the New Issue button.
https://github.com/IntelRealSense/realsense-ros/issues
As far as I know, the wheel odometry Python sample program that you linked to is the most recent official example and the best solution currently available. You may also be interested in new wheel odometry calibration information that was recently added to the T265 documentation.
The link below provides some good tips on improving accuracy once the T265 is in motion.
https://github.com/IntelRealSense/librealsense/issues/3970
Intel has published an excellent guide to using the T265 and a 400 Series camera together.
https://dev.intelrealsense.com/docs/depth-and-tracking-cameras-alignment
-
Thanks for the reply!
Are the code samples for demo in the video (https://www.youtube.com/watch?v=62vm0_RZ1nU&feature=youtu.be&t=1036) released?
-
I hope that the link below will be helpful to you.
https://github.com/IntelRealSense/librealsense/blob/master/doc/t265.md#examples-and-tools
-
could you make launch file for 265/d400/l515, please? to run all three of them?
do you provide any support for running librealsense ROS wrapper on Jetson? Within docker container? Shall I create a separate post for the purpose?
"ROS2 wrapper works; but ROS1 wrapper would thhrow many mwessages after starting the node:"
INFO] [1591601602.923007659]: RealSense Node Is Up! 08/06 00:33:22,965 ERROR [546442297728] (synthetic-stream.cpp:48) Exception was thrown during user processing callback! 08/06 00:33:22,974 WARNING [547130171776] (messenger-libusb.cpp:42) control_transfer returned error, index: 768, error: No data available, number: 61 08/06 00:33:22,994 ERROR [546442297728] (synthetic-stream.cpp:48) Exception was thrown during user processing callback! 08/06 00:33:23,026 ERROR [546442297728] (synthetic-stream.cpp:48) Exception was thrown during user processing callback! 08/06 00:33:23,035 WARNING [547130171776] (messenger-libusb.cpp:42) control_transfer returned error, index: 768, error: No data available, number: 61 08/06 00:33:23,059 ERROR [546442297728] (synthetic-stream.cpp:48) Exception was thrown during user processing callback!
-
Hi Andre Can you re-post your message on the RealSense ROS GitHub forum please, as cases involving the T265 or L515 are not handled on this forum. You can do so by visiting the link below and clicking on the New Issue button. Thanks!
-
Hi MartyG, Thank you for following up.
Which issues are covered at this forum? D435? D435i? they throw the message listed above. t265 seems working. l515 seems to throw the same messages as d series. I will try to post to github, However, in the past, I believe I posted to tere and haven't got any response.
as you can see no one responded there https://github.com/IntelRealSense/realsense-ros/issues/1287
we got a bunch of librealsenses bought including t265, d435, d435i, l515, and none of them but for t265 work from nvidia Jetson Xavier. Is it likely that Intel will help with getting them running with ros wrapper via the github? at all?
-
All camera issues not related to T265 and L515 can be handled on this forum. This includes all 400 Series cameras and SR300 / SR305.
I am on both this forum and the two GitHubs (the main one and the ROS one). Someone else is responsible for T265 and L515 cases though, which is why I was not able to take up your linked ROS GitHub case because it mentioned L515 and T265. The developer of the RealSense ROS wrapper is based at the ROS GitHub though.
I recommend closing the linked case and re-posting it with the middle comment about L515 / T265 left out, so that I can take on that case tomorrow if it only mentions D435 / D435i. Please post the L515 issue as a separate case so that the person who handles L515 can give it their attention. Thanks for your patience!
-
Thanks very much, your edit was a good approach. I will make a note to look at it tomorrow (I am off-duty now).
There is a tutorial for using two 400 Series cameras together here:
https://www.intelrealsense.com/how-to-multiple-camera-setup-with-ros/
You can also launch more than one camera with rs_multiple_devices.launch
https://github.com/IntelRealSense/realsense-ros#work-with-multiple-cameras
A 400 Series and T265 can also launch together with roslaunch realsense2_camera rs_d400_and_t265.launch
T265 on its own can be launched with rs_t265.launch
https://github.com/IntelRealSense/realsense-ros#using-t265
It looks as though L515 can use roslaunch realsense2_camera rs_camera.launch filters:=pointcloud
https://github.com/IntelRealSense/realsense-ros/issues/1246
-
there are issues implementing ros2-foxy-librealsense2
instruction https://github.com/intel/ros2_intel_realsense/tree/refactor#install-ros2_intel_realsense-from-source won't work
command sudo apt-get install ros-foxy-librealsense2 wouldn't find apt package
building from sources would return multiple errors, e.g.
CMake Error at /opt/ros/foxy/share/std_msgs/cmake/ament_cmake_export_dependencies-extras.cmake:43 (get_target_property):
get_target_property called with incorrect number of arguments
Call Stack (most recent call first):
/opt/ros/foxy/share/std_msgs/cmake/std_msgsConfig.cmake:41 (include)
CMakeLists.txt:20 (find_package) -
Hi Andre I cannot provide any further information here as this particular case is being handled by another RealSense support team member on the ROS GitHub now.
-
on the ROS github where?
https://github.com/IntelRealSense/realsense-ros/issues/1287 ??
at some other thread?
Thanks
-
Yes on #1287 where you posted the Foxy question a little earlier. The RealSense team member RealSenseSupport is handling your case.
https://github.com/IntelRealSense/realsense-ros/issues/1287#issuecomment-660253774
-
can you help out with running calibration on d435? d435i? self calibration? https://www.google.com/amp/s/www.intelrealsense.com/self-calibration-for-depth-cameras/amp/
dynamic calibration?
for the latter I can not find packages in the added aws librealsense repository; for arm architecture
-
Hi Andre I have responded to your self calibration question on the GitHub.
https://github.com/IntelRealSense/realsense-ros/issues/1287#issuecomment-661858975
The top of page 16 of the user guide for the Dynamic Calibrator explains how you can install through a local file as well as the AWS method described on page 15.
-
Thank you for you rresponse!
At the moment I am trying to find out how to run remotely realsense viewer in oder to approach self-calibration; It wouldn't run over X forward.
Regarding the dynamic calibration I was looking where to download the deb files. May be there is some url where they can be downloaded? specifically for arm64?
-
My interpretation of the instructions for the 2.11.0.0 version (July 2020) of the Dynamic Calibrator tool is that it needs to be installed using the Linux installation method for Ubuntu 16.04 / 18.04 This would suggest that the AWS method needs to be used, since the local file is described as being in AMD64 format (i.e not Arm architecture).
Did you follow the instructions for Linux installation from page 14 of the guide please, completing the prerequisites before trying to access the AWS address on page 15?
-
I checked the instructions very carefully but there is no further guidance about Arm. Can you ask about it on the main GitHub please by creating a case there? Thanks!
-
Thank you for following up!
I am being asked if it is possible to implement finger/gesture/hand tracker like illustrated in the article https://www.intelrealsense.com/hand-tracking-overview/
with use of d400 devices.
Could you share any insights regarding implementation on d400 on linux something like is described into the article, please?
-
Finger and hand tracking is not strongly supported on the 400 Series cameras by existing programs. Nuitrack SDK can do hand gesture recognition, and they also have a new Nuitrack AI product in development that can do finger tracking.
Intel published a seminar about body joint tracking for RealSense. It requires you to use multiple cameras to train a data-set though before that data can be used for tracking with a single camera.
Please sign in to leave a comment.
Comments
23 comments