We have a D435i camera on our 2D mobile robot in order to improve the orientation estimation of the system. Data are streamed with a C-USB cable (3.1) within the ROS2 (humble) framework (therefore, we are using librealsense-ros).
As starting point, we are trying to achieve the best possible outcome with only gyroscope before to fuse with wheel encoders.
We noticed that we get different results (orientation estimation) depending on which verse the robots turns: specifically, we get a worse performance when robot is turning clockwise with respect to counterclockwise movements. This result seems does not depend on
- the applied method (like simple integration or Kalman);
- if the IMU is calibrated or not.
We are probably missing something. Do you have any suggestions? Please, feel free to ask if you need more details on our set-up.
- This scenario has been observed on 2 / 3 cameras we tried so far.
- Cameras have the latest firmware.
Please sign in to leave a comment.