View my account

"RuntimeError: Frame didn't arrive within 5000"

Comments

34 comments

  • MartyG

    Hi Sohampadhye1998  Your code appears to be correct.  Does RuntimeError: Frame didn't arrive within 5000 still occur if you use 15 FPS instead of 30, please?

    1
    Comment actions Permalink
  • Sohampadhye1998

    Thank you MartyG for your quick reply!!

    But unfortunately I'm getting the same error. Please help me out.

    Attaching screenshot for ref .

    0
    Comment actions Permalink
  • Sohampadhye1998

    When I connect the camera to the Laptop instead of raspberry pi 4 I got the continuous frames without any error.

    The error is occurring only with raspberry pi 4. realsense-viewer is working fine on raspberry pi.

    0
    Comment actions Permalink
  • MartyG

    It can be difficult to diagnose a problem in a script when the code appears to be correct and it runs on a laptop but not a Pi.

     

    Does the problem still occur if you use pipeline.start() instead of pipeline.start(config) so that the script ignores the config instruction and automatically applies the default stream configuration instead?

    0
    Comment actions Permalink
  • Sohampadhye1998

    same error

    0
    Comment actions Permalink
  • Sohampadhye1998

    If you need any more info of the setup then please ask!!

    Can I get the live chat option?... It would be great help for me!!

    0
    Comment actions Permalink
  • Sohampadhye1998

    MartyG

    0
    Comment actions Permalink
  • MartyG

    There is not a live chat option for this support forum.  Support is provided online in text tickets like this one.

     

    Running a RealSense program on Raspberry Pi is more likely to result in problems than running it on another computer / computing device such as a desktop or laptop PC.

     

    Could you try running the opencv_viewer.example.py program at the link below, which displays the color and depth stream.  If this program works then it would suggest an issue in your script specifically on Pi, even though the code looks fine and works on laptop.

    https://github.com/IntelRealSense/librealsense/blob/master/wrappers/python/examples/opencv_viewer_example.py

    0
    Comment actions Permalink
  • Sohampadhye1998

    Hello MartyG

    PFA image

     

    same error while running the sample code

    0
    Comment actions Permalink
  • Sohampadhye1998

    Hello MartyG

    If I run the code only for the depth map it works fine for me on rpi.

    PFA image

    0
    Comment actions Permalink
  • MartyG

    The D455 camera model has the ability to stream RGB from the left infrared sensor instead of the RGB sensor.  In your own script, if you change rs.stream.color in the config line to rs.stream.infrared then does it successfully stream a BGR8 image?

    0
    Comment actions Permalink
  • Sohampadhye1998

    Hello MartyG   

    Tried rs.stream.infrared

    still not getting the frames

    PFA code and error snip

    0
    Comment actions Permalink
  • MartyG

    We can at least conclude from this test that the camera's RGB sensor is likely not the cause of the problem, since the RGB from infrared stream comes from the left infrared sensor and not the RGB sensor and it still did not work.  you also mentioned that the RealSense Viewer works fine.  As a pyrealsense2 script is able to show the depth stream, the Python wrapper installation is also likely okay.

     

    You could try resetting the camera automatically when your script launches by adding the following code before the pipeline start line to see whether it makes a difference.

     

    ctx = rs.context()
    devices = ctx.query_devices()
    for dev in devices:
    dev.hardware_reset()
    0
    Comment actions Permalink
  • Sohampadhye1998

    Hello MartyG

    Tried but not working.

    Also I'm trying to get the IMU data but same problem with the IMU data also.

    PFA image.

    0
    Comment actions Permalink
  • Sohampadhye1998

    Hello MartyG

    I have written the following code to extract IMU data and it runs fine with my Laptoop but not on Rpi. It throws same error "Frame not arrived...5000"

    import pyrealsense2 as rs
    import numpy as np
    import pandas as pd

    pipeline = rs.pipeline()
    config = rs.config()
    config.enable_stream(rs.stream.accel, rs.format.motion_xyz32f, 200)
    config.enable_stream(rs.stream.gyro, rs.format.motion_xyz32f, 200)


    pipeline.start(config)
    # Create empty lists to store the sensor data
    accel_list = []
    gyro_list = []
    try:
        while True:
            frames = pipeline.wait_for_frames()

            accel_frame = frames.first_or_default(rs.stream.accel)
            gyro_frame = frames.first_or_default(rs.stream.gyro)

            if accel_frame and gyro_frame:
                accel_data = accel_frame.as_motion_frame().get_motion_data()
                gyro_data = gyro_frame.as_motion_frame().get_motion_data()

                accel = np.array([accel_data.x, accel_data.y, accel_data.z])
                gyro = np.array([gyro_data.x, gyro_data.y, gyro_data.z])

                # Append the filtered measurements to the lists
                accel_list.append(accel.flatten())
                gyro_list.append(gyro.flatten())

                print("Accelerometer:", accel)
                print("Gyroscope:", gyro)

    except KeyboardInterrupt:
        # Stop the pipeline after collecting the desired data
        pipeline.stop()

        # Create DataFrames from the lists
        accel_df = pd.DataFrame(accel_list, columns=["AccelX", "AccelY", "AccelZ"])
        gyro_df = pd.DataFrame(gyro_list, columns=["GyroX", "GyroY", "GyroZ"])

        # Specify the output Excel file path
        output_file = 'sensor_data_without_kalman.xlsx'

        # Write the DataFrames to Excel
        with pd.ExcelWriter(output_file) as writer:
            accel_df.to_excel(writer, sheet_name='Accelerometer', index=False)
            gyro_df.to_excel(writer, sheet_name='Gyroscope', index=False)

        print(f"Data has been successfully written to {output_file}.")

     

    0
    Comment actions Permalink
  • MartyG

    Hi Sohampadhye1998  There are known problems with accessing the IMU on Raspberry Pi (depth and color work normally).  An example of this issue is discussed at the link below.  There is not a fix for it, unfortunately.

    https://github.com/IntelRealSense/librealsense/issues/11089

    0
    Comment actions Permalink
  • Sohampadhye1998

    Hello MartyG

     

    Thank you for your reply.

    1.Which edge device would you suggest for the realsense D455 so that I can get bot color and depth frame.?

    2.Is it possible to fix this IMU frames bug by changing the edge device?

    0
    Comment actions Permalink
  • MartyG

    The best recommendation that I can give for a single-board computing device comparable to Raspberry Pi's price that can stream both depth and color and that the RealSense IMU will work with is Nvidia Jetson Nano.

    https://developer.nvidia.com/embedded/jetson-nano-developer-kit

    0
    Comment actions Permalink
  • Sohampadhye1998

    Thank you MartyG for your suggestion.

    1. Is there any other board with more computational power with that D455 is compatible?

    2. I will get the all IMU readings on Jetson Nano with the code provided above. Right?

    Because code to find the IMU readings works fine on the laptop. 

     

    0
    Comment actions Permalink
  • MartyG

    There are a range of Jetson board models, such as Xavier and Orin.  The link below lists the models.

    https://developer.nvidia.com/buy-jetson

     

    Another brand of RealSense compatible boards is Up Board.

    https://up-board.org/boards/

    0
    Comment actions Permalink
  • Sohampadhye1998

    Hello MartyG

    Thank you for your help.

    Please ans to my second question...

    2. I will get the all IMU readings on Jetson Nano with the code provided above. Right?

    Because code to find the IMU readings works fine on the laptop. 

    0
    Comment actions Permalink
  • MartyG

    In theory, code that works on a laptop should work on a single board computer but sometimes in practise there are problems on the single-board computer.  There are more likely to be problems if the board uses Arm CPU architecture, like Raspberry Pi and Jetson.  The Jetson boards use software called JetPack that has been known to cause issues with the RealSense IMU if a certain JetPack version is not installed for a certain Ubuntu OS version.

     

    Up Boards use Intel CPUs, which have x86/x64 architecture like most Intel-based laptops.

    0
    Comment actions Permalink
  • Sohampadhye1998

    Okay MartyG

    I'll go with the Jetson board.

    Thank you for your extended support.

    0
    Comment actions Permalink
  • MartyG

    You are very welcome!

     

    Based on past cases involving IMU problems on Jetson, the best chance of the IMU working on Jetson Nano may be to use the Ubuntu 18.04 OS and either JetPack 4.5.1 or 4.6.

     

    A single board computer that uses x86/x64 architecture instead of Arm, like the Up Boards, are likely to have the least amount of complication.

     

    Other than Up, another brand of Intel-based board that some RealSense users have used for years is LattePanda.

    https://www.lattepanda.com/

    0
    Comment actions Permalink
  • Sohampadhye1998

    Thank you MartyG
    I'm beginner in this field. I need smooth integration of IMU and RGB camera with less difficulties. Can you please provide a link or material related to Ubuntu 18.04 OS and either JetPack 4.5.1 or 4.6. installation.

    0
    Comment actions Permalink
  • MartyG

    I cannot make a recommendation for a specific product.  If you are a beginner though then I would recommend an x86/x64 board instead of an Arm one, as the setup of the camera, operating system and the RealSense SDK software for it should be very similar to your laptop.

     

    If you are confident to install JetPack - see the link below - then the installation process of the RealSense SDK on a Jetson board such as Nano will be very similar to installation on an x86/x64 board. 

    https://docs.nvidia.com/jetson/jetpack/install-jetpack/index.html

     

    Installation of JetPack is simplified by use of the Nvidia SDK Manager program.

    https://developer.nvidia.com/sdk-manager

     

    If you do not mind using the Nvidia SDK Manager then the Nano is likely to have a performance advantage over other boards of a similar price-point.  

     

    If you choose an Up Board then the Up Squared model should provide an advantage, as it is capable of running two RealSense cameras attached to the same board.

    0
    Comment actions Permalink
  • Sohampadhye1998

    Thank you for your suggestion. 

    0
    Comment actions Permalink
  • Sohampadhye1998

    hello MartyG

    Can I install Ubuntu 20.04 on Jetson Nano and get the IMU readings?

    Because the python3 don't support Ubuntu 18.04(your earlier suggestion)

    0
    Comment actions Permalink
  • MartyG

    You should be able to build the librealsense SDK for Nano on Ubuntu 20.04 if you build it from source code, as there are not librealsense packages available for Arm64 on 20.04. 

     

    You should ideally use JetPack version 5.0.2 and RealSense SDK (librealsense) version 2.54.1 which has official support for JetPack 5.0.2.  The 4.x versions of JetPack were designed for Ubuntu 18.04 whilst the 5.x versions are designed for 20.04.

     

    If you need to use a librealsense version earlier than 2.54.1 then it is possible to achieve a working IMU with JetPack 5 if you either don't apply a kernel patch before installing librealsense, or you instead build from source using the RSUSB backend installation method which bypasses the kernel and so is not dependent on Linux versions or kernel versions and does not require patching.

    0
    Comment actions Permalink
  • Sohampadhye1998

    Hello MartyG

    Cab you please assist me to extract the data from .bag file.

    I have rgb image and IMU data in .bag file

     

    0
    Comment actions Permalink

Please sign in to leave a comment.