Frame Rate does not match exposure on depth camera and cannot get complete exposure at the frame rate
I have a scenario where I am trying to see whether an IR led has flickered on. This flicker can be as short as 6ms. When I set the framerate of the depth camera, I can only set the exposure up to a specific value "less than" the exposure. So for example, when I try to set the frame rate to 30fps or (1/30) * 1000000 = 33333 microseconds, the camera will only allow me to set the exposure less than that up to 32200 microseconds before dropping the frame rate on my behalf to 15fps. So to be clear, 32201 will cause my frame rate to drop to 15fps although I set it to 30fps which is a difference of 1133 microseconds. I am pretty sure I am getting about 6ms of flicker from my IR led who's exposure time is 6000 microseconds but I am in fact not detecting the flicker in my image frame. I experience this regardless which frame rate I set it at with the corresponding maximum allowed exposure(90, 60, 30, 15). I have experienced that this is the case with the 455 and 415 which confirms the issue exists with both global and rolling depth shutter. Is there an inherent limitation on the hardware or software that will not allow me to get complete exposure at the frame rate?
-
Official comment
Hello Jsadural,
I think you'll need to sync your IR pulsing to D435/D455 depth camera (Global Shutter) exposure time. Recently there was a topic discussed about external trigger application. I believe some info is helpful to your case. Long story short, you can do either one of the following two solutions:
1) if your IR pulsing is fixed at 6ms, you can turn on it around 26~27ms AFTER the Trigger (rising edge), assuming D435/D455 is running at 30FPS. Or,
2) Use Pin#6 of the AUX connector on D4xx camera. It’s the burst of PWM pulsing controlling D435/D455 laser projector. The burst interval represents the exposure time. So you can use the 1st pulse to trigger your IR lighting. An One-Shot Timer circuit will be useful (T=36us~80us). (NOTE: you will need to make sure minimum exposure is no less than 6ms.)
Regarding the D415, since it's Rolling Shutter mode, short exposure mostly is unable to catch the 6ms IR pulsing due to timing not matched. So for D415, you'll have to maximize the exposure, or keep your IR lighting On all the time.
The D4xx camera firmware limits the max exposure time under Auto Exposure. For D435/D455, the max exposure is around 28.5ms (not 33.3ms under 30FPS). For D415, the max exposure is around 31.96ms under 30FPS.
Comment actions -
Hi Jsadural. The maths of how the exposure value can affect FPS are described in the link below.
https://github.com/IntelRealSense/librealsense/issues/1957#issuecomment-400715579
As mentioned in the link, a constant FPS can be enforced by one of the following methods:
1. Having auto-exposure enabled and disabling an RGB setting called Auto-Exposure Priority. This does not suit your particular application if you need to change exposure manually.
2. Set the exposure manually to a value that does not exceed the upper bound.
-
If you are trying to detect a flickering IR source on the RGB image of your RealSense camera then the IR Cut filter on the RGB sensor may be making the IR light frequencies unable to be observed. You could try seeing if it is observable on the infrared image, as the IR sensor does not have an IR Cut filter and can observe all visible and near-infrared frequencies.
https://github.com/IntelRealSense/librealsense/issues/1070#issuecomment-366405803
Alternatively, if you have a smartphone with a front camera then you could try the IR source finding trick in the link below.
https://github.com/IntelRealSense/librealsense/issues/8354#issuecomment-786268540
-
Thank you for your post but it does not directly address the issue I am experiencing. I am trying to detect the flickering IR led with the "IR camera only." I am unable to set the exposure to exactly the frame rate and have to set the magic number in order to get my desired frame rate.
" 30fps or (1/30) * 1000000 = 33333" but only able to set to magic number "32200" because if I set to "32201" then I go down to 15ps.
As you can see my exposure setting is less than the frame rate setting by 1133 microseconds. The actual exposure I in practice is still even less as I miss flickers of 6000 microseconds completely. Is there some inherent hardware issue or is this solvable? I am experiencing this on rolling and global shutters for the "IR" camera(d415 and d455).
-
Could you provide further information about the flickering IR LED that you are trying to detect, please? It sounds as though you are using the camera to observe an LED light component in the room that the camera is in. Is that correct?
I tested at 33333 manual exposure in infrared mode and confirmed that FPS drops to 15. Enabling the HDR Enabled option though enabled streaming with this exposure value at 30 FPS and it worked consistently through multiple consecutive tests. In the RealSense Viewer, this option is under the Controls section of the Stereo Module options.
I should highlight though that the image flashed at a rapid rate when HDR mode is enabled. So HDR mode may not be suitable if such flashing is problematic for you.
I also tested this exposure value successfully with HDR enabled in the 256x144 at 300 FPS mode that the D435 / D435i supports, though FPS was halved to about 150 FPS with 33333 exposure.
-
You are correct, it is an IR led light flashing in the room and is completely independent from the camera. From what I understand, HDR is the combination of short and long exposure images. I don't know what's going on under the hood but having short and long exposures at 30fps, I don't know if multiple frames are being taken during a 1/30 sec window or if frames are taken at 1/30 sec with a combination of short and long exposures. In either case, it does not work for my application because I am assuming that when the led flashes, it will be the brightest pixel of which I will threshold based on the previous frames and it is also a very short pulse. So flashing will clobber my running average of the peak threshold and having possible unexposed parts of the 1/30 sec window may miss the flash. My main requirement to achieve my goal is to have exposure of the whole or as close to the whole frame window as possible with no unexposed sections > 3ms. I have not tested with an oscilloscope to actually measure unexposed portions but my live tests of a known 6ms IR led flash gets completely missed at times using the settings as I stated above. Maybe there is something even lower level code than the api I can fiddle with to make sure I get as complete exposure as I can with respect to the frame rate. Ideally I'm trying to run 90fps on the IR camera.
-
What you are trying to detect reminds me of an astronomy phenomena called a fast radio burst that only lasts for several milliseconds. It is detected by a spike in energy.
https://en.m.wikipedia.org/wiki/Fast_radio_burst#Features
In January 2018, Intel demonstrated a camera sync method where capture was started and stopped by a commercial camera flash that caused a spike in camera data.
So rather than looking for visible flashes, I wonder if you could look for evidences of flashes in the numeric values of the camera stream.
Alternatively, in regard to going lower than the usual API, (the High Level API), you can access the camera hardware with the Low-Level API.
https://github.com/IntelRealSense/librealsense/blob/master/doc/api_arch.md#low-level-device-api
https://support.intelrealsense.com/hc/en-us/community/posts/360043865234/comments/360011267814
-
So my scenario of detecting a flashing LED is really just an analogy for measuring a short pule duration IR laser being shot at a wall that the camera is pointed at. I am shooting a laser that has a specification of 6ms duration. As of now, I am missing about 40% of my shots because the exposure is not extended to the full frame. On first look at the low level api, I am not in need of low latency. Do you know off the bat if there is some inherent physical specification of the realsense hardware that is causing this limitation? The fact that the camera does not have a linear drop in FPS with the incremental increase in shutter time regardless of global vs rolling shutter leaves me confused on what is really going on. On even the cheapest rgb webcams, increasing the rolling shutter to the point where the FPS drops even 1fps(30fps down to 29fps), I am guaranteed 100% exposure over the duration of the frame and can see 100% of visible lasers at 6ms. I am really trying to solve this problem on this platform because I do in fact want to take advantage of the alignment between streams. Is there someone on the realsense hardware team that can chime in?
-
I will tag Intel Support team member @... into this discussion
Hi @... Regarding the project scenario described by Jsadural in the comment above, do you have any suggestions? Thanks!
-
The laser is at 780nm wavelength and pulse duration of 6ms. A long duration 780nm wavelength pulse at 40ms duration is detected visually and algorithmically no problem whether directed at the camera or at the wall. I visually inspect the frames and see the laser very clearly and is the brightest pixel when detected(pulse happens to land when exposed). The problem is that sometimes I get partial exposure and see it very dim or no exposure where I am really unlucky and the full duration of the pulse happens in the "gap" of unexposed frame time.
-
Hello Dan Nie,
Thank you for your response but for my scenario, I do not have the luxury of knowing when the pulse is going to happen. Imagine someone shooting a laser at a wall, it's all luck of the draw when it will happen with regards to the trigger timing. My goal is to detect whether the 6ms ir pulse happened or not while being triggered by something completely independent of the camera system. From your comments:
"For D435/D455, the max exposure is around 28.5ms (not 33.3ms under 30FPS). For D415, the max exposure is around 31.96ms under 30FPS."
33.3 - 28.5 = 4.8ms for D455 and 33.3 - 31.96 = 1.34ms for d415, so in worse case you get 1.2ms and 4.66ms exposure respectively of the 6ms IR which which will result in an extremely dim spot and not showing as the brightest pixels which is what I am observing. To be honest, the d415 feels like it is just as dim as the d455 even though it's spec'ed to have more exposure over the frame. My big questions is:
"Is it physically possible to get a more complete exposure time over the whole frame with any RealSense depth camera or is this a hard limitation?"
-
Hi Jsadural,
I am sorry, since all D400 series cameras in the market are locked due to safety concern, users are unable to maximize the exposure to full frame time by changing the register settings of the CMOS sensor. Also, the FW only supports some fixed frame rates: 90FPS, 60FPS, 30FPS, 15FPS and 6FPS.
For your case, I think under 30FPS, the D415 sounds better than D435/D455 to detect the IR pulse if you max out the exposure (32200=32ms under 30FPS, gap 1.3ms vs 4.8ms). However, since D415 sensitivity is lower than D435/D455 at 780nm, you will need to increase the gain on D415. But under 90FPS, both D435/D455 and D415 the max exposure=9.99ms, the gap will be 1.12ms. With higher sensitivity, D435/D455 should be better.
Additionally, not sure if it fits your application, if you don't care about the visible light, you probably can try some IR long pass filter (i.e. NIR-75N or NIR-70N) for the 780nm IR. Once you attach it to D4xx camera, the auto exposure will go to max under normal condition. if the IR pulse is dim, then try increasing the gain then.
-
Hello Dan,
Thank you for that information, it was exactly what I was looking for. I took your suggestion and ran at 90fps with a shutter setting of 9900 and was able to detect my pulse very dependably. Thank you a ton for that. With regards to your suggestion of an IR filter, in my actual application I do in fact utilize the Aligned RGB along with the IR stream. Does RGB come with the same FPS constraints along with the same magic numbers for exposure? I will look to process both RGB and Depth streams at max FPS with minimum gap requirements but RGB and Depth streams have different max FPS. I do recall getting into a state where my Aligned RGB stream was set at 15fps but my IR stream was set at 60fps and some periodic freezing of the IR frame would occur as opposed to when both streams set at 30fps and being smooth. Is there a constraint that they must run at matching FPS in order to operate correctly? What do you suggest is the best way to go about processing RGB Aligned and IR stream in parallel?
-
Hi Jsadural,
Sorry for late. That's great news. I am so glad to hear that.
Regarding the IR filter, actually I was trying to mean applying it on both IR camera but not including the RGB, sorry for not describing it clearly.
Anyway, regarding the question about RGB vs Depth constraint, if USB bandwidth is not a concern, you can run any combination of them. However, many users want them to be sync'd, so they are supposed to run at same FPS.
Currently the FW(i.e. 5.12.11), by default, the D415/D435/D435i has configured both Depth and RGB at sync mode (NOTE: D455 not done correctly yet at this moment. will be fixed soon). The RGB cam is master and Depth cam is slave. It's better to turn on RGB first, and then the Depth. To your case, the bad part is that, the RGB cam doesn't support 90FPS. So probably you need to try 60FPS. Please note, running different FPS between RGB and Depth may cause some issue like image freezing.
Thanks
-
Hello Dan,
I did notice that the RGB acted as the master and IR as a slave as I varied the frame rates. I did see some option to turn IR as the master but that did not do what I was hoping, to decorrelate the two streams. As a feature request in the future, it would be great to have the IR stream run as fast as it wanted and in turn generating the depth stream at the same rate, and have a function to overlay any RGB frame over any point cloud as you see fit. So for example, take the current RGB frame and overlay it over the most recent or filtered point clouds if that’s what you wanted to do. This would really help out to better “flatten” point cloud surfaces such as walls while still being very responsive. It seems like a hardware constraint which is unfortunate. Since the device really works properly for matching FPS with RGB as the master, what is the paradigm for when you would want to use poll_for_frame? Is there a good reason why the gap in exposure for a given FPS even exists? And finally, Can you please give me the list or table of all the different frame rates and corresponding max exposure setting with gaps? It would be great if a chart existed in some doc.
-
Hi Jsadural,
Sorry for late. Unfortunately, we don't have an existing document regarding the data of exposure timing you were looking for. So I had to go back to check a group of associated registers one by one, and manually calculate the max exposure for different frame rate. Since it was quite time consuming, I could only collected some data you might care firstly. If you are interested in some other specific format, please let me know. I will try to do it again for you. Here is some data for you:
[D455] FW5.12.12.100
RGB 640x360 90FPS: max exp=9.99ms, gap=1.12ms
RGB 848x480 60FPS: max exp=9.99ms, gap=6.67ms
RGB 848x480 30FPS: max exp=30.01ms, gap=3.33ms
RGB 848x480 15FPS: max exp=59.99ms, gap=6.67msDepth 848x480 90FPS: max exp=9.94ms, gap=1.17ms
Depth 848x480 60FPS: max exp=15.36ms, gap=1.3ms
Depth 848x480 30FPS: max exp=*31.97ms, gap=1.36ms (*Sorry, previous data "28.5ms" was wrong due to calculation was mixed with two different FW versions.)
Depth 848x480 15FPS: max exp=64.9ms, gap=1.76msHope the data is helpful.
Thanks
Please sign in to leave a comment.
Comments
18 comments