Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

get accurate frame sampling time #538

Open
2 tasks done
shockjiang opened this issue Feb 25, 2023 · 9 comments
Open
2 tasks done

get accurate frame sampling time #538

shockjiang opened this issue Feb 25, 2023 · 9 comments

Comments

@shockjiang
Copy link

Preliminary Checks

  • This issue is not a duplicate. Before opening a new issue, please search existing issues.
  • This issue is not a question, bug report, or anything other than a feature request directly related to this project.

Proposal

According to the document here: https://www.stereolabs.com/docs/api/classsl_1_1Camera.html#a3cd31c58aba33727f35aeae28244c82d, timestamp for each frame is the time that indicate the ending of each frame's readout:

  • the cam read each line of pixel and transmit it to PC immediately, instead of read the whole frame to the camera's mem and then transmit to PC
  • ignore the time cost for the last bit of last line of the frame.

Is the above description right?

In some case, we really need an accurate time, which could be the middle time between the start and end of frame's sampling (readout), which is an accurate timestamp for the frame.

Use-Case

realtime multi-sensor fusion for navigation

Anything else?

No response

@Myzhar
Copy link
Member

Myzhar commented Feb 25, 2023

Hi @shockjiang
Please note that this is more a question than a feature request.
The best place for questions is the Stereolabs community forum: community.stereolabs.com

However, what you write is not correct.
The CMOS sensors are rolling shutter sensors and they are read line-by-line by the ISP on the camera, but they are transmitted to the host device as two complete synchronized frames.

The timestamp corresponds to the host system clock in the exact moment when the frames are ready in the buffer of the USB3 controller.

@shockjiang
Copy link
Author

Thank you for your response.

  • left and right images are sampled and transmitted synchronized. (per line)
  • still I wonder, for the left image, how can I get the accurate timestamp that CMOS of its middle line sample the world?

@Myzhar
Copy link
Member

Myzhar commented Feb 27, 2023

left and right images are sampled and transmitted synchronized. (per line)

No, they are sampled per line and transmitted to the host per image

still I wonder, for the left image, how can I get the accurate timestamp that CMOS of its middle line sample the world?

This is not possible. The HW does not allow precisely retrieving this kind of information.

You can estimate the mean latency by pointing the camera to a monitor showing a stopwatch.
Here's an example of how to do that:
https://www.youtube.com/watch?v=jmVeFdKxZDc

@shockjiang
Copy link
Author

For a GPS + VO fusion application, if the app cannot get the precise time of the camera "taking" phone, the performance of fusion may be greatly affected. Am I right? If so, there is a need for the feature of providing accurate timestamp.

What's more, is there an API to set the time to the camera's OS?

@Myzhar
Copy link
Member

Myzhar commented Feb 28, 2023

For a GPS + VO fusion application, if the app cannot get the precise time of the camera "taking" phone, the performance of fusion may be greatly affected. Am I right? If so, there is a need for the feature of providing accurate timestamp.

The method currently used to retrieve the frame timestamp is the most precise as possible.

What's more, is there an API to set the time to the camera's OS?

Can you explain better? The question is not clear.

@shockjiang
Copy link
Author

For a GPS + VO fusion application, if the app cannot get the precise time of the camera "taking" phone, the performance of fusion may be greatly affected. Am I right? If so, there is a need for the feature of providing accurate timestamp.

The method currently used to retrieve the frame timestamp is the most precise as possible.
I cannot agree. about 30-50ms delay? GPS sample rate is ~10Hz (100 ms), seem to be a quite a long time. If you check flir or basler, you would find that they even allow callback function on starting and finishing exposure.

What's more, is there an API to set the time to the camera's OS?

Can you explain better? The question is not clear.
Does you camera hold same "system time" as linux? If so, how can we set a time for it.

@Myzhar
Copy link
Member

Myzhar commented Feb 28, 2023

In my first reply, I wrote this:

The timestamp corresponds to the host system clock at the exact moment when the frames are ready in the buffer of the USB3 controller.

This is how the timestamp is set for the ZED. The internal ISP does not allow retrieving the timestamp in a more precise way.
I think this also replies to your latest question.

If you use the GPS with a rate of 10 Hz, and the ZED latency is 30-50 msec, then it's easy to assign each frame to the correct GPS datum. It would have been a problem if the latency was higher than the GPS period.

@shockjiang
Copy link
Author

Thank you so much @Myzhar
I plan to buy one ZED2 for my project, and I hope ZED is able to provide better timestamp in the future.

@Myzhar
Copy link
Member

Myzhar commented Feb 28, 2023

It's my pleasure. Do not hesitate to write an email to [email protected] if you need help.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Development

No branches or pull requests

2 participants