Timestamp and subframe size

Anything that doesn't fit into any of the other forums
Post Reply
javierdeelias
Posts: 8
Joined: Fri Sep 29, 2023 4:16 pm

Timestamp and subframe size

#1

Post by javierdeelias »

Hello,
I am carrying out some tests of the accuracy of the time stamp using light flashes synchronized with a GPS/PPS, and a local time server also with GPS/PPS (stratum 1) to synchronize the Pc. The recordings are with ASI183 camera and .ser format. The results are very good (less than 1ms) if the subframe size is small, but I observe delays of 3ms if the subframe size is 640×480 and higher for larger subframes. Does this make sense?
Regards
Javier
javierdeelias
Posts: 8
Joined: Fri Sep 29, 2023 4:16 pm

Re: Timestamp and subframe size

#2

Post by javierdeelias »

Hello,

I forgot to mention that I am testing with MODE16.

I understand that one possible explanation is the readout time, which will produce a delay in the timestamp with respect to the true capture start time. But I would like to confirm that Sharpcap does not apply any correction for readout time.

The delays I see are compatible with the maximum FPS according to the subframe size than ZWO mention for this camera.

Regards,
Javier
User avatar
admin
Site Admin
Posts: 13350
Joined: Sat Feb 11, 2017 3:52 pm
Location: Vale of the White Horse, UK
Contact:

Re: Timestamp and subframe size

#3

Post by admin »

Hi Javier,

for the majority of cameras, SharpCap bases the timestamps off of the time that it receives the frame data from the camera SDK - that gives the value used for the end of frame timestamp and the start of frame timestamp is calculated by simply subtracting the known exposure from the end of frame timestamp.

That means that the SharpCap time will be late due to

* Any time taken by the camera between finishing the exposure and starting to transfer data to the PC
* The time taken to transfer the data from the camera to the PC
* Any time taken on the PC re-assembling the frame from the transferred data and passing it onto SharpCap.

It's hard to give a precise value for these delays, as so much of the procedure is hidden from observation, and may depend on camera settings like exposure, USB speed settings, frame size, etc. As an example, most cameras use a progressive scan exposure, meaning that the rows of the image are not all exposed at the same time. In theory a camera might be able to start transferring data from the top of the image where the exposure has finished to the PC while the rows further down the image are still exposing, but who knows if that actually happens or not.

As an exception to the above, there are certain cameras (some QHY, Moravian models) that have built in GPS, giving precise timestamps to the start/end of frame capture.

One way to get a handle on the likely time to transfer a frame to the PC is to note that the frame transfer time often limits frame rate on cameras, so if you set a very short exposure (<1ms) and you get 200fps at a particular ROI then the frame transfer time may well be around 5ms. It's worth knowing though that the frame rate can also be limited by the speed of the cameras internal systems, so that is not 100% guaranteed.

Hope this info is useful,

Robin
Jean-Francois
Posts: 402
Joined: Sun Oct 13, 2019 10:52 am
Location: Germany

Re: Timestamp and subframe size

#4

Post by Jean-Francois »

Hello Javier,

For the GPS flashing ... have a look at this topic:
viewtopic.php?p=41414#p41414

Regards,
Jean-Francois
Post Reply