Hello,
I'm trying to figure out the difference between the timestamps in cloud_node/points
and os1_node/lidar_packets
, aka UDP packet.
When plotting delta_t [= UDP - cloud_node] over UDP time, I would expect delta_t to have a constant value, i.e., both the original UDP time and converted time should increase at the same rate. However, I'm seeing a uniformly increasing step function for delta_t, i.e., UDP time outpaces cloud_node time in a step-wise manner.
Can someone explain what's going on?
I assume this holds true regardless of the hardware timing mechanism, e.g., OSC vs PTP vs GPS. Is that true?
Thanks!