Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create a hardware-based timer observation timestamp calculator #791

Open
2 tasks
jlblancoc opened this issue Jun 29, 2018 · 3 comments
Open
2 tasks

Create a hardware-based timer observation timestamp calculator #791

jlblancoc opened this issue Jun 29, 2018 · 3 comments

Comments

@jlblancoc
Copy link
Member

jlblancoc commented Jun 29, 2018

Purpose:

  • use in Hokuyo driver, but probably others

Motivation:

  • hardware-based XTAL oscillators, all have 20-50 ppm accuracy. Over time, a drift will accumulate between the sensor and PC time.
  • Try to design an algorithm that dynamically estimates the drift and resets the reference computer timestamp to reduce the drift.

Example numbers:

  • I have experimentally measured clock drifts of 6.06 µs/s and 83.33 µs/s in timestamps from two PCs which were collecting data from GPS, where I had 1us-accurate timing based on GPS data.
  • Let's say that we can tolerate 100 milliseconds of accumulated drift, and we are in the worst case of a relative drift of, say, 100 us/s (imagine the PC has a drift of +50us/s and the lidar -50 us/s).
  • We could only run for (100e-3 / 100e-6) / 60 = 16 minutes (!!) in a row.
  • Assuming a much smaller drift of 10us/s, we could run for less than 3 hours.

So, we need a better mechanism for solid sensor drivers.

To-do:

  • Design the algorithm (!)
  • test with real sensor over 1-2 days.
@jlblancoc jlblancoc added this to the Release 1.5.7 milestone Jun 29, 2018
@jlblancoc jlblancoc removed this from the Release 1.5.7 milestone Feb 14, 2019
@jolting
Copy link
Member

jolting commented Jun 23, 2019

I think this could be combined with #481. Maybe something similar to the ROS2 TimeSource.
https://github.com/ros2/rclcpp/blob/0723a0a6fc2f46edcbeebf2933c7929822177d56/rclcpp/src/rclcpp/time_source.cpp

@jlblancoc
Copy link
Member Author

Hmm... thanks for the pointer! But I think that this ROS2 class is actually solving a different problem, isn't it?
The problem that I wanted to highlight here was the need to correct a hardware-based timestamp (using an arbitrary t=0 time origin on sensor boot), in steps, whenever it could be estimated / guessed that the shift between the two clocks (local on the PC, remote on the sensor) is getting too large.
Probably NTP and such protocols are closer to this idea, although this mrpt feature should be kept as simple as possible...

If TimeSource does that, please let me know!

@jolting
Copy link
Member

jolting commented Jul 2, 2019

Perhaps it's slightly different.

"It is possible that the user may have access to an out of band time source which can provide better performance than the default source the /clock topic. It might be possible that for their use case a more advanced algorithm would be needed to propagate the simulated time with adequate precision or latency with restricted bandwidth or connectivity. The user will be able to switch out the time source for the instance of their Time object as well as have the ability to override the default for the process."
-https://design.ros2.org/articles/clock_and_time.html

Here's a concrete example I was thinking of a system based on ROS and DJI hardware.
DJI has a hardware sync pulse which can be used to trigger some external sensor. https://developer.dji.com/onboard-sdk/documentation/guides/component-guide-hardware-sync.html
So DJI has a way of getting the reference time to ROS when that sensor fired, but the time reference is using the FC(flight controller) clock.

Perhaps it's easier just to use the DJI's clock across the entirety of the application since DJI has gone through the effort of synchronizing all the data for you. All the algorithms probably should use the DJI clock timestamps. All the telemetry you're receiving from DJI is going to use their clock, so constantly converting between FC time and system time is just burdensome. For some application, system time is unnecessary if you're using that external time source.

Unfortunately, ROS loves to timestamp everything with ros::Time::now(), not DJI time and ROS1 probably won't get TimeSources. That will be good in ROS2 when they have a working implementation of the TimeSource concept. Algorithms in ROS almost exclusively use the header.stamp, so it should be fairly easy to retrofit nodes to use TimeSource, but it seems like it's still in its infancy and I think there is still a lot of work to do.

DJI's example of how to use FC time in ROS is rather basic and doesn't use any fancy time source.
https://github.com/dji-sdk/Onboard-SDK-ROS/blob/f1e68c05b1a25328c1058b2ae2f7a1fcb68644df/dji_sdk_demo/src/demo_time_sync.cpp

Here's how they publish the FC time of the trigger.
https://github.com/dji-sdk/Onboard-SDK-ROS/blob/f1e68c05b1a25328c1058b2ae2f7a1fcb68644df/dji_sdk/src/modules/dji_sdk_node_publisher.cpp#L697

Here's also ROS's TimeReference message.
http://docs.ros.org/melodic/api/sensor_msgs/html/msg/TimeReference.html

I wonder how something like std::chrono:time_point<MySensorClock> would work. The MySensorClock wouldn't have a now() method, so it would be a pseudo clock.
https://en.cppreference.com/w/cpp/chrono/local_t

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants