I’m building something that interacts with the handy but having issues understanding the explanation behind the getServerTime endpoint.
I undestand the first bit, that I need to first get an average of round trip time RTD, but then it mentions Ts_est = Ts + RTD/2 saying Ts is the “received value server time value” and from there until the end I’m lost.
I was wondering if anyone could provide a quick example in pseudo code or any language of how to do all that math, or at least another explanation about what each value is, cause I don’t understand the definitions.
The delay between your pc and the handy api server (ping, roundtrip, whatever)
The time offset of your pc clock and the servers clock
The server send its timestamp in the response, so you might end up with an effective offset several seconds higher than the pure communication delay, or even a negative offset.
ScriptPlayer implementation for reference:
In short:
To estimate what the time at the server is right now you take the serverTime from your response and add half your roundtrip (= time when you received the answer - time when you sent the request).
For a better result, do this 10-30 times and take the average.
(In my code there are 2 warmup requests because the first one might take a bit longer, which slightly skews the average)
This timespan is now your offset. So every time you send a message that requires a serverTime, you take the time of your PC and add that offset.
Also if you want another example, the HandyConnection script in my Handy Unity SDK on GitHub demonstrates how to do the synchronization stuff - might be helpful if you’re a C# programmer