[ENet-discuss] Time synchronization client/server/client communication.

bill k billiumk at gmail.com
Thu Mar 11 10:05:11 PST 2010


So I updated the algorithm and it seems to be working pretty well.

whenever either device detects an incoming event where the packet time is
either far in the future or far in the past, a time-sync sequence is
started. "far" is determined by gametime - (packet_time + one-way-latency).

these time sync sequences are happening more often than i'd like, but it
seems to be the only way to consistently be synced.

I think what I might do down the road is instead of using a "time sync"
packet, i'll build in a flag into the packet ID code, so if a packet is
already queued for delivery and it's stamped with the current game time, i
can just flag that as a time-sync response packet as well as :)

Cheers!
--
Bill Kouretsos
tel \ 647.477.3817
littleguygames.com


On Thu, Mar 11, 2010 at 8:39 AM, bill k <billiumk at gmail.com> wrote:

> Thanks Alexander!
>
> I ended up figuring out pretty much the same approach.. instead of dividing
> by 2 to get the average, i used a variable divisor. depending on the nature
> of the discrepancy, i either add or subtract .1 for large differences, .05
> for small discrepancies. it'll converge in about 5 sec with a tolerance of <
> 100 ms
>
> right now i'm working on the second part of this where even though the
> clocks sync, after time one seems to run faster than the other device. it
> seems the actual system time is being reported fast/slower depending on what
> device i'm using :(
>
> It's an iphone game and my testing devices are iPhone 3gs (pretty powerful)
> and an ipod touch 2nd gen (kinda powerful?)
>
> does framerate have to do with it maybe? should i limit the faster device
> to a lower framerate? I'll post a proper solution if/when i've got it.
>
> cheers!
>
> --
> Bill Kouretsos
> tel \ 647.477.3817
> littleguygames.com
>
>
> On Wed, Mar 10, 2010 at 10:47 PM, Alexander Shyrokov <sj at sjcomp.com>wrote:
>
>> Hi Bill,
>>
>>
>>  client sends time-request packet to server
>>> server stamps and responds
>>> client takes the last send time, current time, figures out the *average*
>>> latency and adjusts the time to predict where the server is right now
>>> the client seems to always be a little bit off though and I think it's
>>> because the sending and receiving speed / packet delays are different.
>>> This would mean the client is /always/ over or under compensating.
>>>
>> Use your procedure to get initial time difference between server time and
>> local time as well as half latency. The next time you get the response, try
>> to predict what the server time should be using information from your
>> previous response. Obviously it will be off, but the direction will tell you
>> which way you should shift your prediction.
>>
>> lt - local time; st - server time; lt1 - request time; lt2 - response
>> received time.
>> 1) hl = (lt2 - lt1)/2
>>   dif = st - (lt1 + hl)
>>
>> 2) predicted_st = lt1 + dif + hl
>>   hl = hl + (st - predicted_st)
>>
>> ... repeat
>>
>> You have two variables dif and hl that you estimate from your times in my
>> example I assume that dif was done properly the first time around, while in
>> fact you need to keep track of how much your prediction is off and see if
>> you need to redo the process.
>>
>> I did not try this, just my 2c. Let us know how you decided to solve the
>> problem.
>>
>> Blue skies,
>> Alexander
>>
>> _______________________________________________
>> ENet-discuss mailing list
>> ENet-discuss at cubik.org
>> http://lists.cubik.org/mailman/listinfo/enet-discuss
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.cubik.org/pipermail/enet-discuss/attachments/20100311/e9cec510/attachment.htm>


More information about the ENet-discuss mailing list