[ENet-discuss] Enet's Timer Precision Needs (timeBeginPeriod)

Mike Kasprzak mike at sykhronics.com
Sun Oct 21 16:58:32 PDT 2012


(Hey Sven)

Okay, then let me give a practical example so I don't get a "you
should always use 1ms" response. ;)

To meet Intel's Power Certification requirements, an application must
idle (not run, but idle) with a timer period 15.6 ms. Basically, if
you never call "timeBeginPeriod", then you pass. timeBeginPeriod
changes the ms clock rate of all running applications on Windows, and
the lowest period set will always have priority. Lower timer periods
across all of Windows have an effect on power drain, and that is why
they care. If you're the main in-focus application it's fine, but if
you're minimized, it's "not neighborly" to be forcing other
applications to run at a lower timer period when they don't need to.
At least, that's the best way I can think of to explain that
requirement.

So the practical use: A game would want to set it to 1 ms due to
broken vsync and better ping, but if the game is minimized then it
doesn't need such a low rate.

I found calls to "timeBeginPeriod" in SDL and enet. Again, it's right
there in the win32 implementation of enet_initialize.

Now sure, I can just go right in to win32.c and remove it. Easy.
That's not the issue.

The issue is if I do that, can anyone think of a reason enet will break on me?

And with that in mind, should enet be doing this all the time anyways?
SDL gives you the option of building the SDL library without timer
support, which is a way of disabling it.

I'm totally cool with it staying as a default and all, but should
there be a "proper" way of disabling it besides editing source? Then
it becomes my responsibility to set the timer period on Windows as I
gain/lose application focus (or not at all). I'm just hoping there's
not some reason enet will suddenly start losing data because my clock
updates are 15ms slow. :)

Thanks,

Mike Kasprzak
Sykhronics Entertainment
www.sykhronics.com


On Sun, Oct 21, 2012 at 6:57 PM, Sven Bergström <fuzzyspoon at gmail.com> wrote:
> MIKE, what are you doing on my list!
>
> So, I think it's also implementation detail of the game itself, much like
> ruud suggests.
> Some games would not need lower precision, some games would.
>
> Though that doesn't mean I understand the full extent of the changes on ENet
> internals,
> it doesn't appear to (by digging around in the code) change enough to cause
> problems.
>
> I'd wait for lee to weigh in, though.
>
> Sven
>
>
> On Sun, Oct 21, 2012 at 7:49 PM, Ruud van Gaal <ruud at racer.nl> wrote:
>>
>> I would say the time accuracy has an influence on ping time, where it
>> helps to get millsecond accuracy instead of just 15ms.
>> Resend times are big enough to probably not be influenced much.
>>
>> Still, any game quickly uses 1ms timing, if only for accurate fps timing.
>>
>> I think ENet will run, but just a bit more jerky than is needed.
>>
>> Ruud
>>
>>
>> On Sun, Oct 21, 2012 at 6:06 PM, Mike Kasprzak <mike at sykhronics.com>
>> wrote:
>>>
>>> Hi,
>>>
>>> I have a bit of an overly technical enet implementation question, so
>>> apologies in advance.
>>>
>>> Long story short, I've been going through and doing some work to make
>>> an older game of mine pass Intel's power use certification tests. To
>>> do that, you run Intel's Power Checker tool alongside your app/game:
>>>
>>>
>>> http://software.intel.com/en-us/software-assessment?&paneltab[tabs-28-main]=pid-252-28#pid-252-28
>>>
>>> The older game is totally single player game that doesn't use
>>> networking at all. That said, I was investigating one of my fail
>>> points (timer precision) by running my new games client and server,
>>> where I noticed even the server (without any graphics at all) was also
>>> using a 1ms precision. A bunch of digging later, I found a
>>> "timeBeginPeriod (1);" call inside win32 implementation of
>>> enet_initialize.
>>>
>>> My question is: How important is timer precision to enet's overall
>>> implementation?
>>>
>>> I realize that's a vague question, so let me clarify. Raising (or not
>>> changing) the timer precision means calls to timeGetTime() will simply
>>> not return new numbers until the timer period has passed. So, if the
>>> Windows default of 15ms is used, no matter how many times you call it
>>> during those 15ms, it will not change. For reference, timeGetTime is
>>> used inside enet_time_get and enet_time_set, so that's how it affects
>>> enet.
>>>
>>> So a better question: What would a lower precision timer break in enet?
>>>
>>> Like, is the time used in any way that may delay packets? Or is it
>>> purely statistical, like it will just ruin our frame of reference on
>>> when a packet was sent/received?
>>>
>>> Thanks,
>>>
>>> Mike Kasprzak
>>> Sykhronics Entertainment
>>> www.sykhronics.com
>>> _______________________________________________
>>> ENet-discuss mailing list
>>> ENet-discuss at cubik.org
>>> http://lists.cubik.org/mailman/listinfo/enet-discuss
>>
>>
>>
>> _______________________________________________
>> ENet-discuss mailing list
>> ENet-discuss at cubik.org
>> http://lists.cubik.org/mailman/listinfo/enet-discuss
>>
>
>
> _______________________________________________
> ENet-discuss mailing list
> ENet-discuss at cubik.org
> http://lists.cubik.org/mailman/listinfo/enet-discuss
>


More information about the ENet-discuss mailing list