[ENet-discuss] Enet's Timer Precision Needs (timeBeginPeriod)

Mike Kasprzak mike at sykhronics.com
Sun Oct 21 09:06:22 PDT 2012


Hi,

I have a bit of an overly technical enet implementation question, so
apologies in advance.

Long story short, I've been going through and doing some work to make
an older game of mine pass Intel's power use certification tests. To
do that, you run Intel's Power Checker tool alongside your app/game:

http://software.intel.com/en-us/software-assessment?&paneltab[tabs-28-main]=pid-252-28#pid-252-28

The older game is totally single player game that doesn't use
networking at all. That said, I was investigating one of my fail
points (timer precision) by running my new games client and server,
where I noticed even the server (without any graphics at all) was also
using a 1ms precision. A bunch of digging later, I found a
"timeBeginPeriod (1);" call inside win32 implementation of
enet_initialize.

My question is: How important is timer precision to enet's overall
implementation?

I realize that's a vague question, so let me clarify. Raising (or not
changing) the timer precision means calls to timeGetTime() will simply
not return new numbers until the timer period has passed. So, if the
Windows default of 15ms is used, no matter how many times you call it
during those 15ms, it will not change. For reference, timeGetTime is
used inside enet_time_get and enet_time_set, so that's how it affects
enet.

So a better question: What would a lower precision timer break in enet?

Like, is the time used in any way that may delay packets? Or is it
purely statistical, like it will just ruin our frame of reference on
when a packet was sent/received?

Thanks,

Mike Kasprzak
Sykhronics Entertainment
www.sykhronics.com


More information about the ENet-discuss mailing list