On the goodness of gettimeofday()

Discussion of chess software programming and technical issues.

Moderator: Ras

User avatar
sje
Posts: 4675
Joined: Mon Mar 13, 2006 7:43 pm

On the goodness of gettimeofday()

Post by sje »

I wrote a little C++ snippet that loops a thousand times printing out the current time with microsecond resolution. The value for the wall clock is taken from gettimeofday(). In each case, the accuracy is better than one millisecond. Sample output:

On a dual 1.133 GHz Pentium 3 running RedHat 9 Linux:

Code: Select all

2008.04.22 00:07:12.006847
2008.04.22 00:07:12.006915
2008.04.22 00:07:12.006982
2008.04.22 00:07:12.007049
2008.04.22 00:07:12.007117
2008.04.22 00:07:12.007184
2008.04.22 00:07:12.007251
2008.04.22 00:07:12.007321
2008.04.22 00:07:12.007388
2008.04.22 00:07:12.007456
On a 3 GHz Pentium 4 running Ubuntu Linux:

Code: Select all

2008.04.22 00:04:14.880565
2008.04.22 00:04:14.880598
2008.04.22 00:04:14.880633
2008.04.22 00:04:14.880668
2008.04.22 00:04:14.880705
2008.04.22 00:04:14.880740
2008.04.22 00:04:14.880775
2008.04.22 00:04:14.880808
2008.04.22 00:04:14.880838
2008.04.22 00:04:14.880875
On a dual 2.66 GHz Xeon running OpenBSD:

Code: Select all

2008.04.22 00:05:40.695686
2008.04.22 00:05:40.695702
2008.04.22 00:05:40.695720
2008.04.22 00:05:40.695734
2008.04.22 00:05:40.695753
2008.04.22 00:05:40.695773
2008.04.22 00:05:40.695792
2008.04.22 00:05:40.695811
2008.04.22 00:05:40.695831
2008.04.22 00:05:40.695850
cyberfish

Re: On the goodness of gettimeofday()

Post by cyberfish »

Are you sure it's not the IO taking the time?

What happens if you print every 10 iterations? (perhaps storing the timevals in an array and print them out later?)
User avatar
sje
Posts: 4675
Joined: Mon Mar 13, 2006 7:43 pm

Re: On the goodness of gettimeofday()

Post by sje »

Of course the I/O (here, it's just the O) is taking time. The reason behind the demo is to show the timing resolution available. The output is line buffered, so there is an additional blocking system call for each record.

All modern CPUs and chipsets support high resolution timing to one microsecond or better, even if the OS in use doesn't. On a modern Mac, there's the nanosleep() call that can access a nanosecond timer. Regular gettimeofday() has to settle for microsecond resolution as that's the limit of the specification.
cyberfish

Re: On the goodness of gettimeofday()

Post by cyberfish »

could be wrong, but I thought I read somewhere that high resolution timing is not reliable on anything except real-time OSes (and things like CPU frequency scaling makes it even worse).

What I meant in the last post is that, maybe it is more precise than what you have shown, if not for the blocking calls. That is why I suggested calling gettimeofday() in succession (don't output between calls).
User avatar
sje
Posts: 4675
Joined: Mon Mar 13, 2006 7:43 pm

Re: On the goodness of gettimeofday()

Post by sje »

Well, I already know that it's good for 1 usec on my machines. Perhaps you could write your own version for your machines.

(I discarded the source for my tests a while back.)
bob
Posts: 20943
Joined: Mon Feb 27, 2006 7:30 pm
Location: Birmingham, AL

Re: On the goodness of gettimeofday()

Post by bob »

sje wrote:Well, I already know that it's good for 1 usec on my machines. Perhaps you could write your own version for your machines.

(I discarded the source for my tests a while back.)
Note that what you are verifying is simply that the timer value is a monotonically increasing value that changes in microsecond increments, but it doesn't say a thing about how accurate this actually is. The operating system disables interrupts frequently for more than a microsecond...
cyberfish

Re: On the goodness of gettimeofday()

Post by cyberfish »

in other words precision but (possibly) not accuracy.
bob
Posts: 20943
Joined: Mon Feb 27, 2006 7:30 pm
Location: Birmingham, AL

Re: On the goodness of gettimeofday()

Post by bob »

cyberfish wrote:in other words precision but (possibly) not accuracy.
correct...
User avatar
sje
Posts: 4675
Joined: Mon Mar 13, 2006 7:43 pm

Re: On the goodness of gettimeofday()

Post by sje »

Well, actually I do know that the microsecond clock is good to one microsecond, although that is not necessarily proven from the sample output.
bob
Posts: 20943
Joined: Mon Feb 27, 2006 7:30 pm
Location: Birmingham, AL

Re: On the goodness of gettimeofday()

Post by bob »

sje wrote:Well, actually I do know that the microsecond clock is good to one microsecond, although that is not necessarily proven from the sample output.
How do you solve the interrupt issue? when any interrupt occurs, the rest are disabled until they are expicitly enabled, and it doesn't take much to miss clock ticks. That's why things like NTP were developed, to correct the time slip that occurs naturally because of this...

I have _never_ seen a computer that could maintain a clock to even 1 second per day, which is millisecond accuracy...