[Pythonmac-SIG] time.clock() versus time.time() significant digits

David M. Cooke cookedm at physics.mcmaster.ca
Sun Sep 19 19:34:49 CEST 2004


On Sun, Sep 19, 2004 at 09:33:01AM -0700, Kevin Altis wrote:
> I was looking at someone else's code today that had some time.clock() 
> calls used to figure out how long an operation took. I was surprised to 
> see on the Mac that there were only two significant digits after the 
> decimal point compared to what I saw on Windows. Normally, I use 
> time.time() for calculating time taken on an operation and my numbers 
> on both systems seemed to have the same resolution. Anyway, I opened up 
> the shell on the Mac and indeed it appears that time.time() is using 
> more significant digits. Am I just misinterpreting the results here or 
> is this a bug? I'm using the stock Panther Python 2.3.

No, it's a Windows vs. Unix thing (and here, Mac is a Unix). The two
calls have two different uses: time.time() returns system time (since
the Epoch), and time.clock() returns CPU time since the start of the
process (more or less).

On Unix, time() will usually have a resolution of milliseconds. The
clock() call will a resolution of os.sysconf('SC_CLK_TCK') times per
second, which under Darwin and Linux, is 0.01 s.

The usual advice for timing is to use clock() on Windows, time() on Unix.

-- 
|>|\/|<
/--------------------------------------------------------------------------\
|David M. Cooke                      http://arbutus.physics.mcmaster.ca/dmc/
|cookedm at physics.mcmaster.ca


More information about the Pythonmac-SIG mailing list