time.clock() or time.time()

Shane Hathaway shane at hathawaymix.org
Tue Aug 2 12:28:30 EDT 2005


peterbe at gmail.com wrote:
> What's the difference between time.clock() and time.time()
> (and please don't say clock() is the CPU clock and time() is the actual
> time because that doesn't help me at all :)
> 
> I'm trying to benchmark some function calls for Zope project and when I
> use t0=time.clock(); foo(); print time.clock()-t0
> I get much smaller values than when I use time.clock() (most of them
> 0.0 but some 0.01)
> When I use time.time() I get values like 0.0133562088013,
> 0.00669002532959 etc.
> To me it looks like time.time() gives a better measure (at least from a
> statistical practical point of view).

time.time() is very likely what you want.

time.time() measures real time, while time.clock() measures the time the 
CPU dedicates to your program.  For example, if your function spends 2 
seconds doing pure I/O and 3 seconds doing pure computation, you should 
expect time.time() to show 5 that seconds elapsed, and time.clock() to 
show that the CPU was consumed by your program for 3 seconds.

Shane



More information about the Python-list mailing list