[docs] 100 fold inaccuracy in time.clock() function in MacOS

Bill Havens havens at cs.sfu.ca
Sun Oct 3 00:39:08 CEST 2010


Dear docs

I am teaching Python to undergrads here at SFU. I discovered that the time.clock() function yields the correct processor time on Windows machines in seconds but in MacOS 10.6.4 it produces the time in divided by 100. So a full minute real time elapsed yields 0.60 seconds. This is repeatable it seems.

Surely the underlying C function is the same in Darwin as other unix implementations. Have you seen this issue before? Perhaps the docs for this function could be updated if this is expected behaviour.

regards, Bill Havens


________________________________________________________
Dr. William S. (Bill) Havens				http://www.cs.sfu.ca/~havens
Professor Emeritus
School of Computing Science			http://www.cs.sfu.ca/
Simon Fraser University				http://www.sfu.ca/
Burnaby, British Columbia, Canada  V5A 1S6
phone: +1.604.518.1624   fax: +1.604.733.0711   



More information about the docs mailing list