time.time() under load between two machines

kwharrigan at yahoo.com kwharrigan at yahoo.com
Fri Jul 22 09:48:39 EDT 2005


I am working on some code using python and a distributed system.  Some
particular message is sent on one machine (with a timestamp logged) and
after the message is received, a timestamp is made.  I am having
problems with negative latencies happening under intense CPU load.
There is ntp sync happening so the machines should be very well in
sync.  Example:

machine1
----

sendTime = time.time()
<Call to send message>

machine2
---

<Got a message>
recvTime = time.time()

Latency = recvTime-sendTime is NEGATIVE

Is it possible that the CPU load is affecting the accuracy of the
time.time() call, or that the relative difference in CPU load between
the machines is causing this delta inaccuracy?  Any help would be
appreciated.

Kyle Harrigan




More information about the Python-list mailing list