a soft real-time system using pythonm

Jeff Epler jepler at unpythonic.net
Wed Jul 31 14:43:52 EDT 2002


On Wed, Jul 31, 2002 at 12:33:36PM -0400, anton wilson wrote:
> O
> > A realtime system places an upper bound on all operations, but the very
> > nature of Python makes this impractical if not impossible.
> >
> > An interaction between Python's handling of the GIL and the platform's
> > pthread library is going to be the least of your problems if you
> > actually intend to deliver a system which gives realtime guarantees on
> > Python programs.
> >
> > Jeff
> 
> 
> If the soft-real-time system we use makes simple function calls and only uses 
> simple datatypes and possibly tuples, what are some of the potential pitfalls 
> we could run into? I understand the deallocation issue and the garbage 
> collection issue. (Does garbage collection have to be enabled?) Can we 
> somehow get an idea on the worse case time delays in python using only simple 
> function calls and assignments with small tuples and simple datatypes?

I've already warned you of one.  Python has two facilities for cleaning
up objects which are no longer referenced: reference counting and
cyclical garbage collection.

Reference counting is the cause of unbounded delays for simple
assignments, as in the second statement of
    x = [unique() for i in range(1000*1000)]
    x = 3
each unique() that was in x will be cleaned up by reference counting at
the time of the second assignment.  There's no easy way to place a limit
on the size of containers.

A similar problem rears its head with the similar
    x = [unique() for i in range(1000*1000)]
    x.append(x) # create circular reference
    x = 3
the need to free a million objects remains, but it will happen whenever
garbage collection "feels like" it.  The gc module provides some
functions for modifying the behavior of the cyclic garbage collector.

(This latter is handled by the CGC instead of refcounting because the
refcount of x never actually falls to 0 -- it holds a reference to
itself)

Sorting is another lengthy operation.  It usually takes around O(n lg n)
operations to complete, where n is the length of the list being sorted.
I don't see any attempt to periodically release/reacquire the GIL here.
Of course, the comparison routines themselves may enter the bytecode
loop, but if the contents are all hard-to-compare fundamental types this
can still take quite awhile.  One farily pathological object would be
    x = [ [[]] * 1000 + [random.random()] for i in range(1000)]
(it takes ~4.5 seconds to sort this 1000-element list on my machine.
Each element is made up of 1000 empty lists followed by a random float)

Of course, even the lowly
    64**64**64L
can bring the interpreter to a standstill on a single bytecode.

IMO, even "soft" realtime based on Python threads is not going to fly.
These items are just off the top of my head, I'm sure there are more.
And more still in any extension module you'd like to include.  

Jeff




More information about the Python-list mailing list