Thread limits?

Quasimodo KILLzdramboSPAMMERS at zdnetmail.com
Sun Aug 20 12:47:04 EDT 2000


Tim,

Thanks for the input.  I'm aware each PC is going to have a small limit and
to resolve that  I'm planning on distributing the load on mulitple machines.

If I understand your suggestion correctly, you're suggesting spawning some
threads (10, say) then waiting for them to complete their tasks, thereby
getting destroyed, then spawning more from the Queue as current ones are
destroyed?

Sounds reasonable - I'll have to try it.  My only concern is I'm trying to
get this as close to realtime monitoring as possible (huge overhead so
settled on per minute scans).  I can't trigger a change event since the
source updates in realtime (100's of times per minute).  Also I need to scan
approx 5,000 different items each with it's own potential source.  So I
might get into a situation where, if each thread takes 10 seconds to
process, I'll only get 60 processed per minute where the max requirement is
5,000 per minute (practical max probably around 1,000).

I'm guessing this is going to be much along the lines of trial and error
combining both elements (i.e. running as many threads as possible, being fed
from a queue).

Thanks again,

Jose



"Tim Peters" <tim_one at email.msn.com> wrote in message
news:LNBBLJKPBEHFEDALKOLCAENDHAAA.tim_one at email.msn.com...
> [Quasimodo]
> > Pretty new to Python, and was messing about with multi-threading
> > (also new).
> >
> > Is there a theoretical limit to the maximum number of threads that can
be
> > spawned?
>
> No, but any particular platform will have a rather small *practical*
limit!
>
> > My machine (450MHz PIII, 320Mb RAM, Win98SE) tends to do it's dead-ant
> > impersonation at around 1,000 threads created.  Each thread at the mo
just
> > sleeps for 60 seconds, but eventually each will be updating a
> > database every minute and running continuously, leading me to think I'll
> > be lucky running 100-200 on this machine.
>
> If you think you need 1,000 threads, you're solving the wrong problem or
> solving the right one in a doomed way <wink>.  Use the Queue (std library)
> module to enter requests for work.  Spin off a small number of threads,
each
> structured like so:
>
>     while 1:
>         request = your_queue_object.get()  # sleeps until queue is
non-empty
>         request.deal_with_it()   # do whatever work it asks for
>
> > Just wondering if anyone's got some input on any limits.
>
> Ain't limits you're after, it's a sensible approach <wink>.
>
> if-you-had-a-thousand-things-to-do-at-once-you'd-start-thrashing-too-ly
>     y'rs  - tim
>
>
>





More information about the Python-list mailing list