[IPython-dev] finding the number of available engines

MinRK benjaminrk at gmail.com
Fri Jan 21 15:38:29 EST 2011


On Fri, Jan 21, 2011 at 11:38, Satrajit Ghosh <satra at mit.edu> wrote:

> thanks min and brian.
>
> we'll probably put this feature aside for the time being then.
>
> two questions:
>
> 1. fernando had mentioned that there was an effort to allow creation of
> engines on the fly with shwazik (i'm sure i'm spelling her name wrong). any
> update on the status of this? and was it going to work with 0.10.x or was it
> primarily intended for 0.11.x
>

Fernando and Soizic's work for starting engines was only for zmq.  It
shouldn't be terribly difficult to start new engines in either one, depend
on who you want to be starting them.  What are the conditions that you want
to trigger new engines, and what environment (pbs, etc.)?  Since starting an
engine is just a shell script, starting a new engine on the fly is just
executing the same script, via subprocess.Popen, or some such.


>
> 2. when is the eta for a beta version of parallel operation with 0.11.x? we
> would love to move over to 0.11 but waiting for the parallel stuff
> currently.
>

Hopefully it will be fairly soon. I was very busy with exams until recently,
and Fernando is busy with work and travel now.  If you want to play with the
current state, which I would call 'alpha', because we have yet to do a final
feature/API freeze, but the `apply` core at the client level is pretty
static. it's at https://github.com/ipython/ipython/tree/newparallel.

-MinRK


>
> cheers,
>
> satra
>
>
>
> On Fri, Jan 21, 2011 at 1:08 PM, Brian Granger <ellisonbg at gmail.com>wrote:
>
>> Satra,
>>
>> > i've run into a related problem. i'm trying to run code on an engine
>> that
>> > can itself spawn new tasks, but i can't seem to execute function below
>> on an
>> > engine. it just doesn't return if you call this through the taskclient
>> in
>> > blocking mode.
>>
>> This usage case of recursive tasks, is not supported with the current
>> twisted based code.  The reason has to do with subtle issues about how
>> the Twisted event loop is run in the engines versus the client.  For
>> the client, we run the Twisted event loop in a thread, which is not
>> compatible with the engine, which runs it in the main thread.  This is
>> probably why Min is seeing the odd GIL/thread related crash.
>>
>> > it seems to block in the request for the task client and multiengine
>> client.
>>
>> Yes, this is what you would observe in the current twisted version of
>> the code.  The new zeromq stuff Min has written doesn't have this
>> limitation.
>>
>> Sorry about the hassle.
>>
>> Cheers,
>>
>> Brian
>>
>> > cheers,
>> >
>> > satra
>> >
>> > -----
>> > def get_ipy_clients():
>> >     """Get ipython clients
>> >
>> >     Returns client, taskclient and multiengineclient
>> >     """
>> >     ipyclient = None
>> >     taskclient = None
>> >     mecclient = None
>> >     try:
>> >         name = 'IPython.kernel.client'
>> >         __import__(name)
>> >         ipyclient = sys.modules[name]
>> >     except ImportError:
>> >         warn("Ipython kernel not found.  Parallel execution will be" \
>> >                  "unavailable", ImportWarning)
>> >     if ipyclient:
>> >         try:
>> >             taskclient = ipyclient.TaskClient()
>> >             mecclient = ipyclient.MultiEngineClient()
>> >         except Exception, e:
>> >             if isinstance(e, ConnectionRefusedError):
>> >                 warn("No clients found, running serially for now.")
>> >             if isinstance(e, ValueError):
>> >                 warn("Ipython kernel not installed")
>> >     return ipyclient, taskclient, mecclient
>> > ------
>> >
>> >
>> >
>> > On Wed, Jan 19, 2011 at 2:42 PM, MinRK <benjaminrk at gmail.com> wrote:
>> >>
>> >> mec.queue_status() returns a list of the form:
>> >> [
>> >>   (0, { 'pending' : "execute('a=5')", 'queue' : [ job1,job2,...] } ,
>> >>   (1, {'pending' : 'None', 'queue' : [] },
>> >>   ...
>> >> ]
>> >>
>> >> In this case, engine 1 is idle.  I don't know why 1's pending is 'None'
>> >> instead of None, that seems to be a bug.
>> >> So you can see the idle engines with something like:
>> >> def idle_engines(mec):
>> >>     """return list of engine_ids corresponding to idle engines."""
>> >>     qs = mec.queue_status()
>> >>     engines = []
>> >>     for e_id, status in qs:
>> >>         if status['queue']:
>> >>             continue
>> >>         if not status['pending'] or status['pending'] == 'None':
>> >>             engines.append(e_id)
>> >>     return engines
>> >> Which would return a list of engine_ids that are idle, the length of
>> which
>> >> would of course be the number of idle engines.
>> >> -MinRK
>> >>
>> >>
>> >> On Wed, Jan 19, 2011 at 10:39, Satrajit Ghosh <satra at mit.edu> wrote:
>> >>>
>> >>> hi brian and min,
>> >>>
>> >>> i would like to do something like this:
>> >>>
>> >>> if num_engines_available() > 2:
>> >>>     do_x
>> >>> else:
>> >>>     do_y
>> >>>
>> >>> in 0.10.1 series is there an easy way to query how many idle engines
>> are
>> >>> available?
>> >>>
>> >>> cheers,
>> >>>
>> >>> satra
>> >>>
>> >>>
>> >>> _______________________________________________
>> >>> IPython-dev mailing list
>> >>> IPython-dev at scipy.org
>> >>> http://mail.scipy.org/mailman/listinfo/ipython-dev
>> >>>
>> >>
>> >
>> >
>> > _______________________________________________
>> > IPython-dev mailing list
>> > IPython-dev at scipy.org
>> > http://mail.scipy.org/mailman/listinfo/ipython-dev
>> >
>> >
>>
>>
>>
>> --
>> Brian E. Granger, Ph.D.
>> Assistant Professor of Physics
>> Cal Poly State University, San Luis Obispo
>> bgranger at calpoly.edu
>> ellisonbg at gmail.com
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/ipython-dev/attachments/20110121/89aa4569/attachment.html>


More information about the IPython-dev mailing list