From fperez.net at gmail.com Fri Sep 4 04:31:08 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Fri, 4 Sep 2009 01:31:08 -0700 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control Message-ID: Hi all, I know I should have been hard at work on continuing the branch review work I have in an open tab of Brian's push, but I couldn't resist. Please bear with me, this is a bit technical but, I hope, very interesting in the long run for us... This part of Ars Technica's excellent review of Snow Leopard: http://arstechnica.com/apple/reviews/2009/08/mac-os-x-10-6.ars/13 shows how Apple tackled the problem of providing civilized primitives to express parallellism in applications and a mechanism to make useful decisions based on this information. The idea is to combine a kernel-level dispatcher (GCD), a beautiful extension to the C language (yes, they extended C!) in the form of anonymous code blocks, and an API to inform GCD of your parallelism breakup easily, so GCD can use your intentions at runtime efficiently. It's really a remarkably simple, yet effective (IMHO) combination of tools to tackle a thorny problem. In any case, what does all this have to do with us? For a long time we've wondered about how to provide the easiest, simplest APIs that appear natural to the user, that are easy to convert into serial execution mode trivially (with a simple global switch for debugging, NOT changing any actual code everywhere), and that can permit execution via ipython. A while ago I hacked something via 'with' and context managers, that was so horrible and brittle (it involved stack manipulations, manual source introspection and exception injection) that I realized that could never really fly for production work. But this article on GCD got me trying my 'with' approach again, and I realized that syntactically it felt quite nice, I could write python versions of the code examples in that review, yet the whole 'with' mess killed it for me. And then it hit me that decorators could be abused just a little bit to get the same job done [1]! While this may be somewhat of an abuse, it does NOT involve source introspection or stack manipulations, so in principle it's 100% kosher, robust python. A little weird the first time you see it, but bear with me. The code below shows an implementation of a simple for loop directly and via a decorator. Both versions do the same thing, but the point is that by providing such decorators, we can *trivially* provide a GCD-style API for users to express their parallelism and have execution chunks handled by ipython remotely. It's obvious that such decorators can also be used to dispatch code to Cython, to a GPU, to a CorePy-based optimizer, to a profiler, etc. I think this could be a useful idea in more than one context, and it certainly feels to me like one of the missing API/usability pieces we've struggled with for the ipython distributed machinery. Cheers, f [1] What clicked in my head was tying the 'with' mess to how the Sage notebook uses the @interact decorator to immediately call the decorated function rather than decorating it and returning it. This immediate-consumption (ab)use of a decorator is what I'm using. ### CODE example # Consider a simple pair of 'loop body' and 'loop summary' functions: def do_work(data, i): return data[i]/2 def summarize(results, count): return sum(results[:count]) # And some 'dataset' (here just a list of 10 numbers count = 10 data = [3.0*j for j in range(count) ] # That we'll process. This is our processing loop, implemented as a regular # serial function that preallocates storage and then goes to work. def loop_serial(): results = [None]*count for i in range(count): results[i] = do_work(data, i) return summarize(results, count) # The same thing can be done with a decorator: def for_each(iterable): """This decorator-based loop does a normal serial run. But in principle it could be doing the dispatch remotely, or into a thread pool, etc. """ def call(func): map(func, iterable) return call # This is the actual code of the decorator-based loop: def loop_deco(): results = [None]*count @for_each(range(count)) def loop(i): results[i] = do_work(data, i) return summarize(results, count) # Test assert loop_serial() == loop_deco() print 'OK' From edreamleo at gmail.com Fri Sep 4 07:27:17 2009 From: edreamleo at gmail.com (Edward K. Ream) Date: Fri, 4 Sep 2009 06:27:17 -0500 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: Message-ID: On Fri, Sep 4, 2009 at 3:31 AM, Fernando Perez wrote: > > This part of Ars Technica's excellent review of Snow Leopard: > > http://arstechnica.com/apple/reviews/2009/08/mac-os-x-10-6.ars/13 > > shows how Apple tackled the problem of providing civilized primitives > to express parallellism in applications and a mechanism to make useful > decisions based on this information. Many thanks for bringing this to our attention. Edward -------------------------------------------------------------------- Edward K. Ream email: edreamleo at gmail.com Leo: http://webpages.charter.net/edreamleo/front.html -------------------------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: From edreamleo at gmail.com Fri Sep 4 08:01:51 2009 From: edreamleo at gmail.com (Edward K. Ream) Date: Fri, 4 Sep 2009 07:01:51 -0500 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: Message-ID: On Fri, Sep 4, 2009 at 3:31 AM, Fernando Perez wrote: > > > The code below shows an implementation of a simple for loop directly > and via a decorator. Both versions do the same thing, but the point > is that by providing such decorators, we can *trivially* provide a > GCD-style API for users to express their parallelism and have > execution chunks handled by ipython remotely. > Fascinating. Let me ask some basic questions to see if I understand. 1. Both loops do: results = [None]*count Is synchronization needed to update this array? 2. You call the decorator @for_each. Would @parallel be more descriptive? 3. The docstring for for_each is: """This decorator-based loop does a normal serial run. But in principle it could be doing the dispatch remotely, or into a thread pool, etc. """ So you are thinking that a call might be something like: def call(func): for i in iterable: << create thread >> And these threads would place their computations in the global results? Edward -------------- next part -------------- An HTML attachment was scrubbed... URL: From dsdale24 at gmail.com Fri Sep 4 08:05:47 2009 From: dsdale24 at gmail.com (Darren Dale) Date: Fri, 4 Sep 2009 08:05:47 -0400 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: Message-ID: Hi Fernando, On Fri, Sep 4, 2009 at 4:31 AM, Fernando Perez wrote: > # This is the actual code of the decorator-based loop: > def loop_deco(): > ? ?results = [None]*count > > ? ?@for_each(range(count)) > ? ?def loop(i): > ? ? ? ?results[i] = do_work(data, i) > > ? ?return summarize(results, count) If I have understood correctly, this example passes around a lot more data than is necessary. for_each passes the whole data list to do_work, which picks off the bit its looking for. I guess this doesn't seem like an intuitive interface to me, plus its expensive. What is being parallelized is: results = [None]*len(data) for i, d in enumerate(data): results[i] = do_work(d) Why not use a simple function instead of a decorator: results = for_each(data, do_work) ? Darren From fperez.net at gmail.com Fri Sep 4 13:56:14 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Fri, 4 Sep 2009 10:56:14 -0700 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: Message-ID: On Fri, Sep 4, 2009 at 5:01 AM, Edward K. Ream wrote: > On Fri, Sep 4, 2009 at 3:31 AM, Fernando Perez wrote: >> >> >> The code below shows an implementation of a simple for ?loop directly >> and via a decorator. ?Both versions do the same thing, but the point >> is that by providing such decorators, we can *trivially* provide a >> GCD-style API for users to express their parallelism and have >> execution chunks handled by ipython remotely. > > Fascinating.? Let me ask some basic questions to see if I understand. > > 1.? Both loops do:? results = [None]*count > > Is synchronization needed to update this array? This trivial example was written for the case where every update was 100% independent of every other, so there is no locking/synchronization involved. I was just transliterating the original example in the ArsTechnica review as a proof of concept, so I kept it as close to the original as possible. > 2. You call the decorator @for_each.? Would @parallel be more descriptive? I called it for_each just to keep the syntactic parallels: for i in range(count): results[i] = do_work(data, i) ---> @for_each(range(count)) def loop(i): results[i] = do_work(data, i) But a library that exposes a set of such decorators would probably use different, more precise names. In this case I was just trying to illustrate the syntax similarities. > 3. The docstring for for_each is: > > ? ?"""This decorator-based loop does a normal serial run. > ? ?But in principle it could be doing the dispatch remotely, > ?? or into a thread pool, etc. > ? ?""" > > So you are thinking that a call might be something like: > > ??? def call(func): > ??????? for i in iterable: > ??????????? << create thread >> > > And these threads would place their computations in the global results? Indeed, that's how the Apple GCD library is meant to be used. In our case, ipython might ship the decorated function and its closure to remote engines so instead of <> it would be more like <>, but that's the basic idea. This is actually something very simple, already in the language, but I think that it can be a useful pattern to expose and use more widely. The basic (trivial, honestly, but I'm pretty dense with these things so it took me a while to swallow it) insight is that in Python, the only scoping construct we have is 'def', where as 'with', loops and almost anything else that starts an indented block with a ':' is NOT a scoping construct. So while 'with' was introduced as a replacement for anonymous blocks (see PEP 340 http://www.python.org/dev/peps/pep-0340/), it doesn't really provide scoping, a key attribute of true blocks. But 'def' creates a scope, nicely wrapping its surroundings via a closure, and the @deco syntax gives you an immediate access to manipulate that scope as you see fit. So implementing lots of things that control the execution of local pieces of code becomes quite simple and natural. Doing things that involve source transformations (like deferring to Cython for compilation) will be still ugly, because functions don't have their source, as the idea of having func.__source__ permanently stored never quite flew: http://mail.python.org/pipermail/python-dev/2004-December/050190.html However you do have access in the decorator to the full function object, which includes its bytecode and other niceties to play tricks with. I'll have to play a little bit more with examples of this idea to see if it's really useful or just my tired brain deluding itself :) Many thanks for the feedback and interest! Cheers, f From fperez.net at gmail.com Fri Sep 4 14:01:19 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Fri, 4 Sep 2009 11:01:19 -0700 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: Message-ID: Hi Darren, On Fri, Sep 4, 2009 at 5:05 AM, Darren Dale wrote: > On Fri, Sep 4, 2009 at 4:31 AM, Fernando Perez wrote: >> # This is the actual code of the decorator-based loop: >> def loop_deco(): >> ? ?results = [None]*count >> >> ? ?@for_each(range(count)) >> ? ?def loop(i): >> ? ? ? ?results[i] = do_work(data, i) >> >> ? ?return summarize(results, count) > > If I have understood correctly, this example passes around a lot more > data than is necessary. for_each passes the whole data list to > do_work, which picks off the bit its looking for. I guess this doesn't > seem like an intuitive interface to me, plus its expensive. What is > being parallelized is: > > results = [None]*len(data) > for i, d in enumerate(data): > ? ?results[i] = do_work(d) > > Why not use a simple function instead of a decorator: > > results = for_each(data, do_work) > > ? Indeed, in this case 'data' is passed around, but I wrote it that way only to keep it as close as possible to the syntax of the original example I started from in the Ars review of Snow Leopard and the Grand Central Dispatch API. In practice, you'd obviously implement things differently. But my main point was not about the parallelization of a loop, but rather about the basic idea of using a decorator to swap the execution context of a bit of code for another one, be it a thread, a remote ipython engine, a GPU, a tracing utility, a profiler, a cython JIT engine or anything else. Perhaps I chose my example a little poorly to get that point across, sorry if that was the case. It would be good to come up with more obviously useful and unambiguous examples of this, I'd love it if we generate some interesting discussion here. I'll continue playing with this idea in my copious spare time, until Brian's patience with my lack of code review in the last few days runs out ;) Does that help clarify my intent? Thanks for the interest and feedback! Cheers, f From fperez.net at gmail.com Fri Sep 4 14:41:48 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Fri, 4 Sep 2009 11:41:48 -0700 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: Message-ID: On Fri, Sep 4, 2009 at 11:01 AM, Fernando Perez wrote: > But my main point was not about the parallelization of a loop, but > rather about the basic idea of using a decorator to swap the execution > context of a bit of code for another one, be it a thread, a remote > ipython engine, a GPU, a tracing utility, a profiler, a cython JIT > engine or anything else. ?Perhaps I chose my example a little poorly > to get that point across, sorry if that was the case. ?It would be > good to come up with more obviously useful and unambiguous examples of > this, I'd love it if we generate some interesting discussion here. > I'll continue playing with this idea in my copious spare time, until > Brian's patience with my lack of code review in the last few days runs > out ;) Here's another trivial example, suppose you'd like to trace some code. Again, starting from the simple loop from before: def loop_serial(): results = [None]*count for i in range(count): results[i] = do_work(data, i) return summarize(results, count) you can then use this decorator: def traced(func): import trace t = trace.Trace() t.runfunc(func) and a 2-line change of code: def loop_traced(): results = [None]*count @traced ### NEW def func(): ### NEW, the name is irrelevant for i in range(count): results[i] = do_work(data, i) return summarize(results, count) gives on execution: In [12]: run contexts.py --- modulename: contexts, funcname: func contexts.py(64): for i in range(count): contexts.py(65): @traced --- modulename: contexts, funcname: do_work contexts.py(10): return data[i]/2 contexts.py(64): for i in range(count): contexts.py(65): @traced ... etc. This shows how trivial, small decorators can be used to control code execution. For example, if you are a fan of Robert's fabulous line_profiler (http://packages.python.org/line_profiler/), using this trivial trick you can profile arbitrarily small chunks of code inline: def profiled(func): import line_profiler prof = line_profiler.LineProfiler() f = prof(func) f() prof.print_stats() prof.disable() def loop_profiled(): results = [None]*count @profiled # NEW def block(): # NEW for i in range(count): results[i] = do_work(data, i) return summarize(results, count) When run, you get: In [3]: run contexts.py Timer unit: 1e-06 s File: contexts.py Function: block at line 82 Total time: 1.6e-05 s Line # Hits Time Per Hit % Time Line Contents ============================================================== 82 @profiled 83 def block(): 84 5 7 1.4 43.8 for i in range(count): 85 4 9 2.2 56.2 results[i] = do_work(data, i) Do these examples illustrate the idea better? Cheers, f From dsdale24 at gmail.com Fri Sep 4 15:17:21 2009 From: dsdale24 at gmail.com (Darren Dale) Date: Fri, 4 Sep 2009 15:17:21 -0400 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: Message-ID: On Fri, Sep 4, 2009 at 2:41 PM, Fernando Perez wrote: > On Fri, Sep 4, 2009 at 11:01 AM, Fernando Perez wrote: > >> But my main point was not about the parallelization of a loop, but >> rather about the basic idea of using a decorator to swap the execution >> context of a bit of code for another one, be it a thread, a remote >> ipython engine, a GPU, a tracing utility, a profiler, a cython JIT >> engine or anything else. ?Perhaps I chose my example a little poorly >> to get that point across, sorry if that was the case. ?It would be >> good to come up with more obviously useful and unambiguous examples of >> this, I'd love it if we generate some interesting discussion here. >> I'll continue playing with this idea in my copious spare time, until >> Brian's patience with my lack of code review in the last few days runs >> out ;) > > Here's another trivial example, suppose you'd like to trace some code. > ?Again, starting from the simple loop from before: > > def loop_serial(): > ? ?results = [None]*count > > ? ?for i in range(count): > ? ? ? ?results[i] = do_work(data, i) > > ? ?return summarize(results, count) > > > you can then use this decorator: > > def traced(func): > ? ?import trace > ? ?t = trace.Trace() > ? ?t.runfunc(func) > > and a 2-line change of code: > > def loop_traced(): > ? ?results = [None]*count > > ? ?@traced ?### NEW > ? ?def func(): ?### NEW, the name is irrelevant > ? ? ? ?for i in range(count): > ? ? ? ? ? ?results[i] = do_work(data, i) > > ? ?return summarize(results, count) > > gives on execution: > > In [12]: run contexts.py > ?--- modulename: contexts, funcname: func > contexts.py(64): ? ? for i in range(count): > contexts.py(65): ? ? ? ? @traced > ?--- modulename: contexts, funcname: do_work > contexts.py(10): ? ? return data[i]/2 > contexts.py(64): ? ? for i in range(count): > contexts.py(65): ? ? ? ? @traced > > ... etc. > > This shows how trivial, small decorators can be used to control code > execution. ?For example, if you are a fan of Robert's fabulous > line_profiler (http://packages.python.org/line_profiler/), using this > trivial trick you can profile arbitrarily small chunks of code inline: > > def profiled(func): > ? ?import line_profiler > ? ?prof = line_profiler.LineProfiler() > ? ?f = prof(func) > ? ?f() > ? ?prof.print_stats() > ? ?prof.disable() > > def loop_profiled(): > ? ?results = [None]*count > > ? ?@profiled ?# NEW > ? ?def block(): ?# NEW > ? ? ? ?for i in range(count): > ? ? ? ? ? ?results[i] = do_work(data, i) > > ? ?return summarize(results, count) > > When run, you get: > > In [3]: run contexts.py > Timer unit: 1e-06 s > > File: contexts.py > Function: block at line 82 > Total time: 1.6e-05 s > > Line # ? ? ?Hits ? ? ? ? Time ?Per Hit ? % Time ?Line Contents > ============================================================== > ? ?82 ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? @profiled > ? ?83 ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? def block(): > ? ?84 ? ? ? ? 5 ? ? ? ? ? ?7 ? ? ?1.4 ? ? 43.8 ? ? ? ? ?for i in range(count): > ? ?85 ? ? ? ? 4 ? ? ? ? ? ?9 ? ? ?2.2 ? ? 56.2 > results[i] = do_work(data, i) > > > > Do these examples illustrate the idea better? Your intent was clear, but the implementation still leaves me wondering what is gained by using the @decorator syntax. Maybe I have missed something, please bear with me. With @decorator, you are still passing a function to another function to modify its execution, its just a different syntax that achieves the same result, isn't it? For example, using your for_each: for_each(range(count))(loop) yields the same result without having to redefine loop each time you call loop_deco. In your above examples, without the @decorator syntax, func/block (lets call it func since they are identical) can be defined once outside loop_profiled or loop_traced, and then the execution of that func can be temporarily modified with using traced(func) and profiled(func) as you have written them. My point is that the decorator syntax doesn't yield you anything that you didn't already have, and it even seems more limiting, because @decorator is just a syntactic nicety focused on function and class definitions, so we don't have to do things like: def func(): ... func = modify(func) Since @decorator rebinds the modified func to func, you have to keep redefining it within the various contexts, which seems to defeat the purpose. Darren From fperez.net at gmail.com Fri Sep 4 15:30:30 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Fri, 4 Sep 2009 12:30:30 -0700 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: Message-ID: On Fri, Sep 4, 2009 at 12:17 PM, Darren Dale wrote: > Your intent was clear, but the implementation still leaves me > wondering what is gained by using the @decorator syntax. Maybe I have > missed something, please bear with me. > > With @decorator, you are still passing a function to another function > to modify its execution, its just a different syntax that achieves the > same result, isn't it? For example, using your for_each: > for_each(range(count))(loop) yields the same result without having to > redefine loop each time you call loop_deco. I think the only point to take home is that these decorators *execute* the decorated function right away, and you can ignore it. Let's look only at the loop body: for i in range(count): results[i] = do_work(data, i) What this simple pattern allows us to do is to (say) profile this chunk of code right where it is, without having to move it out to another separate function (where you'd need to create and pass arguments to provide local variables for example), by doing two things: 1. Prepend two lines of code: @profiled def block(): 2. Indent by one level the code you wanted to profile, leaving you with: @profiled def block(): for i in range(count): results[i] = do_work(data, i) That's it. All local variables remain available, you don't need to pass arguments around, you never actually call block() yourself, nada. For example, if you wanted the profiling to happen one level deeper instead, you can do this instead: for i in range(count): @profiled def block(): results[i] = do_work(data, i) You've just swapped indentation levels, you didn't need to worry about passing variables around, and now the profiling happens per-call of the loop rather than at the top. This may be a silly example, but it just tries to show how the point of this little exercise is just to switch scoped execution control with the least intrusive amount of work possible. I admit though: this is so trivial, and already in the language, that maybe I am indeed happy about nothing :) So I really appreciate the feedback, especially critical one! If this is pointless, might as well kill it before wasting anyone's time with it, so please *do* be critical of this. Thanks again, f From dsdale24 at gmail.com Fri Sep 4 16:18:53 2009 From: dsdale24 at gmail.com (Darren Dale) Date: Fri, 4 Sep 2009 16:18:53 -0400 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: Message-ID: On Fri, Sep 4, 2009 at 3:30 PM, Fernando Perez wrote: > On Fri, Sep 4, 2009 at 12:17 PM, Darren Dale wrote: >> Your intent was clear, but the implementation still leaves me >> wondering what is gained by using the @decorator syntax. Maybe I have >> missed something, please bear with me. >> >> With @decorator, you are still passing a function to another function >> to modify its execution, its just a different syntax that achieves the >> same result, isn't it? For example, using your for_each: >> for_each(range(count))(loop) yields the same result without having to >> redefine loop each time you call loop_deco. > > I think the only point to take home is that these decorators *execute* > the decorated function right away, and you can ignore it. ?Let's look > only at the loop body: > > ? ?for i in range(count): > ? ? ? ?results[i] = do_work(data, i) > > What this simple pattern allows us to do is to (say) profile this > chunk of code right where it is, without having to move it out to > another separate function (where you'd need to create and pass > arguments to provide local variables for example), Ok, I didn't appreciate that you defined func where you did in order to get access to those local variables. And while I prefer the decorator syntax, it is still identical to "def block(): ...; profiled(block)", right? > by doing two things: > > 1. Prepend two lines of code: > > ? ?@profiled > ? ?def block(): > > 2. Indent by one level the code you wanted to profile, leaving you with: > > ? ?@profiled > ? ?def block(): > ? ? ? ?for i in range(count): > ? ? ? ? ? ?results[i] = do_work(data, i) > > That's it. ?All local variables remain available, you don't need to > pass arguments around, you never actually call block() yourself, nada. Ok, or: def block(): for i in range(count): results[i] = do_work(data, i) profiled(block) > ?For example, if you wanted the profiling to happen one level deeper > instead, you can do this instead: > > ? ?for i in range(count): > ? ? ? ?@profiled > ? ? ? ?def block(): > ? ? ? ? ? ?results[i] = do_work(data, i) > > You've just swapped indentation levels, you didn't need to worry about > passing variables around, and now the profiling happens per-call of > the loop rather than at the top. This may be a silly example, but it > just tries to show how the point of this little exercise is just to > switch scoped execution control with the least intrusive amount of > work possible. for i in range(count): def block(): results[i] = do_work(data, i) profiled(block) > I admit though: this is so trivial, and already in the language, that > maybe I am indeed happy about nothing :) ?So I really appreciate the > feedback, especially critical one! ?If this is pointless, might as > well kill it before wasting anyone's time with it, so please *do* be > critical of this. One other potential problem that just occurred to me: it is a non-standard use of the decorator syntax. When I first saw it I thought "wait, where does he call loop?" It happened inside the decorator, but the standard use is to rebind the returned value to loop (or block, or whatever). PEP318 doesn't mention this pattern you have implemented, and I didn't see it at http://wiki.python.org/moin/PythonDecoratorLibrary either (I did not do an exhaustive search). In this case, since the decorator syntax has a very well established purpose, I think the "profiled(block)" alternative is much more obvious. Darren From edreamleo at gmail.com Fri Sep 4 16:53:01 2009 From: edreamleo at gmail.com (Edward K. Ream) Date: Fri, 4 Sep 2009 15:53:01 -0500 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: Message-ID: On Fri, Sep 4, 2009 at 2:30 PM, Fernando Perez wrote: > > I admit though: this is so trivial, and already in the language, that > maybe I am indeed happy about nothing :) So I really appreciate the > feedback, especially critical one! If this is pointless, might as > well kill it before wasting anyone's time with it, so please *do* be > critical of this. > Imo, it is a capital mistake to second-guess or prematurely criticize excitement and invention. There are plenty of inventions that can be simulated by other constructs, but that doesn't make the inventions useless. Using decorators rather than functions is a change of view, and it is most unwise to underestimate the potential of a change in view. For example, Einstein did not invent the Lorentz transformation, he "merely" created a new point of view in which the transformations were something other than a mathematical hack to make c be constant in all reference frames. Also, it is in no way an abuse of decorators to use them in unexpected, unusual, creative ways, provided only that you are not relying on some undocumented accidental feature. Edward -------------------------------------------------------------------- Edward K. Ream email: edreamleo at gmail.com Leo: http://webpages.charter.net/edreamleo/front.html -------------------------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: From fperez.net at gmail.com Fri Sep 4 19:54:22 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Fri, 4 Sep 2009 16:54:22 -0700 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: Message-ID: On Fri, Sep 4, 2009 at 1:18 PM, Darren Dale wrote: > Ok, I didn't appreciate that you defined func where you did in order > to get access to those local variables. And while I prefer the > decorator syntax, it is still identical to "def block(): ...; > profiled(block)", right? Yes, absolutely. This is an unconventional, but 100% 'compliant' use of @ syntax. > Ok, or: > > ? def block(): > ? ? ? for i in range(count): > ? ? ? ? ? results[i] = do_work(data, i) > ? profiled(block) Certainly, it's still the normal python equivalence: @foo def bar(): pass == def bar(): pass bar = foo(bar) > One other potential problem that just occurred to me: it is a > non-standard use of the decorator syntax. When I first saw it I > thought "wait, where does he call loop?" It happened inside the > decorator, but the standard use is to rebind the returned value to > loop (or block, or whatever). PEP318 doesn't mention this pattern you > have implemented, and I didn't see it at > http://wiki.python.org/moin/PythonDecoratorLibrary either (I did not > do an exhaustive search). In this case, since the decorator syntax has > a very well established purpose, I think the "profiled(block)" > alternative is much more obvious. As I mentioned before, it's a non-standard use of @, but I can't claim to have invented: the Apple GCD and ^{} block discussion forced me to rethink the problem, and I remembered having seen this pattern in Sage's @interact implementation. But it is indeed an unusual and somewhat surprising use of @decorators, I do admit that. However, I actually find it quite clear and I like how it reads inline immediately. If this pattern of "inline decorators" becomes popular for certain uses, the surprise factor will fade. The reason why I like it as-is, is that it immediately declares the actions that will affect the block below, much like 'with' declares the context manager up-front (it's no surprise, since I arrived at this while thinking about 'with'-based implementations of this idea). But you are certainly welcome to use the post-block implementation if you find it clearer, after all the resulting execution is 100% the same. I suppose time and usage will tell us which patterns ends up working better for long-term practice. Cheers, f From fperez.net at gmail.com Fri Sep 4 20:00:10 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Fri, 4 Sep 2009 17:00:10 -0700 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: Message-ID: On Fri, Sep 4, 2009 at 1:53 PM, Edward K. Ream wrote: > Imo, it is a capital mistake to second-guess or prematurely criticize > excitement and invention.? There are plenty of inventions that can be > simulated by other constructs, but that doesn't make the inventions > useless.? Using decorators rather than functions is a change of view, and it > is most unwise to underestimate the potential of a change in view. > > For example, Einstein did not invent the Lorentz transformation, he "merely" > created a new point of view in which the transformations were something > other than a mathematical hack to make c be constant in all reference > frames. > > Also, it is in no way an abuse of decorators to use them in unexpected, > unusual, creative ways, provided only that you are not relying on some > undocumented accidental feature. Many thanks for the kind words, which to a physicist sound particularly nice ;) Though I do genuinely appreciate solid, critical feedback like Darren's: an idea solidifies from contact with intelligent, if respectful, criticism. Continuing with the history of physics theme, we can thank Einstein not only for his 1905 papers on the photoelectric effect as giving birth to quantum mechanics, but much more importantly, with his heated and passionate debates with Bohr et al over the Copenhagen interpretation of QM, culminating in the 1935 EPR paper, as giving QM its most solid conceptual foundation. By relentlessly attacking QM with intelligence and creativity, Einstein spurred a debate that helped clarify many ideas that were not well stated at the time (though QM still to this day remains a surprisingly subtle theory for all). I think it's great that we can precisely have this kind of debate here, where an idea is challenged to make it better, yet the challenge is friendly, encouraging and respectful enough that it doesn't stifle creativity or nip potentially good ideas in the bud. I think both you and Darren have provided me today with the best kind of feedback in this spirit, for which I am very grateful. All the best, f From dsdale24 at gmail.com Fri Sep 4 21:12:03 2009 From: dsdale24 at gmail.com (Darren Dale) Date: Fri, 4 Sep 2009 21:12:03 -0400 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: Message-ID: On Fri, Sep 4, 2009 at 8:00 PM, Fernando Perez wrote: > On Fri, Sep 4, 2009 at 1:53 PM, Edward K. Ream wrote: >> Imo, it is a capital mistake to second-guess or prematurely criticize >> excitement and invention.? There are plenty of inventions that can be >> simulated by other constructs, but that doesn't make the inventions >> useless.? Using decorators rather than functions is a change of view, and it >> is most unwise to underestimate the potential of a change in view. >> >> For example, Einstein did not invent the Lorentz transformation, he "merely" >> created a new point of view in which the transformations were something >> other than a mathematical hack to make c be constant in all reference >> frames. >> >> Also, it is in no way an abuse of decorators to use them in unexpected, >> unusual, creative ways, provided only that you are not relying on some >> undocumented accidental feature. > > Many thanks for the kind words, which to a physicist sound particularly nice ;) > > Though I do genuinely appreciate solid, critical feedback like > Darren's: an idea solidifies from contact with intelligent, if > respectful, criticism. ?Continuing with the history of physics theme, > we can thank Einstein not only for his 1905 papers on the > photoelectric effect as giving birth to quantum mechanics, but much > more importantly, with his heated and passionate debates with Bohr et > al over the Copenhagen interpretation of QM, culminating in the 1935 > EPR paper, as giving QM its most solid conceptual foundation. ?By > relentlessly attacking QM with intelligence and creativity, Einstein > spurred a debate that helped clarify many ideas that were not well > stated at the time (though QM still to this day remains a surprisingly > subtle theory for all). > > I think it's great that we can precisely have this kind of debate > here, where an idea is challenged to make it better, yet the challenge > is friendly, encouraging and respectful enough that it doesn't stifle > creativity or nip potentially good ideas in the bud. > > I think both you and Darren have provided me today with the best kind > of feedback in this spirit, for which I am very grateful. Coincidentally, Fernando and I were just commenting off list this morning about how much we appreciate this kind of discussion. All my comments here were meant to be respectful and constructive, to help firm up ideas. Darren From dsdale24 at gmail.com Sat Sep 5 10:51:53 2009 From: dsdale24 at gmail.com (Darren Dale) Date: Sat, 5 Sep 2009 10:51:53 -0400 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: Message-ID: On Fri, Sep 4, 2009 at 9:12 PM, Darren Dale wrote: > On Fri, Sep 4, 2009 at 8:00 PM, Fernando Perez wrote: >> On Fri, Sep 4, 2009 at 1:53 PM, Edward K. Ream wrote: >>> Imo, it is a capital mistake to second-guess or prematurely criticize >>> excitement and invention.? There are plenty of inventions that can be >>> simulated by other constructs, but that doesn't make the inventions >>> useless.? Using decorators rather than functions is a change of view, and it >>> is most unwise to underestimate the potential of a change in view. >>> >>> For example, Einstein did not invent the Lorentz transformation, he "merely" >>> created a new point of view in which the transformations were something >>> other than a mathematical hack to make c be constant in all reference >>> frames. >>> >>> Also, it is in no way an abuse of decorators to use them in unexpected, >>> unusual, creative ways, provided only that you are not relying on some >>> undocumented accidental feature. >> >> Many thanks for the kind words, which to a physicist sound particularly nice ;) >> >> Though I do genuinely appreciate solid, critical feedback like >> Darren's: an idea solidifies from contact with intelligent, if >> respectful, criticism. ?Continuing with the history of physics theme, >> we can thank Einstein not only for his 1905 papers on the >> photoelectric effect as giving birth to quantum mechanics, but much >> more importantly, with his heated and passionate debates with Bohr et >> al over the Copenhagen interpretation of QM, culminating in the 1935 >> EPR paper, as giving QM its most solid conceptual foundation. ?By >> relentlessly attacking QM with intelligence and creativity, Einstein >> spurred a debate that helped clarify many ideas that were not well >> stated at the time (though QM still to this day remains a surprisingly >> subtle theory for all). >> >> I think it's great that we can precisely have this kind of debate >> here, where an idea is challenged to make it better, yet the challenge >> is friendly, encouraging and respectful enough that it doesn't stifle >> creativity or nip potentially good ideas in the bud. >> >> I think both you and Darren have provided me today with the best kind >> of feedback in this spirit, for which I am very grateful. > > Coincidentally, Fernando and I were just commenting off list this > morning about how much we appreciate this kind of discussion. All my > comments here were meant to be respectful and constructive, to help > firm up ideas. I slept on this last night, and this morning when I reread your original post I had to laugh at myself for taking so long and such a winding route to understanding and processing all the points you raised. So thanks for being patient with me. You guys can continue discussing Einstein, I'll sit here and scratch my stomach. Do you think there is the possibility of building on this mechanism? I mean, you have a working demonstration using features that are already built into the language. Do you think it would be possible/worth looking into adding syntax to the python language that provided anonymous code blocks without having to define functions and decorate them (in the general sense, using either @deco or deco(foo))? I'm thinking of something similar to the "with" syntax that creates a scope and allows you to manipulate it. Is there any chance that the GIL issue could be addressed using such a mechanism? Darren From edreamleo at gmail.com Sat Sep 5 17:52:02 2009 From: edreamleo at gmail.com (Edward K. Ream) Date: Sat, 5 Sep 2009 16:52:02 -0500 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: Message-ID: On Fri, Sep 4, 2009 at 3:53 PM, Edward K. Ream wrote: > > Also, it is in no way an abuse of decorators to use them in unexpected, > unusual, creative ways, provided only that you are not relying on some > undocumented accidental feature. > Inspired by this thread, I decided to deepen my understanding of decorators. To state my conclusion first, to truly understand decorators it is a good idea to completely ignore pep 318 and all related tutorials :-) Indeed, everything you need to know, (everything there *is* to know) about decorators is in the Reference Guide: http://docs.python.org/reference/compound_stmts.html#function-definitions Specifically, the reference guide has only this to say about decorators: QQQ A function definition may be wrapped by one or more *decorator*expressions. Decorator expressions are evaluated when the function is defined, in the scope that contains the function definition. The result must be a callable, which is invoked with the function object as the only argument. The returned value is bound to the function name instead of the function object. Multiple decorators are applied in nested fashion. For example, the following code: @f1(arg) @f2 def func(): pass is equivalent to: def func(): pass func = f1(arg)(f2(func)) QQQ Imo, this is a rare example where the most consise explanation is also the clearest and best. It is best because it does not deal with the blah blah blah of expectations. It implicitly says that one is free to use decorators in *any* way, subject only to the constraint that the decorator expression evaluates to a callable. Failure of the decorator to evaluate to a callable of *some* kind is the only way to "abuse" a decorator, and the compiler will not allow such abuse :) In particular, there is no requirement that the callable be in *any* way related to func! The simplicity of decorators renders them neither useless nor uninteresting. Unlike tutorials, the reference does not tell how to implement, say, @trace. We are left with a sense of possibility. Edward ------------------------------------------------------------------- Edward K. Ream email: edreamleo at gmail.com Leo: http://webpages.charter.net/edreamleo/front.html -------------------------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: From fperez.net at gmail.com Sun Sep 6 02:22:34 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Sat, 5 Sep 2009 23:22:34 -0700 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: Message-ID: On Sat, Sep 5, 2009 at 7:51 AM, Darren Dale wrote: > I slept on this last night, and this morning when I reread your > original post I had to laugh at myself for taking so long and such a > winding route to understanding and processing all the points you > raised. So thanks for being patient with me. You guys can continue > discussing Einstein, I'll sit here and scratch my stomach. Sorry for laughing, but after scipy I stayed in SoCal for a few days with the family, and visited the San Diego zoo and wild animal park, and saw lots of primates. So the image of you scratching your stomach while more evolved behavior takes place next door is rather humorous ;) But don't beat yourself up: while there's nothing complex about what we're talking about, it is a slightly unusual usage of the language, so it's natural to do a double-take with it. I have the benefit of having worried about this problem for a long time, but it took me *many* tries to understand how to fit the pieces together. And I had the advantage of lots of help along the way: - the first 'click' was a conversation with Eric Jones at Berkeley in late 2007, where he pointed out really how 'with' could be used for execution management, which they are doing a lot of with Enthought's context library (BlockCanvas, I think?) - in March 2008, William Stein implemented for Sage @interact at the sprint at Enthought, using the 'call and consume' approach to the decorated function. On the flight back from that, I implemented for ipython the trick using 'with', which worked but was so nasty that I never really pursued it. - in September 2008 at Scipy'08 I had a long talk about the problem with Alex Martelli on whether extending the context manager protocol with a __execute__ method to control the actual execution of the code would be feasible. This conversation was very enlightening, even though it made it fairly clear that the 'with' approach was probably doomed in the long run. Alex pointed out very clearly a few of the key issues regarding scoping that helped me a lot. - then at SciPy'09 I had a talk with Peter Norvig again about the same problem, so I got the whole thing back in my head. - and finally, John Siracusa's review at Ars Technica about Apple's work with anonymous blocks and Grand Central Dispatch make the whole thing click. As you can see, if you're slow for taking a day to put it together, there's simply no hope for me: it took me almost 2 years, and I needed the help of some of the very brightest people in the python world to push me along. They are the ones who did all the thinking and deserve the credit, I was just thick enough never to understand the ideas until now! > Do you think there is the possibility of building on this mechanism? I > mean, you have a working demonstration using features that are already > built into the language. Do you think it would be possible/worth > looking into adding syntax to the python language that provided > anonymous code blocks without having to define functions and decorate > them (in the general sense, using either @deco or deco(foo))? I'm > thinking of something similar to the "with" syntax that creates a > scope and allows you to manipulate it. Is there any chance that the > GIL issue could be addressed using such a mechanism? I think there is, but we should first explore it more deeply. The reason I'm happy is that I see lots of potential here, and if we find solid, important uses, it will be *much* easier then to make a case to Guido and python-dev for a core language change in the future. Guido is careful with syntax (fortunately!), but he does listen if a tried and tested usage is presented, where the cost of new syntax is offset by genuine benefits. But fortunately we can get almost everything we want today, even if with ugly syntax. I say almost because with Python 2.x there is at least one real annoyance: the inability to rebind non-local (but not global) names in an inner scope. This was fixed with the 'nonlocal' keyword in 3.0, but for 2.x the following won't work: def execute(func): return func() def simple(n): s = 0.0 @execute def block(): for i in range(n): s += i**2 return s because you get an unbound local error: In [13]: run simple [...] /home/fperez/research/code/contexts/simple.py in block() 15 def block(): 16 for i in range(n): ---> 17 s += i**2 18 19 return s UnboundLocalError: local variable 's' referenced before assignment WARNING: Failure executing file: In Python 3, this works great: In [17]: !python3.1 simple.py 285.0 with the only change being the addition of nonlocal s to the block() definition. I guess this will be a motivation to move to 3.x if this idea turns out to be really useful... In any case, we'll need to see if real-world uses of this trick really pay off, develop some good libraries of decorators that pre-package good functionality, and then we can consider syntax extensions to the language itself. But for now, I think it is still a little premature to look in that direction, until experience shows us better what all the various problems and patterns of use need to be. Regards, f From fperez.net at gmail.com Sun Sep 6 02:26:00 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Sat, 5 Sep 2009 23:26:00 -0700 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: Message-ID: Oops, I forgot to address this: On Sat, Sep 5, 2009 at 11:22 PM, Fernando Perez wrote: >> scope and allows you to manipulate it. Is there any chance that the >> GIL issue could be addressed using such a mechanism? Certainly, but only a part of it and iff something like the unladen swallow project is successful. That's what Apple's GCD is, after all: they have no GIL in raw C/Objective C/C++, so they expose the GCD dispatch machinery for users to easily express any coarse parallelism they can find in their codes. In fact, if the gil went away, Snow Leopard and children would become a *really* nice environment to work in, since a set of GCD bindings would make it possible to write python code whose threads get the dynamic dispatching of GCD. I might even get a Mac! :) Cheers, f From fperez.net at gmail.com Sun Sep 6 02:36:50 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Sat, 5 Sep 2009 23:36:50 -0700 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: Message-ID: On Sat, Sep 5, 2009 at 2:52 PM, Edward K. Ream wrote: > On Fri, Sep 4, 2009 at 3:53 PM, Edward K. Ream wrote: >> >> Also, it is in no way an abuse of decorators to use them in unexpected, >> unusual, creative ways, provided only that you are not relying on some >> undocumented accidental feature. > > Inspired by this thread, I decided to deepen my understanding of > decorators.? To state my conclusion first, to truly understand decorators it > is a good idea to completely ignore pep 318 and all related tutorials :-) Almost all :) I think Matthew Brett's (disclaimer: a good friend and colleague) is actually quite nice and to the point: https://cirl.berkeley.edu/mb312/data_docs/decorating_for_dummies.html though it does have the same misconception that just about every other document about decorators has, namely """a function, that takes a function as input, and ***returns a function*** """ The part between ** above is not correct, and this is a subtle but critical point here. As you correctly cite in the ref guide: > Decorator expressions are evaluated when the function is defined, in the > scope that contains the function definition. The result must be a callable, The result of the *decorator expression*, that is, *the line that starts with '@'*, must be a callable. But the result of evaluating *that* on the function afterwards need not be a callable at all, as we can easily see: In [18]: def funnydeco(func): ....: return 'Hi, I am a decorator...' ....: In [19]: @funnydeco ....: def f(x): ....: return x+1 ....: In [20]: f(10) --------------------------------------------------------------------------- TypeError Traceback (most recent call last) /home/fperez/research/code/contexts/simple.py in () [...] TypeError: 'str' object is not callable In [21]: f Out[21]: 'Hi, I am a decorator...' This means that our 'inline decorators' can not be chained, since they are 'greedy' in that they consume the function they are meant to be applied to. They can return the value of the called function though, which can be very useful as seen here: def execute(func): return func() def simple2(n): @execute def s(): c = 0.0 for i in range(n): c += i**2 return c return s By returning a value in the block and later using the name of the block, we can feed back locals to the surrounding scope. This is the hack that 'nonlocal' in 3.x makes obsolete, but for now we'll have to make do with mutables or this trick. > Imo, this is a rare example where the most consise explanation is also the > clearest and best.? It is best because it does not deal with the blah blah > blah of expectations.? It implicitly says that one is free to use decorators > in *any* way, subject only to the constraint that the decorator expression > evaluates to a callable.? Failure of the decorator to evaluate to a callable > of *some* kind is the only way to "abuse" a decorator, and the compiler will > not allow such abuse :)? In particular, there is no requirement that the > callable be in *any* way related to func! Yes, and that's what we're taking advantage of here. We'll see what good uses we can find as we work with the idea. Cheers, f From prabhu at aero.iitb.ac.in Sun Sep 6 03:14:46 2009 From: prabhu at aero.iitb.ac.in (Prabhu Ramachandran) Date: Sun, 06 Sep 2009 12:44:46 +0530 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: Message-ID: <4AA36166.4070604@aero.iitb.ac.in> On 09/06/09 11:52, Fernando Perez wrote: > But don't beat yourself up: while there's nothing complex about what > we're talking about, it is a slightly unusual usage of the language, > so it's natural to do a double-take with it. I have the benefit of > having worried about this problem for a long time, but it took me > *many* tries to understand how to fit the pieces together. And I had > the advantage of lots of help along the way: [...] > - and finally, John Siracusa's review at Ars Technica about Apple's > work with anonymous blocks and Grand Central Dispatch make the whole > thing click. > > As you can see, if you're slow for taking a day to put it together, > there's simply no hope for me: it took me almost 2 years, and I needed > the help of some of the very brightest people in the python world to > push me along. They are the ones who did all the thinking and deserve > the credit, I was just thick enough never to understand the ideas > until now! Thanks for the interesting links and thread. Just FYI, last year I had occasion to solve, relatively elegantly, a set of pretty sticky problems for mayavi2 using decorators and generators. See for example: https://svn.enthought.com/enthought/browser/Mayavi/trunk/enthought/mayavi/tools/show.py https://svn.enthought.com/enthought/browser/Mayavi/trunk/enthought/mayavi/tools/animator.py They allow us to do relatively simple but neat things very elegantly. Until you use decorators you often don't realize how convenient they can be. In particular the pattern used in animator.py shows how convenient the combination of a UI dispatch mechanism plus a generator is. All a programmer needs is to inject a yield suitably and the rest is automatic. The Kamaelia project (http://www.kamaelia.org) is also very interesting for its use of generators, microprocesses, components and very specifically concurrency. They have a very neat model for exactly this and a nice though slightly elaborate tutorial showing how you can build their core library from these ideas. If you have the time it is very interesting. My humble contribution to spending two more hours of your time. ;-) cheers, prabhu From dsdale24 at gmail.com Sun Sep 6 09:06:10 2009 From: dsdale24 at gmail.com (Darren Dale) Date: Sun, 6 Sep 2009 09:06:10 -0400 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: <4AA36166.4070604@aero.iitb.ac.in> References: <4AA36166.4070604@aero.iitb.ac.in> Message-ID: Hi Prabhu, On Sun, Sep 6, 2009 at 3:14 AM, Prabhu Ramachandran wrote: > On 09/06/09 11:52, Fernando Perez wrote: >> >> But don't beat yourself up: while there's nothing complex about what >> we're talking about, it is a slightly unusual usage of the language, >> so it's natural to do a double-take with it. ?I have the benefit of >> having worried about this problem for a long time, but it took me >> *many* tries to understand how to fit the pieces together. ?And I had >> the advantage of lots of help along the way: > > [...] >> >> - and finally, John Siracusa's review at Ars Technica about Apple's >> work with anonymous blocks and Grand Central Dispatch make the whole >> thing click. >> >> As you can see, if you're slow for taking a day to put it together, >> there's simply no hope for me: it took me almost 2 years, and I needed >> the help of some of the very brightest people in the python world to >> push me along. ?They are the ones who did all the thinking and deserve >> the credit, I was just thick enough never to understand the ideas >> until now! > > Thanks for the interesting links and thread. ?Just FYI, last year I had > occasion to solve, relatively elegantly, a set of pretty sticky problems for > mayavi2 using decorators and generators. ?See for example: > > https://svn.enthought.com/enthought/browser/Mayavi/trunk/enthought/mayavi/tools/show.py > https://svn.enthought.com/enthought/browser/Mayavi/trunk/enthought/mayavi/tools/animator.py > > They allow us to do relatively simple but neat things very elegantly. Until > you use decorators you often don't realize how convenient they can be. ?In > particular the pattern used in animator.py shows how convenient the > combination of a UI dispatch mechanism plus a generator is. ?All a > programmer needs is to inject a yield suitably and the rest is automatic. > > The Kamaelia project (http://www.kamaelia.org) is also very interesting for > its use of generators, microprocesses, components and very specifically > concurrency. ?They have a very neat model for exactly this and a nice though > slightly elaborate tutorial showing how you can build their core library > from these ideas. ?If you have the time it is very interesting. > > My humble contribution to spending two more hours of your time. ;-) Thanks for the pointer. The Pipeline and Graphline reminds me of LabView, or rather how I would prefer to develop labview-like applications in python. I had to look in the older document structure to find a discussion on concurrency, here it is for anyone who is interested: http://www.kamaelia.org/Docs/Axon/Axon.Microprocess.html From dsdale24 at gmail.com Sun Sep 6 09:31:04 2009 From: dsdale24 at gmail.com (Darren Dale) Date: Sun, 6 Sep 2009 09:31:04 -0400 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: <4AA36166.4070604@aero.iitb.ac.in> Message-ID: On Sun, Sep 6, 2009 at 9:06 AM, Darren Dale wrote: > Hi Prabhu, > > On Sun, Sep 6, 2009 at 3:14 AM, Prabhu > Ramachandran wrote: >> On 09/06/09 11:52, Fernando Perez wrote: >>> >>> But don't beat yourself up: while there's nothing complex about what >>> we're talking about, it is a slightly unusual usage of the language, >>> so it's natural to do a double-take with it. ?I have the benefit of >>> having worried about this problem for a long time, but it took me >>> *many* tries to understand how to fit the pieces together. ?And I had >>> the advantage of lots of help along the way: >> >> [...] >>> >>> - and finally, John Siracusa's review at Ars Technica about Apple's >>> work with anonymous blocks and Grand Central Dispatch make the whole >>> thing click. >>> >>> As you can see, if you're slow for taking a day to put it together, >>> there's simply no hope for me: it took me almost 2 years, and I needed >>> the help of some of the very brightest people in the python world to >>> push me along. ?They are the ones who did all the thinking and deserve >>> the credit, I was just thick enough never to understand the ideas >>> until now! >> >> Thanks for the interesting links and thread. ?Just FYI, last year I had >> occasion to solve, relatively elegantly, a set of pretty sticky problems for >> mayavi2 using decorators and generators. ?See for example: >> >> https://svn.enthought.com/enthought/browser/Mayavi/trunk/enthought/mayavi/tools/show.py >> https://svn.enthought.com/enthought/browser/Mayavi/trunk/enthought/mayavi/tools/animator.py >> >> They allow us to do relatively simple but neat things very elegantly. Until >> you use decorators you often don't realize how convenient they can be. ?In >> particular the pattern used in animator.py shows how convenient the >> combination of a UI dispatch mechanism plus a generator is. ?All a >> programmer needs is to inject a yield suitably and the rest is automatic. >> >> The Kamaelia project (http://www.kamaelia.org) is also very interesting for >> its use of generators, microprocesses, components and very specifically >> concurrency. ?They have a very neat model for exactly this and a nice though >> slightly elaborate tutorial showing how you can build their core library >> from these ideas. ?If you have the time it is very interesting. >> >> My humble contribution to spending two more hours of your time. ;-) > > Thanks for the pointer. The Pipeline and Graphline reminds me of > LabView, or rather how I would prefer to develop labview-like > applications in python. > > I had to look in the older document structure to find a discussion on > concurrency, here it is for anyone who is interested: > http://www.kamaelia.org/Docs/Axon/Axon.Microprocess.html They talk a lot about microprocesses, and in this pdf (1) they mention multi-core computers, but I think their terminology unfortunately confuses processes with threads. From the looks of their trunk, they are using the threading package for "concurrency". They are aware of the GIL, however (2). (1) http://www.google.com/url?sa=t&source=web&ct=res&cd=2&url=http%3A%2F%2Fwww.kamaelia.org%2Ft%2FTN-LightTechnicalIntroToKamaelia.pdf&ei=x7SjStviLomJ8QbgybXSDw&usg=AFQjCNH8l1dPhesP9svjU_eBvAmzTgYjkQ&sig2=43cMT10EIYC715NMy4-E1Q (2) http://groups.google.com/group/kamaelia/browse_thread/thread/23b203b183a0c3e4 From prabhu at aero.iitb.ac.in Sun Sep 6 10:31:19 2009 From: prabhu at aero.iitb.ac.in (Prabhu Ramachandran) Date: Sun, 06 Sep 2009 20:01:19 +0530 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: <4AA36166.4070604@aero.iitb.ac.in> Message-ID: <4AA3C7B7.2030800@aero.iitb.ac.in> Darren Dale wrote: > On Sun, Sep 6, 2009 at 9:06 AM, Darren Dale wrote: >>> The Kamaelia project (http://www.kamaelia.org) is also very interesting for >>> its use of generators, microprocesses, components and very specifically >>> concurrency. They have a very neat model for exactly this and a nice though >>> slightly elaborate tutorial showing how you can build their core library >>> from these ideas. If you have the time it is very interesting. >>> >>> My humble contribution to spending two more hours of your time. ;-) >> Thanks for the pointer. The Pipeline and Graphline reminds me of >> LabView, or rather how I would prefer to develop labview-like >> applications in python. Indeed. The only problem when I looked at the architecture a while ago was that it wasn't suitable for VTK pipelines. I don't recall the details of why I felt that way now though. >> I had to look in the older document structure to find a discussion on >> concurrency, here it is for anyone who is interested: >> http://www.kamaelia.org/Docs/Axon/Axon.Microprocess.html > > They talk a lot about microprocesses, and in this pdf (1) they mention > multi-core computers, but I think their terminology unfortunately > confuses processes with threads. From the looks of their trunk, they > are using the threading package for "concurrency". They are aware of > the GIL, however (2). I had looked at this a long while ago when trying to come up with a nice way to build pipelines in general. At the time they didn't have anything for actually using multiple processors or threads but the ideas were very interesting to me especially their use of generators. cheers, prabhu From gokhansever at gmail.com Sun Sep 6 14:56:00 2009 From: gokhansever at gmail.com (=?UTF-8?Q?G=C3=B6khan_Sever?=) Date: Sun, 6 Sep 2009 13:56:00 -0500 Subject: [IPython-dev] New GUI integration in IPython In-Reply-To: <6ce0ac130908311543o3cfc3ddbm9862d9523743fc00@mail.gmail.com> References: <6ce0ac130908311543o3cfc3ddbm9862d9523743fc00@mail.gmail.com> Message-ID: <49d6b3500909061156x7482c277s306d33d42fb0ea7b@mail.gmail.com> On Mon, Aug 31, 2009 at 5:43 PM, Brian Granger wrote: > Hello all, > > This email is being sent out to to the lists of users+devs who regularly > use IPython's "pylab" mode or "-wthread", "-qthread", "-gthread", etc. > threaded shells. As of today, in IPython's trunk, we have a completely new > implementation of our GUI event loop integration that dramatically improves > the stability of using the TERMINAL BASED IPython with GUI applications. > This does not affect attempts to embed IPython into GUI applications. > > At this point, we need developers to begin to try out the new stuff and > adapt their projects to use the new capabilities. Here are some things you > will get: > > * Stability and robustness have been improved greatly. > * KeyboardInterrupts should work on all platforms reliably. > * No more command line flags - instead everything can be > activated/de-activated/switched at runtime. This should allow projects like > matplotlib to enable reliable backend switching. See the new %gui magic for > more information on this. > * We have a new developer module for working with these features > (IPython.lib.inputhook). > * Unless someone complains very loudly *and* steps up to the plate to > maintain them, the old threaded shells will be removed in the next release > of IPython. > > Here are some starting points for documentation on the new features: > > > http://bazaar.launchpad.net/~ipython-dev/ipython/trunk/annotate/head%3A/docs/source/interactive/reference.txt#L1375 > > http://bazaar.launchpad.net/~ipython-dev/ipython/trunk/annotate/head%3A/IPython/lib/inputhook.py > > http://bazaar.launchpad.net/~ipython-dev/ipython/trunk/annotate/head%3A/IPython/core/magic.py#L3542 > > Please let us know if you have questions - we are more than willing to help > you get started with all of this. > > Cheers, > > Brian > > > > _______________________________________________ > IPython-dev mailing list > IPython-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/ipython-dev > > Two questions on this change: 1-) No more auto explicit numpy.core and matplotlib.pyplot load into the visible namespace right? This was a handy functionality to make quick tests, without a need for imports. I am sure this could be remedied putting proper statements somewhere into the config file, however with the switches removed how to instantiate IPython telling that I want matplotlib and numpy functionality in my shell. 2-) No thread options gone, each time we issue a plot directive, a show() must be explicitly stated. I will miss this lazy way of programming :) 3-) What are the visible ad[dis]vantages of these changes to a simple user :) Those who don't integrate Ipython into a GUI application and changing backends very frequently. Thanks -- G?khan -------------- next part -------------- An HTML attachment was scrubbed... URL: From fperez.net at gmail.com Tue Sep 8 04:15:18 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Tue, 8 Sep 2009 01:15:18 -0700 Subject: [IPython-dev] New GUI integration in IPython In-Reply-To: <49d6b3500909061156x7482c277s306d33d42fb0ea7b@mail.gmail.com> References: <6ce0ac130908311543o3cfc3ddbm9862d9523743fc00@mail.gmail.com> <49d6b3500909061156x7482c277s306d33d42fb0ea7b@mail.gmail.com> Message-ID: Hey Gokhan, On Sun, Sep 6, 2009 at 11:56 AM, G?khan Sever wrote: > > 1-) No more auto explicit numpy.core and matplotlib.pyplot load into the > visible namespace right? This was a handy functionality to make quick tests, > without a need for imports. I am sure this could be remedied putting proper > statements somewhere into the config file, however with the switches removed > how to instantiate IPython telling that I want matplotlib and numpy > functionality in my shell. > > 2-) No thread options gone, each time we issue a plot directive, a show() > must be explicitly stated. I will miss this lazy way of programming :) No, when the dust settles, we'll have an interactive -pylab mode that will work just like today, except without the mysterious Ctrl-C-related crashes that are so easy to induce today with Wx. It's just that we are in the middle of major changes, and not all the pieces have landed yet. > 3-) What are the visible ad[dis]vantages of these changes to a simple user > :) Those who don't integrate Ipython into a GUI application and changing > backends very frequently. - That Ctrl-C will actually do something sensible, without potentially exploding in your face depending on the timing of your input regarding what the GUI was doing. - That tab-completion with Mayavi running should be less potentially problematic. There may still be attribute access bugs in the VTK wrapping, but at least bugs in python's readline module that are thread-related (we've been hit by that in the past) won't affect us anymore. - That you will be able to switch between Wx/GTK/Qt at runtime. I can imagine this being very useful for certain testing patterns. Cheers, f From hans_meine at gmx.net Tue Sep 8 05:15:07 2009 From: hans_meine at gmx.net (Hans Meine) Date: Tue, 8 Sep 2009 11:15:07 +0200 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control Message-ID: <200909081115.07578.hans_meine@gmx.net> On Sunday 06 September 2009 09:14:46 Prabhu Ramachandran wrote: > Thanks for the interesting links and thread. Just FYI, last year I had > occasion to solve, relatively elegantly, a set of pretty sticky problems > for mayavi2 using decorators and generators. [...] > They allow us to do relatively simple but neat things very elegantly. Seconded. > The Kamaelia project (http://www.kamaelia.org) is also very interesting > for its use of generators, microprocesses, components and very > specifically concurrency. Let me throw kaa into the pot, which is a set of libraries for media programming, but kaa.base contains really interesting coroutine-stuff! Let me quote http://doc.freevo.org/2.0/Kaa: > The kaa framework includes a mainloop facility with an API for signals and > callbacks, timers, process and thread management, file descriptor monitoring > (with INotify support), inter-process communication, as well as a rich, > practically magical API for asynchronous programming (see > http://doc.freevo.org/2.0/SourceDoc/Async) Maybe the threaded decorator is similar to what Fernando has in mind: (I thought I mentioned kaa here before, but I guess I did not give any code examples..) @kaa.threaded() def do_blocking_task(): [...] return 42 @kaa.coroutine() def do_something_else(): try: result = yield do_blocking_task() except: print "Exception raised in thread" print "Thread returned", result Have a nice day, Hans From gael.varoquaux at normalesup.org Tue Sep 8 08:26:07 2009 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Tue, 8 Sep 2009 14:26:07 +0200 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: Message-ID: <20090908122607.GB25829@phare.normalesup.org> On Sat, Sep 05, 2009 at 11:36:50PM -0700, Fernando Perez wrote: > though it does have the same misconception that just about every other > document about decorators has, namely > """a function, that takes a function as input, and ***returns a > function*** """ > The part between ** above is not correct, and this is a subtle but > critical point here. Actually, I'd like to jump in here. This is slightly off-topic, but I believe of interest to this mailing list. I have recently been revisiting my decoration code, to fight a common mistake I had been doing, and it was partly due to the heavy use of a simplified pattern for decorating that is underlying in the quote above. The pattern ------------ ________________________________________________________________________________ def with_print(func): """ Decorate a function to print its arguments. """ def my_func(*args, **kwargs): print args, kwargs return func(*args, **kwargs) return my_func @with_print def f(x): print 'f called' ________________________________________________________________________________ The nice thing about this pattern is that is it quite easy to type, and to read. Why it is harmful ------------------ The decorated function is actually the function 'my_func', with a reference to the orginal function 'func', a part of the scope of the decorator 'with_print', and thus in the closure of the with_print function. The problem is that we have a closure here. Thus we have a variable that are hard to get to (the undecorated function), and the decorated function is not pickable (which is more and more important to me, e.g. for parallel computing). Solutions ----------- Avoiding the closure ..................... Use objects as a scope, rather than a closure: ________________________________________________________________________________ class WithPrint(object): def __init__(self, func): self.func = func def __call__(self, *args, **kwargs): print args, kwargs return self.func(*args, **kwargs) ________________________________________________________________________________ This solution is not enough: the following code won't pickle: ________________________________________________________________________________ @WithPrint def g(x): print 'g called' ________________________________________________________________________________ The reason this won't pickle is that we have a name collision: the code above expands to: ________________________________________________________________________________ def g(x): print 'g called' g = WithPrint(g) ________________________________________________________________________________ and trying to pickle raises the following PicklingError: Can't pickle : it's not the same object as __main__.g If we do: ________________________________________________________________________________ def g(x): print 'g called' h = WithPrint(g) ________________________________________________________________________________ we can pickle h, hurray! Using functools.wraps ...................... However, Python comes with the answer in the standard libary functools.wraps does the name unmangling. Thus the following code produces a pickleable f: ________________________________________________________________________________ from functools import wraps def with_print(func): """ Decorate a function to print its arguments. """ @wraps(func) def my_func(*args, **kwargs): print args, kwargs return func(*args, **kwargs) return my_func @with_print def f(x): print 'f called' ________________________________________________________________________________ The pickling works simply because using functools.wraps resets the .func_name attribute of f to have a well-defined import path. Thus pickling works, simply by storing the import path, as all pickling of functions. Notice that there is only a one-line difference with the original code! I actually tend to use a combination of both solution (an object, using functools.wraps), to keep a reference on the undecorated functions. Take home messages ------------------- - Decorators can be more clever than you think, and my not return objects as simple as you think - Think about pickling, or you'll get bitten at some point and most important: - Use functools.wraps Sorry for going off-topic, Ga?l -------------- next part -------------- A non-text attachment was scrubbed... Name: tmp.py Type: text/x-python Size: 1323 bytes Desc: not available URL: From matthieu.brucher at gmail.com Tue Sep 8 08:34:50 2009 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Tue, 8 Sep 2009 14:34:50 +0200 Subject: [IPython-dev] Before a patch for LSF support In-Reply-To: References: <6ce0ac130908120258y66547ee5x77b4453beae5fecc@mail.gmail.com> <6ce0ac130908120917m6dd5a68fq67bd22fade7ce47@mail.gmail.com> <6ce0ac130908130110y7402edfbs18480f22c1fd4092@mail.gmail.com> Message-ID: > Excellent ! I saw a mail of Fernando on this. I will need some time to > help you with this. > > Meanwhile, I will try to use ipcluster from a node to another LSF > node. I have still some issues with the IP address: when I'm starting > ipcluster from a compute node, the furl points to 127.0.0.1 instead of > the name of the host or at least its public IP address. Hi, I'm trying to get to this issue again. I've started from a remote node, but I still have the issue with the Connection refused, and this is the moment where the LSF job actually crashes (it is running before it tries to connect). Cheers, Matthieu -- Information System Engineer, Ph.D. Website: http://matthieu-brucher.developpez.com/ Blogs: http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn: http://www.linkedin.com/in/matthieubrucher From dsdale24 at gmail.com Tue Sep 8 10:00:32 2009 From: dsdale24 at gmail.com (Darren Dale) Date: Tue, 8 Sep 2009 10:00:32 -0400 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: <200909081115.07578.hans_meine@gmx.net> References: <200909081115.07578.hans_meine@gmx.net> Message-ID: On Tue, Sep 8, 2009 at 5:15 AM, Hans Meine wrote: > On Sunday 06 September 2009 09:14:46 Prabhu Ramachandran wrote: >> Thanks for the interesting links and thread. ?Just FYI, last year I had >> occasion to solve, relatively elegantly, a set of pretty sticky problems >> for mayavi2 using decorators and generators. ?[...] >> They allow us to do relatively simple but neat things very elegantly. > > Seconded. > >> The Kamaelia project (http://www.kamaelia.org) is also very interesting >> for its use of generators, microprocesses, components and very >> specifically concurrency. > > Let me throw kaa into the pot, which is a set of libraries for media > programming, but kaa.base contains really interesting coroutine-stuff! > Let me quote http://doc.freevo.org/2.0/Kaa: >> The kaa framework includes a mainloop facility with an API for signals and >> callbacks, timers, process and thread management, file descriptor monitoring >> (with INotify support), inter-process communication, as well as a rich, >> practically magical API for asynchronous programming (see >> http://doc.freevo.org/2.0/SourceDoc/Async) > > Maybe the threaded decorator is similar to what Fernando has in mind: > (I thought I mentioned kaa here before, but I guess I did not give any code > examples..) > > @kaa.threaded() > def do_blocking_task(): > ? [...] > ? return 42 I think the pattern Fernando introduced would either throw away the return value or it would rebind it to do_blocking_task, depending on how kaa.threaded is implemented. Darren From gokhansever at gmail.com Tue Sep 8 11:22:58 2009 From: gokhansever at gmail.com (=?UTF-8?Q?G=C3=B6khan_Sever?=) Date: Tue, 8 Sep 2009 10:22:58 -0500 Subject: [IPython-dev] New GUI integration in IPython In-Reply-To: References: <6ce0ac130908311543o3cfc3ddbm9862d9523743fc00@mail.gmail.com> <49d6b3500909061156x7482c277s306d33d42fb0ea7b@mail.gmail.com> Message-ID: <49d6b3500909080822l61bf535ekb4e662e024587e9b@mail.gmail.com> On Tue, Sep 8, 2009 at 3:15 AM, Fernando Perez wrote: > Hey Gokhan, > > On Sun, Sep 6, 2009 at 11:56 AM, G?khan Sever > wrote: > > > > 1-) No more auto explicit numpy.core and matplotlib.pyplot load into the > > visible namespace right? This was a handy functionality to make quick > tests, > > without a need for imports. I am sure this could be remedied putting > proper > > statements somewhere into the config file, however with the switches > removed > > how to instantiate IPython telling that I want matplotlib and numpy > > functionality in my shell. > > > > 2-) No thread options gone, each time we issue a plot directive, a show() > > must be explicitly stated. I will miss this lazy way of programming :) > > No, when the dust settles, we'll have an interactive -pylab mode that > will work just like today, except without the mysterious > Ctrl-C-related crashes that are so easy to induce today with Wx. It's > just that we are in the middle of major changes, and not all the > pieces have landed yet. > > > 3-) What are the visible ad[dis]vantages of these changes to a simple > user > > :) Those who don't integrate Ipython into a GUI application and changing > > backends very frequently. > > - That Ctrl-C will actually do something sensible, without potentially > exploding in your face depending on the timing of your input regarding > what the GUI was doing. > > - That tab-completion with Mayavi running should be less potentially > problematic. There may still be attribute access bugs in the VTK > wrapping, but at least bugs in python's readline module that are > thread-related (we've been hit by that in the past) won't affect us > anymore. > > - That you will be able to switch between Wx/GTK/Qt at runtime. I can > imagine this being very useful for certain testing patterns. > > Cheers, > > f > Thanks for the explanations Fernando. I am happy to hear that pylab will be back soon :) In the following test, when I issue a plt.show() I can't get access to the IPython shell unless I close the plot window. Moreover If I call plt.show() without a plot command beforehand, I have to kill the shell and restart the session again. Seems like a bug or something getting wrong with my system. This is Fedora 11. $ ipython Python 2.6 (r26:66714, Jun 8 2009, 16:07:26) Type "copyright", "credits" or "license" for more information. IPython 0.11.bzr.r1205 -- An enhanced Interactive Python. I[1]: import matplotlib.pyplot as plt I[2]: %gui qt I[3]: plt.plot(range(10)) O[3]: [] I[4]: plt.show() -- G?khan -------------- next part -------------- An HTML attachment was scrubbed... URL: From ellisonbg.net at gmail.com Tue Sep 8 12:42:01 2009 From: ellisonbg.net at gmail.com (Brian Granger) Date: Tue, 8 Sep 2009 09:42:01 -0700 Subject: [IPython-dev] New GUI integration in IPython In-Reply-To: <49d6b3500909061156x7482c277s306d33d42fb0ea7b@mail.gmail.com> References: <6ce0ac130908311543o3cfc3ddbm9862d9523743fc00@mail.gmail.com> <49d6b3500909061156x7482c277s306d33d42fb0ea7b@mail.gmail.com> Message-ID: <6ce0ac130909080942n49e79393vc5f01c468b14c68f@mail.gmail.com> > Two questions on this change: > > 1-) No more auto explicit numpy.core and matplotlib.pyplot load into the > visible namespace right? This was a handy functionality to make quick tests, > without a need for imports. I am sure this could be remedied putting proper > statements somewhere into the config file, however with the switches removed > how to instantiate IPython telling that I want matplotlib and numpy > functionality in my shell. > > As Fernando mentions, something like the old -pylab switch with be there. My own preference is to make it a magic like: In [1]: %pylab To emphasize that it is not something you *have* to choose at startup. But we will definitely have something like this. > 2-) No thread options gone, each time we issue a plot directive, a show() > must be explicitly stated. I will miss this lazy way of programming :) > > This will be fixed as matploblib adds that takes advantages of the new capabilities. Currently, that matplotlib works with this new approach is almost chance. But, once matplotlib uses this new stuff, we can make sure that everything works exactly as everyone wants and is used to. > 3-) What are the visible ad[dis]vantages of these changes to a simple user > :) Those who don't integrate Ipython into a GUI application and changing > backends very frequently. > > These changes don't affect IPython being embedded into a GUI, only the terminal based IPython's integration with GUI event loops. But, the main differences to a simple user are: * Don't have to remember to do -pylab when you start IPython to do plotting. At any point after starting IPython, you can enable this. * Control-C works robustly on all platforms and all GUI toolkits - stability, yah! Some of our users couldn't use certain GUI toolkits on certain platforms (Fernando was in this situation with wx+liunx) because on these problems. * As matplotlib integrates this stuff, they will be able to offer richer and more stable APIs for interactive plotting. Cheers, Brian > Thanks > -- > G?khan > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ellisonbg.net at gmail.com Tue Sep 8 12:46:56 2009 From: ellisonbg.net at gmail.com (Brian Granger) Date: Tue, 8 Sep 2009 09:46:56 -0700 Subject: [IPython-dev] New GUI integration in IPython In-Reply-To: <49d6b3500909080822l61bf535ekb4e662e024587e9b@mail.gmail.com> References: <6ce0ac130908311543o3cfc3ddbm9862d9523743fc00@mail.gmail.com> <49d6b3500909061156x7482c277s306d33d42fb0ea7b@mail.gmail.com> <49d6b3500909080822l61bf535ekb4e662e024587e9b@mail.gmail.com> Message-ID: <6ce0ac130909080946n54267c70j461fa188ce6ac952@mail.gmail.com> > In the following test, when I issue a plt.show() I can't get access to the > IPython shell unless I close the plot window. Moreover If I call plt.show() > without a plot command beforehand, I have to kill the shell and restart the > session again. > > Seems like a bug or something getting wrong with my system. > > This is Fedora 11. > > Can you retry with using %gui -a qt The -a flag tells IPython to create an application object, and then show should work as expected. Again, this is an issue with the event loop hacks that matplotlib currently does. The current show function in matplotlib looks to see if an qt app has been created, if not, they create one and start the event loop (which blocks). This used to work because IPython used to monkey patch the wx/qt/gtk event loops to be no-ops. These hacks are based on the older threading approach and once matplotlib updates their code, all of this will "just work." Could you bring these issues up on the matplotlib list? Cheers, Brian > $ ipython > > Python 2.6 (r26:66714, Jun 8 2009, 16:07:26) > Type "copyright", "credits" or "license" for more information. > > IPython 0.11.bzr.r1205 -- An enhanced Interactive Python. > > I[1]: import matplotlib.pyplot as plt > > I[2]: %gui qt > > I[3]: plt.plot(range(10)) > O[3]: [] > > I[4]: plt.show() > > > -- > G?khan > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ellisonbg.net at gmail.com Tue Sep 8 12:49:37 2009 From: ellisonbg.net at gmail.com (Brian Granger) Date: Tue, 8 Sep 2009 09:49:37 -0700 Subject: [IPython-dev] Before a patch for LSF support In-Reply-To: References: <6ce0ac130908120258y66547ee5x77b4453beae5fecc@mail.gmail.com> <6ce0ac130908120917m6dd5a68fq67bd22fade7ce47@mail.gmail.com> <6ce0ac130908130110y7402edfbs18480f22c1fd4092@mail.gmail.com> Message-ID: <6ce0ac130909080949wd076e5fh946febd26729ffc7@mail.gmail.com> On Tue, Sep 8, 2009 at 5:34 AM, Matthieu Brucher wrote: > > Excellent ! I saw a mail of Fernando on this. I will need some time to > > help you with this. > > > > Meanwhile, I will try to use ipcluster from a node to another LSF > > node. I have still some issues with the IP address: when I'm starting > > ipcluster from a compute node, the furl points to 127.0.0.1 instead of > > the name of the host or at least its public IP address. > > Hi, > > I'm trying to get to this issue again. I've started from a remote > node, but I still have the issue with the Connection refused, and this > is the moment where the LSF job actually crashes (it is running before > it tries to connect). > > Matthieu, Sorry, I am a bit foggy on this one. Can you refresh my memory and describe exactly which iteration you are trying. Brian > Cheers, > > Matthieu > -- > Information System Engineer, Ph.D. > Website: http://matthieu-brucher.developpez.com/ > Blogs: http://matt.eifelle.com and http://blog.developpez.com/?blog=92 > LinkedIn: http://www.linkedin.com/in/matthieubrucher > _______________________________________________ > IPython-dev mailing list > IPython-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/ipython-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ellisonbg.net at gmail.com Tue Sep 8 13:23:16 2009 From: ellisonbg.net at gmail.com (Brian Granger) Date: Tue, 8 Sep 2009 10:23:16 -0700 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: Message-ID: <6ce0ac130909081023u30ecbbffp36a4d6af3607b1a7@mail.gmail.com> Hi, Sorry I missed this one. This is a very nice idea! With respect to using decorators like this, I like the "you can't do that....oh, wait, yes you can" feel of it. Very creative. I don't have time to read through the entire thread, but I did skim most of it and have some comments. One of the big issues we have run into with the parallel computing stuff in IPython, is that it is very tough to avoid typing code in strings. Then you send the string to a different execution context and it does exec in its namespace. Here is an example: mec.push(dict(a=10)) mec.execute('b = 2*a') Fernando's original work on using "with" for stuff like this was to try to get something that allowed you to just type your code: a=10 with remote_engine: b = 2*a While this is much nicer, you do need some code to grab the value of a from the top-level and push it over to the execution context. I guess you could also pass it to the context though like remote_engine(dict(a=10)) With Fernando's new idea, this example would read: @remote(dict(a=10) def foo(): b = 2*a While I wish we could get rid of the 2 line header (@remote...def foo), this is a pretty nice way of abstracting this. No code in strings, and a simple way of passing in variables. My only complaint is that is a bit unexpected that this actually declares and *calls* the function! But renaming remote to call_remote or something would help. I am going to start working on the parallel stuff in about 2 weeks, and I will revisit this then. It shouldn't be too difficult to implement some nice things using this pattern. Cheers, Brian On Fri, Sep 4, 2009 at 1:31 AM, Fernando Perez wrote: > Hi all, > > I know I should have been hard at work on continuing the branch review > work I have in an open tab of Brian's push, but I couldn't resist. > Please bear with me, this is a bit technical but, I hope, very > interesting in the long run for us... > > This part of Ars Technica's excellent review of Snow Leopard: > > http://arstechnica.com/apple/reviews/2009/08/mac-os-x-10-6.ars/13 > > shows how Apple tackled the problem of providing civilized primitives > to express parallellism in applications and a mechanism to make useful > decisions based on this information. The idea is to combine a > kernel-level dispatcher (GCD), a beautiful extension to the C language > (yes, they extended C!) in the form of anonymous code blocks, and an > API to inform GCD of your parallelism breakup easily, so GCD can use > your intentions at runtime efficiently. It's really a remarkably > simple, yet effective (IMHO) combination of tools to tackle a thorny > problem. > > In any case, what does all this have to do with us? For a long time > we've wondered about how to provide the easiest, simplest APIs that > appear natural to the user, that are easy to convert into serial > execution mode trivially (with a simple global switch for debugging, > NOT changing any actual code everywhere), and that can permit > execution via ipython. A while ago I hacked something via 'with' and > context managers, that was so horrible and brittle (it involved stack > manipulations, manual source introspection and exception injection) > that I realized that could never really fly for production work. > > But this article on GCD got me trying my 'with' approach again, and I > realized that syntactically it felt quite nice, I could write python > versions of the code examples in that review, yet the whole 'with' > mess killed it for me. And then it hit me that decorators could be > abused just a little bit to get the same job done [1]! While this may > be somewhat of an abuse, it does NOT involve source introspection or > stack manipulations, so in principle it's 100% kosher, robust python. > A little weird the first time you see it, but bear with me. > > The code below shows an implementation of a simple for loop directly > and via a decorator. Both versions do the same thing, but the point > is that by providing such decorators, we can *trivially* provide a > GCD-style API for users to express their parallelism and have > execution chunks handled by ipython remotely. > > It's obvious that such decorators can also be used to dispatch code to > Cython, to a GPU, to a CorePy-based optimizer, to a profiler, etc. I > think this could be a useful idea in more than one context, and it > certainly feels to me like one of the missing API/usability pieces > we've struggled with for the ipython distributed machinery. > > Cheers, > > f > > [1] What clicked in my head was tying the 'with' mess to how the Sage > notebook uses the @interact decorator to immediately call the > decorated function rather than decorating it and returning it. This > immediate-consumption (ab)use of a decorator is what I'm using. > > ### CODE example > > # Consider a simple pair of 'loop body' and 'loop summary' functions: > def do_work(data, i): > return data[i]/2 > > def summarize(results, count): > return sum(results[:count]) > > # And some 'dataset' (here just a list of 10 numbers > count = 10 > data = [3.0*j for j in range(count) ] > > # That we'll process. This is our processing loop, implemented as a > regular > # serial function that preallocates storage and then goes to work. > def loop_serial(): > results = [None]*count > > for i in range(count): > results[i] = do_work(data, i) > > return summarize(results, count) > > # The same thing can be done with a decorator: > def for_each(iterable): > """This decorator-based loop does a normal serial run. > But in principle it could be doing the dispatch remotely, or into a > thread > pool, etc. > """ > def call(func): > map(func, iterable) > > return call > > # This is the actual code of the decorator-based loop: > def loop_deco(): > results = [None]*count > > @for_each(range(count)) > def loop(i): > results[i] = do_work(data, i) > > return summarize(results, count) > > # Test > assert loop_serial() == loop_deco() > print 'OK' > _______________________________________________ > IPython-dev mailing list > IPython-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/ipython-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From fperez.net at gmail.com Tue Sep 8 13:33:15 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Tue, 8 Sep 2009 10:33:15 -0700 Subject: [IPython-dev] New GUI integration in IPython In-Reply-To: <6ce0ac130909080942n49e79393vc5f01c468b14c68f@mail.gmail.com> References: <6ce0ac130908311543o3cfc3ddbm9862d9523743fc00@mail.gmail.com> <49d6b3500909061156x7482c277s306d33d42fb0ea7b@mail.gmail.com> <6ce0ac130909080942n49e79393vc5f01c468b14c68f@mail.gmail.com> Message-ID: On Tue, Sep 8, 2009 at 9:42 AM, Brian Granger wrote: > As Fernando mentions, something like the old -pylab switch with be there. > My own preference is to make it a magic like: > > In [1]: %pylab > > To emphasize that it is not something you *have* to choose at startup.? But > we will definitely have something like this. Yes, as you point out this is another *major* win I forgot to stress in my reply. I can't remember how many times in the past I've been bitten by this (long running session started without -pylab, now I need to plot something, don't want to lose my session... argh!). I just think (I made a similar comment in my branch review) that having a command-line flag also is a good thing for some of these things, because then people have the convenience of aliasing 'pylab==ipython -pylab', for example. Since pylab is such a common use and we have a lot of existing documentation out there with that pattern, it's probably worth preserving the convenience flag. But the key point here, as you point out, is how this becomes all just one more run-time function, instead of the nasty special-case startup hack we had before. Cheers, f From fperez.net at gmail.com Tue Sep 8 13:46:43 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Tue, 8 Sep 2009 10:46:43 -0700 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: <4AA36166.4070604@aero.iitb.ac.in> References: <4AA36166.4070604@aero.iitb.ac.in> Message-ID: On Sun, Sep 6, 2009 at 12:14 AM, Prabhu Ramachandran wrote: > Thanks for the interesting links and thread. ?Just FYI, last year I had > occasion to solve, relatively elegantly, a set of pretty sticky problems for > mayavi2 using decorators and generators. ?See for example: > > https://svn.enthought.com/enthought/browser/Mayavi/trunk/enthought/mayavi/tools/show.py > https://svn.enthought.com/enthought/browser/Mayavi/trunk/enthought/mayavi/tools/animator.py > Very nice code, thanks for the pointers. I particularly like how you handled (at the inevitable cost of some extra code in your decorators) the slight annoyance that typical decorators whose arguments are optional need to be called in the no-arg case as @foo() def bar (): ... so the first call to foo() resolves out. Your code permits instead the cleaner-looking (for the user at least): @foo(x, y) # with args def bar(): ... @foo # foo without args def baz(): ... > They allow us to do relatively simple but neat things very elegantly. Until > you use decorators you often don't realize how convenient they can be. ?In > particular the pattern used in animator.py shows how convenient the > combination of a UI dispatch mechanism plus a generator is. ?All a > programmer needs is to inject a yield suitably and the rest is automatic. > > The Kamaelia project (http://www.kamaelia.org) is also very interesting for > its use of generators, microprocesses, components and very specifically > concurrency. ?They have a very neat model for exactly this and a nice though > slightly elaborate tutorial showing how you can build their core library > from these ideas. ?If you have the time it is very interesting. I remember reading about Kamaelia a while ago, and thinking that it felt like a relly well laid out and intelligently done project. I should probably revisit it, thanks for the reminder. > My humble contribution to spending two more hours of your time. ;-) Well, I didn't bite this time :) It really was critical that I push on reviewing Brian's excellent recent work to avoid bottlenecking him any further, so my labor day holiday was productively spent on ipython yesterday (and sphinx exercises, for a paper). I'm glad I got it done, because I missed a trip to the Monterrey aquarium with my family and the pictures they brought back were really nice, so at least I didn't stay home for nothing :) Cheers, f From fperez.net at gmail.com Tue Sep 8 13:50:24 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Tue, 8 Sep 2009 10:50:24 -0700 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: <200909081115.07578.hans_meine@gmx.net> References: <200909081115.07578.hans_meine@gmx.net> Message-ID: Hi Hans! On Tue, Sep 8, 2009 at 2:15 AM, Hans Meine wrote: > On Sunday 06 September 2009 09:14:46 Prabhu Ramachandran wrote: > Let me throw kaa into the pot, which is a set of libraries for media > programming, but kaa.base contains really interesting coroutine-stuff! > Let me quote http://doc.freevo.org/2.0/Kaa: >> The kaa framework includes a mainloop facility with an API for signals and >> callbacks, timers, process and thread management, file descriptor monitoring >> (with INotify support), inter-process communication, as well as a rich, >> practically magical API for asynchronous programming (see >> http://doc.freevo.org/2.0/SourceDoc/Async) Thanks a lot for the pointer to that! I think this little thread did indeed prove to be a productive exercise. As we dig out of Brian's major foundational rework, seeing what others have done with this type of problem, with the benefit of more understanding and patterns in our head to think about it, will be very productive. > Maybe the threaded decorator is similar to what Fernando has in mind: > (I thought I mentioned kaa here before, but I guess I did not give any code > examples..) > > @kaa.threaded() > def do_blocking_task(): > ? [...] > ? return 42 Similar, with the caveat Darren pointed out of immediate consumption. Best regards, and thanks again for pitching in! f From gokhansever at gmail.com Tue Sep 8 15:45:56 2009 From: gokhansever at gmail.com (=?UTF-8?Q?G=C3=B6khan_Sever?=) Date: Tue, 8 Sep 2009 14:45:56 -0500 Subject: [IPython-dev] Testing matplotlib on IPython trunk Message-ID: <49d6b3500909081245r763f6292nf75b689f2c7d6ead@mail.gmail.com> Hello, The thread switches will be gone by the release of the new IPython. I am assuming that some extra work needs to be done on both sides in preparation to the new release. See the following test cases: ### This one locks the IPython unless the figure window is killed. If you do an additional plt.show() without a figure is up then you get a complete lock-up of the shell. I[1]: import matplotlib.pyplot as plt I[2]: %gui qt I[3]: plt.plot(range(10)) O[3]: [] I[4]: plt.show() ### The following cannot resolve that issue I[5]: %gui #disable event loops I[6]: %gui -a qt O[6]: I[7]: plt.plot(range(10)) O[7]: [] I[8]: plt.show() ### In a new IPython, these lines work --no locking after plt.show() "-a" makes the difference. I[1]: import matplotlib.pyplot as plt I[2]: %gui -a qt O[2]: I[3]: plt.plot(range(10)) O[3]: [] I[4]: plt.show() ================================================================================ Platform : Linux-2.6.29.6-217.2.3.fc11.i686.PAE-i686-with-fedora-11-Leonidas Python : ('CPython', 'tags/r26', '66714') IPython : 0.11.bzr.r1205 NumPy : 1.4.0.dev Matplotlib : 1.0.svn ================================================================================ -- G?khan -------------- next part -------------- An HTML attachment was scrubbed... URL: From fperez.net at gmail.com Tue Sep 8 16:25:09 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Tue, 8 Sep 2009 13:25:09 -0700 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: <20090908122607.GB25829@phare.normalesup.org> References: <20090908122607.GB25829@phare.normalesup.org> Message-ID: On Tue, Sep 8, 2009 at 5:26 AM, Gael Varoquaux wrote: > > Take home messages > ------------------- > > ? ?- Decorators can be more clever than you think, and my not return > ? ? ?objects as simple as you think > ? ?- Think about pickling, or you'll get bitten at some point > > ?and most important: > > ? ?- Use functools.wraps > > Sorry for going off-topic, Not at all, it's great! Thanks for this nice writeup, it's clear and informative. I have to admit that I've never used functools.wraps, because I had become used to Michele Simonato's excellent decorators module. In fact, in ipython we ship internally a copy of it for use in our testing machinery, as decorators_msim.py. Perhaps it's time we use it system-wide and put it in externals, especially since he continues to improve it so much. I just had a look and he is still making releases, it's now on pypi, and has a fantastically well documented page: http://pypi.python.org/pypi/decorator Hell, I just saw that in fact he already has non-"inlined" versions of @trace, @blocking, @async, etc! One reason I kept using Michele's code was because functools is 2.5-only, and up until now we were keeping 2.4 compatibility. But it's nice to see (I just tested it0 that functools preserves enough of the decorated function data for ipython foo? introspection to work nicely (correct name, docstring, signature, etc). Still, Michele's module is a great little resource in this discussion, I think. Cheers, f From fperez.net at gmail.com Tue Sep 8 16:34:33 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Tue, 8 Sep 2009 13:34:33 -0700 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: <6ce0ac130909081023u30ecbbffp36a4d6af3607b1a7@mail.gmail.com> References: <6ce0ac130909081023u30ecbbffp36a4d6af3607b1a7@mail.gmail.com> Message-ID: On Tue, Sep 8, 2009 at 10:23 AM, Brian Granger wrote: > My only complaint is that is a bit unexpected that this actually declares > and *calls* the function! Yes, it is more than a bit surprising at first :) In fact, I remember that bothered me about @interact when I first saw it, and that was partly why I tried to make things work using 'with'. But it's now clear to me that we need a *real* scope for these ideas, and for now, 'def' is the only way to get a scope we have, so that's what we'll be using. For reference, the stuff I was trying to implement from the Ars review is below in the original Objective C code, plus my own 'pythonization' of it, first in serial mode, then using 'with', then with @decos. This is the page in the review where John Siracusa explains the new Block syntax that Apple introduced to C: http://arstechnica.com/apple/reviews/2009/08/mac-os-x-10-6.ars/10 I think it's really cool, I do hope it makes its way into the language itself. Cheers, f ### Code from __future__ import with_statement """ http://arstechnica.com/apple/reviews/2009/08/mac-os-x-10-6.ars/13 This is Objective C code, taken from the link above: - (IBAction)analyzeDocument:(NSButton *)sender { NSDictionary *stats = [myDoc analyze]; [myModel setDict:stats]; [myStatsView setNeedsDisplay:YES]; [stats release]; } - (IBAction)analyzeDocument:(NSButton *)sender { dispatch_async(dispatch_get_global_queue(0, 0), ^{ NSDictionary *stats = [myDoc analyze]; dispatch_async(dispatch_get_main_queue(), ^{ [myModel setDict:stats]; [myStatsView setNeedsDisplay:YES]; [stats release]; }); }); } """ def analyzeDocument(): """A Python version of the serial version above""" stats = myDoc.analyze() myModel.setDict(stats) myStatsView.setNeedsDisplay(YES) stats.release() def analyzeDocument(): """A hypothetical code using 'with'. This type of hack is not only ugly, but extremely brittle (to actually run, it has to do all kinds of nasty stack manipulations. It is meant only to illustrate what the syntax could look like. """ with dispatch_async(dispatch_get_global_queue(0, 0)): stats = myDoc.analyze() with dispatch_async(dispatch_get_main_queue()): myModel.setDict(stats) myStatsView.setNeedsDisplay(YES) stats.release() def analyzeDocument(): """A decorator-based version. This could in principle work just fine today. All one would need to do would be to write Python decorators for Apple GCD. As long as the code being run released the GIL, it would work fine. """ @dispatch_async(dispatch_get_global_queue(0, 0)) def outer(): stats = myDoc.analyze() @dispatch_async(dispatch_get_main_queue()) def inner(): myModel.setDict(stats) myStatsView.setNeedsDisplay(YES) stats.release() From fperez.net at gmail.com Tue Sep 8 16:45:55 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Tue, 8 Sep 2009 13:45:55 -0700 Subject: [IPython-dev] Testing matplotlib on IPython trunk In-Reply-To: <49d6b3500909081245r763f6292nf75b689f2c7d6ead@mail.gmail.com> References: <49d6b3500909081245r763f6292nf75b689f2c7d6ead@mail.gmail.com> Message-ID: Hey Gokhan, thanks for the summary. On Tue, Sep 8, 2009 at 12:45 PM, G?khan Sever wrote: > ### In a new IPython, these lines work --no locking after plt.show() "-a" > makes the difference. > > I[1]: import matplotlib.pyplot as plt > > I[2]: %gui -a qt > O[2]: > > I[3]: plt.plot(range(10)) > O[3]: [] > > I[4]: plt.show() If you do plt.ion() right after you import it, then you don't need to do 'show' explicitely anymore. Basically what today's '-pylab' does is: - a bunch of imports - the equivalent of %gui, but uglier and at startup - do plt.ion() for you - patch %run a little so it does ioff() before starting up and ion() at the end. As you can see, even now with trunk in the state of upheaval it is, you can get almost all of this back with this snippet. This is pretty much what we'll make available built-in when the dust settles (with the 'import *' being optional, as they are today): %gui -a qt import numpy as np import matplotlib.pyplot as plt import matplotlib.pylab as pylab import matplotlib.mlab as mlab from numpy import * from matplotlib.pyplot import * plt.ion() ### END CODE Cheers, f From ellisonbg.net at gmail.com Tue Sep 8 17:51:59 2009 From: ellisonbg.net at gmail.com (Brian Granger) Date: Tue, 8 Sep 2009 14:51:59 -0700 Subject: [IPython-dev] Testing matplotlib on IPython trunk In-Reply-To: <49d6b3500909081245r763f6292nf75b689f2c7d6ead@mail.gmail.com> References: <49d6b3500909081245r763f6292nf75b689f2c7d6ead@mail.gmail.com> Message-ID: <6ce0ac130909081451x49196ab3ud2010244fce1008a@mail.gmail.com> You also may need to do: plt.interactive(True) Cheers, Brian On Tue, Sep 8, 2009 at 12:45 PM, G?khan Sever wrote: > Hello, > > The thread switches will be gone by the release of the new IPython. I am > assuming that some extra work needs to be done on both sides in preparation > to the new release. See the following test cases: > > > ### This one locks the IPython unless the figure window is killed. If you > do an additional plt.show() without a figure is up then you get a complete > lock-up of the shell. > > I[1]: import matplotlib.pyplot as plt > > I[2]: %gui qt > > I[3]: plt.plot(range(10)) > O[3]: [] > > I[4]: plt.show() > > > > > ### The following cannot resolve that issue > > I[5]: %gui #disable event loops > > I[6]: %gui -a qt > O[6]: > > I[7]: plt.plot(range(10)) > O[7]: [] > > I[8]: plt.show() > > > > ### In a new IPython, these lines work --no locking after plt.show() "-a" > makes the difference. > > I[1]: import matplotlib.pyplot as plt > > I[2]: %gui -a qt > O[2]: > > I[3]: plt.plot(range(10)) > O[3]: [] > > I[4]: plt.show() > > > > > ================================================================================ > Platform : > Linux-2.6.29.6-217.2.3.fc11.i686.PAE-i686-with-fedora-11-Leonidas > Python : ('CPython', 'tags/r26', '66714') > IPython : 0.11.bzr.r1205 > NumPy : 1.4.0.dev > Matplotlib : 1.0.svn > > ================================================================================ > > -- > G?khan > > _______________________________________________ > IPython-dev mailing list > IPython-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/ipython-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ellisonbg.net at gmail.com Tue Sep 8 23:09:25 2009 From: ellisonbg.net at gmail.com (Brian Granger) Date: Tue, 8 Sep 2009 20:09:25 -0700 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: <6ce0ac130909081023u30ecbbffp36a4d6af3607b1a7@mail.gmail.com> Message-ID: <6ce0ac130909082009m913c5dbi86d13eb78adb7017@mail.gmail.com> On Tue, Sep 8, 2009 at 1:34 PM, Fernando Perez wrote: > On Tue, Sep 8, 2009 at 10:23 AM, Brian Granger > wrote: > > My only complaint is that is a bit unexpected that this actually declares > > and *calls* the function! > > Yes, it is more than a bit surprising at first :) In fact, I remember > that bothered me about @interact when I first saw it, and that was > partly why I tried to make things work using 'with'. But it's now > clear to me that we need a *real* scope for these ideas, and for now, > 'def' is the only way to get a scope we have, so that's what we'll be > using. > > For reference, the stuff I was trying to implement from the Ars review > is below in the original Objective C code, plus my own 'pythonization' > of it, first in serial mode, then using 'with', then with @decos. > > This is the page in the review where John Siracusa explains the new > Block syntax that Apple introduced to C: > > http://arstechnica.com/apple/reviews/2009/08/mac-os-x-10-6.ars/10 > > I think it's really cool, I do hope it makes its way into the language > itself. > > Yes, this is a very nice article. It is very nice to see Apple bringing the "iPod" style of thinking (stepping back and really thinking about something) to parallelism and languages. def analyzeDocument(): > """A decorator-based version. > > This could in principle work just fine today. All one would need to do > would be to write Python decorators for Apple GCD. As long as the code > being run released the GIL, it would work fine. """ > > @dispatch_async(dispatch_get_global_queue(0, 0)) > def outer(): > stats = myDoc.analyze() > @dispatch_async(dispatch_get_main_queue()) > def inner(): > myModel.setDict(stats) > myStatsView.setNeedsDisplay(YES) > stats.release() > OK, just a few comments before I loose a week or two thinking about this.... * While multiprocessing is interesting, by sticking with the fork model of getting variables to child execution contexts, it really misses the greater possibilities that proper scoping gives you. With these GCD style constructs you can do arbitrarily deep nestings of scopes and carefully control what variables from the parent scope are seen by children scopes. This is exactly like Cilk and is a fantastic way of doing things. * Damn the GIL! If we didn't have the GIL, we could write a simple version of dispatch_async that used a simple thread pool we would be off to the races. * But, we can implement this on top of IPython.kernel and get many of the same things. The only real downside is the performance hit of using processes rather than threads. Cheers, Brian -------------- next part -------------- An HTML attachment was scrubbed... URL: From ellisonbg.net at gmail.com Tue Sep 8 23:13:45 2009 From: ellisonbg.net at gmail.com (Brian Granger) Date: Tue, 8 Sep 2009 20:13:45 -0700 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: <6ce0ac130909082009m913c5dbi86d13eb78adb7017@mail.gmail.com> References: <6ce0ac130909081023u30ecbbffp36a4d6af3607b1a7@mail.gmail.com> <6ce0ac130909082009m913c5dbi86d13eb78adb7017@mail.gmail.com> Message-ID: <6ce0ac130909082013u368ab0b4tafab44541244c6d8@mail.gmail.com> Another link that is interesting when you start to think about bytecode transformations: http://www.voidspace.org.uk/python/articles/code_blocks.shtml Cheers, Brian On Tue, Sep 8, 2009 at 8:09 PM, Brian Granger wrote: > > > On Tue, Sep 8, 2009 at 1:34 PM, Fernando Perez wrote: > >> On Tue, Sep 8, 2009 at 10:23 AM, Brian Granger >> wrote: >> > My only complaint is that is a bit unexpected that this actually >> declares >> > and *calls* the function! >> >> Yes, it is more than a bit surprising at first :) In fact, I remember >> that bothered me about @interact when I first saw it, and that was >> partly why I tried to make things work using 'with'. But it's now >> clear to me that we need a *real* scope for these ideas, and for now, >> 'def' is the only way to get a scope we have, so that's what we'll be >> using. >> >> For reference, the stuff I was trying to implement from the Ars review >> is below in the original Objective C code, plus my own 'pythonization' >> of it, first in serial mode, then using 'with', then with @decos. >> >> This is the page in the review where John Siracusa explains the new >> Block syntax that Apple introduced to C: >> >> http://arstechnica.com/apple/reviews/2009/08/mac-os-x-10-6.ars/10 >> >> I think it's really cool, I do hope it makes its way into the language >> itself. >> >> > Yes, this is a very nice article. It is very nice to see Apple bringing > the "iPod" > style of thinking (stepping back and really thinking about something) to > parallelism and languages. > > def analyzeDocument(): >> """A decorator-based version. >> >> This could in principle work just fine today. All one would need to do >> would be to write Python decorators for Apple GCD. As long as the code >> being run released the GIL, it would work fine. """ >> >> @dispatch_async(dispatch_get_global_queue(0, 0)) >> def outer(): >> stats = myDoc.analyze() >> @dispatch_async(dispatch_get_main_queue()) >> def inner(): >> myModel.setDict(stats) >> myStatsView.setNeedsDisplay(YES) >> stats.release() >> > > OK, just a few comments before I loose a week or two thinking about > this.... > > * While multiprocessing is interesting, by sticking with the fork model > of getting variables to child execution contexts, it really misses the > greater > possibilities that proper scoping gives you. With these GCD style > constructs > you can do arbitrarily deep nestings of scopes and carefully control what > variables > from the parent scope are seen by children scopes. This is exactly like > Cilk and > is a fantastic way of doing things. > > * Damn the GIL! If we didn't have the GIL, we could write a simple version > of > dispatch_async that used a simple thread pool we would be off to the races. > > * But, we can implement this on top of IPython.kernel and get many of the > same > things. The only real downside is the performance hit of using processes > rather > than threads. > > Cheers, > > Brian > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From fperez.net at gmail.com Wed Sep 9 00:30:08 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Tue, 8 Sep 2009 21:30:08 -0700 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: <6ce0ac130909082013u368ab0b4tafab44541244c6d8@mail.gmail.com> References: <6ce0ac130909081023u30ecbbffp36a4d6af3607b1a7@mail.gmail.com> <6ce0ac130909082009m913c5dbi86d13eb78adb7017@mail.gmail.com> <6ce0ac130909082013u368ab0b4tafab44541244c6d8@mail.gmail.com> Message-ID: On Tue, Sep 8, 2009 at 8:13 PM, Brian Granger wrote: > Another link that is interesting when you start to think about bytecode > transformations: > > http://www.voidspace.org.uk/python/articles/code_blocks.shtml Ah, very nice! Thanks for that link, this thread is really turning out to be very useful and informative, Michael Foord's is a very nice article. It's worth noting that Michael's clever bytecode hack isn't needed, as I mentioned before, in python 3.x because what he's doing is precisely what the new 'nonlocal' keyword provides. But for 2.x either we use some of the approaches I mentioned above, or this kind of trickery. Michael's hack has unfortunately some limitations, as best I can see: you can't capture enclosing scope easily (without much deeper modifications of the bytecode, which are doable but which I do NOT want to get into). If you try to put a closure in the block, python refuses the bare 'exec': In [12]: run simple ------------------------------------------------------------ SyntaxError: unqualified exec is not allowed in function 'simple' it contains a nested function with free variables (simple.py, line 39) WARNING: Failure executing file: where I'd written: def simple(n): s = 0.0 def block(): for i in range(n): s += i**2 exec AnonymousCodeBlock(block) return s But I think we have a reasonable path forward for actually useful tool building (aside from neat machinations and hackery): - in python 2.x, all of this works, but some slightly ugly solutions must be used to return information from the wrapped functions. - in python 3.x, using nonlocal solves the above. - if all of this leads to useful libraries, it's plausible to consider it for later inclusion in the language with better syntactic support, once we understand more angles of the problem and have proven its utility. Cheers, f From gael.varoquaux at normalesup.org Wed Sep 9 03:23:24 2009 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Wed, 9 Sep 2009 09:23:24 +0200 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: <20090908122607.GB25829@phare.normalesup.org> Message-ID: <20090909072324.GC11255@phare.normalesup.org> On Tue, Sep 08, 2009 at 01:25:09PM -0700, Fernando Perez wrote: > I have to admit that I've never used functools.wraps, because I had > become used to Michele Simonato's excellent decorators module. In > fact, in ipython we ship internally a copy of it for use in our > testing machinery, as decorators_msim.py. Perhaps it's time we use it > system-wide and put it in externals, especially since he continues to > improve it so much. I just had a look and he is still making > releases, it's now on pypi, and has a fantastically well documented > page: > http://pypi.python.org/pypi/decorator Good point. I new about it, and had been frowning on using it because it is not in the standard library. I guess I am wrong. I should point out that it really uses functools.partial (which is what wraps relies on) to do all the heavy lifting, unless you are stuck on 2.4. Ga?l From matthieu.brucher at gmail.com Wed Sep 9 10:53:05 2009 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Wed, 9 Sep 2009 16:53:05 +0200 Subject: [IPython-dev] Before a patch for LSF support In-Reply-To: <6ce0ac130909080949wd076e5fh946febd26729ffc7@mail.gmail.com> References: <6ce0ac130908120258y66547ee5x77b4453beae5fecc@mail.gmail.com> <6ce0ac130908120917m6dd5a68fq67bd22fade7ce47@mail.gmail.com> <6ce0ac130908130110y7402edfbs18480f22c1fd4092@mail.gmail.com> <6ce0ac130909080949wd076e5fh946febd26729ffc7@mail.gmail.com> Message-ID: >> Hi, >> >> I'm trying to get to this issue again. I've started from a remote >> node, but I still have the issue with the Connection refused, and this >> is the moment where the LSF job actually crashes (it is running before >> it tries to connect). >> > > Matthieu, > > Sorry, I am a bit foggy on this one. No problem, I know the feeling (espcially one month after). Can you refresh my memory and describe > exactly which iteration you are trying. I'm trying to get ipython working with LSF. I did manage to submit the job and let it run on the nodes. The first issue I face is that $HOME is not the same location on the LSF nodes than on the main computer. This is an issue that should be tackled with the refactoring of ipython (adding the possibility of setting the path for the controller and a different pass for the engines, if I understood ipython correctly). Meanwhile, I've tried to launch ipcluster from an LSF node (I have an SSH access too for debugging purposes). So in this case, I don't have a $HOME problem. This time, it's the ipcontroller-mec.furl file that has a 127.0.0.1 as the controller IP. I've tried to replace it with the node name, but I also a Conenction Refused error in the engines logs when I try to create a MultiEngineClient. I hope I summed up my issue properly. Cheers, Matthieu -- Information System Engineer, Ph.D. Website: http://matthieu-brucher.developpez.com/ Blogs: http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn: http://www.linkedin.com/in/matthieubrucher From fperez.net at gmail.com Thu Sep 10 15:04:49 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Thu, 10 Sep 2009 12:04:49 -0700 Subject: [IPython-dev] A question mostly for R. Kern: Apache license in argparse... Message-ID: Hi Robert, I just saw your post on the scipy list about the apache license, and I realized that ipython does have one apache licensed component: argparse. If my memory serves me right, you've been one of the users/proponents of argparse in ipython (I think for 0.10 we caught up with upstream when you pointed it out). Is there anything you'd recommend we do/change/document regarding our shipping of argparse in ipython? I would like to allow people to unobtrusively include ipython in GPL projects if they so desire, and I hadn't realized this could be an issue. Thanks for any feedback, and sorry in advance for continuing to make you our pseudo-lawyer-in-residence :) Cheers, f From robert.kern at gmail.com Thu Sep 10 15:57:04 2009 From: robert.kern at gmail.com (Robert Kern) Date: Thu, 10 Sep 2009 14:57:04 -0500 Subject: [IPython-dev] A question mostly for R. Kern: Apache license in argparse... In-Reply-To: References: Message-ID: On 2009-09-10 14:04 PM, Fernando Perez wrote: > Hi Robert, > > I just saw your post on the scipy list about the apache license, and I > realized that ipython does have one apache licensed component: > argparse. > > If my memory serves me right, you've been one of the users/proponents > of argparse in ipython (I think for 0.10 we caught up with upstream > when you pointed it out). Is there anything you'd recommend we > do/change/document regarding our shipping of argparse in ipython? I > would like to allow people to unobtrusively include ipython in GPL > projects if they so desire, and I hadn't realized this could be an > issue. Ah, right. It's worth asking Steven Bethard if he will give a copy to you under the BSD license such that IPython can remain GPLv2 compatible. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From fperez.net at gmail.com Fri Sep 11 02:56:29 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Thu, 10 Sep 2009 23:56:29 -0700 Subject: [IPython-dev] A question mostly for R. Kern: Apache license in argparse... In-Reply-To: References: Message-ID: Hi, On Thu, Sep 10, 2009 at 12:57 PM, Robert Kern wrote: > > > Ah, right. It's worth asking Steven Bethard if he will give a copy to you under > the BSD license such that IPython can remain GPLv2 compatible. OK, I'll go ahead and do that, thanks. Cheers, f From fperez.net at gmail.com Fri Sep 11 15:11:07 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Fri, 11 Sep 2009 12:11:07 -0700 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: Message-ID: Hi all, On Fri, Sep 4, 2009 at 1:31 AM, Fernando Perez wrote: > This part of Ars Technica's excellent review of Snow Leopard: > > http://arstechnica.com/apple/reviews/2009/08/mac-os-x-10-6.ars/13 > > shows how Apple tackled the problem of providing civilized primitives > to express parallellism in applications and a mechanism to make useful > decisions based on this information. ? The idea is to combine a > kernel-level dispatcher (GCD), a beautiful extension to the C language > (yes, they extended C!) in the form of anonymous code blocks, and an > API to inform GCD of your parallelism breakup easily, so GCD can use > your intentions at runtime efficiently. ?It's really a remarkably > simple, yet effective (IMHO) combination of tools to tackle a thorny > problem. > Wow. Apple just open sourced Grand Central Dispatch! http://libdispatch.macosforge.org/ http://www.macresearch.org/grand-central-now-open-all Quoting from the latter: """ Of course, this is also very interesting for scientific developers. It may be possible to parallelize code in the not too distant future using Grand Central Dispatch, and run that code not only on Macs, but also on clusters and supercomputers. """ I think we've been saying that here for the last few days :) And we could even do it in python, were it not for the GIL (we can still do a few things, just not everything). We'll see how long before python bindings are available for libdispatch at least on the Mac, so you can more easily organize code that already can run in a thread. This will be interesting to watch... Cheers, f From fperez.net at gmail.com Fri Sep 11 15:44:01 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Fri, 11 Sep 2009 12:44:01 -0700 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: Message-ID: On Fri, Sep 11, 2009 at 12:11 PM, Fernando Perez wrote: > http://www.macresearch.org/grand-central-now-open-all And for those still listening (yes, you two over there on the right), here's a nice followup with an example that compares a simple image blur done with OpenMP and with GCD: http://www.macresearch.org/cocoa-scientists-xxxi-all-aboard-grand-central it's unfortunate he didn't have 8 or 16 cores to run it on, as getting 'linear speedup' with N=2 is a bit of a joke, but other than that the post is a clear and informative example. Cheers, f From fperez.net at gmail.com Fri Sep 11 15:47:04 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Fri, 11 Sep 2009 12:47:04 -0700 Subject: [IPython-dev] A question mostly for R. Kern: Apache license in argparse... In-Reply-To: References: Message-ID: Hi all, On Thu, Sep 10, 2009 at 12:57 PM, Robert Kern wrote: > > Ah, right. It's worth asking Steven Bethard if he will give a copy to you under > the BSD license such that IPython can remain GPLv2 compatible. I just wanted to let everyone know that Robert and I contacted Steven, and he kindly agreed to offer argparse under the BSD license terms as well, so that we have no problem with its continued use. I'll be updating the code over the next few days, once Steven makes the changes upstream. These changes will be applied both to trunk and to the 0.10 branch, so that anyone building off 0.10 can be free of any licensing worries if they wish to include IPython in a GPL project. Many thanks to Steven for doing this, on behalf of the IPython users and developers! Best, f From ellisonbg.net at gmail.com Fri Sep 11 16:46:14 2009 From: ellisonbg.net at gmail.com (Brian Granger) Date: Fri, 11 Sep 2009 13:46:14 -0700 Subject: [IPython-dev] A question mostly for R. Kern: Apache license in argparse... In-Reply-To: References: Message-ID: <6ce0ac130909111346s7fa85a10v735260a11bbb9c20@mail.gmail.com> Fantastic, especially so as I have started to use argparse extensively in my refactoring this summer. Thanks everyone, especially Steven! Cheers, Brian On Fri, Sep 11, 2009 at 12:47 PM, Fernando Perez wrote: > Hi all, > > On Thu, Sep 10, 2009 at 12:57 PM, Robert Kern > wrote: > > > > Ah, right. It's worth asking Steven Bethard if he will give a copy to you > under > > the BSD license such that IPython can remain GPLv2 compatible. > > I just wanted to let everyone know that Robert and I contacted Steven, > and he kindly agreed to offer argparse under the BSD license terms as > well, so that we have no problem with its continued use. > > I'll be updating the code over the next few days, once Steven makes > the changes upstream. These changes will be applied both to trunk and > to the 0.10 branch, so that anyone building off 0.10 can be free of > any licensing worries if they wish to include IPython in a GPL > project. > > Many thanks to Steven for doing this, on behalf of the IPython users > and developers! > > Best, > > f > _______________________________________________ > IPython-dev mailing list > IPython-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/ipython-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ellisonbg.net at gmail.com Fri Sep 11 19:36:07 2009 From: ellisonbg.net at gmail.com (Brian Granger) Date: Fri, 11 Sep 2009 16:36:07 -0700 Subject: [IPython-dev] Chasing my tail with the "with" statement Message-ID: <6ce0ac130909111636k30316052q44d5912ad06a6bfb@mail.gmail.com> Hi, I am converting some of the traps in IPython to use the with statement. The idea is that things like sys.excepthook, sys.displayhook, etc. are not always set by IPython, but only when user code is actually being run. But, I am running into a problem with sys.displayhook with the results of a magic function. The prefilter machinery converts: %alias -> get_ipython().magic("alias") The end of the get_ipython().magic function looks like this: with nested(self.builtin_trap, self.display_trap): result = fn(magic_args) return result This idea is that we use "with" to enable the display_trap, call the magic and then return the result. The builtin_trap works fine, but the display_trap doesn't work. The problem is that sys.displaytrap is only called when something is returned. But by then ("return result") the "with" block has ended and the display_trap is deactivated. There are other places that I can enable the display_trap, but we loose the nice feature of only having the trap set when it is needed. Any ideas of how we can get around having to leave display_trap set all the time? Cheers, Brian -------------- next part -------------- An HTML attachment was scrubbed... URL: From dsdale24 at gmail.com Fri Sep 11 20:03:29 2009 From: dsdale24 at gmail.com (Darren Dale) Date: Fri, 11 Sep 2009 20:03:29 -0400 Subject: [IPython-dev] Chasing my tail with the "with" statement In-Reply-To: <6ce0ac130909111636k30316052q44d5912ad06a6bfb@mail.gmail.com> References: <6ce0ac130909111636k30316052q44d5912ad06a6bfb@mail.gmail.com> Message-ID: On Fri, Sep 11, 2009 at 7:36 PM, Brian Granger wrote: > Hi, > > I am converting some of the traps in IPython to use the with statement.? The > idea is that > things like sys.excepthook, sys.displayhook, etc. are not always set by > IPython, > but only when user code is actually being run.? But, I am running into a > problem > with sys.displayhook with the results of a magic function. > > The prefilter machinery converts: > > %alias -> get_ipython().magic("alias") > > The end of the get_ipython().magic function looks like this: > > ??????????? with nested(self.builtin_trap, self.display_trap): > ??????????????? result = fn(magic_args) > ??????????? return result > > This idea is that we use "with" to enable the display_trap, call the > magic and then return the result.? The builtin_trap works fine, but > the display_trap doesn't work.? The problem is that sys.displaytrap > is only called when something is returned.? But by then ("return result") > the "with" block has ended and the display_trap is deactivated. > > There are other places that I can enable the display_trap, but we > loose the nice feature of only having the trap set when it is needed. > > Any ideas of how we can get around having to leave display_trap set > all the time? Can you do: with nested(self.builtin_trap, self.display_trap): return fn(magic_args) ? From ellisonbg.net at gmail.com Fri Sep 11 20:05:58 2009 From: ellisonbg.net at gmail.com (Brian Granger) Date: Fri, 11 Sep 2009 17:05:58 -0700 Subject: [IPython-dev] Chasing my tail with the "with" statement In-Reply-To: <6ce0ac130909111636k30316052q44d5912ad06a6bfb@mail.gmail.com> References: <6ce0ac130909111636k30316052q44d5912ad06a6bfb@mail.gmail.com> Message-ID: <6ce0ac130909111705x5da9bf7dyee8d95b6101fd9ff@mail.gmail.com> Never mind this, I figured out it was a slightly different issue that was causing the problem. I *do* activate the displaytrap elsewhere, but my display trap context manager wasn't fully reentrant so the trap was being turned off in the wrong place because of a recursive call to __enter__: with trap: # do something with trap: # do something else # Opps trap is off now! # All subsequence code doesn't have the trap set. Moral of the story: if your context managers can be entered recursively, make sure you have logic that prevents this type of thing. It is easy logic to add though. Darren asks: Can you do: > > with nested(self.builtin_trap, self.display_trap): > return fn(magic_args) > Maybe so, I am not sure what the ordering of the return vs __exit__ call is in this case. I will have to look this up. Cheers, Brian On Fri, Sep 11, 2009 at 4:36 PM, Brian Granger wrote: > Hi, > > I am converting some of the traps in IPython to use the with statement. > The idea is that > things like sys.excepthook, sys.displayhook, etc. are not always set by > IPython, > but only when user code is actually being run. But, I am running into a > problem > with sys.displayhook with the results of a magic function. > > The prefilter machinery converts: > > %alias -> get_ipython().magic("alias") > > The end of the get_ipython().magic function looks like this: > > with nested(self.builtin_trap, self.display_trap): > result = fn(magic_args) > return result > > This idea is that we use "with" to enable the display_trap, call the > magic and then return the result. The builtin_trap works fine, but > the display_trap doesn't work. The problem is that sys.displaytrap > is only called when something is returned. But by then ("return result") > the "with" block has ended and the display_trap is deactivated. > > There are other places that I can enable the display_trap, but we > loose the nice feature of only having the trap set when it is needed. > > Any ideas of how we can get around having to leave display_trap set > all the time? > > Cheers, > > Brian > -------------- next part -------------- An HTML attachment was scrubbed... URL: From fperez.net at gmail.com Thu Sep 17 03:30:23 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Thu, 17 Sep 2009 00:30:23 -0700 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: Message-ID: Hi folks, I'm sure you are all tired of this discussion, but to wrap things up, today I presented a summary of this thread to our local Berkeley Py4Science group (https://cirl.berkeley.edu/view/Py4Science). I made some notes which are now online: https://cirl.berkeley.edu/fperez/static/decorators/index.html Hopefully this can serve as a reference of these ideas in a compact location. Thanks again to all who participated, please let me know if I failed to acknowledge anything! Cheers, f ps - There's a chance I may move them at some point because I want to redo the whole site using sphinx (I'm using reST sources but rest2web and I want to just use sphinx for everything to simplify), but I ran out of time today. But there's a link at https://cirl.berkeley.edu/fperez/py4science that I'll update whenever I move them. From ellisonbg.net at gmail.com Thu Sep 17 10:12:46 2009 From: ellisonbg.net at gmail.com (Brian Granger) Date: Thu, 17 Sep 2009 07:12:46 -0700 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: Message-ID: <6ce0ac130909170712g65a1f86dw3609a766a20f49c@mail.gmail.com> Very nice! One question: where does the undecorated attribute come from? (I am paying attention!). Cheers, Brian On Thu, Sep 17, 2009 at 12:30 AM, Fernando Perez wrote: > Hi folks, > > I'm sure you are all tired of this discussion, but to wrap things up, > today I presented a summary of this thread to our local Berkeley > Py4Science group (https://cirl.berkeley.edu/view/Py4Science). I made > some notes which are now online: > > https://cirl.berkeley.edu/fperez/static/decorators/index.html > > Hopefully this can serve as a reference of these ideas in a compact > location. Thanks again to all who participated, please let me know if > I failed to acknowledge anything! > > Cheers, > > f > > > ps - There's a chance I may move them at some point because I want to > redo the whole site using sphinx (I'm using reST sources but rest2web > and I want to just use sphinx for everything to simplify), but I ran > out of time today. But there's a link at > > https://cirl.berkeley.edu/fperez/py4science > > that I'll update whenever I move them. > _______________________________________________ > IPython-dev mailing list > IPython-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/ipython-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From hans_meine at gmx.net Thu Sep 17 11:57:10 2009 From: hans_meine at gmx.net (Hans Meine) Date: Thu, 17 Sep 2009 17:57:10 +0200 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: <6ce0ac130909170712g65a1f86dw3609a766a20f49c@mail.gmail.com> References: <6ce0ac130909170712g65a1f86dw3609a766a20f49c@mail.gmail.com> Message-ID: <200909171757.12664.hans_meine@gmx.net> On Thursday 17 September 2009 16:12:46 Brian Granger wrote: > where does the undecorated attribute come from? (I am paying attention!). This will have to be set by the decorator, i.e. by Michele Simionato?s decorator module. Ciao, Hans From ellisonbg.net at gmail.com Thu Sep 17 14:03:46 2009 From: ellisonbg.net at gmail.com (Brian Granger) Date: Thu, 17 Sep 2009 11:03:46 -0700 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: <200909171757.12664.hans_meine@gmx.net> References: <6ce0ac130909170712g65a1f86dw3609a766a20f49c@mail.gmail.com> <200909171757.12664.hans_meine@gmx.net> Message-ID: <6ce0ac130909171103q881f265h6e4745f09273f02@mail.gmail.com> On Thu, Sep 17, 2009 at 8:57 AM, Hans Meine wrote: > On Thursday 17 September 2009 16:12:46 Brian Granger wrote: > > where does the undecorated attribute come from? (I am paying attention!). > > This will have to be set by the decorator, i.e. by Michele Simionato?s > decorator module. > > Yes, but where does the decorator module get the source code? Cheers, Brian > Ciao, > Hans > _______________________________________________ > IPython-dev mailing list > IPython-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/ipython-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From fperez.net at gmail.com Thu Sep 17 14:36:36 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Thu, 17 Sep 2009 11:36:36 -0700 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: <6ce0ac130909171103q881f265h6e4745f09273f02@mail.gmail.com> References: <6ce0ac130909170712g65a1f86dw3609a766a20f49c@mail.gmail.com> <200909171757.12664.hans_meine@gmx.net> <6ce0ac130909171103q881f265h6e4745f09273f02@mail.gmail.com> Message-ID: 2009/9/17 Brian Granger : > Yes, but where does the decorator module get the source code? It doesn't, IPython does that. But by keeping a reference to the real, original undecorated function (this reference is normally lost), then IPython's introspection can do its job correctly and finds the sources. Does that help? Cheers, f From fperez.net at gmail.com Thu Sep 17 14:40:50 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Thu, 17 Sep 2009 11:40:50 -0700 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: <6ce0ac130909170712g65a1f86dw3609a766a20f49c@mail.gmail.com> References: <6ce0ac130909170712g65a1f86dw3609a766a20f49c@mail.gmail.com> Message-ID: 2009/9/17 Brian Granger : > Very nice! Thanks, I hope it's more useful in the long run in this form. I *really* like sphinx, it was actually quite painless to go through the thread and my local codes to put this together, and the final result is very clean. By the way, I also liked very much this format for giving a presentation. It's a lot less pain to write than making slides, more reusable in the long term, and I think it worked well for the audience. It was easy to highlight points in the code, it's nicely highlighted, I can keep lots of links handy, etc. One could even imagine writing a book about scientific computing for python using these tools. Crazy, I know. Cheers, f From ellisonbg.net at gmail.com Thu Sep 17 17:00:56 2009 From: ellisonbg.net at gmail.com (Brian Granger) Date: Thu, 17 Sep 2009 14:00:56 -0700 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: References: <6ce0ac130909170712g65a1f86dw3609a766a20f49c@mail.gmail.com> <200909171757.12664.hans_meine@gmx.net> <6ce0ac130909171103q881f265h6e4745f09273f02@mail.gmail.com> Message-ID: <6ce0ac130909171400g3fc11ca2v588a83207a461aab@mail.gmail.com> > Yes, but where does the decorator module get the source code? > > It doesn't, IPython does that. But by keeping a reference to the > real, original undecorated function (this reference is normally lost), > then IPython's introspection can do its job correctly and finds the > sources. > > OK, this makes sense. What are the limitations of this approach? That is, are the circumstances where it is not possible to get the source code from the undecorated reference. Cheers, Brian > Does that help? > > Cheers, > > f > -------------- next part -------------- An HTML attachment was scrubbed... URL: From fperez.net at gmail.com Sun Sep 20 02:18:20 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Sat, 19 Sep 2009 23:18:20 -0700 Subject: [IPython-dev] Testing ipython with more robustness... Message-ID: Hey, [ This is mostly for anyone interested in how we'll test IPython in the future with more robustness, after a very long talk Brian and I had on Friday...] I'm fried and need to sleep, and I'll be out all day tomorrow. But if you have a chance, have a look at this. Put it somewhere in your pythonpath (I symlinked it) and try nosetests -vvs ptest --with-doctest there's a deliberately broken test, you can then study it with nosetests -vvs ptest --with-doctest --pdb-failures The fix is trivial, at the end of test.py change x, y = 2, 2 to x, y = 1, 2 The point is: - these are real parametric tests - they use normal unittest (try python test.py in the test/ directory) - they work with nose (after I monkeypatched it, I reported the bug to nose) - they can be debugged interactively!!! - the ipython docstrings decorators are also there - there's also a decorator to make a normal unittest out of any function, nose-like but without depending on nose always (they also work with nose). I think we've cracked it. I don't have time to clean it for commit tonight, but I wanted you to have this right away. It's been a hard but productive 2 days, though doing this meant I didn't finish the 'language spec'. I'm just too concerned with testing. Let me know what you think... Cheers, f -------------- next part -------------- A non-text attachment was scrubbed... Name: ptest.tgz Type: application/x-gzip Size: 4223 bytes Desc: not available URL: From fperez.net at gmail.com Sun Sep 20 14:16:49 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Sun, 20 Sep 2009 11:16:49 -0700 Subject: [IPython-dev] Musings: syntax for high-level expression of parallel (and other) execution control In-Reply-To: <6ce0ac130909171400g3fc11ca2v588a83207a461aab@mail.gmail.com> References: <6ce0ac130909170712g65a1f86dw3609a766a20f49c@mail.gmail.com> <200909171757.12664.hans_meine@gmx.net> <6ce0ac130909171103q881f265h6e4745f09273f02@mail.gmail.com> <6ce0ac130909171400g3fc11ca2v588a83207a461aab@mail.gmail.com> Message-ID: On Thu, Sep 17, 2009 at 2:00 PM, Brian Granger wrote: > OK, this makes sense.? What are the limitations of this approach? > That is, are the circumstances where it is not possible to get the > source code from the undecorated reference. In case others are curious (I replied to Brian offline). The limitations are similar to what ipython normally encounters, namely an object being available in-memory but not in source form. This happens: - for interactively defined functions - for installations where you may have the .pyc files but lost or never had the .py ones - in odd cases, if an installation is built in one place and moved, I've seen this happen. It used to be a common problem in the past with Fedora packages built in rpm chroot environments, but I've seen it less lately, I don't know why. There may be others, these are the ones I'm aware of. Cheers, f From gokhansever at gmail.com Tue Sep 22 00:16:54 2009 From: gokhansever at gmail.com (=?UTF-8?Q?G=C3=B6khan_Sever?=) Date: Mon, 21 Sep 2009 23:16:54 -0500 Subject: [IPython-dev] Testing matplotlib on IPython trunk In-Reply-To: References: <49d6b3500909081245r763f6292nf75b689f2c7d6ead@mail.gmail.com> Message-ID: <49d6b3500909212116q5fac242dhb4dfa0b02da89eb4@mail.gmail.com> On Tue, Sep 8, 2009 at 3:45 PM, Fernando Perez wrote: > Hey Gokhan, > > thanks for the summary. > > On Tue, Sep 8, 2009 at 12:45 PM, G?khan Sever > wrote: > > ### In a new IPython, these lines work --no locking after plt.show() "-a" > > makes the difference. > > > > I[1]: import matplotlib.pyplot as plt > > > > I[2]: %gui -a qt > > O[2]: > > > > I[3]: plt.plot(range(10)) > > O[3]: [] > > > > I[4]: plt.show() > > If you do > > plt.ion() > > right after you import it, then you don't need to do 'show' > explicitely anymore. Basically what today's '-pylab' does is: > > - a bunch of imports > - the equivalent of %gui, but uglier and at startup > - do plt.ion() for you > - patch %run a little so it does ioff() before starting up and ion() at the > end. > > As you can see, even now with trunk in the state of upheaval it is, > you can get almost all of this back with this snippet. This is pretty > much what we'll make available built-in when the dust settles (with > the 'import *' being optional, as they are today): > > It's a very late reply but I am wondering how to make these appear in the Ipy dev loaded into the session but not visible to a whos listing? Thanks. > %gui -a qt > > import numpy as np > import matplotlib.pyplot as plt > import matplotlib.pylab as pylab > import matplotlib.mlab as mlab > > from numpy import * > from matplotlib.pyplot import * > > plt.ion() > > > ### END CODE > > Cheers, > > f > -- G?khan -------------- next part -------------- An HTML attachment was scrubbed... URL: From fperez.net at gmail.com Tue Sep 22 01:18:59 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Mon, 21 Sep 2009 22:18:59 -0700 Subject: [IPython-dev] Testing matplotlib on IPython trunk In-Reply-To: <49d6b3500909212116q5fac242dhb4dfa0b02da89eb4@mail.gmail.com> References: <49d6b3500909081245r763f6292nf75b689f2c7d6ead@mail.gmail.com> <49d6b3500909212116q5fac242dhb4dfa0b02da89eb4@mail.gmail.com> Message-ID: 2009/9/21 G?khan Sever : > > It's a very late reply but I am wondering how to make these appear in the Ipy dev loaded into the session but not visible to a whos listing? > I don't think that's supported quite right now. IPython does one special thing to support a clean %whos listing: right before opening up the user mainloop, it checks all keys in the user namespace, and later on when %whos is run, those variables that were initially present are not displayed. So for now if you do this interactively, you will unfortunately pollute %whos. This is one thing we'll need to make sure works nicely again when the dust settles. Cheers, f From gokhansever at gmail.com Tue Sep 22 01:43:26 2009 From: gokhansever at gmail.com (=?UTF-8?Q?G=C3=B6khan_Sever?=) Date: Tue, 22 Sep 2009 00:43:26 -0500 Subject: [IPython-dev] Testing matplotlib on IPython trunk In-Reply-To: References: <49d6b3500909081245r763f6292nf75b689f2c7d6ead@mail.gmail.com> <49d6b3500909212116q5fac242dhb4dfa0b02da89eb4@mail.gmail.com> Message-ID: <49d6b3500909212243x1357afc3t190817f39b5b055c@mail.gmail.com> Thanks Fernando for the quick response. Today this is the 3rd time I am hitting an unsupported feature in the Python lands. 1-) No attribute docstrings 2-) Look this question: http://stackoverflow.com/questions/1458203/reading-a-float-from-string and 3rd is this. However I think I influenced to guys in our campus to take a look Python. One using Matlab-Simulink and C on collision-detection system design, the latter uses C to design a small scale embedded acquisition system for UAV platforms. He uses an ARM Cortex A8 processor powered Gumstix board. Xubuntu 9.04 runs on it. I saw Python 2.6.2 installed, however not sure how easy would that be to bring rest of the scipy stack into that machine. Besides, tomorrow there is going to be a Matlab seminar here http://www.mathworks.com/company/events/seminars/seminar39323.html It is about a SciPy advanced tutorial long. Many similar subjects I see there: *Speeding Up MATLAB Applications:Tips and Tricks for Writing Efficient Code *Topics include: ? Understanding preallocation and vectorization ? Addressing bottlenecks ? Efficient indexing and manipulations ? JIT ? Interpreter ? Mex *Brief Introduction to Parallel Computing with MATLAB *? Task parallel applications for faster processing ? Data parallel applications for handling large data sets ? Scheduling your programs to run I hope I will not kick out from the session by keep commenting oh that is possible in Python, oh this is too :) On Tue, Sep 22, 2009 at 12:18 AM, Fernando Perez wrote: > 2009/9/21 G?khan Sever : > > > > It's a very late reply but I am wondering how to make these appear in the > Ipy dev loaded into the session but not visible to a whos listing? > > > > I don't think that's supported quite right now. IPython does one > special thing to support a clean %whos listing: right before opening > up the user mainloop, it checks all keys in the user namespace, and > later on when %whos is run, those variables that were initially > present are not displayed. So for now if you do this interactively, > you will unfortunately pollute %whos. > > This is one thing we'll need to make sure works nicely again when the > dust settles. > > Cheers, > > f > -- G?khan -------------- next part -------------- An HTML attachment was scrubbed... URL: From fperez.net at gmail.com Wed Sep 23 01:41:30 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Tue, 22 Sep 2009 22:41:30 -0700 Subject: [IPython-dev] IPython/readline on snow leopard Message-ID: Hi folks, I don't have any machine with Snow Leopard yet, but a colleague just pointed out to me that this blog post: http://blog.zacharyvoase.com/post/174280299 helped him sort out some IPython/readline annoyances when he recently upgraded. I'm posting it here hoping it may be of use to some of you as well, please let us know of any problems or successes with the upgrade. Thanks to Satra for the tip! Cheers, f From ellisonbg.net at gmail.com Mon Sep 28 18:01:06 2009 From: ellisonbg.net at gmail.com (Brian Granger) Date: Mon, 28 Sep 2009 15:01:06 -0700 Subject: [IPython-dev] Summer's work in merged Message-ID: <6ce0ac130909281501m1b133d93sd853505e0b50fa9c@mail.gmail.com> Hi, I just merged all of my work from this summer into trunk. This represents a massive number of changes to the IPython core. Here are some highlights: * Completely new config system and files. * New extension API. * No more ipmaker, shell.py, threaded shells, or ipapi. * Component, Traitlets, Application = new foundation for moving forward * IPython still works! I tried *extremely* hard to make sure that as I refactored things the user's experience would not change. And considering the number of changes that have taken place, I think I was successful. * BUT, I am sure there are many things that are broken. Somethings that I know are broken: all extensions, the GUIs in frontend and gui. There will be many other things as well. * Refactored prefilter system. * Robust GUI support through PyOS_Inputhook (See the new %gui magic). * Documentation updates. For more information, check out the What's new section of the documentation in the source at: docs/source/whatsnew There is also a nice description of the new config system in: docs/source/config We will get these things up on the IPython website soon for folks to look at. If you an IPython developer, the main thing to look at is the Component class and how we are using it in InteractiveShell, PrefilterManager and AliasManager. Our goal is to make *everything* in IPython a component. This is the main abstraction for making IPython more testable, more hackable and more loosely coupled. Cheers, Brian -------------- next part -------------- An HTML attachment was scrubbed... URL: From dsdale24 at gmail.com Wed Sep 30 12:35:25 2009 From: dsdale24 at gmail.com (Darren Dale) Date: Wed, 30 Sep 2009 12:35:25 -0400 Subject: [IPython-dev] ip.core.ipapi.get Message-ID: I think ip.core.ipapi.get() is behaving differently in the trunk than it did before the refactor. Here is the new implementation: def get(): """Get the most recently created InteractiveShell instance.""" from IPython.core.iplib import InteractiveShell insts = InteractiveShell.get_instances() most_recent = insts[0] for inst in insts[1:]: if inst.created > most_recent.created: most_recent = inst return most_recent If I call get from the python prompt, instead of Ipython, I used to get None, but now I get an error because insts is an empty list so insts[0] raises an IndexError. Perhaps: def get(): """Get the most recently created InteractiveShell instance.""" from IPython.core.iplib import InteractiveShell insts = InteractiveShell.get_instances() if not insts: return None most_recent = insts[0] for inst in insts[1:]: if inst.created > most_recent.created: most_recent = inst return most_recent Darren From ellisonbg.net at gmail.com Wed Sep 30 13:21:07 2009 From: ellisonbg.net at gmail.com (Brian Granger) Date: Wed, 30 Sep 2009 10:21:07 -0700 Subject: [IPython-dev] ip.core.ipapi.get In-Reply-To: References: Message-ID: <6ce0ac130909301021l7a5785abk115bd239945855f2@mail.gmail.com> This function is essentially deprecated. This is a bug though that you are getting an exception. But, in the mean time please call the following function: In [1]: ip = get_ipython() This function is always available inside IPython and returns basically the same thing as get used to. For the curious, the problem with the old ipapi.get is that it assumed that there was always only ONE ipython and it returned that one. The new get_ipython function is smart: it doesn't assume there is only 1 ipython, and it always returns the right one. Cheers, Brian On Wed, Sep 30, 2009 at 9:35 AM, Darren Dale wrote: > I think ip.core.ipapi.get() is behaving differently in the trunk than > it did before the refactor. Here is the new implementation: > > def get(): > """Get the most recently created InteractiveShell instance.""" > from IPython.core.iplib import InteractiveShell > insts = InteractiveShell.get_instances() > most_recent = insts[0] > for inst in insts[1:]: > if inst.created > most_recent.created: > most_recent = inst > return most_recent > > If I call get from the python prompt, instead of Ipython, I used to > get None, but now I get an error because insts is an empty list so > insts[0] raises an IndexError. Perhaps: > > def get(): > """Get the most recently created InteractiveShell instance.""" > from IPython.core.iplib import InteractiveShell > insts = InteractiveShell.get_instances() > if not insts: > return None > most_recent = insts[0] > for inst in insts[1:]: > if inst.created > most_recent.created: > most_recent = inst > return most_recent > > > Darren > _______________________________________________ > IPython-dev mailing list > IPython-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/ipython-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From dsdale24 at gmail.com Wed Sep 30 13:43:41 2009 From: dsdale24 at gmail.com (Darren Dale) Date: Wed, 30 Sep 2009 13:43:41 -0400 Subject: [IPython-dev] ip.core.ipapi.get In-Reply-To: <6ce0ac130909301021l7a5785abk115bd239945855f2@mail.gmail.com> References: <6ce0ac130909301021l7a5785abk115bd239945855f2@mail.gmail.com> Message-ID: My use case is to check in a package's __init__.py to see if it is being imported within an ipython session, and if so to load a custom completer. I don't think get_ipython will work: import IPython as ip ip.InteractiveShell.get_ipython() ------------------------------------------------------------ Traceback (most recent call last): File "", line 1, in TypeError: unbound method get_ipython() must be called with InteractiveShell instance as first argument (got nothing instead) Am I using it incorrectly? On Wed, Sep 30, 2009 at 1:21 PM, Brian Granger wrote: > This function is essentially deprecated.? This is a bug though that > you are getting an exception.? But, in the mean time please call the > following function: > > In [1]: ip = get_ipython() > > This function is always available inside IPython and returns basically the > same > thing as get used to. > > For the curious, the problem with the old ipapi.get is that it assumed that > there > was always only ONE ipython and it returned that one.? The new get_ipython > function is smart: it doesn't assume there is only 1 ipython, and it always > returns > the right one. > > Cheers, > > Brian > > On Wed, Sep 30, 2009 at 9:35 AM, Darren Dale wrote: >> >> I think ip.core.ipapi.get() is behaving differently in the trunk than >> it did before the refactor. Here is the new implementation: >> >> def get(): >> ? ?"""Get the most recently created InteractiveShell instance.""" >> ? ?from IPython.core.iplib import InteractiveShell >> ? ?insts = InteractiveShell.get_instances() >> ? ?most_recent = insts[0] >> ? ?for inst in insts[1:]: >> ? ? ? ?if inst.created > most_recent.created: >> ? ? ? ? ? ?most_recent = inst >> ? ?return most_recent >> >> If I call get from the python prompt, instead of Ipython, I used to >> get None, but now I get an error because insts is an empty list so >> insts[0] raises an IndexError. Perhaps: >> >> def get(): >> ? ?"""Get the most recently created InteractiveShell instance.""" >> ? ?from IPython.core.iplib import InteractiveShell >> ? ?insts = InteractiveShell.get_instances() >> ? ?if not insts: >> ? ? ? ?return None >> ? ?most_recent = insts[0] >> ? ?for inst in insts[1:]: >> ? ? ? ?if inst.created > most_recent.created: >> ? ? ? ? ? ?most_recent = inst >> ? ?return most_recent >> >> >> Darren From robert.kern at gmail.com Wed Sep 30 14:02:24 2009 From: robert.kern at gmail.com (Robert Kern) Date: Wed, 30 Sep 2009 13:02:24 -0500 Subject: [IPython-dev] ip.core.ipapi.get In-Reply-To: References: <6ce0ac130909301021l7a5785abk115bd239945855f2@mail.gmail.com> Message-ID: On 2009-09-30 12:43 PM, Darren Dale wrote: > My use case is to check in a package's __init__.py to see if it is > being imported within an ipython session, and if so to load a custom > completer. Eww. How about making a macro that imports your package into the namespace and sets up the completer? Causing side effects on import like this causes more headaches than convenience. If you are adding completers for particular types, you may want to follow the approach I took for the HasTraits completer: https://bugs.launchpad.net/ipython/+bug/416174 -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From dsdale24 at gmail.com Wed Sep 30 14:25:09 2009 From: dsdale24 at gmail.com (Darren Dale) Date: Wed, 30 Sep 2009 14:25:09 -0400 Subject: [IPython-dev] ip.core.ipapi.get In-Reply-To: References: <6ce0ac130909301021l7a5785abk115bd239945855f2@mail.gmail.com> Message-ID: On Wed, Sep 30, 2009 at 2:02 PM, Robert Kern wrote: > On 2009-09-30 12:43 PM, Darren Dale wrote: >> My use case is to check in a package's __init__.py to see if it is >> being imported within an ipython session, and if so to load a custom >> completer. > > Eww. > > How about making a macro that imports your package into the namespace and sets > up the completer? Causing side effects on import like this causes more headaches > than convenience. > > If you are adding completers for particular types, you may want to follow the > approach I took for the HasTraits completer: > > ? https://bugs.launchpad.net/ipython/+bug/416174 I'll have a look, thanks for the link. I used the original traits completer as a template for the h5py completer I mentioned above. Darren From carmstr3 at illinois.edu Wed Sep 30 14:27:45 2009 From: carmstr3 at illinois.edu (carmstr3 at illinois.edu) Date: Wed, 30 Sep 2009 13:27:45 -0500 (CDT) Subject: [IPython-dev] IPython and amd64 Python Message-ID: <20090930132745.BTC36205@expms3.cites.uiuc.edu> hello, I installed Python 2.6.2 using the amd64 installer. I am running Windows 7 Professional Edition (got it early through MSDNAA :P Next, I compiled IPython from source using: 'python setupegg.py install' Then, I added the Python26\Scripts directory to my PATH. However, nothing is placed into my start menu, and I am unable to start IPython from a command line. I am given the error: 'Cannot find Python executable C:\Python26\python.exe' I clearly have the python executable, in that exact place, and that directory is also on my PATH. is ipython.exe looking for the 32 bit python? is there a way I can solve this? -Chandler PS. sorry if this is a repeat message. I subscribed and sent the message to ipython-user last night, and again this morning, but I've never seen the message bounced from the mailing list. ipython-user doesn't seem to be responding to me... I'm trying to send this to ipython-dev because a) it is sort of related to that; b) I can't seem to post to ipython-user; and c) just to make sure there isn't an issue on my end (ie, if this works, then its is an issue with ipython-user?) From ellisonbg.net at gmail.com Wed Sep 30 17:04:35 2009 From: ellisonbg.net at gmail.com (Brian Granger) Date: Wed, 30 Sep 2009 14:04:35 -0700 Subject: [IPython-dev] ip.core.ipapi.get In-Reply-To: References: <6ce0ac130909301021l7a5785abk115bd239945855f2@mail.gmail.com> Message-ID: <6ce0ac130909301404q2cc66f2ev7ab459ab0d41c053@mail.gmail.com> Some of the APIs have changed a little bit, but here is an overview of how you want to accomplish this: In your package, create a module for an ipython extension: mypackage __init__.py # all your other modules and packages and then... myextension.py In that file, create a load_in_ipython function that takes a single argument: def load_in_ipython(ip) # ip is the same thing as get_ipython() # do whatever you need to with it # This is where you will do the types of things Robert was referring to Then to activate your extension, just put it in your new-style config file: # .ipythondir/ipython_config.py c = get_config() c.Global.extensions = ['mypackage.myextension'] Upon starting up, IPython will import your extension and call the load_in_ipython passing itself as the argument. Here is our docs on the new config system: http://bazaar.launchpad.net/~ipython-dev/ipython/trunk/annotate/head%3A/docs/source/config/overview.txt http://bazaar.launchpad.net/~ipython-dev/ipython/trunk/annotate/head%3A/docs/source/config/ipython.txt (No HTML up yet, sorry) Cheers, Brian On Wed, Sep 30, 2009 at 10:43 AM, Darren Dale wrote: > My use case is to check in a package's __init__.py to see if it is > being imported within an ipython session, and if so to load a custom > completer. I don't think get_ipython will work: > > import IPython as ip > > ip.InteractiveShell.get_ipython() > ------------------------------------------------------------ > Traceback (most recent call last): > File "", line 1, in > TypeError: unbound method get_ipython() must be called with > InteractiveShell instance as first argument (got nothing instead) > > Am I using it incorrectly? > > > On Wed, Sep 30, 2009 at 1:21 PM, Brian Granger > wrote: > > This function is essentially deprecated. This is a bug though that > > you are getting an exception. But, in the mean time please call the > > following function: > > > > In [1]: ip = get_ipython() > > > > This function is always available inside IPython and returns basically > the > > same > > thing as get used to. > > > > For the curious, the problem with the old ipapi.get is that it assumed > that > > there > > was always only ONE ipython and it returned that one. The new > get_ipython > > function is smart: it doesn't assume there is only 1 ipython, and it > always > > returns > > the right one. > > > > Cheers, > > > > Brian > > > > On Wed, Sep 30, 2009 at 9:35 AM, Darren Dale wrote: > >> > >> I think ip.core.ipapi.get() is behaving differently in the trunk than > >> it did before the refactor. Here is the new implementation: > >> > >> def get(): > >> """Get the most recently created InteractiveShell instance.""" > >> from IPython.core.iplib import InteractiveShell > >> insts = InteractiveShell.get_instances() > >> most_recent = insts[0] > >> for inst in insts[1:]: > >> if inst.created > most_recent.created: > >> most_recent = inst > >> return most_recent > >> > >> If I call get from the python prompt, instead of Ipython, I used to > >> get None, but now I get an error because insts is an empty list so > >> insts[0] raises an IndexError. Perhaps: > >> > >> def get(): > >> """Get the most recently created InteractiveShell instance.""" > >> from IPython.core.iplib import InteractiveShell > >> insts = InteractiveShell.get_instances() > >> if not insts: > >> return None > >> most_recent = insts[0] > >> for inst in insts[1:]: > >> if inst.created > most_recent.created: > >> most_recent = inst > >> return most_recent > >> > >> > >> Darren > -------------- next part -------------- An HTML attachment was scrubbed... URL: From dsdale24 at gmail.com Wed Sep 30 17:07:56 2009 From: dsdale24 at gmail.com (Darren Dale) Date: Wed, 30 Sep 2009 17:07:56 -0400 Subject: [IPython-dev] ip.core.ipapi.get In-Reply-To: <6ce0ac130909301404q2cc66f2ev7ab459ab0d41c053@mail.gmail.com> References: <6ce0ac130909301021l7a5785abk115bd239945855f2@mail.gmail.com> <6ce0ac130909301404q2cc66f2ev7ab459ab0d41c053@mail.gmail.com> Message-ID: Thanks Brian! On Wed, Sep 30, 2009 at 5:04 PM, Brian Granger wrote: > Some of the APIs have changed a little bit, but here is an overview of how > you want to > accomplish this: > > In your package, create a module for an ipython extension: > > mypackage > ? __init__.py > ? # all your other modules and packages and then... > ? myextension.py > > In that file, create a load_in_ipython function that takes a single > argument: > > def load_in_ipython(ip) > ? # ip is the same thing as get_ipython() > ? # do whatever you need to with it > ? # This is where you will do the types of things Robert was referring to > > Then to activate your extension, just put it in your new-style config file: > > # .ipythondir/ipython_config.py > c = get_config() > c.Global.extensions = ['mypackage.myextension'] > > Upon starting up, IPython will import your extension and call the > load_in_ipython passing > itself as the argument. > > Here is our docs on the new config system: > > http://bazaar.launchpad.net/~ipython-dev/ipython/trunk/annotate/head%3A/docs/source/config/overview.txt > http://bazaar.launchpad.net/~ipython-dev/ipython/trunk/annotate/head%3A/docs/source/config/ipython.txt > > (No HTML up yet, sorry) > > Cheers, > > Brian > > On Wed, Sep 30, 2009 at 10:43 AM, Darren Dale wrote: >> >> My use case is to check in a package's __init__.py to see if it is >> being imported within an ipython session, and if so to load a custom >> completer. I don't think get_ipython will work: >> >> import IPython as ip >> >> ip.InteractiveShell.get_ipython() >> ------------------------------------------------------------ >> Traceback (most recent call last): >> ?File "", line 1, in >> TypeError: unbound method get_ipython() must be called with >> InteractiveShell instance as first argument (got nothing instead) >> >> Am I using it incorrectly? >> >> >> On Wed, Sep 30, 2009 at 1:21 PM, Brian Granger >> wrote: >> > This function is essentially deprecated.? This is a bug though that >> > you are getting an exception.? But, in the mean time please call the >> > following function: >> > >> > In [1]: ip = get_ipython() >> > >> > This function is always available inside IPython and returns basically >> > the >> > same >> > thing as get used to. >> > >> > For the curious, the problem with the old ipapi.get is that it assumed >> > that >> > there >> > was always only ONE ipython and it returned that one.? The new >> > get_ipython >> > function is smart: it doesn't assume there is only 1 ipython, and it >> > always >> > returns >> > the right one. >> > >> > Cheers, >> > >> > Brian >> > >> > On Wed, Sep 30, 2009 at 9:35 AM, Darren Dale wrote: >> >> >> >> I think ip.core.ipapi.get() is behaving differently in the trunk than >> >> it did before the refactor. Here is the new implementation: >> >> >> >> def get(): >> >> ? ?"""Get the most recently created InteractiveShell instance.""" >> >> ? ?from IPython.core.iplib import InteractiveShell >> >> ? ?insts = InteractiveShell.get_instances() >> >> ? ?most_recent = insts[0] >> >> ? ?for inst in insts[1:]: >> >> ? ? ? ?if inst.created > most_recent.created: >> >> ? ? ? ? ? ?most_recent = inst >> >> ? ?return most_recent >> >> >> >> If I call get from the python prompt, instead of Ipython, I used to >> >> get None, but now I get an error because insts is an empty list so >> >> insts[0] raises an IndexError. Perhaps: >> >> >> >> def get(): >> >> ? ?"""Get the most recently created InteractiveShell instance.""" >> >> ? ?from IPython.core.iplib import InteractiveShell >> >> ? ?insts = InteractiveShell.get_instances() >> >> ? ?if not insts: >> >> ? ? ? ?return None >> >> ? ?most_recent = insts[0] >> >> ? ?for inst in insts[1:]: >> >> ? ? ? ?if inst.created > most_recent.created: >> >> ? ? ? ? ? ?most_recent = inst >> >> ? ?return most_recent >> >> >> >> >> >> Darren > > -- "In our description of nature, the purpose is not to disclose the real essence of the phenomena but only to track down, so far as it is possible, relations between the manifold aspects of our experience" - Niels Bohr "It is a bad habit of physicists to take their most successful abstractions to be real properties of our world." - N. David Mermin "Once we have granted that any physical theory is essentially only a model for the world of experience, we must renounce all hope of finding anything like the correct theory ... simply because the totality of experience is never accessible to us." - Hugh Everett III