From ncoghlan at gmail.com Sun May 1 14:24:57 2005 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun May 1 14:25:07 2005 Subject: [Python-Dev] Keyword for block statements In-Reply-To: References: <5.1.1.6.0.20050429213620.0322cec0@mail.telecommunity.com> Message-ID: <4274CA99.6010006@gmail.com> Ka-Ping Yee wrote: > The programmer who writes the function used to introduce a block > can hardly be relied upon to explain the language semantics. We > don't expect the docstring of every class to repeat an explanation > of Python classes, for example. The language reference manual is > for that; it's a different level of documentation. Would 'suite' work as the keyword? Calling these things 'suite' statements would match the Python grammar, give an obvious visual indicator through the use of a keyword, reduce any confusion resulting from the differences between Python suites and Ruby blocks (since the names would now be different), and avoid confusion due to the multiple meanings of the word 'block'. And really, what PEP 340 creates is the ability to have user-defined suites to complement the standard control structures. Anyway, here's the examples from the PEP using 'suite' as the keyword: suite synchronized(myLock): # Code here executes with myLock held. The lock is # guaranteed to be released when the block is left (even # if by an uncaught exception). suite opening("/etc/passwd") as f: for line in f: print line.rstrip() suite transactional(db): # Perform database operation inside transaction suite auto_retry(3, IOError): f = urllib.urlopen("http://python.org/peps/pep-0340.html") print f.read() suite synchronized_opening("/etc/passwd", myLock) as f: for line in f: print line.rstrip() Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.skystorm.net From ncoghlan at iinet.net.au Sun May 1 15:02:50 2005 From: ncoghlan at iinet.net.au (Nick Coghlan) Date: Sun May 1 15:02:55 2005 Subject: [Python-Dev] PEP 340: Else clause for block statements Message-ID: <4274D37A.4020007@iinet.net.au> As yet, I don't have a particularly firm opinion on whether or not block statements should support an 'else:' clause. And there are obviously a great many other questions to be answered about how block statements might work that are more important than this one. Still, I've been tinkering with some ideas for how to approach this, and thought I'd write them up for everyone else's consideration. Option 0: No else clause allowed. Figured I should mention this, since it is Guido's last reported inclination, and my total lack of use cases for the other options below suggests this is the best idea for an initial implementation. Option 1: mimic try, for, while semantics An 'else' clause on a block statement behaves like the else clause on for and while loops, and on try/except statements - the clause is executed only if the managed suite completes 'normally' (i.e. it is not terminated early due to an exception, a break statement or a return statement) Option 2: mimic if semantics An 'else' clause on a block statement behaves vaguely like the else clause on an if statement - the clause is executed only if the first suite is never entered, but no exception occurs (i.e. StopIteration is raised by the first call to next). Option 3: iterator-controlled semantics The iterator is given the ability to control whether or not the else clause is executed (e.g. via an attribute of StopIteration), probably using option 1 above as the default behaviour. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.skystorm.net From aahz at pythoncraft.com Sun May 1 15:43:34 2005 From: aahz at pythoncraft.com (Aahz) Date: Sun May 1 15:43:37 2005 Subject: [Python-Dev] PEP 340: Else clause for block statements In-Reply-To: <4274D37A.4020007@iinet.net.au> References: <4274D37A.4020007@iinet.net.au> Message-ID: <20050501134334.GA24100@panix.com> On Sun, May 01, 2005, Nick Coghlan wrote: > > Option 0: > No else clause allowed. Figured I should mention this, since it is > Guido's last reported inclination, and my total lack of use cases for the > other options below suggests this is the best idea for an initial > implementation. +1 > Option 1: mimic try, for, while semantics > An 'else' clause on a block statement behaves like the else clause on > for and while loops, and on try/except statements - the clause is executed > only if the managed suite completes 'normally' (i.e. it is not terminated > early due to an exception, a break statement or a return statement) +0 > Option 2: mimic if semantics > An 'else' clause on a block statement behaves vaguely like the else > clause on an if statement - the clause is executed only if the first suite > is never entered, but no exception occurs (i.e. StopIteration is raised by > the first call to next). -0 > Option 3: iterator-controlled semantics > The iterator is given the ability to control whether or not the else > clause is executed (e.g. via an attribute of StopIteration), probably using > option 1 above as the default behaviour. -1 Did you deliberately sort the options this way? ;-) I'm mainly responding to deliver my vote against option 3; I don't care much about the other possibilities. -- Aahz (aahz@pythoncraft.com) <*> http://www.pythoncraft.com/ "It's 106 miles to Chicago. We have a full tank of gas, a half-pack of cigarettes, it's dark, and we're wearing sunglasses." "Hit it." From gvanrossum at gmail.com Mon May 2 02:25:47 2005 From: gvanrossum at gmail.com (Guido van Rossum) Date: Sun, 1 May 2005 17:25:47 -0700 Subject: [Python-Dev] Keyword for block statements In-Reply-To: <4274CA99.6010006@gmail.com> References: <5.1.1.6.0.20050429213620.0322cec0@mail.telecommunity.com> <4274CA99.6010006@gmail.com> Message-ID: [Nick Coghlan] > Would 'suite' work as the keyword? > > Calling these things 'suite' statements would match the Python grammar, Actually that's an argument *against* -- too confusing to have two things we call suite. > give an > obvious visual indicator through the use of a keyword, reduce any confusion > resulting from the differences between Python suites and Ruby blocks (since the > names would now be different), There's no need for that; they are close enough most of the time any way. > and avoid confusion due to the multiple meanings > of the word 'block'. Actually, in Python that's always called a suite, not a block. (Though the reference manual defines "code block" as a compilation unit.) > And really, what PEP 340 creates is the ability to have user-defined suites to > complement the standard control structures. Give that suite and block are so close in "intuitive" meaning, if there were no convincing argument for either, I still like "block" much better -- perhaps because suite is the technical term used all over the grammar. -- --Guido van Rossum (home page: http://www.python.org/~guido/) From gvanrossum at gmail.com Mon May 2 02:44:16 2005 From: gvanrossum at gmail.com (Guido van Rossum) Date: Sun, 1 May 2005 17:44:16 -0700 Subject: [Python-Dev] PEP 340: Else clause for block statements In-Reply-To: <4274D37A.4020007@iinet.net.au> References: <4274D37A.4020007@iinet.net.au> Message-ID: [Nick Coghlan] > As yet, I don't have a particularly firm opinion on whether or not block > statements should support an 'else:' clause. And there are obviously a great > many other questions to be answered about how block statements might work that > are more important than this one. > > Still, I've been tinkering with some ideas for how to approach this, and thought > I'd write them up for everyone else's consideration. > > Option 0: > No else clause allowed. Figured I should mention this, since it is Guido's > last reported inclination, and my total lack of use cases for the other options > below suggests this is the best idea for an initial implementation. The more I think about it the more this makes the most sense because it is the easiest to understand. > Option 1: mimic try, for, while semantics > An 'else' clause on a block statement behaves like the else clause on for > and while loops, and on try/except statements - the clause is executed only if > the managed suite completes 'normally' (i.e. it is not terminated early due to > an exception, a break statement or a return statement) You'd have to define this very carefully. Because break is implemented by passing StopIteration to the __next__ or __exit__ method (depending on which alternative API we end up picking), and StopIteration is also how these methods signal that the loop is over, it's a little tricky. Assuming we go with __exit__ to pass an exception to the iterator/generator, we could define that the else clause is executed when the __next__ method raises StopIteration -- this would imply exhaustion of the iterator from natural causes. This has the advantage of matching the behavior of a for loop. > Option 2: mimic if semantics > An 'else' clause on a block statement behaves vaguely like the else clause on > an if statement - the clause is executed only if the first suite is never > entered, but no exception occurs (i.e. StopIteration is raised by the first call > to next). Strange because it's different from the behavior of a for loop, and the block-statement doesn't feel like an if-statement at all. But I could actually imagine a use case: when acquiring a lock with a time-out, the else-clause could be executed when the acquisition times out. block locking(myLock, timeout=30): ...code executed with lock held... else: ...code executed if lock not acquired... But I'm not convinced that this shouldn't be handled with a try/except around it all; the use case doesn't appear all that common, and it scares me that when the lock isn't aquired, this happens entirely silently when there is no else-clause. > Option 3: iterator-controlled semantics > The iterator is given the ability to control whether or not the else clause > is executed (e.g. via an attribute of StopIteration), probably using option 1 > above as the default behaviour. A slightly cleaner version would be to have a separate subclass of StopIteration for this purpose. But I see serious problems with explaining when the else-clause is executed, because it's too dynamic. It does solve one problem with option 2 though: if there's no else-clause, and ElseIteration is raised, that could become an error rather than being ignored silently. -- --Guido van Rossum (home page: http://www.python.org/~guido/) From greg.ewing at canterbury.ac.nz Mon May 2 05:02:03 2005 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Mon, 02 May 2005 15:02:03 +1200 Subject: [Python-Dev] PEP 340: Else clause for block statements In-Reply-To: <4274D37A.4020007@iinet.net.au> References: <4274D37A.4020007@iinet.net.au> Message-ID: <4275982B.5060402@canterbury.ac.nz> Nick Coghlan wrote: > Option 1: mimic try, for, while semantics > An 'else' clause on a block statement behaves like the else clause on > for and while loops, and on try/except statements - the clause is > executed only if the managed suite completes 'normally' (i.e. it is not > terminated early due to an exception, a break statement or a return > statement) I've always thought that was a particularly unintuitive use of the word 'else', and I'm not sure I'd like it to be extended to any new statements. -- Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg.ewing at canterbury.ac.nz +--------------------------------------+ From gvanrossum at gmail.com Mon May 2 05:42:35 2005 From: gvanrossum at gmail.com (Guido van Rossum) Date: Sun, 1 May 2005 20:42:35 -0700 Subject: [Python-Dev] PEP 340 - possible new name for block-statement In-Reply-To: <5.1.1.6.0.20050429212046.032abd70@mail.telecommunity.com> References: <4271F71B.8010000@gmail.com> <20050429163854.GB14920@panix.com> <5.1.1.6.0.20050429130113.033208b0@mail.telecommunity.com> <5.1.1.6.0.20050429134751.03099cb0@mail.telecommunity.com> <5.1.1.6.0.20050429170733.031a74e0@mail.telecommunity.com> <20050429224300.GA9425@panix.com> <5.1.1.6.0.20050429212046.032abd70@mail.telecommunity.com> Message-ID: [Phillip] > By the way, I notice PEP 340 has two outstanding items with my name on > them; let me see if I can help eliminate one real quick. > > Tracebacks: it occurs to me that I may have unintentionally given the > impression that I need to pass in an arbitrary traceback, when in fact I > only need to pass in the current sys.exc_info(). I've updated the PEP (tying a couple of loose ends and making the promised change to the new API); I've decided to change the signature of __exit__() to be a triple matching the return value of sys.exc_info(), IOW the same as the "signature" of the raise-statement. There are still a few loose ends left, including the alternative API that you've proposed (which I'm not super keen on, to be sure, but which is still open for consideration). -- --Guido van Rossum (home page: http://www.python.org/~guido/) From rijishvr at rediffmail.com Mon May 2 09:46:12 2005 From: rijishvr at rediffmail.com (rijish valoorthodi rajan) Date: 2 May 2005 07:46:12 -0000 Subject: [Python-Dev] (no subject) Message-ID: <20050502074612.15892.qmail@webmail36.rediffmail.com> hello all I am a member of a team dedicated to make a client server database application and our main concern is the speed with which the system performs. we are very new to python. but after reading a lot of documents and consulting some experts we decided to work it out using PYTHON. we plan to make 2 programs one running in the client systems and one that run in server. can any one please help me by telling the thins that i should take care of while designing the project and what tools and what style we should adopt to make the program optimised. regards -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mail.python.org/pipermail/python-dev/attachments/20050502/b690e7ba/attachment.htm From ajm at flonidan.dk Mon May 2 12:03:06 2005 From: ajm at flonidan.dk (Anders J. Munch) Date: Mon, 2 May 2005 12:03:06 +0200 Subject: [Python-Dev] PEP 340: Else clause for block statements Message-ID: <9B1795C95533CA46A83BA1EAD4B01030031EE8@flonidanmail.flonidan.net> GvR wrote: > [Nick Coghlan] > > Option 2: mimic if semantics > > An 'else' clause on a block statement behaves vaguely like the else clause on > > an if statement - the clause is executed only if the first suite is never > > entered, but no exception occurs (i.e. StopIteration is raised by the first call > > to next). > > Strange because it's different from the behavior of a for loop, and > the block-statement doesn't feel like an if-statement at all. But I > could actually imagine a use case: when acquiring a lock with a > time-out, the else-clause could be executed when the acquisition times > out. > > block locking(myLock, timeout=30): > ...code executed with lock held... > else: > ...code executed if lock not acquired... A file-closing block function has the same need, as does any block function that manages a resource, whose acquisition might fail. A surrounding try/except doesn't quite cut it; the problem is that the try-clause will then also cover the suite. Example: try: in opening('file1') as f1: ... in opening('file2') as f2: ... except IOError: print "file1 not available, I'll try again later" How do I tell try/except that I really only meant to trap opening('file1'), but opening 'file2' is not supposed to fail so I want any exception from that propagated? Better if I could write: in opening('file1') as f1: ... in opening('file2') as f2: ... else: print "file1 not available, I'll try again later" or even in opening('file1') as f1: ... in opening('file2') as f2: ... except IOError: print "file1 not available, I'll try again later" I rather like this version, because it is patently clear what should happen if there is no except-clause: The exception propagates normally. - Anders From john at hazen.net Mon May 2 12:32:53 2005 From: john at hazen.net (John Hazen) Date: Mon, 2 May 2005 03:32:53 -0700 Subject: [Python-Dev] (no subject) In-Reply-To: <20050502074612.15892.qmail@webmail36.rediffmail.com> References: <20050502074612.15892.qmail@webmail36.rediffmail.com> Message-ID: <20050502103253.GH7085@gate2.hazen.net> Hi Rijish- The python-dev list is for developers *of* python, not for people developing *with* python. I'd recommend you post this on the python-list, but I see you already have. You'll find they can be very helpful, if you show that you've done some research, and ask a specific question. A subject line always helps, too. Good luck with your project! -John * rijish valoorthodi rajan [2005-05-02 00:29]: > > hello all > I am a member of a team dedicated to make a client server database application and our main concern is the speed with which the system performs. we are very new to python. but after reading a lot of documents and consulting some experts we decided to work it out using PYTHON. we plan to make 2 programs one running in the client systems and one that run in server. can any one please help me by telling the thins that i should take care of while designing the project and what tools and what style we should adopt to make the program optimised. > > regards From shane at hathawaymix.org Mon May 2 15:46:31 2005 From: shane at hathawaymix.org (Shane Hathaway) Date: Mon, 02 May 2005 07:46:31 -0600 Subject: [Python-Dev] PEP 340: Else clause for block statements In-Reply-To: <9B1795C95533CA46A83BA1EAD4B01030031EE8@flonidanmail.flonidan.net> References: <9B1795C95533CA46A83BA1EAD4B01030031EE8@flonidanmail.flonidan.net> Message-ID: <42762F37.9000605@hathawaymix.org> Anders J. Munch wrote: > in opening('file1') as f1: > ... > in opening('file2') as f2: > ... > except IOError: > print "file1 not available, I'll try again later" > > I rather like this version, because it is patently clear what should > happen if there is no except-clause: The exception propagates > normally. My eyes would expect the exception handler to also catch IOErrors generated inside the block statement body. My eyes would be deceiving me, of course, but Python isn't currently so subtle and it probably shouldn't be. You could also do this with a suitable iterator. def opening_or_skipping(fn): try: f = open(fn) except IOError: print "file1 not available, I'll try again later" else: try: yield f finally: f.close() Shane From skip at pobox.com Mon May 2 15:46:31 2005 From: skip at pobox.com (Skip Montanaro) Date: Mon, 2 May 2005 08:46:31 -0500 Subject: [Python-Dev] PEP 340: Else clause for block statements In-Reply-To: <9B1795C95533CA46A83BA1EAD4B01030031EE8@flonidanmail.flonidan.net> References: <9B1795C95533CA46A83BA1EAD4B01030031EE8@flonidanmail.flonidan.net> Message-ID: <17014.12087.543854.75179@montanaro.dyndns.org> Anders> How do I tell try/except that I really only meant to trap Anders> opening('file1'), but opening 'file2' is not supposed to fail so Anders> I want any exception from that propagated? Better if I could Anders> write: Anders> in opening('file1') as f1: Anders> ... Anders> in opening('file2') as f2: Anders> ... Anders> else: Anders> print "file1 not available, I'll try again later" -1. This has the opposite meaning of the else clause in while/for statements. Anders> or even Anders> in opening('file1') as f1: Anders> ... Anders> in opening('file2') as f2: Anders> ... Anders> except IOError: Anders> print "file1 not available, I'll try again later" Not keen on this either, maybe just because the "in" clause isn't a "try" clause. Skip From exarkun at divmod.com Mon May 2 16:02:49 2005 From: exarkun at divmod.com (Jp Calderone) Date: Mon, 02 May 2005 14:02:49 GMT Subject: [Python-Dev] PEP 340: Else clause for block statements In-Reply-To: <42762F37.9000605@hathawaymix.org> Message-ID: <20050502140249.15422.1339448410.divmod.quotient.13812@ohm> On Mon, 02 May 2005 07:46:31 -0600, Shane Hathaway wrote: >Anders J. Munch wrote: >> in opening('file1') as f1: >> ... >> in opening('file2') as f2: >> ... >> except IOError: >> print "file1 not available, I'll try again later" >> >> I rather like this version, because it is patently clear what should >> happen if there is no except-clause: The exception propagates >> normally. > >My eyes would expect the exception handler to also catch IOErrors >generated inside the block statement body. My eyes would be deceiving >me, of course, but Python isn't currently so subtle and it probably >shouldn't be. > >You could also do this with a suitable iterator. > > def opening_or_skipping(fn): > try: > f = open(fn) > except IOError: > print "file1 not available, I'll try again later" > else: > try: > yield f > finally: > f.close() I don't think this version is really of much use. It requires that you implement a different iterator for each kind of error handling you want to do. Avoiding multiple different implementations is supposed to be one of the main selling points of this feature. Jp From lcaamano at gmail.com Mon May 2 16:18:07 2005 From: lcaamano at gmail.com (Luis P Caamano) Date: Mon, 2 May 2005 10:18:07 -0400 Subject: [Python-Dev] PEP 340 - possible new name for block-statement In-Reply-To: <20050430005315.C9A7D1E400B@bag.python.org> References: <20050430005315.C9A7D1E400B@bag.python.org> Message-ID: On 4/29/05, Reinhold Birkenfeld wrote: > Date: Sat, 30 Apr 2005 00:53:12 +0200 > From: Reinhold Birkenfeld > Subject: [Python-Dev] Re: PEP 340 - possible new name for > block-statement > To: python-dev at python.org > Message-ID: > Content-Type: text/plain; charset=ISO-8859-1 > > > FWIW, the first association when seeing > > block something: > > is with the verb "to block", and not with the noun, which is most displeasing. > > Reinhold > Which is the reason I thought of "bracket" instead. Although it's also a noun and a verb, the verb doesn't imply "stop" like block does. However, because one of the main features of python is that it's easy to read, adding "with" to it makes it very clear as in "bracket_with". Ugly at first, but that's just a matter of familiarity. You never notice that your ugly friend is really that ugly anymore, right? bracket_with foo(arg1, arg2) as f: BLOCK seems very explicit to me. However, I do prefer no keyword at all and that would be my first choice, but if we have to choose a keyword, "block" has that "stop" connotation that will certainly confuse more than a few but I doubt people would go with "bracket_with." I certainly hope no-keyword is possible. -- Luis P Caamano Atlanta, GA USA From gvanrossum at gmail.com Mon May 2 16:57:19 2005 From: gvanrossum at gmail.com (Guido van Rossum) Date: Mon, 2 May 2005 07:57:19 -0700 Subject: [Python-Dev] PEP 340: Else clause for block statements In-Reply-To: <9B1795C95533CA46A83BA1EAD4B01030031EE8@flonidanmail.flonidan.net> References: <9B1795C95533CA46A83BA1EAD4B01030031EE8@flonidanmail.flonidan.net> Message-ID: [Guido, prsenting a use case] > > block locking(myLock, timeout=30): > > ...code executed with lock held... > > else: > > ...code executed if lock not acquired... [Anders Munch] > A file-closing block function has the same need, as does any block > function that manages a resource, whose acquisition might fail. > > A surrounding try/except doesn't quite cut it; the problem is that the > try-clause will then also cover the suite. > > Example: > > try: > in opening('file1') as f1: > ... > in opening('file2') as f2: > ... > except IOError: > print "file1 not available, I'll try again later" I thought of this and several other solutions overnight and didn't lik any. Finally I realized that this use case is better covered by letting the generator return an error value: def opening_w_err(filename): try: f = open(filename) except IOError, err: yield None, err else: try: yield f, None finally: f.close() The user can then write: block opening_w_err(filename) as f, err: if f: ...code using f... else: ...error handling code using err... Besides, in many cases it's totally acceptable to put a try/except block around the entire block-statement, if the exception it catches is specific enough (like IOError). For example: try: block opening(filename) as f: ...code using f... except IOError, err: ...error handling code using err... So I'm more than ever in favor of keeping the block-statement simple, i.e. without any additional clauses. -- --Guido van Rossum (home page: http://www.python.org/~guido/) From walter at livinglogic.de Mon May 2 18:06:58 2005 From: walter at livinglogic.de (=?ISO-8859-1?Q?Walter_D=F6rwald?=) Date: Mon, 02 May 2005 18:06:58 +0200 Subject: [Python-Dev] Generating nested data structures with blocks Message-ID: <42765022.1080900@livinglogic.de> Reading PEP 340, it seems to me that blocks could be used for generating nested data structures: def blist(list): def enter(parent=None): if parent: parent.append(self) yield self x = blist() block x.enter() as x: x.append(1) block blist().enter(x) as x: x.append(2) x.append(3) print x this should print [1, [2], 3] For this to work, the scope of the block variable has to end with the end of the block. Currently the PEP leaves this unspecified. Bye, Walter D?rwald From walter at livinglogic.de Mon May 2 18:40:06 2005 From: walter at livinglogic.de (=?ISO-8859-1?Q?Walter_D=F6rwald?=) Date: Mon, 02 May 2005 18:40:06 +0200 Subject: [Python-Dev] Generating nested data structures with blocks In-Reply-To: <42765022.1080900@livinglogic.de> References: <42765022.1080900@livinglogic.de> Message-ID: <427657E6.9080007@livinglogic.de> Walter D?rwald wrote: > [...] > def blist(list): > def enter(parent=None): Of course this was meant to be: class blist(list): der enter(self, parent=None): Bye, Walter D?rwald From gvanrossum at gmail.com Tue May 3 02:55:56 2005 From: gvanrossum at gmail.com (Guido van Rossum) Date: Mon, 2 May 2005 17:55:56 -0700 Subject: [Python-Dev] PEP 340 -- loose ends Message-ID: These are the loose ends on the PEP (apart from filling in some missing sections): 1. Decide on a keyword to use, if any. 2. Decide on the else clause. 3. Decide on Phillip Eby's proposal to have a different API for blocks, so you would have to use a @decorator to turn a generator into something usable in a block. Here are my strawman decisions, to the extent that I'm clear on them: 1. I still can't decide on keyword vs. no keyword, but if we're going to have a keyword, I haven't seen a better proposal than block. So it's either block or nothing. I'll sleep on this. Feel free to start an all-out flame war on this in c.l.py. ;-) 2. No else clause; the use case is really weak and there are too many possible semantics. It's not clear whether to generalize from for/else, or if/else, or what else. 3. I'm leaning against Phillip's proposal; IMO it adds more complexity for very little benefit. Unless there's more discussion on any of these, I'll probably finish up the PEP and post it to c.l.py in a few days. -- --Guido van Rossum (home page: http://www.python.org/~guido/) From pje at telecommunity.com Tue May 3 03:39:19 2005 From: pje at telecommunity.com (Phillip J. Eby) Date: Mon, 02 May 2005 21:39:19 -0400 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: Message-ID: <5.1.1.6.0.20050502210840.031fb9b0@mail.telecommunity.com> At 05:55 PM 5/2/05 -0700, Guido van Rossum wrote: >3. I'm leaning against Phillip's proposal; IMO it adds more complexity >for very little benefit. Little benefit, I'll agree with, even though there are EIBTI and TOOWTDI benefits as well as Errors Should Never Pass Silently. But the only added implementation complexity is the decorator -- balanced against the removal of the need for a 'next()' builtin. I also believe that the approach actually *reduces* pedagogical complexity by not allowing any blurring between the concept of an iterator and the concept of a block template. Since I'm not sure if anybody besides you is aware of what I proposed, I'll attempt to recap here, and then step to allow discussion. If there's no community support, I'll let it die a natural death, because it's ultimately a "purity" question rather than a practical one, though I think that other people who teach Python programming should weigh in on this. Specifically, I propose that PEP 340 *not* allow the use of "normal" iterators. Instead, the __next__ and __exit__ methods would be an unrelated protocol. This would eliminate the need for a 'next()' builtin, and avoid any confusion between today's iterators and a template function for use with blocks. Because today's generators were also not written with blocks in mind, it would also be necessary to use a @decorator to declare that a generator is in fact a block template. Possibly something like: @blocktemplate def retry(times): for i in xrange(times): try: yield except StopIteration: return except: continue else: return raise My argument is that this is both Explicit (i.e., better than implicit) and One Obvious Way (because using existing iterators just Another Way to do a "for" loop). It also doesn't allow Errors (using an iterator with no special semantics) to Pass Silently. Of course, since Practicality Beats Purity, I could give this all up. But I don't think the Implementation is Hard to Explain, as it should be just as easy as Guido's proposal. Instead of a 'next()' builtin, one would instead implement a 'blocktemplate' decorator (or whatever it's to be called). The same __next__/__exit__/next methods have to be implemented as in Guido's proposal. Really, the only thing that changes is that you get a TypeError when a template function returns an iterator instead of a block template, and you have to use the decorator on your generators to explicitly label them safe for use with blocks. (Hand-crafted block templates still just implement __next__ and __exit__, in the same way as they would under Guido's proposal, so no real change there.) Guido may also have other reasons to take a different direction that he may not have expressed; e.g. maybe in Py3K there'll be no "for", just "iter(x) as y:"? Or...? I don't claim to have any special smarts about this, but other people (including Guido) have previously expressed reservations about the near-blending of iteration and block control that PEP 340 allows. So, I've thrown out this proposal as an attempt to address those reservations. YMMV. From pje at telecommunity.com Tue May 3 03:43:02 2005 From: pje at telecommunity.com (Phillip J. Eby) Date: Mon, 02 May 2005 21:43:02 -0400 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: <5.1.1.6.0.20050502210840.031fb9b0@mail.telecommunity.com> References: Message-ID: <5.1.1.6.0.20050502214241.02c3a890@mail.telecommunity.com> At 09:39 PM 5/2/05 -0400, Phillip J. Eby wrote: >attempt to recap here, and then step to allow discussion. If there's no Argh. That was supposed to be, "step aside to allow discussion". From tdelaney at avaya.com Tue May 3 04:14:36 2005 From: tdelaney at avaya.com (Delaney, Timothy C (Timothy)) Date: Tue, 3 May 2005 12:14:36 +1000 Subject: [Python-Dev] PEP 340 -- loose ends Message-ID: <338366A6D2E2CA4C9DAEAE652E12A1DE721275@au3010avexu1.global.avaya.com> Phillip J. Eby wrote: > Specifically, I propose that PEP 340 *not* allow the use of "normal" > iterators. Instead, the __next__ and __exit__ methods would be an > unrelated protocol. This would eliminate the need for a 'next()' > builtin, > and avoid any confusion between today's iterators and a template > function > for use with blocks. PEP 340 does not address "normal" iterators very well, but a properly-constructed iterator will behave correctly. The PEP though is very generator-focussed. The issues I see for "normal" iterators (and that need to be addressed/stated in the PEP) are: 1. No automatic handling of parameters passed to __next__ and __exit__. In a generator, these will raise at the yield-statement or -expression. A "normal" iterator will have to take care of this manually. This could be an argument to only allow generator-iterators to be used with PEP 340 semantics (i.e. continue , block), but I don't think it's a very compelling one. Although perhaps the initial implementation could be restricted to generator-iterators. So if a for-loop used `continue ` it would have a check (at the start of the for loop) that the iterator is a generator-iterator. Likewise, a block-statement would always include this check. As another option, it might be worthwhile creating a base iterator type with "correct" semantics. Tim Delaney From gvanrossum at gmail.com Tue May 3 04:33:08 2005 From: gvanrossum at gmail.com (Guido van Rossum) Date: Mon, 2 May 2005 19:33:08 -0700 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: <338366A6D2E2CA4C9DAEAE652E12A1DE721275@au3010avexu1.global.avaya.com> References: <338366A6D2E2CA4C9DAEAE652E12A1DE721275@au3010avexu1.global.avaya.com> Message-ID: [Delaney, Timothy] > PEP 340 does not address "normal" iterators very well, but a > properly-constructed iterator will behave correctly. This is by design. > The PEP though is very generator-focussed. Disagree. The PEP describes most everything (e.g. the block statement semantics) in terms of iterators, and then describes how the new APIs behave for generators. > The issues I see for "normal" > iterators (and that need to be addressed/stated in the PEP) are: > > 1. No automatic handling of parameters passed to __next__ and __exit__. > In a generator, these will raise at the yield-statement or -expression. > A "normal" iterator will have to take care of this manually. Not sure what you mean by this. If __next__() is defined, it is passed the parameter; if only next() is defined, a parameter (except None) is an error. That seems exactly right. Also, if __exit__() isn't defined, the exception is raised, which is a very sensible default behavior (and also what will happen to a generator that doesn't catch the exception). > This could be an argument to only allow generator-iterators to be used > with PEP 340 semantics (i.e. continue , block), but I don't think > it's a very compelling one. Neither do I. :-) > Although perhaps the initial implementation could be restricted to > generator-iterators. So if a for-loop used `continue ` it would > have a check (at the start of the for loop) that the iterator is a > generator-iterator. Likewise, a block-statement would always include > this check. But what would this buy you except an arbitrary restriction? > As another option, it might be worthwhile creating a base iterator type > with "correct" semantics. Well, what would the "correct" semantics be? What would passing a parameter to a list iterator's __next__() method mean? -- --Guido van Rossum (home page: http://www.python.org/~guido/) From tdelaney at avaya.com Tue May 3 04:53:03 2005 From: tdelaney at avaya.com (Delaney, Timothy C (Timothy)) Date: Tue, 3 May 2005 12:53:03 +1000 Subject: [Python-Dev] PEP 340 -- loose ends Message-ID: <338366A6D2E2CA4C9DAEAE652E12A1DE721276@au3010avexu1.global.avaya.com> Guido van Rossum wrote: > [Delaney, Timothy] >> PEP 340 does not address "normal" iterators very well, but a >> properly-constructed iterator will behave correctly. > > This is by design. Yep - I agree. >> The PEP though is very generator-focussed. > > Disagree. The PEP describes most everything (e.g. the block statement > semantics) in terms of iterators, and then describes how the new APIs > behave for generators. Again, agree. What I meant is that there are no examples of how to actually implement the correct semantics for a normal iterator. Doing it right is non-trivial, especially with the __next__ and __exit__ interaction (see below). >> The issues I see for "normal" >> iterators (and that need to be addressed/stated in the PEP) are: >> >> 1. No automatic handling of parameters passed to __next__ and >> __exit__. In a generator, these will raise at the yield >> -statement or -expression. A "normal" iterator will have >> to take care of this manually. > > Not sure what you mean by this. If __next__() is defined, it is passed > the parameter; if only next() is defined, a parameter (except None) is > an error. That seems exactly right. Also, if __exit__() isn't defined, > the exception is raised, which is a very sensible default behavior > (and also what will happen to a generator that doesn't catch the > exception). What I meant is how the iterator is meant to handle the parameters passed to each method. PEP 340 deals with this by stating that exceptions will be raised at the next yield-statement or -expression. I think we need an example though of how this would translate to a "normal" iterator. Something along the lines of:: class iterator (object): def next (self): return self.__next__() def __next__(self, arg=None): value = None if isinstance(arg, ContinueIteration): value = arg.value elif arg is not None: raise arg if value is None: raise StopIteration return value def __exit__(self, type=None, value=None, traceback=None): if (type is None) and (value is None) and (traceback is None): type, value, traceback = sys.exc_info() if type is not None: try: raise type, value, traceback except type, exc: return self.__next__(exc) return self.__next__() >> As another option, it might be worthwhile creating a base iterator type >> with "correct" semantics. > Well, what would the "correct" semantics be? What would passing a > parameter to a list iterator's __next__() method mean? Sorry - I meant for user-defined iterators. And the correct semantics would be something like the example above I think. Except that I think most of it would need to be in a separate method (e.g. _next) for base classes to call - then things would change to be something like:: class iterator (object): ... def _next (self, arg): if isinstance(arg, ContinueIteration): return arg.value elif arg is not None: raise arg def __next__(self, arg=None): value = self._next(arg) if value is None: raise StopIteration return value ... Finally, I think there is another loose end that hasn't been addressed:: When __next__() is called with an argument that is not None, the yield-expression that it resumes will return the value attribute of the argument. If it resumes a yield-statement, the value is ignored (or should this be considered an error?). When the *initial* call to __next__() receives an argument that is not None, the generator's execution is started normally; the argument's value attribute is ignored (or should this be considered an error?). When __next__() is called without an argument or with None as argument, and a yield-expression is resumed, the yield-expression returns None. My opinion is that each of these should be an error. Tim Delaney From gvanrossum at gmail.com Tue May 3 06:05:38 2005 From: gvanrossum at gmail.com (Guido van Rossum) Date: Mon, 2 May 2005 21:05:38 -0700 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: <338366A6D2E2CA4C9DAEAE652E12A1DE721276@au3010avexu1.global.avaya.com> References: <338366A6D2E2CA4C9DAEAE652E12A1DE721276@au3010avexu1.global.avaya.com> Message-ID: [Delaney, Timothy] > What I meant is that there are no examples of how to > actually implement the correct semantics for a normal iterator. Doing it > right is non-trivial, especially with the __next__ and __exit__ > interaction (see below). Depends on what you mean by right. Ignoring the argument to __next__() and not implementing __exit__() seems totally "right" to me. [...] > What I meant is how the iterator is meant to handle the parameters > passed to each method. PEP 340 deals with this by stating that > exceptions will be raised at the next yield-statement or -expression. I > think we need an example though of how this would translate to a > "normal" iterator. Something along the lines of:: > > class iterator (object): > > def next (self): > return self.__next__() > > def __next__(self, arg=None): > value = None > > if isinstance(arg, ContinueIteration): Oops. Read the most recent version of the PEP again. __next__() doesn't take an exception argument, it only takes a value. Maybe this removes your concern? > value = arg.value > elif arg is not None: > raise arg > > if value is None: > raise StopIteration > > return value That's a very strange iterator; it immediately terminates unless you call __next__() with a non-None argument, then it returns the argument value. I'm having a hard time understanding what you meant to say. Also note that the very *first* call to __next__() is not supposed to have an argument. The argument (normally) only comes from "continue EXPR" and that can only be reached after the first call to __next__(). This is exactly right for generators -- the first __next__() call there "starts" the generator at the top of its function body, executing until the first yield is reached. > def __exit__(self, type=None, value=None, traceback=None): > if (type is None) and (value is None) and (traceback is None): > type, value, traceback = sys.exc_info() You shouldn't need to check for traceback is None. Also, even though the PEP suggests that you can do this, I don't see a use case for it -- the translation of a block-statement never calls __exit__() without an exception. > if type is not None: > try: > raise type, value, traceback > except type, exc: > return self.__next__(exc) > > return self.__next__() Ah, here we see the other misconception (caused by not reading the most recent version of the PEP). __exit__() shouldn't call __next__() -- it should just raise the exception passed in unless it has something special to do. Let me clarify all this with an example showing how you could write "synchronized()" as an iterator instead of a generator. class synchronized: def __init__(self, lock): self.lock = lock self.state = 0 def __next__(self, arg=None): # ignores arg if self.state: assert self.state == 1 self.lock.release() self.state += 1 raise StopIteration else: self.lock.acquire() self.state += 1 return None def __exit__(self, type, value=None, traceback=None): assert self.state in (0, 1, 2) if self.state == 1: self.lock.release() raise type, value, traceback > >> As another option, it might be worthwhile creating a base iterator type > >> with "correct" semantics. > > > Well, what would the "correct" semantics be? What would passing a > > parameter to a list iterator's __next__() method mean? > > Sorry - I meant for user-defined iterators. And the correct semantics > would be something like the example above I think. Except that I think > most of it would need to be in a separate method (e.g. _next) for base > classes to call - then things would change to be something like:: > > class iterator (object): > ... > > def _next (self, arg): > if isinstance(arg, ContinueIteration): > return arg.value > elif arg is not None: > raise arg > > def __next__(self, arg=None): > value = self._next(arg) > > if value is None: > raise StopIteration > > return value > > ... I think this is all based on a misunderstanding of the PEP. Also, you really don't need to implement __exit__() unless you have some cleanup to do -- the default behavior of the block translation only calls it if defined, and otherwise simply raises the exception. > Finally, I think there is another loose end that hasn't been addressed:: > > When __next__() is called with an argument that is not None, the > yield-expression that it resumes will return the value attribute > of the argument. If it resumes a yield-statement, the value is > ignored (or should this be considered an error?). When the > *initial* call to __next__() receives an argument that is not > None, the generator's execution is started normally; the > argument's value attribute is ignored (or should this be > considered an error?). When __next__() is called without an > argument or with None as argument, and a yield-expression is > resumed, the yield-expression returns None. Good catch. > My opinion is that each of these should be an error. Personally, I think not using the value passed into __next__() should not be an error; that's about the same as not using the value returned by a function you call. There are all sorts of reasons for doing that. In a very early version of Python, the result of an expression that wasn't used would be printed unless it was None (it still does this at the interactive prompt); this was universally hated. I agree that calling the initial __next__() of a generator with a non-None argument should be considered an error; this is likely caused by some kind of logic error; it can never happen when the generator is called by a block statement. I'll update the PEP to reflect this. -- --Guido van Rossum (home page: http://www.python.org/~guido/) From kbk at shore.net Tue May 3 06:25:58 2005 From: kbk at shore.net (Kurt B. Kaiser) Date: Tue, 3 May 2005 00:25:58 -0400 (EDT) Subject: [Python-Dev] Weekly Python Patch/Bug Summary Message-ID: <200505030426.j434PwB1027617@bayview.thirdcreek.com> Patch / Bug Summary ___________________ Patches : 322 open ( +6) / 2832 closed ( +1) / 3154 total ( +7) Bugs : 920 open (+12) / 4952 closed (+11) / 5872 total (+23) RFE : 186 open ( +8) / 156 closed ( +3) / 342 total (+11) New / Reopened Patches ______________________ Info Associated with Merge to AST (2005-01-07) http://python.org/sf/1097671 reopened by kbk Minimal cleanup of run.py (2005-04-26) http://python.org/sf/1190163 opened by Michiel de Hoon socketmodule.c's recvfrom on OSF/1 4.0 (2005-04-27) http://python.org/sf/1191065 opened by Marcel Martin Simplify logic in random.py (2005-04-28) CLOSED http://python.org/sf/1191489 opened by Raymond Hettinger wrong offsets in bpprint() (2005-04-28) http://python.org/sf/1191700 opened by Pechkinzzz about shift key (2005-04-28) http://python.org/sf/1191726 opened by yang debugger ``condition`` and ``ignore`` exception handling (2005-04-29) http://python.org/sf/1192590 opened by Jeremy Jones Use GCC4 ELF symbol visibility (2005-04-30) http://python.org/sf/1192789 opened by James Henstridge Patches Closed ______________ type conversion methods and subclasses (2005-01-25) http://python.org/sf/1109424 closed by bcannon Don't assume all exceptions are SyntaxError's (2005-04-24) http://python.org/sf/1189210 closed by bcannon Automatically build fpectl module from setup.py (2005-04-19) http://python.org/sf/1185529 closed by mwh Simplify logic in random.py (2005-04-28) http://python.org/sf/1191489 closed by rhettinger New / Reopened Bugs ___________________ [AST] assignment to None allowed (2005-04-25) CLOSED http://python.org/sf/1190010 opened by Brett Cannon [AST] distinct code objects not created (2005-04-25) http://python.org/sf/1190011 opened by Brett Cannon ``from sys import stdin,`` doesn't raise a SyntaxError (2005-04-25) http://python.org/sf/1190012 opened by Brett Cannon 3.29 site is confusing re site-packages on Windows (2005-04-26) http://python.org/sf/1190204 opened by Kent Johnson 6.9 First sentence is confusing (2005-04-26) CLOSED http://python.org/sf/1190451 opened by Nicolas Grilly os.waitpid docs don't specify return value for WNOHANG (2005-04-26) http://python.org/sf/1190563 opened by jls SimpleHTTPServer sends wrong c-length and locks up client (2005-04-26) http://python.org/sf/1190580 opened by Alexander Schremmer calendar._firstweekday is too hard-wired (2005-04-26) http://python.org/sf/1190596 opened by Tres Seaver dir() docs show incorrect output (2005-04-26) CLOSED http://python.org/sf/1190599 opened by Martin Chase bz2 RuntimeError when decompressing file (2005-04-27) http://python.org/sf/1191043 opened by Chris AtLee Warning ``error`` filter action is ignored. (2005-04-27) CLOSED http://python.org/sf/1191104 opened by Ivan Vilata i Balaguer [AST] Failing tests (2005-04-27) http://python.org/sf/1191458 opened by Brett Cannon 'clear -1' in pdb (2005-04-29) http://python.org/sf/1192315 opened by Pechkinzzz doctest's ELLIPSIS and multiline statements (2005-04-29) CLOSED http://python.org/sf/1192554 opened by S?bastien Boisg?rault docstring error (2005-04-29) CLOSED http://python.org/sf/1192777 opened by Christopher Smith Notation (2005-04-30) http://python.org/sf/1193001 opened by Mythril Python and Turkish Locale (2005-04-30) http://python.org/sf/1193061 opened by S.?ağlar Onur Embedded python thread crashes (2005-04-30) http://python.org/sf/1193099 opened by ugodiggi Strange os.path.exists() results with invalid chars (2005-04-30) http://python.org/sf/1193180 opened by Daniele Varrazzo MACOSX_DEPLOYMENT_TARGET checked incorrectly (2005-04-30) http://python.org/sf/1193190 opened by Bob Ippolito os.path.expanduser documentation wrt. empty $HOME (2005-05-02) http://python.org/sf/1193849 opened by Wummel calendar.weekheader not found in __all__ (2005-05-03) http://python.org/sf/1193890 opened by George Yoshida Weakref types documentation bugs (2005-05-02) http://python.org/sf/1193966 opened by Barry A. Warsaw bz2.BZ2File doesn't handle modes correctly (2005-05-02) http://python.org/sf/1194181 opened by Bob Ippolito Error in section 4.2 of Python Tutorial (2005-05-03) http://python.org/sf/1194209 opened by Andrina Kelly Bugs Closed ___________ [ast branch] fatal error when compiling test_bool.py (2005-03-19) http://python.org/sf/1166714 closed by bcannon [AST] assert failure on ``eval("u'\Ufffffffe'")`` (2005-04-19) http://python.org/sf/1186345 closed by bcannon "Atuple containing default argument values ..." (2005-04-25) http://python.org/sf/1189819 closed by rhettinger [AST] assignment to None allowed (2005-04-25) http://python.org/sf/1190010 closed by bcannon 6.9 First sentence is confusing (2005-04-26) http://python.org/sf/1190451 closed by rhettinger Python 2.4 Not Recognized by Any Programs (2005-04-23) http://python.org/sf/1188637 closed by tjreedy Variable.__init__ uses self.set(), blocking specialization (2005-04-07) http://python.org/sf/1178872 closed by tjreedy Compiler generates relative filenames (2001-04-11) http://python.org/sf/415492 closed by tjreedy smtplib crashes Windows Kernal. (2003-06-09) http://python.org/sf/751612 closed by tjreedy AssertionError from urllib.retrieve / httplib (2003-06-15) http://python.org/sf/755080 closed by tjreedy dir() docs show incorrect output (2005-04-26) http://python.org/sf/1190599 closed by mwh Warning ``error`` filter action is ignored. (2005-04-27) http://python.org/sf/1191104 closed by vsajip doctest's ELLIPSIS and multiline statements (2005-04-29) http://python.org/sf/1192554 closed by boisgerault docstring error (2005-04-29) http://python.org/sf/1192777 closed by bcannon New / Reopened RFE __________________ The array module and the buffer interface (2005-04-25) http://python.org/sf/1190033 opened by Josiah Carlson logging module '.' behavior (2005-04-26) http://python.org/sf/1190689 opened by Christopher Dunn Add 'before' and 'after' methods to Strings (2005-04-26) CLOSED http://python.org/sf/1190701 opened by Christopher Dunn cStringIO has reset(), but StringIO does not (2005-04-27) CLOSED http://python.org/sf/1191420 opened by Christopher Dunn logging module root logger name (2005-04-26) http://python.org/sf/1190689 reopened by cxdunn slice indices different than integers (2005-04-28) CLOSED http://python.org/sf/1191697 opened by Sebastien de Menten make slices pickable (2005-04-28) http://python.org/sf/1191699 opened by Sebastien de Menten asynchronous Subprocess (2005-04-28) http://python.org/sf/1191964 opened by Josiah Carlson "replace" function should accept lists. (2005-04-17) http://python.org/sf/1184678 reopened by poromenos 'str'.translate(None) => identity translation (2005-04-30) http://python.org/sf/1193128 opened by Bengt Richter add server.shutdown() method and daemon arg to SocketServer (2005-05-02) http://python.org/sf/1193577 opened by paul rubin Expat Parser to supply document locator in incremental parse (2005-05-02) http://python.org/sf/1193610 opened by GaryD RFE Closed __________ Add 'before' and 'after' methods to Strings (2005-04-26) http://python.org/sf/1190701 closed by rhettinger cStringIO has reset(), but StringIO does not (2005-04-27) http://python.org/sf/1191420 closed by rhettinger logging module root logger name (2005-04-27) http://python.org/sf/1190689 closed by vsajip logging module documentation (2003-01-16) http://python.org/sf/668905 closed by vsajip slice indices different than integers (2005-04-28) http://python.org/sf/1191697 closed by mwh From tdelaney at avaya.com Tue May 3 06:59:42 2005 From: tdelaney at avaya.com (Delaney, Timothy C (Timothy)) Date: Tue, 3 May 2005 14:59:42 +1000 Subject: [Python-Dev] PEP 340 -- loose ends Message-ID: <338366A6D2E2CA4C9DAEAE652E12A1DE721277@au3010avexu1.global.avaya.com> Guido van Rossum wrote: > Oops. Read the most recent version of the PEP again. __next__() > doesn't take an exception argument, it only takes a value. Maybe this > removes your concern? Actually, I misinterpreted it, assuming that the value passed in was an exception instance because the previous versions worked that way. This has been going on too long ;) > Ah, here we see the other misconception (caused by not reading the > most recent version of the PEP). __exit__() shouldn't call __next__() > -- it should just raise the exception passed in unless it has > something special to do. Ah - I think this needs to be explained better. In particular, in the specification of the __next__ and __exit__ methods it should state what exceptions are expected to be raised under what circumstances - in particular, that __exit__ is expected to raise the passed in exception or StopIteration. This is only explained in the Generator Exception Handling specification, but it's applicable to all iterators. >> Finally, I think there is another loose end that hasn't been >> addressed:: >> >> When __next__() is called with an argument that is not None, the >> yield-expression that it resumes will return the value attribute >> of the argument. If it resumes a yield-statement, the value is >> ignored (or should this be considered an error?). When the >> *initial* call to __next__() receives an argument that is not >> None, the generator's execution is started normally; the >> argument's value attribute is ignored (or should this be >> considered an error?). When __next__() is called without an >> argument or with None as argument, and a yield-expression is >> resumed, the yield-expression returns None. > > Good catch. > >> My opinion is that each of these should be an error. > > Personally, I think not using the value passed into __next__() should > not be an error; that's about the same as not using the value returned > by a function you call. Now that I understand that the parameter to __next__ is not an exception, I agree. > I agree that calling the initial __next__() of a generator with a > non-None argument should be considered an error; this is likely caused > by some kind of logic error; it can never happen when the generator is > called by a block statement. Cheers. Tim Delaney From ncoghlan at gmail.com Tue May 3 11:28:40 2005 From: ncoghlan at gmail.com (Nick Coghlan) Date: Tue, 03 May 2005 19:28:40 +1000 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: <5.1.1.6.0.20050502210840.031fb9b0@mail.telecommunity.com> References: <5.1.1.6.0.20050502210840.031fb9b0@mail.telecommunity.com> Message-ID: <42774448.8080306@gmail.com> Phillip J. Eby wrote: > Specifically, I propose that PEP 340 *not* allow the use of "normal" > iterators. Instead, the __next__ and __exit__ methods would be an > unrelated protocol. This would eliminate the need for a 'next()' builtin, > and avoid any confusion between today's iterators and a template function > for use with blocks. I would extend this to say that invoking the blocktemplate decorator should eliminate the conventional iteration interface, preventing the following problematically silent bug: for l in synchronized(mylock): # This lock is not released promptly! break > My argument is that this is both Explicit (i.e., better than implicit) and > One Obvious Way (because using existing iterators just Another Way to do a > "for" loop). It also doesn't allow Errors (using an iterator with no > special semantics) to Pass Silently. While I agree these are advantages, a bigger issue for me would be the one above: keeping a block template which expects prompt finalisation from being inadvertently used in a conventional for loop which won't finalise on early termination of the loop. I'd also suggest that the blocktemplate decorator accept any iterator, not just generators. > Of course, since Practicality Beats Purity, I could give this all up. But > I don't think the Implementation is Hard to Explain, as it should be just > as easy as Guido's proposal. I think it would be marginally easier to explain, since the confusion between iterators and block templates would be less of a distraction. > Really, the only thing that changes is that you get a > TypeError when a template function returns an iterator instead of a block > template, and you have to use the decorator on your generators to > explicitly label them safe for use with blocks. I'd add raising a TypeError when a block template is passed to the iter() builtin to the list of differences from the current incarnation of the PEP. As for Phillip, I think using different API's is a good way to more clearly emphasise the difference in purpose between conventional for loops and the new block statement, but I'm also a little concerned about incorrectly passing a block template to a for loop. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.skystorm.net From ncoghlan at gmail.com Tue May 3 11:33:35 2005 From: ncoghlan at gmail.com (Nick Coghlan) Date: Tue, 03 May 2005 19:33:35 +1000 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: References: Message-ID: <4277456F.7090402@gmail.com> Guido van Rossum wrote: > 1. I still can't decide on keyword vs. no keyword, but if we're going > to have a keyword, I haven't seen a better proposal than block. So > it's either block or nothing. I'll sleep on this. Feel free to start > an all-out flame war on this in c.l.py. ;-) I quite like 'block', but can live with no keyword (since it then becomes a practical equivalent to user-defined statements). > 2. No else clause; the use case is really weak and there are too many > possible semantics. It's not clear whether to generalize from > for/else, or if/else, or what else. Agreed. The order I posted my list of semantic options was the order I thought of them, but I ended up agreeing with the votes Aahz posted. > 3. I'm leaning against Phillip's proposal; IMO it adds more complexity > for very little benefit. See my response to Phillip. I think there could be an advantage to it if it means that "for l in synchronized(lock)" raises an immediate error instead of silently doing the wrong thing. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.skystorm.net From pierre.barbier at cirad.fr Tue May 3 13:32:18 2005 From: pierre.barbier at cirad.fr (Pierre Barbier de Reuille) Date: Tue, 03 May 2005 13:32:18 +0200 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: <4277456F.7090402@gmail.com> References: <4277456F.7090402@gmail.com> Message-ID: <42776142.9010006@cirad.fr> Nick Coghlan a ?crit : >>3. I'm leaning against Phillip's proposal; IMO it adds more complexity >>for very little benefit. > > > See my response to Phillip. I think there could be an advantage to it if it > means that "for l in synchronized(lock)" raises an immediate error instead of > silently doing the wrong thing. First, I really think this PEP is needed for Python. But this is express exactly my main concern about it ! As far as I understand it, iterator-for-blocks and iterator-for-loops are two different beasts. Even if iterator-for-loops can be used within a block without damage, the use of iterator-for-block in a loop can lead to completely unpredictable result (and result really hard to find since they'll possibly involve race conditions or dead locks). To try being as clear as possible, I would say the iterator-for-loops are simplified iterator-for-blocks. IOW, if I were to put them in a class inheritance hierarchy (I don't say we should put them into one ;) ) iterator-for-block would be the base class of iterator-for-loop. Thus, as for-loops require an iterator-for-loop, they would raise an error if used with an iterator-for-block. But as blocks require an iterator-for-blocks they will allow iterator-for-loops too ! Cheers, Pierre -- Pierre Barbier de Reuille INRA - UMR Cirad/Inra/Cnrs/Univ.MontpellierII AMAP Botanique et Bio-informatique de l'Architecture des Plantes TA40/PSII, Boulevard de la Lironde 34398 MONTPELLIER CEDEX 5, France tel : (33) 4 67 61 65 77 fax : (33) 4 67 61 56 68 From m.u.k.2 at gawab.com Tue May 3 13:42:11 2005 From: m.u.k.2 at gawab.com (m.u.k) Date: Tue, 3 May 2005 11:42:11 +0000 (UTC) Subject: [Python-Dev] Need to hook Py_FatalError Message-ID: Greetings, Currently Py_FatalError only dumps the error to stderr and calls abort(). When doing quirky things with the interpreter, it's so annoying that process just terminates. Are there any reason why we still dont have a simple callback to hook Py_FatalError. PS. If the answer is "because no one needs/implemented...", I can volunteer. Best regards. From ncoghlan at gmail.com Tue May 3 14:24:17 2005 From: ncoghlan at gmail.com (Nick Coghlan) Date: Tue, 03 May 2005 22:24:17 +1000 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: <20050503063140.A7719@familjen.svensson.org> References: <5.1.1.6.0.20050502210840.031fb9b0@mail.telecommunity.com> <42774448.8080306@gmail.com> <20050503063140.A7719@familjen.svensson.org> Message-ID: <42776D71.6080303@gmail.com> Paul Svensson wrote: > On Tue, 3 May 2005, Nick Coghlan wrote: > >> I'd also suggest that the blocktemplate decorator accept any iterator, >> not just >> generators. > > > So you want decorators on classes now ? A decorator is just a function - it doesn't *need* to be used with decorator syntax. I just think the following code should work for any iterator: block blocktemplate(itr): # Do stuff Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.skystorm.net From ncoghlan at gmail.com Tue May 3 15:07:07 2005 From: ncoghlan at gmail.com (Nick Coghlan) Date: Tue, 03 May 2005 23:07:07 +1000 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: <42776142.9010006@cirad.fr> References: <4277456F.7090402@gmail.com> <42776142.9010006@cirad.fr> Message-ID: <4277777B.3000300@gmail.com> Pierre Barbier de Reuille wrote: > Even if iterator-for-loops can be used within a block without damage, > the use of iterator-for-block in a loop can lead to completely > unpredictable result (and result really hard to find since they'll > possibly involve race conditions or dead locks). I had a longish post written before I realised I'd completely misunderstood your comment. You were actually agreeing with me, so most of my post was totally beside the point. Anyway, to summarise the argument in favour of separate API's for iterators and block templates, the first code example below is a harmless quirk (albeit an irritating violation of TOOWTDI). The second and third examples are potentially serious bugs: block range(10) as i: # Just a silly way to write "for i in range(10)" for f in opening(name): # When f gets closed is Python implementation dependent for lock in synchronized(mylock): # When lock gets released is Python implementation dependent Cheers, Nick. P.S. Dear lord, synchronized is an aggravating name for that function. I keep wanting to spell it with a second letter 's', like any civilised person ;) -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.skystorm.net From eric.nieuwland at xs4all.nl Tue May 3 15:08:08 2005 From: eric.nieuwland at xs4all.nl (Eric Nieuwland) Date: Tue, 3 May 2005 15:08:08 +0200 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: <42776142.9010006@cirad.fr> References: <4277456F.7090402@gmail.com> <42776142.9010006@cirad.fr> Message-ID: <2382e9f0bf39f58c4770f1b37085f716@xs4all.nl> I've been away for a while and just read through the PEP 340 discussion with growing amazement. Pierre Barbier de Reuille wrote: > As far as I understand it, > iterator-for-blocks and iterator-for-loops are two different beasts. Right! > To try being as clear as possible, I would say the iterator-for-loops > are simplified iterator-for-blocks. IOW, if I were to put them in a > class inheritance hierarchy (I don't say we should put them into one ;) > ) iterator-for-block would be the base class of iterator-for-loop. > Thus, > as for-loops require an iterator-for-loop, they would raise an error if > used with an iterator-for-block. But as blocks require an > iterator-for-blocks they will allow iterator-for-loops too ! IMHO It is more like round holes and square pegs (or the other way around). What PEP 340 seems to be trying to achieve is a generic mechanism to define templates with holes/place holders for blocks of code. That gives two nouns ('template' and 'code block') that both qualify as indicators of reusable items. We can use standard functions as reusable code blocks. Wouldn't a template then be just a function that takes other functions ar arguments? All information transfer between the template and its arguments is via the parameter list/returned values. What am I missing? --eric From mwh at python.net Tue May 3 16:35:35 2005 From: mwh at python.net (Michael Hudson) Date: Tue, 03 May 2005 15:35:35 +0100 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: <42776D71.6080303@gmail.com> (Nick Coghlan's message of "Tue, 03 May 2005 22:24:17 +1000") References: <5.1.1.6.0.20050502210840.031fb9b0@mail.telecommunity.com> <42774448.8080306@gmail.com> <20050503063140.A7719@familjen.svensson.org> <42776D71.6080303@gmail.com> Message-ID: <2mhdhkz8aw.fsf@starship.python.net> Nick Coghlan writes: > Paul Svensson wrote: >> On Tue, 3 May 2005, Nick Coghlan wrote: >> >>> I'd also suggest that the blocktemplate decorator accept any iterator, >>> not just >>> generators. >> >> >> So you want decorators on classes now ? > > A decorator is just a function - it doesn't *need* to be used with decorator > syntax. I just think the following code should work for any iterator: > > block blocktemplate(itr): > # Do stuff But in @blocktemplate def foo(...): ... blocktemplate isn't passed an iterator, it's passed a callable that returns an iterator. Cheers, mwh -- . <- the point your article -> . |------------------------- a long way ------------------------| -- Christophe Rhodes, ucam.chat From tom-python-dev at rothamel.us Tue May 3 17:05:10 2005 From: tom-python-dev at rothamel.us (Tom Rothamel) Date: Tue, 3 May 2005 11:05:10 -0400 Subject: [Python-Dev] PEP 340: Breaking out. Message-ID: <20050503150510.GA13595@onegeek.org> I have a question/suggestion about PEP 340. As I read the PEP right now, the code: while True: block synchronized(v1): if v1.field: break time.sleep(1) Will never break out of the enclosing while loop. This is because the break breaks the while loop that the block statement is translated into, instead of breaking the outer True statement. Am I understanding this right, or am I misunderstanding this? If I am understanding this right, I would suggest allowing some way of having the iterator call continue or break in the enclosing context. (Perhaps by enclosing the entire translation of block in a try-except construct, which catches Stop and Continue exceptions raised by the generator and re-raises them in the outer context.) I hope this helps. -- Tom Rothamel ----------------------------------- http://www.rothamel.us/~tom/ From pierre.barbier at cirad.fr Tue May 3 17:25:09 2005 From: pierre.barbier at cirad.fr (Pierre Barbier de Reuille) Date: Tue, 03 May 2005 17:25:09 +0200 Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: <20050503150510.GA13595@onegeek.org> References: <20050503150510.GA13595@onegeek.org> Message-ID: <427797D5.8030207@cirad.fr> Tom Rothamel a ?crit : > I have a question/suggestion about PEP 340. > > As I read the PEP right now, the code: > > > while True: > > block synchronized(v1): > if v1.field: > break > > time.sleep(1) > > > Will never break out of the enclosing while loop. This is because the > break breaks the while loop that the block statement is translated > into, instead of breaking the outer True statement. Well, that's exactly what it is intended to do and what I would expect it to do ! break/continue affect only the inner-most loop. > > Am I understanding this right, or am I misunderstanding this? > > If I am understanding this right, I would suggest allowing some way of > having the iterator call continue or break in the enclosing > context. (Perhaps by enclosing the entire translation of block in a > try-except construct, which catches Stop and Continue exceptions > raised by the generator and re-raises them in the outer context.) > > I hope this helps. > I don't want it like that ! This would differ with the break/continue used in other loops. If you need to break from many loops, enclose them into a function and return from it ! Pierre -- Pierre Barbier de Reuille INRA - UMR Cirad/Inra/Cnrs/Univ.MontpellierII AMAP Botanique et Bio-informatique de l'Architecture des Plantes TA40/PSII, Boulevard de la Lironde 34398 MONTPELLIER CEDEX 5, France tel : (33) 4 67 61 65 77 fax : (33) 4 67 61 56 68 From skip at pobox.com Tue May 3 17:30:53 2005 From: skip at pobox.com (Skip Montanaro) Date: Tue, 3 May 2005 10:30:53 -0500 Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: <427797D5.8030207@cirad.fr> References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr> Message-ID: <17015.39213.522060.873605@montanaro.dyndns.org> >>>>> "Pierre" == Pierre Barbier de Reuille writes: Pierre> Tom Rothamel a ?crit : >> I have a question/suggestion about PEP 340. >> >> As I read the PEP right now, the code: >> >> while True: >> block synchronized(v1): >> if v1.field: >> break >> time.sleep(1) >> >> Will never break out of the enclosing while loop. Pierre> Well, that's exactly what it is intended to do and what I would Pierre> expect it to do ! break/continue affect only the inner-most Pierre> loop. Yeah, but "block synchronized(v1)" doesn't look like a loop. I think this might be a common stumbling block for people using this construct. Skip From python at rcn.com Tue May 3 17:41:52 2005 From: python at rcn.com (Raymond Hettinger) Date: Tue, 3 May 2005 11:41:52 -0400 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: <2mhdhkz8aw.fsf@starship.python.net> Message-ID: <000b01c54ff6$a0fb4ca0$c704a044@oemcomputer> I just made a first reading of the PEP and want to clarify my understanding of how it fits with existing concepts. Is it correct to say that "continue" parallel's its current meaning and returns control upwards (?outwards) to the block iterator that called it? Likewise, is it correct that "yield" is anti-parallel to the current meaning? Inside a generator, it returns control upwards to the caller. But inside a block-iterator, it pushes control downwards (?inwards) to the block it controls. Is the distinction between block iterators and generators similar to the Gang-of-Four's distinction between external and internal iterators? Are there some good use cases that do not involve resource locking? IIRC, that same use case was listed a prime motivating example for decorators (i.e. @syncronized). TOOWTDI suggests that a single use case should not be used to justify multiple, orthogonal control structures. It would be great if we could point to some code in the standard library or in a major Python application that would be better (cleaner, faster, or clearer) if re-written using blocks and block-iterators. I've scanned through the code base looking for some places to apply the idea and have come up empty handed. This could mean that I've not yet grasped the essence of what makes the idea useful or it may have other implications such as apps needing to be designed from the ground-up with block iterators in mind. Raymond From pierre.barbier at cirad.fr Tue May 3 17:43:36 2005 From: pierre.barbier at cirad.fr (Pierre Barbier de Reuille) Date: Tue, 03 May 2005 17:43:36 +0200 Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: <17015.39213.522060.873605@montanaro.dyndns.org> References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr> <17015.39213.522060.873605@montanaro.dyndns.org> Message-ID: <42779C28.5000207@cirad.fr> Skip Montanaro a ?crit : > [...] > > Yeah, but "block synchronized(v1)" doesn't look like a loop. I think this > might be a common stumbling block for people using this construct. > > Skip > Well, this can be a problem, because indeed the black-statement introduce a new loop construct in Python. That's why I advocated some time ago against the introduction of a new name. IMHO, the for-loop syntax can be really used instead of blocks as its behavior if exactly the one of a for-loop if the iterator is an iterator-for-for and the current for-loop cannot be used with iterator-for-blocks. The main problem with this syntax is the use of the blocks for things that are not loops (like the synchronize object)! And they are, indeed, quite common ! (or they will be :) ). Pierre -- Pierre Barbier de Reuille INRA - UMR Cirad/Inra/Cnrs/Univ.MontpellierII AMAP Botanique et Bio-informatique de l'Architecture des Plantes TA40/PSII, Boulevard de la Lironde 34398 MONTPELLIER CEDEX 5, France tel : (33) 4 67 61 65 77 fax : (33) 4 67 61 56 68 From ldlandis at gmail.com Tue May 3 17:55:12 2005 From: ldlandis at gmail.com (LD "Gus" Landis) Date: Tue, 3 May 2005 10:55:12 -0500 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: <000b01c54ff6$a0fb4ca0$c704a044@oemcomputer> References: <2mhdhkz8aw.fsf@starship.python.net> <000b01c54ff6$a0fb4ca0$c704a044@oemcomputer> Message-ID: Hi, Sounds like a useful requirement to have for new features in 2.x, IMO. that is... "demonstrated need". If the feature implies that the app needs to be designed from the ground up to *really* take advantage of the feature, then, maybe leave it for Guido's sabbatical (e.g. Python 3000). On 5/3/05, Raymond Hettinger wrote: > It would be great if we could point to some code in the standard library > or in a major Python application that would be better (cleaner, faster, > or clearer) if re-written using blocks and block-iterators. I've > scanned through the code base looking for some places to apply the idea > and have come up empty handed. This could mean that I've not yet > grasped the essence of what makes the idea useful or it may have other > implications such as apps needing to be designed from the ground-up with > block iterators in mind. > > Raymond -- LD Landis - N0YRQ - from the St Paul side of Minneapolis From gvanrossum at gmail.com Tue May 3 18:15:42 2005 From: gvanrossum at gmail.com (Guido van Rossum) Date: Tue, 3 May 2005 09:15:42 -0700 Subject: [Python-Dev] Need to hook Py_FatalError In-Reply-To: References: Message-ID: > Currently Py_FatalError only dumps the error to stderr and calls abort(). > When doing quirky things with the interpreter, it's so annoying that process > just terminates. Are there any reason why we still dont have a simple > callback to hook Py_FatalError. > > PS. If the answer is "because no one needs/implemented...", I can volunteer. Your efforts would be better directed towards fixing the causes of the fatal errors. I see no need to hook Py_FatalError, but since it's open source, you are of course free to patch your own copy if your urge is truly irresistible. Or I guess you could run Python under supervision of gdb and trap it that way. But tell me, what do you want the process to do instead of terminating? Py_FatalError is used in situations where raising an exception is impossible or would do more harm than good. -- --Guido van Rossum (home page: http://www.python.org/~guido/) From jcarlson at uci.edu Tue May 3 18:14:28 2005 From: jcarlson at uci.edu (Josiah Carlson) Date: Tue, 03 May 2005 09:14:28 -0700 Subject: [Python-Dev] Need to hook Py_FatalError In-Reply-To: References: Message-ID: <20050503090452.648C.JCARLSON@uci.edu> "m.u.k" wrote: > Currently Py_FatalError only dumps the error to stderr and calls abort(). > When doing quirky things with the interpreter, it's so annoying that process > just terminates. Are there any reason why we still dont have a simple > callback to hook Py_FatalError. > > PS. If the answer is "because no one needs/implemented...", I can volunteer. In looking at the use of Py_FatalError in the Python Sources (it's a 10 meg tarball that is well worth the download), it looks as though its use shows a Fatal error (hence the name). Things like "Inconsistant interned string state" or "Immortal interned string died" or "Can't initialize type", etc. Essentially, those errors generally signify "the internal state of python is messed up", whether that be by C extension, or even a bug in Python. The crucial observation is that many of them have ambiguous possible recoveries. How do you come back from "Can't initialize type", or even 'gc couldn't allocate "__del__"'? When you have individual solutions to some subset of the uses of Py_FatalError, then it would make sense to offer those solutions as a replacement to Py_FatalError use in those situations (also showing that the errors are not actually fatal), rather than to ask for a hook to hook all (by definition) fatal errors. - Josiah From gvanrossum at gmail.com Tue May 3 18:53:18 2005 From: gvanrossum at gmail.com (Guido van Rossum) Date: Tue, 3 May 2005 09:53:18 -0700 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: <000b01c54ff6$a0fb4ca0$c704a044@oemcomputer> References: <2mhdhkz8aw.fsf@starship.python.net> <000b01c54ff6$a0fb4ca0$c704a044@oemcomputer> Message-ID: [Raymond Hettinger] > I just made a first reading of the PEP and want to clarify my > understanding of how it fits with existing concepts. Thanks! Now is about the right time -- all the loose ends are being solidified (in my mind any way). > Is it correct to say that "continue" parallel's its current meaning and > returns control upwards (?outwards) to the block iterator that called > it? I have a hard time using directions as metaphors (maybe because on some hardware, stacks grow down) unless you mean "up in the source code" which doesn't make a lot of sense either in this context. But yes, continue does what you expect it to do in a loop. Of course, in a resource allocation block, continue and break are pretty much the same (just as they are in any loop that you know has only one iteration). > Likewise, is it correct that "yield" is anti-parallel to the current > meaning? Inside a generator, it returns control upwards to the caller. > But inside a block-iterator, it pushes control downwards (?inwards) to > the block it controls. I have a hard time visualizing the difference. They feel the same to me, and the implementation (from the generator's POV) is identical: yield suspends the current frame, returning to the previous frame from the call to next() or __next__(), and the suspended frame can be resumed by calling next() / __next__() again. > Is the distinction between block iterators and generators similar to the > Gang-of-Four's distinction between external and internal iterators? I looked it up in the book (p. 260), and I think generators have a duality to them that makes the distinction useless, or at least relative to your POV. With a classic for-loop driven by a generator, the author of the for-loop thinks of it as an external iterator -- you ask for the next item using the (implicit) call to next(). But the author of the generator thinks of it as an internal iterator -- the for loop resumes only when the generator feels like it. > Are there some good use cases that do not involve resource locking? > IIRC, that same use case was listed a prime motivating example for > decorators (i.e. @syncronized). TOOWTDI suggests that a single use case > should not be used to justify multiple, orthogonal control structures. Decorators don't need @synchronized as a motivating use case; there are plenty of other use cases. Anyway, @synchronized was mostly a demonstration toy; whole method calls are rarely the right granularity of locking. (BTW in the latest version of PEP 340 I've renamed synchronized to locking; many people complained about the strange Javaesque term.) Look at the examples in the PEP (version 1.16) for more use cases. > It would be great if we could point to some code in the standard library > or in a major Python application that would be better (cleaner, faster, > or clearer) if re-written using blocks and block-iterators. I've > scanned through the code base looking for some places to apply the idea > and have come up empty handed. This could mean that I've not yet > grasped the essence of what makes the idea useful or it may have other > implications such as apps needing to be designed from the ground-up with > block iterators in mind. I presume you mentally discarded the resource allocation use cases where the try/finally statement was the outermost statement in the function body, since those would be helped by @synchronized; but look more closely at Queue, and you'll find that the two such methods use different locks! Also the use case for closing a file upon leaving a block, while clearly a resource allocation use case, doesn't work well with a decorator. I just came across another use case that is fairly common in the standard library: redirecting sys.stdout. This is just a beauty (in fact I'll add it to the PEP): def saving_stdout(f): save_stdout = sys.stdout try: sys.stdout = f yield finally: sys.stdout = save_stdout -- --Guido van Rossum (home page: http://www.python.org/~guido/) From gvanrossum at gmail.com Tue May 3 19:13:53 2005 From: gvanrossum at gmail.com (Guido van Rossum) Date: Tue, 3 May 2005 10:13:53 -0700 Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: <17015.39213.522060.873605@montanaro.dyndns.org> References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr> <17015.39213.522060.873605@montanaro.dyndns.org> Message-ID: [Skip Montanaro] > Yeah, but "block synchronized(v1)" doesn't look like a loop. I think this > might be a common stumbling block for people using this construct. How many try/finally statements have you written inside a loop? In my experience this is extreeeemely rare. I found no occurrences in the standard library. -- --Guido van Rossum (home page: http://www.python.org/~guido/) From pje at telecommunity.com Tue May 3 19:20:38 2005 From: pje at telecommunity.com (Phillip J. Eby) Date: Tue, 03 May 2005 13:20:38 -0400 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: References: <000b01c54ff6$a0fb4ca0$c704a044@oemcomputer> <2mhdhkz8aw.fsf@starship.python.net> <000b01c54ff6$a0fb4ca0$c704a044@oemcomputer> Message-ID: <5.1.1.6.0.20050503130527.02471ad0@mail.telecommunity.com> At 09:53 AM 5/3/05 -0700, Guido van Rossum wrote: >I just came across another use case that is fairly common in the >standard library: redirecting sys.stdout. This is just a beauty (in >fact I'll add it to the PEP): > >def saving_stdout(f): Very nice; may I suggest 'redirecting_stdout' as the name instead? This and other examples from the PEP still have a certain awkwardness of phrasing in their names. A lot of them seem to cry out for a "with" prefix, although maybe that's part of the heritage of PEP 310. But Lisp has functions like 'with-open-file', so I don't think that it's *all* a PEP 310 influence on the examples. It also seems to me that it would be nice if locks, files, sockets and similar resources would implement the block-template protocol; then one could simply say: block self.__lock: ... or: open("foo") as f: ... And not need any special wrappers. Of course, this could only work for files if the block-template protocol were distinct from the normal iteration protocol. From aahz at pythoncraft.com Tue May 3 19:31:32 2005 From: aahz at pythoncraft.com (Aahz) Date: Tue, 3 May 2005 10:31:32 -0700 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: <5.1.1.6.0.20050503130527.02471ad0@mail.telecommunity.com> References: <000b01c54ff6$a0fb4ca0$c704a044@oemcomputer> <2mhdhkz8aw.fsf@starship.python.net> <000b01c54ff6$a0fb4ca0$c704a044@oemcomputer> <5.1.1.6.0.20050503130527.02471ad0@mail.telecommunity.com> Message-ID: <20050503173131.GA1375@panix.com> On Tue, May 03, 2005, Phillip J. Eby wrote: > At 09:53 AM 5/3/05 -0700, Guido van Rossum wrote: >> >>I just came across another use case that is fairly common in the >>standard library: redirecting sys.stdout. This is just a beauty (in >>fact I'll add it to the PEP): >> >>def saving_stdout(f): > > Very nice; may I suggest 'redirecting_stdout' as the name instead? You may; I'd nitpick that to either "redirect_stdout" or "redirected_stdout". "redirecting_stdout" is slightly longer and doesn't have quite the right flavor to my eye. I might even go for "make_stdout" or "using_stdout"; that relies on people understanding that a block means temporary usage. > This and other examples from the PEP still have a certain awkwardness > of phrasing in their names. A lot of them seem to cry out for a > "with" prefix, although maybe that's part of the heritage of PEP 310. > But Lisp has functions like 'with-open-file', so I don't think that > it's *all* a PEP 310 influence on the examples. Yes, that's why I've been pushing for "with". -- Aahz (aahz at pythoncraft.com) <*> http://www.pythoncraft.com/ "It's 106 miles to Chicago. We have a full tank of gas, a half-pack of cigarettes, it's dark, and we're wearing sunglasses." "Hit it." From foom at fuhm.net Tue May 3 19:38:27 2005 From: foom at fuhm.net (James Y Knight) Date: Tue, 3 May 2005 13:38:27 -0400 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: References: <2mhdhkz8aw.fsf@starship.python.net> <000b01c54ff6$a0fb4ca0$c704a044@oemcomputer> Message-ID: <8dffe7626eaf9a89812c2828bcf96efe@fuhm.net> On May 3, 2005, at 12:53 PM, Guido van Rossum wrote: > def saving_stdout(f): > save_stdout = sys.stdout > try: > sys.stdout = f > yield > finally: > sys.stdout = save_stdout I hope you aren't going to be using that in any threaded program. That's one really nice thing about lisp's dynamic variables: they automatically interact properly with threads. (defvar *foo* nil) (let ((*foo* 5)) ; *foo* has value of 5 for all functions called from here, but only in this thread. In other threads it'll still be nil. ) ; *foo* has gone back to nil. James From m.u.k.2 at gawab.com Tue May 3 19:44:10 2005 From: m.u.k.2 at gawab.com (m.u.k) Date: Tue, 3 May 2005 17:44:10 +0000 (UTC) Subject: [Python-Dev] Need to hook Py_FatalError References: Message-ID: Hi, Guido van Rossum wrote in news:ca471dc205050309156962d3ff at mail.gmail.com: > Your efforts would be better directed towards fixing the causes of the > fatal errors. > > I see no need to hook Py_FatalError, but since it's open source, you > are of course free to patch your own copy if your urge is truly > irresistible. Or I guess you could run Python under supervision of gdb > and trap it that way. Well, I admit it is a bit triva(as its implementation), at least nobody wanted it within Python's 10+ lifetime. Indeed Im using my own patched copy, I just thought it'd be good some other naughty boy playing dangerous games with interpreter internals not spend hours in debugger trying to reproduce the crash. > But tell me, what do you want the process to do instead of > terminating? Py_FatalError is used in situations where raising an > exception is impossible or would do more harm than good. The need for this is only logging purposes. eg the process just terminates on client machine, you have no logs, no clues(except a coredump), nightmare!. Some sort of log would be invaluable here. Best regards. From jepler at unpythonic.net Tue May 3 19:54:23 2005 From: jepler at unpythonic.net (Jeff Epler) Date: Tue, 3 May 2005 12:54:23 -0500 Subject: [Python-Dev] Need to hook Py_FatalError In-Reply-To: References: Message-ID: <20050503175422.GF8344@unpythonic.net> On Tue, May 03, 2005 at 09:15:42AM -0700, Guido van Rossum wrote: > But tell me, what do you want the process to do instead of > terminating? Py_FatalError is used in situations where raising an > exception is impossible or would do more harm than good. In an application which embeds Python, I want to show the application's standard error dialog, which doesn't call any Python APIs (but does do things like capture the call stack at the time of the error). For this use, it doesn't matter that no further calls to those APIs are possible. Jeff -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 189 bytes Desc: not available Url : http://mail.python.org/pipermail/python-dev/attachments/20050503/62c5ddea/attachment.pgp From m.u.k.2 at gawab.com Tue May 3 19:47:57 2005 From: m.u.k.2 at gawab.com (m.u.k) Date: Tue, 3 May 2005 17:47:57 +0000 (UTC) Subject: [Python-Dev] Need to hook Py_FatalError References: <20050503090452.648C.JCARLSON@uci.edu> Message-ID: Hi, Josiah Carlson wrote in news:20050503090452.648C.JCARLSON at uci.edu: > In looking at the use of Py_FatalError in the Python Sources (it's a 10 > meg tarball that is well worth the download), it looks as though its use > shows a Fatal error (hence the name). Things like "Inconsistant > interned string state" or "Immortal interned string died" or "Can't > initialize type", etc. > > Essentially, those errors generally signify "the internal state of > python is messed up", whether that be by C extension, or even a bug in > Python. The crucial observation is that many of them have ambiguous > possible recoveries. How do you come back from "Can't initialize type", > or even 'gc couldn't allocate "__del__"'? The hook is not to come back just for logging, see my previous post please. Best regards. From skip at pobox.com Tue May 3 20:11:10 2005 From: skip at pobox.com (Skip Montanaro) Date: Tue, 3 May 2005 13:11:10 -0500 Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr> <17015.39213.522060.873605@montanaro.dyndns.org> Message-ID: <17015.48830.223391.390538@montanaro.dyndns.org> >>>>> "Guido" == Guido van Rossum writes: Guido> [Skip Montanaro] >> Yeah, but "block synchronized(v1)" doesn't look like a loop. I think >> this might be a common stumbling block for people using this >> construct. Guido> How many try/finally statements have you written inside a loop? Guido> In my experience this is extreeeemely rare. I found no Guido> occurrences in the standard library. How'd we start talking about try/finally? To the casual observer, this looks like "break" should break out of the loop: while True: block synchronized(v1): ... if v1.field: break time.sleep(1) The PEP says: Note that it is left in the middle whether a block-statement represents a loop or not; this is up to the iterator, but in the most common case BLOCK1 is executed exactly once. That suggests to me it's still not clear if the block statement is actually a looping statement. If not, then "break" should almost certainly break out of the while loop. BTW, what did you mean by "left in the middle" mean? I interpreted it as "still undecided", but it's an idiom I've never seen. Perhaps it should be replaced by something more clear. Skip From python at rcn.com Tue May 3 20:26:05 2005 From: python at rcn.com (Raymond Hettinger) Date: Tue, 3 May 2005 14:26:05 -0400 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: Message-ID: <001101c5500d$a7be7140$c704a044@oemcomputer> [Raymond] > > Likewise, is it correct that "yield" is anti-parallel to the current > > meaning? Inside a generator, it returns control upwards to the caller. > > But inside a block-iterator, it pushes control downwards (?inwards) to > > the block it controls. [Guido van Rossum] > I have a hard time visualizing the difference. They feel the same to > me, and the implementation (from the generator's POV) is identical: > yield suspends the current frame, returning to the previous frame from > the call to next() or __next__(), and the suspended frame can be > resumed by calling next() / __next__() again. This concept ought to be highlighted in the PEP because it explains clearly what "yield" does and it may help transition from a non-Dutch mental model. I expect that many folks (me included) think in terms of caller vs callee with a parallel spatial concept of enclosing vs enclosed. In that model, the keywords "continue", "break", "yield", and "return" all imply a control transfer from the enclosed back to the encloser. In contrast, the new use of yield differs in that the suspended frame transfers control from the encloser to the enclosed. > > Are there some good use cases that do not involve resource locking? > > IIRC, that same use case was listed a prime motivating example for > > decorators (i.e. @syncronized). TOOWTDI suggests that a single use case > > should not be used to justify multiple, orthogonal control structures. > > Decorators don't need @synchronized as a motivating use case; there > are plenty of other use cases. No doubt about that. > Anyway, @synchronized was mostly a demonstration toy; whole method > calls are rarely the right granularity of locking. Agreed. Since that is the case, there should be some effort to shift some of the examples towards real use cases where a block-iterator is the appropriate solution. It need not hold-up releasing the PEP to comp.lang.python, but it would go a long way towards improving the quality of the subsequent discussion. > (BTW in the latest > version of PEP 340 I've renamed synchronized to locking; many people > complained about the strange Javaesque term.) That was diplomatic. Personally, I find it amusing when there is an early focus on naming rather than on functionality, implementation issues, use cases, usability, and goodness-of-fit within the language. > > It would be great if we could point to some code in the standard library > > or in a major Python application that would be better (cleaner, faster, > > or clearer) if re-written using blocks and block-iterators > look > more closely at Queue, and you'll find that the two such methods use > different locks! I don't follow this one. Tim's uses of not_empty and not_full are orthogonal (pertaining to pending gets at one end of the queue and to pending puts at the other end). The other use of the mutex is independent of either pending puts or gets; instead, it is a weak attempt to minimize what can happen to the queue during a size query. While the try/finallys could get factored-out into separate blocks, I do not see how the code could be considered better off. There is a slight worsening of all metrics of merit: line counts, total number of function defs, number of calls, or number of steps executed outside the lock (important given that the value a query result declines rapidly once the lock is released). > Also the use case for closing a file upon leaving a block, while > clearly a resource allocation use case, doesn't work well with a > decorator. Right. > I just came across another use case that is fairly common in the > standard library: redirecting sys.stdout. This is just a beauty (in > fact I'll add it to the PEP): > > def saving_stdout(f): > save_stdout = sys.stdout > try: > sys.stdout = f > yield > finally: > sys.stdout = save_stdout This is the strongest example so far. When adding it to the PEP, it would be useful to contrast the code with simpler alternatives like PEP 288's g.throw() or PEP 325's g.close(). On the plus side, the block-iterator approach factors out code common to multiple callers. On the minus side, the other PEPs involve simpler mechanisms and their learning curve would be nearly zero. These pluses and minuses are important because apply equally to all examples using blocks for initialization/finalization. Raymond From gvanrossum at gmail.com Tue May 3 20:31:55 2005 From: gvanrossum at gmail.com (Guido van Rossum) Date: Tue, 3 May 2005 11:31:55 -0700 Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: <17015.48830.223391.390538@montanaro.dyndns.org> References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr> <17015.39213.522060.873605@montanaro.dyndns.org> <17015.48830.223391.390538@montanaro.dyndns.org> Message-ID: [Skip Montanaro] > >> Yeah, but "block synchronized(v1)" doesn't look like a loop. I think > >> this might be a common stumbling block for people using this > >> construct. > > Guido> How many try/finally statements have you written inside a loop? > Guido> In my experience this is extreeeemely rare. I found no > Guido> occurrences in the standard library. [Skip again] > How'd we start talking about try/finally? Because it provides by far the dominant use case for 'block'. The block-statement is intended to replace many boilerplace uses of try/finally. In addition, it's also a coroutine invocation primitive. > To the casual observer, this > looks like "break" should break out of the loop: > > while True: > block synchronized(v1): > ... > if v1.field: > break > time.sleep(1) Without 'block' this would be written as try/finally. And my point is that people just don't write try/finally inside a while loop very often (I found *no* examples in the entire standard library). > The PEP says: > > Note that it is left in the middle whether a block-statement > represents a loop or not; this is up to the iterator, but in the > most common case BLOCK1 is executed exactly once. > > That suggests to me it's still not clear if the block statement is actually > a looping statement. If not, then "break" should almost certainly break out > of the while loop. Dynamically, it's most likely not a loop. But the compiler doesn't know that, so the compiler considers it a loop. > BTW, what did you mean by "left in the middle" mean? I interpreted it as > "still undecided", but it's an idiom I've never seen. Perhaps it should be > replaced by something more clear. It may be a Dutch phrase that doesn't translate to English as wel as I thought. It doesn't exactly mean "still undecided" but more "depends on your POV". I'll use something different, and also clarify that as far as break/continue are concerned, it *is* a loop. -- --Guido van Rossum (home page: http://www.python.org/~guido/) From jcarlson at uci.edu Tue May 3 20:34:52 2005 From: jcarlson at uci.edu (Josiah Carlson) Date: Tue, 03 May 2005 11:34:52 -0700 Subject: [Python-Dev] Need to hook Py_FatalError In-Reply-To: References: Message-ID: <20050503112637.648F.JCARLSON@uci.edu> "m.u.k" wrote: > > Hi, > > Guido van Rossum wrote in > news:ca471dc205050309156962d3ff at mail.gmail.com: > > > Your efforts would be better directed towards fixing the causes of the > > fatal errors. > > > > I see no need to hook Py_FatalError, but since it's open source, you > > are of course free to patch your own copy if your urge is truly > > irresistible. Or I guess you could run Python under supervision of gdb > > and trap it that way. > > Well, I admit it is a bit triva(as its implementation), at least nobody > wanted it within Python's 10+ lifetime. Indeed Im using my own patched copy, > I just thought it'd be good some other naughty boy playing dangerous games > with interpreter internals not spend hours in debugger trying to reproduce > the crash. > > > But tell me, what do you want the process to do instead of > > terminating? Py_FatalError is used in situations where raising an > > exception is impossible or would do more harm than good. > > The need for this is only logging purposes. eg the process just terminates > on client machine, you have no logs, no clues(except a coredump), nightmare!. > Some sort of log would be invaluable here. Offering any hook for Py_FatalError may not even be enough, as some of those errors are caused by insufficient memory. What if a hook were available, but it couldn't be called because there wasn't enough memory? Of course there is the option of pre-allocating a few kilobytes, then just before one calls the hook, freeing that memory so that the hook can execute (assuming the hook is small enough). I'm not sure if this is a desireable general mechanic, but it may be sufficient for you. If you do figure out a logging mechanism that is almost guaranteed to execute on FatalError, post it to sourceforge. - Josiah From gvanrossum at gmail.com Tue May 3 20:48:09 2005 From: gvanrossum at gmail.com (Guido van Rossum) Date: Tue, 3 May 2005 11:48:09 -0700 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: <001101c5500d$a7be7140$c704a044@oemcomputer> References: <001101c5500d$a7be7140$c704a044@oemcomputer> Message-ID: > [Raymond] > > > Likewise, is it correct that "yield" is anti-parallel to the current > > > meaning? Inside a generator, it returns control upwards to the caller. > > > But inside a block-iterator, it pushes control downwards (?inwards) to > > > the block it controls. > > [Guido van Rossum] > > I have a hard time visualizing the difference. They feel the same to > > me, and the implementation (from the generator's POV) is identical: > > yield suspends the current frame, returning to the previous frame from > > the call to next() or __next__(), and the suspended frame can be > > resumed by calling next() / __next__() again. [Raymond] > This concept ought to be highlighted in the PEP because it explains > clearly what "yield" does and it may help transition from a non-Dutch > mental model. I expect that many folks (me included) think in terms of > caller vs callee with a parallel spatial concept of enclosing vs > enclosed. In that model, the keywords "continue", "break", "yield", and > "return" all imply a control transfer from the enclosed back to the > encloser. I'm still confused and surprised that you think I need to explain what yield does, since the PEP doesn't change one bit about this. The encloser/enclosed parallel to caller/callee doesn't make sense to me; but that may just because I'm Dutch. > In contrast, the new use of yield differs in that the suspended frame > transfers control from the encloser to the enclosed. Why does your notion of who encloses whom suddenly reverse when you go from a for-loop to a block-statement? This all feels very strange to me. > > Anyway, @synchronized was mostly a demonstration toy; whole method > > calls are rarely the right granularity of locking. > > Agreed. Since that is the case, there should be some effort to shift > some of the examples towards real use cases where a block-iterator is > the appropriate solution. It need not hold-up releasing the PEP to > comp.lang.python, but it would go a long way towards improving the > quality of the subsequent discussion. Um? I thought I just showed that locking *is* a good use case for the block-statement and you agreed; now why would I have to move away from it? I think I'm thoroughly confused by your critique of the PEP. Perhaps you could suggest some concrete rewritings to knock me out of my confusion? > Personally, I find it amusing when there is an > early focus on naming rather than on functionality, implementation > issues, use cases, usability, and goodness-of-fit within the language. Well, the name of a proposed concept does a lot to establish its first impression. First imressions matter! > > > It would be great if we could point to some code in the standard library > > > or in a major Python application that would be better (cleaner, faster, > > > or clearer) if re-written using blocks and block-iterators > > > look > > more closely at Queue, and you'll find that the two such methods use > > different locks! > > I don't follow this one. Tim's uses of not_empty and not_full are > orthogonal (pertaining to pending gets at one end of the queue and to > pending puts at the other end). The other use of the mutex is > independent of either pending puts or gets; instead, it is a weak > attempt to minimize what can happen to the queue during a size query. I meant to use this as an example of the unsuitability of the @synchronized decorator, since it implies that all synchronization is on the same mutex, thereby providing a use case for the locking block-statement. I suspect we're violently in agreement though. > While the try/finallys could get factored-out into separate blocks, I do > not see how the code could be considered better off. There is a slight > worsening of all metrics of merit: line counts, total number of > function defs, number of calls, or number of steps executed outside the > lock (important given that the value a query result declines rapidly > once the lock is released). I don't see how the line count metric would lose: a single "locking()" primitive exported by the threading module would be usable by all code that currently uses try/finally to acquire and release a lock. Performance needn't suffer either, if the locking() primitive is implemented in C (it could be a straightforward translation of example 6 into C). > > I just came across another use case that is fairly common in the > > standard library: redirecting sys.stdout. This is just a beauty (in > > fact I'll add it to the PEP): > > > > def saving_stdout(f): > > save_stdout = sys.stdout > > try: > > sys.stdout = f > > yield > > finally: > > sys.stdout = save_stdout > > This is the strongest example so far. When adding it to the PEP, it > would be useful to contrast the code with simpler alternatives like PEP > 288's g.throw() or PEP 325's g.close(). On the plus side, the > block-iterator approach factors out code common to multiple callers. On > the minus side, the other PEPs involve simpler mechanisms and their > learning curve would be nearly zero. These pluses and minuses are > important because apply equally to all examples using blocks for > initialization/finalization. Where do you see a learning curve for blocks? -- --Guido van Rossum (home page: http://www.python.org/~guido/) From m.u.k.2 at gawab.com Tue May 3 20:58:40 2005 From: m.u.k.2 at gawab.com (m.u.k) Date: Tue, 3 May 2005 18:58:40 +0000 (UTC) Subject: [Python-Dev] Need to hook Py_FatalError References: <20050503112637.648F.JCARLSON@uci.edu> Message-ID: Hi, Josiah Carlson wrote in news:20050503112637.648F.JCARLSON at uci.edu: > Offering any hook for Py_FatalError may not even be enough, as some of > those errors are caused by insufficient memory. What if a hook were > available, but it couldn't be called because there wasn't enough memory? > > Of course there is the option of pre-allocating a few kilobytes, then > just before one calls the hook, freeing that memory so that the hook can > execute (assuming the hook is small enough). I'm not sure if this is a > desireable general mechanic, but it may be sufficient for you. If you > do figure out a logging mechanism that is almost guaranteed to execute > on FatalError, post it to sourceforge. IMHO this should be left to hooker(apparerently not right word, but you get the point :) ). If he allocates more mem. or does heavy stuff, that will just fail. Anyway abort() is a failure too. Either abort() will end the process or OS will on such a critical error. Best regards. From jimjjewett at gmail.com Tue May 3 21:07:37 2005 From: jimjjewett at gmail.com (Jim Jewett) Date: Tue, 3 May 2005 15:07:37 -0400 Subject: [Python-Dev] PEP 340 -- concept clarification Message-ID: [Raymond Hettinger] >> Likewise, is it correct that "yield" is anti-parallel to the current >> meaning? Inside a generator, it returns control upwards to the caller. >> But inside a block-iterator, it pushes control downwards (?inwards) to >> the block it controls. Guido: > I have a hard time visualizing the difference. In a normal generator, someone makes a call to establish the generator, which then becomes a little island -- anyone can call the generator, and it returns control back to whoever made the last call. With the block, every yield returns to a single designated callback. This callback had to be established at the same time the block was created, and must be textually inside it. (An indented suite to the "block XXX:" line.) >> Are there some good use cases that do not involve resource locking? > Decorators don't need @synchronized as a motivating use case; > there are plenty of other use cases. But are there plenty of other use cases for PEP 340? If not, then why do we need PEP 340? Are decorators not strong enough, or is it just that people aren't comfortable yet? If it is a matter of comfort or recipies, then the new construct might have just as much trouble. (So this one is not a loop, and you can tell the difference because ... uh, just skip that advanced stuff.) > Anyway, @synchronized was mostly a demonstration toy; whole > method calls are rarely the right granularity of locking. That is an important difference -- though I'm not sure that the critical part *shouldn't* be broken out into a separate method. >> I've scanned through the code base looking for some places >> to apply the idea and have come up empty handed. > I presume you mentally discarded the resource allocation use > cases where the try/finally statement was the outermost statement > in the function body, since those would be helped by @synchronized; > but look more closely at Queue, and you'll find that the two such > methods use different locks! qsize, empty, and full could be done with a lockself decorator. Effectively, they *are* lockself decorators for the _xxx functions that subclasses are told to override. If you're talking about put and get, decorators don't help as much, but I'm not sure blocks are much better. You can't replace the outermost try ... finally with a common decorator because the locks are self variables. A block, by being inside a method, would be delayed until self exists -- but that outer lock is only a tiny fraction of the boilerplate. It doesn't help with if not block: if self._STATE(): raise STATEException elif timeout is None: while self._STATE(): self.not_STATE.wait() else: if timeout < 0: raise ValueError("'timeout' must be a positive number") endtime = _time() + timeout while self._STATE(): remaining = endtime - _time() if remaining <= 0.0: raise STATEException self.not_STATE.wait(remaining) val = self._RealMethod(item) # OK, the put optimizes out this and the return self.not_OTHERSTATE.notify() return val I wouldn't object to a helper method, but using a block just to get rid of four lines (two of which are the literals "try:" and "finally:") seems barely worth doing, let alone with special new syntax. > Also the use case for closing a file upon leaving a block, while > clearly a resource allocation use case, doesn't work well with a > decorator. def autoclose(fn): def outer(filename, *args, **kwargs): f = open(filename) val = fn(f, *args, **kwargs) f.close() return val return outer @autoclose def f1(f): for line in f: print line > I just came across another use case that is fairly common in the > standard library: redirecting sys.stdout. This is just a beauty (in > fact I'll add it to the PEP): > def saving_stdout(f): > save_stdout = sys.stdout > try: > sys.stdout = f > yield > finally: > sys.stdout = save_stdout Why does this need a yield? Why not just a regular call to the function? If you're trying to generalize the redirector, then this also works as a decorator. The nested functions (and the *args, **kwargs, if you don't inherit from a standard dedcorator) is a bit of an annoyance, but I'm not sure the new iterator form will be any easier to explain. def saving_stdout(f): import sys # Just in case... def captured_stream(fn): def redirect(*args, **kwargs): save_stdout = sys.stdout try: sys.stdout = f return fn (*args, **kwargs) finally: sys.stdout = save_stdout return redirect return captured_stream o=StringIO() @saving_stdout(o) ... From tim.peters at gmail.com Tue May 3 21:13:52 2005 From: tim.peters at gmail.com (Tim Peters) Date: Tue, 3 May 2005 15:13:52 -0400 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: References: <001101c5500d$a7be7140$c704a044@oemcomputer> Message-ID: <1f7befae050503121346833d97@mail.gmail.com> [Raymond] >>>> It would be great if we could point to some code in the standard library >>>> or in a major Python application that would be better (cleaner, faster, >>>> or clearer) if re-written using blocks and block-iterators [Guido] >>> look more closely at Queue, and you'll find that the two such methods >>> use different locks! [Raymond] >> I don't follow this one. Tim's uses of not_empty and not_full are >> orthogonal (pertaining to pending gets at one end of the queue and to >> pending puts at the other end). The other use of the mutex is >> independent of either pending puts or gets; instead, it is a weak >> attempt to minimize what can happen to the queue during a size query. [Guido] > I meant to use this as an example of the unsuitability of the > @synchronized decorator, since it implies that all synchronization is > on the same mutex, thereby providing a use case for the locking > block-statement. Queue may be a confusing example. Older versions of Queue did indeed use more than one mutex. The _current_ (2.4+) version of Queue uses only one mutex, but shared across two condition variables (`not_empty` and `not_full` are condvars in current Queue, not locks). Where, e.g., current Queue.put() starts with self.not_full.acquire() it _could_ say self.not_empty.acquire() instead with the same semantics, or it could say self.mutex.acquire() They all do an acquire() on the same mutex. If put() needs to wait, it needs to wait on the not_full condvar, so it's conceptually clearest for put() to spell it the first of these ways. Because Queue does use condvars now instead of plain locks, I wouldn't approve of any gimmick purporting to hide the acquire/release's in put() or get(): that those are visible is necessary to seeing that the _condvar_ protocol is being followed ("must acquire() before wait(); must be acquire()'ed during notify(); no path should leave the condvar acquire()d 'for a long time' before a wait() or release()"). From gvanrossum at gmail.com Tue May 3 21:40:11 2005 From: gvanrossum at gmail.com (Guido van Rossum) Date: Tue, 3 May 2005 12:40:11 -0700 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: References: Message-ID: > [Raymond Hettinger] > >> Likewise, is it correct that "yield" is anti-parallel to the current > >> meaning? Inside a generator, it returns control upwards to the caller. > >> But inside a block-iterator, it pushes control downwards (?inwards) to > >> the block it controls. > [Guido] > > I have a hard time visualizing the difference. [Jim Jewett] > In a normal generator, someone makes a call to establish the > generator, which then becomes a little island -- anyone can call > the generator, and it returns control back to whoever made the last call. > > With the block, every yield returns to a single designated callback. > This callback had to be established at the same time the block was > created, and must be textually inside it. (An indented suite to the > "block XXX:" line.) Doesn't convince me. The common use for a regular generator is in a for-loop, where every yield also returns to a single designated place (calling it callback is really deceptive!). And with a block, you're free to put the generator call ahead of the block so you can call next() on it manually: it = EXPR1 block it: BLOCK1 is totally equivalent to block EXPR1: BLOCK1 but the first form lets you call next() on it as you please (until the block is exited, for sure). > But are there plenty of other use cases for PEP 340? Yes. Patterns like "do this little dance in a try/finally block" and "perform this tune when you catch an XYZ exception" are pretty common in larger systems and are effectively abstracted away using the block-statement and an appropriate iterator. The try/finally use case often also has some setup that needs to go right before the try (and sometimes some more setup that needs to go *inside* the try). Being able to write this once makes it a lot easier when the "little dance" has to be changed everywhere it is performed. > If not, then why do we need PEP 340? Are decorators not strong > enough, or is it just that people aren't comfortable yet? If it is a > matter of comfort or recipies, then the new construct might have > just as much trouble. (So this one is not a loop, and you can tell > the difference because ... uh, just skip that advanced stuff.) PEP 340 and decorators are totally different things, and the only vaguely common use case would be @synchronized, which is *not* a proper use case for decorators, but "safe locking" is definitely a use case for PEP 340. > > Anyway, @synchronized was mostly a demonstration toy; whole > > method calls are rarely the right granularity of locking. > > That is an important difference -- though I'm not sure that the critical > part *shouldn't* be broken out into a separate method. I'll be the judge of that. I have plenty of examples where breaking it out would create an entirely artificial helper method that takes several arguments just because it needs to use stuff that its caller has set up for it. > > I presume you mentally discarded the resource allocation use > > cases where the try/finally statement was the outermost statement > > in the function body, since those would be helped by @synchronized; > > but look more closely at Queue, and you'll find that the two such > > methods use different locks! > > qsize, empty, and full could be done with a lockself decorator. > Effectively, they *are* lockself decorators for the _xxx functions > that subclasses are told to override. Actually you're pointing out a bug in the Queue module: these *should* be using a try/finally clause to ensure the mutex is released even if the inner call raises an exception. I hadn't noticed these before because I was scanning only for "finally". If a locking primitive had been available, I'm sure it would have been used here. > If you're talking about put and get, decorators don't help as much, > but I'm not sure blocks are much better. > > You can't replace the outermost try ... finally with a common decorator > because the locks are self variables. A block, by being inside a method, > would be delayed until self exists -- but that outer lock is only a > tiny fraction of the boilerplate. It doesn't help with > [...example deleted...] > I wouldn't object to a helper method, but using a block just to get rid of four > lines (two of which are the literals "try:" and "finally:") seems barely worth > doing, let alone with special new syntax. Well, to me it does; people have been requesting new syntax for this specific case for a long time (that's where PEP 310 is coming from). > > Also the use case for closing a file upon leaving a block, while > > clearly a resource allocation use case, doesn't work well with a > > decorator. > > def autoclose(fn): > def outer(filename, *args, **kwargs): > f = open(filename) > val = fn(f, *args, **kwargs) > f.close() > return val > return outer > > @autoclose > def f1(f): > for line in f: > print line But the auto-closing file, even more than the self-releasing lock, most often occurs in the middle of some code that would be unnatural to turn into a helper method just so that you can use a decorator pattern. In fact your example is so confusing that I can't figure out whether it has a bug or whether I'm just confused. This is *not* a good use case for decorators. > > I just came across another use case that is fairly common in the > > standard library: redirecting sys.stdout. This is just a beauty (in > > fact I'll add it to the PEP): > > > def saving_stdout(f): > > save_stdout = sys.stdout > > try: > > sys.stdout = f > > yield > > finally: > > sys.stdout = save_stdout > > Why does this need a yield? Why not just a regular call to the > function? Because PEP 340 uses yield to pass control to the body of the block-statement. (I have to resist the urge to add, ", dummy!" :-) I can't tell whether you have totally not grasped PEP 340, or you are proposing to solve all its use cases by defining an explicit function or method representing the body of the block. The latter solution leads to way too much ugly code -- all that function-definition boilerplate is worse than the try/finally boilerplate we're trying to hide! > If you're trying to generalize the redirector, then this > also works as a decorator. The nested functions (and the *args, > **kwargs, if you don't inherit from a standard dedcorator) is a > bit of an annoyance, but I'm not sure the new iterator form will > be any easier to explain. > > def saving_stdout(f): > import sys # Just in case... > def captured_stream(fn): > def redirect(*args, **kwargs): > save_stdout = sys.stdout > try: > sys.stdout = f > return fn (*args, **kwargs) > finally: > sys.stdout = save_stdout > return redirect > return captured_stream > > o=StringIO() > @saving_stdout(o) > ... This has absolutely nothing to recommend it. -- --Guido van Rossum (home page: http://www.python.org/~guido/) From gvanrossum at gmail.com Tue May 3 21:48:05 2005 From: gvanrossum at gmail.com (Guido van Rossum) Date: Tue, 3 May 2005 12:48:05 -0700 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: <1f7befae050503121346833d97@mail.gmail.com> References: <001101c5500d$a7be7140$c704a044@oemcomputer> <1f7befae050503121346833d97@mail.gmail.com> Message-ID: [Tim] > Because Queue does use condvars now instead of plain locks, I wouldn't > approve of any gimmick purporting to hide the acquire/release's in > put() or get(): that those are visible is necessary to seeing that > the _condvar_ protocol is being followed ("must acquire() before > wait(); must be acquire()'ed during notify(); no path should leave the > condvar acquire()d 'for a long time' before a wait() or release()"). So you think that this would be obscure? A generic condition variable use could look like this: block locking(self.condvar): while not self.items: self.condvar.wait() self.process(self.items) self.items = [] instead of this: self.condvar.acquire() try: while not self.items: self.condvar.wait() self.process(self.items) self.items = [] finally: self.condvar.release() I find that the "block locking" version looks just fine; it makes the scope of the condition variable quite clear despite not having any explicit acquire() or release() calls (there are some abstracted away in the wait() call too!). -- --Guido van Rossum (home page: http://www.python.org/~guido/) From tim.peters at gmail.com Tue May 3 22:10:42 2005 From: tim.peters at gmail.com (Tim Peters) Date: Tue, 3 May 2005 16:10:42 -0400 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: References: Message-ID: <1f7befae0505031310200564c6@mail.gmail.com> ... [Jim Jewett] >> qsize, empty, and full could be done with a lockself decorator. >> Effectively, they *are* lockself decorators for the _xxx functions >> that subclasses are told to override. [Guido] > Actually you're pointing out a bug in the Queue module: these *should* > be using a try/finally clause to ensure the mutex is released even if > the inner call raises an exception. Yup! OTOH, if those dead-simple methods raised an exception, the Queue has probably gone wholly insane anyway. > I hadn't noticed these before because I was scanning only for "finally" > > If a locking primitive had been available, I'm sure it would have been > used here. That too. From hpk at trillke.net Tue May 3 22:14:00 2005 From: hpk at trillke.net (holger krekel) Date: Tue, 3 May 2005 22:14:00 +0200 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: References: Message-ID: <20050503201400.GE30548@solar.trillke.net> Hi Guido, On Mon, May 02, 2005 at 17:55 -0700, Guido van Rossum wrote: > These are the loose ends on the PEP (apart from filling in some > missing sections): > > 1. Decide on a keyword to use, if any. I just read the PEP340 basically the first time so bear with me. First i note that introducing a keyword 'block' would break lots of programs, among it half of PyPy. Unlike many other keywords 'block' is a pretty common variable name. For invoking blocktemplates i like the no-keyword approach, instead. However, i would find it much clearer if *defining* blocktemplates used a new keyword, like: blocktemplate opening(filename, mode="r"): ... because this immediately tells me what the purpose and semantics of the folowing definition is. The original overloading of 'def' to mean generators if the body contains a yield statement was already a matter of discussion (ASFAIK). When i came to Python it was at 2.2 and i remember wondering about this "def" oddity. Extending poor old 'def' functions now to possibly mean block templates gives me semantical overload even if it is justified from an implementation point of view. I am talking purely about (my sense of) code readability here not about implementation. cheers, holger From gmilas at gmail.com Tue May 3 21:50:11 2005 From: gmilas at gmail.com (Gheorghe Milas) Date: Tue, 3 May 2005 19:50:11 +0000 (UTC) Subject: [Python-Dev] 2 words keyword for block Message-ID: I'm not really in position to speak but since I only saw people trying to come up with a keyword only using one word and without much success I would venture to suggest the possibility of making a keyword out of two words. Would there be a huge problem to use 2 words to make up a keyword? like for example or if using space is a real problem in template thread_safe(lock): in template redirected_stdout(stream): in template use_and_close_file(path) as file: in template as_transaction(): in template auto_retry(times=3, failas=IOError): From gvanrossum at gmail.com Tue May 3 22:20:38 2005 From: gvanrossum at gmail.com (Guido van Rossum) Date: Tue, 3 May 2005 13:20:38 -0700 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: <20050503201400.GE30548@solar.trillke.net> References: <20050503201400.GE30548@solar.trillke.net> Message-ID: [Holger] > > 1. Decide on a keyword to use, if any. > > I just read the PEP340 basically the first time so bear with me. Thanks for reviewing! > First i note that introducing a keyword 'block' would break > lots of programs, among it half of PyPy. Unlike many other > keywords 'block' is a pretty common variable name. For > invoking blocktemplates i like the no-keyword approach, instead. Good point (the code from Queue.py quoted by Jim Jewett also uses block as a variable name :-). There has been much argument on both sides. I guess we may need to have a subcommittee to select the keyword (if any) ... Maybe if we can't go without a keyword, 'with' would be okay after all; I'm not so strongly in favor of a Pascal/VB-style with-statement after reading the C# developers' comments (see reference in the PEP). > However, i would find it much clearer if *defining* blocktemplates > used a new keyword, like: > > blocktemplate opening(filename, mode="r"): > ... > > because this immediately tells me what the purpose and semantics > of the folowing definition is. The original overloading of 'def' to > mean generators if the body contains a yield statement was already a > matter of discussion (ASFAIK). When i came to Python it was at 2.2 > and i remember wondering about this "def" oddity. > > Extending poor old 'def' functions now to possibly mean block > templates gives me semantical overload even if it is justified > from an implementation point of view. I am talking purely > about (my sense of) code readability here not about implementation. Hm... Maybe you also want to have separate function and procedure keywords? Or static typing? 'def' can be used to define all sorts of things, that is Python's beauty! -- --Guido van Rossum (home page: http://www.python.org/~guido/) From tim.peters at gmail.com Tue May 3 22:21:35 2005 From: tim.peters at gmail.com (Tim Peters) Date: Tue, 3 May 2005 16:21:35 -0400 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: References: <001101c5500d$a7be7140$c704a044@oemcomputer> <1f7befae050503121346833d97@mail.gmail.com> Message-ID: <1f7befae05050313212db5d4df@mail.gmail.com> [Tim] >> Because Queue does use condvars now instead of plain locks, I wouldn't >> approve of any gimmick purporting to hide the acquire/release's in >> put() or get(): that those are visible is necessary to seeing that >> the _condvar_ protocol is being followed ("must acquire() before >> wait(); must be acquire()'ed during notify(); no path should leave the >> condvar acquire()d 'for a long time' before a wait() or release()"). [Guido] > So you think that this would be obscure? A generic condition variable > use could look like this: > > block locking(self.condvar): > while not self.items: > self.condvar.wait() > self.process(self.items) > self.items = [] > > instead of this: > > self.condvar.acquire() > try: > while not self.items: > self.condvar.wait() > self.process(self.items) > self.items = [] > finally: > self.condvar.release() > > I find that the "block locking" version looks just fine; it makes the > scope of the condition variable quite clear despite not having any > explicit acquire() or release() calls (there are some abstracted away > in the wait() call too!). Actually typing it all out like that makes it hard to dislike . Yup, that reads fine to me too. I don't think anyone has mentioned this yet, so I will: library writers using Decimal (or more generally HW 754 gimmicks) have a need to fiddle lots of thread-local state ("numeric context"), and must restore it no matter how the routine exits. Like "boost precision to twice the user's value over the next 12 computations, then restore", and "no matter what happens here, restore the incoming value of the overflow-happened flag". It's just another instance of temporarily taking over a shared resource, but I think it's worth mentioning that there are a lot of things "like that" in the world, and to which decorators don't really sanely apply. From jcarlson at uci.edu Tue May 3 22:39:21 2005 From: jcarlson at uci.edu (Josiah Carlson) Date: Tue, 03 May 2005 13:39:21 -0700 Subject: [Python-Dev] Need to hook Py_FatalError In-Reply-To: References: <20050503112637.648F.JCARLSON@uci.edu> Message-ID: <20050503132639.6492.JCARLSON@uci.edu> "m.u.k" wrote: > Josiah Carlson wrote in > news:20050503112637.648F.JCARLSON at uci.edu: > > > Offering any hook for Py_FatalError may not even be enough, as some of > > those errors are caused by insufficient memory. What if a hook were > > available, but it couldn't be called because there wasn't enough memory? > > > > Of course there is the option of pre-allocating a few kilobytes, then > > just before one calls the hook, freeing that memory so that the hook can > > execute (assuming the hook is small enough). I'm not sure if this is a > > desireable general mechanic, but it may be sufficient for you. If you > > do figure out a logging mechanism that is almost guaranteed to execute > > on FatalError, post it to sourceforge. > > IMHO this should be left to hooker(apparerently not right word, but you get > the point :) ). If he allocates more mem. or does heavy stuff, that will just > fail. Anyway abort() is a failure too. Either abort() will end the process or > OS will on such a critical error. I'm not talking about doing memory-intensive callbacks, I'm talking about the function call itself. From what I understand, any function call in Python requires a memory allocation. This is trivially true in the case of rentrant Python calls; which requires the allocation of a frame object from heap memory, and in the case of all calls, from C stack memory. If you cannot allocate a frame for __del__ method calling (one of the error conditions), you certainly aren't going to be able to call a Python callback (no heap memory), and may not have enough stack memory required by your logging function; even if it is written in C (especially if you construct a nontrivial portion of the message in memory before it is printed). If I'm wrong, I'd like to hear it, but I'm still waiting for your patch on sourceforge. - Josiah From hpk at trillke.net Tue May 3 23:26:36 2005 From: hpk at trillke.net (holger krekel) Date: Tue, 3 May 2005 23:26:36 +0200 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: References: <20050503201400.GE30548@solar.trillke.net> Message-ID: <20050503212636.GF30548@solar.trillke.net> [Guido] > [Holger] > > However, i would find it much clearer if *defining* blocktemplates > > used a new keyword, like: > > > > blocktemplate opening(filename, mode="r"): > > ... > > > > because this immediately tells me what the purpose and semantics > > of the folowing definition is. The original overloading of 'def' to > > mean generators if the body contains a yield statement was already a > > matter of discussion (ASFAIK). When i came to Python it was at 2.2 > > and i remember wondering about this "def" oddity. > > > > Extending poor old 'def' functions now to possibly mean block > > templates gives me semantical overload even if it is justified > > from an implementation point of view. I am talking purely > > about (my sense of) code readability here not about implementation. > > Hm... Maybe you also want to have separate function and procedure > keywords? Or static typing? 'def' can be used to define all sorts of > things, that is Python's beauty! Sure, 'def' is nice and i certainly wouldn't introduce a new keyword for adding e.g. static typing to function 'defs'. But for my taste, blocktemplates derive enough from the old-style function/sub-routine notion that many people still think of when seing a 'def'. When (new) people would see something like 'blocktemplate ...:' they know they have to look it up in the language documentation or in some book under 'blocktemplate' instead of trying to figure out (what the hell) this "function" or "generator" does and how they can use it. Or they might simply think they can invoke it from a for-loop which - as far as i understand - could lead to silent errors, no? Let me add that with the growing number of Python programmers (as stated in your Pycon2005 keynote) it seems to make sense to increase emphasis on how new syntax/concepts will be viewed/used by possibly 100'dreds of thousands of programmers already familiar with (some version of) Python. But i also see your point of confronting people with the fact that Python has a nice unified 'def' statement so i guess it's a balancing act. cheers, holger From python at rcn.com Tue May 3 23:30:23 2005 From: python at rcn.com (Raymond Hettinger) Date: Tue, 3 May 2005 17:30:23 -0400 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: Message-ID: <001901c55027$50c101e0$c704a044@oemcomputer> > > In contrast, the new use of yield differs in that the suspended frame > > transfers control from the encloser to the enclosed. > > Why does your notion of who encloses whom suddenly reverse when you go > from a for-loop to a block-statement? This all feels very strange to > me. After another reading of the PEP, it seems fine. On the earlier readings, the "yield" felt disorienting because the body of the block is subordinate to the block-iterator yet its code is co-located with the caller (albeit set-off with a colon and indentation). > I meant to use this as an example of the unsuitability of the > @synchronized decorator, since it implies that all synchronization is > on the same mutex, thereby providing a use case for the locking > block-statement. > > I suspect we're violently in agreement though. Right :-) > > This is the strongest example so far. When adding it to the PEP, it > > would be useful to contrast the code with simpler alternatives like PEP > > 288's g.throw() or PEP 325's g.close(). On the plus side, the > > block-iterator approach factors out code common to multiple callers. On > > the minus side, the other PEPs involve simpler mechanisms and their > > learning curve would be nearly zero. These pluses and minuses are > > important because apply equally to all examples using blocks for > > initialization/finalization. > > Where do you see a learning curve for blocks? Altering the meaning of a for-loop; introducing a new keyword; extending the semantics of "break" and "continue"; allowing try/finally inside a generator; introducing new control flow; adding new magic methods __next__ and __exit__; adding a new context for "as"; and tranforming "yield" from statement semantics to expression semantics. This isn't a lightweight proposal and not one where we get transference of knowledge from other languages (except for a few users of Ruby, Smalltalk, etc). By comparision, g.throw() or g.close() are trivially simple approaches to generator/iterator finalization. In section on new for-loop specification, what is the purpose of "arg"? Can it be replaced with the constant None? itr = iter(EXPR1) brk = False while True: try: VAR1 = next(itr, None) except StopIteration: brk = True break BLOCK1 if brk: BLOCK2 In "block expr as var", can "var" be any lvalue? block context() as inputfil, outputfil, errorfil: for i, line in enumerate(inputfil): if not checkformat(line): print >> errorfil, line else: print >> outputfil, secret_recipe(line) In re-reading the examples, it occurred to me that the word "block" already has meaning in the context of threading.Lock.acquire() which has an optional blocking argument defaulting to 1. In example 4, consider adding a comment that the "continue" has its normal (non-extending) meaning. The examples should demonstrate the operation of the extended form of "continue", "break", and "return" in the body of the block. Raymond From eric.nieuwland at xs4all.nl Tue May 3 23:44:39 2005 From: eric.nieuwland at xs4all.nl (Eric Nieuwland) Date: Tue, 3 May 2005 23:44:39 +0200 Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr> <17015.39213.522060.873605@montanaro.dyndns.org> <17015.48830.223391.390538@montanaro.dyndns.org> Message-ID: <37179cd212e38a8e041b0d39ee0160de@xs4all.nl> Guido van Rossum wrote: > [Skip Montanaro] >> To the casual observer, this >> looks like "break" should break out of the loop: >> >> while True: >> block synchronized(v1): >> ... >> if v1.field: >> break >> time.sleep(1) > > Without 'block' this would be written as try/finally. And my point is > that people just don't write try/finally inside a while loop very > often (I found *no* examples in the entire standard library). Errr... Dutch example: Dining Philosophers (Dijkstra) --eric From bjourne at gmail.com Tue May 3 23:54:45 2005 From: bjourne at gmail.com (=?ISO-8859-1?Q?BJ=F6rn_Lindqvist?=) Date: Tue, 3 May 2005 23:54:45 +0200 Subject: [Python-Dev] PEP 340: Only for try/finally? In-Reply-To: References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr> <17015.39213.522060.873605@montanaro.dyndns.org> <17015.48830.223391.390538@montanaro.dyndns.org> Message-ID: <740c3aec05050314544178f57f@mail.gmail.com> > > Guido> How many try/finally statements have you written inside a loop? > > Guido> In my experience this is extreeeemely rare. I found no > > Guido> occurrences in the standard library. > > [Skip again] > > How'd we start talking about try/finally? > > Because it provides by far the dominant use case for 'block'. The > block-statement is intended to replace many boilerplace uses of > try/finally. In addition, it's also a coroutine invocation primitive. Maybe I'm not understanding something, but why should "block" only be for less boilerplate in try/finally's? I spent an hour grepping through the standard library and there are indeed lots of use cases for some blocks to replace try/finallys. There are opportunities for block opening(file) and block locked(mutex) everywhere! But why stop there? Lots of functions that takes a callable as argument could be upgraded to use the new block syntax. Because it is a cool way to do template method, isn't it? Take wrapper() in curses/wrapper.py for example. Why have it like this: wrapper(curses_wrapped_main) when you can have it like this: .block wrapper(): . (main program stuff) . (...) Or assertRaises in unittest.py, why call it like this: self.assertRaises(TypeError, lambda: a*x) When you can squash the lambda like this: .block self.assertRaises(TypeError): . a*x Or for another use case, in gl-code you often write glBegin().. glDrawBlah().. glEnd(). Make it properly indented!: .block glNowDraw(): # glBegin(); yield; glEnd() . glDrawBlah() Make your own repeat-until loop: .def until(cond): . while True: . yield None . if cond: . break .block until(lambda: s == "quit"): . s = sys.stdin.readline() It seems like the possibilities are endless. Maybe too endless? Because this new feature is so similar to anonymous functions, but is not quite anonymous functions, so why not introduce anonymous functions instead, that could make all the things block can, and more? But as I said, I'm misunderstanding something. -- mvh Bj?rn From pje at telecommunity.com Wed May 4 00:02:56 2005 From: pje at telecommunity.com (Phillip J. Eby) Date: Tue, 03 May 2005 18:02:56 -0400 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: <001901c55027$50c101e0$c704a044@oemcomputer> References: Message-ID: <5.1.1.6.0.20050503175901.0212aa00@mail.telecommunity.com> At 05:30 PM 5/3/05 -0400, Raymond Hettinger wrote: >By comparision, g.throw() or g.close() are trivially simple approaches >to generator/iterator finalization. That reminds me of something; in PEP 333 I proposed use of a 'close()' attribute in anticipation of PEP 325, so that web applications implemented as generators could take advantage of resource cleanup. Is there any chance that as part of PEP 340, 'close()' could translate to the same as '__exit__(StopIteration)'? If not, modifying PEP 333 to support '__exit__' is going to be a bit of a pain, especially since there's code in the field now with that assumption. From gvanrossum at gmail.com Wed May 4 00:04:46 2005 From: gvanrossum at gmail.com (Guido van Rossum) Date: Tue, 3 May 2005 15:04:46 -0700 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: <001901c55027$50c101e0$c704a044@oemcomputer> References: <001901c55027$50c101e0$c704a044@oemcomputer> Message-ID: [Guido] > > Where do you see a learning curve for blocks? [Raymond] > Altering the meaning of a for-loop; introducing a new keyword; extending > the semantics of "break" and "continue"; allowing try/finally inside a > generator; introducing new control flow; adding new magic methods > __next__ and __exit__; adding a new context for "as"; and tranforming > "yield" from statement semantics to expression semantics. This isn't a > lightweight proposal and not one where we get transference of knowledge > from other languages (except for a few users of Ruby, Smalltalk, etc). [Bah, gmail just lost my draft. :-( Trying to reconstruct...] But there are several separable proposals in the PEP. Using "continue EXPR" which calls its.__next__(EXPR) which becomes the return value of a yield-expression is entirely orthogonal (and come to think of it the PEP needs a motivating example for this). And come to think of it, using a generator to "drive" a block statement is also separable; with just the definition of the block statement from the PEP you could implement all the examples using a class (similar to example 6, which is easily turned into a template). I think that seeing just two of the examples would be enough for most people to figure out how to write their own, so that's not much of a learning curve IMO. > By comparision, g.throw() or g.close() are trivially simple approaches > to generator/iterator finalization. But much more clumsy to use since you have to write your own try/finally. > In section on new for-loop specification, what is the purpose of "arg"? > Can it be replaced with the constant None? No, it is set by the "continue EXPR" translation given just below it. I'll add a comment; other people also missed this. > In "block expr as var", can "var" be any lvalue? Yes. That's what I meant by "VAR1 is an arbitrary assignment target (which may be a comma-separated list)". I'm adding an example that shows this usage. > In re-reading the examples, it occurred to me that the word "block" > already has meaning in the context of threading.Lock.acquire() which has > an optional blocking argument defaulting to 1. Yeah, Holger also pointed out that block is a common variable name... :-( > In example 4, consider adding a comment that the "continue" has its > normal (non-extending) meaning. I'd rather not, since this would just increase the confusion between the body of the generator (where yield has a special meaning) vs. the body of the block-statement (where continue, break, return and exceptions have a special meaning). Also note example 5, which has a yield inside a block-statement. This is the block statement's equivalent to using a for-loop with a yield in its body in a regular generator when it is invoking another iterator or generator recursively. > The examples should demonstrate the operation of the extended form of > "continue", "break", and "return" in the body of the block. Good point. (Although break and return don't really have an extended form -- they just get new semantics in a block-statement.) I'll have to think about those. -- --Guido van Rossum (home page: http://www.python.org/~guido/) From pje at telecommunity.com Wed May 4 00:11:40 2005 From: pje at telecommunity.com (Phillip J. Eby) Date: Tue, 03 May 2005 18:11:40 -0400 Subject: [Python-Dev] PEP 340: Only for try/finally? In-Reply-To: <740c3aec05050314544178f57f@mail.gmail.com> References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr> <17015.39213.522060.873605@montanaro.dyndns.org> <17015.48830.223391.390538@montanaro.dyndns.org> Message-ID: <5.1.1.6.0.20050503180546.03471d70@mail.telecommunity.com> At 11:54 PM 5/3/05 +0200, BJ?rn Lindqvist wrote: >It seems like the possibilities are endless. Maybe too endless? >Because this new feature is so similar to anonymous functions, but is >not quite anonymous functions, so why not introduce anonymous >functions instead, that could make all the things block can, and more? >But as I said, I'm misunderstanding something. Anonymous functions can't rebind variables in their enclosing function. It could be argued that it's better to fix this, rather than inventing a new macro-like facility, but I don't know how such a rebinding facility could preserve readability as well as PEP 340 does. Also, many of your examples are indeed improvements over calling a function that takes a function. The block syntax provides a guarantee that the block will be executed immediately or not at all. Once you are past the block suite in the code, you know it will not be re-executed, because no reference to it is ever held by the called function. You do not have this same guarantee when you see a function-taking-function being invoked. So, a block suite tells you that the control flow is more-or-less linear, whereas a function definition raises the question of *when* that function will be executed, and whether you have exhaustive knowledge of the possible places from which it may be called. From nidoizo at yahoo.com Wed May 4 00:29:33 2005 From: nidoizo at yahoo.com (Nicolas Fleury) Date: Tue, 03 May 2005 18:29:33 -0400 Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr> <17015.39213.522060.873605@montanaro.dyndns.org> <17015.48830.223391.390538@montanaro.dyndns.org> Message-ID: Guido van Rossum wrote: > [Skip Montanaro] >> Guido> How many try/finally statements have you written inside a loop? >> Guido> In my experience this is extreeeemely rare. I found no >> Guido> occurrences in the standard library. > >>How'd we start talking about try/finally? > > Because it provides by far the dominant use case for 'block'. The > block-statement is intended to replace many boilerplace uses of > try/finally. In addition, it's also a coroutine invocation primitive. I would expect programmers to do more than only replace existing try/finally blocks. The support for RAII patterns in Python might result in more use of RAII primitives and some may fit very well inside a loop. It might not be a bad idea to look at what other languages are doing with RAII. Also, even if there's no occurence right now in the standard library, it doesn't mean it has always been the case in the code evolution, where debugging such pitfall would not be cool. FWIW, I expect most generators used in block-syntax to not be loops. What would imply to support these to pass "break" to parent loop at run-time? Maybe generators are not the way to go, but could be supported natively by providing a __block__ function, very similarly to sequences providing an __iter__ function for for-loops? We could avoid explaining to a newbie why the following code doesn't work if "opening" could be implemented in way that it works. for filename in filenames: block opening(filename) as file: if someReason: break By the way, FWIW, my preference if to have no keyword, making it clearer that some block statements are loops and others not, but probably amplifying the "break" problem. Regards, Nicolas From gvanrossum at gmail.com Wed May 4 00:33:37 2005 From: gvanrossum at gmail.com (Guido van Rossum) Date: Tue, 3 May 2005 15:33:37 -0700 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: <5.1.1.6.0.20050503175901.0212aa00@mail.telecommunity.com> References: <001901c55027$50c101e0$c704a044@oemcomputer> <5.1.1.6.0.20050503175901.0212aa00@mail.telecommunity.com> Message-ID: [Phillip] > That reminds me of something; in PEP 333 I proposed use of a 'close()' > attribute in anticipation of PEP 325, so that web applications implemented > as generators could take advantage of resource cleanup. Is there any > chance that as part of PEP 340, 'close()' could translate to the same as > '__exit__(StopIteration)'? If not, modifying PEP 333 to support '__exit__' > is going to be a bit of a pain, especially since there's code in the field > now with that assumption. Maybe if you drop support for the "separate protocol" alternative... :-) I had never heard of that PEP. How much code is there in the field? Written by whom? I suppose you can always write a decorator that takes care of the mapping. I suppose it should catch and ignore the StopIteration that __exit__(StopIteration) is likely to throw. -- --Guido van Rossum (home page: http://www.python.org/~guido/) From nbastin at opnet.com Wed May 4 00:36:23 2005 From: nbastin at opnet.com (Nicholas Bastin) Date: Tue, 3 May 2005 18:36:23 -0400 Subject: [Python-Dev] Py_UNICODE madness Message-ID: The documentation for Py_UNICODE states the following: "This type represents a 16-bit unsigned storage type which is used by Python internally as basis for holding Unicode ordinals. On platforms where wchar_t is available and also has 16-bits, Py_UNICODE is a typedef alias for wchar_t to enhance native platform compatibility. On all other platforms, Py_UNICODE is a typedef alias for unsigned short." However, we have found this not to be true on at least certain RedHat versions (maybe all, but I'm not willing to say that at this point). pyconfig.h on these systems reports that PY_UNICODE_TYPE is wchar_t, and PY_UNICODE_SIZE is 4. Needless to say, this isn't consistent with the docs. It also creates quite a few problems when attempting to interface Python with other libraries which produce unicode data. Is this a bug, or is this behaviour intended? It turns out that at some point in the past, this created problems for tkinter as well, so someone just changed the internal unicode representation in tkinter to be 4 bytes as well, rather than tracking down the real source of the problem. Is PY_UNICODE_TYPE always going to be guaranteed to be 16 bits, or is it dependent on your platform? (in which case we can give up now on Python unicode compatibility with any other libraries). At the very least, if we can't guarantee the internal representation, then the PyUnicode_FromUnicode API needs to go away, and be replaced with something capable of transcoding various unicode inputs into the internal python representation. -- Nick From gvanrossum at gmail.com Wed May 4 00:39:04 2005 From: gvanrossum at gmail.com (Guido van Rossum) Date: Tue, 3 May 2005 15:39:04 -0700 Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr> <17015.39213.522060.873605@montanaro.dyndns.org> <17015.48830.223391.390538@montanaro.dyndns.org> Message-ID: > FWIW, I expect most generators used in block-syntax to not be loops. > What would imply to support these to pass "break" to parent loop at > run-time? I proposed this at some point during the discussion leading up to the PEP and it was boohed away as too fragile (and I agree). You're just going to have to learn to deal with it, just as you can't break out of two nested loops (but you can return from the innermost loop). > Maybe generators are not the way to go, but could be > supported natively by providing a __block__ function, very similarly to > sequences providing an __iter__ function for for-loops? Sorry, I have no idea what you are proposing here. -- --Guido van Rossum (home page: http://www.python.org/~guido/) From gvanrossum at gmail.com Wed May 4 00:44:13 2005 From: gvanrossum at gmail.com (Guido van Rossum) Date: Tue, 3 May 2005 15:44:13 -0700 Subject: [Python-Dev] Py_UNICODE madness In-Reply-To: References: Message-ID: I think that documentation is wrong; AFAIK Py_UNICODE has always been allowed to be either 16 or 32 bits, and the source code goes through great lengths to make sure that you get a link error if you try to combine extensions built with different assumptions about its size. On 5/3/05, Nicholas Bastin wrote: > The documentation for Py_UNICODE states the following: > > "This type represents a 16-bit unsigned storage type which is used by > Python internally as basis for holding Unicode ordinals. On platforms > where wchar_t is available and also has 16-bits, Py_UNICODE is a > typedef alias for wchar_t to enhance native platform compatibility. On > all other platforms, Py_UNICODE is a typedef alias for unsigned > short." > > However, we have found this not to be true on at least certain RedHat > versions (maybe all, but I'm not willing to say that at this point). > pyconfig.h on these systems reports that PY_UNICODE_TYPE is wchar_t, > and PY_UNICODE_SIZE is 4. Needless to say, this isn't consistent with > the docs. It also creates quite a few problems when attempting to > interface Python with other libraries which produce unicode data. > > Is this a bug, or is this behaviour intended? > > It turns out that at some point in the past, this created problems for > tkinter as well, so someone just changed the internal unicode > representation in tkinter to be 4 bytes as well, rather than tracking > down the real source of the problem. > > Is PY_UNICODE_TYPE always going to be guaranteed to be 16 bits, or is > it dependent on your platform? (in which case we can give up now on > Python unicode compatibility with any other libraries). At the very > least, if we can't guarantee the internal representation, then the > PyUnicode_FromUnicode API needs to go away, and be replaced with > something capable of transcoding various unicode inputs into the > internal python representation. > > -- > Nick > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > http://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: http://mail.python.org/mailman/options/python-dev/guido%40python.org > -- --Guido van Rossum (home page: http://www.python.org/~guido/) From gvanrossum at gmail.com Wed May 4 00:44:13 2005 From: gvanrossum at gmail.com (Guido van Rossum) Date: Tue, 3 May 2005 15:44:13 -0700 Subject: [Python-Dev] Py_UNICODE madness In-Reply-To: References: Message-ID: I think that documentation is wrong; AFAIK Py_UNICODE has always been allowed to be either 16 or 32 bits, and the source code goes through great lengths to make sure that you get a link error if you try to combine extensions built with different assumptions about its size. On 5/3/05, Nicholas Bastin wrote: > The documentation for Py_UNICODE states the following: > > "This type represents a 16-bit unsigned storage type which is used by > Python internally as basis for holding Unicode ordinals. On platforms > where wchar_t is available and also has 16-bits, Py_UNICODE is a > typedef alias for wchar_t to enhance native platform compatibility. On > all other platforms, Py_UNICODE is a typedef alias for unsigned > short." > > However, we have found this not to be true on at least certain RedHat > versions (maybe all, but I'm not willing to say that at this point). > pyconfig.h on these systems reports that PY_UNICODE_TYPE is wchar_t, > and PY_UNICODE_SIZE is 4. Needless to say, this isn't consistent with > the docs. It also creates quite a few problems when attempting to > interface Python with other libraries which produce unicode data. > > Is this a bug, or is this behaviour intended? > > It turns out that at some point in the past, this created problems for > tkinter as well, so someone just changed the internal unicode > representation in tkinter to be 4 bytes as well, rather than tracking > down the real source of the problem. > > Is PY_UNICODE_TYPE always going to be guaranteed to be 16 bits, or is > it dependent on your platform? (in which case we can give up now on > Python unicode compatibility with any other libraries). At the very > least, if we can't guarantee the internal representation, then the > PyUnicode_FromUnicode API needs to go away, and be replaced with > something capable of transcoding various unicode inputs into the > internal python representation. > > -- > Nick > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > http://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: http://mail.python.org/mailman/options/python-dev/guido%40python.org > -- --Guido van Rossum (home page: http://www.python.org/~guido/) From jimjjewett at gmail.com Wed May 4 00:56:50 2005 From: jimjjewett at gmail.com (Jim Jewett) Date: Tue, 3 May 2005 18:56:50 -0400 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: References: Message-ID: Summary: Resource Managers are a good idea. First Class Suites may be a good idea. Block Iterators try to split the difference. They're not as powerful as First Class Suites, and not as straightforward as Resource Managers. This particular middle ground didn't work out so well. On 5/3/05, Guido van Rossum wrote: > [Jim Jewett] ... > > With the block, every yield returns to a single designated callback. > > This callback had to be established at the same time the block was > > created, and must be textually inside it. (An indented suite to the > > "block XXX:" line.) > Doesn't convince me. The common use for a regular generator is in a > for-loop, where every yield also returns to a single designated place > (calling it callback is really deceptive!). I do not consider the body of a for-loop a to be callback; the generator has no knowledge of that body. But with a Block Iterator, the generator (or rather, its unrolled version) does need to textually contain the to-be-included suite -- which is why that suite smells like a callback function that just doesn't happen to be named. > And with a block, you're free to put the generator call ahead of the > block so you can call next() on it manually: > > it = EXPR1 > block it: > BLOCK1 > ... lets you call next() on it as you please (until the > block is exited, for sure). For a Resource Manager, the only thing this could do is effectively discard the BLOCK1, because the yields would have been used up (and the resource deallocated). I suppose this is another spelling of "resources are not loops". > > But are there plenty of other use cases for PEP 340? > Yes. Patterns like "do this little dance in a try/finally block" and > "perform this tune when you catch an XYZ exception" are pretty common ... Let me rephrase ... The Block Iterator syntax gets awkward if it needs to yield more than once (and the exits are not interchangable). You have said that is OK because most Resource Managers only yield once. But if you're willing to accept that, then why not just limit it to a Resource Manager instead of an Iterator? Resource Managers could look similar to the current proposal, but would be less ambitious. They should have absolutely no connection to loops/iterators/generators. There should be no internal secret loop. if they use the "yield" keyword, it should be described as "yielding control" rather than "yielding the next value." There would be only one yielding of control per Resource Manager. If limiting the concept to Resource Managers is not acceptable, then I still don't think Block Iterators are the right answer -- though First Class Suites might be. (And so might "No Changes at all".) Reasoning: If there is only one yield, then you're really just wrapping the call to the (unnamed) suite. (Q) Why are decorators not appropriate? (A1) In some cases, the wrapper needs to capture an instance-variable, which isn't available at definition-time. (A2) Decorators can be ugly. This is often because the need to return a complete replacement callable leads to too many nested functions. These are both problems with decorators. They do argue for improving the decorator syntax, but not for throwing out the concept. I don't think that Block Iterators will really clear things up -- to me, they just look like a different variety of fog. If decoration doesn't work, why not use a regular function that takes a callback? Pass the callback instead of defining an anonymous suite. Call the callback instead of writing the single yield. ... > ... you are proposing to solve all its use cases by defining an > explicit function or method representing the body of the block. Yes. > The latter solution leads to way too much ugly code -- all that > function-definition boilerplate is worse than the try/finally > boilerplate we're trying to hide! In the cases I've actually seen, the ugly function definition portions are in the decorator, rather than the regular function. It trades a little ugliness that gets repeated all over the place for a lot of ugliness that happens only once (in the decorator). That said, I'm willing to believe that breaking out a method might sometimes be a bad idea. In which case you probably want First Class (and decorable) Suites. If First Class Suites are not acceptable in general, then let's figure out where they are acceptable. For me, Resource Manager is a good use case, but Block Iterator is not. -jJ From gvanrossum at gmail.com Wed May 4 01:08:29 2005 From: gvanrossum at gmail.com (Guido van Rossum) Date: Tue, 3 May 2005 16:08:29 -0700 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: References: Message-ID: Sorry Jim, but I just don't think you & I were intended to be on the same language design committee. Nothing you say seems to be making any sense to me these days. Maybe someone else can channel you effectively, but I'm not going to try to do a line-by-line response to your email quoted below. On 5/3/05, Jim Jewett wrote: > Summary: > > Resource Managers are a good idea. > First Class Suites may be a good idea. > > Block Iterators try to split the difference. They're not as powerful > as First Class Suites, and not as straightforward as Resource > Managers. This particular middle ground didn't work out so well. > > On 5/3/05, Guido van Rossum wrote: > > [Jim Jewett] > ... > > > With the block, every yield returns to a single designated callback. > > > This callback had to be established at the same time the block was > > > created, and must be textually inside it. (An indented suite to the > > > "block XXX:" line.) > > > Doesn't convince me. The common use for a regular generator is in a > > for-loop, where every yield also returns to a single designated place > > (calling it callback is really deceptive!). > > I do not consider the body of a for-loop a to be callback; the generator > has no knowledge of that body. > > But with a Block Iterator, the generator (or rather, its unrolled version) > does need to textually contain the to-be-included suite -- which is why > that suite smells like a callback function that just doesn't happen to be > named. > > > And with a block, you're free to put the generator call ahead of the > > block so you can call next() on it manually: > > > > it = EXPR1 > > block it: > > BLOCK1 > > > ... lets you call next() on it as you please (until the > > block is exited, for sure). > > For a Resource Manager, the only thing this could do is effectively > discard the BLOCK1, because the yields would have been used > up (and the resource deallocated). > > I suppose this is another spelling of "resources are not loops". > > > > But are there plenty of other use cases for PEP 340? > > > Yes. Patterns like "do this little dance in a try/finally block" and > > "perform this tune when you catch an XYZ exception" are pretty common > > ... > > Let me rephrase ... > > The Block Iterator syntax gets awkward if it needs to yield more than > once (and the exits are not interchangable). You have said that is OK > because most Resource Managers only yield once. > > But if you're willing to accept that, then why not just limit it to a Resource > Manager instead of an Iterator? Resource Managers could look similar > to the current proposal, but would be less ambitious. They should have > absolutely no connection to loops/iterators/generators. There should be > no internal secret loop. if they use the "yield" keyword, it should be > described as "yielding control" rather than "yielding the next value." There > would be only one yielding of control per Resource Manager. > > If limiting the concept to Resource Managers is not acceptable, then > I still don't think Block Iterators are the right answer -- though First Class > Suites might be. (And so might "No Changes at all".) > > Reasoning: > > If there is only one yield, then you're really just wrapping the call to > the (unnamed) suite. > > (Q) Why are decorators not appropriate? > > (A1) In some cases, the wrapper needs to capture an > instance-variable, which isn't available at definition-time. > (A2) Decorators can be ugly. This is often because the > need to return a complete replacement callable leads to too > many nested functions. > > These are both problems with decorators. They do argue for > improving the decorator syntax, but not for throwing out the > concept. I don't think that Block Iterators will really clear things > up -- to me, they just look like a different variety of fog. > > If decoration doesn't work, why not use a regular function > that takes a callback? Pass the callback instead of defining an > anonymous suite. Call the callback instead of writing the single > yield. > > ... > > > ... you are proposing to solve all its use cases by defining an > > explicit function or method representing the body of the block. > > Yes. > > > The latter solution leads to way too much ugly code -- all that > > function-definition boilerplate is worse than the try/finally > > boilerplate we're trying to hide! > > In the cases I've actually seen, the ugly function definition portions > are in the decorator, rather than the regular function. It trades a > little ugliness that gets repeated all over the place for a lot of ugliness > that happens only once (in the decorator). > > That said, I'm willing to believe that breaking out a method might > sometimes be a bad idea. In which case you probably want > First Class (and decorable) Suites. > > If First Class Suites are not acceptable in general, then let's figure > out where they are acceptable. For me, Resource Manager is a good > use case, but Block Iterator is not. > > -jJ > -- --Guido van Rossum (home page: http://www.python.org/~guido/) From pje at telecommunity.com Wed May 4 01:27:42 2005 From: pje at telecommunity.com (Phillip J. Eby) Date: Tue, 03 May 2005 19:27:42 -0400 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: References: <5.1.1.6.0.20050503175901.0212aa00@mail.telecommunity.com> <001901c55027$50c101e0$c704a044@oemcomputer> <5.1.1.6.0.20050503175901.0212aa00@mail.telecommunity.com> Message-ID: <5.1.1.6.0.20050503191911.03206050@mail.telecommunity.com> At 03:33 PM 5/3/05 -0700, Guido van Rossum wrote: >[Phillip] > > That reminds me of something; in PEP 333 I proposed use of a 'close()' > > attribute in anticipation of PEP 325, so that web applications implemented > > as generators could take advantage of resource cleanup. Is there any > > chance that as part of PEP 340, 'close()' could translate to the same as > > '__exit__(StopIteration)'? If not, modifying PEP 333 to support '__exit__' > > is going to be a bit of a pain, especially since there's code in the field > > now with that assumption. > >Maybe if you drop support for the "separate protocol" alternative... :-) I don't understand you. Are you suggesting a horse trade, or...? >I had never heard of that PEP. How much code is there in the field? Maybe a dozen or so web applications and frameworks (including Zope, Quixote, PyBlosxom) and maybe a half dozen servers (incl. Twisted and mod_python). A lot of the servers are based on my wsgiref library, though, so it probably wouldn't be too horrible a job to make everybody add support; I might even be able to fudge wsgiref so that wsgiref-based servers don't even see an issue. Modifying the spec is potentially more controversial, however; it'll have to go past the Web-SIG, and I assume the first thing that'll be asked is, "Why aren't generators getting a close() method then?", so I figured I should ask that question first. I'd completely forgotten about this being an issue until Raymond mentioned g.close(); I'd previously gotten the impression that PEP 325 was expected to be approved, otherwise I wouldn't have written support for it into PEP 333. >Written by whom? I used to know who all had written implementations, but there are now too many to keep track of. From pje at telecommunity.com Wed May 4 01:36:47 2005 From: pje at telecommunity.com (Phillip J. Eby) Date: Tue, 03 May 2005 19:36:47 -0400 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: <5.1.1.6.0.20050503191911.03206050@mail.telecommunity.com> References: <5.1.1.6.0.20050503175901.0212aa00@mail.telecommunity.com> <001901c55027$50c101e0$c704a044@oemcomputer> <5.1.1.6.0.20050503175901.0212aa00@mail.telecommunity.com> Message-ID: <5.1.1.6.0.20050503193348.031096f0@mail.telecommunity.com> At 07:27 PM 5/3/05 -0400, Phillip J. Eby wrote: >Modifying the spec is potentially more controversial, however; it'll have >to go past the Web-SIG, and I assume the first thing that'll be asked is, >"Why aren't generators getting a close() method then?", so I figured I >should ask that question first. You know what, never mind. I'm still going to write the Web-SIG so they know the change is coming, but this is really a very minor thing; just a feature we won't get "for free" as a side effect of PEP 325. Your decorator idea is a trivial solution, but it would also be trivial to allow WSGI server implementations to call __exit__ on generators. None of this affects existing code in the field, because today you can't write a try/finally in a generator anyway. Therefore, nobody is relying on this feature, therefore it's basically moot. From gvanrossum at gmail.com Wed May 4 01:41:53 2005 From: gvanrossum at gmail.com (Guido van Rossum) Date: Tue, 3 May 2005 16:41:53 -0700 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: <5.1.1.6.0.20050503191911.03206050@mail.telecommunity.com> References: <001901c55027$50c101e0$c704a044@oemcomputer> <5.1.1.6.0.20050503175901.0212aa00@mail.telecommunity.com> <5.1.1.6.0.20050503191911.03206050@mail.telecommunity.com> Message-ID: > >Maybe if you drop support for the "separate protocol" alternative... :-) > > I don't understand you. Are you suggesting a horse trade, or...? Only tongue-in-cheek. :-) > >I had never heard of that PEP. How much code is there in the field? > > Maybe a dozen or so web applications and frameworks (including Zope, > Quixote, PyBlosxom) and maybe a half dozen servers (incl. Twisted and > mod_python). A lot of the servers are based on my wsgiref library, though, > so it probably wouldn't be too horrible a job to make everybody add > support; I might even be able to fudge wsgiref so that wsgiref-based > servers don't even see an issue. > > Modifying the spec is potentially more controversial, however; it'll have > to go past the Web-SIG, and I assume the first thing that'll be asked is, > "Why aren't generators getting a close() method then?", so I figured I > should ask that question first. > > I'd completely forgotten about this being an issue until Raymond mentioned > g.close(); I'd previously gotten the impression that PEP 325 was expected > to be approved, otherwise I wouldn't have written support for it into PEP 333. > > >Written by whom? > > I used to know who all had written implementations, but there are now too > many to keep track of. Given all that, it's not infeasible to add a close() method to generators as a shortcut for this: def close(self): try: self.__exit__(StopIteration) except StopIteration: break else: # __exit__() didn't raise RuntimeError("or some other exception") I'd like the block statement to be defined exclusively in terms of __exit__() though. -- --Guido van Rossum (home page: http://www.python.org/~guido/) From tdelaney at avaya.com Wed May 4 01:45:38 2005 From: tdelaney at avaya.com (Delaney, Timothy C (Timothy)) Date: Wed, 4 May 2005 09:45:38 +1000 Subject: [Python-Dev] PEP 340 -- loose ends Message-ID: <338366A6D2E2CA4C9DAEAE652E12A1DE025204D9@au3010avexu1.global.avaya.com> Another loose end (which can partially explain why I still thought __next__ took an exception ;) In "Specification: Generator Exit Handling":: "When __next__() is called with an argument that is not None, the yield-expression that it resumes will return the value attribute of the argument." I think this should read:: "When __next__() is called with an argument that is not None, the yield-expression that it resumes will return the argument." Tim Delaney From python at jwp.name Wed May 4 01:54:27 2005 From: python at jwp.name (James William Pye) Date: Tue, 03 May 2005 16:54:27 -0700 Subject: [Python-Dev] Need to hook Py_FatalError In-Reply-To: <20050503175422.GF8344@unpythonic.net> References: <20050503175422.GF8344@unpythonic.net> Message-ID: <1115164467.62180.17.camel@localhost> On Tue, 2005-05-03 at 12:54 -0500, Jeff Epler wrote: > On Tue, May 03, 2005 at 09:15:42AM -0700, Guido van Rossum wrote: > > But tell me, what do you want the process to do instead of > > terminating? Py_FatalError is used in situations where raising an > > exception is impossible or would do more harm than good. > > In an application which embeds Python, I want to show the application's > standard error dialog, which doesn't call any Python APIs (but does do > things like capture the call stack at the time of the error). For this > use, it doesn't matter that no further calls to those APIs are possible. > > Jeff +1 Here. In my case(postgresql), it would probably be wiser to map Py_Fatal's to Postgres' ereport(FATAL,(...)), as it does appear to do some cleaning up on exit, and if there's a remote user, it could actually give the user the message. [http://python.project.postgresql.org] -- Regards, James William Pye From python at rcn.com Wed May 4 01:57:02 2005 From: python at rcn.com (Raymond Hettinger) Date: Tue, 3 May 2005 19:57:02 -0400 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: Message-ID: <000601c5503b$cdaa26a0$41b6958d@oemcomputer> > But there are several separable proposals in the PEP. Using "continue > EXPR" which calls its.__next__(EXPR) which becomes the return value of > a yield-expression is entirely orthogonal (and come to think of it the > PEP needs a motivating example for this). > > And come to think of it, using a generator to "drive" a block > statement is also separable; with just the definition of the block > statement from the PEP you could implement all the examples using a > class (similar to example 6, which is easily turned into a template). I think that realization is important. It would be great to have a section of the PEP that focuses on separability and matching features to benefits. Start with above observation that the proposed examples can be achieved with generators driving the block statement. When the discussion hits comp.lang.python, a separability section will help focus the conversation (there's a flaw/issue/dislike about feature x; however, features y/z and related benefits do not depend on x). Essentially, having generators as block drivers is the base proposal. Everything else is an elaboration. > > By comparision, g.throw() or g.close() are trivially simple approaches > > to generator/iterator finalization. > > But much more clumsy to use since you have to write your own try/finally. Sometimes easy makes up for clumsy. > > In re-reading the examples, it occurred to me that the word "block" > > already has meaning in the context of threading.Lock.acquire() which has > > an optional blocking argument defaulting to 1. > > Yeah, Holger also pointed out that block is a common variable name... :-( Someone mentioned "suite" as a suitable alternative. That word seems to encompass the same conceptual space without encroaching on existing variable and argument names. Also, "suite" reads as a noun. In contrast, "block" has a verb form that too easily misconnects with the name of the block-iterator expression -- what comes to mind when you see block sender() or block next_message(). Performance-wise, I cringe at the thought of adding any weight at all to the for-loop semantics. The current version is super lightweight and clean. Adding anything to it will likely have a comparatively strong negative effect on timings. It's too early for that discussion, but keep it in mind. That's pretty much it for my first readings of the PEP. All-in-all it has come together nicely. Raymond From python at rcn.com Wed May 4 01:59:31 2005 From: python at rcn.com (Raymond Hettinger) Date: Tue, 3 May 2005 19:59:31 -0400 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: Message-ID: <000701c5503c$265212e0$41b6958d@oemcomputer> > it's not infeasible to add a close() method to > generators as a shortcut for this: > > def close(self): > try: > self.__exit__(StopIteration) > except StopIteration: > break > else: > # __exit__() didn't > raise RuntimeError("or some other exception") > > I'd like the block statement to be defined exclusively in terms of > __exit__() though. That sounds like a winner. Raymond From pje at telecommunity.com Wed May 4 02:05:23 2005 From: pje at telecommunity.com (Phillip J. Eby) Date: Tue, 03 May 2005 20:05:23 -0400 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: References: <5.1.1.6.0.20050503191911.03206050@mail.telecommunity.com> <001901c55027$50c101e0$c704a044@oemcomputer> <5.1.1.6.0.20050503175901.0212aa00@mail.telecommunity.com> <5.1.1.6.0.20050503191911.03206050@mail.telecommunity.com> Message-ID: <5.1.1.6.0.20050503200040.03171960@mail.telecommunity.com> At 04:41 PM 5/3/05 -0700, Guido van Rossum wrote: >Given all that, it's not infeasible to add a close() method to >generators as a shortcut for this: > > def close(self): > try: > self.__exit__(StopIteration) > except StopIteration: > break > else: > # __exit__() didn't > raise RuntimeError("or some other exception") > >I'd like the block statement to be defined exclusively in terms of >__exit__() though. Sure. PEP 325 proposes a "CloseGenerator" exception in place of "StopIteration", however, because: """ Issues: should StopIteration be reused for this purpose? Probably not. We would like close to be a harmless operation for legacy generators, which could contain code catching StopIteration to deal with other generators/iterators. """ I don't know enough about the issue to offer either support or opposition for this idea, though. From gvanrossum at gmail.com Wed May 4 02:07:39 2005 From: gvanrossum at gmail.com (Guido van Rossum) Date: Tue, 3 May 2005 17:07:39 -0700 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: <000601c5503b$cdaa26a0$41b6958d@oemcomputer> References: <000601c5503b$cdaa26a0$41b6958d@oemcomputer> Message-ID: [Guido] > > And come to think of it, using a generator to "drive" a block > > statement is also separable; with just the definition of the block > > statement from the PEP you could implement all the examples using a > > class (similar to example 6, which is easily turned into a template). [Raymond Hettinger] > I think that realization is important. It would be great to have a > section of the PEP that focuses on separability and matching features to > benefits. Start with above observation that the proposed examples can > be achieved with generators driving the block statement. Good idea. I'm kind of stuck for time (have used up most of my Python time for the next few weeks) -- if you or someone else could volunteer some text I'd appreciate it. > When the discussion hits comp.lang.python, a separability section will > help focus the conversation (there's a flaw/issue/dislike about feature > x; however, features y/z and related benefits do not depend on x). Right. The PEP started with me not worrying too much about motivation or use cases but instead focusing on precise specification of the mechanisms, since there was a lot of confusion over that. Now that's out of the way, motivation (you might call it "spin" :-) becomes more important. > Essentially, having generators as block drivers is the base proposal. > Everything else is an elaboration. Right. > Someone mentioned "suite" as a suitable alternative. That word seems to > encompass the same conceptual space without encroaching on existing > variable and argument names. Alas, the word "suite" is used extensively when describing Python's syntax. > Performance-wise, I cringe at the thought of adding any weight at all to > the for-loop semantics. The current version is super lightweight and > clean. Adding anything to it will likely have a comparatively strong > negative effect on timings. It's too early for that discussion, but > keep it in mind. A for-loop without a "continue EXPR" in it shouldn't need to change at all; the tp_iternext slot could be filled with either __next__ or next whichever is defined. -- --Guido van Rossum (home page: http://www.python.org/~guido/) From gvanrossum at gmail.com Wed May 4 02:17:46 2005 From: gvanrossum at gmail.com (Guido van Rossum) Date: Tue, 3 May 2005 17:17:46 -0700 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: <5.1.1.6.0.20050503200040.03171960@mail.telecommunity.com> References: <001901c55027$50c101e0$c704a044@oemcomputer> <5.1.1.6.0.20050503175901.0212aa00@mail.telecommunity.com> <5.1.1.6.0.20050503191911.03206050@mail.telecommunity.com> <5.1.1.6.0.20050503200040.03171960@mail.telecommunity.com> Message-ID: On 5/3/05, Phillip J. Eby wrote: > At 04:41 PM 5/3/05 -0700, Guido van Rossum wrote: > >Given all that, it's not infeasible to add a close() method to > >generators as a shortcut for this: > > > > def close(self): > > try: > > self.__exit__(StopIteration) > > except StopIteration: > > break > > else: > > # __exit__() didn't > > raise RuntimeError("or some other exception") > > > >I'd like the block statement to be defined exclusively in terms of > >__exit__() though. (So do you want this feature now or not? Earlier you said it was no big deal.) > Sure. PEP 325 proposes a "CloseGenerator" exception in place of > "StopIteration", however, because: > > """ > Issues: should StopIteration be reused for this purpose? Probably > not. We would like close to be a harmless operation for legacy > generators, which could contain code catching StopIteration to > deal with other generators/iterators. > """ > > I don't know enough about the issue to offer either support or opposition > for this idea, though. That would be an issue for the generator finalization proposed by the PEP as well. But I kind of doubt that it's an issue; you'd have to have a try/except catching StopIteration around a yield statement that resumes the generator before this becomes an issue, and that sounds extremely improbable. If at all possible I'd rather not have to define a new exception for this purpose. -- --Guido van Rossum (home page: http://www.python.org/~guido/) From tdelaney at avaya.com Wed May 4 02:31:18 2005 From: tdelaney at avaya.com (Delaney, Timothy C (Timothy)) Date: Wed, 4 May 2005 10:31:18 +1000 Subject: [Python-Dev] PEP 340 -- concept clarification Message-ID: <338366A6D2E2CA4C9DAEAE652E12A1DE721278@au3010avexu1.global.avaya.com> Guido van Rossum wrote: > I'd like the block statement to be defined exclusively in terms of > __exit__() though. This does actually suggest something to me (note - just a thought - no real idea if it's got any merit). Are there any use cases proposed for the block-statement (excluding the for-loop) that do *not* involve resource cleanup (i.e. need an __exit__)? This could be the distinguishing feature between for-loops and block-statements: 1. If an iterator declares __exit__, it cannot be used in a for-loop. For-loops do not guarantee resource cleanup. 2. If an iterator does not declare __exit__, it cannot be used in a block-statement. Block-statements guarantee resource cleanup. This gives separation of API (and thus purpose) whilst maintaining the simplicity of the concept. Unfortunately, generators then become a pain :( We would need additional syntax to declare that a generator was a block generator. OTOH, this may not be such a problem. Any generator that contains a finally: around a yield automatically gets an __exit__, and any that doesn't, doesn't. Although that feels *way* too magical to me (esp. in light of my example below, which *doesn't* use finally). I'd prefer a separate keyword for block generators. In that case, having finally: around a yield would be a syntax error in a "normal" generator. :: resource locking(lock): lock.acquire() try: yield finally: lock.release() block locking(myLock): # Code here executes with myLock held. The lock is # guaranteed to be released when the block is left (even # if via return or by an uncaught exception). To use a (modified) example from another email:: class TestCase: resource assertRaises (self, excClass): try: yield except excClass: return else: if hasattr(excClass, '__name__'): excName = excClass.__name__ else: excName = str(excClass) raise self.failureException, "%s is not raised" % excName block self.assertRaises(TypeError): raise TypeError Note that this *does* require cleanup, but without using a finally: clause - the except: and else: are the cleanup code. Tim Delaney From python at rcn.com Wed May 4 02:37:38 2005 From: python at rcn.com (Raymond Hettinger) Date: Tue, 3 May 2005 20:37:38 -0400 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: Message-ID: <000901c55041$7a06f5e0$41b6958d@oemcomputer> > > I think that realization is important. It would be great to have a > > section of the PEP that focuses on separability and matching features to > > benefits. Start with above observation that the proposed examples can > > be achieved with generators driving the block statement. > > Good idea. I'm kind of stuck for time (have used up most of my Python > time for the next few weeks) -- if you or someone else could volunteer > some text I'd appreciate it. I'll take a crack at it in the morning (we all seem to be on borrowed time this week). > > When the discussion hits comp.lang.python, a separability section will > > help focus the conversation (there's a flaw/issue/dislike about feature > > x; however, features y/z and related benefits do not depend on x). > > Right. The PEP started with me not worrying too much about motivation > or use cases but instead focusing on precise specification of the > mechanisms, since there was a lot of confusion over that. Now that's > out of the way, motivation (you might call it "spin" :-) becomes more > important. Perhaps the cover announcement should impart the initial spin as a request for the community to create, explore, and learn from use cases. That will help make the discussion more constructive, less abstract, and more grounded in reality (wishful thinking). That probably beats, "Here's 3500 words of proposal; do you like it?". Raymond From pje at telecommunity.com Wed May 4 02:47:41 2005 From: pje at telecommunity.com (Phillip J. Eby) Date: Tue, 03 May 2005 20:47:41 -0400 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: References: <5.1.1.6.0.20050503200040.03171960@mail.telecommunity.com> <001901c55027$50c101e0$c704a044@oemcomputer> <5.1.1.6.0.20050503175901.0212aa00@mail.telecommunity.com> <5.1.1.6.0.20050503191911.03206050@mail.telecommunity.com> <5.1.1.6.0.20050503200040.03171960@mail.telecommunity.com> Message-ID: <5.1.1.6.0.20050503202653.02fa28f0@mail.telecommunity.com> At 05:17 PM 5/3/05 -0700, Guido van Rossum wrote: >(So do you want this feature now or not? Earlier you said it was no big deal.) It *isn't* a big deal; but it'd still be nice, and I'd happily volunteer to do the actual implementation of the 'close()' method myself, because it's about the same amount of work as updating PEP 333 and sorting out any political issues that might arise therefrom. :) >But I kind of doubt that it's an issue; you'd have to have a >try/except catching StopIteration around a yield statement that >resumes the generator before this becomes an issue, and that sounds >extremely improbable. But it does exist, alas; see the 'itergroup()' and 'xmap()' functions of this cookbook recipe: http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/66448/ Or more pointedly, the 'roundrobin()' example in the Python 2.4 documentation: http://www.python.org/doc/lib/deque-recipes.html And there are other examples as well: http://www.faqts.com/knowledge_base/view.phtml/aid/13516 http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/141934 From pje at telecommunity.com Wed May 4 02:55:32 2005 From: pje at telecommunity.com (Phillip J. Eby) Date: Tue, 03 May 2005 20:55:32 -0400 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: <5.1.1.6.0.20050503202653.02fa28f0@mail.telecommunity.com> References: <5.1.1.6.0.20050503200040.03171960@mail.telecommunity.com> <001901c55027$50c101e0$c704a044@oemcomputer> <5.1.1.6.0.20050503175901.0212aa00@mail.telecommunity.com> <5.1.1.6.0.20050503191911.03206050@mail.telecommunity.com> <5.1.1.6.0.20050503200040.03171960@mail.telecommunity.com> Message-ID: <5.1.1.6.0.20050503205342.03cc4250@mail.telecommunity.com> At 08:47 PM 5/3/05 -0400, Phillip J. Eby wrote: > http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/141934 Oops; that one's not really a valid example; the except StopIteration just has a harmless "pass", and it's not in a loop. From python at rcn.com Wed May 4 03:00:49 2005 From: python at rcn.com (Raymond Hettinger) Date: Tue, 3 May 2005 21:00:49 -0400 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: <5.1.1.6.0.20050503202653.02fa28f0@mail.telecommunity.com> Message-ID: <001901c55044$b6cb4460$41b6958d@oemcomputer> > >(So do you want this feature now or not? Earlier you said it was no big > deal.) > > It *isn't* a big deal; but it'd still be nice, and I'd happily volunteer > to > do the actual implementation of the 'close()' method myself, because it's > about the same amount of work as updating PEP 333 and sorting out any > political issues that might arise therefrom. :) Can I recommend tabling this one for the time being. My sense is that it can be accepted independently of PEP 340 but that it should wait until afterwards because the obvious right-thing-to-do will be influenced by what happens with 340. Everyone's bandwidth is being maxed-out at this stage. So it is somewhat helpful to keep focused on the core proposal of generator driven block/suite thingies. Raymond From python at jwp.name Wed May 4 03:54:46 2005 From: python at jwp.name (James William Pye) Date: Tue, 03 May 2005 18:54:46 -0700 Subject: [Python-Dev] Need to hook Py_FatalError In-Reply-To: <20050503132639.6492.JCARLSON@uci.edu> References: <20050503112637.648F.JCARLSON@uci.edu> <20050503132639.6492.JCARLSON@uci.edu> Message-ID: <1115171686.62180.48.camel@localhost> On Tue, 2005-05-03 at 13:39 -0700, Josiah Carlson wrote: > If I'm wrong, I'd like to hear it, but I'm still waiting for your patch > on sourceforge. Well, if he lost/loses interest for whatever reason, I'd be willing to provide. Although, if m.u.k. is going to write it, please be sure to include a CPP macro/define, so that embedders could recognize the feature without having to run explicit checks or do version based fingerprinting. (I'd be interested to follow the patch if you(muk) put it up!) Hrm, although, I don't think it would be wise to allow extension modules to set this. IMO, there should be some attempt to protect it; ie, once it's initialized, don't allow reinitialization, as if the embedder is handling it, it should be handled through the duration of the process. So, a static function pointer in pythonrun.c initialized to NULL, a protective setter that will only allow setting if the pointer is NULL, and Py_FatalError calling the pointer if pointer != Py_FatalError. Should [Py_FatalError] fall through if the hook didn't terminate the process to provide some level of warranty that the process will indeed die? Sound good? -- Regards, James William Pye From tdelaney at avaya.com Wed May 4 04:30:55 2005 From: tdelaney at avaya.com (Delaney, Timothy C (Timothy)) Date: Wed, 4 May 2005 12:30:55 +1000 Subject: [Python-Dev] PEP 340 -- concept clarification Message-ID: <338366A6D2E2CA4C9DAEAE652E12A1DE025204DC@au3010avexu1.global.avaya.com> Delaney, Timothy C (Timothy) wrote: > Guido van Rossum wrote: > >> I'd like the block statement to be defined exclusively in terms of >> __exit__() though. > > 1. If an iterator declares __exit__, it cannot be used in a for-loop. > For-loops do not guarantee resource cleanup. > > 2. If an iterator does not declare __exit__, it cannot be used in a > block-statement. > Block-statements guarantee resource cleanup. Now some thoughts have solidified in my mind ... I'd like to define some terminology that may be useful. resource protocol: __next__ __exit__ Note: __iter__ is explicitly *not* required. resource: An object that conforms to the resource protocol. resource generator: A generator function that produces a resource. resource usage statement/suite: A suite that uses a resource. With this conceptual framework, I think the following makes sense: - Keyword 'resource' for defining a resource generator. - Keyword 'use' for using a resource. e.g. :: resource locker (lock): lock.acquire() try: yield finally: lock.release() use locker(lock): # do stuff Tim Delaney From nbastin at opnet.com Wed May 4 05:15:38 2005 From: nbastin at opnet.com (Nicholas Bastin) Date: Tue, 3 May 2005 23:15:38 -0400 Subject: [Python-Dev] Py_UNICODE madness In-Reply-To: References: Message-ID: <8c00ac9db9a6be3e4a937eb6290c9838@opnet.com> On May 3, 2005, at 6:44 PM, Guido van Rossum wrote: > I think that documentation is wrong; AFAIK Py_UNICODE has always been > allowed to be either 16 or 32 bits, and the source code goes through > great lengths to make sure that you get a link error if you try to > combine extensions built with different assumptions about its size. That makes PyUnicode_FromUnicode() a lot less useful. Well, really, not useful at all. You might suggest that PyUnicode_FromWideChar is more useful, but that's only true on platforms that support wchar_t. Is there no universally supported way of moving buffers of unicode data (as common data types, like unsigned short, etc.) into Python from C? -- Nick From gvanrossum at gmail.com Wed May 4 05:42:11 2005 From: gvanrossum at gmail.com (Guido van Rossum) Date: Tue, 3 May 2005 20:42:11 -0700 Subject: [Python-Dev] Py_UNICODE madness In-Reply-To: <8c00ac9db9a6be3e4a937eb6290c9838@opnet.com> References: <8c00ac9db9a6be3e4a937eb6290c9838@opnet.com> Message-ID: I really don't know. Effbot, MvL and/or MAL should know. On 5/3/05, Nicholas Bastin wrote: > > On May 3, 2005, at 6:44 PM, Guido van Rossum wrote: > > > I think that documentation is wrong; AFAIK Py_UNICODE has always been > > allowed to be either 16 or 32 bits, and the source code goes through > > great lengths to make sure that you get a link error if you try to > > combine extensions built with different assumptions about its size. > > That makes PyUnicode_FromUnicode() a lot less useful. Well, really, > not useful at all. > > You might suggest that PyUnicode_FromWideChar is more useful, but > that's only true on platforms that support wchar_t. > > Is there no universally supported way of moving buffers of unicode data > (as common data types, like unsigned short, etc.) into Python from C? > > -- > Nick > > -- --Guido van Rossum (home page: http://www.python.org/~guido/) From greg.ewing at canterbury.ac.nz Wed May 4 05:50:19 2005 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Wed, 04 May 2005 15:50:19 +1200 Subject: [Python-Dev] 2 words keyword for block In-Reply-To: References: Message-ID: <4278467B.1090404@canterbury.ac.nz> Gheorghe Milas wrote: > in template thread_safe(lock): > in template redirected_stdout(stream): > in template use_and_close_file(path) as file: > in template as_transaction(): > in template auto_retry(times=3, failas=IOError): -1. This is unpythonically verbose. If I wanted to get lots of finger exercise typing redundant keywords, I'd program in COBOL. :-) -- Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg.ewing at canterbury.ac.nz +--------------------------------------+ From reinhold-birkenfeld-nospam at wolke7.net Wed May 4 10:09:11 2005 From: reinhold-birkenfeld-nospam at wolke7.net (Reinhold Birkenfeld) Date: Wed, 04 May 2005 10:09:11 +0200 Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally Message-ID: Hello, after proposing this here (albeit deep in the PEP 340 thread) and getting a somewhat affirmatory response from Guido, I have written something that could become a PEP if sufficiently hatched... --------------------------- PEP: XXX Title: Unifying try-except and try-finally Version: $Revision: $ Last-Modified: $Date: $ Author: Reinhold Birkenfeld Status: Draft Type: Standards Track Content-Type: text/plain Created: 04-May-2005 Post-History: Abstract This PEP proposes a change in the syntax and semantics of try statements to allow combined try-except-finally blocks. This means in short that it would be valid to write try: except Exception: finally: Rationale/Proposal There are many use cases for the try-except statement and for the try-finally statement per se; however, often one needs to catch exceptions and execute some cleanup code afterwards. It is slightly annoying and not very intelligible that one has to write f = None try: try: f = open(filename) text = f.read() except IOError: print 'An error occured' finally: if f: f.close() So it is proposed that a construction like this try: except Ex1: else: finally: be exactly the same as the legacy try: try: except Ex1: else: finally: This is backwards compatible, and every try statement that is legal today would continue to work. Changes to the grammar The grammar for the try statement, which is currently try_stmt: ('try' ':' suite (except_clause ':' suite)+ ['else' ':' suite] | 'try' ':' suite 'finally' ':' suite) would have to become try_stmt: ('try' ':' suite (except_clause ':' suite)+ ['else' ':' suite] ['finally' ':' suite] | 'try' ':' suite (except_clause ':' suite)* ['else' ':' suite] 'finally' ':' suite) Implementation As the PEP author currently does not have sufficient knowledge of the CPython implementation, he is unfortunately not able to deliver one. References None yet. Copyright This document has been placed in the public domain. ----------------------- Reinhold -- Mail address is perfectly valid! From mal at egenix.com Wed May 4 10:39:16 2005 From: mal at egenix.com (M.-A. Lemburg) Date: Wed, 04 May 2005 10:39:16 +0200 Subject: [Python-Dev] Py_UNICODE madness In-Reply-To: References: Message-ID: <42788A34.30301@egenix.com> Nicholas Bastin wrote: > The documentation for Py_UNICODE states the following: > > "This type represents a 16-bit unsigned storage type which is used by > Python internally as basis for holding Unicode ordinals. On platforms > where wchar_t is available and also has 16-bits, Py_UNICODE is a > typedef alias for wchar_t to enhance native platform compatibility. On > all other platforms, Py_UNICODE is a typedef alias for unsigned > short." > > However, we have found this not to be true on at least certain RedHat > versions (maybe all, but I'm not willing to say that at this point). > pyconfig.h on these systems reports that PY_UNICODE_TYPE is wchar_t, > and PY_UNICODE_SIZE is 4. Needless to say, this isn't consistent with > the docs. It also creates quite a few problems when attempting to > interface Python with other libraries which produce unicode data. > > Is this a bug, or is this behaviour intended? It's a documentation bug. The above was true in Python 2.0 and still is for standard Python builds. The optional 32-bit support was added later on (in Python 2.1 IIRC) and is only used if Python is compiled with --enable-unicode=ucs4. Unfortunately, RedHat and others have made the UCS4 build their default which caused and is still causing lots of problems with Python extensions shipped as binaries, e.g. RPMs or other packages. > It turns out that at some point in the past, this created problems for > tkinter as well, so someone just changed the internal unicode > representation in tkinter to be 4 bytes as well, rather than tracking > down the real source of the problem. > > Is PY_UNICODE_TYPE always going to be guaranteed to be 16 bits, or is > it dependent on your platform? (in which case we can give up now on > Python unicode compatibility with any other libraries). Depends on the way Python was compiled. > At the very > least, if we can't guarantee the internal representation, then the > PyUnicode_FromUnicode API needs to go away, and be replaced with > something capable of transcoding various unicode inputs into the > internal python representation. We have PyUnicode_Decode() for that. PyUnicode_FromUnicode is useful and meant for working directly on Py_UNICODE buffers. -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source (#1, May 04 2005) >>> Python/Zope Consulting and Support ... http://www.egenix.com/ >>> mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ >>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ ________________________________________________________________________ ::: Try mxODBC.Zope.DA for Windows,Linux,Solaris,FreeBSD for free ! :::: From p.f.moore at gmail.com Wed May 4 10:57:45 2005 From: p.f.moore at gmail.com (Paul Moore) Date: Wed, 4 May 2005 09:57:45 +0100 Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr> <17015.39213.522060.873605@montanaro.dyndns.org> <17015.48830.223391.390538@montanaro.dyndns.org> Message-ID: <79990c6b050504015762d004ac@mail.gmail.com> On 5/3/05, Nicolas Fleury wrote: > We could avoid explaining to a newbie why the following code doesn't > work if "opening" could be implemented in way that it works. > > for filename in filenames: > block opening(filename) as file: > if someReason: break My initial feeling was that this is a fairly major issue, not just from an education POV, but also because it would be easy to imagine someone converting f = open(name) ... close(f) into opening(name) as f: ... as part of a maintenance fix (an exception fails to close the file) or tidy-up (upgrading code to conform to new best practices). But when I tried to construct a plausible example, I couldn't find a case which made real-life sense. For example, with Nicolas' original example: for name in filenames: opening(name) as f: if condition: break I can't think of a reasonable condition which wouldn't involve reading the file - which either involves an inner loop (and we already can't break out of two loops, so the third one implied by the opening block makes things no worse), or needs the whole file reading (which can be done via f = open(); data = f.read(); f.close() and the opening block doesn't actually help...) So I believe the issue is less serious than I supposed at first - it's certainly a teaching issue, but might not come up often enough in real life to matter. Oh, and by the way - I prefer the keywordless form of the block statement (as used in my examples above). But it may exacerbate the issue with break unless we have a really strong name for these constructs ("break exits the innermost enclosing for, while, or um, one of those things which nearly used the block keyword...") Actually, maybe referring to them as "block statements", but using no keyword, is perfectly acceptable. As I write, I'm finding it more and more natural. Paul. From m.u.k.2 at gawab.com Wed May 4 11:35:42 2005 From: m.u.k.2 at gawab.com (M.Utku K.) Date: Wed, 4 May 2005 09:35:42 +0000 (UTC) Subject: [Python-Dev] Need to hook Py_FatalError References: <20050503112637.648F.JCARLSON@uci.edu> <20050503132639.6492.JCARLSON@uci.edu> Message-ID: Hi, Josiah Carlson wrote in news:20050503132639.6492.JCARLSON at uci.edu: >>> strip.... >> IMHO this should be left to hooker(apparerently not right word, but you >> get the point :) ). If he allocates more mem. or does heavy stuff, that >> will just fail. Anyway abort() is a failure too. Either abort() will >> end the process or OS will on such a critical error. > > I'm not talking about doing memory-intensive callbacks, I'm talking > about the function call itself. > >>From what I understand, any function call in Python requires a memory > allocation. This is trivially true in the case of rentrant Python calls; > which requires the allocation of a frame object from heap memory, and in > the case of all calls, from C stack memory. If you cannot allocate a > frame for __del__ method calling (one of the error conditions), you > certainly aren't going to be able to call a Python callback (no heap > memory), and may not have enough stack memory required by your logging > function; even if it is written in C (especially if you construct a > nontrivial portion of the message in memory before it is printed). > > If I'm wrong, I'd like to hear it, but I'm still waiting for your patch > on sourceforge. > - Josiah Wait a minute I guess I wasn't clear on that: The callback will be only in C level smtg like "PySetFatalError_CallBack" , there will be no way to hook it from Python because as you said Python may have crashed hard like "Can't initialize type". Best regards. From m.u.k.2 at gawab.com Wed May 4 11:46:14 2005 From: m.u.k.2 at gawab.com (M.Utku K.) Date: Wed, 4 May 2005 09:46:14 +0000 (UTC) Subject: [Python-Dev] Need to hook Py_FatalError References: <20050503112637.648F.JCARLSON@uci.edu> <20050503132639.6492.JCARLSON@uci.edu> <1115171686.62180.48.camel@localhost> Message-ID: Hi, James William Pye wrote in news:1115171686.62180.48.camel at localhost: > On Tue, 2005-05-03 at 13:39 -0700, Josiah Carlson wrote: > >> If I'm wrong, I'd like to hear it, but I'm still waiting for your patch >> on sourceforge. > > Well, if he lost/loses interest for whatever reason, I'd be willing to > provide. > > Although, if m.u.k. is going to write it, please be sure to include a > CPP macro/define, so that embedders could recognize the feature without > having to run explicit checks or do version based fingerprinting. (I'd > be interested to follow the patch if you(muk) put it up!) > > Hrm, although, I don't think it would be wise to allow extension modules > to set this. IMO, there should be some attempt to protect it; ie, once > it's initialized, don't allow reinitialization, as if the embedder is > handling it, it should be handled through the duration of the process. > So, a static function pointer in pythonrun.c initialized to NULL, a > protective setter that will only allow setting if the pointer is NULL, > and Py_FatalError calling the pointer if pointer != Py_FatalError. > > Should [Py_FatalError] fall through if the hook didn't terminate the > process to provide some level of warranty that the process will indeed > die? > > Sound good? I haven't lost interest, I expect to publish at most in a couple of days at SourceForge. The reinit. issue: The old way of returning old callback when a new callback is set sounds OK. Or better way: there may be an array to hold all the callbacks, Py_FatalError iterates and call each. M. Utku Karatas Best regards. From ncoghlan at gmail.com Wed May 4 12:25:32 2005 From: ncoghlan at gmail.com (Nick Coghlan) Date: Wed, 04 May 2005 20:25:32 +1000 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: <8dffe7626eaf9a89812c2828bcf96efe@fuhm.net> References: <2mhdhkz8aw.fsf@starship.python.net> <000b01c54ff6$a0fb4ca0$c704a044@oemcomputer> <8dffe7626eaf9a89812c2828bcf96efe@fuhm.net> Message-ID: <4278A31C.5000001@gmail.com> James Y Knight wrote: > On May 3, 2005, at 12:53 PM, Guido van Rossum wrote: > >>def saving_stdout(f): >> save_stdout = sys.stdout >> try: >> sys.stdout = f >> yield >> finally: >> sys.stdout = save_stdout > > > I hope you aren't going to be using that in any threaded program. sys.stdout is a global - threading issues are inherent in monkeying with it. At least this approach allows all code that redirects stdout to be easily serialised: def redirect_stdout(f, the_lock=Lock()): locking(the_lock): save_stdout = sys.stdout try: sys.stdout = f yield finally: sys.stdout = save_stdout Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.skystorm.net From ncoghlan at gmail.com Wed May 4 12:42:19 2005 From: ncoghlan at gmail.com (Nick Coghlan) Date: Wed, 04 May 2005 20:42:19 +1000 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: <1f7befae05050313212db5d4df@mail.gmail.com> References: <001101c5500d$a7be7140$c704a044@oemcomputer> <1f7befae050503121346833d97@mail.gmail.com> <1f7befae05050313212db5d4df@mail.gmail.com> Message-ID: <4278A70B.8040804@gmail.com> Tim Peters wrote: > I don't think anyone has mentioned this yet, so I will: library > writers using Decimal (or more generally HW 754 gimmicks) have a need > to fiddle lots of thread-local state ("numeric context"), and must > restore it no matter how the routine exits. Like "boost precision to > twice the user's value over the next 12 computations, then restore", > and "no matter what happens here, restore the incoming value of the > overflow-happened flag". It's just another instance of temporarily > taking over a shared resource, but I think it's worth mentioning that > there are a lot of things "like that" in the world, and to which > decorators don't really sanely apply. To turn this example into PEP 340 based code: # A template to be provided by the decimal module # Context is thread-local, so there is no threading problem def in_context(context): old_context = getcontext() try: setcontext(context) yield finally: setcontext(old_context) Used as follows: block decimal.in_context(Context(prec=12)): # Perform higher precision operations here Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.skystorm.net From fredrik at pythonware.com Wed May 4 13:08:56 2005 From: fredrik at pythonware.com (Fredrik Lundh) Date: Wed, 4 May 2005 13:08:56 +0200 Subject: [Python-Dev] anonymous blocks References: <426DB7C8.5020708@canterbury.ac.nz><426E3B01.1010007@canterbury.ac.nz><5.1.1.6.0.20050427105524.02479e70@mail.telecommunity.com><5.1.1.6.0.20050427164323.0332c2b0@mail.telecommunity.com> Message-ID: Guido van Rossum wrote: > Fredrik, what does your intuition tell you? having been busy with other stuff for nearly a week, and seeing that the PEP is now at version 1.22, my intuition tells me that it's time to read the PEP again before I have any opinion on anything ;-) From ncoghlan at gmail.com Wed May 4 13:33:25 2005 From: ncoghlan at gmail.com (Nick Coghlan) Date: Wed, 04 May 2005 21:33:25 +1000 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: <5.1.1.6.0.20050503130527.02471ad0@mail.telecommunity.com> References: <000b01c54ff6$a0fb4ca0$c704a044@oemcomputer> <2mhdhkz8aw.fsf@starship.python.net> <000b01c54ff6$a0fb4ca0$c704a044@oemcomputer> <5.1.1.6.0.20050503130527.02471ad0@mail.telecommunity.com> Message-ID: <4278B305.6040704@gmail.com> Phillip J. Eby wrote: > This and other examples from the PEP still have a certain awkwardness of > phrasing in their names. A lot of them seem to cry out for a "with" > prefix, although maybe that's part of the heritage of PEP 310. But Lisp > has functions like 'with-open-file', so I don't think that it's *all* a PEP > 310 influence on the examples. I've written up a few examples in the course of the discussion, and the more of them I have written, the more the keywordless syntax has grown on me. No meaningful name like 'with' or 'in' is appropriate for all possible block iterators, which leaves only keyword-for-the-sake-of-a-keyword options like 'block' or 'suite'. With block statements viewed as user-defined blocks, leaving the keyword out lets the block iterator be named whatever is appropriate to making the block statement read well. If a leading 'with' is needed, just include it in the name. That is, instead of a 'block statement with the locking block iterator', you write a 'locking statement'. Instead of a block statement with the opening block iterator', you write an 'opening statement'. The benefit didn't stand out for me until writing examples with real code around the start of the block statement. Unlike existing statements, the keyword is essentially irrelevant in understanding the implications of the statement - the important thing is the block iterator being used. That is hard to see when the keyword is the only thing dedented from the contained suite. Consider some of the use cases from the PEP, but put inside function definitions to make it harder to pick out the name of the block iterator: def my_func(): block locking(the_lock): do_some_operation_while_holding_the_lock() Versus: def my_func(): locking(the_lock): do_some_operation_while_holding_the_lock() And: def my_func(filename): block opening(filename) as f: for line in f: print f Versus: def my_func(filename): opening(filename) as f: for line in f: print f And a few more without the contrast: def my_func(): do_transaction(): db.delete_everything() def my_func(): auto_retry(3, IOError): f = urllib.urlopen("http://python.org/peps/pep-0340.html") print f.read() def my_func(): opening(filename, "w") as f: with_stdout(f): print "Hello world" When Guido last suggested this, the main concern seemed to be that the documentation for every block iterator would need to explain the semantics of block statements, since the block iterator name is the only name to be looked up in the documentation. But they don't need to explain the general semantics, they only need to explain _their_ semantics, and possibly provide a pointer to the general block statement documentation. That is, explain _what_ the construct does (which is generally straightforward), not _how_ it does it (which is potentially confusing). E.g. def locking(the_lock): """Executes the following nested block while holding the supplied lock Ensures the lock is acquired before entering the block and released when the block is exited (including via exceptions or return statements). If None is supplied as the argument, no locking occurs. """ if the_lock is None: yield else: the_lock.acquire() try: yield finally: the_lock.release() Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.skystorm.net From ncoghlan at gmail.com Wed May 4 13:58:29 2005 From: ncoghlan at gmail.com (Nick Coghlan) Date: Wed, 04 May 2005 21:58:29 +1000 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: <5.1.1.6.0.20050503202653.02fa28f0@mail.telecommunity.com> References: <5.1.1.6.0.20050503200040.03171960@mail.telecommunity.com> <001901c55027$50c101e0$c704a044@oemcomputer> <5.1.1.6.0.20050503175901.0212aa00@mail.telecommunity.com> <5.1.1.6.0.20050503191911.03206050@mail.telecommunity.com> <5.1.1.6.0.20050503200040.03171960@mail.telecommunity.com> <5.1.1.6.0.20050503202653.02fa28f0@mail.telecommunity.com> Message-ID: <4278B8E5.7060303@gmail.com> > At 05:17 PM 5/3/05 -0700, Guido van Rossum wrote: >>But I kind of doubt that it's an issue; you'd have to have a >>try/except catching StopIteration around a yield statement that >>resumes the generator before this becomes an issue, and that sounds >>extremely improbable. The specific offending construct is: yield itr.next() Wrapping that in StopIteration can be quite convenient, and is probably too common to ignore - Phillip's examples reminded me that some my _own_ code uses this trick. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.skystorm.net From ncoghlan at gmail.com Wed May 4 14:32:01 2005 From: ncoghlan at gmail.com (Nick Coghlan) Date: Wed, 04 May 2005 22:32:01 +1000 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: <338366A6D2E2CA4C9DAEAE652E12A1DE721278@au3010avexu1.global.avaya.com> References: <338366A6D2E2CA4C9DAEAE652E12A1DE721278@au3010avexu1.global.avaya.com> Message-ID: <4278C0C1.4040201@gmail.com> Delaney, Timothy C (Timothy) wrote: > Guido van Rossum wrote: > > >>I'd like the block statement to be defined exclusively in terms of >>__exit__() though. > > > This does actually suggest something to me (note - just a thought - no > real idea if it's got any merit). > > Are there any use cases proposed for the block-statement (excluding the > for-loop) that do *not* involve resource cleanup (i.e. need an > __exit__)? > > This could be the distinguishing feature between for-loops and > block-statements: > > 1. If an iterator declares __exit__, it cannot be used in a for-loop. > For-loops do not guarantee resource cleanup. > > 2. If an iterator does not declare __exit__, it cannot be used in a > block-statement. > Block-statements guarantee resource cleanup. > > This gives separation of API (and thus purpose) whilst maintaining the > simplicity of the concept. Unfortunately, generators then become a pain > :( We would need additional syntax to declare that a generator was a > block generator. Ah, someone else did post this idea first :) To deal with the generator issue, one option would be to follow up on Phillip's idea of a decorator to convert a generator (or perhaps any standard iterator) into a block iterator. I think this would also do wonders for emphasising the difference between for loops and block statements. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.skystorm.net From ncoghlan at gmail.com Wed May 4 16:10:31 2005 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 05 May 2005 00:10:31 +1000 Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: <79990c6b050504015762d004ac@mail.gmail.com> References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr> <17015.39213.522060.873605@montanaro.dyndns.org> <17015.48830.223391.390538@montanaro.dyndns.org> <79990c6b050504015762d004ac@mail.gmail.com> Message-ID: <4278D7D7.2040805@gmail.com> Paul Moore wrote: > Oh, and by the way - I prefer the keywordless form of the block > statement (as used in my examples above). But it may exacerbate the > issue with break unless we have a really strong name for these > constructs "may" exacerbate? Something of an understatement, unfortunately. 'break' and 'continue' are going to be useless in most block statements, and in most of the cases where that is true, one would expect them to break out of a surrounding loop. Reconsidering the non-looping semantics I discussed way back at the start of this exercise, this is what using auto_retry would look like with a single-pass block statement: for attempt in auto_retry(3, IOError): attempt: # Do something! # Including break and continue to get on with the next attempt! (Now that's what I call a user defined statement - we're iterating over a list of them!) Anyway, auto_retry and the block statements it returns could be implemented as: def suppressing(exc=Exception, on_success=None): """Suppresses the specified exception in the following block""" try: yield except exc: pass else: if on_success is not None: on_success() def just_do_it(): """Simply executes the following block""" yield def auto_retry(times, exc=Exception): """Generates the specified number of attempts""" class cb(): succeeded = False def __init__(self): cb.succeeded = True for i in xrange(times-1): yield suppressing(exc, cb) if cb.succeeded: break else: yield just_do_it() (Prettier than my last attempt at writing this, but still not very pretty. However, I'm willing to trade the need for that callback in the implementation of auto_retry to get non-surprising behaviour from break and continue, as the latter is visible to the majority of users, but the former is not) Note that the code above works, even *if* the block statement is a looping construct, making a mess out of TOOWTDI. Making it single pass also simplifies the semantics of the block statement (using VAR1 and EXPR1 from PEP 340): finalised = False block_itr = EXPR1 try: try: VAR1 = block_itr.next() except StopIteration: # Can still choose not to run the block at all finalised = True except: # There was an exception. Handle it or just reraise it. finalised = True exc = sys.exc_info() ext = getattr(block_itr, "__exit__", None) if ext is not None: ext(*exc) # May re-raise *exc else: raise *exc # Well, the moral equivalent :-) finally: if not finalised: # The block finished cleanly, or exited via # break, return or continue. Clean up the iterator. ext = getattr(block_itr, "__exit__", None) if ext is not None: try: ext(StopIteration) except StopIteration: pass With single-pass semantics, an iterator used in a block statement would have it's .next() method called exactly once, and it's __exit__ method called exactly once if the call to .next() does not raise StopIteration. And there's no need to mess with the meaning of break, return or continue - they behave as usual, affecting the surrounding scope rather than the block statement. The only new thing needed is an __exit__ method on generators (and the block syntax itself, of course). Looks like I've come full circle, and am back to arguing for semantics closer to those in PEP 310. But I have a better reason now :) > Actually, > maybe referring to them as "block statements", but using no keyword, > is perfectly acceptable. As I write, I'm finding it more and more > natural. Same here. Especially if the semantics are tweaked so that it *is* a straightforward statement instead of a loop. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.skystorm.net From nbastin at opnet.com Wed May 4 16:14:31 2005 From: nbastin at opnet.com (Nicholas Bastin) Date: Wed, 4 May 2005 10:14:31 -0400 Subject: [Python-Dev] Py_UNICODE madness In-Reply-To: <42788A34.30301@egenix.com> References: <42788A34.30301@egenix.com> Message-ID: On May 4, 2005, at 4:39 AM, M.-A. Lemburg wrote: >> At the very least, if we can't guarantee the internal representation, >> then the PyUnicode_FromUnicode API needs to go away, and be replaced >> with something capable of transcoding various unicode inputs into the >> internal python representation. > > We have PyUnicode_Decode() for that. PyUnicode_FromUnicode is > useful and meant for working directly on Py_UNICODE buffers. Is this API documented anywhere? (It's not in the Unicode Object section of the API doc). Also, this is quite inefficient if the source data is in UTF-16, because it appears that I'll have to transcode my data to utf-8 before I can pass it to this function, but I guess I'll have to live with that. -- Nick From pedronis at strakt.com Wed May 4 16:20:04 2005 From: pedronis at strakt.com (Samuele Pedroni) Date: Wed, 04 May 2005 16:20:04 +0200 Subject: [Python-Dev] Python Language track at Europython, still possibilities to submit talks Message-ID: <4278DA14.8030301@strakt.com> I'm the track chair of the Python Language track at Europython (27-29 June, G?teborg, Sweden) . The general deadlline for talk submission has been extended until the 7th of May. There are still open slots for the language track. So if someone with (core) language interests is or may be interested in partecipating, there's still the possibility to submit talks about idioms, patterns, recent new additions to language (for example new 2.4 features), or other language related topics. http://www.europython.org/sections/tracks_and_talks/propose_a_talk/#language http://www.europython.org/sections/tracks_and_talks/propose_a_talk/ http://www.europython.org Regards, Samuele Pedroni, Python Language Track chair. From python at jwp.name Wed May 4 16:58:54 2005 From: python at jwp.name (James William Pye) Date: Wed, 04 May 2005 07:58:54 -0700 Subject: [Python-Dev] Need to hook Py_FatalError In-Reply-To: References: <20050503112637.648F.JCARLSON@uci.edu> <20050503132639.6492.JCARLSON@uci.edu> <1115171686.62180.48.camel@localhost> Message-ID: <1115218734.62180.119.camel@localhost> On Wed, 2005-05-04 at 09:46 +0000, M.Utku K. wrote: > The reinit. issue: The old way of returning old callback when a new > callback is set sounds OK. Or better way: there may be an array to hold all > the callbacks, Py_FatalError iterates and call each. Why should reinitialization be allowed at all? Seems to me that this feature should be exclusively reserved for an embedding application to handle the fatal in an application specific way; ie ereport(FATAL,()) in PostgreSQL, which quickly exits after some cleanup. Why should an extension module be allowed to set this, or reset it? -- Regards, James William Pye From aleaxit at yahoo.com Wed May 4 17:14:54 2005 From: aleaxit at yahoo.com (Alex Martelli) Date: Wed, 4 May 2005 08:14:54 -0700 Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: <79990c6b050504015762d004ac@mail.gmail.com> References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr> <17015.39213.522060.873605@montanaro.dyndns.org> <17015.48830.223391.390538@montanaro.dyndns.org> <79990c6b050504015762d004ac@mail.gmail.com> Message-ID: On May 4, 2005, at 01:57, Paul Moore wrote: > tried to construct a plausible example, I couldn't find a case which > made real-life sense. For example, with Nicolas' original example: > > for name in filenames: > opening(name) as f: > if condition: break > > I can't think of a reasonable condition which wouldn't involve reading > the file - which either involves an inner loop (and we already can't > break out of two loops, so the third one implied by the opening block > makes things no worse), or needs the whole file reading (which can be Looking for a file with a certain magicnumber in its 1st two bytes...? for name in filenames: opening(name) as f: if f.read(2) == 0xFEB0: break This does seem to make real-life sense to me... Alex From p.f.moore at gmail.com Wed May 4 17:27:48 2005 From: p.f.moore at gmail.com (Paul Moore) Date: Wed, 4 May 2005 16:27:48 +0100 Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr> <17015.39213.522060.873605@montanaro.dyndns.org> <17015.48830.223391.390538@montanaro.dyndns.org> <79990c6b050504015762d004ac@mail.gmail.com> Message-ID: <79990c6b0505040827941ff0@mail.gmail.com> On 5/4/05, Alex Martelli wrote: > > On May 4, 2005, at 01:57, Paul Moore wrote: > > > > I can't think of a reasonable condition which wouldn't involve reading > > the file - which either involves an inner loop (and we already can't > > break out of two loops, so the third one implied by the opening block > > makes things no worse), or needs the whole file reading (which can be > > Looking for a file with a certain magicnumber in its 1st two bytes...? > > for name in filenames: > opening(name) as f: > if f.read(2) == 0xFEB0: break > > This does seem to make real-life sense to me... Yes, that'd do. I can't say I think it would be common, but it's a valid case. And the workaround is the usual messy flag variable: for name in filenames: found = False opening(name) as f: if f.read(2) == 0xFEB0: found = True if found: break Yuk. Paul. From steven.bethard at gmail.com Wed May 4 17:35:18 2005 From: steven.bethard at gmail.com (Steven Bethard) Date: Wed, 4 May 2005 09:35:18 -0600 Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: <4278D7D7.2040805@gmail.com> References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr> <17015.39213.522060.873605@montanaro.dyndns.org> <17015.48830.223391.390538@montanaro.dyndns.org> <79990c6b050504015762d004ac@mail.gmail.com> <4278D7D7.2040805@gmail.com> Message-ID: On 5/4/05, Nick Coghlan wrote: > With single-pass semantics, an iterator used in a block statement would have > it's .next() method called exactly once, and it's __exit__ method called exactly > once if the call to .next() does not raise StopIteration. And there's no need to > mess with the meaning of break, return or continue - they behave as usual, > affecting the surrounding scope rather than the block statement. > > The only new thing needed is an __exit__ method on generators (and the block > syntax itself, of course). Makes me wonder if we shouldn't just return to the __enter__() and __exit__() names of PEP 310[1] where for a generator __enter__() is just an alias for next(). We could even require Phillip J. Eby's "blockgenerator" decorator to rename next() to __enter__(), and add the appropriate __exit__() method. Something like: class blockgen(object): def __init__(self, gen): self.gen = gen def __enter__(self): self.gen.next() def __exit__(self): # cause finally blocks to be executed def blockgenerator(genfunc): def getblockgen(*args, **kwargs): return blockgen(genfunc(*args, **kwargs)) return getblockgen to be used like: @blockgenerator def locking(lock): lock.acquire() try: yield finally: lock.release() 'Course, it might be even nicer if try/finally around a yield could only be used with block generators... To get a syntax error, we'd have to replace the decorator with a new syntax, e.g. Tim Delaney's "resource" instead of "def" syntax or maybe using something like "blockyield" or "resourceyield" instead of "yield" (though these are probably too long)... Steve [1]http://www.python.org/peps/pep-0310.html -- You can wordify anything if you just verb it. --- Bucky Katt, Get Fuzzy From m.u.k.2 at gawab.com Wed May 4 17:29:33 2005 From: m.u.k.2 at gawab.com (M.Utku K.) Date: Wed, 4 May 2005 15:29:33 +0000 (UTC) Subject: [Python-Dev] Need to hook Py_FatalError References: <20050503112637.648F.JCARLSON@uci.edu> <20050503132639.6492.JCARLSON@uci.edu> <1115171686.62180.48.camel@localhost> <1115218734.62180.119.camel@localhost> Message-ID: James William Pye wrote in news:1115218734.62180.119.camel at localhost: > On Wed, 2005-05-04 at 09:46 +0000, M.Utku K. wrote: >> The reinit. issue: The old way of returning old callback when a new >> callback is set sounds OK. Or better way: there may be an array to hold >> all the callbacks, Py_FatalError iterates and call each. > > Why should reinitialization be allowed at all? Seems to me that this > feature should be exclusively reserved for an embedding application to > handle the fatal in an application specific way; ie ereport(FATAL,()) in > PostgreSQL, which quickly exits after some cleanup. Why should an > extension module be allowed to set this, or reset it? What if more than one extension needs it ? Curently Im doing callback_type SetCallBack(callback_type newfunc) This will set the callback to newfunc and return the old one. Extension developer may discard or call them at his own will. What do you think? From mal at egenix.com Wed May 4 19:39:55 2005 From: mal at egenix.com (M.-A. Lemburg) Date: Wed, 04 May 2005 19:39:55 +0200 Subject: [Python-Dev] Py_UNICODE madness In-Reply-To: References: <42788A34.30301@egenix.com> Message-ID: <427908EB.2050904@egenix.com> Nicholas Bastin wrote: > > On May 4, 2005, at 4:39 AM, M.-A. Lemburg wrote: > >>> At the very least, if we can't guarantee the internal representation, >>> then the PyUnicode_FromUnicode API needs to go away, and be replaced >>> with something capable of transcoding various unicode inputs into the >>> internal python representation. >> >> >> We have PyUnicode_Decode() for that. PyUnicode_FromUnicode is >> useful and meant for working directly on Py_UNICODE buffers. > > > Is this API documented anywhere? (It's not in the Unicode Object > section of the API doc). Also, this is quite inefficient if the source > data is in UTF-16, because it appears that I'll have to transcode my > data to utf-8 before I can pass it to this function, but I guess I'll > have to live with that. Not at all. You pass in the pointer, the function does the rest: http://docs.python.org/api/builtinCodecs.html -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source (#1, May 04 2005) >>> Python/Zope Consulting and Support ... http://www.egenix.com/ >>> mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ >>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ ________________________________________________________________________ ::: Try mxODBC.Zope.DA for Windows,Linux,Solaris,FreeBSD for free ! :::: From nbastin at opnet.com Wed May 4 17:54:32 2005 From: nbastin at opnet.com (Nicholas Bastin) Date: Wed, 4 May 2005 11:54:32 -0400 Subject: [Python-Dev] Py_UNICODE madness In-Reply-To: <427908EB.2050904@egenix.com> References: <42788A34.30301@egenix.com> <427908EB.2050904@egenix.com> Message-ID: <27a821b5583891d3ea4dfa62f0eafeb8@opnet.com> On May 4, 2005, at 1:39 PM, M.-A. Lemburg wrote: > Nicholas Bastin wrote: >> On May 4, 2005, at 4:39 AM, M.-A. Lemburg wrote: >>>> At the very least, if we can't guarantee the internal >>>> representation, then the PyUnicode_FromUnicode API needs to go >>>> away, and be replaced with something capable of transcoding various >>>> unicode inputs into the internal python representation. >>> >>> >>> We have PyUnicode_Decode() for that. PyUnicode_FromUnicode is >>> useful and meant for working directly on Py_UNICODE buffers. >> Is this API documented anywhere? (It's not in the Unicode Object >> section of the API doc). Also, this is quite inefficient if the >> source data is in UTF-16, because it appears that I'll have to >> transcode my data to utf-8 before I can pass it to this function, but >> I guess I'll have to live with that. > > Not at all. You pass in the pointer, the function does the rest: Ah, I missed the codec registry lookup. Thanks. I'll change the Py_UNICODE doc, if anyone has a suggestion as to what to change it *to*... -- Nick From nbastin at opnet.com Wed May 4 17:59:40 2005 From: nbastin at opnet.com (Nicholas Bastin) Date: Wed, 4 May 2005 11:59:40 -0400 Subject: [Python-Dev] New Py_UNICODE doc Message-ID: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com> The current documentation for Py_UNICODE states: "This type represents a 16-bit unsigned storage type which is used by Python internally as basis for holding Unicode ordinals. On platforms where wchar_t is available and also has 16-bits, Py_UNICODE is a typedef alias for wchar_t to enhance native platform compatibility. On all other platforms, Py_UNICODE is a typedef alias for unsigned short." I propose changing this to: "This type represents the storage type which is used by Python internally as the basis for holding Unicode ordinals. On platforms where wchar_t is available, Py_UNICODE is a typedef alias for wchar_t to enhance native platform compatibility. On all other platforms, Py_UNICODE is a typedef alias for unsigned short. Extension module developers should make no assumptions about the size of this type on any given platform." If no one has a problem with that, I'll make the change in CVS. -- Nick From theller at python.net Wed May 4 18:08:54 2005 From: theller at python.net (Thomas Heller) Date: Wed, 04 May 2005 18:08:54 +0200 Subject: [Python-Dev] New Py_UNICODE doc In-Reply-To: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com> (Nicholas Bastin's message of "Wed, 4 May 2005 11:59:40 -0400") References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com> Message-ID: <4qdjm0rt.fsf@python.net> Nicholas Bastin writes: > The current documentation for Py_UNICODE states: > > "This type represents a 16-bit unsigned storage type which is used by > Python internally as basis for holding Unicode ordinals. On platforms > where wchar_t is available and also has 16-bits, Py_UNICODE is a > typedef alias for wchar_t to enhance native platform compatibility. On > all other platforms, Py_UNICODE is a typedef alias for unsigned > short." > > I propose changing this to: > > "This type represents the storage type which is used by Python > internally as the basis for holding Unicode ordinals. On platforms > where wchar_t is available, Py_UNICODE is a typedef alias for wchar_t > to enhance native platform compatibility. On all other platforms, > Py_UNICODE is a typedef alias for unsigned short. Extension module > developers should make no assumptions about the size of this type on > any given platform." > > If no one has a problem with that, I'll make the change in CVS. AFAIK, you can configure Python to use 16-bits or 32-bits Unicode chars, independend from the size of wchar_t. The HAVE_USABLE_WCHAR_T macro can be used by extension writers to determine if Py_UNICODE is the same as wchar_t. At least that's my understanding, so the above seems still wrong. And +1 for trying to clean up this confusion. Thomas From jepler at unpythonic.net Wed May 4 18:12:26 2005 From: jepler at unpythonic.net (Jeff Epler) Date: Wed, 4 May 2005 11:12:26 -0500 Subject: [Python-Dev] Need to hook Py_FatalError In-Reply-To: References: <20050503112637.648F.JCARLSON@uci.edu> <20050503132639.6492.JCARLSON@uci.edu> <1115171686.62180.48.camel@localhost> <1115218734.62180.119.camel@localhost> Message-ID: <20050504161226.GD14737@unpythonic.net> On Wed, May 04, 2005 at 03:29:33PM +0000, M.Utku K. wrote: > James William Pye wrote in > news:1115218734.62180.119.camel at localhost: > > Why should reinitialization be allowed at all? Seems to me that this > > feature should be exclusively reserved for an embedding application to > > handle the fatal in an application specific way; ie ereport(FATAL,()) in > > PostgreSQL, which quickly exits after some cleanup. Why should an > > extension module be allowed to set this, or reset it? > > What if more than one extension needs it ? I agree with James; As I imagine this feature, it is for programs that embed Python, not for extensions. Whether the hook would be written to prevent this from being done, or whether it would just be documented as "for embedders only", I don't care. In my own application, I didn't use a setter function, I just created a new global variable. This works fine for me. It doesn't prevent the (abusive, in my view) hooking of the error handler by any old extension, but since my application doesn't currently import shared modules it doesn't matter. --- /tmp/Python-2.3/Python/pythonrun.c 2003-07-15 20:54:38.000000000 -0500 +++ ./pythonrun.c 2005-04-11 13:32:39.000000000 -0500 @@ -1435,9 +1435,14 @@ /* Print fatal error message and abort */ +void (*Py_FatalErrorHandler)(const char *msg) = NULL; void Py_FatalError(const char *msg) { + if(Py_FatalErrorHandler != NULL) { + Py_FatalErrorHandler(msg); + fprintf(stderr, "PyFatalErrorHandler returned\n"); + } fprintf(stderr, "Fatal Python error: %s\n", msg); #ifdef MS_WINDOWS OutputDebugString("Fatal Python error: "); -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 189 bytes Desc: not available Url : http://mail.python.org/pipermail/python-dev/attachments/20050504/3c3b1daf/attachment.pgp From reinhold-birkenfeld-nospam at wolke7.net Wed May 4 18:23:03 2005 From: reinhold-birkenfeld-nospam at wolke7.net (Reinhold Birkenfeld) Date: Wed, 04 May 2005 18:23:03 +0200 Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: <79990c6b0505040827941ff0@mail.gmail.com> References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr> <17015.39213.522060.873605@montanaro.dyndns.org> <17015.48830.223391.390538@montanaro.dyndns.org> <79990c6b050504015762d004ac@mail.gmail.com> <79990c6b0505040827941ff0@mail.gmail.com> Message-ID: Paul Moore wrote: > On 5/4/05, Alex Martelli wrote: >> >> On May 4, 2005, at 01:57, Paul Moore wrote: >> > >> > I can't think of a reasonable condition which wouldn't involve reading >> > the file - which either involves an inner loop (and we already can't >> > break out of two loops, so the third one implied by the opening block >> > makes things no worse), or needs the whole file reading (which can be >> >> Looking for a file with a certain magicnumber in its 1st two bytes...? >> >> for name in filenames: >> opening(name) as f: >> if f.read(2) == 0xFEB0: break >> >> This does seem to make real-life sense to me... > > Yes, that'd do. I can't say I think it would be common, but it's a > valid case. And the workaround is the usual messy flag variable: > > for name in filenames: > found = False > opening(name) as f: > if f.read(2) == 0xFEB0: found = True > if found: break Is there anything we could do about this? Reinhold -- Mail address is perfectly valid! From adamsz at gmail.com Wed May 4 18:41:02 2005 From: adamsz at gmail.com (Adam Souzis) Date: Wed, 4 May 2005 09:41:02 -0700 Subject: [Python-Dev] "begin" as keyword for pep 340 Message-ID: I'm a bit surpised that no one has yet [1] suggested "begin" as a keyword instead "block" as it seems to express the intent of blocks and is concise and readable. For example, here are the examples in PEP 340 rewritten using "begin": begin locking(): ... begin opening(path) as f: #how about: begin using_file(path) as f: ... begin transaction(db): ... begin auto_retry(3): ... begin redirecting_stdout: .... Probably the biggest problem with "begin" is that it is relatively common as an identify. For example, Greping through Python's Lib directory, begin is used as a method name twice (in httplib and idlelib.pyshell) and as a local twice (in mhlib and pyassemb). However, i can't think of many instances where there would be ambiguity in usage -- could 'begin' be a pseudo-identifier like "as" for some transitional time? -- adam [1] (Or maybe GMail's search has failed me ;) From michele.simionato at gmail.com Wed May 4 18:49:17 2005 From: michele.simionato at gmail.com (Michele Simionato) Date: Wed, 4 May 2005 12:49:17 -0400 Subject: [Python-Dev] my first post: asking about a "decorator" module Message-ID: <4edc17eb050504094950154ed0@mail.gmail.com> My first post to python-dev, but I guess my name is not completely unknown in this list ;) Actually, I have been wondering about subscribing to python-dev for at least a couple of years, but never did it, because of the limited amount of time I have to follow all the interesting mailing lists in the world. However, in the last few months I have been involved with teaching Python and I have decided to follow more closely the development to keep myself updated on what is going on. Plus, I have some ideas I would like to share with people in this list. One of them concerns decorators. Are there plans to improve decorators support in future Python versions? By "improving decorator support" I mean for instance a module in the standard library providing some commonly used decorators such as ``memoize``, or utilities to create and compose decorators, and things like that. I have been doing some work on decorators lately and I would be willing to help is there is a general interest about a "decorator" module. Actually, I have already a good candidate function for that module, and plenty of recipes. I submitted an early version of the idea some time ago on c.l.py http://groups-beta.google.com/group/comp.lang.python/browse_frm/thread/60f22ed33af5dbcb/5f870d271456ccf3?q=simionato+decorate&rnum=1&hl=en#5f870d271456ccf3 but I could as well flesh it out and deliver a module people can play with and see if they like it. This is especially interesting in this moment, since decorators may address many of the use cases of PEP 340 (not all of them). I need to write down some documentation, but it could be done by tomorrow. What do people think? Michele Simionato From reinhold-birkenfeld-nospam at wolke7.net Wed May 4 18:57:58 2005 From: reinhold-birkenfeld-nospam at wolke7.net (Reinhold Birkenfeld) Date: Wed, 04 May 2005 18:57:58 +0200 Subject: [Python-Dev] "begin" as keyword for pep 340 In-Reply-To: References: Message-ID: Adam Souzis wrote: > I'm a bit surpised that no one has yet [1] suggested "begin" as a > keyword instead "block" as it seems to express the intent of blocks > and is concise and readable. For example, here are the examples in > PEP 340 rewritten using "begin": > > begin locking(): > ... I don't know, but I always would expect "end" to follow each begin somewhere... the-good-old-pascal-days-ly yours, Reinhold PS: What about "using"? Too C#-ish? -- Mail address is perfectly valid! From mwh at python.net Wed May 4 19:02:20 2005 From: mwh at python.net (Michael Hudson) Date: Wed, 04 May 2005 18:02:20 +0100 Subject: [Python-Dev] New Py_UNICODE doc In-Reply-To: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com> (Nicholas Bastin's message of "Wed, 4 May 2005 11:59:40 -0400") References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com> Message-ID: <2mhdhiyler.fsf@starship.python.net> Nicholas Bastin writes: > The current documentation for Py_UNICODE states: > > "This type represents a 16-bit unsigned storage type which is used by > Python internally as basis for holding Unicode ordinals. On platforms > where wchar_t is available and also has 16-bits, Py_UNICODE is a > typedef alias for wchar_t to enhance native platform compatibility. On > all other platforms, Py_UNICODE is a typedef alias for unsigned > short." > > I propose changing this to: > > "This type represents the storage type which is used by Python > internally as the basis for holding Unicode ordinals. On platforms > where wchar_t is available, Py_UNICODE is a typedef alias for wchar_t > to enhance native platform compatibility. This just isn't true. Have you read ./configure --help recently? > On all other platforms, Py_UNICODE is a typedef alias for unsigned > short. Extension module developers should make no assumptions about > the size of this type on any given platform." I like this last sentence, though. > If no one has a problem with that, I'll make the change in CVS. I have a problem with replacing one lie with another :) Cheers, mwh -- Just put the user directories on a 486 with deadrat7.1 and turn the Octane into the afforementioned beer fridge and keep it in your office. The lusers won't notice the difference, except that you're more cheery during office hours. -- Pim van Riezen, asr From gvanrossum at gmail.com Wed May 4 19:09:43 2005 From: gvanrossum at gmail.com (Guido van Rossum) Date: Wed, 4 May 2005 10:09:43 -0700 Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally In-Reply-To: References: Message-ID: Nice one. Should be a piece of cake to implement. Please talk to peps at python.org about getting it checked into the PEP database. I'm +1 on accepting this now -- anybody against? On 5/4/05, Reinhold Birkenfeld wrote: > Hello, > > after proposing this here (albeit deep in the PEP 340 thread) and > getting a somewhat affirmatory response from Guido, I have written > something that could become a PEP if sufficiently hatched... > > --------------------------- > > PEP: XXX > Title: Unifying try-except and try-finally > Version: $Revision: $ > Last-Modified: $Date: $ > Author: Reinhold Birkenfeld > Status: Draft > Type: Standards Track > Content-Type: text/plain > Created: 04-May-2005 > Post-History: > > Abstract > > This PEP proposes a change in the syntax and semantics of try > statements to allow combined try-except-finally blocks. This > means in short that it would be valid to write > > try: > > except Exception: > > finally: > > > Rationale/Proposal > > There are many use cases for the try-except statement and > for the try-finally statement per se; however, often one needs > to catch exceptions and execute some cleanup code afterwards. > It is slightly annoying and not very intelligible that > one has to write > > f = None > try: > try: > f = open(filename) > text = f.read() > except IOError: > print 'An error occured' > finally: > if f: > f.close() > > So it is proposed that a construction like this > > try: > > except Ex1: > > > else: > > finally: > > > be exactly the same as the legacy > > try: > try: > > except Ex1: > > > else: > > finally: > > > This is backwards compatible, and every try statement that is > legal today would continue to work. > > Changes to the grammar > > The grammar for the try statement, which is currently > > try_stmt: ('try' ':' suite (except_clause ':' suite)+ > ['else' ':' suite] | 'try' ':' suite 'finally' ':' suite) > > would have to become > > try_stmt: ('try' ':' suite (except_clause ':' suite)+ > ['else' ':' suite] ['finally' ':' suite] | > 'try' ':' suite (except_clause ':' suite)* > ['else' ':' suite] 'finally' ':' suite) > > Implementation > > As the PEP author currently does not have sufficient knowledge > of the CPython implementation, he is unfortunately not able > to deliver one. > > References > > None yet. > > Copyright > > This document has been placed in the public domain. > > ----------------------- > Reinhold > > -- > Mail address is perfectly valid! > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > http://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: http://mail.python.org/mailman/options/python-dev/guido%40python.org > -- --Guido van Rossum (home page: http://www.python.org/~guido/) From gjc at inescporto.pt Wed May 4 19:14:01 2005 From: gjc at inescporto.pt (Gustavo J. A. M. Carneiro) Date: Wed, 04 May 2005 18:14:01 +0100 Subject: [Python-Dev] PEP 340: propose to get rid of 'as' keyword Message-ID: <1115226841.7909.24.camel@localhost> I have not read every email about this subject, so sorry if this has already been mentioned. In PEP 340 I read: block EXPR1 as VAR1: BLOCK1 I think it would be much clearer this (plus you save one keyword): block VAR1 = EXPR1: BLOCK1 Regards. -- Gustavo J. A. M. Carneiro The universe is always one step beyond logic. From nbastin at opnet.com Wed May 4 19:19:34 2005 From: nbastin at opnet.com (Nicholas Bastin) Date: Wed, 4 May 2005 13:19:34 -0400 Subject: [Python-Dev] New Py_UNICODE doc In-Reply-To: <2mhdhiyler.fsf@starship.python.net> References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com> <2mhdhiyler.fsf@starship.python.net> Message-ID: <19b53d82c6eb11419ddf4cb529241f64@opnet.com> On May 4, 2005, at 1:02 PM, Michael Hudson wrote: > Nicholas Bastin writes: > >> The current documentation for Py_UNICODE states: >> >> "This type represents a 16-bit unsigned storage type which is used by >> Python internally as basis for holding Unicode ordinals. On platforms >> where wchar_t is available and also has 16-bits, Py_UNICODE is a >> typedef alias for wchar_t to enhance native platform compatibility. >> On >> all other platforms, Py_UNICODE is a typedef alias for unsigned >> short." >> >> I propose changing this to: >> >> "This type represents the storage type which is used by Python >> internally as the basis for holding Unicode ordinals. On platforms >> where wchar_t is available, Py_UNICODE is a typedef alias for wchar_t >> to enhance native platform compatibility. > > This just isn't true. Have you read ./configure --help recently? Ok, so the above statement is true if the user does not set --enable-unicode=ucs[24] (was reading the whar_t test in configure.in, and not the generated configure help). Alternatively, we shouldn't talk about the size at all, and just leave the first and last sentences: "This type represents the storage type which is used by Python internally as the basis for holding Unicode ordinals. Extension module developers should make no assumptions about the size of this type on any given platform." -- Nick From rodsenra at gpr.com.br Wed May 4 19:30:21 2005 From: rodsenra at gpr.com.br (Rodrigo Dias Arruda Senra) Date: Wed, 04 May 2005 17:30:21 -0000 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: References: <20050503201400.GE30548@solar.trillke.net> Message-ID: <20020107054513.566d74ed@localhost.localdomain> [ Guido ]: > 1. Decide on a keyword to use, if any. Shouldn't be the other way around ? Decide to use *no* keyword, if that could be avoided. In my large inexperience *no keyword* is much better (if feasible): 1) No name conflicts with previous code: block, blocktemplate, whatever 2) ':' is already a block (broader sense) indication 3) Improved readbility: <> def locking_opening(lock, filename, mode="r"): block locking(lock): block opening(filename) as f: yield f <> def locking_opening(lock, filename, mode="r"): locking(lock): opening(filename) as f: yield f 4) Better to make the language parser more complex than the language exposed to end-users Following the PEP and this thread, it seems to me that __no keyword__ is less preferable than __some keyword__(=='block' so far), and I wonder why is not the reverse. Perhaps I missed something ? Besides, I think this solves many issues AOP was trying to tackle in a much cleaner, elegant -- therefore pythonic -- way. Outstanding. best regards, Senra -- Rodrigo Senra -- MSc Computer Engineer rodsenra(at)gpr.com.br GPr Sistemas Ltda http://www.gpr.com.br/ Personal Blog http://rodsenra.blogspot.com/ From barry at python.org Wed May 4 19:36:15 2005 From: barry at python.org (Barry Warsaw) Date: Wed, 04 May 2005 13:36:15 -0400 Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally In-Reply-To: References: Message-ID: <1115228174.21868.18.camel@geddy.wooz.org> On Wed, 2005-05-04 at 13:09, Guido van Rossum wrote: > Nice one. Should be a piece of cake to implement. Please talk to > peps at python.org about getting it checked into the PEP database. +1! -Barry -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 307 bytes Desc: This is a digitally signed message part Url : http://mail.python.org/pipermail/python-dev/attachments/20050504/120d6578/attachment.pgp From rodsenra at gpr.com.br Wed May 4 20:23:17 2005 From: rodsenra at gpr.com.br (Rodrigo Dias Arruda Senra) Date: Wed, 04 May 2005 18:23:17 -0000 Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally In-Reply-To: References: Message-ID: <20020107063813.0e603043@localhost.localdomain> [ Guido ]: > Nice one. Should be a piece of cake to implement. Please talk to > peps at python.org about getting it checked into the PEP database. > > I'm +1 on accepting this now -- anybody against? +1 Last week, while I was giving a Python course (in Rio de Janeiro-Brazil) some students attempted to use try/except/finally blocks. I had to dig the grammar to prove to them that it was __not already__ supported. cheers, Senra -- Rodrigo Senra -- MSc Computer Engineer rodsenra(at)gpr.com.br GPr Sistemas Ltda http://www.gpr.com.br/ Personal Blog http://rodsenra.blogspot.com/ From python at rcn.com Wed May 4 20:26:49 2005 From: python at rcn.com (Raymond Hettinger) Date: Wed, 4 May 2005 14:26:49 -0400 Subject: [Python-Dev] my first post: asking about a "decorator" module In-Reply-To: <4edc17eb050504094950154ed0@mail.gmail.com> Message-ID: <002401c550d6$d669ab80$11bd2c81@oemcomputer> > Are there plans to improve decorators support in future Python versions? > By "improving decorator support" I mean for instance a module in the > standard > library providing some commonly used decorators such as ``memoize``, > or utilities to create and compose decorators, and things like that. Ultimately, some of these will likely end-up in the library. For the time being, I think it best that these get posted and evolve either as Wiki entries or as ASPN entries. The best practices and proven winners have yet to emerge. Solidifying first attempts is likely not a good idea. Putting tools in the standard library should be the last evolutionary step, not the first. Raymond Hettinger From jwp at localhost.lit.jwp.name Wed May 4 19:52:53 2005 From: jwp at localhost.lit.jwp.name (James William Pye) Date: Wed, 04 May 2005 10:52:53 -0700 Subject: [Python-Dev] Need to hook Py_FatalError In-Reply-To: References: <20050503112637.648F.JCARLSON@uci.edu> <20050503132639.6492.JCARLSON@uci.edu> <1115171686.62180.48.camel@localhost> <1115218734.62180.119.camel@localhost> Message-ID: <1115229173.62180.171.camel@localhost> On Wed, 2005-05-04 at 15:29 +0000, M.Utku K. wrote: > Extension developer may discard or call them at his own will. That's the issue, an extension developer shouldn't be able to discard it, as I, the embedder, do not want my hook to be clobbered. The extension developer doesn't set the context of the application, the embedder does. > What if more than one extension needs it ? Firstly, I don't think it is likely an extension module *by itself* would ever have to initialize something that would *require* some form of cleanup if the app were to fatal out. If it did, I imagine that it would be suspect of poor design, any exceptions likely to be few and far between. Now, that doesn't mean its use during the process might not create some state or side effect where cleanup would be nice. Although, chances are that such cleanup should occur during normal operations, and be handled via a Python exception, something that a fatal is not. -- Regards, James William Pye -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 187 bytes Desc: This is a digitally signed message part Url : http://mail.python.org/pipermail/python-dev/attachments/20050504/60abdfb4/attachment.pgp From fredrik at pythonware.com Wed May 4 20:29:18 2005 From: fredrik at pythonware.com (Fredrik Lundh) Date: Wed, 4 May 2005 20:29:18 +0200 Subject: [Python-Dev] PEP 340: propose to get rid of 'as' keyword References: <1115226841.7909.24.camel@localhost> Message-ID: Gustavo J. A. M. Carneiro wrote: > I have not read every email about this subject, so sorry if this has > already been mentioned. > > In PEP 340 I read: > > block EXPR1 as VAR1: > BLOCK1 > > I think it would be much clearer this (plus you save one keyword): > > block VAR1 = EXPR1: > BLOCK1 clearer for whom? where else is this construct used in Python? From fredrik at pythonware.com Wed May 4 20:33:10 2005 From: fredrik at pythonware.com (Fredrik Lundh) Date: Wed, 4 May 2005 20:33:10 +0200 Subject: [Python-Dev] New Py_UNICODE doc References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com> <4qdjm0rt.fsf@python.net> Message-ID: Thomas Heller wrote: > AFAIK, you can configure Python to use 16-bits or 32-bits Unicode chars, > independend from the size of wchar_t. The HAVE_USABLE_WCHAR_T macro > can be used by extension writers to determine if Py_UNICODE is the same as > wchar_t. note that "usable" is more than just "same size"; it also implies that widechar predicates (iswalnum etc) works properly with Unicode characters, under all locales. From reinhold-birkenfeld-nospam at wolke7.net Wed May 4 20:36:34 2005 From: reinhold-birkenfeld-nospam at wolke7.net (Reinhold Birkenfeld) Date: Wed, 04 May 2005 20:36:34 +0200 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: <20020107054513.566d74ed@localhost.localdomain> References: <20050503201400.GE30548@solar.trillke.net> <20020107054513.566d74ed@localhost.localdomain> Message-ID: Rodrigo Dias Arruda Senra wrote: > [ Guido ]: > > 1. Decide on a keyword to use, if any. > > Shouldn't be the other way around ? > Decide to use *no* keyword, if that could be avoided. [...] > Following the PEP and this thread, it seems to me that __no keyword__ > is less preferable than __some keyword__(=='block' so far), and I wonder > why is not the reverse. Perhaps I missed something ? There is one problem with using no keyword: You cannot use arbitrary expressions in the new statement. Consider: resource = opening("file.txt") block resource: (...) resource = opening("file.txt") resource: (...) The latter would have to be forbidden. (Seeing these examples, I somehow strongly dislike "block"; "with" or "using" seem really better) Reinhold -- Mail address is perfectly valid! From tim.peters at gmail.com Wed May 4 20:41:22 2005 From: tim.peters at gmail.com (Tim Peters) Date: Wed, 4 May 2005 14:41:22 -0400 Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally In-Reply-To: References: Message-ID: <1f7befae05050411416c198c54@mail.gmail.com> [Guido] > I'm +1 on accepting this now -- anybody against? I'm curious to know if you (Guido) remember why you removed this feature in Python 0.9.6? From the HISTORY file: """ New features in 0.9.6: - stricter try stmt syntax: cannot mix except and finally clauses on 1 try """ IIRC (and I may well not), half of people guessed wrong about whether an exception raised in an "except:" suite would or would not skip execution of the same-level "finally:" suite. try: 1/0 except DivisionByZero: 2/0 finally: print "yes or no?" The complementary question is whether an exception in the "finally:" suite will be handled by the same-level "except:" suites. There are obvious answers to both, of course. The question is whether they're the _same_ obvious answers across responders <0.7 wink>. From bob at redivi.com Wed May 4 20:45:25 2005 From: bob at redivi.com (Bob Ippolito) Date: Wed, 4 May 2005 14:45:25 -0400 Subject: [Python-Dev] PEP 340: propose to get rid of 'as' keyword In-Reply-To: References: <1115226841.7909.24.camel@localhost> Message-ID: <2655785A-547C-4832-AC3F-A9271DB61A06@redivi.com> On May 4, 2005, at 2:29 PM, Fredrik Lundh wrote: > Gustavo J. A. M. Carneiro wrote: > > >> I have not read every email about this subject, so sorry if this >> has >> already been mentioned. >> >> In PEP 340 I read: >> >> block EXPR1 as VAR1: >> BLOCK1 >> >> I think it would be much clearer this (plus you save one keyword): >> >> block VAR1 = EXPR1: >> BLOCK1 >> > > clearer for whom? where else is this construct used in Python? It might be more clear to have the "var" on the left. The only place it's used on the right (that I know of) is in import statements when using the "as" clause. Assignment, for loops, generator expressions, list comprehensions, etc. always have the var on the left. -bob From noamraph at gmail.com Wed May 4 20:57:42 2005 From: noamraph at gmail.com (Noam Raphael) Date: Wed, 4 May 2005 20:57:42 +0200 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: References: <20050503201400.GE30548@solar.trillke.net> <20020107054513.566d74ed@localhost.localdomain> Message-ID: On 5/4/05, Reinhold Birkenfeld wrote: > > There is one problem with using no keyword: You cannot use arbitrary expressions > in the new statement. Consider: > > resource = opening("file.txt") > block resource: > (...) > > resource = opening("file.txt") > resource: > (...) > > The latter would have to be forbidden. Can you explain why it would have to be forbidden please? Thanks, Noam From shane at hathawaymix.org Wed May 4 21:02:40 2005 From: shane at hathawaymix.org (Shane Hathaway) Date: Wed, 04 May 2005 13:02:40 -0600 Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr> <17015.39213.522060.873605@montanaro.dyndns.org> <17015.48830.223391.390538@montanaro.dyndns.org> <79990c6b050504015762d004ac@mail.gmail.com> Message-ID: <42791C50.3090107@hathawaymix.org> Alex Martelli wrote: > Looking for a file with a certain magicnumber in its 1st two bytes...? > > for name in filenames: > opening(name) as f: > if f.read(2) == 0xFEB0: break > > This does seem to make real-life sense to me... I'd like to suggest a small language enhancement that would fix this example. Allow the break and continue statements to use a keyword, either "for" or "while", to state that the code should break out of both the block statement and the innermost "for" or "while" statement. The example above would change to: for name in filenames: opening(name) as f: if f.read(2) == 0xFEB0: break for This could be a separate PEP if necessary. When a "break for" is used in a block statement, it should raise a new kind of exception, BreakForLoop, and the block statement should propagate the exception. When used outside a block statement, "break for" can use existing Python byte code to jump directly to the next appropriate statement. Shane From theller at python.net Wed May 4 20:59:24 2005 From: theller at python.net (Thomas Heller) Date: Wed, 04 May 2005 20:59:24 +0200 Subject: [Python-Dev] New Py_UNICODE doc In-Reply-To: (Fredrik Lundh's message of "Wed, 4 May 2005 20:33:10 +0200") References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com> <4qdjm0rt.fsf@python.net> Message-ID: "Fredrik Lundh" writes: > Thomas Heller wrote: > >> AFAIK, you can configure Python to use 16-bits or 32-bits Unicode chars, >> independend from the size of wchar_t. The HAVE_USABLE_WCHAR_T macro >> can be used by extension writers to determine if Py_UNICODE is the same as >> wchar_t. > > note that "usable" is more than just "same size"; it also implies that widechar > predicates (iswalnum etc) works properly with Unicode characters, under all > locales. Ok, so who is going to collect the wisdom of this thread into the docs? Thomas From m.u.k.2 at gawab.com Wed May 4 20:52:51 2005 From: m.u.k.2 at gawab.com (M.Utku K.) Date: Wed, 4 May 2005 18:52:51 +0000 (UTC) Subject: [Python-Dev] Need to hook Py_FatalError References: <20050503112637.648F.JCARLSON@uci.edu> <20050503132639.6492.JCARLSON@uci.edu> <1115171686.62180.48.camel@localhost> <1115218734.62180.119.camel@localhost> Message-ID: Hi all, > strip..... > What if more than one extension needs it ? > Curently Im doing > > callback_type SetCallBack(callback_type newfunc) > > This will set the callback to newfunc and return the old one. Extension > developer may discard or call them at his own will. What do you think? > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > http://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > http://mail.python.org/mailman/options/python-dev/python-python-dev%40m.g > mane.org > Ok then, it will be a one shot callback registration. By the way declaration of the func ( void SetFatalError_Callback(PyFatalError_Func func) ) is in "pyerrors.h" but implemenatiton is is in "Pythonrun.c". Is it OK? Im listening for more. Best regards. From shane at hathawaymix.org Wed May 4 21:08:59 2005 From: shane at hathawaymix.org (Shane Hathaway) Date: Wed, 04 May 2005 13:08:59 -0600 Subject: [Python-Dev] PEP 340: propose to get rid of 'as' keyword In-Reply-To: <1115226841.7909.24.camel@localhost> References: <1115226841.7909.24.camel@localhost> Message-ID: <42791DCB.5050703@hathawaymix.org> Gustavo J. A. M. Carneiro wrote: > In PEP 340 I read: > > block EXPR1 as VAR1: > BLOCK1 > > I think it would be much clearer this (plus you save one keyword): > > block VAR1 = EXPR1: > BLOCK1 I think you misunderstood the statement. EXPR1 creates an iterator, then VAR1 iterates over the values returns by the iterator. VAR1 never sees the iterator. Using your syntax would reinforce the misinterpretation that VAR1 sees the iterator. Shane From rodsenra at gpr.com.br Wed May 4 21:12:33 2005 From: rodsenra at gpr.com.br (Rodrigo Dias Arruda Senra) Date: Wed, 4 May 2005 16:12:33 -0300 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: References: <20050503201400.GE30548@solar.trillke.net> <20020107054513.566d74ed@localhost.localdomain> Message-ID: <20050504161233.541b3fc6@localhost.localdomain> [ Senra ]: > > [ Guido ]: > > > 1. Decide on a keyword to use, if any. > > > > Shouldn't be the other way around ? > > Decide to use *no* keyword, if that could be avoided. [ Reinhold ]: > There is one problem with using no keyword: You cannot use arbitrary expressions > in the new statement. Consider: > > resource = opening("file.txt") > block resource: > (...) > > resource = opening("file.txt") > resource: > (...) > > The latter would have to be forbidden. I'm not quite sure why, but there seem to be a workaround (forseen in PEP 340). And people seem to be "using" this already <0.5 wink>: [Alex Martelli]: > for name in filenames: > opening(name) as f: > if f.read(2) == 0xFEB0: break Moreover, an anonymous block should have no <> (neither 'block', 'with', 'using') to be true anonymous <1.0-Tim-Peter'ly wink> cheers, Senra -- Rodrigo Senra -- MSc Computer Engineer rodsenra(at)gpr.com.br GPr Sistemas Ltda http://www.gpr.com.br/ Personal Blog http://rodsenra.blogspot.com/ From shane at hathawaymix.org Wed May 4 21:14:03 2005 From: shane at hathawaymix.org (Shane Hathaway) Date: Wed, 04 May 2005 13:14:03 -0600 Subject: [Python-Dev] PEP 340: propose to get rid of 'as' keyword In-Reply-To: <42791DCB.5050703@hathawaymix.org> References: <1115226841.7909.24.camel@localhost> <42791DCB.5050703@hathawaymix.org> Message-ID: <42791EFB.7090200@hathawaymix.org> Shane Hathaway wrote: > Gustavo J. A. M. Carneiro wrote: > >> In PEP 340 I read: >> >> block EXPR1 as VAR1: >> BLOCK1 >> >> I think it would be much clearer this (plus you save one keyword): >> >> block VAR1 = EXPR1: >> BLOCK1 > > > I think you misunderstood the statement. EXPR1 creates an iterator, > then VAR1 iterates over the values returns by the iterator. VAR1 never ^^^^^^^^^^ returned by > sees the iterator. Using your syntax would reinforce the > misinterpretation that VAR1 sees the iterator. From mitja.marn at gmail.com Wed May 4 21:10:34 2005 From: mitja.marn at gmail.com (Mitja Marn) Date: Wed, 4 May 2005 21:10:34 +0200 Subject: [Python-Dev] "begin" as keyword for pep 340 In-Reply-To: References: Message-ID: On 5/4/05, Reinhold Birkenfeld wrote: > PS: What about "using"? Too C#-ish? Another idea from a hobbyist programmer: "holding" or mabe just "hold". Like this: hold locked(myLock): # Code here executes with myLock held. The lock is # guaranteed to be released when the block is left (even # if via return or by an uncaught exception). hold opened("/etc/passwd") as f: for line in f: print line.rstrip() From m.u.k.2 at gawab.com Wed May 4 21:05:35 2005 From: m.u.k.2 at gawab.com (M.Utku K.) Date: Wed, 4 May 2005 19:05:35 +0000 (UTC) Subject: [Python-Dev] Need to hook Py_FatalError References: <20050503112637.648F.JCARLSON@uci.edu> <20050503132639.6492.JCARLSON@uci.edu> <1115171686.62180.48.camel@localhost> <1115218734.62180.119.camel@localhost> Message-ID: "M.Utku K." wrote in news:Xns964CE11B16061token@ 80.91.229.5: > _Callback(PyFatalError_Func func) ) > is in "pyerrors.h" but implemenatiton is > is in "Pythonrun.c". Is it OK? Im listening for more. > Sorry, just checked decl. will be in "pydebug.h" From skip at pobox.com Wed May 4 21:19:25 2005 From: skip at pobox.com (Skip Montanaro) Date: Wed, 4 May 2005 14:19:25 -0500 Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally In-Reply-To: References: Message-ID: <17017.8253.375246.734512@montanaro.dyndns.org> Guido> Nice one. Should be a piece of cake to implement. Please talk to Guido> peps at python.org about getting it checked into the PEP database. Guido> I'm +1 on accepting this now -- anybody against? I'm not against it, but I thought there were ambiguity reasons that this construct wasn't already implemented. I'm pretty sure people have asked about it before but been rebuffed. Here's a message with Reinhold's fingerprints on it: http://mail.python.org/pipermail/python-list/2004-June/227008.html Here's another one: http://mail.python.org/pipermail/python-list/2003-November/193159.html Both reference other articles which presumably have more details about the reasoning, but I was unable to find them with a quick search. Skip From python-dev at zesty.ca Wed May 4 21:20:09 2005 From: python-dev at zesty.ca (Ka-Ping Yee) Date: Wed, 4 May 2005 14:20:09 -0500 (CDT) Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: <42791C50.3090107@hathawaymix.org> References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr> <17015.39213.522060.873605@montanaro.dyndns.org> <17015.48830.223391.390538@montanaro.dyndns.org> <79990c6b050504015762d004ac@mail.gmail.com> <42791C50.3090107@hathawaymix.org> Message-ID: On Wed, 4 May 2005, Shane Hathaway wrote: > I'd like to suggest a small language enhancement that would fix this > example. Allow the break and continue statements to use a keyword, > either "for" or "while", to state that the code should break out of both > the block statement and the innermost "for" or "while" statement. The > example above would change to: > > for name in filenames: > opening(name) as f: > if f.read(2) == 0xFEB0: > break for This is very elegant. It works beautifully with "break", though at first that natural analogs "continue for", "continue while" appear to conflict with Guido's proposed extension to "continue". But if we choose the keyword "with" to introduce an anonymous block, it comes out rather nicely: continue with 2 That's easier to read than "continue 2", in my opinion. (If it's not too cute for you.) Anyway, i like the general idea of letting the programmer specify exactly which block to break/continue, instead of leaving it looking ambiguous. Explicit is better than implicit, right? -- ?!ng From rodsenra at gpr.com.br Wed May 4 21:24:10 2005 From: rodsenra at gpr.com.br (Rodrigo Dias Arruda Senra) Date: Wed, 4 May 2005 16:24:10 -0300 Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: <42791C50.3090107@hathawaymix.org> References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr> <17015.39213.522060.873605@montanaro.dyndns.org> <17015.48830.223391.390538@montanaro.dyndns.org> <79990c6b050504015762d004ac@mail.gmail.com> <42791C50.3090107@hathawaymix.org> Message-ID: <20050504162410.7e7fcd14@localhost.localdomain> [ Shane Hathaway ]: > I'd like to suggest a small language enhancement that would fix this > example. Allow the break and continue statements to use a keyword, > either "for" or "while", to state that the code should break out of both > the block statement and the innermost "for" or "while" statement. The > example above would change to: > > for name in filenames: > opening(name) as f: > if f.read(2) == 0xFEB0: > break for > > This could be a separate PEP if necessary. When a "break for" is used > in a block statement, it should raise a new kind of exception, > BreakForLoop, and the block statement should propagate the exception. > When used outside a block statement, "break for" can use existing Python > byte code to jump directly to the next appropriate statement. What about nested blocks ? When they act as iterators that would be desireable too. What to do then: - baptize blocks -> break - keep them anonymous -> break #enclosing_scope_counter - do not support them cheers, Senra -- Rodrigo Senra -- MSc Computer Engineer rodsenra(at)gpr.com.br GPr Sistemas Ltda http://www.gpr.com.br/ Personal Blog http://rodsenra.blogspot.com/ From barry at python.org Wed May 4 21:20:31 2005 From: barry at python.org (Barry Warsaw) Date: Wed, 04 May 2005 15:20:31 -0400 Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally In-Reply-To: <1f7befae05050411416c198c54@mail.gmail.com> References: <1f7befae05050411416c198c54@mail.gmail.com> Message-ID: <1115234431.21863.23.camel@geddy.wooz.org> On Wed, 2005-05-04 at 14:41, Tim Peters wrote: > IIRC (and I may well not), half of people guessed wrong about whether > an exception raised in an "except:" suite would or would not skip > execution of the same-level "finally:" suite. It would not, obviously . > try: > 1/0 > except DivisionByZero: > 2/0 > finally: > print "yes or no?" > > The complementary question is whether an exception in the "finally:" > suite will be handled by the same-level "except:" suites. It would not, obviously . > There are obvious answers to both, of course. The question is whether > they're the _same_ obvious answers across responders <0.7 wink>. It only matters that it's the same obvious answers across all responders who are right. :) -Barry -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 307 bytes Desc: This is a digitally signed message part Url : http://mail.python.org/pipermail/python-dev/attachments/20050504/8599b268/attachment.pgp From shane at hathawaymix.org Wed May 4 21:31:23 2005 From: shane at hathawaymix.org (Shane Hathaway) Date: Wed, 04 May 2005 13:31:23 -0600 Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr> <17015.39213.522060.873605@montanaro.dyndns.org> <17015.48830.223391.390538@montanaro.dyndns.org> <79990c6b050504015762d004ac@mail.gmail.com> <42791C50.3090107@hathawaymix.org> Message-ID: <4279230B.4050902@hathawaymix.org> Ka-Ping Yee wrote: > On Wed, 4 May 2005, Shane Hathaway wrote: >> >> for name in filenames: >> opening(name) as f: >> if f.read(2) == 0xFEB0: >> break for > > > This is very elegant. Thanks. > It works beautifully with "break", though at > first that natural analogs "continue for", "continue while" appear to > conflict with Guido's proposed extension to "continue". > > But if we choose the keyword "with" to introduce an anonymous block, > it comes out rather nicely: > > continue with 2 > > That's easier to read than "continue 2", in my opinion. (If it's not > too cute for you.) Or perhaps: continue yield 2 This would create some symmetry, since generators will retrieve the value passed by a continue statement using a yield expression. Shane From gvanrossum at gmail.com Wed May 4 21:27:03 2005 From: gvanrossum at gmail.com (Guido van Rossum) Date: Wed, 4 May 2005 12:27:03 -0700 Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally In-Reply-To: <1f7befae05050411416c198c54@mail.gmail.com> References: <1f7befae05050411416c198c54@mail.gmail.com> Message-ID: [Tim] > I'm curious to know if you (Guido) remember why you removed this > feature in Python 0.9.6? From the HISTORY file: > > """ > New features in 0.9.6: > - stricter try stmt syntax: cannot mix except and finally clauses on 1 try > """ > > IIRC (and I may well not), half of people guessed wrong about whether > an exception raised in an "except:" suite would or would not skip > execution of the same-level "finally:" suite. > > try: > 1/0 > except DivisionByZero: > 2/0 > finally: > print "yes or no?" > > The complementary question is whether an exception in the "finally:" > suite will be handled by the same-level "except:" suites. No. The rule of thumb is that control only passes forward. > There are obvious answers to both, of course. The question is whether > they're the _same_ obvious answers across responders <0.7 wink>. I think the main person confused was me. :-) In addition, at the time I don't think I knew Java -- certainly I didn't know it well enough to realize that it gives this construct the meaning proposed by Reinhold's PEP. -- --Guido van Rossum (home page: http://www.python.org/~guido/) From reinhold-birkenfeld-nospam at wolke7.net Wed May 4 21:28:23 2005 From: reinhold-birkenfeld-nospam at wolke7.net (Reinhold Birkenfeld) Date: Wed, 04 May 2005 21:28:23 +0200 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: References: <20050503201400.GE30548@solar.trillke.net> <20020107054513.566d74ed@localhost.localdomain> Message-ID: Noam Raphael wrote: > On 5/4/05, Reinhold Birkenfeld wrote: >> >> There is one problem with using no keyword: You cannot use arbitrary expressions >> in the new statement. Consider: >> >> resource = opening("file.txt") >> block resource: >> (...) >> >> resource = opening("file.txt") >> resource: >> (...) >> >> The latter would have to be forbidden. > > Can you explain why it would have to be forbidden please? Well, with it you could create suites with _any_ introducing identifier. Consider: with: (...) synchronized: (...) try: (...) transaction: (...) Do you understand my concern? It would be very, very hard to discern these "user-defined statements" from real language constructs. Reinhold -- Mail address is perfectly valid! From reinhold-birkenfeld-nospam at wolke7.net Wed May 4 21:33:29 2005 From: reinhold-birkenfeld-nospam at wolke7.net (Reinhold Birkenfeld) Date: Wed, 04 May 2005 21:33:29 +0200 Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally In-Reply-To: <1f7befae05050411416c198c54@mail.gmail.com> References: <1f7befae05050411416c198c54@mail.gmail.com> Message-ID: Tim Peters wrote: > [Guido] >> I'm +1 on accepting this now -- anybody against? > > I'm curious to know if you (Guido) remember why you removed this > feature in Python 0.9.6? From the HISTORY file: > > """ > New features in 0.9.6: > - stricter try stmt syntax: cannot mix except and finally clauses on 1 try > """ > > IIRC (and I may well not), half of people guessed wrong about whether > an exception raised in an "except:" suite would or would not skip > execution of the same-level "finally:" suite. With the arrival of Java and C#, which both have this feature, I think the wrong guesses are minimized. I think the behaviour of the "else" clause is much harder to guess, mainly when used with the looping constructs. > try: > 1/0 > except DivisionByZero: > 2/0 > finally: > print "yes or no?" > > The complementary question is whether an exception in the "finally:" > suite will be handled by the same-level "except:" suites. No, as except clauses can only occur before the finally clause, and execution should not go backwards. > There are obvious answers to both, of course. The question is whether > they're the _same_ obvious answers across responders <0.7 wink>. Reinhold -- Mail address is perfectly valid! From python-dev at zesty.ca Wed May 4 21:42:50 2005 From: python-dev at zesty.ca (Ka-Ping Yee) Date: Wed, 4 May 2005 14:42:50 -0500 (CDT) Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: References: <20050503201400.GE30548@solar.trillke.net> <20020107054513.566d74ed@localhost.localdomain> Message-ID: Reinhold Birkenfeld wrote: > There is one problem with using no keyword: You cannot use arbitrary > expressions in the new statement. [...] > resource = opening("file.txt") > resource: > (...) > > The latter would have to be forbidden. Noam Raphael wrote: > Can you explain why it would have to be forbidden please? Reinhold Birkenfeld wrote: > Well, with it you could create suites with _any_ introducing > identifier. Consider: > > with: > (...) > > synchronized: > (...) > > try: > (...) > > transaction: > (...) > > Do you understand my concern? It would be very, very hard to discern > these "user-defined statements" from real language constructs. I think part of the debate is about whether that's good or bad. I happen to agree with you -- i think a keyword is necessary -- but i believe some people see an advantage in having the flexibility to make a "real-looking" construct. As i see it the argument boils down to: Python is not Lisp. There are good reasons why the language has keywords, why it distinguishes statements from expressions, uses indentation, and so on. All of these properties cause Python programs to be made of familiar and easily recognizable patterns instead of degenerating into a homogeneous pile of syntax. -- ?!ng From reinhold-birkenfeld-nospam at wolke7.net Wed May 4 21:45:36 2005 From: reinhold-birkenfeld-nospam at wolke7.net (Reinhold Birkenfeld) Date: Wed, 04 May 2005 21:45:36 +0200 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: References: <20050503201400.GE30548@solar.trillke.net> <20020107054513.566d74ed@localhost.localdomain> Message-ID: Ka-Ping Yee wrote: > Reinhold Birkenfeld wrote: >> Well, with it you could create suites with _any_ introducing >> identifier. Consider: >> >> with: >> (...) >> >> synchronized: >> (...) >> >> try: >> (...) >> >> transaction: >> (...) >> >> Do you understand my concern? It would be very, very hard to discern >> these "user-defined statements" from real language constructs. > > I think part of the debate is about whether that's good or bad. > I happen to agree with you -- i think a keyword is necessary -- > but i believe some people see an advantage in having the flexibility > to make a "real-looking" construct. Yes. But it would only be crippled, as the "keyword" would have to be a pre-constructed generator instance which cannot be easily reused as a library export (at least, it is not intended this way). > As i see it the argument boils down to: Python is not Lisp. > > There are good reasons why the language has keywords, why it > distinguishes statements from expressions, uses indentation, and > so on. All of these properties cause Python programs to be made > of familiar and easily recognizable patterns instead of degenerating > into a homogeneous pile of syntax. Big ACK. Reinhold -- Mail address is perfectly valid! From shane at hathawaymix.org Wed May 4 21:54:48 2005 From: shane at hathawaymix.org (Shane Hathaway) Date: Wed, 04 May 2005 13:54:48 -0600 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: References: <20050503201400.GE30548@solar.trillke.net> <20020107054513.566d74ed@localhost.localdomain> Message-ID: <42792888.10209@hathawaymix.org> Reinhold Birkenfeld wrote: > Noam Raphael wrote: > >>On 5/4/05, Reinhold Birkenfeld wrote: >>>resource = opening("file.txt") >>>resource: >>> (...) >>> >>>The latter would have to be forbidden. >> >>Can you explain why it would have to be forbidden please? > > > Well, with it you could create suites with _any_ introducing > identifier. Consider: > [...] > > transaction: > (...) > > > Do you understand my concern? It would be very, very hard to discern > these "user-defined statements" from real language constructs. For each block statement, it is necessary to create a *new* iterator, since iterators that have stopped are required to stay stopped. So at a minimum, used-defined statements will need to call something, and thus will have parentheses. The parentheses might be enough to make block statements not look like built-in keywords. PEP 340 seems to punish people for avoiding the parentheses: transaction = begin_transaction() transaction: db.execute('insert 3 into mytable') transaction: db.execute('insert 4 into mytable') I expect that only '3' would be inserted in mytable. The second use of the transaction iterator will immediately raise StopIteration. Shane From reinhold-birkenfeld-nospam at wolke7.net Wed May 4 21:53:50 2005 From: reinhold-birkenfeld-nospam at wolke7.net (Reinhold Birkenfeld) Date: Wed, 04 May 2005 21:53:50 +0200 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: <42792888.10209@hathawaymix.org> References: <20050503201400.GE30548@solar.trillke.net> <20020107054513.566d74ed@localhost.localdomain> <42792888.10209@hathawaymix.org> Message-ID: Shane Hathaway wrote: > For each block statement, it is necessary to create a *new* iterator, Right. > since iterators that have stopped are required to stay stopped. So at a > minimum, used-defined statements will need to call something, and thus > will have parentheses. The parentheses might be enough to make block > statements not look like built-in keywords. > > PEP 340 seems to punish people for avoiding the parentheses: > > transaction = begin_transaction() > > transaction: > db.execute('insert 3 into mytable') > > transaction: > db.execute('insert 4 into mytable') > > I expect that only '3' would be inserted in mytable. The second use of > the transaction iterator will immediately raise StopIteration. Yes, but wouldn't you think that people would misunderstand it in this way? Reinhold -- Mail address is perfectly valid! From bac at OCF.Berkeley.EDU Wed May 4 22:10:21 2005 From: bac at OCF.Berkeley.EDU (Brett C.) Date: Wed, 04 May 2005 13:10:21 -0700 Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally In-Reply-To: References: Message-ID: <42792C2D.3080609@ocf.berkeley.edu> Guido van Rossum wrote: > Nice one. Should be a piece of cake to implement. Please talk to > peps at python.org about getting it checked into the PEP database. > > I'm +1 on accepting this now -- anybody against? > I'm +1. A couple of us discussed this at PyCon during the last day of the sprints and we all thought that it could be done, but none of us felt strong enough to write the PEP immediately. So thanks to Reinhold for picking up our slack. =) -Brett From tim.peters at gmail.com Wed May 4 22:14:23 2005 From: tim.peters at gmail.com (Tim Peters) Date: Wed, 4 May 2005 16:14:23 -0400 Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally In-Reply-To: References: <1f7befae05050411416c198c54@mail.gmail.com> Message-ID: <1f7befae05050413141e1128dc@mail.gmail.com> [Reinhold Birkenfeld] > ... > I think the behaviour of the "else" clause is much harder to guess, > mainly when used with the looping constructs. No, that's obvious . What about `else` mixed with try/except/finally? try: A except: B else: C finally: D If A executes without exception, does D execute before or after C? I'm not saying we can't make up reasonable answers. I'm saying they look more-or-less arbitrary, while the current nested forms are always clear. From shane at hathawaymix.org Wed May 4 22:23:41 2005 From: shane at hathawaymix.org (Shane Hathaway) Date: Wed, 04 May 2005 14:23:41 -0600 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: References: <20050503201400.GE30548@solar.trillke.net> <20020107054513.566d74ed@localhost.localdomain> <42792888.10209@hathawaymix.org> Message-ID: <42792F4D.4010706@hathawaymix.org> Reinhold Birkenfeld wrote: > Shane Hathaway wrote: > > >>For each block statement, it is necessary to create a *new* iterator, > > > Right. > > >>since iterators that have stopped are required to stay stopped. So at a >>minimum, used-defined statements will need to call something, and thus >>will have parentheses. The parentheses might be enough to make block >>statements not look like built-in keywords. >> >>PEP 340 seems to punish people for avoiding the parentheses: >> >> transaction = begin_transaction() >> >> transaction: >> db.execute('insert 3 into mytable') >> >> transaction: >> db.execute('insert 4 into mytable') >> >>I expect that only '3' would be inserted in mytable. The second use of >>the transaction iterator will immediately raise StopIteration. > > > Yes, but wouldn't you think that people would misunderstand it in this way? Yes, they might. Just to be clear, the risk is that people will try to write statements without parentheses and get burned because their code doesn't get executed, right? A possible workaround is to identify iterators that have already finished. StopIteration doesn't distinguish between an iterator that never yields any values from an iterator that has yielded all of its values. Maybe there should be a subclass of StopIteration like "AlreadyStoppedIteration". Then, if a block statement gets an AlreadyStoppedIteration exception from its iterator, it should convert that to an error like "InvalidBlockError". Shane From gjc at inescporto.pt Wed May 4 22:21:15 2005 From: gjc at inescporto.pt (Gustavo J. A. M. Carneiro) Date: Wed, 04 May 2005 21:21:15 +0100 Subject: [Python-Dev] PEP 340: propose to get rid of 'as' keyword In-Reply-To: <42791DCB.5050703@hathawaymix.org> References: <1115226841.7909.24.camel@localhost> <42791DCB.5050703@hathawaymix.org> Message-ID: <1115238075.10836.4.camel@emperor> On Wed, 2005-05-04 at 13:08 -0600, Shane Hathaway wrote: > Gustavo J. A. M. Carneiro wrote: > > In PEP 340 I read: > > > > block EXPR1 as VAR1: > > BLOCK1 > > > > I think it would be much clearer this (plus you save one keyword): > > > > block VAR1 = EXPR1: > > BLOCK1 > > I think you misunderstood the statement. EXPR1 creates an iterator, > then VAR1 iterates over the values returns by the iterator. VAR1 never > sees the iterator. Using your syntax would reinforce the > misinterpretation that VAR1 sees the iterator. In that case, block VAR1 in EXPR1: BLOCK1 And now I see how using 'for' statements (perhaps slightly changed) turned up in the discussion. Sorry for the noise. -- Gustavo J. A. M. Carneiro The universe is always one step beyond logic From skip at pobox.com Wed May 4 22:27:33 2005 From: skip at pobox.com (Skip Montanaro) Date: Wed, 4 May 2005 15:27:33 -0500 Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally In-Reply-To: <1f7befae05050413141e1128dc@mail.gmail.com> References: <1f7befae05050411416c198c54@mail.gmail.com> <1f7befae05050413141e1128dc@mail.gmail.com> Message-ID: <17017.12341.491136.991788@montanaro.dyndns.org> Tim> What about `else` mixed with try/except/finally? Tim> try: Tim> A Tim> except: Tim> B Tim> else: Tim> C Tim> finally: Tim> D Tim> If A executes without exception, does D execute before or after C? According to Guido, execution is A, C, D in the normal case and A, B, D in the exceptional case. Execution never jumps back. Tim> I'm not saying we can't make up reasonable answers. I'm saying Tim> they look more-or-less arbitrary, while the current nested forms Tim> are always clear. As far as arbitrary answers go, execution only in the forward direction seems more reasonable than jumping forward and back. Skip From reinhold-birkenfeld-nospam at wolke7.net Wed May 4 22:28:23 2005 From: reinhold-birkenfeld-nospam at wolke7.net (Reinhold Birkenfeld) Date: Wed, 04 May 2005 22:28:23 +0200 Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally In-Reply-To: <1f7befae05050413141e1128dc@mail.gmail.com> References: <1f7befae05050411416c198c54@mail.gmail.com> <1f7befae05050413141e1128dc@mail.gmail.com> Message-ID: Tim Peters wrote: > [Reinhold Birkenfeld] >> ... >> I think the behaviour of the "else" clause is much harder to guess, >> mainly when used with the looping constructs. > > No, that's obvious . OK, I'm persuaded. Well you wield the Force, master. > What about `else` mixed with try/except/finally? > > try: > A > except: > B > else: > C > finally: > D > > If A executes without exception, does D execute before or after C? Given the order of the clauses, is it so ambiguous? Reinhold -- Mail address is perfectly valid! From aahz at pythoncraft.com Wed May 4 22:52:27 2005 From: aahz at pythoncraft.com (Aahz) Date: Wed, 4 May 2005 13:52:27 -0700 Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: <79990c6b0505040827941ff0@mail.gmail.com> References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr> <17015.39213.522060.873605@montanaro.dyndns.org> <17015.48830.223391.390538@montanaro.dyndns.org> <79990c6b050504015762d004ac@mail.gmail.com> <79990c6b0505040827941ff0@mail.gmail.com> Message-ID: <20050504205226.GA25641@panix.com> On Wed, May 04, 2005, Paul Moore wrote: > > Yes, that'd do. I can't say I think it would be common, but it's a > valid case. And the workaround is the usual messy flag variable: > > for name in filenames: > found = False > opening(name) as f: > if f.read(2) == 0xFEB0: found = True > if found: break My standard workaround is using exceptions, but I'm not sure how that interacts with a block: try: for name in filenames: with opened(name) as f: if f.read(2) == 0xFEB0: raise Found except Found: pass -- Aahz (aahz at pythoncraft.com) <*> http://www.pythoncraft.com/ "It's 106 miles to Chicago. We have a full tank of gas, a half-pack of cigarettes, it's dark, and we're wearing sunglasses." "Hit it." From rodsenra at gpr.com.br Wed May 4 23:29:47 2005 From: rodsenra at gpr.com.br (Rodrigo Dias Arruda Senra) Date: Wed, 4 May 2005 18:29:47 -0300 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: References: <20050503201400.GE30548@solar.trillke.net> <20020107054513.566d74ed@localhost.localdomain> Message-ID: <20050504182947.6b95773a@localhost.localdomain> [ Reinhold Birkenfeld ]: > Well, with it you could create suites with _any_ introducing > identifier. Consider: > > with: > (...) > > synchronized: > (...) > > try: > (...) > > transaction: > (...) > > > Do you understand my concern? I definetely see your point. However,... > It would be very, very hard to discern > these "user-defined statements" from real language constructs. - today it is hard to distinguish a user-defined function from a builtin function. What is the problem with the former (anonymous blocks) that is accepted for the later (functions). I guess the solution is the same for both: documentation. - 'try' would probably be an invalid identifier/expression in a block, as well as any other reserved words. So no confusion arises from '''try:''' nor '''while''' nor '''for''' nor '''except''' etc [ Ka-Ping Yee ]: > My point is There are good reasons why the language has keywords, why it > distinguishes statements from expressions, uses indentation, and > so on. All of these properties cause Python programs to be made > of familiar and easily recognizable patterns instead of degenerating > into a homogeneous pile of syntax. I am completely in favour of preserving Python's readability and simplicity. But metaclasses and decorators introduced opportunities for some magical spells. Either you know their definition and how they modify its subjects or your code understanding might be harmed to a certain degree. They were born without being baptized with a keyword, why should blocks ? I think that the absence of 'name clashing', alone, is the strong argument in favour of the __no keyword__ proposal. Recognizing a __no keyword__ block would be very easy. If you did not recognize it as something you already knew, then it was a block <0.2 wink>. best regards, Senra -- Rodrigo Senra -- MSc Computer Engineer rodsenra(at)gpr.com.br GPr Sistemas Ltda http://www.gpr.com.br/ Personal Blog http://rodsenra.blogspot.com/ From reinhold-birkenfeld-nospam at wolke7.net Wed May 4 23:31:53 2005 From: reinhold-birkenfeld-nospam at wolke7.net (Reinhold Birkenfeld) Date: Wed, 04 May 2005 23:31:53 +0200 Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: <20050504205226.GA25641@panix.com> References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr> <17015.39213.522060.873605@montanaro.dyndns.org> <17015.48830.223391.390538@montanaro.dyndns.org> <79990c6b050504015762d004ac@mail.gmail.com> <79990c6b0505040827941ff0@mail.gmail.com> <20050504205226.GA25641@panix.com> Message-ID: Aahz wrote: > On Wed, May 04, 2005, Paul Moore wrote: >> >> Yes, that'd do. I can't say I think it would be common, but it's a >> valid case. And the workaround is the usual messy flag variable: >> >> for name in filenames: >> found = False >> opening(name) as f: >> if f.read(2) == 0xFEB0: found = True >> if found: break > > My standard workaround is using exceptions, but I'm not sure how that > interacts with a block: > > try: > for name in filenames: > with opened(name) as f: > if f.read(2) == 0xFEB0: > raise Found > except Found: > pass >From a naive point of view, it should definitely work as expected. >From the PEP point of view, no clue. *hope* Reinhold -- Mail address is perfectly valid! From martin at v.loewis.de Thu May 5 00:00:34 2005 From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=) Date: Thu, 05 May 2005 00:00:34 +0200 Subject: [Python-Dev] Py_UNICODE madness In-Reply-To: <8c00ac9db9a6be3e4a937eb6290c9838@opnet.com> References: <8c00ac9db9a6be3e4a937eb6290c9838@opnet.com> Message-ID: <42794602.7080609@v.loewis.de> Nicholas Bastin wrote: > That makes PyUnicode_FromUnicode() a lot less useful. Well, really, > not useful at all. Why do you say that? Py_UNICODE is as good a type to store characters as any other, and if you happen to have a Py_UNICODE[], you can use that function to build a unicode object. > You might suggest that PyUnicode_FromWideChar is more useful, but > that's only true on platforms that support wchar_t. Useful to do what? PyInt_FromLong isn't useful if you have a void*, either... > Is there no universally supported way of moving buffers of unicode data > (as common data types, like unsigned short, etc.) into Python from C? There is no common Unicode type in C, period (be it Python or not). Your best bet is to prepare a Py_UNICODE[], either by copying from your favourite Unicode type, or by casting it, e.g. #if Py_UNICODE_IS_AS_WIDE_AS_MY_UNICODE_TYPE Py_UNICODE* data = (Py_UNICODE*) my_data; do_free=0; #else Py_UNICODE* data = malloc(sizeof(Py_UNICODE)*my_data_len); for(int i=0;i<=my_data_len) data[i] = my_data[i]; do_free=1; #endif PyObject *uni = PyUnicode_FromUnicode(data); if(do_free)free(data); Regards, Martin From martin at v.loewis.de Thu May 5 00:03:37 2005 From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=) Date: Thu, 05 May 2005 00:03:37 +0200 Subject: [Python-Dev] New Py_UNICODE doc In-Reply-To: <19b53d82c6eb11419ddf4cb529241f64@opnet.com> References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com> <2mhdhiyler.fsf@starship.python.net> <19b53d82c6eb11419ddf4cb529241f64@opnet.com> Message-ID: <427946B9.6070500@v.loewis.de> Nicholas Bastin wrote: > "This type represents the storage type which is used by Python > internally as the basis for holding Unicode ordinals. Extension module > developers should make no assumptions about the size of this type on > any given platform." But people want to know "Is Python's Unicode 16-bit or 32-bit?" So the documentation should explicitly say "it depends". Regards, Martin From shane at hathawaymix.org Thu May 5 00:20:45 2005 From: shane at hathawaymix.org (Shane Hathaway) Date: Wed, 04 May 2005 16:20:45 -0600 Subject: [Python-Dev] New Py_UNICODE doc In-Reply-To: <427946B9.6070500@v.loewis.de> References: <9e09f34df62dfcf799155a4ff4fdc327@opnet.com> <2mhdhiyler.fsf@starship.python.net> <19b53d82c6eb11419ddf4cb529241f64@opnet.com> <427946B9.6070500@v.loewis.de> Message-ID: <42794ABD.2080405@hathawaymix.org> Martin v. L?wis wrote: > Nicholas Bastin wrote: > >>"This type represents the storage type which is used by Python >>internally as the basis for holding Unicode ordinals. Extension module >>developers should make no assumptions about the size of this type on >>any given platform." > > > But people want to know "Is Python's Unicode 16-bit or 32-bit?" > So the documentation should explicitly say "it depends". On a related note, it would be help if the documentation provided a little more background on unicode encoding. Specifically, that UCS-2 is not the same as UTF-16, even though they're both two bytes wide and most of the characters are the same. UTF-16 can encode 4 byte characters, while UCS-2 can't. A Py_UNICODE is either UCS-2 or UCS-4. It took me quite some time to figure that out so I could produce a patch [1]_ for PyXPCOM that fixes its unicode support. .. [1] https://bugzilla.mozilla.org/show_bug.cgi?id=281156 Shane From shane.holloway at ieee.org Thu May 5 00:45:58 2005 From: shane.holloway at ieee.org (Shane Holloway (IEEE)) Date: Wed, 04 May 2005 16:45:58 -0600 Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally In-Reply-To: <17017.12341.491136.991788@montanaro.dyndns.org> References: <1f7befae05050411416c198c54@mail.gmail.com> <1f7befae05050413141e1128dc@mail.gmail.com> <17017.12341.491136.991788@montanaro.dyndns.org> Message-ID: <427950A6.6090408@ieee.org> And per the PEP, I think the explaining that:: try: A except: B else: C finally: D is *exactly* equivalent to:: try: try: A except: B else: C finally: D Resolved all the questions about control flow for me. Well, assuming that implementation makes the explanation truth. ;) From m.u.k.2 at gawab.com Thu May 5 00:51:30 2005 From: m.u.k.2 at gawab.com (M.Utku K.) Date: Wed, 4 May 2005 22:51:30 +0000 (UTC) Subject: [Python-Dev] Need to hook Py_FatalError References: Message-ID: Hi, Added the patch to the patch manager on SF. Best regards. From tdelaney at avaya.com Thu May 5 01:28:01 2005 From: tdelaney at avaya.com (Delaney, Timothy C (Timothy)) Date: Thu, 5 May 2005 09:28:01 +1000 Subject: [Python-Dev] PEP 340 -- concept clarification Message-ID: <338366A6D2E2CA4C9DAEAE652E12A1DE72127A@au3010avexu1.global.avaya.com> Nick Coghlan wrote: > Ah, someone else did post this idea first :) I knew I was standing on the shoulders of others :) > To deal with the generator issue, one option would be to follow up on > Phillip's idea of a decorator to convert a generator (or perhaps any > standard iterator) into a block iterator. > > I think this would also do wonders for emphasising the difference > between for loops and block statements. I think if we are going to emphasise the difference, a decorator does not go far enough. To use a decorator, this *must* be valid syntax:: def gen(): try: yield finally: print 'Done!' However, that generator cannot be properly used in a for-loop. So it's only realistically valid with the decorator, and used in a block statement (resource suite ;) My feeling is that the above should be a SyntaxError, as it currently is, and that a new keyword is needed which explicitly allows the above, and creates an object conforming to the resource protocal (as I called it). Tim Delaney From shane.holloway at ieee.org Thu May 5 01:29:39 2005 From: shane.holloway at ieee.org (Shane Holloway (IEEE)) Date: Wed, 04 May 2005 17:29:39 -0600 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: <42792888.10209@hathawaymix.org> References: <20050503201400.GE30548@solar.trillke.net> <20020107054513.566d74ed@localhost.localdomain> <42792888.10209@hathawaymix.org> Message-ID: <42795AE3.10808@ieee.org> Shane Hathaway wrote: > For each block statement, it is necessary to create a *new* iterator, > since iterators that have stopped are required to stay stopped. So at a > minimum, used-defined statements will need to call something, and thus > will have parentheses. The parentheses might be enough to make block > statements not look like built-in keywords. Definitely true for generators. Not necessarily true for iterators in general:: class Example(object): value = 0 result = False def __iter__(self): return self def next(self): self.result = not self.result if self.result: self.value += 1 return self.value else: raise StopIteration() :: >>> e = Example() >>> list(e) [1] >>> list(e) [2] >>> list(e) [3] It might actually be workable in the transaction scenario, as well as others. I'm not sure if I love or hate the idea though. Another thing. In the specification of the Anonymous Block function, is there a reason that "itr = EXPR1" instead of "itr = iter(EXPR1)"? It seems to be a dis-symmetry with the 'for' loop specification. Thanks, -Shane (Holloway) ;) From tdelaney at avaya.com Thu May 5 01:37:18 2005 From: tdelaney at avaya.com (Delaney, Timothy C (Timothy)) Date: Thu, 5 May 2005 09:37:18 +1000 Subject: [Python-Dev] PEP 340: Breaking out. Message-ID: <338366A6D2E2CA4C9DAEAE652E12A1DE025204E5@au3010avexu1.global.avaya.com> Aahz wrote: > My standard workaround is using exceptions, but I'm not sure how that > interacts with a block: > > try: > for name in filenames: > with opened(name) as f: > if f.read(2) == 0xFEB0: > raise Found > except Found: > pass For any sane block iterator, it will work as expected. However, an evil block iterator could suppress the `Found` exception. A sane block iterator should re-raise the original exception. Tim Delaney From gvanrossum at gmail.com Thu May 5 01:56:41 2005 From: gvanrossum at gmail.com (Guido van Rossum) Date: Wed, 4 May 2005 16:56:41 -0700 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: References: Message-ID: I'm forced by my day job to temporarily withdraw from the discussion about PEP 340 (I've used up my Python quota for the next several weeks). If agreement is reached in python-dev to suggest specific changes to the PEP, please let me know via mail sent directly to me and not cc'ed to python-dev. But please only if there is broad agreement on something. -- --Guido van Rossum (home page: http://www.python.org/~guido/) From tim.peters at gmail.com Thu May 5 02:29:07 2005 From: tim.peters at gmail.com (Tim Peters) Date: Wed, 4 May 2005 20:29:07 -0400 Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally In-Reply-To: <427950A6.6090408@ieee.org> References: <1f7befae05050411416c198c54@mail.gmail.com> <1f7befae05050413141e1128dc@mail.gmail.com> <17017.12341.491136.991788@montanaro.dyndns.org> <427950A6.6090408@ieee.org> Message-ID: <1f7befae0505041729512ca505@mail.gmail.com> [Shane Holloway] > And per the PEP, I think the explaining that:: > > try: > A > except: > B > else: > C > finally: > D > > is *exactly* equivalent to:: > > try: > try: > A > except: > B > else: > C > finally: > D > > Resolved all the questions about control flow for me. Well, assuming > that implementation makes the explanation truth. ;) Yup! It's not unreasonable to abbreviate it, but the second form is obvious on the face of it, and can already be written. I'm neutral on adding the slightly muddier shortcut. From edcjones at comcast.net Thu May 5 05:37:20 2005 From: edcjones at comcast.net (Edward C. Jones) Date: Wed, 04 May 2005 23:37:20 -0400 Subject: [Python-Dev] Adding DBL_MANTISSA and such to Python Message-ID: <427994F0.6020504@comcast.net> Recently I needed some information about the floating point numbers on my machine. So I wrote a tiny C99 program with the line printf("%a\n", DBL_EPSILON); The answer was "0x1p-52". A search of comp.lang.python shows that I was not alone. Here are some ideas. 1. Add to Python the constants in "float.h" and "limits.h". 2. Add the C99 "%a" format to the "%" operator for strings and allow it in floating point literals. 3. Add full "tostring" and "fromstring" capabilities for Python numeric types. "tostring(x)" would return a string containing the binary representation of x. For example, if x is a Python float, "tostring(x)" would have eight characters. "fromstring(s, atype)" does the reserve, so fromstring(tostring(x), type(x)) == x 4. Add some functions that process floating point types at a low level. I suggest borrowing from C (mantissa, exponent) = frexp(x) where mantissa is a float and exponent is an int. The mantissa can be 0.0 or 0.5 <= mantissa < 1.0. Also x = mamtissa * 2**exponent. If x == 0.0, the function returns (0.0, 0). (This is almost a quote from Harbison and Steele.) 5. Add the C99 constants and functions involving special floating point values: "FP_INFINITE", "FP_NAN", "FP_NORMAL", "FP_SUBNORMAL", "FP_ZERO", "fpclassify", "isfinite", "isinf", "isnan", "isnormal", "signbit", "copysign", "nan", "nextafter", and "nexttoward". There has been controversy about these in the past, but I am in favor of them. The documentation should discuss portability. From greg.ewing at canterbury.ac.nz Thu May 5 06:33:03 2005 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 05 May 2005 16:33:03 +1200 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: <42792888.10209@hathawaymix.org> References: <20050503201400.GE30548@solar.trillke.net> <20020107054513.566d74ed@localhost.localdomain> <42792888.10209@hathawaymix.org> Message-ID: <4279A1FF.9030308@canterbury.ac.nz> Shane Hathaway wrote: > For each block statement, it is necessary to create a *new* iterator, > since iterators that have stopped are required to stay stopped. So at a > minimum, used-defined statements will need to call something, and thus > will have parentheses. Not necessarily! class Frobbing: def __neg__(self): begin_frobbing() try: yield finally: end_frobbing() frobbing = Frobbing() ... -frobbing: do_something() -- Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg.ewing at canterbury.ac.nz +--------------------------------------+ From shane at hathawaymix.org Thu May 5 06:36:43 2005 From: shane at hathawaymix.org (Shane Hathaway) Date: Wed, 04 May 2005 22:36:43 -0600 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: <42795AE3.10808@ieee.org> References: <20050503201400.GE30548@solar.trillke.net> <20020107054513.566d74ed@localhost.localdomain> <42792888.10209@hathawaymix.org> <42795AE3.10808@ieee.org> Message-ID: <4279A2DB.6040504@hathawaymix.org> Shane Holloway (IEEE) wrote: > Another thing. In the specification of the Anonymous Block function, is > there a reason that "itr = EXPR1" instead of "itr = iter(EXPR1)"? It > seems to be a dis-symmetry with the 'for' loop specification. Hmm... yeah. That's strange. In fact, if it gets changed to "itr = iter(EXPR1)", as it probably ought to, all of the existing examples will continue to work. It will also be safe to start block iterators with a single variable, nullifying my argument about parentheses. So Reinhold's examples stand, except for the "try" block, since it clashes with a keyword. They read well, but when something goes wrong in the code, how would a new programmer crack these nuts? with: (...) synchronized: (...) transaction: (...) > Thanks, > -Shane (Holloway) ;) Once in a while I read a post by Shane H????way and start wondering when I wrote it and how I could've forgotten about it. And then I realize I didn't. :-) Shane From greg.ewing at canterbury.ac.nz Thu May 5 06:38:29 2005 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 05 May 2005 16:38:29 +1200 Subject: [Python-Dev] PEP 340: Only for try/finally? In-Reply-To: <740c3aec05050314544178f57f@mail.gmail.com> References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr> <17015.39213.522060.873605@montanaro.dyndns.org> <17015.48830.223391.390538@montanaro.dyndns.org> <740c3aec05050314544178f57f@mail.gmail.com> Message-ID: <4279A345.8050708@canterbury.ac.nz> BJ?rn Lindqvist wrote: > But why stop there? Lots of functions that takes a callable as > argument could be upgraded to use the new block syntax. Actually, this is something that occurred to me in potential support of a thunk implementation: It's possible that many functions already out there which take function arguments could *already* be used with a thunk-based block statement, without needing to be re-written at all. -- Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg.ewing at canterbury.ac.nz +--------------------------------------+ From shane at hathawaymix.org Thu May 5 06:51:17 2005 From: shane at hathawaymix.org (Shane Hathaway) Date: Wed, 04 May 2005 22:51:17 -0600 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: <4279A1FF.9030308@canterbury.ac.nz> References: <20050503201400.GE30548@solar.trillke.net> <20020107054513.566d74ed@localhost.localdomain> <42792888.10209@hathawaymix.org> <4279A1FF.9030308@canterbury.ac.nz> Message-ID: <4279A645.9060004@hathawaymix.org> Greg Ewing wrote: > Shane Hathaway wrote: > >>For each block statement, it is necessary to create a *new* iterator, >>since iterators that have stopped are required to stay stopped. So at a >>minimum, used-defined statements will need to call something, and thus >>will have parentheses. > > > Not necessarily! > > class Frobbing: > > def __neg__(self): > begin_frobbing() > try: > yield > finally: > end_frobbing() > > frobbing = Frobbing() > > ... > > -frobbing: > do_something() Larry Wall would hire you in a heartbeat. ;-) Maybe there's really no way to prevent people from writing cute but obscure block statements. A keyword like "block" or "suite" would give the reader something firm to hold on to. Shane From greg.ewing at canterbury.ac.nz Thu May 5 07:12:07 2005 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 05 May 2005 17:12:07 +1200 Subject: [Python-Dev] PEP 340 -- Clayton's keyword? In-Reply-To: <42792888.10209@hathawaymix.org> References: <20050503201400.GE30548@solar.trillke.net> <20020107054513.566d74ed@localhost.localdomain> <42792888.10209@hathawaymix.org> Message-ID: <4279AB27.3010405@canterbury.ac.nz> How about user-defined keywords? Suppose you could write statement opening def opening(path, mode): f = open(path, mode) try: yield finally: close(f) which would then allow opening "myfile", "w" as f: do_something_with(f) The 'statement' statement declares to the parser that an identifier is to be treated as a keyword introducing a block statement when it appears as the first token in a statement. This would allow keywordless block-statements that look very similar to built-in statements, without any danger of forgetting to make a function call, since a call would be implicit in all such block-statements. A 'statement' declaration would be needed in all modules which use the generator, e.g. statement opening from filestuff import opening For convenience, this could be abbreviated to from filestuff import statement opening There could also be an abbreviation def statement opening(...): ... for when you're defining and using it in the same module. Sufficiently smart editors would understand the 'statement' declarations and highlight accordingly, making these user-defined statements look even more like the native ones. -- Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg.ewing at canterbury.ac.nz +--------------------------------------+ From jbone at place.org Thu May 5 07:33:23 2005 From: jbone at place.org (Jeff Bone) Date: Thu, 5 May 2005 00:33:23 -0500 Subject: [Python-Dev] PEP 340 -- Clayton's keyword? In-Reply-To: <4279AB27.3010405@canterbury.ac.nz> References: <20050503201400.GE30548@solar.trillke.net> <20020107054513.566d74ed@localhost.localdomain> <42792888.10209@hathawaymix.org> <4279AB27.3010405@canterbury.ac.nz> Message-ID: <2BA5DA3C-BF3D-4B6B-A53B-EAFDCDF8CAE8@place.org> +1 This is awesome. BTW, did we totally abandon the question of using block: as RHS? jb On May 5, 2005, at 12:12 AM, Greg Ewing wrote: > How about user-defined keywords? > > Suppose you could write > > statement opening > > def opening(path, mode): > f = open(path, mode) > try: > yield > finally: > close(f) > > which would then allow > > opening "myfile", "w" as f: > do_something_with(f) > > The 'statement' statement declares to the parser that an > identifier is to be treated as a keyword introducing a > block statement when it appears as the first token in a > statement. > > This would allow keywordless block-statements that look > very similar to built-in statements, without any danger of > forgetting to make a function call, since a call would > be implicit in all such block-statements. > > A 'statement' declaration would be needed in all modules > which use the generator, e.g. > > statement opening > from filestuff import opening > > For convenience, this could be abbreviated to > > from filestuff import statement opening > > There could also be an abbreviation > > def statement opening(...): > ... > > for when you're defining and using it in the same module. > > Sufficiently smart editors would understand the 'statement' > declarations and highlight accordingly, making these > user-defined statements look even more like the native > ones. > > -- > Greg Ewing, Computer Science Dept, > +--------------------------------------+ > University of Canterbury, | A citizen of NewZealandCorp, > a | > Christchurch, New Zealand | wholly-owned subsidiary of USA > Inc. | > greg.ewing at canterbury.ac.nz > +--------------------------------------+ > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > http://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: http://mail.python.org/mailman/options/python-dev/jbone > %40place.org > From shane at hathawaymix.org Thu May 5 08:18:38 2005 From: shane at hathawaymix.org (Shane Hathaway) Date: Thu, 05 May 2005 00:18:38 -0600 Subject: [Python-Dev] [OT] Re: PEP 340 -- Clayton's keyword? In-Reply-To: <2BA5DA3C-BF3D-4B6B-A53B-EAFDCDF8CAE8@place.org> References: <20050503201400.GE30548@solar.trillke.net> <20020107054513.566d74ed@localhost.localdomain> <42792888.10209@hathawaymix.org> <4279AB27.3010405@canterbury.ac.nz> <2BA5DA3C-BF3D-4B6B-A53B-EAFDCDF8CAE8@place.org> Message-ID: <4279BABE.6040208@hathawaymix.org> Just a little offtopic note to Jeff Bone: Jeff, every time I send a message to Python-Dev, your "Mail.app 2.0" sends me a nasty auto-reply that I can't quote in public. Please stop. Since I can't seem to reach you by email, I'm trying to reach you through this mailing list. The note refers to something about "Shantar"; maybe that will help you figure out what's wrong. Shane From martin at v.loewis.de Thu May 5 08:39:54 2005 From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=) Date: Thu, 05 May 2005 08:39:54 +0200 Subject: [Python-Dev] Adding DBL_MANTISSA and such to Python In-Reply-To: <427994F0.6020504@comcast.net> References: <427994F0.6020504@comcast.net> Message-ID: <4279BFBA.2020608@v.loewis.de> Edward C. Jones wrote: > The documentation should discuss portability. This is the critical issue here. Discussing portability is not enough; these features really ought to be either available on a majority of the installations, or not available at all. In particular, they would need to be available on Windows. I haven't check whether VC 7.1 provides them, and if it doesn't, somebody would have to provide a "direct" implementation. I'd say "contributions are welcome". Regards, Martin From python-dev at zesty.ca Thu May 5 08:59:42 2005 From: python-dev at zesty.ca (Ka-Ping Yee) Date: Thu, 5 May 2005 01:59:42 -0500 (CDT) Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: <338366A6D2E2CA4C9DAEAE652E12A1DE025204E5@au3010avexu1.global.avaya.com> References: <338366A6D2E2CA4C9DAEAE652E12A1DE025204E5@au3010avexu1.global.avaya.com> Message-ID: On Thu, 5 May 2005, Delaney, Timothy C (Timothy) wrote: > Aahz wrote: > > My standard workaround is using exceptions, but I'm not sure how that > > interacts with a block: > > > > try: > > for name in filenames: > > with opened(name) as f: > > if f.read(2) == 0xFEB0: > > raise Found > > except Found: > > pass > > For any sane block iterator, it will work as expected. However, an evil > block iterator could suppress the `Found` exception. I was thinking about more use cases for the block statement, and one of the ideas was an exception-logging block: def logged_exceptions(file): try: yield except Exception, value: file.write(repr(value) + '\n') block logged_exceptions(file): do stuff do stuff do stuff ...but then i wasn't sure whether this was supposed to be possible in the proposed scheme. Currently, generators do not catch exceptions raised in the code that they yield values to, because the target of the yield is in a higher stack frame. This makes sense from a language design perspective, since there is no try...finally construct lexically wrapping the thing that raises the exception. In current Python, for example, this says 'caught outside generator': def spam_generator(): try: yield 'spam' except ValueError, value: print 'caught inside generator' try: g = spam_generator() i = g.next() raise ValueError(5) except ValueError, value: print 'caught outside generator' But now i'm confused. Tim's words above seem to suggest that the interior generator could actually catch exceptions raised outside, by whatever is on the other end of the yield. So, could a block statement really catch that exception inside? I think it might be too confusing if it were possible. -- ?!ng From jcarlson at uci.edu Thu May 5 09:27:54 2005 From: jcarlson at uci.edu (Josiah Carlson) Date: Thu, 05 May 2005 00:27:54 -0700 Subject: [Python-Dev] Adding DBL_MANTISSA and such to Python In-Reply-To: <427994F0.6020504@comcast.net> References: <427994F0.6020504@comcast.net> Message-ID: <20050505000111.64BD.JCARLSON@uci.edu> "Edward C. Jones" wrote: > 3. Add full "tostring" and "fromstring" capabilities for Python numeric > types. "tostring(x)" would return a string containing the binary > representation of x. For example, if x is a Python float, "tostring(x)" > would have eight characters. "fromstring(s, atype)" does the reserve, so > fromstring(tostring(x), type(x)) == x For floats: struct.pack("d",...) struct.unpack("d",...) For 32-bit signed integers: struct.pack("l",...) struct.unpack("l",...) For 64 bit signed integers: struct.pack("Q",...) struct.unpack("Q",...) Heck, you can even get big-endian output on little-endian machines (or vv.) if you want! Or you can toss the signs on the integers, get shorts, or even chars. Python already has such functionality in the standard library, though perhaps it isn't the most aptly named (being in 'struct' rather than a 'value_packing' module...though its functionality was for packing and unpacking of c-style structs...). Alternatively, you can create an array (using similar typecodes), and use the .tostring() and .fromstring() mechanism. > 4. Add some functions that process floating point types at a low level. > I suggest borrowing from C > (mantissa, exponent) = frexp(x) What about the sign? Here's an implementation for you that includes the sign... def frexp(f): if not isinstance(f, float): raise TypeError, "Requires float argument" v, = struct.unpack(">Q", struct.pack(">d", f)) #we ignore denormalized values, NANs, and infinities... return v>>63, 1 + (v&(2**52-1))/(2.0**52), ((v>>52)&2047)-1023 Is that enough? Or did you want to convert back into a float? def inv_frexp(sign, mantissa, exponent): #I don't know what this is normally called in C... v = bool(sign)*2**63 v += (abs(int(exponent+1023))&2047)*2**52 v += abs(int(((mantissa-1)*2**52)))&(2**52-1) f, = struct.unpack(">d", struct.pack(">Q", v)) return f Yeah, there's some bit work in there, and some is merely protection against foolish inputs, but it's not that bad. - Josiah From t-meyer at ihug.co.nz Thu May 5 09:29:02 2005 From: t-meyer at ihug.co.nz (Tony Meyer) Date: Thu, 5 May 2005 19:29:02 +1200 Subject: [Python-Dev] python-dev Summary for 2005-04-16 through 2005-04-30 [draft] Message-ID: Here's April Part Two. If anyone can take their eyes of the anonymous block threads for a moment and give this a once-over, that would be great! Please send any corrections or suggestions to Tim (tlesher at gmail.com), Steve (steven.bethard at gmail.com) and/or me, rather than cluttering the list. Ta! ====================== Summary Announcements ====================== --------------- Exploding heads --------------- After a gentle introduction for our first summary, python-dev really let loose last fortnight; not only with the massive PEP 340 discussion, but also more spin-offs than a `popular`_ `TV`_ `series`_, and a few stand-alone threads. Nearly a week into May, and the PEP 340 talk shows no sign of abating; this is unfortunate, since Steve's head may explode if he has to write anything more about anonymous blocks. Just as well there are three of us! .. _popular: http://imdb.com/title/tt0060028/ .. _TV: http://imdb.com/title/tt0098844/ .. _series: http://imdb.com/title/tt0247082/ [TAM] ------- PEP 340 ------- A request for anonymous blocks by Shannon -jj Behrens launched a massive discussion about a variety of related ideas. This discussion is split into different sections for the sake of readability, but as the sections are extracted from basically the same discussion, it may be easiest to read them in the following order: 1. `Localized Namespaces`_ 2. `The Control Flow Management Problem`_ 3. `Block Decorators`_ 4. `PEP 310 Updates Requested`_ 5. `Sharing Namespaces`_ 6. `PEP 340 Proposed`_ [SJB] ========= Summaries ========= -------------------- Localized Namespaces -------------------- Initially, the "anonymous blocks" discussion focused on introducing statement-local namespaces as a replacement for lambda expressions. This would have allowed localizing function definitions to a single namespace, e.g.:: foo = property(get_foo) where: def get_foo(self): ... where get_foo is only accessible within the ``foo = ...`` assignment statement. However, this proposal seemed mainly to be motivated by a desire to avoid "namespace pollution", an issue which Guido felt was not really that much of a problem. Contributing threads: - `anonymous blocks `__ [SJB] ----------------------------------- The Control Flow Management Problem ----------------------------------- Guido suggested that if new syntax were to be introduced for "anonymous blocks", it should address the more important problem of being able to extract common patterns of control flow. A very typical example of such a problem, and thus one of the recurring examples in the thread, is that of a typical acquire/release pattern, e.g.:: lock.acquire() try: CODE finally: lock.release() Guido was hoping that syntactic sugar and an appropriate definition of locking() could allow such code to be written as:: locking(lock): CODE where locking() would factor out the acquire(), try/finally and release(). For such code to work properly, ``CODE`` would have to execute in the enclosing namespace, so it could not easily be converted into a def-statement. Some of the suggested solutions to this problem: - `Block Decorators`_ - `PEP 310 Updates Requested`_ - `Sharing Namespaces`_ - `PEP 340 Proposed`_ Contributing threads: - `anonymous blocks `__ [SJB] ---------------- Block Decorators ---------------- One of the first solutions to `The Control Flow Management Problem`_ was "block decorators". Block decorators were functions that accepted a "block object" (also referred to in the thread as a "thunk"), defined a particular control flow, and inserted calls to the block object at the appropriate points in the control flow. Block objects would have been much like function objects, in that they encapsulated a sequence of statements, except that they would have had no local namespace; names would have been looked up in their enclosing function. Block decorators would have wrapped sequences of statements in much the same way as function decorators wrap functions today. "Block decorators" would have allowed locking() to be written as:: def locking(lock): def block_deco(block): lock.acquire() try: block() finally: lock.release() return block_deco and invoked as:: @locking(lock): CODE The implementation of block objects would have been somewhat complicated if a block object was a first class object and could be passed to other functions. This would have required all variables used in a block object to be "cells" (which provide slower access than normal name lookup). Additionally, first class block objects, as a type of callable, would have confused the meaning of the return statement - should the return exit the block or the enclosing function? Contributing threads: - `anonymous blocks `__ - `Anonymous blocks: Thunks or iterators? `__ [SJB] ------------------------- PEP 310 Updates Requested ------------------------- Another suggested solution to `The Control Flow Management Problem`_ was the resuscitation of `PEP 310`_, which described a protocol for invoking the __enter__() and __exit__() methods of an object at the beginning and ending of a set of statements. This PEP was originally intended mainly to address the acquisition/release problem, an example of which is discussed in `The Control Flow Management Problem`_ as the locking() problem. Unmodified, `PEP 310`_ could handle the locking() problem defining locking() as:: class locking(object): def __init__(self, lock): self.lock = lock def __enter__(self): self.lock.acquire() def __exit__(self): self.lock.release() and invoking it as:: with locking(lock): CODE In addition, an extended version of the `PEP 310`_ protocol which augmented the __enter__() and __exit__() methods with __except__() and __else__() methods provided a simple syntax for some of the transactional use cases as well. Contributing threads: - `PEP 310 and exceptions: `__ - `__except__ use cases: `__ - `Integrating PEP 310 with PEP 340 `__ .. _PEP 310: http://www.python.org/peps/pep-0310.html [SJB] ------------------ Sharing Namespaces ------------------ Jim Jewett suggested that `The Control Flow Management Problem`_ could be solved in many cases by allowing arbitrary blocks of code to share namespaces with other blocks of code. As injecting such arbitrary code into a template has been traditionally done in other languages with compile-time macros, this thread briefly discussed some of the reasons for not wanting compile-time macros in Python, most importantly, that Python's compiler is "too dumb" to do much at compile time. The discussion then moved to runtime "macros", which would essentially inject code into functions at runtime. The goal here was that the injected code could share a namespace with the function into which it was injected. Jim Jewett proposed a strawman implementation that would mark includable chunks of code with a "chunk" keyword, and require these chunks to be included using an "include" keyword. The major problems with this approach were that names could "magically" appear in a function after including a chunk, and that functions that used an include statement would have dramatically slower name lookup as Python's lookup optimizations rely on static analysis. Contributing threads: - `defmacro (was: Anonymous blocks): `__ - `anonymous blocks vs scope-collapse: `__ - `scope-collapse: `__ - `anonymous blocks as scope-collapse: detailed proposal `__ [SJB] ---------------- PEP 340 Proposed ---------------- In the end, Guido decided that what he really wanted as a solution to `The Control Flow Management Problem`_ was the simplicity of something like generators that would let him write locking() as something like:: def locking(lock): lock.acquire() try: yield finally: lock.release() and invoke it as something like:: block locking(lock): CODE where the yield statement indicates the return of control flow to ``CODE``. Unlike a generator in a normal for-loop, the generator in such a "block statement" would need to guarantee that the finally block was executed by the end of the block statement. For this reason, and the fact that many felt that overloading for-loops for such non-loops might be confusing, Guido suggested introducing "block statements" as a new separate type of statement. `PEP 340`_ explains the proposed implementation. Essentially, a block-statement takes a block-iterator object, and calls its __next__() method in a while loop until it is exhausted, in much the same way that a for-statement works now. However, for generators and well-behaved block-iterators, the block-statement guarantees that the block-iterator is exhausted, by calling the block-iterator's __exit__() method. Block-iterator objects can also have values passed into them using a new extended continue statement. Several people were especially excited about this prospect as it seemed very useful for a variety of event-driven programming techniques. A few issues still remained unresolved at the time this summary was written: Should the block-iterator protocol be distinct from the current iterator protocol? Phillip J. Eby campaigned for this, suggesting that by treating them as two separate protocols, block-iterators could be prevented from accidentally being used in for-loops where their cleanup code would be silently omitted. Having two separate protocols would also allow objects to implement both protocols if appropriate, e.g. perhaps sometimes file objects should close themselves, and sometimes they shouldn't. Guido seemed inclined to instead merge the two protocols, allowing block-iterators to be used in both for-statements and block-statements, saying that the benefits of having two very similar but distinct protocols were too small. What syntax should block-statements use? This one was still undecided, with dozens of different keywords having been proposed. Syntaxes that did not seem to have any major detractors at the time this summary was written: * ``EXPR as NAME: BLOCK`` * ``@EXPR as NAME: BLOCK`` * ``in EXPR as NAME: BLOCK`` * ``block EXPR as NAME: BLOCK`` * ``finalize EXPR as NAME: BLOCK`` Contributing threads: - `anonymous blocks `__ - `next(arg) `__ - `PEP 340 - possible new name for block-statement `__ - `PEP 340: syntax suggestion - try opening(filename) as f: `__ - `Keyword for block statements `__ - `About block statement name alternative `__ - `Integrating PEP 310 with PEP 340 `__ .. _PEP 340: http://www.python.org/peps/pep-0340.html [SJB] ------------------ A switch statement ------------------ The switch statement (a la `PEP 275`_) came up again this month, as a spin-off (wasn't everything?) of the `PEP 340`_ discussion. The main benefit of such a switch statement is avoiding Python function calls, which are very slow compared to branching to inlined Python code. In addition, repetition (the name of the object being compared, and the comparison operator) is avoided. Although Brian Beck contributed a `switch cookbook recipe`_, the discussion didn't progress far enough to make any changes or additions to `PEP 275`_. Contributing threads: - `switch statement `__ .. _PEP 275: http://www.python.org/peps/pep-0275.html .. _switch cookbook recipe: http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/410692 [TAM] ---------------- Pattern Matching ---------------- Discussion about a dictionary-lookup switch statement branched out to more general discussion about pattern matching statements, like Ocaml's "match", and Haskell's case statement. There was reasonable interest in pattern matching, but since types are typically a key feature of pattern matching, and many of the elegant uses of pattern matching use recursion to traverse data structures, both of which hinder coming up with a viable Python implementation (at least while CPython lacks tail-recursion elimination). The exception to this is matching strings, where regular expressions provide a method of pattern specification, and multi-way branches based on string content is common. As such, it seems unlikely that any proposals for new pattern matching language features will be made at this time. Contributing threads: - `switch statement `__ [TAM] ----------------------------------------- Read-only property access inconsistencies ----------------------------------------- Barry Warsaw noticed that the exception thrown for read-only properties of C extension types is a TypeError, while for read-only properties of Python (new-style) classes is an AttributeError, and `wrote a patch`_ to clean up the inconsistency. Guido pronounced that this was an acceptable fix for 2.5, and so the change was checked in. Along the way, he wondered whether in the long-term, AttributeError should perhaps inherit from TypeError. Contributing threads: - `Inconsistent exception for read-only properties? `__ .. _wrote a patch: http://sourceforge.net/tracker/index.php?func=detail&aid=1184449&group_id=54 70&atid=105470 [TAM] -------------------- site.py enhancements -------------------- Bob Ippolito asked for review of a `patch to site.py`_ to solve three deficiencies for Python 2.5: - It is no longer true that all site dirs must exist on the file system - The directories added to sys.path by .pth files are not scanned for further .pth files - .pth files cannot use os.path.expanduser() Greg Ewing suggested additionally scanning the directory containing the main .py file for .pth files to make it easier to have collections of Python programs sharing a common set of modules. Possibly swamped by the `PEP 340`_ threads, further discussion trailed off, so it's likely that Bob's patch will be applied, but without Greg's proposal. Contributing threads: - `site enhancements (request for review) `__ .. _patch to site.py: http://python.org/sf/1174614 [TAM] ------------------------------------- Passing extra arguments when building ------------------------------------- Martin v. L?wis helped Brett C out with adding some code to facilitate a Py_COMPILER_DEBUG build for use on the AST branch. Specifically, an EXTRA_CFLAGS environment variable was added to the build process, to enable passing additional arguments to the compiler, that aren't modified by configure, and that change binary compatibility. Contributing threads: - `Proper place to put extra args for building `__ [TAM] --------------------------- zipfile module improvements --------------------------- Bob Ippolito pointed out that the "2GB bug" that was `supposed to be fixed`_ was not, and opened a `new bug and patch`_ that should fix the issue correctly, as well as a bug that sets the ZipInfo record's platform to Windows, regardless of the actual platform. He suggested that someone should consider rewriting the zipfile module to include a repair feature, and be up-to-date with `the latest specifications`_, and include support for Apple's zip format extensions. Charles Hartman added deleting a file from a zipfile to this list. Bob suggested that one of the most useful additions to the zipfile module would be a stream interface for reading and writing (a `patch to read large items`_ along these lines already exists). Guido liked this idea and suggested that Bob rework the zipfile module, if possible, for Python 2.5. Contributing threads: - `zipfile still has 2GB boundary bug `__ .. _supposed to be fixed: http://python.org/sf/679953 .. _new bug and patch: http://python.org/sf/1189216 .. _the latest specifications: http://www.pkware.com/company/standards/appnote/ .. _patch to read large items: http://python.org/sf/1121142 [TAM] ------------------------------------- super_getattro() and attribute lookup ------------------------------------- Phil Thompson asked a number of questions about super_getattro(), and attribute lookup. Michael Hudson answered these, and pointed out that there has been `some talk`_ of having a tp_lookup slot in typeobjects. However, he noted that he has many other pots on the boil at the moment, so is unlikely to work on it soon. Contributing threads: - `super_getattro() Behaviour `__ .. _some talk: http://mail.python.org/pipermail/python-dev/2005-March/052150.html [TAM] -------------------------------- Which objects are memory cached? -------------------------------- Facundo Batista asked about memory caching of objects (for performance reasons), to aid in explaining how to think using name/object and not variable/value. In practice, ints between -5 and 100 are cached, 1-character strings are often cached, and string literals that resemble Python identifiers are often interned. It was noted that the reference manual specifies that immutables *may* be cached, but that CPython specifics, such as which objects are, are omitted so that people will not think of them as fixed; Terry Reedy reiterated his suggestion that implementation details such as this are documented separately, elsewhere. Guido and Greg Ewing pointed out that when explaining it is important for the explainees to understand that mutable objects are never in danger of being shared. Contributing threads: - `Caching objects in memory `__ [TAM] ------------------------------ Unregistering atexit functions ------------------------------ Nick Jacobson noted that while you can mark functions to be called with at exit with the 'register' method, there's no 'unregister' method to remove them from the stack of functions to be called. Many suggestions were made about how this could already be done, however, including using try/finally, writing the cleanup routines in such a way that they could detect reentry, passing a class to register(), and managing a list of functions oneself. General pearls of wisdom outlined included: - if one devotes time to "making a case", then one should also devote equal effort to researching the hazards and API issues. - "Potentially useful" is usually trumped by "potentially harmful". - if the API is awkward or error-prone, that is a bad sign. Contributing threads: - `atexit missing an unregister method `__ [TAM] ----------------------- Pickling buffer objects ----------------------- Travis Oliphant proposed a patch to the pickle module to allow pickling of the buffer object. His use case was avoiding copying array data into a string before writing to a file in Numeric, while still having Numeric arrays interact seamlessly with other pickleable types. The proposal was to unpickle the object as a string (since a `bytes object`_ does not exist) rather than to mutable-byte buffer objects, to maintain backwards compatibility. Other than a couple of questions, no objections were raised, so a patch is likely to appear. Contributing threads: - `Pickling buffer objects. `__ .. _bytes object: http://python.org/peps/pep-0296.html [TAM] =============== Skipped Threads =============== - `Another Anonymous Block Proposal `__ - `PEP 340: What is "ret" in block statement semantics? `__ - `anonymous blocks (off topic: match) `__ - `Reference counting when entering and exiting scopes `__ - `How do you get yesterday from a time object `__ - `shadow password module (spwd) is never built due to error in setup.py `__ - `Problem with embedded python `__ - `python.org crashing Mozilla? `__ - `noob question regarding the interpreter `__ - `Newish test failures `__ - `Error checking in init functions `__ - `a few SF bugs which can (probably) be closed `__ - `Check out a new way to read threaded conversations. `__ - `Python 2.1 in HP-UX `__ - `Python tests fails on HP-UX 11.11 and core dumps `__ - `IPV6 with Python- 4.2.1 on HPUX `__ - `Fwd: CFP: DLS05: ACM Dynamic Languages Symposium `__ - `PyCon 2005 keynote on-line `__ - `PyCallable_Check redeclaration `__ - `os.urandom uses closed FD (sf 1177468) `__ - `Removing --with-wctype-functions support `__ From jcarlson at uci.edu Thu May 5 09:53:58 2005 From: jcarlson at uci.edu (Josiah Carlson) Date: Thu, 05 May 2005 00:53:58 -0700 Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: References: <42791C50.3090107@hathawaymix.org> Message-ID: <20050505004244.64C0.JCARLSON@uci.edu> Ka-Ping Yee wrote: > > On Wed, 4 May 2005, Shane Hathaway wrote: > > > > for name in filenames: > > opening(name) as f: > > if f.read(2) == 0xFEB0: > > break for > > continue with 2 > There is something about that I just don't like. I can't really put my finger on it right now, so perhaps it is merely personal aesthetics. I'll sleep on it and see if I can come up with a good reason why I don't like it. - Josiah From jcarlson at uci.edu Thu May 5 10:10:53 2005 From: jcarlson at uci.edu (Josiah Carlson) Date: Thu, 05 May 2005 01:10:53 -0700 Subject: [Python-Dev] Adding DBL_MANTISSA and such to Python In-Reply-To: <20050505000111.64BD.JCARLSON@uci.edu> References: <427994F0.6020504@comcast.net> <20050505000111.64BD.JCARLSON@uci.edu> Message-ID: <20050505010955.64C3.JCARLSON@uci.edu> Josiah Carlson wrote: unsigned vvvvvv > For 64 bit signed integers: > struct.pack("Q",...) > struct.unpack("Q",...) My fingers were typing too fast (I do much work with unsigned 64 bit integers, but not much with unsigned ones). - Josiah From mwh at python.net Thu May 5 10:17:30 2005 From: mwh at python.net (Michael Hudson) Date: Thu, 05 May 2005 09:17:30 +0100 Subject: [Python-Dev] Adding DBL_MANTISSA and such to Python In-Reply-To: <427994F0.6020504@comcast.net> (Edward C. Jones's message of "Wed, 04 May 2005 23:37:20 -0400") References: <427994F0.6020504@comcast.net> Message-ID: <2mzmvaw0h1.fsf@starship.python.net> "Edward C. Jones" writes: > Recently I needed some information about the floating point numbers on > my machine. So I wrote a tiny C99 program with the line > > printf("%a\n", DBL_EPSILON); > > The answer was "0x1p-52". > > A search of comp.lang.python shows that I was not alone. Here are some > ideas. > > 1. Add to Python the constants in "float.h" and "limits.h". Where? > 2. Add the C99 "%a" format to the "%" operator for strings and allow it > in floating point literals. Is there an implementation of this somewhere? We mostly certainly are not demanding a C99 compiler yet. > 3. Add full "tostring" and "fromstring" capabilities for Python numeric > types. "tostring(x)" would return a string containing the binary > representation of x. For example, if x is a Python float, "tostring(x)" > would have eight characters. "fromstring(s, atype)" does the reserve, so > fromstring(tostring(x), type(x)) == x We have this already in the struct module. I have a patch that should improve the robustness of these functions on IEEE-754 platforms in the face of special values that you can review if you like: http://python.org/sf/1181301 (my not-so-recent anguished "will one of you bastards please review this and/or 1180995 for me?" still applies, btw) > 4. Add some functions that process floating point types at a low level. > I suggest borrowing from C > (mantissa, exponent) = frexp(x) > where mantissa is a float and exponent is an int. The mantissa can be > 0.0 or 0.5 <= mantissa < 1.0. Also x = mamtissa * 2**exponent. If > x == 0.0, the function returns (0.0, 0). (This is almost a quote from > Harbison and Steele.) >>> math.frexp(math.pi) (0.78539816339744828, 2) What am I missing? > 5. Add the C99 constants and functions involving special floating point > values: "FP_INFINITE", "FP_NAN", "FP_NORMAL", "FP_SUBNORMAL", "FP_ZERO", > "fpclassify", "isfinite", "isinf", "isnan", "isnormal", "signbit", > "copysign", "nan", "nextafter", and "nexttoward". There has been > controversy about these in the past, but I am in favor of them. The > documentation should discuss portability. If you can supply a patch to make all the compilers out there behave with respect to these functions, I'll be impressed (they seem to exist on Mac OS X 10.3, dunno if they work though :). Cheers, mwh -- If you're talking "useful", I'm not your bot. -- Tim Peters, 08 Nov 2001 From mwh at python.net Thu May 5 10:10:17 2005 From: mwh at python.net (Michael Hudson) Date: Thu, 05 May 2005 09:10:17 +0100 Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally In-Reply-To: <427950A6.6090408@ieee.org> (Shane Holloway's message of "Wed, 04 May 2005 16:45:58 -0600") References: <1f7befae05050411416c198c54@mail.gmail.com> <1f7befae05050413141e1128dc@mail.gmail.com> <17017.12341.491136.991788@montanaro.dyndns.org> <427950A6.6090408@ieee.org> Message-ID: <2m4qdixfdi.fsf@starship.python.net> "Shane Holloway (IEEE)" writes: > And per the PEP, I think the explaining that:: > > try: > A > except: > B > else: > C > finally: > D > > is *exactly* equivalent to:: > > try: > try: > A > except: > B > else: > C > finally: > D > > Resolved all the questions about control flow for me. Well, yes, that makes sense, but also raises a small "and the point is...?" flag in my head. Cheers, mwh -- This is the fixed point problem again; since all some implementors do is implement the compiler and libraries for compiler writing, the language becomes good at writing compilers and not much else! -- Brian Rogoff, comp.lang.functional From fredrik at pythonware.com Thu May 5 10:11:32 2005 From: fredrik at pythonware.com (Fredrik Lundh) Date: Thu, 5 May 2005 10:11:32 +0200 Subject: [Python-Dev] PEP 340: propose to get rid of 'as' keyword References: <1115226841.7909.24.camel@localhost><42791DCB.5050703@hathawaymix.org> <1115238075.10836.4.camel@emperor> Message-ID: Gustavo J. A. M. Carneiro wrote: > In that case, > > block VAR1 in EXPR1: > BLOCK1 > > And now I see how using 'for' statements (perhaps slightly changed) > turned up in the discussion. you're moving through this discussion exactly backwards; the current proposal stems from the observation that "for-loop plus generators" in today's Python does in fact provide a block implementation that solves many use cases in an elegant way. PEP 340 builds on this, sorts out a couple of weak points in the current design, and adds an elegant syntax for most remaining use cases. From python at rcn.com Thu May 5 11:53:12 2005 From: python at rcn.com (Raymond Hettinger) Date: Thu, 5 May 2005 05:53:12 -0400 Subject: [Python-Dev] my first post: asking about a "decorator" module In-Reply-To: <4edc17eb0505042347a9d02be@mail.gmail.com> Message-ID: <002a01c55158$64275b80$11bd2c81@oemcomputer> > > Ultimately, some of these will likely end-up in the library. For the > > time being, I think it best that these get posted and evolve either as > > Wiki entries or as ASPN entries. The best practices and proven winners > > have yet to emerge. Solidifying first attempts is likely not a good > > idea. Putting tools in the standard library should be the last > > evolutionary step, not the first. > > Yes, of course. I just wanted to know it there was interest on the > subject. Yes, there has been quite a bit of interest including several ASPN recipes and a wiki: http://www.python.org/moin/PythonDecoratorLibrary Raymond From ncoghlan at gmail.com Thu May 5 12:44:25 2005 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 05 May 2005 20:44:25 +1000 Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr> <17015.39213.522060.873605@montanaro.dyndns.org> <17015.48830.223391.390538@montanaro.dyndns.org> <79990c6b050504015762d004ac@mail.gmail.com> <4278D7D7.2040805@gmail.com> Message-ID: <4279F909.6030206@gmail.com> Steven Bethard wrote: > Makes me wonder if we shouldn't just return to the __enter__() and > __exit__() names of PEP 310[1] where for a generator __enter__() is > just an alias for next(). We could even require Phillip J. Eby's > "blockgenerator" decorator to rename next() to __enter__(), and add > the appropriate __exit__() method. You must be reading my mind or something. . . Unless there is something in today's 80-odd messages to make it redundant, look for a post entitled something like "Minimalist PEP 340 (aka PEP 310 redux)" Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.skystorm.net From ncoghlan at gmail.com Thu May 5 12:55:14 2005 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 05 May 2005 20:55:14 +1000 Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr> <17015.39213.522060.873605@montanaro.dyndns.org> <17015.48830.223391.390538@montanaro.dyndns.org> <79990c6b050504015762d004ac@mail.gmail.com> Message-ID: <4279FB92.5050501@gmail.com> Alex Martelli wrote: > Looking for a file with a certain magicnumber in its 1st two bytes...? > > for name in filenames: > opening(name) as f: > if f.read(2) == 0xFEB0: break > > This does seem to make real-life sense to me... Also consider the vast semantic differences between: locking(lock): for item in items: if can_handle(item): break for item in items: locking(lock): if can_handle(item): break Instead of simply acquiring and releasing the lock on each iteration as one might expect, moving to the latter version *also* causes every item to be checked, instead of only items up to the first one that can be handled. The break magically becomes meaningless. How does this even come close to executable pseudocode? I also think another factor is that currently, instead of doing try/finally's in loops, there is a tendency to push the try/finally into a function, then call that function inside the loop. The introduction of block statements means that a number of those inner functions are likely to be handled as block statements instead - with the above highly confusing result. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.skystorm.net From ncoghlan at gmail.com Thu May 5 13:00:59 2005 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 05 May 2005 21:00:59 +1000 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: <42795AE3.10808@ieee.org> References: <20050503201400.GE30548@solar.trillke.net> <20020107054513.566d74ed@localhost.localdomain> <42792888.10209@hathawaymix.org> <42795AE3.10808@ieee.org> Message-ID: <4279FCEB.4020207@gmail.com> Shane Holloway (IEEE) wrote: > It might actually be workable in the transaction scenario, as well as > others. I'm not sure if I love or hate the idea though. Given that this is officially a violation of the iterator protocol. . . (check the docs for well-behaved iterators) > Another thing. In the specification of the Anonymous Block function, is > there a reason that "itr = EXPR1" instead of "itr = iter(EXPR1)"? It > seems to be a dis-symmetry with the 'for' loop specification. Indeed - and a deliberate one, at least partly to discourage caching of block iterators. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.skystorm.net From ronaldoussoren at mac.com Thu May 5 13:13:58 2005 From: ronaldoussoren at mac.com (Ronald Oussoren) Date: Thu, 5 May 2005 13:13:58 +0200 Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: <4279FB92.5050501@gmail.com> References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr> <17015.39213.522060.873605@montanaro.dyndns.org> <17015.48830.223391.390538@montanaro.dyndns.org> <79990c6b050504015762d004ac@mail.gmail.com> <4279FB92.5050501@gmail.com> Message-ID: On 5-mei-2005, at 12:55, Nick Coghlan wrote: > Alex Martelli wrote: > >> Looking for a file with a certain magicnumber in its 1st two >> bytes...? >> >> for name in filenames: >> opening(name) as f: >> if f.read(2) == 0xFEB0: break >> >> This does seem to make real-life sense to me... >> > > Also consider the vast semantic differences between: > > locking(lock): > for item in items: > if can_handle(item): break > > for item in items: > locking(lock): > if can_handle(item): break > > > Instead of simply acquiring and releasing the lock on each > iteration as one > might expect, moving to the latter version *also* causes every item > to be > checked, instead of only items up to the first one that can be > handled. The > break magically becomes meaningless. How does this even come close > to executable > pseudocode? What's bothering me about the proposed semantics is that block statement behaves like a loop while most use cases do no looping whatsoever. Furthermore the it doesn't feel like loop either. In all three examples on this page I'd assume that the break would break out of the for loop. Ronald From ncoghlan at gmail.com Thu May 5 13:46:25 2005 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 05 May 2005 21:46:25 +1000 Subject: [Python-Dev] PEP 340 -- concept clarification In-Reply-To: <338366A6D2E2CA4C9DAEAE652E12A1DE72127A@au3010avexu1.global.avaya.com> References: <338366A6D2E2CA4C9DAEAE652E12A1DE72127A@au3010avexu1.global.avaya.com> Message-ID: <427A0791.7000707@gmail.com> Delaney, Timothy C (Timothy) wrote: > Nick Coghlan wrote: > I think if we are going to emphasise the difference, a decorator does > not go far enough. To use a decorator, this *must* be valid syntax:: > > def gen(): > try: > yield > finally: > print 'Done!' > > However, that generator cannot be properly used in a for-loop. So it's > only realistically valid with the decorator, and used in a block > statement (resource suite ;) > > My feeling is that the above should be a SyntaxError, as it currently > is, and that a new keyword is needed which explicitly allows the above, > and creates an object conforming to the resource protocal (as I called > it). I think adding __exit__ and __del__ methods to generators will suffice - for a normal generator, it *will* get cleaned up eventually. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.skystorm.net From ncoghlan at gmail.com Thu May 5 14:32:59 2005 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 05 May 2005 22:32:59 +1000 Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally In-Reply-To: <2m4qdixfdi.fsf@starship.python.net> References: <1f7befae05050411416c198c54@mail.gmail.com> <1f7befae05050413141e1128dc@mail.gmail.com> <17017.12341.491136.991788@montanaro.dyndns.org> <427950A6.6090408@ieee.org> <2m4qdixfdi.fsf@starship.python.net> Message-ID: <427A127B.3050901@gmail.com> Michael Hudson wrote: > "Shane Holloway (IEEE)" writes: > > >>And per the PEP, I think the explaining that:: >> >> try: >> A >> except: >> B >> else: >> C >> finally: >> D >> >>is *exactly* equivalent to:: >> >> try: >> try: >> A >> except: >> B >> else: >> C >> finally: >> D >> >>Resolved all the questions about control flow for me. > > > Well, yes, that makes sense, but also raises a small "and the point > is...?" flag in my head. Someone writing a patch and profiling the two versions would serve to convince me :) Cheers, Nick. P.S. Well, assuming the flattened version is faster. . . -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.skystorm.net From martin at v.loewis.de Thu May 5 14:58:02 2005 From: martin at v.loewis.de (=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=) Date: Thu, 05 May 2005 14:58:02 +0200 Subject: [Python-Dev] PEP 340 keyword: after Message-ID: <427A185A.90504@v.loewis.de> I haven't followed the PEP 340 discussion in detail, but as the PEP doesn't list keywords that have been considered and rejected, I'd like to propose my own: use "after" instead of "block": after opening("/etc/passwd") as f: for line in f: print line.rstrip() after locking(myLock): # code that needs to hold the lock Regards, Martin From rodsenra at gpr.com.br Thu May 5 15:23:39 2005 From: rodsenra at gpr.com.br (Rodrigo Dias Arruda Senra) Date: Thu, 5 May 2005 10:23:39 -0300 Subject: [Python-Dev] PEP 340 keyword: after In-Reply-To: <427A185A.90504@v.loewis.de> References: <427A185A.90504@v.loewis.de> Message-ID: <20050505102339.7b745670@localhost.localdomain> On Thu, 05 May 2005 14:58:02 +0200 "Martin v. L?wis" wrote: > I haven't followed the PEP 340 discussion in detail, > but as the PEP doesn't list keywords that have been > considered and rejected, I'd like to propose my own: > use "after" instead of "block": > > after opening("/etc/passwd") as f: > for line in f: > print line.rstrip() > > after locking(myLock): > # code that needs to hold the lock > And *after* fits very nice for the examples above. However, it might get weird for: after transaction(db): # code inbetween new_trasn/ commit_or_abort The code pattern that will 'wrap' the block might not always make sense with the chosen keyword, if that keyword is not semantically neutral. (not time-related, not function-related, etc). Notice that is _no keyword_ is chosen, nothing prevents us from using (even if by aliasing): after_opening("/etc/passwd") as f: for line in f: print line.rstrip() after_locking(myLock): # code that needs to hold the lock My two cents. Senra -- Rodrigo Senra -- MSc Computer Engineer rodsenra(at)gpr.com.br GPr Sistemas Ltda http://www.gpr.com.br/ Personal Blog http://rodsenra.blogspot.com/ From eric.nieuwland at xs4all.nl Thu May 5 16:36:03 2005 From: eric.nieuwland at xs4all.nl (Eric Nieuwland) Date: Thu, 5 May 2005 16:36:03 +0200 Subject: [Python-Dev] PEP 340 -- loose ends In-Reply-To: References: <20050503201400.GE30548@solar.trillke.net> <20020107054513.566d74ed@localhost.localdomain> <42792888.10209@hathawaymix.org> Message-ID: <6992f471550fb63f339e0a0dad6ca8a5@xs4all.nl> Reinhold Birkenfeld wrote: > Shane Hathaway wrote: >> PEP 340 seems to punish people for avoiding the parentheses: >> >> transaction = begin_transaction() >> >> transaction: >> db.execute('insert 3 into mytable') >> >> transaction: >> db.execute('insert 4 into mytable') >> >> I expect that only '3' would be inserted in mytable. The second use >> of >> the transaction iterator will immediately raise StopIteration. > > Yes, but wouldn't you think that people would misunderstand it in this > way? This could be solved if the translation of block EXPR1 as VAR1: BLOCK1 would change from: itr = EXPR1 # The iterator ret = False # True if a return statement is active ...etc... to: itr = iter(EXPR1) # The iterator ret = False # True if a return statement is active ...etc... --eric -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: text/enriched Size: 1134 bytes Desc: not available Url : http://mail.python.org/pipermail/python-dev/attachments/20050505/bda648c0/attachment.bin From eric.nieuwland at xs4all.nl Thu May 5 16:51:46 2005 From: eric.nieuwland at xs4all.nl (Eric Nieuwland) Date: Thu, 5 May 2005 16:51:46 +0200 Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr> <17015.39213.522060.873605@montanaro.dyndns.org> <17015.48830.223391.390538@montanaro.dyndns.org> <79990c6b050504015762d004ac@mail.gmail.com> <4279FB92.5050501@gmail.com> Message-ID: Ronald Oussoren wrote: > What's bothering me about the proposed semantics is that block > statement behaves like a loop while most use cases do no looping > whatsoever. > Furthermore the it doesn't feel like loop either. In all three > examples on this page I'd assume > that the break would break out of the for loop. I'm bothered the same way. IMHO control constructs should be very clear. No implicit looping, conditionals etc. Especially since the main reason to have this whole discussion is about resource management. The main pattern of use I have in mind is: resource = grab/allocate/open/whatever(...) try: do something possibly with the resource except ...: ... finally: ... resource.release/deallocate/close/whatever() This is linear. No looping whatsoever. And easily translated to a simple language construct and a protocol: class resource(object): def __init__(self,...): # store resource parameters def __acquire__(self): # whatever it takes to grab the resource def __release__(self): # free the resource res = resource(...) acquire res: do something possibly with the resource except ...: ... finally: ... The resource is automagically released at the end of the 'acquire' block (keyword up for other proposals :-) An alternative syntax could also be allowed: acquire resource(...) as res: ...etc... Then 'res' would be undefined after the 'acquire' block. --eric From ncoghlan at iinet.net.au Thu May 5 17:03:54 2005 From: ncoghlan at iinet.net.au (Nick Coghlan) Date: Fri, 06 May 2005 01:03:54 +1000 Subject: [Python-Dev] PEP 340: Non-looping version (aka PEP 310 redux) Message-ID: <427A35DA.8050505@iinet.net.au> The discussion on the meaning of break when nesting a PEP 340 block statement inside a for loop has given me some real reasons to prefer PEP 310's single pass semantics for user defined statements (more on that at the end). The suggestion below is my latest attempt at combining the ideas of the two PEP's. For the keyword, I've used the abbreviation 'stmt' (for statement). I find it reads pretty well, and the fact that it *isn't* a real word makes it easier for me to track to the next item on the line to find out the actual statement name (I think this might be similar to the effect of 'def' not being a complete word making it easier for me to pick out the function name). I consequently use 'user statement' or 'user defined statement' to describe what PEP 340 calls anonymous block statements. I'm still fine with the concept of not using a keyword at all, though. Cheers, Nick. == User Defined Statement Usage Syntax == stmt EXPR1 [as VAR1]: BLOCK1 == User Defined Statement Semantics == the_stmt = EXPR1 terminated = False try: stmt_enter = the_stmt.__enter__ stmt_exit = the_stmt.__exit__ except AttributeError: raise TypeError("User statement required") try: VAR1 = stmt_enter() # Omit 'VAR1 =' if no 'as' clause except TerminateBlock: pass # Block is not entered at all in this case # If an else clause were to be permitted, the # associated block would be executed here else: try: try: BLOCK1 except: exc = sys.exc_info() terminated = True try: stmt_exit(*exc) except TerminateBlock: pass finally: if not terminated: try: stmt_exit(TerminateBlock) except TerminateBlock: pass Key points: * The supplied expression must have both __enter__ and __exit__ methods. * The result of the __enter__ method is assigned to VAR1 if VAR1 is given. * BLOCK1 is not executed if __enter__ raises an exception * A new exception, TerminateBlock, is used to signal statement completion * The __exit__ method is called with the exception tuple if an exception occurs * Otherwise it is called with TerminateBlock as the argument * The __exit__ method can suppress an exception by converting it to TerminateBlock or by returning without reraising the exception * return, break, continue and raise StopIteration are all OK inside BLOCK1. They affect the surrounding scope, and are in no way tampered with by the user defined statement machinery (some user defined statements may choose to suppress the raising of StopIteration, but the basic machinery doesn't do that) * Decouples user defined statements from yield expressions, the enhanced continue statement and generator finalisation. == New Builtin: statement == def statement(factory): try: factory.__enter__ factory.__exit__ # Supplied factory is already a user statement factory return factory except AttributeError: # Assume supplied factory is an iterable factory # Use it to create a user statement factory class stmt_factory(object): def __init__(*args, **kwds) self = args[0] self.itr = iter(factory(*args[1:], **kwds)) def __enter__(self): try: return self.itr.next() except StopIteration: raise TerminateBlock def __exit__(self, *exc_info): try: stmt_exit = self.itr.__exit__ except AttributeError: try: self.itr.next() except StopIteration: pass raise *exc_info # i.e. re-raise the supplied exception else: try: stmt_exit(*exc_info) except StopIteration: raise TerminateBlock Key points: * The supplied factory is returned unchanged if it supports the statement API (such as a class with both __enter__ and __exit__ methods) * An iterable factory (such as a generator, or class with an __iter__ method) is converted to a block statement factory * Either way, the result is a callable whose results can be used as EXPR1 in a user defined statement. * For statements constructed from iterators, the iterator's next() method is called once when entering the statement, and the result is assigned to VAR1 * If the iterator has an __exit__ method, it is invoked when the statement is exited. The __exit__ method is passed the exception information (which may indicate that no exception occurred). * If the iterator does not have an __exit__ method, it's next() method is invoked a second time instead * When an iterator is used to drive a user defined statement, StopIteration is translated to TerminateBlock * Main intended use is as a generator decorator * Decouples user defined statements from yield expressions, the enhanced continue statement and generator finalisation. == Justification for non-looping semantics == For most use cases, the effect PEP 340 block statements have on break and continue statements is both surprising and undesirable. This is highlighted by the major semantic difference between the following two cases: stmt locking(lock): for item in items: if handle(item): break for item in items: stmt locking(lock): if handle(item): break Instead of simply acquiring and releasing the lock on each iteration, as one would legitimately expect, the latter piece of code actually processes all of the items, instead of breaking out of the loop once one of the items is handled. With non-looping user defined statements, the above code works in the obvious fashion (the break statement ends the for loop, not the lock acquisition). With non-looping semantics, the implementation of the examples in PEP 340 is essentially identical - just add an invocation of @statement to the start of the generators. It also becomes significantly easier to write user defined statements manually as there is no need to track state: class locking: def __init__(self, lock): self.lock = lock def __enter__(self): self.lock.acquire() def __exit__(self, exc_type, value=None, traceback=None): self.lock.release() if type is not None: raise exc_type, value, traceback The one identified use case for a user-defined loop was PJE's auto_retry. We already have user-defined loops in the form of custom iterators, and there is nothing stopping an iterator from returning user defined statements like this: for attempt in auto_retry(3, IOError): stmt attempt: # Do something! # Including break to give up early # Or continue to try again without raising IOError The implementation of auto-retry is messier than it is with all user defined statement being loops, but I think the benefits of non-looping semantics justify that sacrifice. Besides, it really isn't all that bad: class auto_retry(3, IOError): def __init__(self, times, exc=Exception): self.times = xrange(times-1) self.exc = exc self.succeeded = False def __iter__(self): attempt = self.attempt for i in self.times: yield attempt() if self.succeeded: break else: yield self.last_attempt() @statement def attempt(self): try: yield None self.succeeded = True except self.exc: pass @statement def last_attempt(self): yield None (Third time lucky! One day I'll remember that Python has these things called classes designed to elegantly share state between a collection of related functions and generators. . .) The above code for auto_retry assumes that generators supply an __exit__ method as described in PEP 340 - without that, auto_retry.attempt would need to be written as a class since it needs to know if an exception was thrown or not: class auto_retry(3, IOError): def __init__(self, times, exc=Exception): self.times = xrange(times-1) self.exc = exc self.succeeded = False def __iter__(self): attempt = self.attempt for i in self.times: yield attempt(self) if self.succeeded: break else: yield self.last_attempt() class attempt(object): def __init__(self, outer): self.outer = outer def __enter__(self): pass def __exit__(self, exc_type, value=None, traceback=None): if exc_type is None: self.outer.succeeded = true elif exc_type not in self.outer.exc raise exc_type, value, traceback @statement def last_attempt(self): yield None -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.skystorm.net From steven.bethard at gmail.com Thu May 5 17:05:44 2005 From: steven.bethard at gmail.com (Steven Bethard) Date: Thu, 5 May 2005 09:05:44 -0600 Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: <4279F909.6030206@gmail.com> References: <20050503150510.GA13595@onegeek.org> <17015.39213.522060.873605@montanaro.dyndns.org> <17015.48830.223391.390538@montanaro.dyndns.org> <79990c6b050504015762d004ac@mail.gmail.com> <4278D7D7.2040805@gmail.com> <4279F909.6030206@gmail.com> Message-ID: On 5/5/05, Nick Coghlan wrote: > Steven Bethard wrote: > > Makes me wonder if we shouldn't just return to the __enter__() and > > __exit__() names of PEP 310[1] where for a generator __enter__() is > > just an alias for next(). We could even require Phillip J. Eby's > > "blockgenerator" decorator to rename next() to __enter__(), and add > > the appropriate __exit__() method. > > You must be reading my mind or something. . . > > Unless there is something in today's 80-odd messages to make it redundant, look > for a post entitled something like "Minimalist PEP 340 (aka PEP 310 redux)" Yeah, I should have linked to that discussion [1]. I wonder if it would be possible to update PEP 310 with your ideas, or perhaps start a new PEP? I'd like to see a competitor for PEP 340 that addresses some of the issues that came up, e.g. that the block-statement doesn't look like a loop, so break and continue might look like they break out of an enclosing loop. It might also be a good place to mirror Guido's PEP 340 examples with PEP 310-style examples -- I know the first attempts at writing some of them weren't as clean as the later attempts, so it would be nice to have somewhere to look for the "current version" of everything. STeVe [1]http://mail.python.org/pipermail/python-dev/2005-April/053039.html -- You can wordify anything if you just verb it. --- Bucky Katt, Get Fuzzy From eric.nieuwland at xs4all.nl Thu May 5 17:07:21 2005 From: eric.nieuwland at xs4all.nl (Eric Nieuwland) Date: Thu, 5 May 2005 17:07:21 +0200 Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally In-Reply-To: References: Message-ID: <9768a4652f7ffa1778560b3548609827@xs4all.nl> Reinhold Birkenfeld wrote: > Changes to the grammar > > The grammar for the try statement, which is currently > > try_stmt: ('try' ':' suite (except_clause ':' suite)+ > ['else' ':' suite] | 'try' ':' suite 'finally' ':' > suite) > > would have to become > > try_stmt: ('try' ':' suite (except_clause ':' suite)+ > ['else' ':' suite] ['finally' ':' suite] | > 'try' ':' suite (except_clause ':' suite)* > ['else' ':' suite] 'finally' ':' suite) Wouldn't it be easier to change it to: try_stmt: ('try' ':' suite (except_clause ':' suite)* ['else' ':' suite] ['finally' ':' suite] ) ? --eric From cpr at emsoftware.com Thu May 5 16:55:18 2005 From: cpr at emsoftware.com (Chris Ryland) Date: Thu, 5 May 2005 14:55:18 +0000 (UTC) Subject: [Python-Dev] PEP 340 keyword: after References: <427A185A.90504@v.loewis.de> <20050505102339.7b745670@localhost.localdomain> Message-ID: Rodrigo Dias Arruda Senra gpr.com.br> writes: > The code pattern that will 'wrap' the block might > not always make sense with the chosen keyword, if > that keyword is not semantically neutral. > (not time-related, not function-related, etc). > > Notice that is _no keyword_ is chosen, nothing prevents us > from using (even if by aliasing): > > after_opening("/etc/passwd") as f: > for line in f: > print line.rstrip() > > after_locking(myLock): > # code that needs to hold the lock I hate to add to what could be an endless discussion, but... ;-) In this case, "while" is the better time-related prefix, whether keyword (hopeless, due to ages-old boolean-controlled loop association) or function, since you want to imply that the code block is going on *while* the lock is held or *while* the file is open (and you also want to imply that afterwards, something else happens, i.e., cleanup). while_locked(myLock): # code that needs to hold the lock --Chris Ryland, Em Software From ncoghlan at gmail.com Thu May 5 17:19:13 2005 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 06 May 2005 01:19:13 +1000 Subject: [Python-Dev] Pre-PEP: Unifying try-except and try-finally In-Reply-To: <9768a4652f7ffa1778560b3548609827@xs4all.nl> References: <9768a4652f7ffa1778560b3548609827@xs4all.nl> Message-ID: <427A3971.8030400@gmail.com> Eric Nieuwland wrote: > Wouldn't it be easier to change it to: > > try_stmt: ('try' ':' suite (except_clause ':' suite)* > ['else' ':' suite] ['finally' ':' suite] ) > ? What does a try statement with neither an except clause nor a finally clause mean? Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.skystorm.net From mwh at python.net Thu May 5 17:25:03 2005 From: mwh at python.net (Michael Hudson) Date: Thu, 05 May 2005 16:25:03 +0100 Subject: [Python-Dev] PEP 340 keyword: after In-Reply-To: (Chris Ryland's message of "Thu, 5 May 2005 14:55:18 +0000 (UTC)") References: <427A185A.90504@v.loewis.de> <20050505102339.7b745670@localhost.localdomain> Message-ID: <2mvf5xwv8w.fsf@starship.python.net> Chris Ryland writes: > In this case, "while" is the better time-related prefix, whether Indeed. while_execution_is_lexically_in_the_next_block lock(theLock): ... Anyone? . Cheers, mwh -- Every day I send overnight packages filled with rabid weasels to people who use frames for no good reason. -- The Usenet Oracle, Oracularity #1017-1 From p.f.moore at gmail.com Thu May 5 18:01:28 2005 From: p.f.moore at gmail.com (Paul Moore) Date: Thu, 5 May 2005 17:01:28 +0100 Subject: [Python-Dev] PEP 340 - Remaining issues Message-ID: <79990c6b0505050901fe38af1@mail.gmail.com> On 5/5/05, Steven Bethard wrote: > I wonder if it would be possible to update PEP 310 with your ideas, > or perhaps start a new PEP? I'd like to see a competitor for PEP 340 that > addresses some of the issues that came up, e.g. that the block-statement > doesn't look like a loop, so break and continue might look like they break > out of an enclosing loop. In an attempt to bring things back under control, can I summarise what I believe are the outstanding issues? 1. Choice (or not) of a keyword. I honestly believe that there will never be a consensus on this, and we'd be better deferring the decision to Guido's judgement. 2. Separate protocol or not? I'm not entirely sure I have a view on this, but it feels related to the looping question below. I do like being able to write these things as generators, and I don't mind needing a decorator (although I, personally, don't feel a compelling *need* for one). 3. Looping blocks, and the break issue. I see a consensus forming here that blocks should *not* loop. No-one has come up with a strong case for looping blocks, except the auto_retry example, and Nick (I think it was Nick Coghlan, sorry if my memory is wrong) demonstrated how to build this case from a for loop and a non-looping block. Given that Guido has stated that he is willing to accept a consensus decision on changes to the PEP, can I suggest that rather than writing a competitor, someone (who understands the technicalities better than me) simply propose a modification to PEP 340 that does not loop[1]. I think the separate protocol issue is subtler - maybe it's just a case of renaming some methods and specifying a decorator, but I really don't understand the issues at this depth. I apologise if this post (1) misrepresents anyone's view, or (2) hinders things rather than helping. But I feel that we are pretty close to a solution here, and I fear that more competing PEPs will simply muddy the waters. Paul. [1] My simplistic view is that you may be able to get away with changing the specification of the anonymous blok statement's expansion just to remove the "while True". There's some fixup needed to avoid the one "break" in the expansion, and probably a lot of little details that make this far harder than I'm assuming - but maybe that's the starting point... From ncoghlan at gmail.com Thu May 5 18:04:50 2005 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 06 May 2005 02:04:50 +1000 Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: References: <20050503150510.GA13595@onegeek.org> <17015.39213.522060.873605@montanaro.dyndns.org> <17015.48830.223391.390538@montanaro.dyndns.org> <79990c6b050504015762d004ac@mail.gmail.com> <4278D7D7.2040805@gmail.com> <4279F909.6030206@gmail.com> Message-ID: <427A4422.4@gmail.com> Steven Bethard wrote: > I wonder if it > would be possible to update PEP 310 with your ideas, or perhaps start > a new PEP? I'd like to see a competitor for PEP 340 that addresses > some of the issues that came up, e.g. that the block-statement doesn't > look like a loop, so break and continue might look like they break out > of an enclosing loop. It might also be a good place to mirror Guido's > PEP 340 examples with PEP 310-style examples -- I know the first > attempts at writing some of them weren't as clean as the later > attempts, so it would be nice to have somewhere to look for the > "current version" of everything. Well, Michael Hudson and Paul Moore are the current authors of PEP 310, so updating it with any of my ideas would be their call. Either way, my latest and greatest version of the non-looping block statement semantics can be found here: http://mail.python.org/pipermail/python-dev/2005-May/053400.html Some key advantages of that proposal are: 1. It's not a loop, so nesting it inside another loop 'just works' 2. Manual protocol implementations are _significantly_ easier to write 3. try/finally can be done with generators _without_ changing generators 4. try/except/else can be done with generators if they provide an __exit__ method that raises the exception at the point of the last yield 5. Clearly distinct construct, no potential for confusion with for loops 6. Generators must be clearly marked as creating a user defined statement (although this could be changed by giving them an __enter__ method and an __exit__ method) The one downside relative to PEP 340 is that looping constructs like auto_retry are slightly harder to write, albeit not hugely so (once you remember that an iterable can be a class instead of a generator!). On the usage front, I find the 'loop over an iterator returning user defined statements' does a much better job of making the iteration clear, so I'd be inclined to count that as an advantage of a PEP 310 style approach. Anyway, I've already been spending more time on this than I should (sleep is optional, right?), so I won't be prettying it up into PEP format any time soon. I have no objection to someone else rolling some of the ideas into a PEP, though :) Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.skystorm.net From cgarciaf at lucent.com Thu May 5 18:05:38 2005 From: cgarciaf at lucent.com (Carlos Garcia) Date: Thu, 5 May 2005 18:05:38 +0200 Subject: [Python-Dev] problems with memory management Message-ID: <007501c5518c$477d9150$ba565887@1068801y07c0j> Hi All, I do hava a problem with python and it is that it raise an outofmemory (i comment lines in Py.java to avoid system.exit, to debug), i try to debug this issue with jprobe and realize that i get the exception even although the java heap is not in the limit, i book 64- 256M and the java heap was less than 60 M. The program is a command line that receive a line that python parse an call some java classes ti execute the appropiate command, any idea? Thansk, ========================================================== Carlos Garc?a Phone : +34 91 714 8796 Lucent Technologies e-mail : cgarciaf at lucent.com Avenida de Bruselas , 8 - 28108 Alcobendas (Madrid) ========================================================== -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mail.python.org/pipermail/python-dev/attachments/20050505/cfd99ea1/attachment.htm From rrr at ronadam.com Thu May 5 18:27:20 2005 From: rrr at ronadam.com (Ron Adam) Date: Thu, 05 May 2005 12:27:20 -0400 Subject: [Python-Dev] PEP 340 keyword: Extended while syntax In-Reply-To: <427A185A.90504@v.loewis.de> References: <427A185A.90504@v.loewis.de> Message-ID: <427A4968.5080308@ronadam.com> I expect there's an obvious reason why this hasn't been suggested already that I'm not currently thinking of, but here it is anyway. :-) How about an *extended while* syntax as a block keyword alternative? Reasoning: The block statement resembles a "while" block in some ways in that it is a conditional block that may be executed only once, or possibly not at all (or many times). And the word "while" is also descriptive of how a block is used. while VAR1 from EXPR1(): BLOCK This will require a new keyword/operator 'from' to use in a 'from' expression: VAR1 from EXPR1() Where EXPR1 returns an anonymous iterator, and the expression (VAR1 from EXPR1()) evaluates as True only if a value from the EXPR1 iterator is received. Or possibly False if it is received and is None. [* see below] The "for" tests for the name binding instead of testing the value of VAR1, it may also be desirable to check VAR1 for None after it is recieved. This would be translated as follows: 1 --> while VAR1 from EXPR1(): raise an error if EXPR1 is not an iterator. 2 --> while (VAR1 = _EXPR1_iter.__next__()): # internal 3 --> while True: # if VAR1 gets a new value or 3 -> while False: # if VAR1 fails to get a value [*]or 3 -> while False: # if VAR1 receives None * Undecided on check for None. An iterator could always return something, so testing for None would be needed; or it could refuse and break the request somehow after it is called. In the later case None could be a valid return value it may not desirable to finalize the block. A while *might* be able to test for both. while VAR1 from EXPR1() and VAR1!=None: or ... while VAR1 from EXPR1() and VAR1: Order of placement could make a difference. while VAR1 and VAR1 from EXPR1(): This would test the *last* VAR1 before getting a new value. That might be useful in some situations. This may also be inconsistent with how expressions are currently evaluated. I'm not sure if it's allowed for names to rebound while evaluating an expression. Examples: while lock from locking(myLock): # Code here executes with myLock held. while f from opening("/etc/passwd"): for line in f: print line.rstrip() while retry from auto_retry(3, IOError): f = urllib.urlopen("http://python.org/peps/pep-0340.html") print f.read() while f from locking_opening(myLock, "/etc/passwd"): for line in f: print line.rstrip() while f from opening(filename, "w"): while re_out from redirecting_stdout(f): print "Hello world" while f, err from opening_w_error("/etc/passwd", "a"): if err: print "IOError:", err else: f.write("guido::0:0::/:/bin/sh\n") Because the *from expression* evaluates to a bool, it might be useful in other places, although there may be reason to prevent it from being used as such. if VAR1 from GEN: print VAR1 else: print "GEN didn't give me anything" Another possibility is the use of xrange() in a block statements/ or extended while statements. while VAR1 from xrange(100): block This may blur the distinction between "for" loops and "while" loops, although it may be a *good* thing since "for" can then always used sequences, and the *extended while syntax* always use iterators. Which to use, would be up to the programmer. With that change xrange() support could be removed from "for" statements in Python 3000, (I think Guido wants to do that.), and it then could be used with "extended while" statements. With this suggestion there will still only be two looping constructs, "for" and "while", and I think the distinction between a normal "while" and an extended "while" is made clear with the "from" keyword. I think this would be much easier to understand, IMO, and also much easier to read and teach as well. It uses already familiar syntax and adds a new expression keyword instead of a new statement keyword. A symbol might be possible instead of "from", so adding new keywords could be avoided if "from" is out of the question. Optimistically, Ron_Adam From ncoghlan at gmail.com Thu May 5 18:33:45 2005 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 06 May 2005 02:33:45 +1000 Subject: [Python-Dev] PEP 340 - Remaining issues In-Reply-To: <79990c6b0505050901fe38af1@mail.gmail.com> References: <79990c6b0505050901fe38af1@mail.gmail.com> Message-ID: <427A4AE9.80007@gmail.com> Paul Moore wrote: > 1. Choice (or not) of a keyword. I honestly believe that there will > never be a consensus on this, and we'd be better deferring the > decision to Guido's judgement. The keyword-less approach is less confusing when the block statement is not a loop, as that eliminates the suprising behaviour of break and continue statements. If there is a keyword, the wide variety of user-defined statements means that any real English word will be a bad fit for at least some of them. Something relatively nonsensical, but usefully mnemonic (like 'stmt') may be a good way to go. > 2. Separate protocol or not? I'm not entirely sure I have a view on > this, but it feels related to the looping question below. I do like > being able to write these things as generators, and I don't mind > needing a decorator (although I, personally, don't feel a compelling > *need* for one). If the block statement doesn't loop, the PEP 310 protocol makes a lot more sense. A function (usable as a generator decorator) can then be provided to convert from a callable that returns iterables to a callable that returns objects that support the PEP 310 protocol. > Given that Guido has stated that he is willing to accept a consensus > decision on changes to the PEP, can I suggest that rather than writing > a competitor, someone (who understands the technicalities better than > me) simply propose a modification to PEP 340 that does not loop My attempt at doing exactly that is "PEP 340: Non-looping version (aka PEP 310 redux)" [1] And the seemingly simple change ('run the block at most once') had far more wide-ranging ramifications than I expected. > I think the separate protocol issue is subtler - maybe it's just a > case of renaming some methods and specifying a decorator, but I really > don't understand the issues at this depth. When I was writing my suggested semantics for a non-looping version, the use of an iteration protocol (next, StopIteration) became obviously inappropriate. So while having a separate protocol is a little murky when block statements are loops, the PEP 310 interface protocol is a clear winner when block statements are _not_ loops. > I apologise if this post (1) misrepresents anyone's view, or (2) > hinders things rather than helping. But I feel that we are pretty > close to a solution here, and I fear that more competing PEPs will > simply muddy the waters. In this case, I think having a separate document (perhaps PEP 310, or maybe a Wiki page) to describe how a non-looping block statement can support all of the identified use cases for the PEP 340's block statement will be clearer than trying to describe the two main alternatives in the same PEP. Cheers, Nick. [1] http://mail.python.org/pipermail/python-dev/2005-May/053400.html -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.skystorm.net From rrr at ronadam.com Thu May 5 18:48:29 2005 From: rrr at ronadam.com (Ron Adam) Date: Thu, 05 May 2005 12:48:29 -0400 Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: References: <20050503150510.GA13595@onegeek.org> <427797D5.8030207@cirad.fr> <17015.39213.522060.873605@montanaro.dyndns.org> <17015.48830.223391.390538@montanaro.dyndns.org> <79990c6b050504015762d004ac@mail.gmail.com> <4279FB92.5050501@gmail.com> Message-ID: <427A4E5D.1080904@ronadam.com> Eric Nieuwland wrote: > This is linear. No looping whatsoever. And easily translated to a > simple language construct and a protocol: > > class resource(object): > def __init__(self,...): > # store resource parameters > def __acquire__(self): > # whatever it takes to grab the resource > def __release__(self): > # free the resource I wanted to see what the examples in PEP340 would look like written with standard class's using object inheritance and overriding to define resource managers. If anyone's interested I can post it. My block class is non-looping as I found in most cases looping isn't required, and looping complicates things because you have to pass around a loop expression due to not all loops will want to behave that same way. The solution was to put the loop in the body method and call a repeat_body method (the repeated body section) which is added to the class when needed. Ron_Adam From tjreedy at udel.edu Thu May 5 19:17:16 2005 From: tjreedy at udel.edu (Terry Reedy) Date: Thu, 5 May 2005 13:17:16 -0400 Subject: [Python-Dev] problems with memory management References: <007501c5518c$477d9150$ba565887@1068801y07c0j> Message-ID: > I do hava a problem with python and it is that it raise an outofmemory > >(i comment lines in Py.java to avoid system.exit, to debug), Questions about using current Python belong on the Python list or comp.lang.python. Python-dev is for discussions about developing future versions. From steven.bethard at gmail.com Thu May 5 19:23:20 2005 From: steven.bethard at gmail.com (Steven Bethard) Date: Thu, 5 May 2005 11:23:20 -0600 Subject: [Python-Dev] PEP 340: Non-looping version (aka PEP 310 redux) In-Reply-To: <427A35DA.8050505@iinet.net.au> References: <427A35DA.8050505@iinet.net.au> Message-ID: On 5/5/05, Nick Coghlan wrote: > The discussion on the meaning of break when nesting a PEP 340 block statement > inside a for loop has given me some real reasons to prefer PEP 310's single pass > semantics for user defined statements (more on that at the end). The suggestion > below is my latest attempt at combining the ideas of the two PEP's. > [snip] > * An iterable factory (such as a generator, or class with an __iter__ method) is > converted to a block statement factory I like the non-looping proposal a lot, but I'd still prefer that iterators were not usable as statements. As I understand it, the main motivation for wanting iterators to be usable as statements is that generators provide a very simple way of creating iterators, and we'd like to have an equally simple way of creating statments. The simplicity of generators is that using a "yield" statement inside a "def" statement magically modifies the function so that it returns "iterator" objects. I'd like to see a parallel for block-statements, so that using an "XXX" statement inside a "def" statement magically modifies the function so that it returns "statement" objects. To illustrate my point, I'm going to assume a no-keyword syntax for calling statement objects and I'm going to steal your "stmt" keyword to replace "yield". So, for example, to create the "opening" statement from PEP 340, you would write it almost exactly the same: def opening(filename, mode="r"): f = open(filename, mode) try: stmt f finally: f.close() This would create a generator-like object that instead of providing __iter__() and next() methods, provides __enter__() and __exit__() methods. It could then be called like: opening("temp.txt") as f: for line in f: print line I like this for a few reasons: * statement-generators (or whatever you want to call them) are just as easy to declare as normal generators are * try/finally statements around a "yield" would still be invalid syntax, as generators can't generally guarantee proper finalization semantics. * statement objects can't be accidentally used in for-loops; they don't have __iter__() or next() methods * statement objects can be clearly documented separately from iterator objects; there would be no need for them to refer to each other I don't know the generator implementation very well, but I would think that statement-generators could share almost all the code of normal generators by simply changing the name of a slot or two and adding the __exit__ code already proposed by PEP 340. STeVe -- You can wordify anything if you just verb it. --- Bucky Katt, Get Fuzzy From gustavo at niemeyer.net Thu May 5 19:36:52 2005 From: gustavo at niemeyer.net (Gustavo Niemeyer) Date: Thu, 5 May 2005 14:36:52 -0300 Subject: [Python-Dev] PEP 340 keyword: Extended while syntax In-Reply-To: <427A4968.5080308@ronadam.com> References: <427A185A.90504@v.loewis.de> <427A4968.5080308@ronadam.com> Message-ID: <20050505173652.GA6947@burma.localdomain> Greetings, > Reasoning: The block statement resembles a "while" block in some ways in > that it is a conditional block that may be executed only once, or > possibly not at all (or many times). And the word "while" is also > descriptive of how a block is used. > > while VAR1 from EXPR1(): > BLOCK This is an interesting propose, but for a different PEP. In the current propose VAR1 is not evaluated for truthness, and many of the usage examples doesn't even require it. This looks quite strange, for instance: while dummy from locking(myLock): # Do something And also, this would require a break necessarily: while (foo, bar) from locking(): # Pass > This will require a new keyword/operator 'from' to use in a 'from' > expression: 'from' is already a keyword, btw. -- Gustavo Niemeyer http://niemeyer.net From jcarlson at uci.edu Thu May 5 20:08:35 2005 From: jcarlson at uci.edu (Josiah Carlson) Date: Thu, 05 May 2005 11:08:35 -0700 Subject: [Python-Dev] PEP 340: Breaking out. In-Reply-To: References: <20050505004244.64C0.JCARLSON@uci.edu> Message-ID: <20050505095935.64C6.JCARLSON@uci.edu> Ka-Ping Yee wrote: > > On Thu, 5 May 2005, Josiah Carlson wrote: > > Ka-Ping Yee wrote: > > > continue with 2 > > > > There is something about that I just don't like. > > Just to clarify: if by you mean "nesting level", did it appear > that the 2 in my example was a count of block levels? I didn't > mean it that way -- the 2 is a value passed in to the generator > that appears as the value of the "yield" expression, as in PEP 340. I remember reading that, but I seem to have forgotten it when I was composing my reply. Thankfully, sleeping on it has helped me discover what I really don't like. With the 'passing value' semantic, the [ ] [ ] is only useful for the deepest loop of a particular type. Take for example... for ...: for ...: for ...: break/continue [for] That break or continue can only affect that last for loop. It doesn't make any easier the use of nested fors, nested whiles, or even nested blocks. It only really helps you if you mix and match all possible looping constructs, and even then, only gives the granularity of the most recent block of a particular type. In that sense, I think it is a nonstarter, because it doesn't really add functionality in common uses of for and while statements. If one allowed [] [] , [], then one could jump to arbitrary loops. Now, I'm not condoning this, and I don't even like it. Sure, it allows breaking or continuing to any for, while, or block statement in the current scope, but the argument is as equivalently ambiguous as package-relative imports using a leading integer (http://python.org/peps/pep-0328.html). Now, one can remove ambiguity if we were able to 'label' while loops and for loops producing [