From python at discontinuity.net Wed Apr 1 06:52:46 2015 From: python at discontinuity.net (Davin Potts) Date: Tue, 31 Mar 2015 23:52:46 -0500 Subject: [Python-Dev] OpenBSD buildbot has many failures In-Reply-To: References: Message-ID: <778DE168-859E-4A7A-B9DE-2A49FF9EE7BA@discontinuity.net> Hi Victor ? I am personally interested in seeing all tests pass on OpenBSD and am willing to put forth effort to help that be so. I would be happy to be added to any issues that get opened against OpenBSD. That said, I have concerns about the nature of when and how these failures came about ? specifically I worry that other devs have committed the changes which prompted these failures yet they did not pay attention nor take responsibility when it happened. Having monitored certain buildbots for a while to see how the community behaves and devs fail to react when a failure is triggered by a commit, I think we should do much better in taking individual responsibility for prompting these failures. Davin > On Mar 28, 2015, at 04:53, Victor Stinner wrote: > > Hi, > > The OpenBSD buildbot always fail with the same failures since many > months (ex: test_socket). Is someone interested to fix them? If no, > would it be possible to turn it off to get a better view of > regressions? Currently, they are too many red buildbots to quickly see > regressions introduced by recent commits. > > http://buildbot.python.org/all/builders/x86%20OpenBSD%205.5%203.x > > Victor > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/python%2Bpython_dev%40discontinuity.net From hasan.diwan at gmail.com Wed Apr 1 07:12:28 2015 From: hasan.diwan at gmail.com (Hasan Diwan) Date: Tue, 31 Mar 2015 22:12:28 -0700 Subject: [Python-Dev] OpenBSD buildbot has many failures In-Reply-To: <778DE168-859E-4A7A-B9DE-2A49FF9EE7BA@discontinuity.net> References: <778DE168-859E-4A7A-B9DE-2A49FF9EE7BA@discontinuity.net> Message-ID: I, too, would be interested in having tests pass on OpenBSD (and NetBSD) and am willing to do whatever I have to to make it be so. -- H On 31 March 2015 at 21:52, Davin Potts wrote: > Hi Victor ? > > I am personally interested in seeing all tests pass on OpenBSD and am > willing to put forth effort to help that be so. I would be happy to be > added to any issues that get opened against OpenBSD. That said, I have > concerns about the nature of when and how these failures came about ? > specifically I worry that other devs have committed the changes which > prompted these failures yet they did not pay attention nor take > responsibility when it happened. Having monitored certain buildbots for a > while to see how the community behaves and devs fail to react when a > failure is triggered by a commit, I think we should do much better in > taking individual responsibility for prompting these failures. > > > Davin > > > > On Mar 28, 2015, at 04:53, Victor Stinner > wrote: > > > > Hi, > > > > The OpenBSD buildbot always fail with the same failures since many > > months (ex: test_socket). Is someone interested to fix them? If no, > > would it be possible to turn it off to get a better view of > > regressions? Currently, they are too many red buildbots to quickly see > > regressions introduced by recent commits. > > > > http://buildbot.python.org/all/builders/x86%20OpenBSD%205.5%203.x > > > > Victor > > _______________________________________________ > > Python-Dev mailing list > > Python-Dev at python.org > > https://mail.python.org/mailman/listinfo/python-dev > > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/python%2Bpython_dev%40discontinuity.net > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/hasan.diwan%40gmail.com > -- OpenPGP: https://hasan.d8u.us/gpg.key Sent from my mobile device Envoy? de mon portable -------------- next part -------------- An HTML attachment was scrubbed... URL: From victor.stinner at gmail.com Wed Apr 1 09:32:33 2015 From: victor.stinner at gmail.com (Victor Stinner) Date: Wed, 1 Apr 2015 09:32:33 +0200 Subject: [Python-Dev] OpenBSD buildbot has many failures In-Reply-To: <778DE168-859E-4A7A-B9DE-2A49FF9EE7BA@discontinuity.net> References: <778DE168-859E-4A7A-B9DE-2A49FF9EE7BA@discontinuity.net> Message-ID: Le mercredi 1 avril 2015, Davin Potts a ?crit : > > I am personally interested in seeing all tests pass on OpenBSD and am > willing to put forth effort to help that be so. Great! > I would be happy to be added to any issues that get opened against > OpenBSD. You can do a search in bugs.python.org. If I recall recorrectly, there are only a few open issues. > That said, I have concerns about the nature of when and how these failures > came about ? specifically I worry that other devs have committed the > changes which prompted these failures yet they did not pay attention nor > take responsibility when it happened. That's the purpose of my previous email. If the buildbot is always red, nobody will check it anymore. I'm quite sure that the current OpenBSD 5.5 was always red since its setup. Sometimes, I try to fix some isssues. Fixing OpenBSD issues usually means install OpenBSD at home because most issues are specific to OpenBSD. Even for LibreSSL, OpenBSD behaves differently (on the version number) than FreeBSD. Having monitored certain buildbots for a while to see how the community > behaves and devs fail to react when a failure is triggered by a commit, I > think we should do much better in taking individual responsibility for > prompting these failures. When I started to work on Python, I was surprised to not get emails from buildbots. There are different issues. Dome buildbots are always red. Threre are still many sporadic issues: temporary network failures seen as bugs, threads issues. Multiprocessing issues. Recently, I saw some rare asyncio issues. So we reduce the number of these issues before spaming developers. Currently, there is an IRC bot on #python-dev which notify when buildbot color changes. Or sometiles I chceck the huge waterfall page. By the way, it became difficult to browse this page because there are too many offline buildbots. Victor -------------- next part -------------- An HTML attachment was scrubbed... URL: From storchaka at gmail.com Wed Apr 1 09:33:13 2015 From: storchaka at gmail.com (Serhiy Storchaka) Date: Wed, 01 Apr 2015 10:33:13 +0300 Subject: [Python-Dev] OpenBSD buildbot has many failures In-Reply-To: <778DE168-859E-4A7A-B9DE-2A49FF9EE7BA@discontinuity.net> References: <778DE168-859E-4A7A-B9DE-2A49FF9EE7BA@discontinuity.net> Message-ID: On 01.04.15 07:52, Davin Potts wrote: > I am personally interested in seeing all tests pass on OpenBSD and am willing to put forth effort to help that be so. I would be happy to be added to any issues that get opened against OpenBSD. That said, I have concerns about the nature of when and how these failures came about ? specifically I worry that other devs have committed the changes which prompted these failures yet they did not pay attention nor take responsibility when it happened. Having monitored certain buildbots for a while to see how the community behaves and devs fail to react when a failure is triggered by a commit, I think we should do much better in taking individual responsibility for prompting these failures. http://bugs.python.org/issue?%40columns=id%2Cactivity%2Ctitle%2Ccreator%2Cassignee%2Cstatus%2Ctype&%40filter=status&%40search_text=openbsd&submit=search&status=1 From victor.stinner at gmail.com Wed Apr 1 10:01:28 2015 From: victor.stinner at gmail.com (Victor Stinner) Date: Wed, 1 Apr 2015 10:01:28 +0200 Subject: [Python-Dev] MemoryError and other bugs on AMD64 OpenIndiana 3.x In-Reply-To: <55184C92.9050103@jcea.es> References: <55184C92.9050103@jcea.es> Message-ID: Hi, 2015-03-29 21:03 GMT+02:00 Jesus Cea : > I have contacted the machine manager and he has said to me that 16GB > were taken in the "/tmp" directory by buildbot user. He has deleted them > and everything should be fine now. Lets see. > > Anyway if buildbot is leaving garbage behind we should take care of it. > Doing a daily cron for manual cleaning up seems a bit hacky :). Usually, created files are all removed. Serhiy even added a test recently for that in regrtest. The problem is that Python may keep some created files on test failures. > I would be interested, too, in getting an email when one of my buildbots > is red for more than, lets say, 4 days in a row. An email per day would > be fine. There are still MemoryError: http://buildbot.python.org/all/builders/AMD64%20OpenIndiana%203.x/builds/9543/steps/test/logs/stdio http://buildbot.python.org/all/builders/AMD64%20OpenIndiana%203.x/builds/9542/steps/test/logs/stdio http://buildbot.python.org/all/builders/AMD64%20OpenIndiana%203.x/builds/9541/steps/test/logs/stdio etc. Victor From cory at lukasa.co.uk Wed Apr 1 11:40:15 2015 From: cory at lukasa.co.uk (Cory Benfield) Date: Wed, 1 Apr 2015 10:40:15 +0100 Subject: [Python-Dev] http.client: Determining Content-Length In-Reply-To: <18A92DFA-B232-405C-B894-C5464BF777D3@gmail.com> References: <18A92DFA-B232-405C-B894-C5464BF777D3@gmail.com> Message-ID: On 31 March 2015 at 17:55, Demian Brecht wrote: > Hi all, > > I'm not sure whether this should be python-list or here, but given it's a premature code review for http.client, I figured I'd post here first. > > I'm in the process of adding support for chunked transfer encoding to http.client (issue #12319). One of the bits of functionality that I'm working on in is ironing out some of the kinks out in determining the content length of the request bodies. Given the number of data types allowed as bodies it does get a little messy, so I wanted to get some feedback here and see if anyone can shoot holes through it prior to updating the current patch. The tests are passing, but there may be use cases not accounted for in the new implementation. > > https://gist.github.com/demianbrecht/f94be5a51e32bb9c81e1 > > The above is intended to replace _set_content_length (current state of the patch can be seen here: http://bugs.python.org/review/12319/diff/14331/Lib/http/client.py). There is a comprehensive list of data types currently supported can be found here: http://bugs.python.org/issue23740. Comments and feedback are much appreciated. Nothing problematic leaps out at me here, but only if I look at your proposal in the context of the full patch. Your full patch does seem to have an edge case: for HTTP/1.1, if I pass a generator and no content-length you'll blow up by indexing into it. That's probably fine (your only option in this case would be to frame this request by half-closing the TCP stream, which I seem to recall you ruling out elsewhere), but you might want to wrap the exception in something more helpful: not sure. Cory From mail at timgolden.me.uk Wed Apr 1 12:47:36 2015 From: mail at timgolden.me.uk (Tim Golden) Date: Wed, 01 Apr 2015 11:47:36 +0100 Subject: [Python-Dev] OpenBSD buildbot has many failures In-Reply-To: References: <778DE168-859E-4A7A-B9DE-2A49FF9EE7BA@discontinuity.net> Message-ID: <551BCCC8.3080200@timgolden.me.uk> On 01/04/2015 08:32, Victor Stinner wrote: > When I started to work on Python, I was surprised to not get emails from > buildbots. > > Currently, there is an IRC bot on #python-dev which notify when buildbot > color changes. Or sometiles I chceck the huge waterfall page. By the > way, it became difficult to browse this page because there are too many > offline buildbots. On the back of Victor's recent emails re buildbots, I've knocked something up which can be scheduled to email the status of some or all buildbots: https://github.com/tjguk/buildbotinfo It's rough-and-ready but it does work. If it's useful to anyone, feel free to use / clone / fork / whatever. By all means send PRs or raise Issues but I've already overrun the little time I'd given myself to get this working so I'm not sure when I'll get to them. There are two modules: the underlying buildbot.py which lightly wraps the XML-RPC interface; and the buildbotinfo.py which uses it to generate some readable output according to some selection criteria. See the project README for some details: https://github.com/tjguk/buildbotinfo/blob/master/README.rst TJG From victor.stinner at gmail.com Wed Apr 1 14:13:44 2015 From: victor.stinner at gmail.com (Victor Stinner) Date: Wed, 1 Apr 2015 14:13:44 +0200 Subject: [Python-Dev] OpenBSD buildbot has many failures In-Reply-To: <551BCCC8.3080200@timgolden.me.uk> References: <778DE168-859E-4A7A-B9DE-2A49FF9EE7BA@discontinuity.net> <551BCCC8.3080200@timgolden.me.uk> Message-ID: Hi, 2015-04-01 12:47 GMT+02:00 Tim Golden : > On the back of Victor's recent emails re buildbots, I've knocked > something up which can be scheduled to email the status of some or all > buildbots: > > https://github.com/tjguk/buildbotinfo Are you aware of this previous project? https://code.google.com/p/bbreport/ I also wrote two very simple scripts to download and parse buildbot output: https://bitbucket.org/haypo/misc/src/5929cc110f0352cecb384adceae3647f26fa693e/python/buildbot_download.py?at=default https://bitbucket.org/haypo/misc/src/5929cc110f0352cecb384adceae3647f26fa693e/python/buildbot_parse.py?at=default I'm more interested on the parser part. My first goal was to compute a summary in one line: success, timeout, fatal error, etc. The next step would be to parse test failures to be able to ignore tests known to fail, to only detect regressions. It would be better than the current "red" / "green" status. For example, I want to always ignore MemoryError just because I'm unable to fix buildbots which are known to regulary fail with MemoryError (ex: OpenIndiana). I used Jenkins which uses JUnit reports. It's much more powerful because it computes statistics on tests to check which tests fail randomly, which tests are new failures, etc. I don't think that it would be complex to generate a JUnit report (or another easy to parse report) in regrtest. But XML is maybe not the best format to handle timeout and fatal errors :-) Victor From mail at timgolden.me.uk Wed Apr 1 14:21:31 2015 From: mail at timgolden.me.uk (Tim Golden) Date: Wed, 01 Apr 2015 13:21:31 +0100 Subject: [Python-Dev] OpenBSD buildbot has many failures In-Reply-To: References: <778DE168-859E-4A7A-B9DE-2A49FF9EE7BA@discontinuity.net> <551BCCC8.3080200@timgolden.me.uk> Message-ID: <551BE2CB.50709@timgolden.me.uk> On 01/04/2015 13:13, Victor Stinner wrote: > Hi, > > 2015-04-01 12:47 GMT+02:00 Tim Golden : >> On the back of Victor's recent emails re buildbots, I've knocked >> something up which can be scheduled to email the status of some or all >> buildbots: >> >> https://github.com/tjguk/buildbotinfo > > Are you aware of this previous project? > https://code.google.com/p/bbreport/ Yes -- I originally forked it with a view to extending it the way I wanted, but I was rewriting so much that in the end I started afresh and ended up with just my own code. (The way you do...) [... snip useful ideas about more finely-tuned results ...] I'd also thought of various directions things could go. TBH, though, I knew if I started to get ambitious I'd end up with something which tried to do everything and didn't do anything. I might go somewhere with the thing in the future, but for now it does what it does and if it helps someone -- including me -- at the simplest level, well that's a good thing. TJG From neale at sinenomine.net Wed Apr 1 16:16:12 2015 From: neale at sinenomine.net (Neale Ferguson) Date: Wed, 1 Apr 2015 14:16:12 +0000 Subject: [Python-Dev] AF_IUCV support for sockets Message-ID: Hi, I have opened an enhancement request (23830) with a suggested path that adds support for the AF_IUCV protocol family: IUCV is a hypervisor mediated communications method for z/VM guest virtual machines. Linux on z Systems (aka s390x) has supported this via the use of AF_IUCV sockets for many years (added to kernel Feb 2007). This suggested patch adds support to Python 2.7.9 for this socket type. I have built Python on both s390x and x86_64: both build cleanly and the test added to test_socket.py runs cleanly on s390x and is skipped on other platforms. ---------- components: Library (Lib) files: af_iucv.patch keywords: patch messages: 239743 nosy: neale priority: normal severity: normal status: open title: Add AF_IUCV support to sockets type: enhancement versions: Python 2.7 Added file: http://bugs.python.org/file38765/af_iucv.patch I would like help in improving the patch and making it a candidate for acceptance into the code base. Neale From zachary.ware+pydev at gmail.com Wed Apr 1 17:27:54 2015 From: zachary.ware+pydev at gmail.com (Zachary Ware) Date: Wed, 1 Apr 2015 10:27:54 -0500 Subject: [Python-Dev] [Python-checkins] cpython: Check deques against common sequence tests (except for slicing). In-Reply-To: <20150401151114.1901.60002@psf.io> References: <20150401151114.1901.60002@psf.io> Message-ID: On Wed, Apr 1, 2015 at 10:11 AM, raymond.hettinger wrote: > https://hg.python.org/cpython/rev/393189326adb > changeset: 95350:393189326adb > user: Raymond Hettinger > date: Wed Apr 01 08:11:09 2015 -0700 > summary: > Check deques against common sequence tests (except for slicing). > > files: > Lib/test/test_deque.py | 16 ++++++++++++++++ > 1 files changed, 16 insertions(+), 0 deletions(-) > > > diff --git a/Lib/test/test_deque.py b/Lib/test/test_deque.py > --- a/Lib/test/test_deque.py > +++ b/Lib/test/test_deque.py > @@ -843,6 +843,21 @@ > # SF bug #1486663 -- this used to erroneously raise a TypeError > SubclassWithKwargs(newarg=1) > > +class TestSequence(seq_tests.CommonTest): > + type2test = deque > + > + def test_getitem(self): > + # For now, bypass tests that require slicing > + pass > + > + def test_getslice(self): > + # For now, bypass tests that require slicing > + pass > + > + def test_subscript(self): > + # For now, bypass tests that require slicing > + pass Instead of making these empty passing tests, it's better to set them to 'None' so that unittest doesn't run them (and thus doesn't report success when it hasn't actually tested something). -- Zach From pje at telecommunity.com Wed Apr 1 23:35:53 2015 From: pje at telecommunity.com (PJ Eby) Date: Wed, 1 Apr 2015 17:35:53 -0400 Subject: [Python-Dev] PEP 487 vs 422 (dynamic class decoration) Message-ID: I recently got an inquiry from some of my users about porting some of my libraries to Python 3 that make use of the Python 2 __metaclass__ facility. While checking up on the status of PEP 422 today, I found out about its recently proposed replacement, PEP 487. While PEP 487 is a generally fine PEP, it actually *rules out* the specific use case that I wanted PEP 422 for in the first place: dynamic addition of callbacks or decorators for use at class creation time without requiring explicit inheritance or metaclass participation. (So that e.g. method decorators can access the enclosing class at class definition time.) As discussed previously prior to the creation of PEP 422, it is not possible to port certain features of my libraries to work on Python 3 without some form of that ability, and the only thing that I know of that could even *potentially* provide that ability outside of PEP 422 is monkeypatching __build_class__ (which might not even work). That is, the very thing that PEP 422 was created to avoid the need for. ;-) One possible alteration would be to replace __init_subclass__ with some sort of __init_class__ invoked on the class that provides it, not just subclasses. That would allow the kind of dynamic decoration that PEP 422 allows. However, this approach was rather specifically ruled out in earlier consideration of PEP 422, so.... Another alternative would be to have the default __init_subclass__ look at a class-level __decorators__ attribute, as originally discussed for PEP 422. That would solve *my* problem, but feels too much like adding more than One Way To Do It. So... honestly, I'm not sure where to go from here. Is there any chance that this is going to be changed, or revert to the PEP 422 approach, or... something? If so, what Python version will the "something" be in? Or is this use case just going to be a dead parrot in Python 3, period? From ncoghlan at gmail.com Thu Apr 2 04:39:01 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 2 Apr 2015 12:39:01 +1000 Subject: [Python-Dev] PEP 487 vs 422 (dynamic class decoration) In-Reply-To: References: Message-ID: On 2 April 2015 at 07:35, PJ Eby wrote: > I recently got an inquiry from some of my users about porting some of > my libraries to Python 3 that make use of the Python 2 __metaclass__ > facility. While checking up on the status of PEP 422 today, I found > out about its recently proposed replacement, PEP 487. > > While PEP 487 is a generally fine PEP, it actually *rules out* the > specific use case that I wanted PEP 422 for in the first place: > dynamic addition of callbacks or decorators for use at class creation > time without requiring explicit inheritance or metaclass > participation. (So that e.g. method decorators can access the > enclosing class at class definition time.) How hard is the requirement against relying on a mixin class or class decorator to request the defining class aware method decorator support? Is the main concern with the fact that failing to apply the right decorator/mixin at the class level becomes a potentially silent failure where the class aware method decorators aren't invoked properly? > So... honestly, I'm not sure where to go from here. Is there any > chance that this is going to be changed, or revert to the PEP 422 > approach, or... something? If so, what Python version will the > "something" be in? Or is this use case just going to be a dead parrot > in Python 3, period? My preference at this point would definitely be to introduce a mixin class into the affected libraries and frameworks with an appropriate PEP 487 style __init_subclass__ that was a noop in Python 2 (which would rely on metaclass injection instead), but implemented the necessary "defining class aware" method decorator support in Python 3. The question of dynamically injecting additional base classes from the class body to allow the use of certain method decorators to imply specific class level behaviour could then be addressed as a separate proposal (e.g. making the case for an "__append_mixins__" attribute), rather than being linked directly to the question of how we going about defining inherited creation time behaviour without needing a custom metaclass. Regards, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From pje at telecommunity.com Thu Apr 2 08:38:17 2015 From: pje at telecommunity.com (PJ Eby) Date: Thu, 2 Apr 2015 02:38:17 -0400 Subject: [Python-Dev] PEP 487 vs 422 (dynamic class decoration) In-Reply-To: References: Message-ID: On Wed, Apr 1, 2015 at 10:39 PM, Nick Coghlan wrote: > On 2 April 2015 at 07:35, PJ Eby wrote: >> I recently got an inquiry from some of my users about porting some of >> my libraries to Python 3 that make use of the Python 2 __metaclass__ >> facility. While checking up on the status of PEP 422 today, I found >> out about its recently proposed replacement, PEP 487. >> >> While PEP 487 is a generally fine PEP, it actually *rules out* the >> specific use case that I wanted PEP 422 for in the first place: >> dynamic addition of callbacks or decorators for use at class creation >> time without requiring explicit inheritance or metaclass >> participation. (So that e.g. method decorators can access the >> enclosing class at class definition time.) > > How hard is the requirement against relying on a mixin class or class > decorator to request the defining class aware method decorator > support? Is the main concern with the fact that failing to apply the > right decorator/mixin at the class level becomes a potentially silent > failure where the class aware method decorators aren't invoked > properly? The concern is twofold: it breaks proper information hiding/DRY, *and* it fails silently. It should not be necessary for clients of package A1 (that uses a decorator built using package B2) to mixin a metaclass or decorator from package C3 (because B2 implemented its decorators using C3), just for package A1's decorator to work properly in the *client package's class*. (And then, of course, this all silently breaks if you forget, and the breakage might happen at the A1, B2, or C3 level.) Without a way to hook into the class creation process, there is no way to verify correctness and prevent the error from passing silently. (OTOH, if there *is* a way to hook into the creation process, the problem is solved: there's no need to mix anything in anyway, because the hook can do whatever the mixin was supposed to do.) The only way PEP 487 could be a solution is if the default `object.__init_subclass__` supported one of the earlier __decorators__ or __autodecorate__ proposals, or if the PEP were for an `__init_class__` that operated on the defining class, instead of operating only on subclasses. (I need to hook the creation of a class that's *being defined*, not the definition of its future subclasses.) > My preference at this point would definitely be to introduce a mixin > class into the affected libraries and frameworks with an appropriate > PEP 487 style __init_subclass__ that was a noop in Python 2 (which > would rely on metaclass injection instead), but implemented the > necessary "defining class aware" method decorator support in Python 3. If this were suitable for the use case, I'd have done it already. DecoratorTools has had a mixin that provides a __class_init__ feature since 2007, which could be ported to Python 3 in a straighforward manner as a third-party module. (It's just a mixin that provides a metaclass; under 3.x it could probably just be a plain metaclass with no mixin.) > The question of dynamically injecting additional base classes from the > class body to allow the use of certain method decorators to imply > specific class level behaviour could then be addressed as a separate > proposal (e.g. making the case for an "__append_mixins__" attribute), > rather than being linked directly to the question of how we going > about defining inherited creation time behaviour without needing a > custom metaclass. Then maybe we should do that first, since PEP 487 doesn't do anything you can't *already* do with a mixin, all the way back to Python 2.2. IOW, there's no need to modify the core just to have *that* feature, since if you control the base class you can already do what PEP 487 does in essentially every version of Python, ever. If that's all PEP 487 is going to do, it should just be a PyPI package on a stdlib-inclusion track, not a change to core Python. It's not actually adding back any of the dynamicness (dynamicity? hookability?) that PEP 3115 took away. From ncoghlan at gmail.com Thu Apr 2 10:46:11 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 2 Apr 2015 18:46:11 +1000 Subject: [Python-Dev] PEP 487 vs 422 (dynamic class decoration) In-Reply-To: References: Message-ID: On 2 April 2015 at 16:38, PJ Eby wrote: > > IOW, there's no need to modify the core just to have *that* feature, > since if you control the base class you can already do what PEP 487 > does in essentially every version of Python, ever. If that's all PEP > 487 is going to do, it should just be a PyPI package on a > stdlib-inclusion track, not a change to core Python. It's not > actually adding back any of the dynamicness (dynamicity? > hookability?) that PEP 3115 took away. The specific feature that PEP 487 is adding is the ability to customise creation of subclasses without risking the introduction of a metaclass conflict. That allows it to be used in situations where adopting any of the existing metaclass based mechanisms would require a potential compatibility break (as well as being far more approachable as a mechanism than the use of custom metaclasses). The gap I agree this approach leaves is a final post-namespace-execution step that supports establishing any class level invariants implied by decorators and other functions used in the class body. Python 2 allowed that to be handled with a dynamically generated __metaclass__ and PEP 422 through __autodecorate__, while PEP 487 currently has no equivalent mechanism. Regards, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Thu Apr 2 12:25:01 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 2 Apr 2015 20:25:01 +1000 Subject: [Python-Dev] OpenBSD buildbot has many failures In-Reply-To: References: <778DE168-859E-4A7A-B9DE-2A49FF9EE7BA@discontinuity.net> Message-ID: On 1 April 2015 at 17:32, Victor Stinner wrote: > Currently, there is an IRC bot on #python-dev which notify when buildbot > color changes. Or sometiles I chceck the huge waterfall page. By the way, it > became difficult to browse this page because there are too many offline > buildbots. The categorised links at https://www.python.org/dev/buildbot/ can be a bit more usable (but the stable links naturally exclude the systems we've never got working reliably in the first place). An idea that's been kicking around for a while has been to start using BuildBot's ephemeral test client support to kick off fresh VMs in Rackspace for x86_64 testing on the OS versions available there rather than relying solely on donated test systems. Unfortunately there are enough unknowns involved in that that we don't even know how hard its likely to be to set up without someone being in a position to contribute the time to start figuring it out. Regards, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ethan at stoneleaf.us Thu Apr 2 18:33:42 2015 From: ethan at stoneleaf.us (Ethan Furman) Date: Thu, 2 Apr 2015 09:33:42 -0700 Subject: [Python-Dev] PEP 487 vs 422 (dynamic class decoration) In-Reply-To: References: Message-ID: <20150402163341.GA7842@stoneleaf.us> On Thu, Apr 02, 2015 at 06:46:11PM +1000, Nick Coghlan wrote: > On 2 April 2015 at 16:38, PJ Eby wrote: > > IOW, there's no need to modify the core just to have *that* feature, > > since if you control the base class you can already do what PEP 487 > > does in essentially every version of Python, ever. If that's all PEP > > 487 is going to do, it should just be a PyPI package on a > > stdlib-inclusion track, not a change to core Python. It's not > > actually adding back any of the dynamicness (dynamicity? > > hookability?) that PEP 3115 took away. > > The specific feature that PEP 487 is adding is the ability to > customise creation of subclasses without risking the introduction of a > metaclass conflict. That allows it to be used in situations where > adopting any of the existing metaclass based mechanisms would require > a potential compatibility break (as well as being far more > approachable as a mechanism than the use of custom metaclasses). > > The gap I agree this approach leaves is a final > post-namespace-execution step that supports establishing any class > level invariants implied by decorators and other functions used in the > class body. Python 2 allowed that to be handled with a dynamically > generated __metaclass__ and PEP 422 through __autodecorate__, while > PEP 487 currently has no equivalent mechanism. Perhaps PEP 422 should be back in the running then (possible reduced in scope, I haven't read it in a while), along with PEP 487, since they seem to target different areas. -- ~Ethan~ From pje at telecommunity.com Thu Apr 2 19:42:16 2015 From: pje at telecommunity.com (PJ Eby) Date: Thu, 2 Apr 2015 13:42:16 -0400 Subject: [Python-Dev] PEP 487 vs 422 (dynamic class decoration) In-Reply-To: References: Message-ID: On Thu, Apr 2, 2015 at 4:46 AM, Nick Coghlan wrote: > On 2 April 2015 at 16:38, PJ Eby wrote: >> >> IOW, there's no need to modify the core just to have *that* feature, >> since if you control the base class you can already do what PEP 487 >> does in essentially every version of Python, ever. If that's all PEP >> 487 is going to do, it should just be a PyPI package on a >> stdlib-inclusion track, not a change to core Python. It's not >> actually adding back any of the dynamicness (dynamicity? >> hookability?) that PEP 3115 took away. > > The specific feature that PEP 487 is adding is the ability to > customise creation of subclasses without risking the introduction of a > metaclass conflict. That allows it to be used in situations where > adopting any of the existing metaclass based mechanisms would require > a potential compatibility break But metaclass conflicts are *also* fixable in end-user code, and have been since 2.2. All you need to do is use a metaclass *function* that automatically merges the metaclasses involved, which essentially amounts to doing `class MergedMeta(base1.__class__, base2.__class__,...)`. (Indeed, I've had a library for doing just that since 2002, that originally ran on Python 2.2,.) On Python 3, it's even easier to use that approach, because you can just use something like `class whatever(base1, base2, metaclass=noconflict)` whenever a conflict comes up. (And because the implementation wouldn't have to deal with classic classes or __metaclass__, as my Python 2 implementation has to.) IOW, *all* of PEP 487 is straightforward to implement in userspace as a metaclass and a function that already exist off-the-shelf in Python 2... and whose implementations would be simplified by porting them to Python 3, and dropping any extraneous features: * http://svn.eby-sarna.com/PEAK/src/peak/util/Meta.py?view=markup (the `makeClass` function does what my hypothetical `noconflict` above does, with a slightly different API, and support for classic classes, __metaclass__, etc., that could all be stripped out) * http://svn.eby-sarna.com/DecoratorTools/peak/util/decorators.py?view=markup (see the `classy_class` metaclass and `classy` mixin base that implement features similar to `__init_subclass__`, plus others that could be stripped out) Basically, you can pull out those functions/classes (and whatever else they use in those modules), port 'em to Python 3, make any API changes deemed suitable, and call it a day. And the resulting code could go to a stdlib metaclass utility module after a reasonable break-in period. > (as well as being far more > approachable as a mechanism than the use of custom metaclasses). Sure, nobody's arguing that it's not a desirable feature. I *implemented* that mechanism for Python 2 (eight years ago) because it's easier to use even for those of us who are fully versed in the dark metaclass arts. ;-) Here's the documentation: http://peak.telecommunity.com/DevCenter/DecoratorTools#meta-less-classes So the feature doesn't even require *stdlib* adoption, let alone changes to Python core. (Heck, I wasn't even the first to implement this feature: Zope had it for Python *1.5.2*, in their ExtensionClass.) It's a totally solved problem in Python 2, although the solution is admittedly not widely known. If the PEP 487 metaclass library, however, were to just port some bits of my code to Python 3 this could be a done deal already and available in *all* versions of Python 3, not just the next one. > The gap I agree this approach leaves is a final > post-namespace-execution step that supports establishing any class > level invariants implied by decorators and other functions used in the > class body. Python 2 allowed that to be handled with a dynamically > generated __metaclass__ and PEP 422 through __autodecorate__, while > PEP 487 currently has no equivalent mechanism. Right. And it's *only* having such a mechanism available by *default* that requires a language change. Conversely, if we *are* making a language change, then adding a hook that allows method decorators to access the just-defined class provides roughly the same generality that Python 2 had in this respect. All I want is the ability for method decorators to find out what class they were added to, at the time the class is built, rather than having to wait for an access or invocation that may never come. This could be as simple as __build_class__ or type.__call__ looking through the new class's dictionary for objects with a `__used_in_class__(cls, name)` method, e.g.: for k, v in dict.items(): if hasattr(v, '__used_in_class__'): v.__used_in_class__(cls, k) This doesn't do what PEP 487 or 422 do, but it's the bare minimum for what I need, and it actually allows this type of decorator to avoid any frame inspection because they can just add a __used_in_class__ attribute to the functions being decorated. (It actually also eliminates another common use for metaclasses and class decorators, e.g. in ORM and other structure-like classes that need properties to know their names without duplicating code, as in `x = field(int)` vs `x = field('x', int)`. I have a fair amount of code that does this sort of thing, too, though it's not in urgent need of porting to Python 3.) From pje at telecommunity.com Thu Apr 2 21:32:48 2015 From: pje at telecommunity.com (PJ Eby) Date: Thu, 2 Apr 2015 15:32:48 -0400 Subject: [Python-Dev] PEP 487 vs 422 (dynamic class decoration) Message-ID: On Thu, Apr 2, 2015 at 1:42 PM, PJ Eby wrote: > If the PEP 487 metaclass library, > however, were to just port some bits of my code to Python 3 this could > be a done deal already and available in *all* versions of Python 3, > not just the next one. Just for the heck of it, here's an actual implementation and demo of PEP 487, that I've tested with 3.1, 3.2, and 3.4 (I didn't have a copy of 3.3 handy): https://gist.github.com/pjeby/75ca26f8d2a7a0c68e30 The first module is just a demo that shows the features in use. The second module is the implementation. Notice that the actual *functionality* of PEP 487 is just *16 lines* in Python 3... including docstrings and an `__all__` definition. ;-) The other 90 lines of code are only there to implement the `noconflict` feature for fixing metaclass conflicts... and quite a lot of *those* lines are comments and docstrings. ;-) Anyway, I think this demo is a knockout argument for why PEP 487 doesn't need a language change: if you're writing an __init_subclass__ method you just include the `pep487.init_subclasses` base in your base classes, and you're done. It'll silently fail if you leave it out (but you'll notice that right away), and it *won't* fail in third-party subclasses because the *third party* didn't include it. In contrast, PEP 422 provided a way to have both the features contemplated by 487, *and* a way to allow method-level decorators to discover the class at class creation time. If there's going to be a language change, it should include that latter feature from the outset. From lkb.teichmann at gmail.com Thu Apr 2 13:47:54 2015 From: lkb.teichmann at gmail.com (Martin Teichmann) Date: Thu, 2 Apr 2015 13:47:54 +0200 Subject: [Python-Dev] PEP 487 vs 422 (dynamic class decoration) In-Reply-To: References: Message-ID: Hi everyone, for those new to the discussion, I am the author of PEP 487, which has been discussed here: https://mail.python.org/pipermail/python-ideas/2015-February/032249.html > The concern is twofold: it breaks proper information hiding/DRY, *and* > it fails silently. It should not be necessary for clients of package > A1 (that uses a decorator built using package B2) to mixin a metaclass > or decorator from package C3 (because B2 implemented its decorators > using C3), just for package A1's decorator to work properly in the > *client package's class*. (And then, of course, this all silently > breaks if you forget, and the breakage might happen at the A1, B2, or > C3 level.) I am just not capable to understand things at such an abstract level, would it be possible to give a simple example on what and how you did things in python 2? > IOW, there's no need to modify the core just to have *that* feature, > since if you control the base class you can already do what PEP 487 > does in essentially every version of Python, ever. If that's all PEP > 487 is going to do, it should just be a PyPI package on a > stdlib-inclusion track, not a change to core Python. It's not > actually adding back any of the dynamicness (dynamicity? > hookability?) that PEP 3115 took away. That was my point. You can find the PyPI package at: https://pypi.python.org/pypi/metaclass Greetings Martin From lkb.teichmann at gmail.com Fri Apr 3 00:24:53 2015 From: lkb.teichmann at gmail.com (Martin Teichmann) Date: Fri, 3 Apr 2015 00:24:53 +0200 Subject: [Python-Dev] PEP 487 vs 422 (dynamic class decoration) In-Reply-To: References: Message-ID: Hi everyone, > > If the PEP 487 metaclass library, > > however, were to just port some bits of my code to Python 3 this could > > be a done deal already and available in *all* versions of Python 3, > > not just the next one. > > Just for the heck of it, here's an actual implementation and demo of > PEP 487, that I've tested with 3.1, 3.2, and 3.4 (I didn't have a copy > of 3.3 handy): The implementation mentioned in PEP 487 itself even works on python 2.7. The whole point of PEP 487 was to reduce PEP 422 so much that it can be written in python and back-ported. This is also discussed in PEP 487. An updated discussion of PEP 487 can be found here: https://mail.python.org/pipermail/python-ideas/2015-March/032538.html Now you want to be able to write decorators whose details are filled in at class creation time. Currently this is typically done by a metaclass. With PEP 487 in place this can also be done using a mixin class. Your point is that you want to be able to use your decorators without having to ask users to also inherit a specific class. I personally don't think that's desirable. Many frameworks out there have such kind of decorators and mandatory base classes and that works fine. The only problem remains once you need to inherit more than one of those classes, as their metaclasses most likely clash. This is what PEP 487 fixes. It fixes this with just some handful lines of python code, but I consider that shortness an advantage. So my opinion is that it is not too hard a requirement to ask a user to inherit a specific mixin class for the sake of using a decorator. What do other people think? Greetings Martin -------------- next part -------------- An HTML attachment was scrubbed... URL: From ethan at stoneleaf.us Fri Apr 3 03:29:29 2015 From: ethan at stoneleaf.us (Ethan Furman) Date: Thu, 2 Apr 2015 18:29:29 -0700 Subject: [Python-Dev] version of freshly built 2.7 python Message-ID: <20150403012929.GA8957@stoneleaf.us> I just built the latest version of Python 2.7 on my development machine -- or so I thought. When I invoke it, I get: Python 2.7.6+ (2.7:1beb3e0507fa, Apr 2 2015, 17:57:53) Why am I not seeing 2.7.9? -- ~Ethan~ From ncoghlan at gmail.com Fri Apr 3 03:31:04 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 3 Apr 2015 11:31:04 +1000 Subject: [Python-Dev] PEP 487 vs 422 (dynamic class decoration) In-Reply-To: References: Message-ID: On 3 April 2015 at 08:24, Martin Teichmann wrote: > Hi everyone, > >> > If the PEP 487 metaclass library, >> > however, were to just port some bits of my code to Python 3 this could >> > be a done deal already and available in *all* versions of Python 3, >> > not just the next one. >> >> Just for the heck of it, here's an actual implementation and demo of >> PEP 487, that I've tested with 3.1, 3.2, and 3.4 (I didn't have a copy >> of 3.3 handy): > > The implementation mentioned in PEP 487 itself even works on > python 2.7. > > The whole point of PEP 487 was to reduce PEP 422 so much that > it can be written in python and back-ported. This is also discussed in > PEP 487. An updated discussion of PEP 487 can be found here: > https://mail.python.org/pipermail/python-ideas/2015-March/032538.html > > Now you want to be able to write decorators whose details > are filled in at class creation time. Currently this is typically done > by a metaclass. With PEP 487 in place this can also be > done using a mixin class. > > Your point is that you want to be able to use your decorators > without having to ask users to also inherit a specific class. > I personally don't think that's desirable. Many frameworks out > there have such kind of decorators and mandatory base classes > and that works fine. The only problem remains once you need to > inherit more than one of those classes, as their metaclasses > most likely clash. This is what PEP 487 fixes. It fixes this with > just some handful lines of python code, but I consider that > shortness an advantage. > > So my opinion is that it is not too hard a requirement to ask > a user to inherit a specific mixin class for the sake of using > a decorator. What do other people think? If I'm understanding PJE's main concern correctly it's that this approach requires explicitly testing that the decorator has been applied correctly in your automated tests every time you use it, as otherwise there's a risk of a silent failure when you use the decorator but omit the mandatory base class that makes the decorator work correctly. I'm actually OK with the "mandatory base class" approach myself - it's the way most Python frameworks operate, and it acts as a visible indicator of the "concept space" a particular class is operating in (e.g. "Django Form", "SQL Alchemy Model") rather than having that information be implicit in the decorators being used. However, I'm also now wondering if it may be possible to reach out to the pylint authors (similar to what Brett did for the "pylint --py3k" flag) and ask for a way to make it easy to register "base class, decorator" pairs where pylint will complain if it sees a particular method decorator but can't determine at analysis time if the named base class is in the MRO for the class defining the method. Regards, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From pje at telecommunity.com Fri Apr 3 03:32:29 2015 From: pje at telecommunity.com (PJ Eby) Date: Thu, 2 Apr 2015 21:32:29 -0400 Subject: [Python-Dev] PEP 487 vs 422 (dynamic class decoration) In-Reply-To: References: Message-ID: On Thu, Apr 2, 2015 at 6:24 PM, Martin Teichmann wrote: > The whole point of PEP 487 was to reduce PEP 422 so much that > it can be written in python and back-ported. As I said earlier, it's a fine feature and should be in the stdlib for Python 3. (But it should have a `noconflict` feature added, and it doesn't need a language change.) However, since my specific use case was the one PEP 422 was originally written to solve, and PEP 487 does not address that use case, it is not a suitable substitute *for PEP 422*. This is also not your fault; you didn't force Nick to withdraw it, after all. ;-) My main concern in this thread, however, is ensuring that either the use case behind PEP 422 doesn't get dropped, or that Nick is now okay with me implementing that feature by monkeypatching __build_class__. Since he practically begged me not to do that in 2012, and IIRC *specifically created* PEP 422 to provide an alternative way for me to accomplish this *specific* use case, I wanted to see what his current take was. (That is, did he forget the history of the PEP, or does he no longer care about userspace code hooking __build_class__? Is there some other proposal that would be a viable alternative? etc.) > Now you want to be able to write decorators whose details > are filled in at class creation time. Not "now"; it's been possible to do this in Python 2 for over a decade, and code that does so is in current use by other packages. The package providing this feature (DecoratorTools) was downloaded 145 times today, and 3274 times in the past month, so there is active, current use of it by other Python 2 packages. (Though I don't know how many of them depend directly or indirectly upon this particular feature.) Currently, however, it is not possible to port this feature of DecoratorTools (or any other package that uses that feature, recursively) to Python 3, due to the removal of __metaclass__ and the lack of any suitable substitute hook. > Your point is that you want to be able to use your decorators > without having to ask users to also inherit a specific class. > I personally don't think that's desirable. Many frameworks out > there have such kind of decorators and mandatory base classes > and that works fine. The intended use case is for generic method decorators that have nothing to do with the base class per se, so inheriting from a specific base-class is an anti-feature in this case. > The only problem remains once you need to > inherit more than one of those classes, as their metaclasses > most likely clash. This is what PEP 487 fixes. No, it addresses the issue for certain *specific* metaclass use cases. It does not solve the problem of metaclass conflict in general; for that you need something like the sample `noconflict` code I posted, which works for Python 3.1+ and doesn't require a language change. > So my opinion is that it is not too hard a requirement to ask > a user to inherit a specific mixin class for the sake of using > a decorator. If this logic were applied to PEP 487 as it currently stands, the PEP should be rejected, since its use case is even *more* easily accomplished by inheriting from a specific mixin class. (Since the feature only works on subclasses anyway!) Further, if the claim is that metaclass conflict potential makes PEP 487 worthy of a language change, then by the same logic method decorators are just as worthy of a language change, since any mixin required to use a method decorator would be *just as susceptible* to metaclass conflicts as SubclassInit. (Notably, the stdlib's ABCMeta is a common cause of metaclass conflicts in Python 2.6+ -- if you mix in anything that implements an ABC by subclassing it, you will get a metaclass conflict.) Finally, I of course disagree with the conclusion that it's okay to require mixins in order for method decorators to access the containing class, since it is not a requirement in Python 2, due to the availability of the __metaclass__ hook. Further, PEP 422 was previously approved to fix this problem, and has a patch in progress, so I'm understandably upset by its sudden withdrawal and lack of suitable replacement. So personally, I think that PEP 422 should be un-withdrawn (or replaced with something else), and PEP 487 should be retargeted towards defining a `metaclass` module for the stdlib, including a `noconflict` implementation to address metaclass conflict issues. (Mine or someone else's, as long as it works.) PEP 487 should not be a proposal to change the language, as the provided features don't require it. (And it definitely shouldn't pre-empt a separately useful feature that *does* require a language change.) At this point, though, I mostly just want to get some kind of closure. After three years, I'd like to know if this is a yea or nay, so I can port the thing and move on, whether it's through a standardized mechanism or ugly monkeypatching. Honestly, the only reason I'm even discussing this in the first place is because 1) Nick pleaded with me three years ago not to hold off porting until a standardized way of doing this could be added to the language, and 2) I've had some recent inquiries from users about porting PEAK-Rules (which uses this particular feature of DecoratorTools) to Python 3. So I went to check on PEP 422's status, and here we are. From tritium-list at sdamon.com Fri Apr 3 03:37:10 2015 From: tritium-list at sdamon.com (Alexander Walters) Date: Thu, 02 Apr 2015 21:37:10 -0400 Subject: [Python-Dev] version of freshly built 2.7 python In-Reply-To: <20150403012929.GA8957@stoneleaf.us> References: <20150403012929.GA8957@stoneleaf.us> Message-ID: <551DEEC6.30702@sdamon.com> Are you building from mercurial or a source tarball? On 4/2/2015 21:29, Ethan Furman wrote: > I just built the latest version of Python 2.7 on my development machine -- or so I thought. When I invoke it, I get: > > Python 2.7.6+ (2.7:1beb3e0507fa, Apr 2 2015, 17:57:53) > > Why am I not seeing 2.7.9? > > -- > ~Ethan~ > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/tritium-list%40sdamon.com From zachary.ware+pydev at gmail.com Fri Apr 3 04:17:46 2015 From: zachary.ware+pydev at gmail.com (Zachary Ware) Date: Thu, 2 Apr 2015 21:17:46 -0500 Subject: [Python-Dev] version of freshly built 2.7 python In-Reply-To: <20150403012929.GA8957@stoneleaf.us> References: <20150403012929.GA8957@stoneleaf.us> Message-ID: On Thursday, April 2, 2015, Ethan Furman wrote: > I just built the latest version of Python 2.7 on my development machine -- > or so I thought. When I invoke it, I get: > > Python 2.7.6+ (2.7:1beb3e0507fa, Apr 2 2015, 17:57:53) > > Why am I not seeing 2.7.9? > https://hg.python.org/cpython/rev/1beb3e0507fa/ I'd say update and try again. Taking a wild guess as to why you're on the wrong revision, if you use the hg 'share' extension to keep separate working copies of each branch, remember that 'hg pull --update' doesn't update if you already have all changesets from the server due to a pull in another 'shared' copy. Hope this helps, -- Zach -- Sent from Gmail Mobile -------------- next part -------------- An HTML attachment was scrubbed... URL: From greg.ewing at canterbury.ac.nz Fri Apr 3 04:29:31 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Fri, 03 Apr 2015 15:29:31 +1300 Subject: [Python-Dev] PEP 487 vs 422 (dynamic class decoration) In-Reply-To: References: Message-ID: <551DFB0B.7060207@canterbury.ac.nz> On 04/03/2015 02:31 PM, Nick Coghlan wrote: > If I'm understanding PJE's main concern correctly it's that this > approach requires explicitly testing that the decorator has been > applied correctly in your automated tests every time you use it, as > otherwise there's a risk of a silent failure when you use the > decorator but omit the mandatory base class that makes the decorator > work correctly. Could the decorator be designed to detect that situation somehow? E.g. the first time the decorated method is called, check that the required base class is present. -- Greg From pje at telecommunity.com Fri Apr 3 05:03:21 2015 From: pje at telecommunity.com (PJ Eby) Date: Thu, 2 Apr 2015 23:03:21 -0400 Subject: [Python-Dev] PEP 487 vs 422 (dynamic class decoration) In-Reply-To: <551DFB0B.7060207@canterbury.ac.nz> References: <551DFB0B.7060207@canterbury.ac.nz> Message-ID: On Thu, Apr 2, 2015 at 10:29 PM, Greg Ewing wrote: > On 04/03/2015 02:31 PM, Nick Coghlan wrote: >> >> If I'm understanding PJE's main concern correctly it's that this >> approach requires explicitly testing that the decorator has been >> applied correctly in your automated tests every time you use it, as >> otherwise there's a risk of a silent failure when you use the >> decorator but omit the mandatory base class that makes the decorator >> work correctly. > > > Could the decorator be designed to detect that situation > somehow? E.g. the first time the decorated method is called, > check that the required base class is present. No, because in the most relevant use case, the method will never be called if the base class isn't present. For more details, see also the previous discussion at https://mail.python.org/pipermail/python-dev/2012-June/119883.html From ethan at stoneleaf.us Fri Apr 3 05:05:19 2015 From: ethan at stoneleaf.us (Ethan Furman) Date: Thu, 2 Apr 2015 20:05:19 -0700 Subject: [Python-Dev] PEP 487 vs 422 (dynamic class decoration) In-Reply-To: <551DFB0B.7060207@canterbury.ac.nz> References: <551DFB0B.7060207@canterbury.ac.nz> Message-ID: <20150403030519.GB8957@stoneleaf.us> On 04/03, Greg Ewing wrote: > On 04/03/2015 02:31 PM, Nick Coghlan wrote: >> If I'm understanding PJE's main concern correctly it's that this >> approach requires explicitly testing that the decorator has been >> applied correctly in your automated tests every time you use it, as >> otherwise there's a risk of a silent failure when you use the >> decorator but omit the mandatory base class that makes the decorator >> work correctly. > > Could the decorator be designed to detect that situation > somehow? E.g. the first time the decorated method is called, > check that the required base class is present. That feels like a horrible work-around. The proper place for setup code is in the set up. -- ~Ethan From pje at telecommunity.com Fri Apr 3 05:07:49 2015 From: pje at telecommunity.com (PJ Eby) Date: Thu, 2 Apr 2015 23:07:49 -0400 Subject: [Python-Dev] PEP 487 vs 422 (dynamic class decoration) In-Reply-To: References: Message-ID: On Thu, Apr 2, 2015 at 9:31 PM, Nick Coghlan wrote: > On 3 April 2015 at 08:24, Martin Teichmann wrote: > However, I'm also now wondering if it may be possible to reach out to > the pylint authors (similar to what Brett did for the "pylint --py3k" > flag) and ask for a way to make it easy to register "base class, > decorator" pairs where pylint will complain if it sees a particular > method decorator but can't determine at analysis time if the named > base class is in the MRO for the class defining the method. Will it *also* check the calling chain of the decorator, or any other thing that's called or invoked in the class body,to find out if somewhere, somehow, it asks for a class decoration? If not, it's not going to help with this use case. There are many ways to solve this problem by re-adding a hook -- you and I have proposed several, in 2012 and now. There are none, however, which do not involve putting back the hookability that Python 3 took out, except by using hacks like sys.set_trace() or monkeypatching __build_class__. From ethan at stoneleaf.us Fri Apr 3 05:19:09 2015 From: ethan at stoneleaf.us (Ethan Furman) Date: Thu, 2 Apr 2015 20:19:09 -0700 Subject: [Python-Dev] version of freshly built 2.7 python In-Reply-To: <551DEEC6.30702@sdamon.com> References: <20150403012929.GA8957@stoneleaf.us> <551DEEC6.30702@sdamon.com> Message-ID: <20150403031909.GA9120@stoneleaf.us> On 04/02, Alexander Walters wrote: > On 4/2/2015 21:29, Ethan Furman wrote: >> >> I just built the latest version of Python 2.7 on my development machine -- or so I thought. When I invoke it, I get: >> >> Python 2.7.6+ (2.7:1beb3e0507fa, Apr 2 2015, 17:57:53) >> >> Why am I not seeing 2.7.9? > > Are you building from mercurial or a source tarball? Mercurial: ethan at code:~/source/python/python2.7$ hg parent changeset: 90450:1beb3e0507fa branch: 2.7 parent: 90434:b428b803f71f user: Zachary Ware date: Thu Apr 24 13:20:27 2014 -0500 files: Lib/test/test_itertools.py description: Issue #21346: Fix typos in test_itertools. Patch by Brian Kearns. -- ~Ethan~ From brian at python.org Fri Apr 3 05:27:02 2015 From: brian at python.org (Brian Curtin) Date: Thu, 2 Apr 2015 22:27:02 -0500 Subject: [Python-Dev] version of freshly built 2.7 python In-Reply-To: <20150403031909.GA9120@stoneleaf.us> References: <20150403012929.GA8957@stoneleaf.us> <551DEEC6.30702@sdamon.com> <20150403031909.GA9120@stoneleaf.us> Message-ID: On Thu, Apr 2, 2015 at 10:19 PM, Ethan Furman wrote: > On 04/02, Alexander Walters wrote: >> On 4/2/2015 21:29, Ethan Furman wrote: >>> >>> I just built the latest version of Python 2.7 on my development machine -- or so I thought. When I invoke it, I get: >>> >>> Python 2.7.6+ (2.7:1beb3e0507fa, Apr 2 2015, 17:57:53) >>> >>> Why am I not seeing 2.7.9? >> >> Are you building from mercurial or a source tarball? > > Mercurial: > > ethan at code:~/source/python/python2.7$ hg parent > changeset: 90450:1beb3e0507fa > branch: 2.7 > parent: 90434:b428b803f71f > user: Zachary Ware > date: Thu Apr 24 13:20:27 2014 -0500 > files: Lib/test/test_itertools.py > description: > Issue #21346: Fix typos in test_itertools. Patch by Brian Kearns. That's almost a year old. Update it? From ethan at stoneleaf.us Fri Apr 3 06:11:43 2015 From: ethan at stoneleaf.us (Ethan Furman) Date: Thu, 2 Apr 2015 21:11:43 -0700 Subject: [Python-Dev] version of freshly built 2.7 python In-Reply-To: References: <20150403012929.GA8957@stoneleaf.us> <551DEEC6.30702@sdamon.com> <20150403031909.GA9120@stoneleaf.us> Message-ID: <20150403041143.GA9254@stoneleaf.us> On 04/02, Brian Curtin wrote: > On Thu, Apr 2, 2015 at 10:19 PM, Ethan Furman wrote: >> On 04/02, Alexander Walters wrote: >>> On 4/2/2015 21:29, Ethan Furman wrote: >>>> >>>> I just built the latest version of Python 2.7 on my development machine -- or so I thought. When I invoke it, I get: >>>> >>>> Python 2.7.6+ (2.7:1beb3e0507fa, Apr 2 2015, 17:57:53) >>>> >>>> Why am I not seeing 2.7.9? >>> >>> Are you building from mercurial or a source tarball? >> >> Mercurial: >> >> ethan at code:~/source/python/python2.7$ hg parent >> changeset: 90450:1beb3e0507fa >> branch: 2.7 >> parent: 90434:b428b803f71f >> user: Zachary Ware >> date: Thu Apr 24 13:20:27 2014 -0500 >> files: Lib/test/test_itertools.py >> description: >> Issue #21346: Fix typos in test_itertools. Patch by Brian Kearns. > > That's almost a year old. Update it? A fresh pull only garnared 13 new commits, and an hg update in the 2.7 directory changed nothing. I even tried updating to default, then back to 2.7, but I get the same parent. hg branches shows default 90453:d84a69b7ba72 2.7 90450:1beb3e0507fa 3.1 90264:c7b93519807a 2.6 90100:23a60d89dbd4 3.4 90451:901b9afc918e (inactive) 3.3 90266:a8445ead2f9e (inactive) 3.2 90265:0a7d4cdc4c8d (inactive) hmmm... those numbers are about 5000 off! Okay, deleting and recloning... Okay, the revisions are matching up now -- don't know what happened, but nothing a simple nuke couldn't solve! Now-wondering-if-I-ever-did-an-hg-pull'ly yrs, ~Ethan~ From ncoghlan at gmail.com Fri Apr 3 10:21:10 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 3 Apr 2015 18:21:10 +1000 Subject: [Python-Dev] PEP 487 vs 422 (dynamic class decoration) In-Reply-To: References: Message-ID: On 3 April 2015 at 11:32, PJ Eby wrote: > On Thu, Apr 2, 2015 at 6:24 PM, Martin Teichmann > wrote: >> The whole point of PEP 487 was to reduce PEP 422 so much that >> it can be written in python and back-ported. > > As I said earlier, it's a fine feature and should be in the stdlib for > Python 3. (But it should have a `noconflict` feature added, and it > doesn't need a language change.) > > However, since my specific use case was the one PEP 422 was originally > written to solve, and PEP 487 does not address that use case, it is > not a suitable substitute *for PEP 422*. > > This is also not your fault; you didn't force Nick to withdraw it, > after all. ;-) > > My main concern in this thread, however, is ensuring that either the > use case behind PEP 422 doesn't get dropped, or that Nick is now okay > with me implementing that feature by monkeypatching __build_class__. > Since he practically begged me not to do that in 2012, and IIRC > *specifically created* PEP 422 to provide an alternative way for me to > accomplish this *specific* use case, I wanted to see what his current > take was. (That is, did he forget the history of the PEP, or does he > no longer care about userspace code hooking __build_class__? Is there > some other proposal that would be a viable alternative? etc.) It's actually both - I'd forgotten the origins of PEP 422, *and* I've come around to the view that this level of implicit behaviour is a bad idea because it's inherently confusing. The various incarnations of PEP 422 ended up remaining at least somewhat confusing for *me* and I wrote it. The "what happens when?" story in PEP 487 is much simpler. That means I'm now OK with monkeypatching __build_class__ being the only way to get dynamic hooking of the class currently being defined from the class body - folks that really want that behaviour can monkeypatch it in, while folks that think it's a bad idea don't need to worry about. >> Now you want to be able to write decorators whose details >> are filled in at class creation time. > > Not "now"; it's been possible to do this in Python 2 for over a > decade, and code that does so is in current use by other packages. > The package providing this feature (DecoratorTools) was downloaded 145 > times today, and 3274 times in the past month, so there is active, > current use of it by other Python 2 packages. (Though I don't know > how many of them depend directly or indirectly upon this particular > feature.) > > Currently, however, it is not possible to port this feature of > DecoratorTools (or any other package that uses that feature, > recursively) to Python 3, due to the removal of __metaclass__ and the > lack of any suitable substitute hook. Right, any post-namespace-execution modification of class definitions in Python 3 currently has to be declared in the class definition header, either as an explicit decorator, as a metaclass declaration, or through inheritance from a particular base class. PEP 487 doesn't change that, it just adds a way for the last case to work without needing a custom metaclass in the picture. PEP 422 *did* change that, by providing a way for the namespace itself to register a post-execution hook, akin to one of the way's __metaclass__ could be used in Python 2. What's changed since I first wrote PEP 422 is that I've come to view the requirement to be explicit in Python 3 as a gain for readability, rather than as a limitation to be eliminated - there will always be *some* hint in the class header that post-namespace-execution modifications may be happening, even if it's just having a declared base class other than object. >> Your point is that you want to be able to use your decorators >> without having to ask users to also inherit a specific class. >> I personally don't think that's desirable. Many frameworks out >> there have such kind of decorators and mandatory base classes >> and that works fine. > > The intended use case is for generic method decorators that have > nothing to do with the base class per se, so inheriting from a > specific base-class is an anti-feature in this case. If that's the use case, it would be preferable to explore enhancing the type based dispatch capabilities in functools as a follow-up to Guido's work on type hinting enhacements. >> The only problem remains once you need to >> inherit more than one of those classes, as their metaclasses >> most likely clash. This is what PEP 487 fixes. > > No, it addresses the issue for certain *specific* metaclass use cases. > It does not solve the problem of metaclass conflict in general; for > that you need something like the sample `noconflict` code I posted, > which works for Python 3.1+ and doesn't require a language change. Neither PEP 422 nor 487 are designed to eliminate metaclass conflicts in general, they're primarily designed to let base classes run arbitrary code after the namespace has been executed in a subclass definition *without* needing a custom metaclass. They both introduce a new tier in the metaprogramming hierarchy between explicit class decorators and full custom metaclasses. >> So my opinion is that it is not too hard a requirement to ask >> a user to inherit a specific mixin class for the sake of using >> a decorator. > > If this logic were applied to PEP 487 as it currently stands, the PEP > should be rejected, since its use case is even *more* easily > accomplished by inheriting from a specific mixin class. (Since the > feature only works on subclasses anyway!) No, you can't do it currently without risking a backwards incompatibility through the introduction of a custom metaclass. > Further, if the claim is that metaclass conflict potential makes PEP > 487 worthy of a language change, then by the same logic method > decorators are just as worthy of a language change, since any mixin > required to use a method decorator would be *just as susceptible* to > metaclass conflicts as SubclassInit. There wouldn't be a custom metaclass involved in the native implementation of PEP 487, only in the backport. > Finally, I of course disagree with the conclusion that it's okay to > require mixins in order for method decorators to access the containing > class, since it is not a requirement in Python 2, due to the > availability of the __metaclass__ hook. Right, that's the crux of the disagreement: is the fact that you can inject a postprocessor into a Python 2 class namespace while it is being executed a good thing, or does it inherently lead to surprising code where it isn't clear where the postprocessing injection happened? When I first wrote PEP 422 I was of the view that "Python 2 allows class definition postprocessing injection, we should allow it in Python 3 as well". I've since changed my view to "Having to declare post-processing of a class definition up front as a decorator, base class or metaclass is a good thing for readability, as otherwise there's nothing obvious when reading a class definition that tells you whether or not postprocessing may happen, so you have to assume its possible for *every* class definition". > Further, PEP 422 was > previously approved to fix this problem, and has a patch in progress, > so I'm understandably upset by its sudden withdrawal and lack of > suitable replacement. PEP 422 has never been approved - it's only ever been Draft or Deferred (and now Withdrawn). If you would like to take it over and champion it as a competitor to PEP 487, I'd be fine with that, I just no longer think it's a good idea myself. > At this point, though, I mostly just want to get some kind of closure. > After three years, I'd like to know if this is a yea or nay, so I can > port the thing and move on, whether it's through a standardized > mechanism or ugly monkeypatching. Honestly, the only reason I'm even > discussing this in the first place is because 1) Nick pleaded with me > three years ago not to hold off porting until a standardized way of > doing this could be added to the language, My apologies for leading you up the garden path - back then I genuinely thought bringing back the capability was a good idea, but the combination of the original deferral to process the previous lot of feedback, and the more recent discussions with Martin that led to the creation of PEP 487 and the withdrawal of PEP 422 changed my mind. > and 2) I've had some recent > inquiries from users about porting PEAK-Rules (which uses this > particular feature of DecoratorTools) to Python 3. So I went to check > on PEP 422's status, and here we are. Given my change of heart, I believe that at this point, if you were willing to champion a revived PEP 422 that implemented the behaviour you're after, that would be ideal, with monkeypatching the desired behaviour in as a fallback plan if the PEP is still ultimately rejected. Alternatively, you could go the monkeypatching path first, and then potentially seek standardisation later after you've had some practical experience with it - I now consider it an orthogonal capability to the feature in PEP 487, so the acceptance of the latter wouldn't necessarily preclude acceptance of a hook for class postprocessing injection. Regards, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Fri Apr 3 10:26:50 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 3 Apr 2015 18:26:50 +1000 Subject: [Python-Dev] PEP 487 vs 422 (dynamic class decoration) In-Reply-To: References: Message-ID: On 3 April 2015 at 18:21, Nick Coghlan wrote: > That means I'm now OK with monkeypatching __build_class__ being the > only way to get dynamic hooking of the class currently being defined > from the class body - folks that really want that behaviour can > monkeypatch it in, while folks that think it's a bad idea don't need > to worry about. [snip] > PEP 422 has never been approved - it's only ever been Draft or > Deferred (and now Withdrawn). If you would like to take it over and > champion it as a competitor to PEP 487, I'd be fine with that, I just > no longer think it's a good idea myself. [snip] > Given my change of heart, I believe that at this point, if you were > willing to champion a revived PEP 422 that implemented the behaviour > you're after, that would be ideal, with monkeypatching the desired > behaviour in as a fallback plan if the PEP is still ultimately > rejected. Alternatively, you could go the monkeypatching path first, > and then potentially seek standardisation later after you've had some > practical experience with it - I now consider it an orthogonal > capability to the feature in PEP 487, so the acceptance of the latter > wouldn't necessarily preclude acceptance of a hook for class > postprocessing injection. Heh, editing fail. Writing out this post in full made me realise the last of these options was a potentially reasonable path forward, but I didn't go back and correct the earlier sections where I was viewing the situation differently. So if the post reads a self-contradictory, that's because it is, but the last one is the one that best reflects my current thinking :) Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From larry at hastings.org Fri Apr 3 11:56:53 2015 From: larry at hastings.org (Larry Hastings) Date: Fri, 03 Apr 2015 02:56:53 -0700 Subject: [Python-Dev] Do we need to sign Windows files with GnuPG? Message-ID: <551E63E5.6080805@hastings.org> As of Python 3.5 Steve Dower has taken over the Windows builds of Python from Martin van Loewis. He's also taken over for 2.7--though Martin's still doing builds for 3.4. For both versions, Steve is using all-new tooling for the build process. The output is different, too; he's producing .exe installers instead of .msi installers, and he has snazzy new "web-based" installers where the initial download is small, then it downloads the rest dynamically. Steve's also changed the authentication process. His new installers rely on a Windows digital signature technology called Authenticode where the signature is built right into the .exe file. Windows platforms will automatically authenticate executables signed with Authenticode, so this is both secure and convenient. Martin's build process also digitally signed the files he built, but not using Authenticode (or at least I don't think so). Like the Mac and source code releases, his automation used GnuPG to produce separate ".asc" files containing digital signatures. This meant authentication was a manual process. The Authenticode approach sounds great. But there are advantages to the GnuPG approach too: * Using GnuPG means we can authenticate the files from any platform, not just Windows. If there were a security breach on the Python content delivery network, any developer could get GnuPG for their platform and authenticate that the installers are unmodified. If we use Authenitcode, * GnuPG is agnostic about the data it digitally signs. So, for example, Martin's build process digitally signs the Windows help file--the ".chm" file--produced by his build process. The help file Steve builds is currently completely unsigned; Steve says he can try signing it but he's not sure it'll work. Note that .chm files actually /can/ contain live code, so this is at least a plausible vector for attack. My Windows development days are firmly behind me. So I don't really have an opinion here. So I put it to you, Windows Python developers: do you care about GnuPG signatures on Windows-specific files? Or do you not care? //arry/ p.s. And, of course, my thanks to both Steve and Martin for their past and continuing service to the Python community! It's a pleasure working with each of them. (Both of them? I forget how English works.) -------------- next part -------------- An HTML attachment was scrubbed... URL: From mal at egenix.com Fri Apr 3 12:47:58 2015 From: mal at egenix.com (M.-A. Lemburg) Date: Fri, 03 Apr 2015 12:47:58 +0200 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: <551E63E5.6080805@hastings.org> References: <551E63E5.6080805@hastings.org> Message-ID: <551E6FDE.5090801@egenix.com> On 03.04.2015 11:56, Larry Hastings wrote: > My Windows development days are firmly behind me. So I don't really have an opinion here. So I put > it to you, Windows Python developers: do you care about GnuPG signatures on Windows-specific files? > Or do you not care? Regardless of target platform, I firmly believe we should (continue to) GPG sign all distribution files as well as provide hash files/values for them. This is very useful to detect corrupted downloads or files which were not created by the original packagers. -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source >>> Python/Zope Consulting and Support ... http://www.egenix.com/ >>> mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ >>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ ________________________________________________________________________ ::: Try our new mxODBC.Connect Python Database Interface for free ! :::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/ From martin at martindengler.com Fri Apr 3 12:58:07 2015 From: martin at martindengler.com (Martin Dengler) Date: Fri, 3 Apr 2015 11:58:07 +0100 Subject: [Python-Dev] Do we need to sign Windows files with GnuPG? In-Reply-To: <551E63E5.6080805@hastings.org> References: <551E63E5.6080805@hastings.org> Message-ID: <20150403105807.GR17467@edge.thecloud.net> On Fri, Apr 03, 2015 at 02:56:53AM -0700, Larry Hastings wrote: > So I put it to you, Windows Python developers: do you care > about GnuPG signatures on Windows-specific files? Or do you not care? Developer using python on windows here. I care, yes. It's valuable and significant to be able to authenticate the files on different platforms (your GnuPG advantage #1), and using the same tools. It's also useful to have a very mature tool to do that. And valuable that all files can be so validated, not just the .exes. > //arry/ Martin -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 181 bytes Desc: not available URL: From victor.stinner at gmail.com Fri Apr 3 13:56:44 2015 From: victor.stinner at gmail.com (Victor Stinner) Date: Fri, 3 Apr 2015 13:56:44 +0200 Subject: [Python-Dev] Socket timeout: reset timeout at each successful syscall? Message-ID: Hi, I reworked the socket and ssl modules to better handle signals (to implement the PEP 475, retry on EINTR). These changes require to recompute timeout because syscalls are calling in a loop until it doesn't with EINTR (or EWOULDBLOCK or EGAIN). Most socket methods exit when the underlying syscall succeed. The problem is that the socket.sendall() method may require multiple syscalls. In this case, does the timeout count for the total time or only for a single syscall? Asked differently: should we reset the timeout each time a syscall succeed? Let's say that a server limits the bandwidth to 10 MB per second per client (or per connection). If I want to send 1000 MB, the request will take 100 seconds. Do you expect a timeout exception on calling call sendall() with a timeout of 60 seconds? Each call to send() may succeed in less than 60 seconds, for example each send() may send 1 MB in one second. In the socket documentation, I understand that the socket timeout is the total duration of an operation, not the maximum duration of a single syscall: https://docs.python.org/dev/library/socket.html#socket-timeouts "In timeout mode, operations fail if they cannot be completed within the timeout specified for the socket (they raise a timeout exception) or if the system returns an error." In Python 2.7, 3.4 and 3.5, socket.sendall() resets the timeout after each send() success. We had similar questions in the asyncio module, especially for StreamReader methods which can require multiple reads. I propose a patch to add a timeout parameter to StreamReader: it resets the timeout after each successful read. It's already possible to put a global timeout on any asyncio operationg. StreamReader timeout patch: https://bugs.python.org/issue23236 Note: there is also an open issue to add a socket.recvall() method which would open similar question on timeout: https://bugs.python.org/issue1103213 Victor From victor.stinner at gmail.com Fri Apr 3 14:21:48 2015 From: victor.stinner at gmail.com (Victor Stinner) Date: Fri, 3 Apr 2015 14:21:48 +0200 Subject: [Python-Dev] Socket timeout: reset timeout at each successful syscall? In-Reply-To: References: Message-ID: Oh, I forgot to explain that the problem is even more important in the SSL module. In SSL, a single "recv()" may require to *send* data. The SSL protocol is more complex and may require multiple OpenSSL calls which imply multiple send/recv syscalls. I wrote a patch for the ssl module to implement the PEP 475: to recompute the timeout if the method (ex: select()) is interrupted by a signal. https://bugs.python.org/issue23853 My patch uses a global timeout, it doesn't reset the timeout at each successful OpenSSL function call. And so it changes the behaviour compared to Python 3.4. If we want to keep the same behaviour, we can always reset the timeout except if a function was interrupted by signal. I opened the issue because the handshake() hangs if I send a singal every millisecond, because the timeout is not recomputed. So select() is called in a loop, always with the same timeout, and it always fail with EINTR. Victor 2015-04-03 13:56 GMT+02:00 Victor Stinner : > Hi, > > I reworked the socket and ssl modules to better handle signals (to > implement the PEP 475, retry on EINTR). These changes require to > recompute timeout because syscalls are calling in a loop until it > doesn't with EINTR (or EWOULDBLOCK or EGAIN). Most socket methods exit > when the underlying syscall succeed. > > The problem is that the socket.sendall() method may require multiple > syscalls. In this case, does the timeout count for the total time or > only for a single syscall? Asked differently: should we reset the > timeout each time a syscall succeed? > > Let's say that a server limits the bandwidth to 10 MB per second per > client (or per connection). If I want to send 1000 MB, the request > will take 100 seconds. Do you expect a timeout exception on calling > call sendall() with a timeout of 60 seconds? Each call to send() may > succeed in less than 60 seconds, for example each send() may send 1 MB > in one second. > > In the socket documentation, I understand that the socket timeout is > the total duration of an operation, not the maximum duration of a > single syscall: > > https://docs.python.org/dev/library/socket.html#socket-timeouts > "In timeout mode, operations fail if they cannot be completed within > the timeout specified for the socket (they raise a timeout exception) > or if the system returns an error." > > In Python 2.7, 3.4 and 3.5, socket.sendall() resets the timeout after > each send() success. > > We had similar questions in the asyncio module, especially for > StreamReader methods which can require multiple reads. I propose a > patch to add a timeout parameter to StreamReader: it resets the > timeout after each successful read. It's already possible to put a > global timeout on any asyncio operationg. StreamReader timeout patch: > https://bugs.python.org/issue23236 > > Note: there is also an open issue to add a socket.recvall() method > which would open similar question on timeout: > https://bugs.python.org/issue1103213 > > Victor From p.f.moore at gmail.com Fri Apr 3 14:25:25 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Fri, 3 Apr 2015 13:25:25 +0100 Subject: [Python-Dev] Do we need to sign Windows files with GnuPG? In-Reply-To: <551E63E5.6080805@hastings.org> References: <551E63E5.6080805@hastings.org> Message-ID: On 3 April 2015 at 10:56, Larry Hastings wrote: > My Windows development days are firmly behind me. So I don't really have an > opinion here. So I put it to you, Windows Python developers: do you care > about GnuPG signatures on Windows-specific files? Or do you not care? I don't have a very strong security background, so take my views with a pinch of saly, but I see Authenticode as a way of being sure that what I *run* is "OK". Whereas a GPG signature lets me check that the content of a file is as intended. So there are benefits to both, and I thing we should continue to provide GPG signatures. (Disclaimer: I've never in my life actually *checked* a GPG signature for a file...) Paul From barry at python.org Fri Apr 3 15:33:51 2015 From: barry at python.org (Barry Warsaw) Date: Fri, 3 Apr 2015 09:33:51 -0400 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: <551E63E5.6080805@hastings.org> References: <551E63E5.6080805@hastings.org> Message-ID: <20150403093351.71ff9751@limelight.wooz.org> On Apr 03, 2015, at 02:56 AM, Larry Hastings wrote: >My Windows development days are firmly behind me. So I don't really have an >opinion here. So I put it to you, Windows Python developers: do you care >about GnuPG signatures on Windows-specific files? Or do you not care? They're not mutually exclusive, so why not do both? I think the advantage of being able to verify the files on any platform is useful. Cheers, -Barry From brian at python.org Fri Apr 3 15:44:36 2015 From: brian at python.org (Brian Curtin) Date: Fri, 3 Apr 2015 08:44:36 -0500 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: References: <551E63E5.6080805@hastings.org> Message-ID: On Fri, Apr 3, 2015 at 7:25 AM, Paul Moore wrote: > On 3 April 2015 at 10:56, Larry Hastings wrote: >> My Windows development days are firmly behind me. So I don't really have an >> opinion here. So I put it to you, Windows Python developers: do you care >> about GnuPG signatures on Windows-specific files? Or do you not care? > > I don't have a very strong security background, so take my views with > a pinch of saly, but I see Authenticode as a way of being sure that > what I *run* is "OK". Whereas a GPG signature lets me check that the > content of a file is as intended. So there are benefits to both, and I > thing we should continue to provide GPG signatures. (Disclaimer: I've > never in my life actually *checked* a GPG signature for a file...) I haven't been on Windows in a bit, but this is my understanding/expectation as well. From narsu1984 at foxmail.com Fri Apr 3 11:51:11 2015 From: narsu1984 at foxmail.com (=?utf-8?B?TmFyc3U=?=) Date: Fri, 3 Apr 2015 17:51:11 +0800 Subject: [Python-Dev] Installing Python from the source code on Windows Message-ID: Hi Python I'm working on a game project,using c++ as main language, and using python as script.I've built the Python from the source code on Windows, and when I invoked method Py_Initialize my program exited. After some tests I realized as long as I move the Python-2.7.9/Lib file to my program file, it works fine. Please help me out . Thank you for your time. Narsu? -------------- next part -------------- An HTML attachment was scrubbed... URL: From lkb.teichmann at gmail.com Fri Apr 3 14:44:18 2015 From: lkb.teichmann at gmail.com (Martin Teichmann) Date: Fri, 3 Apr 2015 14:44:18 +0200 Subject: [Python-Dev] PEP 487 vs 422 (dynamic class decoration) In-Reply-To: References: Message-ID: > When I first wrote PEP 422 I was of the view that "Python 2 allows > class definition postprocessing injection, we should allow it in > Python 3 as well". I've since changed my view to "Having to declare > post-processing of a class definition up front as a decorator, base > class or metaclass is a good thing for readability, as otherwise > there's nothing obvious when reading a class definition that tells you > whether or not postprocessing may happen, so you have to assume its > possible for *every* class definition". Nick, I couldn't agree more with you, yet I think PJ actually brought up a very interesting point. Post-processing is a very common thing these days, and has been re-written so many times that I think it is about time that something like it should be in the standard library. I'm less thinking about decorated methods, more about descriptors. They always have the problem that they don't know which attribute they belong to, so every author of a framework that defines descriptors writes a metaclass which goes through all the descriptors and tells them their attribute name. I propose to have some metaclass in the standard library that does that. I think it would fit nicely in my metaclass module proposed in PEP 487. It would basically do the following: class Metaclass(type): def __init__(self, name, bases, dict): super().__init__(name, bases, dict) for k, v in dict.items(): if hasattr(v, "__post_process__"): v.__post_process__(k, self) So each descriptor could define a __post_process__ hook that tells it the attribute name and also the class it belongs to. This would for sure also work for decorated methods. This should mature on PyPI, then introduced into the standard library, and if demand is really that high, maybe even be introduced into type.__init__. It should be noted that this can also be easily written as a PEP 487 class using __subclass_init__, I just used the classical metaclass notion as I guess people are more used to that. This proposal can actually be seen as an extension to the __class__ and super() mechanism of normal methods: methods currently have the priviledge to know which classes they are defined in, while descriptors don't. So we could unify all this by giving functions a __post_process__ method which sets the __class__ in the function body. This is about the same as what happened when functions got a __get__ method to turn them into object methods. While this all is in the making, PJ could monkey-patch __build_class__ to do the steps described above, until it gets accepted into cpython. So I pose the question to PJ: would such an approach solve the problems you have? Greetings Martin From ncoghlan at gmail.com Fri Apr 3 17:04:38 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 4 Apr 2015 01:04:38 +1000 Subject: [Python-Dev] PEP 487 vs 422 (dynamic class decoration) In-Reply-To: References: Message-ID: On 4 Apr 2015 00:29, "Martin Teichmann" wrote: > > > When I first wrote PEP 422 I was of the view that "Python 2 allows > > class definition postprocessing injection, we should allow it in > > Python 3 as well". I've since changed my view to "Having to declare > > post-processing of a class definition up front as a decorator, base > > class or metaclass is a good thing for readability, as otherwise > > there's nothing obvious when reading a class definition that tells you > > whether or not postprocessing may happen, so you have to assume its > > possible for *every* class definition". > > Nick, I couldn't agree more with you, yet I think PJ actually brought > up a very interesting point. Post-processing is a very common thing > these days, and has been re-written so many times that I think it is > about time that something like it should be in the standard library. > > I'm less thinking about decorated methods, more about descriptors. > They always have the problem that they don't know which attribute they > belong to, so every author of a framework that defines descriptors > writes a metaclass which goes through all the descriptors and tells > them their attribute name. Extending the descriptor protocol to include a per-descriptor hook that's called at class definition time sounds like a potentially nice way to go to me. While you *could* still use it to arbitrarily mutate the class object, it's much clearer that's not the intended purpose, so I don't see it as a major problem. Cheers, Nick. -------------- next part -------------- An HTML attachment was scrubbed... URL: From pje at telecommunity.com Fri Apr 3 18:02:28 2015 From: pje at telecommunity.com (PJ Eby) Date: Fri, 3 Apr 2015 12:02:28 -0400 Subject: [Python-Dev] PEP 487 vs 422 (dynamic class decoration) In-Reply-To: References: Message-ID: On Fri, Apr 3, 2015 at 8:44 AM, Martin Teichmann wrote: > This proposal can actually be seen as an extension to the __class__ > and super() mechanism of normal methods: methods currently have the > priviledge to know which classes they are defined in, while descriptors > don't. So we could unify all this by giving functions a __post_process__ > method which sets the __class__ in the function body. This is about the > same as what happened when functions got a __get__ method to turn > them into object methods. > > While this all is in the making, PJ could monkey-patch __build_class__ > to do the steps described above, until it gets accepted into cpython. > So I pose the question to PJ: would such an approach solve the > problems you have? Universal member post-processing actually works *better* for the motivating use case than the metaclass or class level hooks, so yes. In practice, there is one potential hiccup, and that's that decorators which aren't aware of __post_process__ will end up masking it. But that's not an insurmountable obstacle. From status at bugs.python.org Fri Apr 3 18:08:19 2015 From: status at bugs.python.org (Python tracker) Date: Fri, 3 Apr 2015 18:08:19 +0200 (CEST) Subject: [Python-Dev] Summary of Python tracker Issues Message-ID: <20150403160819.017C1566AE@psf.upfronthosting.co.za> ACTIVITY SUMMARY (2015-03-27 - 2015-04-03) Python tracker at http://bugs.python.org/ To view or respond to any of the issues listed below, click on the issue. Do NOT respond to this message. Issues counts and deltas: open 4838 (+19) closed 30782 (+50) total 35620 (+69) Open issues with patches: 2252 Issues opened (52) ================== #23466: PEP 461: Inconsistency between str and bytes formatting of int http://bugs.python.org/issue23466 reopened by ethan.furman #23752: Cleanup io.FileIO http://bugs.python.org/issue23752 reopened by haypo #23791: Identify Moved Lines with difflib http://bugs.python.org/issue23791 opened by bkiefer #23794: http package should support HTTP/2 http://bugs.python.org/issue23794 opened by alex #23795: argparse -- incorrect usage for mutually exclusive http://bugs.python.org/issue23795 opened by jotomicron #23796: BufferedReader.peek() crashes if closed http://bugs.python.org/issue23796 opened by vadmium #23797: Mac OS X locale setup in thread happens sometime after run() i http://bugs.python.org/issue23797 opened by barry-scott #23799: Join started threads in tests http://bugs.python.org/issue23799 opened by serhiy.storchaka #23800: Support POSIX uselocale interface for thread-specific locale s http://bugs.python.org/issue23800 opened by ned.deily #23802: patch: __deepcopy__ memo dict argument usage http://bugs.python.org/issue23802 opened by danielsh #23804: SSLSocket.recv(0) receives up to 1024 bytes http://bugs.python.org/issue23804 opened by vadmium #23805: _hashlib compile error? http://bugs.python.org/issue23805 opened by SinusAlpha #23806: documentation for no_proxy is missing from the python3 urllib http://bugs.python.org/issue23806 opened by r.david.murray #23807: Improved test coverage for calendar.py command line http://bugs.python.org/issue23807 opened by Thana Annis #23808: Symlink to directory on Windows 8 http://bugs.python.org/issue23808 opened by serhiy.storchaka #23809: RFE: emit a warning when a module in a traceback shadows a std http://bugs.python.org/issue23809 opened by ncoghlan #23810: Suboptimal stacklevel of deprecation warnings for formatter an http://bugs.python.org/issue23810 opened by Arfrever #23811: Python py_compile error message inconsistent and missing newli http://bugs.python.org/issue23811 opened by akshetp #23812: asyncio.Queue.put_nowait(), followed get() task cancellation l http://bugs.python.org/issue23812 opened by gustavo #23813: RSS and Atom feeds of buildbot results are broken http://bugs.python.org/issue23813 opened by serhiy.storchaka #23814: argparse: Parser level defaults do not always override argumen http://bugs.python.org/issue23814 opened by kop #23815: Segmentation fault when create _tkinter objects http://bugs.python.org/issue23815 opened by serhiy.storchaka #23816: struct.unpack returns null pascal strings - [first] bug report http://bugs.python.org/issue23816 opened by jonheiner #23817: Consider FreeBSD like any other OS in SOVERSION http://bugs.python.org/issue23817 opened by bapt #23819: test_asyncio fails when run under -O http://bugs.python.org/issue23819 opened by brett.cannon #23820: test_importlib fails under -O http://bugs.python.org/issue23820 opened by brett.cannon #23822: test_py_compile fails under -O http://bugs.python.org/issue23822 opened by brett.cannon #23825: test_idle fails under -OO http://bugs.python.org/issue23825 opened by brett.cannon #23826: test_enum fails under -OO http://bugs.python.org/issue23826 opened by brett.cannon #23828: test_socket testCmsgTruncLen0 gets "received malformed or impr http://bugs.python.org/issue23828 opened by brett.cannon #23830: Add AF_IUCV support to sockets http://bugs.python.org/issue23830 opened by neale #23831: tkinter canvas lacks of moveto method. http://bugs.python.org/issue23831 opened by mps #23832: pdb's `longlist` shows only decorator if that one contains a l http://bugs.python.org/issue23832 opened by Gerrit.Holl #23833: email.header.Header folding modifies headers that include semi http://bugs.python.org/issue23833 opened by dseg #23835: configparser does not convert defaults to strings http://bugs.python.org/issue23835 opened by aragilar #23837: read pipe transport tries to resume reading after loop is gone http://bugs.python.org/issue23837 opened by mwf #23839: Clear caches after every test http://bugs.python.org/issue23839 opened by serhiy.storchaka #23840: tokenize.open() leaks an open binary file on TextIOWrapper err http://bugs.python.org/issue23840 opened by haypo #23841: py34 OrderedDict is using weakref for root reference http://bugs.python.org/issue23841 opened by sorin #23842: SystemError: ../Objects/longobject.c:998: bad argument to inte http://bugs.python.org/issue23842 opened by doko #23843: ssl.wrap_socket doesn't handle virtual TLS hosts http://bugs.python.org/issue23843 opened by nagle #23845: test_ssl: fails on recent libressl with SSLV3_ALERT_HANDSHAKE_ http://bugs.python.org/issue23845 opened by ced #23846: asyncio : ProactorEventLoop raised BlockingIOError when Thread http://bugs.python.org/issue23846 opened by kernel0 #23848: faulthandler: setup an exception handler on Windows http://bugs.python.org/issue23848 opened by haypo #23849: Leaks in test_deque http://bugs.python.org/issue23849 opened by serhiy.storchaka #23850: Missing documentation for Py_TPFLAGS_HAVE_NEWBUFFER http://bugs.python.org/issue23850 opened by Giacomo.Alzetta #23852: Wrong FD_DIR file name on OpenBSD http://bugs.python.org/issue23852 opened by ced #23853: PEP 475: handle EINTR in the ssl module http://bugs.python.org/issue23853 opened by haypo #23855: Missing Sanity Check for malloc() in PC/_msi.c http://bugs.python.org/issue23855 opened by dogbert2 #23856: build.bat -e shouldn't also build http://bugs.python.org/issue23856 opened by tim.golden #23857: Make default HTTPS certificate verification setting configurab http://bugs.python.org/issue23857 opened by rkuska #23859: asyncio: document behaviour of wait() cancellation http://bugs.python.org/issue23859 opened by haypo Most recent 15 issues with no replies (15) ========================================== #23859: asyncio: document behaviour of wait() cancellation http://bugs.python.org/issue23859 #23855: Missing Sanity Check for malloc() in PC/_msi.c http://bugs.python.org/issue23855 #23850: Missing documentation for Py_TPFLAGS_HAVE_NEWBUFFER http://bugs.python.org/issue23850 #23849: Leaks in test_deque http://bugs.python.org/issue23849 #23848: faulthandler: setup an exception handler on Windows http://bugs.python.org/issue23848 #23846: asyncio : ProactorEventLoop raised BlockingIOError when Thread http://bugs.python.org/issue23846 #23845: test_ssl: fails on recent libressl with SSLV3_ALERT_HANDSHAKE_ http://bugs.python.org/issue23845 #23841: py34 OrderedDict is using weakref for root reference http://bugs.python.org/issue23841 #23835: configparser does not convert defaults to strings http://bugs.python.org/issue23835 #23832: pdb's `longlist` shows only decorator if that one contains a l http://bugs.python.org/issue23832 #23831: tkinter canvas lacks of moveto method. http://bugs.python.org/issue23831 #23825: test_idle fails under -OO http://bugs.python.org/issue23825 #23816: struct.unpack returns null pascal strings - [first] bug report http://bugs.python.org/issue23816 #23815: Segmentation fault when create _tkinter objects http://bugs.python.org/issue23815 #23813: RSS and Atom feeds of buildbot results are broken http://bugs.python.org/issue23813 Most recent 15 issues waiting for review (15) ============================================= #23857: Make default HTTPS certificate verification setting configurab http://bugs.python.org/issue23857 #23856: build.bat -e shouldn't also build http://bugs.python.org/issue23856 #23855: Missing Sanity Check for malloc() in PC/_msi.c http://bugs.python.org/issue23855 #23853: PEP 475: handle EINTR in the ssl module http://bugs.python.org/issue23853 #23852: Wrong FD_DIR file name on OpenBSD http://bugs.python.org/issue23852 #23848: faulthandler: setup an exception handler on Windows http://bugs.python.org/issue23848 #23845: test_ssl: fails on recent libressl with SSLV3_ALERT_HANDSHAKE_ http://bugs.python.org/issue23845 #23842: SystemError: ../Objects/longobject.c:998: bad argument to inte http://bugs.python.org/issue23842 #23840: tokenize.open() leaks an open binary file on TextIOWrapper err http://bugs.python.org/issue23840 #23839: Clear caches after every test http://bugs.python.org/issue23839 #23837: read pipe transport tries to resume reading after loop is gone http://bugs.python.org/issue23837 #23830: Add AF_IUCV support to sockets http://bugs.python.org/issue23830 #23826: test_enum fails under -OO http://bugs.python.org/issue23826 #23825: test_idle fails under -OO http://bugs.python.org/issue23825 #23822: test_py_compile fails under -O http://bugs.python.org/issue23822 Top 10 most discussed issues (10) ================================= #23756: Tighten definition of bytes-like objects http://bugs.python.org/issue23756 12 msgs #23857: Make default HTTPS certificate verification setting configurab http://bugs.python.org/issue23857 12 msgs #23602: Implement __format__ for Fraction http://bugs.python.org/issue23602 11 msgs #23840: tokenize.open() leaks an open binary file on TextIOWrapper err http://bugs.python.org/issue23840 11 msgs #23466: PEP 461: Inconsistency between str and bytes formatting of int http://bugs.python.org/issue23466 9 msgs #23796: BufferedReader.peek() crashes if closed http://bugs.python.org/issue23796 9 msgs #23817: Consider FreeBSD like any other OS in SOVERSION http://bugs.python.org/issue23817 8 msgs #22643: Integer overflow in case_operation http://bugs.python.org/issue22643 7 msgs #23812: asyncio.Queue.put_nowait(), followed get() task cancellation l http://bugs.python.org/issue23812 7 msgs #23830: Add AF_IUCV support to sockets http://bugs.python.org/issue23830 7 msgs Issues closed (43) ================== #1483: xml.sax.saxutils.prepare_input_source ignores character stream http://bugs.python.org/issue1483 closed by serhiy.storchaka #10395: new os.path function to extract common prefix based on path co http://bugs.python.org/issue10395 closed by serhiy.storchaka #13583: sqlite3.Row doesn't support slice indexes http://bugs.python.org/issue13583 closed by serhiy.storchaka #14904: test_unicode_repr_oflw (in test_bigmem) crashes http://bugs.python.org/issue14904 closed by serhiy.storchaka #16840: Tkinter doesn't support large integers http://bugs.python.org/issue16840 closed by serhiy.storchaka #18473: some objects pickled by Python 3.x are not unpicklable in Pyth http://bugs.python.org/issue18473 closed by serhiy.storchaka #21319: WindowsRegistryFinder never added to sys.meta_path http://bugs.python.org/issue21319 closed by tim.golden #22117: Rewrite pytime.h to work on nanoseconds http://bugs.python.org/issue22117 closed by haypo #22390: test.regrtest should complain if a test doesn't remove tempora http://bugs.python.org/issue22390 closed by serhiy.storchaka #22500: Argparse always stores True for positional arguments http://bugs.python.org/issue22500 closed by paul.j3 #22672: float arguments in scientific notation not supported by argpar http://bugs.python.org/issue22672 closed by paul.j3 #22977: Unformatted ???Windows Error 0x%X??? exception message on Wine http://bugs.python.org/issue22977 closed by serhiy.storchaka #23171: csv.writer.writerow() does not accept generator (must be coerc http://bugs.python.org/issue23171 closed by serhiy.storchaka #23485: PEP 475: handle EINTR in the select and selectors module http://bugs.python.org/issue23485 closed by haypo #23555: behavioural change in argparse set_defaults in python 2.7.9 http://bugs.python.org/issue23555 closed by paul.j3 #23611: Pickle nested names with protocols < 4 http://bugs.python.org/issue23611 closed by serhiy.storchaka #23618: PEP 475: handle EINTR in the socket module (connect) http://bugs.python.org/issue23618 closed by haypo #23663: Crash on failure in ctypes on Cygwin http://bugs.python.org/issue23663 closed by berker.peksag #23745: Exception when parsing an email using email.parser.BytesParser http://bugs.python.org/issue23745 closed by r.david.murray #23771: Timeouts on "x86 Ubuntu Shared 3.x" buildbot http://bugs.python.org/issue23771 closed by haypo #23781: Add private _PyErr_ReplaceException() in 2.7 http://bugs.python.org/issue23781 closed by serhiy.storchaka #23783: Leak in PyObject_ClearWeakRefs http://bugs.python.org/issue23783 closed by serhiy.storchaka #23785: Leak in TextIOWrapper.tell() http://bugs.python.org/issue23785 closed by serhiy.storchaka #23789: decimal.Decimal: __format__ behaviour http://bugs.python.org/issue23789 closed by mark.dickinson #23792: help crash leaves terminal in raw mode http://bugs.python.org/issue23792 closed by r.david.murray #23793: Support __add__, __mul__, and __imul__ for deques. http://bugs.python.org/issue23793 closed by rhettinger #23798: NameError in the encode fixer http://bugs.python.org/issue23798 closed by serhiy.storchaka #23801: cgi.FieldStorage has different (wrong?) behavior on Python3 th http://bugs.python.org/issue23801 closed by python-dev #23803: str.partition() is broken in 3.4 http://bugs.python.org/issue23803 closed by serhiy.storchaka #23818: test_enum failing test_class_nested_enum_and_pickle_protocol_f http://bugs.python.org/issue23818 closed by serhiy.storchaka #23821: test_pdb fails under -O http://bugs.python.org/issue23821 closed by serhiy.storchaka #23823: "Generalization" misused in deque docs http://bugs.python.org/issue23823 closed by rhettinger #23824: in-place addition of a shadowed class field http://bugs.python.org/issue23824 closed by r.david.murray #23827: test.test_multiprocessing_spawn fails under -Werror http://bugs.python.org/issue23827 closed by serhiy.storchaka #23829: test_warnings fails under -Werror http://bugs.python.org/issue23829 closed by serhiy.storchaka #23834: socketmodule.c: add sock_call() to fix how the timeout is reco http://bugs.python.org/issue23834 closed by haypo #23836: PEP 475: Enhance faulthandler to handle better reentrant calls http://bugs.python.org/issue23836 closed by haypo #23838: linecache and MemoryError http://bugs.python.org/issue23838 closed by serhiy.storchaka #23844: test_ssl: fails on recent libressl version with BAD_DH_P_LENGT http://bugs.python.org/issue23844 closed by python-dev #23847: Add xml pretty print option to ElementTree http://bugs.python.org/issue23847 closed by serhiy.storchaka #23851: PEP 475: _posixsubprocess retries close() on EINTR http://bugs.python.org/issue23851 closed by haypo #23854: qtconsole and all windows based python have issues loading http://bugs.python.org/issue23854 closed by r.david.murray #23858: Look for local sqlite3 by parsing -I/-L flags in linux as well http://bugs.python.org/issue23858 closed by ned.deily From Steve.Dower at microsoft.com Fri Apr 3 19:35:09 2015 From: Steve.Dower at microsoft.com (Steve Dower) Date: Fri, 3 Apr 2015 17:35:09 +0000 Subject: [Python-Dev] Do we need to sign Windows files with GnuPG? In-Reply-To: <551E63E5.6080805@hastings.org> References: <551E63E5.6080805@hastings.org> Message-ID: Larry Hastings wrote: > Steve's also changed the authentication process. His new installers rely on a > Windows digital signature technology called Authenticode where the signature is > built right into the .exe file. Windows platforms will automatically > authenticate executables signed with Authenticode, so this is both secure and > convenient. > > Martin's build process also digitally signed the files he built, but not using > Authenticode (or at least I don't think so). Like the Mac and source code > releases, his automation used GnuPG to produce separate ".asc" files containing > digital signatures. This meant authentication was a manual process. Martin previously only signed the installer with Authenticode, and generated a signature with GnuPG for the installer. My change now signs every binary and MSI in the entire installation with Authenticode, and for now I've stopped creating a GPG signature for the installers. I'm still providing sizes and MD5 hashes for the user-visible downloads (except for the last alpha release, thanks Larry for covering for me). With the installer also being a downloader, there are now actually 30+ files uploaded for each Windows release. Most of these are never seen by users unless they run the installer with /layout (sorry for not having changed this to /download yet... it's not as easily customizable as I'd hoped, but /layout is the standard name for this command), and if they're being downloaded by the installer then both hashes (embedded in the installer) and Authenticode signatures (embedded in each file) are being checked and will be blocked if they don't match. So verifying the EXE installer should always be sufficient to trust the rest of the installable files. > The Authenticode approach sounds great. But there are advantages to the GnuPG > approach too: For reference, the main advantage of Authenticode signing is shown at https://technet.microsoft.com/en-us/library/dd835561(v=ws.10).aspx - about halfway down there are screenshots of the various dialogs that are displayed when you run signed vs. unsigned vs. blocked applications. It also helps bypass SmartScreen, which will block downloaded files until they've developed a minimum level of trust. Simply having an Authenticode signature on the initial download meets this level. (The summary of my opinion is that these two checks are sufficient for the initial EXE download, and the embedded hashes and signatures are sufficient for the rest. Having python.exe et al signed is a bonus that we've never done in the past.) > * Using GnuPG means we can authenticate the files from any platform, not just > Windows. If there were a security breach on the Python content delivery network, > any developer could get GnuPG for their platform and authenticate that the > installers are unmodified. If we use Authenitcode, There are tools out there for validating Authenticode on Linux, though none of them seem to be as complete as on Windows (it really needs the OS certificate store to be completely reliable), so I can certainly see the value in being able to verify these against a signature. My only question is whether/how this is better with GPG compared to say a SHA hash? I don't currently have a GPG key (to my knowledge), so it's not like there's any preexisting trust to build from - or am I misunderstanding how GPG works here? > * GnuPG is agnostic about the data it digitally signs. So, for example, Martin's > build process digitally signs the Windows help file--the ".chm" file--produced > by his build process. The help file Steve builds is currently completely > unsigned; Steve says he can try signing it but he's not sure it'll work. Note > that .chm files actually can contain live code, so this is at least a plausible > vector for attack. Authenticode is not supported for CHM files, unfortunately. If this is the only file that we decide needs GPG, I'd vote to stop offering the download apart from the interpreter :) (Among other things, I'm not supposed to use GPG without specific permission from the lawyers at work because of the license...) > My Windows development days are firmly behind me. So I don't really have an > opinion here. So I put it to you, Windows Python developers: do you care about > GnuPG signatures on Windows-specific files? Or do you not care? The later replies seem to suggest that they are general goodness that nobody on Windows will use. If someone convinces me (or steamrolls me, that's fine too) that the goodness of GPG is better than a hash then I'll look into adding it into the process. Otherwise I'll happily add hash generation into the upload process (which I'm going to do anyway for the ones displayed on the download page). Cheers, Steve From mal at egenix.com Fri Apr 3 19:55:41 2015 From: mal at egenix.com (M.-A. Lemburg) Date: Fri, 03 Apr 2015 19:55:41 +0200 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: References: <551E63E5.6080805@hastings.org> Message-ID: <551ED41D.2070909@egenix.com> On 03.04.2015 19:35, Steve Dower wrote: >> My Windows development days are firmly behind me. So I don't really have an >> opinion here. So I put it to you, Windows Python developers: do you care about >> GnuPG signatures on Windows-specific files? Or do you not care? > > The later replies seem to suggest that they are general goodness that nobody on Windows will use. If someone convinces me (or steamrolls me, that's fine too) that the goodness of GPG is better than a hash then I'll look into adding it into the process. Otherwise I'll happily add hash generation into the upload process (which I'm going to do anyway for the ones displayed on the download page). FWIW: I regularly check the GPG sigs on all important downloaded files, regardless of which platform they target, including the Windows installers for Python or any other Windows installers I use which provide such sigs. The reason is simple: The signature is a proof of authenticity which is not bound to a particular file format or platform and before running .exes it's good to know that they were built by the right people and not manipulated by trojans, viruses or malicious proxies. Is that a good enough reason to continue providing the GPG sigs or do you need more proof of goodness ? ;-) -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source >>> Python/Zope Consulting and Support ... http://www.egenix.com/ >>> mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ >>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ ________________________________________________________________________ ::: Try our new mxODBC.Connect Python Database Interface for free ! :::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/ From pje at telecommunity.com Fri Apr 3 21:23:53 2015 From: pje at telecommunity.com (PJ Eby) Date: Fri, 3 Apr 2015 15:23:53 -0400 Subject: [Python-Dev] PEP 487 vs 422 (dynamic class decoration) In-Reply-To: References: Message-ID: On Fri, Apr 3, 2015 at 11:04 AM, Nick Coghlan wrote: > Extending the descriptor protocol to include a per-descriptor hook that's > called at class definition time sounds like a potentially nice way to go to > me. While you *could* still use it to arbitrarily mutate the class object, > it's much clearer that's not the intended purpose, so I don't see it as a > major problem. Just to be clear, mutating the class object was never the point for my main use case that needs the PEP 422 feature; it was for method overloads that are called remotely and need to be registered elsewhere. For some of my other use cases, adding metadata to the class is a convenient way to do things, but classes are generally weak-referenceable so the add-on data can be (and often is) stored in a weak-key dictionary rather than placed directly on the class. From tjreedy at udel.edu Fri Apr 3 22:14:32 2015 From: tjreedy at udel.edu (Terry Reedy) Date: Fri, 03 Apr 2015 16:14:32 -0400 Subject: [Python-Dev] Installing Python from the source code on Windows In-Reply-To: References: Message-ID: On 4/3/2015 5:51 AM, Narsu wrote: > Hi Python > I'm working on a game project,using c++ as main > language, and using python as script.I've built > the Python from the source code on Windows, and when I > invoked method Py_Initialize my program exited. After some tests > I realized as long as I move the Python-2.7.9/Lib file > to my program file, it works fine. > Please help me out . Py-dev is for developing *future* versions of python. Questions about using *current* versions of python should usually go to python-list. PS: consider using 3.x for scripting. -- Terry Jan Reedy From pje at telecommunity.com Fri Apr 3 22:36:04 2015 From: pje at telecommunity.com (PJ Eby) Date: Fri, 3 Apr 2015 16:36:04 -0400 Subject: [Python-Dev] PEP 487 vs 422 (dynamic class decoration) In-Reply-To: References: Message-ID: On Fri, Apr 3, 2015 at 4:21 AM, Nick Coghlan wrote: > That means I'm now OK with monkeypatching __build_class__ being the > only way to get dynamic hooking of the class currently being defined > from the class body - folks that really want that behaviour can > monkeypatch it in, while folks that think it's a bad idea don't need > to worry about. I'd still prefer to only do that as an emulation of an agreed-upon descriptor notification protocol, such that it's a backport of an approved PEP, so I hope we can work that out. But I guess if not, then whatever works. I just wish you'd been okay with it in 2012, as there was more than once in the last few years where I had some downtime and thought about trying to do some porting work. :-( And in the meantime, the only alternative Python implementation I know of that's made *any* headway on Python 3 in the last few years (i.e., PyPy 3) *includes* a compatibly monkeypatchable __build_class__. It appears that the *other* obstacles to making a compatible Python 3 implementation are a lot tougher for implementers to get over than compatibility with __build_class__. ;-) > Neither PEP 422 nor 487 are designed to eliminate metaclass conflicts > in general, they're primarily designed to let base classes run > arbitrary code after the namespace has been executed in a subclass > definition *without* needing a custom metaclass. And yet the argument was being made that the lack of custom metaclass was a feature because it avoided conflict. I'm just trying to point out that if avoiding conflict is desirable, building *every possible metaclass feature* into the Python core isn't a scalable solution. At this point, co-operative inheritance is a well-understood model in Python, so providing an API to automatically mix metaclasses (explicitly, at first) seems like a good step towards solving the metaclass conflict problem in general. When Guido introduced the new MRO scheme in Python 2.2, he noted that the source he'd gotten that scheme from had explained that it could be extended to automatically mixing metaclasses, but he (Guido) didn't want to do that in Python until more experience was had with the new MRO scheme in general. And I think we have enough experience with that *now*, to be able to take a step forward, by providing a stdlib-blessed metaclass mixer. It not only makes the prototype, PyPI-based version of PEP 487 more usable immediately, it will also encourage people to develop metaclasses as *mixins* rather than one-size-fits-all monoliths. For example, there's no reason that both of PEP 487''s features need to live in the *same* metaclass, if you could trivially mix metaclasses at the point where you inherit from bases with different metaclasses. (And eventually, a future version of Python could do the mixing automatically, without the `noconflict` function. The theory was well-understood for other languages, after all, before Python 2.2 even came out.) > No, you can't do it currently without risking a backwards > incompatibility through the introduction of a custom metaclass. Right... which is precisely why I'm suggesting the `noconflict()` metaclass factory function as a *general* solution for providing useful metaclasses, and why I think that PEP 487 should break the namespacing and subclass init features into separate metaclasses, and add that noconflict feature. It will then become a good example for people moving forward writing metaclasses. Basically, as long as you don't have the pointless conflict errors, you can write co-operative metaclass mixins as easily as you can write regular co-operative mixins. I was missing this point myself because I've been too steeped in Python 2's complexities: writing a usable version of `noconflict()` is a lot more complex and its invocation far more obscure. In Python 2, there's classic classes, class- and module-level __metaclass__, ExtensionClass, and all sorts of other headaches for automatic mixing. In Python 3, though, all that stuff goes out the window, and even my 90-line version that's almost half comments is probably still overengineered compared to what's actually needed to do the mixing. >> Further, if the claim is that metaclass conflict potential makes PEP >> 487 worthy of a language change, then by the same logic method >> decorators are just as worthy of a language change, since any mixin >> required to use a method decorator would be *just as susceptible* to >> metaclass conflicts as SubclassInit. > > There wouldn't be a custom metaclass involved in the native > implementation of PEP 487, only in the backport. Right... and if there were a native implementation of PEP 422, that would also be the case for PEP 422. The point is that if the PEP 487 can justify a *language* change to avoid needing a metaclass, then arguably PEP 422 has an even *better* justification, because its need to avoid needing a metaclass is at least as strong. Indeed, you said the same yourself as recently as 2013: https://mail.python.org/pipermail/python-dev/2013-February/123925.html > PEP 422 has never been approved I took this post from Guido (and his other comments in the same thread where you asked for Pronouncement on it) as meaning it was basically a done deal, approved in all but some final edits: https://mail.python.org/pipermail/python-dev/2013-February/123957.html > Given my change of heart, I believe that at this point, if you were > willing to champion a revived PEP 422 that implemented the behaviour > you're after, that would be ideal, with monkeypatching the desired > behaviour in as a fallback plan if the PEP is still ultimately > rejected. Alternatively, you could go the monkeypatching path first, > and then potentially seek standardisation later after you've had some > practical experience with it - I now consider it an orthogonal > capability to the feature in PEP 487, so the acceptance of the latter > wouldn't necessarily preclude acceptance of a hook for class > postprocessing injection. A lot of things have changed since the original discussion, mostly in the direction of me having even *less* time for Python work than previously, so it's unlikely that me championing a PEP is a realistic possibility. Frankly, I'm immensely fatigued at the discussion *already*, and the need to go over the same questions a *third* time seems like not something I'm going to want to put energy into. However it sounds like there *is* some growing consensus towards the idea of simply notifying interested class members of their class membership, so if there ends up being a consensus to standardize *that* protocol and what part of the class-building process it gets invoked in, then I will implement a backport (or use such a backport if someone else implements it), when I actually start porting my libraries to Python 3. But that would make my timeline somewhat dependent on how much of a consensus there is, and how much clarity I could get before going forward. Even if PEP 422 never was officially tagged with "Approved" status in the PEP itself, our 2013 conversation with Guido made it sound like it was totally a done deal; if there was something *other* than PEP 487 that threw it off that track, I never saw it. So I'm understandably a little bit reluctant to start off implementing a new protocol that then two or three years from *now* will suddenly not be a done deal any more, with whatever I did being retroactively declared the wrong thing to do again. I suppose, though, that that my best option all in all is just to do whatever the heck seems best for porting, and worry about standardization later. If a member-notification protocol is standardized, I can always change DecoratorTools to use it *later*, after all, as long as the actual implementation mechanism inside DecoratorTools is opaque to its consumers. (i.e., if I don't actually expose the member-notification protocol directly) (And, in retrospect, I could have, and probably should have, taken this approach from the get-go in 2012. It just seemed really, *really* important to you back then that I *not* do it.) From Steve.Dower at microsoft.com Sat Apr 4 00:14:20 2015 From: Steve.Dower at microsoft.com (Steve Dower) Date: Fri, 3 Apr 2015 22:14:20 +0000 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: <551ED41D.2070909@egenix.com> References: <551E63E5.6080805@hastings.org> , <551ED41D.2070909@egenix.com> Message-ID: The thing is, that's exactly the same goodness as Authenticode gives, except everyone gets that for free and meanwhile you're the only one who has admitted to using GPG on Windows :) Basically, what I want to hear is that GPG sigs provide significantly better protection than hashes (and I can provide better than MD5 for all files if it's useful), taking into consideration that (I assume) I'd have to obtain a signing key for GPG and unless there's a CA involved like there is for Authenticode, there's no existing trust in that key. Cheers, Steve Top-posted from my Windows Phone ________________________________ From: M.-A. Lemburg Sent: ?4/?3/?2015 10:55 To: Steve Dower; Larry Hastings; Python Dev; python-committers Subject: Re: [python-committers] [Python-Dev] Do we need to sign Windows files with GnuPG? On 03.04.2015 19:35, Steve Dower wrote: >> My Windows development days are firmly behind me. So I don't really have an >> opinion here. So I put it to you, Windows Python developers: do you care about >> GnuPG signatures on Windows-specific files? Or do you not care? > > The later replies seem to suggest that they are general goodness that nobody on Windows will use. If someone convinces me (or steamrolls me, that's fine too) that the goodness of GPG is better than a hash then I'll look into adding it into the process. Otherwise I'll happily add hash generation into the upload process (which I'm going to do anyway for the ones displayed on the download page). FWIW: I regularly check the GPG sigs on all important downloaded files, regardless of which platform they target, including the Windows installers for Python or any other Windows installers I use which provide such sigs. The reason is simple: The signature is a proof of authenticity which is not bound to a particular file format or platform and before running .exes it's good to know that they were built by the right people and not manipulated by trojans, viruses or malicious proxies. Is that a good enough reason to continue providing the GPG sigs or do you need more proof of goodness ? ;-) -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source >>> Python/Zope Consulting and Support ... http://www.egenix.com/ >>> mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ >>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ ________________________________________________________________________ ::: Try our new mxODBC.Connect Python Database Interface for free ! :::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From mal at egenix.com Sat Apr 4 00:38:08 2015 From: mal at egenix.com (M.-A. Lemburg) Date: Sat, 04 Apr 2015 00:38:08 +0200 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: References: <551E63E5.6080805@hastings.org> , <551ED41D.2070909@egenix.com> Message-ID: <551F1650.8070808@egenix.com> On 04.04.2015 00:14, Steve Dower wrote: > The thing is, that's exactly the same goodness as Authenticode gives, except everyone gets that for free and meanwhile you're the only one who has admitted to using GPG on Windows :) > > Basically, what I want to hear is that GPG sigs provide significantly better protection than hashes (and I can provide better than MD5 for all files if it's useful), taking into consideration that (I assume) I'd have to obtain a signing key for GPG and unless there's a CA involved like there is for Authenticode, there's no existing trust in that key. Hashes only provide checks against file corruption (and then only if you can trust the hash values). GPG provides all the benefits of public key encryption on arbitrary files (not just code). The main benefit in case of downloadable installers is to be able to make sure that the files are authentic, meaning that they were created and signed by the people listed as packagers. There is no CA infrastructure involved as for SSL certificates or Authenticode, but it's easy to get the keys from key servers given the key signatures available from python.org's download pages. If you want to sign a package file using GPG, you will need to create your own key, upload it to the key servers and then place the signature up on the download page. Relying only on Authenticode for Windows installers would result in a break in technology w/r to the downloads we make available for Python, since all other files are (usually) GPG signed: https://www.python.org/ftp/python/3.4.3/ Cheers, -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source >>> Python/Zope Consulting and Support ... http://www.egenix.com/ >>> mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ >>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ ________________________________________________________________________ ::: Try our new mxODBC.Connect Python Database Interface for free ! :::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/ > Cheers, > Steve > > Top-posted from my Windows Phone > ________________________________ > From: M.-A. Lemburg > Sent: ?4/?3/?2015 10:55 > To: Steve Dower; Larry Hastings; Python Dev; python-committers > Subject: Re: [python-committers] [Python-Dev] Do we need to sign Windows files with GnuPG? > > On 03.04.2015 19:35, Steve Dower wrote: >>> My Windows development days are firmly behind me. So I don't really have an >>> opinion here. So I put it to you, Windows Python developers: do you care about >>> GnuPG signatures on Windows-specific files? Or do you not care? >> >> The later replies seem to suggest that they are general goodness that nobody on Windows will use. If someone convinces me (or steamrolls me, that's fine too) that the goodness of GPG is better than a hash then I'll look into adding it into the process. Otherwise I'll happily add hash generation into the upload process (which I'm going to do anyway for the ones displayed on the download page). > > FWIW: I regularly check the GPG sigs on all important downloaded > files, regardless of which platform they target, including the > Windows installers for Python or any other Windows installers > I use which provide such sigs. > > The reason is simple: > The signature is a proof of authenticity which is not bound to > a particular file format or platform and before running .exes > it's good to know that they were built by the right people and > not manipulated by trojans, viruses or malicious proxies. > > Is that a good enough reason to continue providing the GPG > sigs or do you need more proof of goodness ? ;-) > > -- > Marc-Andre Lemburg > eGenix.com > > Professional Python Services directly from the Source >>>> Python/Zope Consulting and Support ... http://www.egenix.com/ >>>> mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ >>>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ > ________________________________________________________________________ > > ::: Try our new mxODBC.Connect Python Database Interface for free ! :::: > > > eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 > D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg > Registered at Amtsgericht Duesseldorf: HRB 46611 > http://www.egenix.com/company/contact/ > From donald at stufft.io Sat Apr 4 02:49:09 2015 From: donald at stufft.io (Donald Stufft) Date: Fri, 3 Apr 2015 20:49:09 -0400 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: <551F1650.8070808@egenix.com> References: <551E63E5.6080805@hastings.org> <, > <551ED41D.2070909@egenix.com> <551F1650.8070808@egenix.com> Message-ID: <97D7CADF-318A-4FA7-89A6-E27319B99A21@stufft.io> > On Apr 3, 2015, at 6:38 PM, M.-A. Lemburg wrote: > > On 04.04.2015 00:14, Steve Dower wrote: >> The thing is, that's exactly the same goodness as Authenticode gives, except everyone gets that for free and meanwhile you're the only one who has admitted to using GPG on Windows :) >> >> Basically, what I want to hear is that GPG sigs provide significantly better protection than hashes (and I can provide better than MD5 for all files if it's useful), taking into consideration that (I assume) I'd have to obtain a signing key for GPG and unless there's a CA involved like there is for Authenticode, there's no existing trust in that key. > > Hashes only provide checks against file corruption (and then > only if you can trust the hash values). GPG provides all the > benefits of public key encryption on arbitrary files (not just > code). > > The main benefit in case of downloadable installers is to > be able to make sure that the files are authentic, meaning that > they were created and signed by the people listed as packagers. > > There is no CA infrastructure involved as for SSL certificates > or Authenticode, but it's easy to get the keys from key servers > given the key signatures available from python.org's download > pages. FTR if we?re relying on people to get the GPG keys from the download pages then there?s no additional benefit over just using a hash published on the same page. In order to get additional benefit we?d need to get Steve?s key signed by enough people to get him into the strong set. > > If you want to sign a package file using GPG, you will need > to create your own key, upload it to the key servers and then > place the signature up on the download page. > > Relying only on Authenticode for Windows installers would > result in a break in technology w/r to the downloads we > make available for Python, since all other files are (usually) > GPG signed: > > https://www.python.org/ftp/python/3.4.3/ > > Cheers, > -- > Marc-Andre Lemburg > eGenix.com > > Professional Python Services directly from the Source >>>> Python/Zope Consulting and Support ... http://www.egenix.com/ >>>> mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ >>>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ > ________________________________________________________________________ > > ::: Try our new mxODBC.Connect Python Database Interface for free ! :::: > > > eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 > D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg > Registered at Amtsgericht Duesseldorf: HRB 46611 > http://www.egenix.com/company/contact/ > > >> Cheers, >> Steve >> >> Top-posted from my Windows Phone >> ________________________________ >> From: M.-A. Lemburg >> Sent: ?4/?3/?2015 10:55 >> To: Steve Dower; Larry Hastings; Python Dev; python-committers >> Subject: Re: [python-committers] [Python-Dev] Do we need to sign Windows files with GnuPG? >> >> On 03.04.2015 19:35, Steve Dower wrote: >>>> My Windows development days are firmly behind me. So I don't really have an >>>> opinion here. So I put it to you, Windows Python developers: do you care about >>>> GnuPG signatures on Windows-specific files? Or do you not care? >>> >>> The later replies seem to suggest that they are general goodness that nobody on Windows will use. If someone convinces me (or steamrolls me, that's fine too) that the goodness of GPG is better than a hash then I'll look into adding it into the process. Otherwise I'll happily add hash generation into the upload process (which I'm going to do anyway for the ones displayed on the download page). >> >> FWIW: I regularly check the GPG sigs on all important downloaded >> files, regardless of which platform they target, including the >> Windows installers for Python or any other Windows installers >> I use which provide such sigs. >> >> The reason is simple: >> The signature is a proof of authenticity which is not bound to >> a particular file format or platform and before running .exes >> it's good to know that they were built by the right people and >> not manipulated by trojans, viruses or malicious proxies. >> >> Is that a good enough reason to continue providing the GPG >> sigs or do you need more proof of goodness ? ;-) >> >> -- >> Marc-Andre Lemburg >> eGenix.com >> >> Professional Python Services directly from the Source >>>>> Python/Zope Consulting and Support ... http://www.egenix.com/ >>>>> mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ >>>>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ >> ________________________________________________________________________ >> >> ::: Try our new mxODBC.Connect Python Database Interface for free ! :::: >> >> >> eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 >> D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg >> Registered at Amtsgericht Duesseldorf: HRB 46611 >> http://www.egenix.com/company/contact/ >> > > _______________________________________________ > python-committers mailing list > python-committers at python.org > https://mail.python.org/mailman/listinfo/python-committers --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 801 bytes Desc: Message signed with OpenPGP using GPGMail URL: From njs at pobox.com Sat Apr 4 03:16:00 2015 From: njs at pobox.com (Nathaniel Smith) Date: Fri, 3 Apr 2015 18:16:00 -0700 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: <97D7CADF-318A-4FA7-89A6-E27319B99A21@stufft.io> References: <551E63E5.6080805@hastings.org> <551ED41D.2070909@egenix.com> <551F1650.8070808@egenix.com> <97D7CADF-318A-4FA7-89A6-E27319B99A21@stufft.io> Message-ID: On Apr 3, 2015 5:50 PM, "Donald Stufft" wrote: > > > > On Apr 3, 2015, at 6:38 PM, M.-A. Lemburg wrote: > > > > On 04.04.2015 00:14, Steve Dower wrote: > >> The thing is, that's exactly the same goodness as Authenticode gives, except everyone gets that for free and meanwhile you're the only one who has admitted to using GPG on Windows :) > >> > >> Basically, what I want to hear is that GPG sigs provide significantly better protection than hashes (and I can provide better than MD5 for all files if it's useful), taking into consideration that (I assume) I'd have to obtain a signing key for GPG and unless there's a CA involved like there is for Authenticode, there's no existing trust in that key. > > > > Hashes only provide checks against file corruption (and then > > only if you can trust the hash values). GPG provides all the > > benefits of public key encryption on arbitrary files (not just > > code). > > > > The main benefit in case of downloadable installers is to > > be able to make sure that the files are authentic, meaning that > > they were created and signed by the people listed as packagers. > > > > There is no CA infrastructure involved as for SSL certificates > > or Authenticode, but it's easy to get the keys from key servers > > given the key signatures available from python.org's download > > pages. > > FTR if we?re relying on people to get the GPG keys from the download > pages then there?s no additional benefit over just using a hash > published on the same page. > > In order to get additional benefit we?d need to get Steve?s key > signed by enough people to get him into the strong set. I don't think that's true -- e.g. people who download the key for checking 3.5.0 will still have it when 3.5.1 is released, and notice if something silently changes. In general distributing a key id widely on webpages / mailing lists / using it consistently over multiple releases all increase security, even if they fall short of perfect. Even the web of trust isn't particularly trustworthy, it's just useful because it's harder to attack two targets (the webserver and the WoT) than it is to attack one. In any case, getting his key into the strong set ought to be trivial given that pycon is next week. -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From mal at egenix.com Sat Apr 4 12:02:17 2015 From: mal at egenix.com (M.-A. Lemburg) Date: Sat, 04 Apr 2015 12:02:17 +0200 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: <97D7CADF-318A-4FA7-89A6-E27319B99A21@stufft.io> References: <551E63E5.6080805@hastings.org> <, > <551ED41D.2070909@egenix.com> <551F1650.8070808@egenix.com> <97D7CADF-318A-4FA7-89A6-E27319B99A21@stufft.io> Message-ID: <551FB6A9.2040700@egenix.com> On 04.04.2015 02:49, Donald Stufft wrote: > >> On Apr 3, 2015, at 6:38 PM, M.-A. Lemburg wrote: >> >> On 04.04.2015 00:14, Steve Dower wrote: >>> The thing is, that's exactly the same goodness as Authenticode gives, except everyone gets that for free and meanwhile you're the only one who has admitted to using GPG on Windows :) >>> >>> Basically, what I want to hear is that GPG sigs provide significantly better protection than hashes (and I can provide better than MD5 for all files if it's useful), taking into consideration that (I assume) I'd have to obtain a signing key for GPG and unless there's a CA involved like there is for Authenticode, there's no existing trust in that key. >> >> Hashes only provide checks against file corruption (and then >> only if you can trust the hash values). GPG provides all the >> benefits of public key encryption on arbitrary files (not just >> code). >> >> The main benefit in case of downloadable installers is to >> be able to make sure that the files are authentic, meaning that >> they were created and signed by the people listed as packagers. >> >> There is no CA infrastructure involved as for SSL certificates >> or Authenticode, but it's easy to get the keys from key servers >> given the key signatures available from python.org's download >> pages. > > FTR if we?re relying on people to get the GPG keys from the download > pages then there?s no additional benefit over just using a hash > published on the same page. Well, it's still better than just the hashes... > In order to get additional benefit we?d need to get Steve?s key > signed by enough people to get him into the strong set. ...but having the key signed by fellow core devs will certainly add more goodness :-) >> If you want to sign a package file using GPG, you will need >> to create your own key, upload it to the key servers and then >> place the signature up on the download page. >> >> Relying only on Authenticode for Windows installers would >> result in a break in technology w/r to the downloads we >> make available for Python, since all other files are (usually) >> GPG signed: >> >> https://www.python.org/ftp/python/3.4.3/ >> >> Cheers, >> -- >> Marc-Andre Lemburg >> eGenix.com >> >> Professional Python Services directly from the Source >>>>> Python/Zope Consulting and Support ... http://www.egenix.com/ >>>>> mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ >>>>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ >> ________________________________________________________________________ >> >> ::: Try our new mxODBC.Connect Python Database Interface for free ! :::: >> >> >> eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 >> D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg >> Registered at Amtsgericht Duesseldorf: HRB 46611 >> http://www.egenix.com/company/contact/ >> >> >>> Cheers, >>> Steve >>> >>> Top-posted from my Windows Phone >>> ________________________________ >>> From: M.-A. Lemburg >>> Sent: ?4/?3/?2015 10:55 >>> To: Steve Dower; Larry Hastings; Python Dev; python-committers >>> Subject: Re: [python-committers] [Python-Dev] Do we need to sign Windows files with GnuPG? >>> >>> On 03.04.2015 19:35, Steve Dower wrote: >>>>> My Windows development days are firmly behind me. So I don't really have an >>>>> opinion here. So I put it to you, Windows Python developers: do you care about >>>>> GnuPG signatures on Windows-specific files? Or do you not care? >>>> >>>> The later replies seem to suggest that they are general goodness that nobody on Windows will use. If someone convinces me (or steamrolls me, that's fine too) that the goodness of GPG is better than a hash then I'll look into adding it into the process. Otherwise I'll happily add hash generation into the upload process (which I'm going to do anyway for the ones displayed on the download page). >>> >>> FWIW: I regularly check the GPG sigs on all important downloaded >>> files, regardless of which platform they target, including the >>> Windows installers for Python or any other Windows installers >>> I use which provide such sigs. >>> >>> The reason is simple: >>> The signature is a proof of authenticity which is not bound to >>> a particular file format or platform and before running .exes >>> it's good to know that they were built by the right people and >>> not manipulated by trojans, viruses or malicious proxies. >>> >>> Is that a good enough reason to continue providing the GPG >>> sigs or do you need more proof of goodness ? ;-) >>> >>> -- >>> Marc-Andre Lemburg >>> eGenix.com >>> >>> Professional Python Services directly from the Source >>>>>> Python/Zope Consulting and Support ... http://www.egenix.com/ >>>>>> mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ >>>>>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ >>> ________________________________________________________________________ >>> >>> ::: Try our new mxODBC.Connect Python Database Interface for free ! :::: >>> >>> >>> eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 >>> D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg >>> Registered at Amtsgericht Duesseldorf: HRB 46611 >>> http://www.egenix.com/company/contact/ >>> >> >> _______________________________________________ >> python-committers mailing list >> python-committers at python.org >> https://mail.python.org/mailman/listinfo/python-committers > > --- > Donald Stufft > PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA > > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/mal%40egenix.com > -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source >>> Python/Zope Consulting and Support ... http://www.egenix.com/ >>> mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ >>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ ________________________________________________________________________ ::: Try our new mxODBC.Connect Python Database Interface for free ! :::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/ From solipsis at pitrou.net Sat Apr 4 13:27:25 2015 From: solipsis at pitrou.net (Antoine Pitrou) Date: Sat, 4 Apr 2015 13:27:25 +0200 Subject: [Python-Dev] Socket timeout: reset timeout at each successful syscall? References: Message-ID: <20150404132725.21892ee5@fsol> On Fri, 3 Apr 2015 13:56:44 +0200 Victor Stinner wrote: > > The problem is that the socket.sendall() method may require multiple > syscalls. In this case, does the timeout count for the total time or > only for a single syscall? Asked differently: should we reset the > timeout each time a syscall succeed? From a user's point of view, it should count for the total time, IMO. If people want a timeout for each syscall, they should call send() iteratively. Regards Antoine. From wes.turner at gmail.com Sat Apr 4 15:42:55 2015 From: wes.turner at gmail.com (Wes Turner) Date: Sat, 4 Apr 2015 08:42:55 -0500 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: <551F1650.8070808@egenix.com> References: <551E63E5.6080805@hastings.org> <551ED41D.2070909@egenix.com> <551F1650.8070808@egenix.com> Message-ID: So, AFAIU from this discussion: * Authenticode does not have a PKI * GPG does have PKI * ASC signatures are signed checksums As far as downstream packaging on Windows (people who should/could be subscribed to release ANNs): For Choclatey NuGet: * https://chocolatey.org/packages/python * https://chocolatey.org/packages/python.x86 * https://chocolatey.org/packages/python2 * https://chocolatey.org/packages/python-x86_32 * https://chocolatey.org/packages/python3 Python(x,y): * https://code.google.com/p/pythonxy/ For Anaconda (the MS Azure chosen python distribution): * http://docs.continuum.io/anaconda/install.html#windows-install ... These should/could/are checking GPG signatures for Windows packages downstream. http://www.scipy.org/install.html On Apr 3, 2015 5:38 PM, "M.-A. Lemburg" wrote: > On 04.04.2015 00:14, Steve Dower wrote: > > The thing is, that's exactly the same goodness as Authenticode gives, > except everyone gets that for free and meanwhile you're the only one who > has admitted to using GPG on Windows :) > > > > Basically, what I want to hear is that GPG sigs provide significantly > better protection than hashes (and I can provide better than MD5 for all > files if it's useful), taking into consideration that (I assume) I'd have > to obtain a signing key for GPG and unless there's a CA involved like there > is for Authenticode, there's no existing trust in that key. > > Hashes only provide checks against file corruption (and then > only if you can trust the hash values). GPG provides all the > benefits of public key encryption on arbitrary files (not just > code). > > The main benefit in case of downloadable installers is to > be able to make sure that the files are authentic, meaning that > they were created and signed by the people listed as packagers. > > There is no CA infrastructure involved as for SSL certificates > or Authenticode, but it's easy to get the keys from key servers > given the key signatures available from python.org's download > pages. > > If you want to sign a package file using GPG, you will need > to create your own key, upload it to the key servers and then > place the signature up on the download page. > > Relying only on Authenticode for Windows installers would > result in a break in technology w/r to the downloads we > make available for Python, since all other files are (usually) > GPG signed: > > https://www.python.org/ftp/python/3.4.3/ > > Cheers, > -- > Marc-Andre Lemburg > eGenix.com > > Professional Python Services directly from the Source > >>> Python/Zope Consulting and Support ... http://www.egenix.com/ > >>> mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ > >>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ > ________________________________________________________________________ > > ::: Try our new mxODBC.Connect Python Database Interface for free ! :::: > > > eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 > D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg > Registered at Amtsgericht Duesseldorf: HRB 46611 > http://www.egenix.com/company/contact/ > > > > Cheers, > > Steve > > > > Top-posted from my Windows Phone > > ________________________________ > > From: M.-A. Lemburg > > Sent: ?4/?3/?2015 10:55 > > To: Steve Dower; Larry > Hastings; Python Dev python-dev at python.org>; python-committers python-committers at python.org> > > Subject: Re: [python-committers] [Python-Dev] Do we need to sign Windows > files with GnuPG? > > > > On 03.04.2015 19:35, Steve Dower wrote: > >>> My Windows development days are firmly behind me. So I don't really > have an > >>> opinion here. So I put it to you, Windows Python developers: do you > care about > >>> GnuPG signatures on Windows-specific files? Or do you not care? > >> > >> The later replies seem to suggest that they are general goodness that > nobody on Windows will use. If someone convinces me (or steamrolls me, > that's fine too) that the goodness of GPG is better than a hash then I'll > look into adding it into the process. Otherwise I'll happily add hash > generation into the upload process (which I'm going to do anyway for the > ones displayed on the download page). > > > > FWIW: I regularly check the GPG sigs on all important downloaded > > files, regardless of which platform they target, including the > > Windows installers for Python or any other Windows installers > > I use which provide such sigs. > > > > The reason is simple: > > The signature is a proof of authenticity which is not bound to > > a particular file format or platform and before running .exes > > it's good to know that they were built by the right people and > > not manipulated by trojans, viruses or malicious proxies. > > > > Is that a good enough reason to continue providing the GPG > > sigs or do you need more proof of goodness ? ;-) > > > > -- > > Marc-Andre Lemburg > > eGenix.com > > > > Professional Python Services directly from the Source > >>>> Python/Zope Consulting and Support ... http://www.egenix.com/ > >>>> mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ > >>>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ > > ________________________________________________________________________ > > > > ::: Try our new mxODBC.Connect Python Database Interface for free ! :::: > > > > > > eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 > > D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg > > Registered at Amtsgericht Duesseldorf: HRB 46611 > > http://www.egenix.com/company/contact/ > > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/wes.turner%40gmail.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gmludo at gmail.com Sat Apr 4 16:05:24 2015 From: gmludo at gmail.com (Ludovic Gasc) Date: Sat, 4 Apr 2015 16:05:24 +0200 Subject: [Python-Dev] Socket timeout: reset timeout at each successful syscall? In-Reply-To: <20150404132725.21892ee5@fsol> References: <20150404132725.21892ee5@fsol> Message-ID: On Sat, Apr 4, 2015 at 1:27 PM, Antoine Pitrou wrote: > On Fri, 3 Apr 2015 13:56:44 +0200 > Victor Stinner wrote: > > > > The problem is that the socket.sendall() method may require multiple > > syscalls. In this case, does the timeout count for the total time or > > only for a single syscall? Asked differently: should we reset the > > timeout each time a syscall succeed? > > From a user's point of view, it should count for the total time, IMO. > If people want a timeout for each syscall, they should call send() > iteratively. I'm agree with Antoine for a global timeout. When you exchange data with the external world, very often, you can't trust the other part. If you reset the timeout each time you send or receive something, the other part could use this property to send few bytes during a long time to force you to block for almost nothing during a longer time than with a global timeout. It's like an old granny at the checkout of a supermarket who pays coin by coin, you don't know if only her age or if she purposely be as slow. It's the same principle with Slowloris attack: http://ha.ckers.org/slowloris/ BTW, when I've learnt AsyncIO, I've reimplemented Slowloris attack with AsyncIO in a quick'd'dirty Python script. The funny thing is that the Python script consumed less CPU and was more efficient than the original Perl script. If somebody is interested in, I can send you the script. -- Ludovic Gasc (GMLudo) http://www.gmludo.eu/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Steve.Dower at microsoft.com Sat Apr 4 16:24:46 2015 From: Steve.Dower at microsoft.com (Steve Dower) Date: Sat, 4 Apr 2015 14:24:46 +0000 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: References: <551E63E5.6080805@hastings.org> <551ED41D.2070909@egenix.com> <551F1650.8070808@egenix.com>, Message-ID: "Authenticode does not have a PKI" If you got that from this discussion, I need everyone to at least skim read this: https://msdn.microsoft.com/en-us/library/ie/ms537361(v=vs.85).aspx Authenticode uses the same certificate infrastructure as SSL (note: not the same certificates). As I see it, anyone running on Windows has access to verification that is at least as good as GPG, and the only people who would benefit from GPG sigs are those checking Windows files on another OS or those with an existing GPG workflow on Windows (before this thread, I knew nobody who used GPG on Windows for anything, so forgive me for thinking this is very rare). Cheers, Steve Top-posted from my Windows Phone ________________________________ From: Wes Turner Sent: ?4/?4/?2015 6:42 To: M. -A. Lemburg Cc: Python-Dev; python-committers; Larry Hastings; Steve Dower Subject: Re: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? So, AFAIU from this discussion: * Authenticode does not have a PKI * GPG does have PKI * ASC signatures are signed checksums As far as downstream packaging on Windows (people who should/could be subscribed to release ANNs): For Choclatey NuGet: * https://chocolatey.org/packages/python * https://chocolatey.org/packages/python.x86 * https://chocolatey.org/packages/python2 * https://chocolatey.org/packages/python-x86_32 * https://chocolatey.org/packages/python3 Python(x,y): * https://code.google.com/p/pythonxy/ For Anaconda (the MS Azure chosen python distribution): * http://docs.continuum.io/anaconda/install.html#windows-install ... These should/could/are checking GPG signatures for Windows packages downstream. http://www.scipy.org/install.html On Apr 3, 2015 5:38 PM, "M.-A. Lemburg" > wrote: On 04.04.2015 00:14, Steve Dower wrote: > The thing is, that's exactly the same goodness as Authenticode gives, except everyone gets that for free and meanwhile you're the only one who has admitted to using GPG on Windows :) > > Basically, what I want to hear is that GPG sigs provide significantly better protection than hashes (and I can provide better than MD5 for all files if it's useful), taking into consideration that (I assume) I'd have to obtain a signing key for GPG and unless there's a CA involved like there is for Authenticode, there's no existing trust in that key. Hashes only provide checks against file corruption (and then only if you can trust the hash values). GPG provides all the benefits of public key encryption on arbitrary files (not just code). The main benefit in case of downloadable installers is to be able to make sure that the files are authentic, meaning that they were created and signed by the people listed as packagers. There is no CA infrastructure involved as for SSL certificates or Authenticode, but it's easy to get the keys from key servers given the key signatures available from python.org's download pages. If you want to sign a package file using GPG, you will need to create your own key, upload it to the key servers and then place the signature up on the download page. Relying only on Authenticode for Windows installers would result in a break in technology w/r to the downloads we make available for Python, since all other files are (usually) GPG signed: https://www.python.org/ftp/python/3.4.3/ Cheers, -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source >>> Python/Zope Consulting and Support ... http://www.egenix.com/ >>> mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ >>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ ________________________________________________________________________ ::: Try our new mxODBC.Connect Python Database Interface for free ! :::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/ > Cheers, > Steve > > Top-posted from my Windows Phone > ________________________________ > From: M.-A. Lemburg> > Sent: ?4/?3/?2015 10:55 > To: Steve Dower>; Larry Hastings>; Python Dev>; python-committers> > Subject: Re: [python-committers] [Python-Dev] Do we need to sign Windows files with GnuPG? > > On 03.04.2015 19:35, Steve Dower wrote: >>> My Windows development days are firmly behind me. So I don't really have an >>> opinion here. So I put it to you, Windows Python developers: do you care about >>> GnuPG signatures on Windows-specific files? Or do you not care? >> >> The later replies seem to suggest that they are general goodness that nobody on Windows will use. If someone convinces me (or steamrolls me, that's fine too) that the goodness of GPG is better than a hash then I'll look into adding it into the process. Otherwise I'll happily add hash generation into the upload process (which I'm going to do anyway for the ones displayed on the download page). > > FWIW: I regularly check the GPG sigs on all important downloaded > files, regardless of which platform they target, including the > Windows installers for Python or any other Windows installers > I use which provide such sigs. > > The reason is simple: > The signature is a proof of authenticity which is not bound to > a particular file format or platform and before running .exes > it's good to know that they were built by the right people and > not manipulated by trojans, viruses or malicious proxies. > > Is that a good enough reason to continue providing the GPG > sigs or do you need more proof of goodness ? ;-) > > -- > Marc-Andre Lemburg > eGenix.com > > Professional Python Services directly from the Source >>>> Python/Zope Consulting and Support ... http://www.egenix.com/ >>>> mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ >>>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ > ________________________________________________________________________ > > ::: Try our new mxODBC.Connect Python Database Interface for free ! :::: > > > eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 > D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg > Registered at Amtsgericht Duesseldorf: HRB 46611 > http://www.egenix.com/company/contact/ > _______________________________________________ Python-Dev mailing list Python-Dev at python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/wes.turner%40gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From bcannon at gmail.com Sat Apr 4 16:33:57 2015 From: bcannon at gmail.com (Brett Cannon) Date: Sat, 04 Apr 2015 14:33:57 +0000 Subject: [Python-Dev] [Python-checkins] Daily reference leaks (e10ad4d4d490): sum=333 In-Reply-To: <20150404084759.15211.4930@psf.io> References: <20150404084759.15211.4930@psf.io> Message-ID: Anyone know what is causing the deque leakage? On Sat, Apr 4, 2015, 04:48 wrote: > results for e10ad4d4d490 on branch "default" > -------------------------------------------- > > test_collections leaked [0, -4, 0] references, sum=-4 > test_collections leaked [0, -2, 0] memory blocks, sum=-2 > test_deque leaked [91, 91, 91] references, sum=273 > test_deque leaked [21, 21, 21] memory blocks, sum=63 > test_functools leaked [0, 0, 3] memory blocks, sum=3 > > > Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', > '3:3:/home/psf-users/antoine/refleaks/reflogK5OSop', '--timeout', '7200'] > _______________________________________________ > Python-checkins mailing list > Python-checkins at python.org > https://mail.python.org/mailman/listinfo/python-checkins > -------------- next part -------------- An HTML attachment was scrubbed... URL: From benjamin at python.org Sat Apr 4 16:53:18 2015 From: benjamin at python.org (Benjamin Peterson) Date: Sat, 04 Apr 2015 10:53:18 -0400 Subject: [Python-Dev] [Python-checkins] Daily reference leaks (e10ad4d4d490): sum=333 In-Reply-To: References: <20150404084759.15211.4930@psf.io> Message-ID: <1428159198.3969075.249273981.63788F29@webmail.messagingengine.com> On Sat, Apr 4, 2015, at 10:33, Brett Cannon wrote: > Anyone know what is causing the deque leakage? https://hg.python.org/cpython/rev/3409f4d945e8 From Steve.Dower at microsoft.com Sat Apr 4 16:41:14 2015 From: Steve.Dower at microsoft.com (Steve Dower) Date: Sat, 4 Apr 2015 14:41:14 +0000 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: <551F1650.8070808@egenix.com> References: <551E63E5.6080805@hastings.org> , <551ED41D.2070909@egenix.com> , <551F1650.8070808@egenix.com> Message-ID: "Relying only on Authenticode for Windows installers would result in a break in technology w/r to the downloads we make available for Python, since all other files are (usually) GPG signed" This is the point of this discussion. I'm willing to make such a break because I believe Authenticode is so much more convenient for end users that it isn't worth producing GPG signatures. So far, the responses seem to be: "I'd use them on Windows" x1 "I'd consider using them on another OS" x2-3 "Please don't change" everyone else At least that's the impression I'm getting, so I hope that helps clarify why I'm still not convinced it's that critical. Cheers, Steve Top-posted from my Windows Phone ________________________________ From: M.-A. Lemburg Sent: ?4/?3/?2015 15:38 To: Steve Dower; Larry Hastings; Python Dev; python-committers Subject: Re: [python-committers] [Python-Dev] Do we need to sign Windows files with GnuPG? On 04.04.2015 00:14, Steve Dower wrote: > The thing is, that's exactly the same goodness as Authenticode gives, except everyone gets that for free and meanwhile you're the only one who has admitted to using GPG on Windows :) > > Basically, what I want to hear is that GPG sigs provide significantly better protection than hashes (and I can provide better than MD5 for all files if it's useful), taking into consideration that (I assume) I'd have to obtain a signing key for GPG and unless there's a CA involved like there is for Authenticode, there's no existing trust in that key. Hashes only provide checks against file corruption (and then only if you can trust the hash values). GPG provides all the benefits of public key encryption on arbitrary files (not just code). The main benefit in case of downloadable installers is to be able to make sure that the files are authentic, meaning that they were created and signed by the people listed as packagers. There is no CA infrastructure involved as for SSL certificates or Authenticode, but it's easy to get the keys from key servers given the key signatures available from python.org's download pages. If you want to sign a package file using GPG, you will need to create your own key, upload it to the key servers and then place the signature up on the download page. Relying only on Authenticode for Windows installers would result in a break in technology w/r to the downloads we make available for Python, since all other files are (usually) GPG signed: https://www.python.org/ftp/python/3.4.3/ Cheers, -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source >>> Python/Zope Consulting and Support ... http://www.egenix.com/ >>> mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ >>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ ________________________________________________________________________ ::: Try our new mxODBC.Connect Python Database Interface for free ! :::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/ > Cheers, > Steve > > Top-posted from my Windows Phone > ________________________________ > From: M.-A. Lemburg > Sent: ?4/?3/?2015 10:55 > To: Steve Dower; Larry Hastings; Python Dev; python-committers > Subject: Re: [python-committers] [Python-Dev] Do we need to sign Windows files with GnuPG? > > On 03.04.2015 19:35, Steve Dower wrote: >>> My Windows development days are firmly behind me. So I don't really have an >>> opinion here. So I put it to you, Windows Python developers: do you care about >>> GnuPG signatures on Windows-specific files? Or do you not care? >> >> The later replies seem to suggest that they are general goodness that nobody on Windows will use. If someone convinces me (or steamrolls me, that's fine too) that the goodness of GPG is better than a hash then I'll look into adding it into the process. Otherwise I'll happily add hash generation into the upload process (which I'm going to do anyway for the ones displayed on the download page). > > FWIW: I regularly check the GPG sigs on all important downloaded > files, regardless of which platform they target, including the > Windows installers for Python or any other Windows installers > I use which provide such sigs. > > The reason is simple: > The signature is a proof of authenticity which is not bound to > a particular file format or platform and before running .exes > it's good to know that they were built by the right people and > not manipulated by trojans, viruses or malicious proxies. > > Is that a good enough reason to continue providing the GPG > sigs or do you need more proof of goodness ? ;-) > > -- > Marc-Andre Lemburg > eGenix.com > > Professional Python Services directly from the Source >>>> Python/Zope Consulting and Support ... http://www.egenix.com/ >>>> mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ >>>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ > ________________________________________________________________________ > > ::: Try our new mxODBC.Connect Python Database Interface for free ! :::: > > > eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 > D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg > Registered at Amtsgericht Duesseldorf: HRB 46611 > http://www.egenix.com/company/contact/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: From victor.stinner at gmail.com Sat Apr 4 17:04:10 2015 From: victor.stinner at gmail.com (Victor Stinner) Date: Sat, 4 Apr 2015 17:04:10 +0200 Subject: [Python-Dev] Socket timeout: reset timeout at each successful syscall? In-Reply-To: References: <20150404132725.21892ee5@fsol> Message-ID: Le samedi 4 avril 2015, Ludovic Gasc a ?crit : > > From a user's point of view, it should count for the total time, IMO. >> If people want a timeout for each syscall, they should call send() >> iteratively. > > > I'm agree with Antoine for a global timeout. > Ok, I also agree. I will modify sendall in Python 3.5 and suggest to loop on send to keep the same behaviour on timeout. I don't want to change python 3.4 or 2.7, it would be strange to have a different behaviour in a minor python version. Victor -------------- next part -------------- An HTML attachment was scrubbed... URL: From barry at python.org Sat Apr 4 18:10:23 2015 From: barry at python.org (Barry Warsaw) Date: Sat, 4 Apr 2015 12:10:23 -0400 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: References: <551E63E5.6080805@hastings.org> <551ED41D.2070909@egenix.com> <551F1650.8070808@egenix.com> Message-ID: <20150404121023.3fefee2f@limelight.wooz.org> On Apr 04, 2015, at 02:41 PM, Steve Dower wrote: >"Relying only on Authenticode for Windows installers would result in a break >in technology w/r to the downloads we make available for Python, since all >other files are (usually) GPG signed" It's the "only" part I have a question about. Does the use of Authenticode preclude detached GPG signatures of the exe file? I can't see how it would, but maybe there's something (well, a lot of somethings ;) I don't know about Windows. If not, then what's the problem with also providing a GPG signature? Cheers, -Barry From brett at python.org Sat Apr 4 18:37:52 2015 From: brett at python.org (Brett Cannon) Date: Sat, 04 Apr 2015 16:37:52 +0000 Subject: [Python-Dev] [Python-checkins] Daily reference leaks (e10ad4d4d490): sum=333 In-Reply-To: <1428159198.3969075.249273981.63788F29@webmail.messagingengine.com> References: <20150404084759.15211.4930@psf.io> <1428159198.3969075.249273981.63788F29@webmail.messagingengine.com> Message-ID: Thanks for fixing it! On Sat, Apr 4, 2015, 10:53 Benjamin Peterson wrote: > > > On Sat, Apr 4, 2015, at 10:33, Brett Cannon wrote: > > Anyone know what is causing the deque leakage? > > https://hg.python.org/cpython/rev/3409f4d945e8 > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ > brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mal at egenix.com Sat Apr 4 18:57:28 2015 From: mal at egenix.com (M.-A. Lemburg) Date: Sat, 04 Apr 2015 18:57:28 +0200 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: References: <551E63E5.6080805@hastings.org> , <551ED41D.2070909@egenix.com> , <551F1650.8070808@egenix.com> Message-ID: <552017F8.3060401@egenix.com> On 04.04.2015 16:41, Steve Dower wrote: > "Relying only on Authenticode for Windows installers would result in a break in technology w/r to the downloads we make available for Python, since all other files are (usually) GPG signed" > > This is the point of this discussion. I'm willing to make such a break because I believe Authenticode is so much more convenient for end users that it isn't worth producing GPG signatures. So far, the responses seem to be: > > "I'd use them on Windows" x1 > "I'd consider using them on another OS" x2-3 > "Please don't change" everyone else > > At least that's the impression I'm getting, so I hope that helps clarify why I'm still not convinced it's that critical. Just to clarify: I have absolutely nothing against using Authenticode on Windows :-) I'm only trying to convince you that *additionally* providing GPG sigs for Windows downloads is a good thing and we should not stop doing this, since it makes verification of downloaded files easier. It's not hard to do, can be automated and provides additional security which can be verified on any platform, not only Windows. Cheers, -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source >>> Python/Zope Consulting and Support ... http://www.egenix.com/ >>> mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ >>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ ________________________________________________________________________ ::: Try our new mxODBC.Connect Python Database Interface for free ! :::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/ From Steve.Dower at microsoft.com Sat Apr 4 16:35:40 2015 From: Steve.Dower at microsoft.com (Steve Dower) Date: Sat, 4 Apr 2015 14:35:40 +0000 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: References: <551E63E5.6080805@hastings.org> <551ED41D.2070909@egenix.com> <551F1650.8070808@egenix.com>, , Message-ID: Small clarification: there certificates *are* the same format as for SSL, and OpenSSL it's able to validate them in the same way as well as generate them (but not extract embedded ones, AFAICT). But generally SSL certificates are not marked as suitable for code signing so you need to buy a separate one. Both Martin and I have the PSF's code signing cert private key, which is how we can sign with the "Python Software Foundation" name. The public key is embedded into every signed file, just like an SSL cert is attached to a site or an S/MIME cert is embedded in a signed email. Cheers, Steve Top-posted from my Windows Phone ________________________________ From: Steve Dower Sent: ?4/?4/?2015 7:25 To: Wes Turner; M. -A. Lemburg Cc: python-committers; Python-Dev Subject: Re: [python-committers] [Python-Dev] Do we need to sign Windows files with GnuPG? "Authenticode does not have a PKI" If you got that from this discussion, I need everyone to at least skim read this: https://msdn.microsoft.com/en-us/library/ie/ms537361(v=vs.85).aspx Authenticode uses the same certificate infrastructure as SSL (note: not the same certificates). As I see it, anyone running on Windows has access to verification that is at least as good as GPG, and the only people who would benefit from GPG sigs are those checking Windows files on another OS or those with an existing GPG workflow on Windows (before this thread, I knew nobody who used GPG on Windows for anything, so forgive me for thinking this is very rare). Cheers, Steve Top-posted from my Windows Phone ________________________________ From: Wes Turner Sent: ?4/?4/?2015 6:42 To: M. -A. Lemburg Cc: Python-Dev; python-committers; Larry Hastings; Steve Dower Subject: Re: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? So, AFAIU from this discussion: * Authenticode does not have a PKI * GPG does have PKI * ASC signatures are signed checksums As far as downstream packaging on Windows (people who should/could be subscribed to release ANNs): For Choclatey NuGet: * https://chocolatey.org/packages/python * https://chocolatey.org/packages/python.x86 * https://chocolatey.org/packages/python2 * https://chocolatey.org/packages/python-x86_32 * https://chocolatey.org/packages/python3 Python(x,y): * https://code.google.com/p/pythonxy/ For Anaconda (the MS Azure chosen python distribution): * http://docs.continuum.io/anaconda/install.html#windows-install ... These should/could/are checking GPG signatures for Windows packages downstream. http://www.scipy.org/install.html On Apr 3, 2015 5:38 PM, "M.-A. Lemburg" > wrote: On 04.04.2015 00:14, Steve Dower wrote: > The thing is, that's exactly the same goodness as Authenticode gives, except everyone gets that for free and meanwhile you're the only one who has admitted to using GPG on Windows :) > > Basically, what I want to hear is that GPG sigs provide significantly better protection than hashes (and I can provide better than MD5 for all files if it's useful), taking into consideration that (I assume) I'd have to obtain a signing key for GPG and unless there's a CA involved like there is for Authenticode, there's no existing trust in that key. Hashes only provide checks against file corruption (and then only if you can trust the hash values). GPG provides all the benefits of public key encryption on arbitrary files (not just code). The main benefit in case of downloadable installers is to be able to make sure that the files are authentic, meaning that they were created and signed by the people listed as packagers. There is no CA infrastructure involved as for SSL certificates or Authenticode, but it's easy to get the keys from key servers given the key signatures available from python.org's download pages. If you want to sign a package file using GPG, you will need to create your own key, upload it to the key servers and then place the signature up on the download page. Relying only on Authenticode for Windows installers would result in a break in technology w/r to the downloads we make available for Python, since all other files are (usually) GPG signed: https://www.python.org/ftp/python/3.4.3/ Cheers, -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source >>> Python/Zope Consulting and Support ... http://www.egenix.com/ >>> mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ >>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ ________________________________________________________________________ ::: Try our new mxODBC.Connect Python Database Interface for free ! :::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/ > Cheers, > Steve > > Top-posted from my Windows Phone > ________________________________ > From: M.-A. Lemburg> > Sent: ?4/?3/?2015 10:55 > To: Steve Dower>; Larry Hastings>; Python Dev>; python-committers> > Subject: Re: [python-committers] [Python-Dev] Do we need to sign Windows files with GnuPG? > > On 03.04.2015 19:35, Steve Dower wrote: >>> My Windows development days are firmly behind me. So I don't really have an >>> opinion here. So I put it to you, Windows Python developers: do you care about >>> GnuPG signatures on Windows-specific files? Or do you not care? >> >> The later replies seem to suggest that they are general goodness that nobody on Windows will use. If someone convinces me (or steamrolls me, that's fine too) that the goodness of GPG is better than a hash then I'll look into adding it into the process. Otherwise I'll happily add hash generation into the upload process (which I'm going to do anyway for the ones displayed on the download page). > > FWIW: I regularly check the GPG sigs on all important downloaded > files, regardless of which platform they target, including the > Windows installers for Python or any other Windows installers > I use which provide such sigs. > > The reason is simple: > The signature is a proof of authenticity which is not bound to > a particular file format or platform and before running .exes > it's good to know that they were built by the right people and > not manipulated by trojans, viruses or malicious proxies. > > Is that a good enough reason to continue providing the GPG > sigs or do you need more proof of goodness ? ;-) > > -- > Marc-Andre Lemburg > eGenix.com > > Professional Python Services directly from the Source >>>> Python/Zope Consulting and Support ... http://www.egenix.com/ >>>> mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ >>>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ > ________________________________________________________________________ > > ::: Try our new mxODBC.Connect Python Database Interface for free ! :::: > > > eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 > D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg > Registered at Amtsgericht Duesseldorf: HRB 46611 > http://www.egenix.com/company/contact/ > _______________________________________________ Python-Dev mailing list Python-Dev at python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/wes.turner%40gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From kbk at shore.net Sat Apr 4 21:02:05 2015 From: kbk at shore.net (Kurt B. Kaiser) Date: Sat, 04 Apr 2015 15:02:05 -0400 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: References: <551E63E5.6080805@hastings.org> <551ED41D.2070909@egenix.com> Message-ID: <1428174125.3954963.249326869.58843520@webmail.messagingengine.com> For the record, that is a Symantec/Verisign code signing certificate. We paid $1123 for it last April. It expires April 2017. If you don't switch to a different vendor, e.g. startssl, please contact me for renewal in 2017. KBK On Sat, Apr 4, 2015, at 10:35 AM, Steve Dower wrote: > Small clarification: there certificates *are* the same format as for SSL, > and OpenSSL it's able to validate them in the same way as well as > generate them (but not extract embedded ones, AFAICT). But generally SSL > certificates are not marked as suitable for code signing so you need to > buy a separate one. > > Both Martin and I have the PSF's code signing cert private key, which is > how we can sign with the "Python Software Foundation" name. The public > key is embedded into every signed file, just like an SSL cert is attached > to a site or an S/MIME cert is embedded in a signed email. > > Cheers, > Steve > > Top-posted from my Windows Phone > ________________________________ > From: Steve Dower > Sent: ?4/?4/?2015 7:25 > To: Wes Turner; M. -A. > Lemburg > Cc: python-committers; > Python-Dev > Subject: Re: [python-committers] [Python-Dev] Do we need to sign Windows > files with GnuPG? > > "Authenticode does not have a PKI" > > If you got that from this discussion, I need everyone to at least skim > read this: > https://msdn.microsoft.com/en-us/library/ie/ms537361(v=vs.85).aspx > > Authenticode uses the same certificate infrastructure as SSL (note: not > the same certificates). As I see it, anyone running on Windows has access > to verification that is at least as good as GPG, and the only people who > would benefit from GPG sigs are those checking Windows files on another > OS or those with an existing GPG workflow on Windows (before this thread, > I knew nobody who used GPG on Windows for anything, so forgive me for > thinking this is very rare). > > Cheers, > Steve > > Top-posted from my Windows Phone > ________________________________ > From: Wes Turner > Sent: ?4/?4/?2015 6:42 > To: M. -A. Lemburg > Cc: Python-Dev; > python-committers; Larry > Hastings; Steve > Dower > Subject: Re: [Python-Dev] [python-committers] Do we need to sign Windows > files with GnuPG? > > > So, AFAIU from this discussion: > > * Authenticode does not have a PKI > * GPG does have PKI > * ASC signatures are signed checksums > > As far as downstream packaging on Windows (people who should/could be > subscribed to release ANNs): > > For Choclatey NuGet: > > * https://chocolatey.org/packages/python > * https://chocolatey.org/packages/python.x86 > * https://chocolatey.org/packages/python2 > * https://chocolatey.org/packages/python-x86_32 > * https://chocolatey.org/packages/python3 > > Python(x,y): > > * https://code.google.com/p/pythonxy/ > > For Anaconda (the MS Azure chosen python distribution): > > * http://docs.continuum.io/anaconda/install.html#windows-install > > ... > > These should/could/are checking GPG signatures for Windows packages > downstream. > > http://www.scipy.org/install.html > > On Apr 3, 2015 5:38 PM, "M.-A. Lemburg" > > wrote: > On 04.04.2015 00:14, Steve Dower wrote: > > The thing is, that's exactly the same goodness as Authenticode gives, except everyone gets that for free and meanwhile you're the only one who has admitted to using GPG on Windows :) > > > > Basically, what I want to hear is that GPG sigs provide significantly better protection than hashes (and I can provide better than MD5 for all files if it's useful), taking into consideration that (I assume) I'd have to obtain a signing key for GPG and unless there's a CA involved like there is for Authenticode, there's no existing trust in that key. > > Hashes only provide checks against file corruption (and then > only if you can trust the hash values). GPG provides all the > benefits of public key encryption on arbitrary files (not just > code). > > The main benefit in case of downloadable installers is to > be able to make sure that the files are authentic, meaning that > they were created and signed by the people listed as packagers. > > There is no CA infrastructure involved as for SSL certificates > or Authenticode, but it's easy to get the keys from key servers > given the key signatures available from python.org's > download > pages. > > If you want to sign a package file using GPG, you will need > to create your own key, upload it to the key servers and then > place the signature up on the download page. > > Relying only on Authenticode for Windows installers would > result in a break in technology w/r to the downloads we > make available for Python, since all other files are (usually) > GPG signed: > > https://www.python.org/ftp/python/3.4.3/ > > Cheers, > -- > Marc-Andre Lemburg > eGenix.com > > Professional Python Services directly from the Source > >>> Python/Zope Consulting and Support ... http://www.egenix.com/ > >>> mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ > >>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ > ________________________________________________________________________ > > ::: Try our new mxODBC.Connect Python Database Interface for free ! :::: > > > eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 > D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg > Registered at Amtsgericht Duesseldorf: HRB 46611 > http://www.egenix.com/company/contact/ > > > > Cheers, > > Steve > > > > Top-posted from my Windows Phone > > ________________________________ > > From: M.-A. Lemburg> > > Sent: ?4/?3/?2015 10:55 > > To: Steve Dower>; Larry Hastings>; Python Dev>; python-committers> > > Subject: Re: [python-committers] [Python-Dev] Do we need to sign Windows files with GnuPG? > > > > On 03.04.2015 19:35, Steve Dower wrote: > >>> My Windows development days are firmly behind me. So I don't really have an > >>> opinion here. So I put it to you, Windows Python developers: do you care about > >>> GnuPG signatures on Windows-specific files? Or do you not care? > >> > >> The later replies seem to suggest that they are general goodness that nobody on Windows will use. If someone convinces me (or steamrolls me, that's fine too) that the goodness of GPG is better than a hash then I'll look into adding it into the process. Otherwise I'll happily add hash generation into the upload process (which I'm going to do anyway for the ones displayed on the download page). > > > > FWIW: I regularly check the GPG sigs on all important downloaded > > files, regardless of which platform they target, including the > > Windows installers for Python or any other Windows installers > > I use which provide such sigs. > > > > The reason is simple: > > The signature is a proof of authenticity which is not bound to > > a particular file format or platform and before running .exes > > it's good to know that they were built by the right people and > > not manipulated by trojans, viruses or malicious proxies. > > > > Is that a good enough reason to continue providing the GPG > > sigs or do you need more proof of goodness ? ;-) > > > > -- > > Marc-Andre Lemburg > > eGenix.com > > > > Professional Python Services directly from the Source > >>>> Python/Zope Consulting and Support ... http://www.egenix.com/ > >>>> mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ > >>>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ > > ________________________________________________________________________ > > > > ::: Try our new mxODBC.Connect Python Database Interface for free ! :::: > > > > > > eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 > > D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg > > Registered at Amtsgericht Duesseldorf: HRB 46611 > > http://www.egenix.com/company/contact/ > > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/wes.turner%40gmail.com > _______________________________________________ > python-committers mailing list > python-committers at python.org > https://mail.python.org/mailman/listinfo/python-committers From mal at egenix.com Sat Apr 4 21:35:09 2015 From: mal at egenix.com (M.-A. Lemburg) Date: Sat, 04 Apr 2015 21:35:09 +0200 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: <1428174125.3954963.249326869.58843520@webmail.messagingengine.com> References: <551E63E5.6080805@hastings.org> <551ED41D.2070909@egenix.com> <1428174125.3954963.249326869.58843520@webmail.messagingengine.com> Message-ID: <55203CED.5030609@egenix.com> On 04.04.2015 21:02, Kurt B. Kaiser wrote: > For the record, that is a Symantec/Verisign code signing certificate. We > paid $1123 for it last April. It expires April 2017. > > If you don't switch to a different vendor, e.g. startssl, please contact > me for renewal in 2017. FWIW: The PSF mostly uses StartSSL nowadays and they also support code signing certificates. Given that this option is a lot cheaper than Verisign, I think we should switch, unless there are significant reasons not to. We should revisit this in 2017. > KBK > > On Sat, Apr 4, 2015, at 10:35 AM, Steve Dower wrote: >> Small clarification: there certificates *are* the same format as for SSL, >> and OpenSSL it's able to validate them in the same way as well as >> generate them (but not extract embedded ones, AFAICT). But generally SSL >> certificates are not marked as suitable for code signing so you need to >> buy a separate one. >> >> Both Martin and I have the PSF's code signing cert private key, which is >> how we can sign with the "Python Software Foundation" name. The public >> key is embedded into every signed file, just like an SSL cert is attached >> to a site or an S/MIME cert is embedded in a signed email. >> >> Cheers, >> Steve >> >> Top-posted from my Windows Phone >> ________________________________ >> From: Steve Dower >> Sent: ?4/?4/?2015 7:25 >> To: Wes Turner; M. -A. >> Lemburg >> Cc: python-committers; >> Python-Dev >> Subject: Re: [python-committers] [Python-Dev] Do we need to sign Windows >> files with GnuPG? >> >> "Authenticode does not have a PKI" >> >> If you got that from this discussion, I need everyone to at least skim >> read this: >> https://msdn.microsoft.com/en-us/library/ie/ms537361(v=vs.85).aspx >> >> Authenticode uses the same certificate infrastructure as SSL (note: not >> the same certificates). As I see it, anyone running on Windows has access >> to verification that is at least as good as GPG, and the only people who >> would benefit from GPG sigs are those checking Windows files on another >> OS or those with an existing GPG workflow on Windows (before this thread, >> I knew nobody who used GPG on Windows for anything, so forgive me for >> thinking this is very rare). >> >> Cheers, >> Steve >> >> Top-posted from my Windows Phone >> ________________________________ >> From: Wes Turner >> Sent: ?4/?4/?2015 6:42 >> To: M. -A. Lemburg >> Cc: Python-Dev; >> python-committers; Larry >> Hastings; Steve >> Dower >> Subject: Re: [Python-Dev] [python-committers] Do we need to sign Windows >> files with GnuPG? >> >> >> So, AFAIU from this discussion: >> >> * Authenticode does not have a PKI >> * GPG does have PKI >> * ASC signatures are signed checksums >> >> As far as downstream packaging on Windows (people who should/could be >> subscribed to release ANNs): >> >> For Choclatey NuGet: >> >> * https://chocolatey.org/packages/python >> * https://chocolatey.org/packages/python.x86 >> * https://chocolatey.org/packages/python2 >> * https://chocolatey.org/packages/python-x86_32 >> * https://chocolatey.org/packages/python3 >> >> Python(x,y): >> >> * https://code.google.com/p/pythonxy/ >> >> For Anaconda (the MS Azure chosen python distribution): >> >> * http://docs.continuum.io/anaconda/install.html#windows-install >> >> ... >> >> These should/could/are checking GPG signatures for Windows packages >> downstream. >> >> http://www.scipy.org/install.html >> >> On Apr 3, 2015 5:38 PM, "M.-A. Lemburg" >> > wrote: >> On 04.04.2015 00:14, Steve Dower wrote: >>> The thing is, that's exactly the same goodness as Authenticode gives, except everyone gets that for free and meanwhile you're the only one who has admitted to using GPG on Windows :) >>> >>> Basically, what I want to hear is that GPG sigs provide significantly better protection than hashes (and I can provide better than MD5 for all files if it's useful), taking into consideration that (I assume) I'd have to obtain a signing key for GPG and unless there's a CA involved like there is for Authenticode, there's no existing trust in that key. >> >> Hashes only provide checks against file corruption (and then >> only if you can trust the hash values). GPG provides all the >> benefits of public key encryption on arbitrary files (not just >> code). >> >> The main benefit in case of downloadable installers is to >> be able to make sure that the files are authentic, meaning that >> they were created and signed by the people listed as packagers. >> >> There is no CA infrastructure involved as for SSL certificates >> or Authenticode, but it's easy to get the keys from key servers >> given the key signatures available from python.org's >> download >> pages. >> >> If you want to sign a package file using GPG, you will need >> to create your own key, upload it to the key servers and then >> place the signature up on the download page. >> >> Relying only on Authenticode for Windows installers would >> result in a break in technology w/r to the downloads we >> make available for Python, since all other files are (usually) >> GPG signed: >> >> https://www.python.org/ftp/python/3.4.3/ >> >> Cheers, >> -- >> Marc-Andre Lemburg >> eGenix.com >> >> Professional Python Services directly from the Source >>>>> Python/Zope Consulting and Support ... http://www.egenix.com/ >>>>> mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ >>>>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ >> ________________________________________________________________________ >> >> ::: Try our new mxODBC.Connect Python Database Interface for free ! :::: >> >> >> eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 >> D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg >> Registered at Amtsgericht Duesseldorf: HRB 46611 >> http://www.egenix.com/company/contact/ >> >> >>> Cheers, >>> Steve >>> >>> Top-posted from my Windows Phone >>> ________________________________ >>> From: M.-A. Lemburg> >>> Sent: ?4/?3/?2015 10:55 >>> To: Steve Dower>; Larry Hastings>; Python Dev>; python-committers> >>> Subject: Re: [python-committers] [Python-Dev] Do we need to sign Windows files with GnuPG? >>> >>> On 03.04.2015 19:35, Steve Dower wrote: >>>>> My Windows development days are firmly behind me. So I don't really have an >>>>> opinion here. So I put it to you, Windows Python developers: do you care about >>>>> GnuPG signatures on Windows-specific files? Or do you not care? >>>> >>>> The later replies seem to suggest that they are general goodness that nobody on Windows will use. If someone convinces me (or steamrolls me, that's fine too) that the goodness of GPG is better than a hash then I'll look into adding it into the process. Otherwise I'll happily add hash generation into the upload process (which I'm going to do anyway for the ones displayed on the download page). >>> >>> FWIW: I regularly check the GPG sigs on all important downloaded >>> files, regardless of which platform they target, including the >>> Windows installers for Python or any other Windows installers >>> I use which provide such sigs. >>> >>> The reason is simple: >>> The signature is a proof of authenticity which is not bound to >>> a particular file format or platform and before running .exes >>> it's good to know that they were built by the right people and >>> not manipulated by trojans, viruses or malicious proxies. >>> >>> Is that a good enough reason to continue providing the GPG >>> sigs or do you need more proof of goodness ? ;-) >>> >>> -- >>> Marc-Andre Lemburg >>> eGenix.com >>> >>> Professional Python Services directly from the Source >>>>>> Python/Zope Consulting and Support ... http://www.egenix.com/ >>>>>> mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ >>>>>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ >>> ________________________________________________________________________ >>> >>> ::: Try our new mxODBC.Connect Python Database Interface for free ! :::: >>> >>> >>> eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 >>> D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg >>> Registered at Amtsgericht Duesseldorf: HRB 46611 >>> http://www.egenix.com/company/contact/ >>> >> >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/wes.turner%40gmail.com >> _______________________________________________ >> python-committers mailing list >> python-committers at python.org >> https://mail.python.org/mailman/listinfo/python-committers > _______________________________________________ > python-committers mailing list > python-committers at python.org > https://mail.python.org/mailman/listinfo/python-committers > -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source >>> Python/Zope Consulting and Support ... http://www.egenix.com/ >>> mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ >>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ ________________________________________________________________________ ::: Try our new mxODBC.Connect Python Database Interface for free ! :::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/ From kbk at shore.net Sat Apr 4 21:49:01 2015 From: kbk at shore.net (Kurt B. Kaiser) Date: Sat, 04 Apr 2015 15:49:01 -0400 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: <55203CED.5030609@egenix.com> References: <551E63E5.6080805@hastings.org> <551ED41D.2070909@egenix.com> <1428174125.3954963.249326869.58843520@webmail.messagingengine.com> <55203CED.5030609@egenix.com> Message-ID: <1428176941.3962322.249337589.2D987B60@webmail.messagingengine.com> On Sat, Apr 4, 2015, at 03:35 PM, M.-A. Lemburg wrote: > On 04.04.2015 21:02, Kurt B. Kaiser wrote: > > For the record, that is a Symantec/Verisign code signing > > certificate. We paid $1123 for it last April. It expires > > April 2017. > > > > If you don't switch to a different vendor, e.g. startssl, please > > contact me for renewal in 2017. > > FWIW: The PSF mostly uses StartSSL nowadays and they also support code > signing certificates. Given that this option is a lot cheaper than > Verisign, I think we should switch, unless there are significant > reasons not to. We should revisit this in 2017. Agree - apparently the starlssl process for getting a signing cert is complex/obscure, so we should start early. Let me know if I can help providing PSF organization verification. KBK From mal at egenix.com Sat Apr 4 21:54:28 2015 From: mal at egenix.com (M.-A. Lemburg) Date: Sat, 04 Apr 2015 21:54:28 +0200 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: <1428176941.3962322.249337589.2D987B60@webmail.messagingengine.com> References: <551E63E5.6080805@hastings.org> <551ED41D.2070909@egenix.com> <1428174125.3954963.249326869.58843520@webmail.messagingengine.com> <55203CED.5030609@egenix.com> <1428176941.3962322.249337589.2D987B60@webmail.messagingengine.com> Message-ID: <55204174.9010404@egenix.com> On 04.04.2015 21:49, Kurt B. Kaiser wrote: > > > On Sat, Apr 4, 2015, at 03:35 PM, M.-A. Lemburg wrote: >> On 04.04.2015 21:02, Kurt B. Kaiser wrote: >>> For the record, that is a Symantec/Verisign code signing >>> certificate. We paid $1123 for it last April. It expires >>> April 2017. >>> >>> If you don't switch to a different vendor, e.g. startssl, please >>> contact me for renewal in 2017. >> >> FWIW: The PSF mostly uses StartSSL nowadays and they also support code >> signing certificates. Given that this option is a lot cheaper than >> Verisign, I think we should switch, unless there are significant >> reasons not to. We should revisit this in 2017. > > Agree - apparently the starlssl process for getting a signing cert is > complex/obscure, so we should start early. Not really. Once you have the org verification it's really easy. > Let me know if I can help providing PSF organization verification. I already completed that for the current cycle. -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source >>> Python/Zope Consulting and Support ... http://www.egenix.com/ >>> mxODBC.Zope.Database.Adapter ... http://zope.egenix.com/ >>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ ________________________________________________________________________ ::: Try our new mxODBC.Connect Python Database Interface for free ! :::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/ From kbk at shore.net Sat Apr 4 22:01:03 2015 From: kbk at shore.net (Kurt B. Kaiser) Date: Sat, 04 Apr 2015 16:01:03 -0400 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: <55204174.9010404@egenix.com> References: <551E63E5.6080805@hastings.org> <551ED41D.2070909@egenix.com> <1428174125.3954963.249326869.58843520@webmail.messagingengine.com> <55203CED.5030609@egenix.com> <1428176941.3962322.249337589.2D987B60@webmail.messagingengine.com> <55204174.9010404@egenix.com> Message-ID: <1428177663.3964945.249339689.32CCC62D@webmail.messagingengine.com> On Sat, Apr 4, 2015, at 03:54 PM, M.-A. Lemburg wrote: > On 04.04.2015 21:49, Kurt B. Kaiser wrote: > > > > > > On Sat, Apr 4, 2015, at 03:35 PM, M.-A. Lemburg wrote: > >> On 04.04.2015 21:02, Kurt B. Kaiser wrote: > >>> For the record, that is a Symantec/Verisign code signing > >>> certificate. We paid $1123 for it last April. It expires > >>> April 2017. > >>> > >>> If you don't switch to a different vendor, e.g. startssl, please > >>> contact me for renewal in 2017. > >> > >> FWIW: The PSF mostly uses StartSSL nowadays and they also support code > >> signing certificates. Given that this option is a lot cheaper than > >> Verisign, I think we should switch, unless there are significant > >> reasons not to. We should revisit this in 2017. > > > > Agree - apparently the starlssl process for getting a signing cert is > > complex/obscure, so we should start early. > > Not really. Once you have the org verification it's really easy. > > > Let me know if I can help providing PSF organization verification. > > I already completed that for the current cycle. One can hope. We shall see :-) KBK From ericsnowcurrently at gmail.com Sun Apr 5 00:00:55 2015 From: ericsnowcurrently at gmail.com (Eric Snow) Date: Sat, 4 Apr 2015 16:00:55 -0600 Subject: [Python-Dev] PEP 487 vs 422 (dynamic class decoration) In-Reply-To: References: Message-ID: On Fri, Apr 3, 2015 at 6:44 AM, Martin Teichmann wrote: >> When I first wrote PEP 422 I was of the view that "Python 2 allows >> class definition postprocessing injection, we should allow it in >> Python 3 as well". I've since changed my view to "Having to declare >> post-processing of a class definition up front as a decorator, base >> class or metaclass is a good thing for readability, as otherwise >> there's nothing obvious when reading a class definition that tells you >> whether or not postprocessing may happen, so you have to assume its >> possible for *every* class definition". > > Nick, I couldn't agree more with you, yet I think PJ actually brought > up a very interesting point. Post-processing is a very common thing > these days, and has been re-written so many times that I think it is > about time that something like it should be in the standard library. > > I'm less thinking about decorated methods, more about descriptors. > They always have the problem that they don't know which attribute they > belong to, so every author of a framework that defines descriptors > writes a metaclass which goes through all the descriptors and tells > them their attribute name. > > I propose to have some metaclass in the standard library that does > that. I think it would fit nicely in my metaclass module proposed in > PEP 487. > > It would basically do the following: > > class Metaclass(type): > def __init__(self, name, bases, dict): > super().__init__(name, bases, dict) > for k, v in dict.items(): > if hasattr(v, "__post_process__"): > v.__post_process__(k, self) > > So each descriptor could define a __post_process__ hook that tells > it the attribute name and also the class it belongs to. This would for > sure also work for decorated methods. > > This should mature on PyPI, then introduced into the standard library, > and if demand is really that high, maybe even be introduced into > type.__init__. It should be noted that this can also be easily written > as a PEP 487 class using __subclass_init__, I just used the classical > metaclass notion as I guess people are more used to that. > > This proposal can actually be seen as an extension to the __class__ > and super() mechanism of normal methods: methods currently have the > priviledge to know which classes they are defined in, while descriptors > don't. So we could unify all this by giving functions a __post_process__ > method which sets the __class__ in the function body. This is about the > same as what happened when functions got a __get__ method to turn > them into object methods. I've felt for a long time that it would be helpful in some situations to have a reverse descriptor protocol. What you are describing it a strict subset of that concept (which is fine). It may be worth considering a method name that would also be used for that more generic reverse descriptor, rather than having 2 names for the same thing. Even if such a protocol never materializes, the name borrowed here would still be informative. -eric From greg.ewing at canterbury.ac.nz Sun Apr 5 02:40:48 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Sun, 05 Apr 2015 12:40:48 +1200 Subject: [Python-Dev] PEP 487 vs 422 (dynamic class decoration) In-Reply-To: References: Message-ID: <55208490.4080106@canterbury.ac.nz> Eric Snow wrote: > I've felt for a long time that it would be helpful in some situations > to have a reverse descriptor protocol. Can you elaborate on what you mean by that? -- Greg From Steve.Dower at microsoft.com Sun Apr 5 03:07:10 2015 From: Steve.Dower at microsoft.com (Steve Dower) Date: Sun, 5 Apr 2015 01:07:10 +0000 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: <20150404121023.3fefee2f@limelight.wooz.org> References: <551E63E5.6080805@hastings.org> <551ED41D.2070909@egenix.com> <551F1650.8070808@egenix.com> , <20150404121023.3fefee2f@limelight.wooz.org> Message-ID: There's no problem, per se, but initially it was less trouble to use the trusted PSF certificate and native support than to add an extra step using a program I don't already use and trust, am restricted in use by my employer (because of the license and the fact there are alternatives), and developing the trust in a brand new certificate. Eventually the people saying "do it" will win through sheer persistence, since I'll get sick of trying to get a more detailed response and just concede. Not sure if that's how we want to be running the project though... Top-posted from my Windows Phone ________________________________ From: Barry Warsaw Sent: ?4/?4/?2015 9:11 To: python-dev at python.org Subject: Re: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? On Apr 04, 2015, at 02:41 PM, Steve Dower wrote: >"Relying only on Authenticode for Windows installers would result in a break >in technology w/r to the downloads we make available for Python, since all >other files are (usually) GPG signed" It's the "only" part I have a question about. Does the use of Authenticode preclude detached GPG signatures of the exe file? I can't see how it would, but maybe there's something (well, a lot of somethings ;) I don't know about Windows. If not, then what's the problem with also providing a GPG signature? Cheers, -Barry _______________________________________________ Python-Dev mailing list Python-Dev at python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/steve.dower%40microsoft.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Sun Apr 5 03:33:36 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun, 5 Apr 2015 11:33:36 +1000 Subject: [Python-Dev] PEP 487 vs 422 (dynamic class decoration) In-Reply-To: References: Message-ID: On 4 April 2015 at 06:36, PJ Eby wrote: > On Fri, Apr 3, 2015 at 4:21 AM, Nick Coghlan wrote: >> No, you can't do it currently without risking a backwards >> incompatibility through the introduction of a custom metaclass. > > Right... which is precisely why I'm suggesting the `noconflict()` > metaclass factory function as a *general* solution for providing > useful metaclasses, and why I think that PEP 487 should break the > namespacing and subclass init features into separate metaclasses, and > add that noconflict feature. It will then become a good example for > people moving forward writing metaclasses. > > Basically, as long as you don't have the pointless conflict errors, > you can write co-operative metaclass mixins as easily as you can write > regular co-operative mixins. I was missing this point myself because > I've been too steeped in Python 2's complexities: writing a usable > version of `noconflict()` is a lot more complex and its invocation far > more obscure. In Python 2, there's classic classes, class- and > module-level __metaclass__, ExtensionClass, and all sorts of other > headaches for automatic mixing. In Python 3, though, all that stuff > goes out the window, and even my 90-line version that's almost half > comments is probably still overengineered compared to what's actually > needed to do the mixing. D'oh, I had the same problem you did - I'd been assuming this was entirely infeasible because of all the complexities it involved back in Python 2, and had never adequately reconsidered the question in a PEP 3115 based world :( So actually reading https://gist.github.com/pjeby/75ca26f8d2a7a0c68e30 properly, you're starting to convince me that a "noconflict" metaclass resolver would be a valuable and viable addition to the Python 3 type system machinery. The future possible language level enhancement would then be to make that automatic resolution of metaclass conflicts part of the *default* metaclass determination process. I realise you've been trying to explain that to me for a few days now, I'm just writing it out explicitly to make it clear I finally get it :) >> Given my change of heart, I believe that at this point, if you were >> willing to champion a revived PEP 422 that implemented the behaviour >> you're after, that would be ideal, with monkeypatching the desired >> behaviour in as a fallback plan if the PEP is still ultimately >> rejected. Alternatively, you could go the monkeypatching path first, >> and then potentially seek standardisation later after you've had some >> practical experience with it - I now consider it an orthogonal >> capability to the feature in PEP 487, so the acceptance of the latter >> wouldn't necessarily preclude acceptance of a hook for class >> postprocessing injection. > > A lot of things have changed since the original discussion, mostly in > the direction of me having even *less* time for Python work than > previously, so it's unlikely that me championing a PEP is a realistic > possibility. Frankly, I'm immensely fatigued at the discussion > *already*, and the need to go over the same questions a *third* time > seems like not something I'm going to want to put energy into. Heh, one of the main reasons PEP 422 ended up languishing for so long is that I started putting more time into other projects (PyPA, the PSF, the import system, Python 3 advocacy, etc), so with both you & me occupied elsewhere, we didn't really have anyone driving the discussion forward on the metaclass machinery side of things. Martin's very pertinent challenges to some of the unnecessary design complexity in PEP 422 has noticeably changed that dynamic for the better :) > However it sounds like there *is* some growing consensus towards the > idea of simply notifying interested class members of their class > membership, so if there ends up being a consensus to standardize > *that* protocol and what part of the class-building process it gets > invoked in, then I will implement a backport (or use such a backport > if someone else implements it), when I actually start porting my > libraries to Python 3. But that would make my timeline somewhat > dependent on how much of a consensus there is, and how much clarity I > could get before going forward. In a separate RFE, Martin convinced me that we really want to kick as much of this off from type.__init__ as we can, and that the problem with zero-argument super() currently not working when called from metaclass __init__ methods should be treated as a bug in the way zero-argument super() is currently implemented. That position makes a lot of sense to me (especially since it was backed up with a patch to fix the bug using a modifed implementation that's inspired by the way that setting __qualname__ works), and is what makes it possible to assume we can just use the metaclass system to deal with this, rather than having to rely on modifications to __build_class__. > Even if PEP 422 never was officially tagged with "Approved" status in > the PEP itself, our 2013 conversation with Guido made it sound like it > was totally a done deal; if there was something *other* than PEP 487 > that threw it off that track, I never saw it. So I'm understandably a > little bit reluctant to start off implementing a new protocol that > then two or three years from *now* will suddenly not be a done deal > any more, with whatever I did being retroactively declared the wrong > thing to do again. That's entirely fair. The main thing that threw PEP 422 off track was that I received a lot of good requests for clarification in the previous round of PEP 422 discussions, and I wasn't really happy with the answers I was coming up with when I contemplated redrafting it, so it was easy to rationalise postponing further work on it. Martin's recent feedback then served to finally crystallise those doubts into "this isn't the right answer after all". > I suppose, though, that that my best option all in all is just to do > whatever the heck seems best for porting, and worry about > standardization later. If a member-notification protocol is > standardized, I can always change DecoratorTools to use it *later*, > after all, as long as the actual implementation mechanism inside > DecoratorTools is opaque to its consumers. (i.e., if I don't actually > expose the member-notification protocol directly) > > (And, in retrospect, I could have, and probably should have, taken > this approach from the get-go in 2012. It just seemed really, *really* > important to you back then that I *not* do it.) My apologies for that - while I don't actually recall what I was thinking when I said it, I suspect I was all fired up that PEP 422 was definitely the right answer, and hence thought I'd have an official solution in place for you in fairly short order. I should have let you know explicitly when I started having doubts about it, so you could reassess your porting options. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Sun Apr 5 03:59:01 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun, 5 Apr 2015 11:59:01 +1000 Subject: [Python-Dev] PEP 487 vs 422 (dynamic class decoration) In-Reply-To: <55208490.4080106@canterbury.ac.nz> References: <55208490.4080106@canterbury.ac.nz> Message-ID: On 5 April 2015 at 10:40, Greg Ewing wrote: > Eric Snow wrote: >> >> I've felt for a long time that it would be helpful in some situations >> to have a reverse descriptor protocol. > > > Can you elaborate on what you mean by that? My guess from the name and the context: having a way to notify descriptor objects when they're added to or removed from a class. That is, the current descriptor protocol works as follows: 1. Descriptor *instances* are stored on *class* objects 2. Descriptor *instance methods* are invoked to control class *instance* attribute access & modification 3. Descriptors play no role in *class* attribute access & modification, you have to put descriptors on a metaclass for that A "reverse descriptor" protocol would be aimed at addressing point 3 without needing a custom metaclass. For example, suppose we took Martin's "__post_process__" idea, and instead changed it to: def __addclassattr__(self, cls, attr): "Notifies the descriptor of a new reference from a class attribute" ... def __delclassattr__(self, cls, attr): "Notifies the descriptor of the removal of a reference from a class attribute" ... If a descriptor had an "__addclassattr__" defined, then not only would type.__init__ call it during type creation, but so would class attribute assignment (on the descriptor being assigned, not on one that is already present). Class attribute *deletion* would call "descr.__delclassattr__(cls, attr)" on the current descriptor value, and "__delclassattr__" would also be called on the *current* attribute descriptor when a class attribute is set to something different. Class attribute retrieval would remain unaffected, and just return the descriptor object as it does now. It would be up to the descriptor implementor to decide how to handle "__addclassattr__" being called multiple times on the same descriptor instance (e.g. you might allow it for aliasing purposes within a single class, but throw an exception if you try to register the same instance with a second class without removing it from the old one first). Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From njs at pobox.com Sun Apr 5 05:21:49 2015 From: njs at pobox.com (Nathaniel Smith) Date: Sat, 4 Apr 2015 20:21:49 -0700 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: References: <551E63E5.6080805@hastings.org> <551ED41D.2070909@egenix.com> <551F1650.8070808@egenix.com> <20150404121023.3fefee2f@limelight.wooz.org> Message-ID: On Sat, Apr 4, 2015 at 6:07 PM, Steve Dower wrote: > There's no problem, per se, but initially it was less trouble to use the > trusted PSF certificate and native support than to add an extra step using a > program I don't already use and trust, am restricted in use by my employer > (because of the license and the fact there are alternatives), and developing > the trust in a brand new certificate. > > Eventually the people saying "do it" will win through sheer persistence, > since I'll get sick of trying to get a more detailed response and just > concede. Not sure if that's how we want to be running the project though... I don't get the impression that there's any particularly detailed rationale that people aren't giving you; it's just that to the average python-dev denizen, gpg-signing seems to provide some mild benefits and with no downside. The certificate trust issue isn't a downside, just a mild dilution of the upside. And I suspect python-dev generally doesn't put much weight on the extra effort required (release managers have all been using gpg for decades, it's pretty trivial), or see any reason why Microsoft's internal GPL-hate should have any effect on the PSF's behaviour. Though it's kinda inconvenient for you, obviously. (I guess you could call Larry or someone, read them a hash over the phone, and then have them create the actual gpg signatures.) -n From robertc at robertcollins.net Sun Apr 5 06:59:13 2015 From: robertc at robertcollins.net (Robert Collins) Date: Sun, 5 Apr 2015 16:59:13 +1200 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: References: <551E63E5.6080805@hastings.org> <551ED41D.2070909@egenix.com> Message-ID: On 4 April 2015 at 11:14, Steve Dower wrote: > The thing is, that's exactly the same goodness as Authenticode gives, except > everyone gets that for free and meanwhile you're the only one who has > admitted to using GPG on Windows :) > > Basically, what I want to hear is that GPG sigs provide significantly better > protection than hashes (and I can provide better than MD5 for all files if > it's useful), taking into consideration that (I assume) I'd have to obtain a > signing key for GPG and unless there's a CA involved like there is for > Authenticode, there's no existing trust in that key. GPG sigs will provide protection against replay attacks [unless we're proposing to revoke signatures on old point releases with known security vulnerabilities - something that Window software vendors tend not to do because of the dramatic and immediate effect on the deployed base...] This is not relevant for things we're hosting on SSL, but is if anyone is mirroring our installers around. They dont' seem to be so perhaps its a bit 'meh'. OTOH I also think there is value in consistency: signing all our artifacts makes checking back on them later easier, should we need to. One question, if you will - I don't think this was asked so far - is authenticode verifiable from Linux, without Windows? And does it work for users of WINE ? -Rob -- Robert Collins Distinguished Technologist HP Converged Cloud From larry at hastings.org Sun Apr 5 10:06:01 2015 From: larry at hastings.org (Larry Hastings) Date: Sun, 05 Apr 2015 01:06:01 -0700 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: References: <551E63E5.6080805@hastings.org> <551ED41D.2070909@egenix.com> <551F1650.8070808@egenix.com> <20150404121023.3fefee2f@limelight.wooz.org> Message-ID: <5520ECE9.1060500@hastings.org> On 04/04/2015 08:21 PM, Nathaniel Smith wrote: > (I guess you could call Larry or someone, read them a hash over the > phone, and then have them create the actual gpg signatures.) By sheer coincidence, I believe Steve and I both live in the Seattle area...! //arry/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From Steve.Dower at microsoft.com Sun Apr 5 06:43:14 2015 From: Steve.Dower at microsoft.com (Steve Dower) Date: Sun, 5 Apr 2015 04:43:14 +0000 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: References: <551E63E5.6080805@hastings.org> <551ED41D.2070909@egenix.com> <551F1650.8070808@egenix.com> <20150404121023.3fefee2f@limelight.wooz.org> , Message-ID: <1428208996091.85387@microsoft.com> Nathaniel Smith wrote: > And I suspect python-dev generally doesn't put much weight on the > extra effort required (release managers have all been using gpg for > decades, it's pretty trivial) I'm aware of this, but still don't see it as a reason to unnecessarily duplicate process. > or see any reason why Microsoft's internal GPL-hate should have any > effect on the PSF's behaviour. Seems the "internal GPL-hate" has softened even more than I was aware. The history for GPG was spotty, but my request was automatically approved, so I guess the line has been moved far enough away that I've lost that excuse :) Now I just have to find the time to learn how to use it... Cheers, Steve From solipsis at pitrou.net Sun Apr 5 15:41:23 2015 From: solipsis at pitrou.net (Antoine Pitrou) Date: Sun, 5 Apr 2015 15:41:23 +0200 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? References: <551E63E5.6080805@hastings.org> <551ED41D.2070909@egenix.com> <551F1650.8070808@egenix.com> <20150404121023.3fefee2f@limelight.wooz.org> <5520ECE9.1060500@hastings.org> Message-ID: <20150405154123.10e263bb@fsol> On Sun, 05 Apr 2015 01:06:01 -0700 Larry Hastings wrote: > > On 04/04/2015 08:21 PM, Nathaniel Smith wrote: > > (I guess you could call Larry or someone, read them a hash over the > > phone, and then have them create the actual gpg signatures.) > > By sheer coincidence, I believe Steve and I both live in the Seattle > area...! Meaning the phone works well enough there? Regards Antoine. From Steve.Dower at microsoft.com Sun Apr 5 14:07:53 2015 From: Steve.Dower at microsoft.com (Steve Dower) Date: Sun, 5 Apr 2015 12:07:53 +0000 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: References: <551E63E5.6080805@hastings.org> <551ED41D.2070909@egenix.com> , Message-ID: "One question, if you will - I don't think this was asked so far - is authenticode verifiable from Linux, without Windows? And does it work for users of WINE ?" I've seen some info suggesting that it's verifiable, but you do need to extract the cert and calculate the hash against less than the signed file. Seemed like Mono had a tool for it, but OpenSSL can handle the cert. Currently the new installer doesn't run on Wine because of missing APIs (since I want to discuss alternate distribution ideas I haven't treated this as a priority), and I've heard they haven't implemented enough crypto yet to handle it, but that could be outdated. "GPG sigs will provide protection against replay attacks" How does this work? Cheers, Steve Top-posted from my Windows Phone ________________________________ From: Robert Collins Sent: ?4/?4/?2015 21:59 To: Steve Dower Cc: M.-A. Lemburg; Larry Hastings; Python Dev; python-committers Subject: Re: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? On 4 April 2015 at 11:14, Steve Dower wrote: > The thing is, that's exactly the same goodness as Authenticode gives, except > everyone gets that for free and meanwhile you're the only one who has > admitted to using GPG on Windows :) > > Basically, what I want to hear is that GPG sigs provide significantly better > protection than hashes (and I can provide better than MD5 for all files if > it's useful), taking into consideration that (I assume) I'd have to obtain a > signing key for GPG and unless there's a CA involved like there is for > Authenticode, there's no existing trust in that key. GPG sigs will provide protection against replay attacks [unless we're proposing to revoke signatures on old point releases with known security vulnerabilities - something that Window software vendors tend not to do because of the dramatic and immediate effect on the deployed base...] This is not relevant for things we're hosting on SSL, but is if anyone is mirroring our installers around. They dont' seem to be so perhaps its a bit 'meh'. OTOH I also think there is value in consistency: signing all our artifacts makes checking back on them later easier, should we need to. One question, if you will - I don't think this was asked so far - is authenticode verifiable from Linux, without Windows? And does it work for users of WINE ? -Rob -- Robert Collins Distinguished Technologist HP Converged Cloud -------------- next part -------------- An HTML attachment was scrubbed... URL: From larry at hastings.org Sun Apr 5 19:00:55 2015 From: larry at hastings.org (Larry Hastings) Date: Sun, 05 Apr 2015 10:00:55 -0700 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: <20150405154123.10e263bb@fsol> References: <551E63E5.6080805@hastings.org> <551ED41D.2070909@egenix.com> <551F1650.8070808@egenix.com> <20150404121023.3fefee2f@limelight.wooz.org> <5520ECE9.1060500@hastings.org> <20150405154123.10e263bb@fsol> Message-ID: <55216A47.8090605@hastings.org> On 04/05/2015 06:41 AM, Antoine Pitrou wrote: > On Sun, 05 Apr 2015 01:06:01 -0700 > Larry Hastings wrote: >> On 04/04/2015 08:21 PM, Nathaniel Smith wrote: >>> (I guess you could call Larry or someone, read them a hash over the >>> phone, and then have them create the actual gpg signatures.) >> By sheer coincidence, I believe Steve and I both live in the Seattle >> area...! > Meaning the phone works well enough there? Meaning we could do it properly in person. Anyway we're gonna take care of it at PyCon. //arry/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ben+python at benfinney.id.au Mon Apr 6 01:15:17 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Mon, 06 Apr 2015 09:15:17 +1000 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? References: <551E63E5.6080805@hastings.org> <551ED41D.2070909@egenix.com> <551F1650.8070808@egenix.com> <20150404121023.3fefee2f@limelight.wooz.org> <1428208996091.85387@microsoft.com> Message-ID: <85oan2yxoa.fsf@benfinney.id.au> Steve Dower writes: > Nathaniel Smith wrote: > > And I suspect python-dev generally doesn't put much weight on the > > extra effort required (release managers have all been using gpg for > > decades, it's pretty trivial) > > I'm aware of this, but still don't see it as a reason to unnecessarily > duplicate process. That's a good argument. But it's one against Authenticode, because that's a single-platform process that duplicates an existing convention to use an open, free standard: OpenPGP certificates. So the demands of ?why do we need to duplicate this work?? should be made to Microsoft for choosing to re-invent that long-standing and superior (because open, free-software, and cross-platform) wheel. -- \ ?At my lemonade stand I used to give the first glass away free | `\ and charge five dollars for the second glass. The refill | _o__) contained the antidote.? ?Emo Philips | Ben Finney From pje at telecommunity.com Mon Apr 6 01:17:06 2015 From: pje at telecommunity.com (PJ Eby) Date: Sun, 5 Apr 2015 19:17:06 -0400 Subject: [Python-Dev] PEP 487 vs 422 (dynamic class decoration) In-Reply-To: References: Message-ID: On Sat, Apr 4, 2015 at 9:33 PM, Nick Coghlan wrote: > So actually reading https://gist.github.com/pjeby/75ca26f8d2a7a0c68e30 > properly, you're starting to convince me that a "noconflict" metaclass > resolver would be a valuable and viable addition to the Python 3 type > system machinery. > > The future possible language level enhancement would then be to make > that automatic resolution of metaclass conflicts part of the *default* > metaclass determination process. I realise you've been trying to > explain that to me for a few days now, I'm just writing it out > explicitly to make it clear I finally get it :) I'm glad you got around to reading it. Sometimes it's really frustrating trying to get things like that across. What's funny is that once I actually 1) wrote that version, and 2) ended up doing a version of six's with_metaclass() function so I could write 2/3 mixed code in DecoratorTools, I realized that there isn't actually any reason why I can't write a Python 2 version of noconflict. Indeed, with a slight change to eliminate ClassType from the metaclass candidate list, the Python 3 version would also work as the Python 2 version: just use it as the explicit __metaclass__, or use with_metaclass, i.e.: class something(base1, base2, ...): __metaclass__ = noconflict # ... or: class something(with_metaclass(noconflict, base1, base2, ...)): # ... And the latter works syntactically from Python 2.3 on up. > My apologies for that - while I don't actually recall what I was > thinking when I said it, I suspect I was all fired up that PEP 422 was > definitely the right answer, and hence thought I'd have an official > solution in place for you in fairly short order. I should have let you > know explicitly when I started having doubts about it, so you could > reassess your porting options. Well, at least it's done now. Clearing up the issue allowed me to spend some time on porting some of the relevant libraries this weekend, where I promptly ran into challenges with several of the *other* features removed from Python 3 (like tuple arguments), but fortunately those are issues more of syntactic convenience than irreplaceable functionality. ;-) From ericsnowcurrently at gmail.com Mon Apr 6 17:22:23 2015 From: ericsnowcurrently at gmail.com (Eric Snow) Date: Mon, 6 Apr 2015 09:22:23 -0600 Subject: [Python-Dev] PEP 487 vs 422 (dynamic class decoration) In-Reply-To: <55208490.4080106@canterbury.ac.nz> References: <55208490.4080106@canterbury.ac.nz> Message-ID: On Sat, Apr 4, 2015 at 6:40 PM, Greg Ewing wrote: > Eric Snow wrote: >> >> I've felt for a long time that it would be helpful in some situations >> to have a reverse descriptor protocol. > > Can you elaborate on what you mean by that? Sure. It's more python-ideas territory (I posted about it a few years back). The idea is to allow an object the opportunity to handle being bound to a name. So if a type defines __bound__ (or similar) then it will be called for the instance being bound to a name: type(obj).__bound__(obj, name). There would also be an __unbound__, but that is less of an issue here. I'm still not convinced such a reverse descriptor protocol is practical as a general approach (though no less than the current descriptor protocol). However, I do see a distinct correspondence with the "__post_process__" method being considered here. So I wanted to point out the possibility of a more general approach for the sake of its impact on the name and semantics of a descriptor post-process method. While I expect __post_process__ would be called at a different place than __bound__, the responsibility of both would still be identical. -eric From ericsnowcurrently at gmail.com Mon Apr 6 18:12:07 2015 From: ericsnowcurrently at gmail.com (Eric Snow) Date: Mon, 6 Apr 2015 10:12:07 -0600 Subject: [Python-Dev] PEP 487 vs 422 (dynamic class decoration) In-Reply-To: References: Message-ID: On Fri, Apr 3, 2015 at 6:44 AM, Martin Teichmann wrote: > Nick, I couldn't agree more with you, yet I think PJ actually brought > up a very interesting point. Post-processing is a very common thing > these days, and has been re-written so many times that I think it is > about time that something like it should be in the standard library. Here's another approach that would help. Support a mechanism for inheriting class decorators. Classes would have an attribute like __subclass_decorators__ that would hold a tuple of all the inherited decorators. They would be applied, in order, in __build_class__ before any explicit decorators are. One way to accomplish this is with a meta-decorator, e.g. "classutil.inherited_decorator". You would decorate a class decorator with it: def inherited_decorator(deco): def new_deco(cls): cls = deco(cls) try: inherited = cls.__subclass_decorators__ except AttributeError: cls.__subclass_decorators__ = (deco,) else: cls.__subclass_decorators__ = inherited + (deco,) return cls @inherited_decorator def register(cls): registry.add(cls) return cls @register class X: ... The downside to the meta-decorator is that is isn't apparent when the class decorator is used that it will be inherited. It could also be used directly to make it more apparent: def register(cls): registry.add(cls) return cls @inherited(register) class X: ... However, that doesn't read well. Syntax would be better, but is a harder sell and a little grittier: def register(cls): registry.add(cls) return cls @@register class X: -eric From guido at python.org Tue Apr 7 03:08:25 2015 From: guido at python.org (Guido van Rossum) Date: Mon, 6 Apr 2015 18:08:25 -0700 Subject: [Python-Dev] PEP 8 update Message-ID: I've taken the liberty of adding the following old but good rule to PEP 8 (I was surprised to find it wasn't already there since I've lived by this for ages): - Be consistent in return statements. Either all return statements in a function should return an expression, or none of them should. If any return statement returns an expression, any return statements where no value is returned should explicitly state this as return None, and an explicit return statement should be present at the end of the function (if reachable). Yes: def foo(x): if x >= 0: return math.sqrt(x) else: return None def bar(x): if x < 0: return None return math.sqrt(x) No: def foo(x): if x >= 0: return math.sqrt(x) def bar(x): if x < 0: return return math.sqrt(x) -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From barry at python.org Tue Apr 7 03:31:33 2015 From: barry at python.org (Barry Warsaw) Date: Mon, 6 Apr 2015 21:31:33 -0400 Subject: [Python-Dev] PEP 8 update In-Reply-To: References: Message-ID: <20150406213133.7ac73e1f@anarchist.wooz.org> On Apr 06, 2015, at 06:08 PM, Guido van Rossum wrote: >I've taken the liberty of adding the following old but good rule to PEP 8 >(I was surprised to find it wasn't already there since I've lived by this >for ages): > > Be consistent in return statements. Either all return statements in a > function should return an expression, or none of them should. If any return > statement returns an expression, any return statements where no value is > returned should explicitly state this as return None, and an explicit > return statement should be present at the end of the function (if > reachable). +1 Odd synchronicity: Today I discovered an old interface that was documented as returning a "thing or None" but the latter was relying on implicit None return in some cases. Fixed of course in exactly the way PEP 8 now recommends. :) Cheers, -Barry From rob.cliffe at btinternet.com Tue Apr 7 04:11:30 2015 From: rob.cliffe at btinternet.com (Rob Cliffe) Date: Tue, 07 Apr 2015 03:11:30 +0100 Subject: [Python-Dev] PEP 8 update In-Reply-To: References: Message-ID: <55233CD2.3050508@btinternet.com> On 07/04/2015 02:08, Guido van Rossum wrote: > I've taken the liberty of adding the following old but good rule to > PEP 8 (I was surprised to find it wasn't already there since I've > lived by this for ages): > > * > > Be consistent in return statements. Either all return statements > in a function should return an expression, or none of them should. > If any return statement returns an expression, any return > statements where no value is returned should explicitly state this > asreturn None, and an explicit return statement should be present > at the end of the function (if reachable). > > Yes: > > def foo(x): > if x >= 0: > return math.sqrt(x) > else: > return None > That would seem to be good style and common sense. As a matter of interest, how far away from mainstream am I in preferring, *in this particular example* (obviously it might be different for more complicated computation), def foo(x): return math.sqrt(x) if x >= 0 else None I probably have a personal bias towards compact code, but it does seem to me that the latter says exactly what it means, no more and no less, and therefore is somewhat more readable. (Easier to keep the reader's attention for 32 non-whitespace characters than 40.) Sorry if this is irrelevant to Guido's point. Rob Cliffe -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at pearwood.info Tue Apr 7 05:02:55 2015 From: steve at pearwood.info (Steven D'Aprano) Date: Tue, 7 Apr 2015 13:02:55 +1000 Subject: [Python-Dev] PEP 8 update In-Reply-To: <55233CD2.3050508@btinternet.com> References: <55233CD2.3050508@btinternet.com> Message-ID: <20150407030238.GZ25453@ando.pearwood.info> On Tue, Apr 07, 2015 at 03:11:30AM +0100, Rob Cliffe wrote: > As a matter of interest, how far away from mainstream am I in > preferring, *in this particular example* (obviously it might be > different for more complicated computation), > > def foo(x): > return math.sqrt(x) if x >= 0 else None > > I probably have a personal bias towards compact code, but it does seem > to me that the latter says exactly what it means, no more and no less, > and therefore is somewhat more readable. (Easier to keep the reader's > attention for 32 non-whitespace characters than 40.) In my opinion, code like that is a good example of why the ternary if operator was resisted for so long :-) Sometimes you can have code which is just too compact. My own preference would be: def foo(x): if x >= 0: return math.sqrt(x) return None but I'm not terribly fussed about whether the "else" is added or not, whether the return is on the same line as the if, and other minor variations. -- Steve From mistersheik at gmail.com Tue Apr 7 12:34:35 2015 From: mistersheik at gmail.com (Neil Girdhar) Date: Tue, 7 Apr 2015 06:34:35 -0400 Subject: [Python-Dev] Build is broken for me after updating Message-ID: Ever since I updated, I am getting: In file included from Objects/dictobject.c:236:0: Objects/clinic/dictobject.c.h:70:26: fatal error: stringlib/eq.h: No such file or directory #include "stringlib/eq.h" But, Objects/stringlib/eq.h exists. Replacing the include with "Objects/stringlib/eq.h" seems to make this error go away, but others follow. Would anyone happen to know why this is happening? Neil -------------- next part -------------- An HTML attachment was scrubbed... URL: From mistersheik at gmail.com Tue Apr 7 12:42:50 2015 From: mistersheik at gmail.com (Neil Girdhar) Date: Tue, 7 Apr 2015 06:42:50 -0400 Subject: [Python-Dev] PEP 448 review In-Reply-To: References: Message-ID: Hello, Following up with PEP 448, I've gone over the entire code review except a few points as mentioned at the issue: http://bugs.python.org/review/2292/. I'm hoping that this will get done at the PyCon sprints. Is there any way I can help? I couldn't make it to PyCon, but I do live in Montreal. I would be more than happy to make time to meet up with anyone who wants to help with the review, e.g. Laika's usually a good place to work and have a coffee ( http://www.yelp.ca/biz/la%C3%AFka-montr%C3%A9al-2). Code reviews tend go much faster face-to-face. Also, I'm definitely interested in meeting any Python developers for coffee or drinks. I know the city pretty well. :) Best, Neil On Tue, Mar 17, 2015 at 9:49 AM, Brett Cannon wrote: > > > On Mon, Mar 16, 2015 at 7:11 PM Neil Girdhar > wrote: > >> Hi everyone, >> >> I was wondering what is left with the PEP 448 ( >> http://bugs.python.org/issue2292) code review? Big thanks to Benjamin, >> Ethan, and Serhiy for reviewing some (all?) of the code. What is the next >> step of this process? >> > > My suspicion is that if no one steps up between now and PyCon to do a > complete code review of the final patch, we as a group will try to get it > done at the PyCon sprints. I have made the issue a release blocker to help > make sure it gets reviewed and doesn't accidentally get left behind. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From solipsis at pitrou.net Tue Apr 7 14:26:51 2015 From: solipsis at pitrou.net (Antoine Pitrou) Date: Tue, 7 Apr 2015 14:26:51 +0200 Subject: [Python-Dev] PEP 8 update References: <55233CD2.3050508@btinternet.com> Message-ID: <20150407142651.17469355@fsol> On Tue, 07 Apr 2015 03:11:30 +0100 Rob Cliffe wrote: > > On 07/04/2015 02:08, Guido van Rossum wrote: > > I've taken the liberty of adding the following old but good rule to > > PEP 8 (I was surprised to find it wasn't already there since I've > > lived by this for ages): > > > > * > > > > Be consistent in return statements. Either all return statements > > in a function should return an expression, or none of them should. > > If any return statement returns an expression, any return > > statements where no value is returned should explicitly state this > > asreturn None, and an explicit return statement should be present > > at the end of the function (if reachable). > > > > Yes: > > > > def foo(x): > > if x >= 0: > > return math.sqrt(x) > > else: > > return None > > > That would seem to be good style and common sense. > > As a matter of interest, how far away from mainstream am I in > preferring, *in this particular example* (obviously it might be > different for more complicated computation), > > def foo(x): > return math.sqrt(x) if x >= 0 else None I agree with you on this. Regards Antoine. From benhoyt at gmail.com Tue Apr 7 14:47:25 2015 From: benhoyt at gmail.com (Ben Hoyt) Date: Tue, 7 Apr 2015 08:47:25 -0400 Subject: [Python-Dev] PEP 8 update In-Reply-To: <20150407030238.GZ25453@ando.pearwood.info> References: <55233CD2.3050508@btinternet.com> <20150407030238.GZ25453@ando.pearwood.info> Message-ID: > My own preference would be: > > def foo(x): > if x >= 0: > return math.sqrt(x) > return None Kind of getting into the weeds here, but I would always invert this to "return errors early, and keep the normal flow at the main indentation level". Depends a little on what foo() means, but it seems to me the "return None" case is the exceptional/error case, so this would be: def foo(x): if x < 0: return None return math.sqrt(x) The handling of errors is done first, and the normal case stays at the main indentation level. Is this discussed in style guides at all? I don't see anything directly in PEP 8, but I might be missing something. Oh wait, I just noticed this is exactly how Guido has it in his PEP addition with the definition of bar(). :-| -Ben From ncoghlan at gmail.com Tue Apr 7 18:20:10 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Wed, 8 Apr 2015 02:20:10 +1000 Subject: [Python-Dev] PEP 448 review In-Reply-To: References: Message-ID: On 7 Apr 2015 03:43, "Neil Girdhar" wrote: > > Hello, > > Following up with PEP 448, I've gone over the entire code review except a few points as mentioned at the issue: http://bugs.python.org/review/2292/. I'm hoping that this will get done at the PyCon sprints. Is there any way I can help? > > I couldn't make it to PyCon, but I do live in Montreal. I would be more than happy to make time to meet up with anyone who wants to help with the review, e.g. Laika's usually a good place to work and have a coffee ( http://www.yelp.ca/biz/la%C3%AFka-montr%C3%A9al-2). Code reviews tend go much faster face-to-face. My understanding is that it's possible to register for the post-PyCon sprints, even if you're not attending PyCon itself. Cheers, Nick. -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefan_ml at behnel.de Tue Apr 7 20:34:24 2015 From: stefan_ml at behnel.de (Stefan Behnel) Date: Tue, 07 Apr 2015 20:34:24 +0200 Subject: [Python-Dev] PEP 8 update In-Reply-To: <20150407142651.17469355@fsol> References: <55233CD2.3050508@btinternet.com> <20150407142651.17469355@fsol> Message-ID: Antoine Pitrou schrieb am 07.04.2015 um 14:26: > On Tue, 07 Apr 2015 03:11:30 +0100 > Rob Cliffe wrote: >> >> On 07/04/2015 02:08, Guido van Rossum wrote: >>> I've taken the liberty of adding the following old but good rule to >>> PEP 8 (I was surprised to find it wasn't already there since I've >>> lived by this for ages): >>> >>> * >>> >>> Be consistent in return statements. Either all return statements >>> in a function should return an expression, or none of them should. >>> If any return statement returns an expression, any return >>> statements where no value is returned should explicitly state this >>> asreturn None, and an explicit return statement should be present >>> at the end of the function (if reachable). >>> >>> Yes: >>> >>> def foo(x): >>> if x >= 0: >>> return math.sqrt(x) >>> else: >>> return None >>> >> That would seem to be good style and common sense. >> >> As a matter of interest, how far away from mainstream am I in >> preferring, *in this particular example* (obviously it might be >> different for more complicated computation), >> >> def foo(x): >> return math.sqrt(x) if x >= 0 else None > > I agree with you on this. +1, the ternary operator reads best when there is a "normal" case that can go first and a "special" or "unusual" case that can go last and does not apply under the (given) normal conditions. Stefan From guido at python.org Tue Apr 7 21:31:49 2015 From: guido at python.org (Guido van Rossum) Date: Tue, 7 Apr 2015 12:31:49 -0700 Subject: [Python-Dev] PEP 8 update In-Reply-To: References: <55233CD2.3050508@btinternet.com> <20150407142651.17469355@fsol> Message-ID: Talk about hijacking an unrelated thread... :-( Though there's maybe a style rule for the ternary operator to be devised here? :-) On Apr 7, 2015 11:36 AM, "Stefan Behnel" wrote: > Antoine Pitrou schrieb am 07.04.2015 um 14:26: > > On Tue, 07 Apr 2015 03:11:30 +0100 > > Rob Cliffe wrote: > >> > >> On 07/04/2015 02:08, Guido van Rossum wrote: > >>> I've taken the liberty of adding the following old but good rule to > >>> PEP 8 (I was surprised to find it wasn't already there since I've > >>> lived by this for ages): > >>> > >>> * > >>> > >>> Be consistent in return statements. Either all return statements > >>> in a function should return an expression, or none of them should. > >>> If any return statement returns an expression, any return > >>> statements where no value is returned should explicitly state this > >>> asreturn None, and an explicit return statement should be present > >>> at the end of the function (if reachable). > >>> > >>> Yes: > >>> > >>> def foo(x): > >>> if x >= 0: > >>> return math.sqrt(x) > >>> else: > >>> return None > >>> > >> That would seem to be good style and common sense. > >> > >> As a matter of interest, how far away from mainstream am I in > >> preferring, *in this particular example* (obviously it might be > >> different for more complicated computation), > >> > >> def foo(x): > >> return math.sqrt(x) if x >= 0 else None > > > > I agree with you on this. > > +1, the ternary operator reads best when there is a "normal" case that can > go first and a "special" or "unusual" case that can go last and does not > apply under the (given) normal conditions. > > Stefan > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From cristifati0 at gmail.com Tue Apr 7 21:58:49 2015 From: cristifati0 at gmail.com (Cristi Fati) Date: Tue, 7 Apr 2015 22:58:49 +0300 Subject: [Python-Dev] ctypes module Message-ID: Hi all, Not sure whether you got this question, or this is the right distribution list: Intel has deprecated Itanium architecture, and Windows also deprecated its versions(currently 2003 and 2008) that run on IA64. However Python (2.7.3) is compilable on Windows IA64, but ctypes module (1.1.0) which is now part of Python is not (the source files have been removed). What was the reason for its disablement? I am asking because an older version of ctypes (1.0.2) which came as a separate extension module (i used to compile it with Python 2.4.5) was available for WinIA64; i found (and fixed) a nasty buffer overrun in it. Regards, Cristi Fati. -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at pearwood.info Wed Apr 8 04:16:14 2015 From: steve at pearwood.info (Steven D'Aprano) Date: Wed, 8 Apr 2015 12:16:14 +1000 Subject: [Python-Dev] PEP 8 update In-Reply-To: References: <55233CD2.3050508@btinternet.com> <20150407030238.GZ25453@ando.pearwood.info> Message-ID: <20150408021614.GC5760@ando.pearwood.info> On Tue, Apr 07, 2015 at 08:47:25AM -0400, Ben Hoyt wrote: > > My own preference would be: > > > > def foo(x): > > if x >= 0: > > return math.sqrt(x) > > return None > > Kind of getting into the weeds here, but I would always invert this to > "return errors early, and keep the normal flow at the main indentation > level". Depends a little on what foo() means, but it seems to me the > "return None" case is the exceptional/error case, so this would be: > > def foo(x): > if x < 0: > return None > return math.sqrt(x) While *in general* I agree with "handle the error case early", there are cases where "handle the normal case early" is better, and I think that this is one of them. Also, inverting the comparison isn't appropriate, due to float NANs. With the first version, foo(NAN) returns None (which I assumed was deliberate by the OP). In your version, it returns NAN. But as you say, we're now deep into the weeds... -- Steve From fijall at gmail.com Wed Apr 8 12:36:00 2015 From: fijall at gmail.com (Maciej Fijalkowski) Date: Wed, 8 Apr 2015 12:36:00 +0200 Subject: [Python-Dev] ctypes module In-Reply-To: References: Message-ID: I presume the reason was that noone wants to maintain code for the case where there are no buildbots available and there is no development time available. You are free to put back in the files and see if they work (they might not), but such things are usually removed if they're a maintenance burden. I would be happy to assist you with finding someone willing to do commercial maintenance of ctypes for itanium, but asking python devs to do it for free is a bit too much. Cheers, fijal On Tue, Apr 7, 2015 at 9:58 PM, Cristi Fati wrote: > Hi all, > > Not sure whether you got this question, or this is the right distribution > list: > > Intel has deprecated Itanium architecture, and Windows also deprecated its > versions(currently 2003 and 2008) that run on IA64. > > However Python (2.7.3) is compilable on Windows IA64, but ctypes module > (1.1.0) which is now part of Python is not (the source files have been > removed). What was the reason for its disablement? > > I am asking because an older version of ctypes (1.0.2) which came as a > separate extension module (i used to compile it with Python 2.4.5) was > available for WinIA64; i found (and fixed) a nasty buffer overrun in it. > > Regards, > Cristi Fati. > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/fijall%40gmail.com > From ncoghlan at gmail.com Wed Apr 8 13:32:59 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Wed, 8 Apr 2015 21:32:59 +1000 Subject: [Python-Dev] ctypes module In-Reply-To: References: Message-ID: On 8 April 2015 at 20:36, Maciej Fijalkowski wrote: > I presume the reason was that noone wants to maintain code for the > case where there are no buildbots available and there is no > development time available. You are free to put back in the files and > see if they work (they might not), but such things are usually removed > if they're a maintenance burden. I would be happy to assist you with > finding someone willing to do commercial maintenance of ctypes for > itanium, but asking python devs to do it for free is a bit too much. As a point of reference, even Red Hat dropped Itanium support for RHEL6+ - you have to go all the way back to RHEL5 to find a version we still support running on Itanium. For most of CPython, keeping it running on arbitrary architectures often isn't too difficult, as libc abstracts away a lot of the hardware details. libffi (and hence ctypes) are notable exceptions to that :) Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From fijall at gmail.com Wed Apr 8 14:17:24 2015 From: fijall at gmail.com (Maciej Fijalkowski) Date: Wed, 8 Apr 2015 14:17:24 +0200 Subject: [Python-Dev] ctypes module In-Reply-To: References: Message-ID: for the record libffi supports itanium officially (but as usual I'm very skeptical how well it works on less used platforms) https://sourceware.org/libffi/ On Wed, Apr 8, 2015 at 1:32 PM, Nick Coghlan wrote: > On 8 April 2015 at 20:36, Maciej Fijalkowski wrote: >> I presume the reason was that noone wants to maintain code for the >> case where there are no buildbots available and there is no >> development time available. You are free to put back in the files and >> see if they work (they might not), but such things are usually removed >> if they're a maintenance burden. I would be happy to assist you with >> finding someone willing to do commercial maintenance of ctypes for >> itanium, but asking python devs to do it for free is a bit too much. > > As a point of reference, even Red Hat dropped Itanium support for > RHEL6+ - you have to go all the way back to RHEL5 to find a version we > still support running on Itanium. > > For most of CPython, keeping it running on arbitrary architectures > often isn't too difficult, as libc abstracts away a lot of the > hardware details. libffi (and hence ctypes) are notable exceptions to > that :) > > Cheers, > Nick. > > -- > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From cristifati0 at gmail.com Wed Apr 8 14:49:14 2015 From: cristifati0 at gmail.com (Cristi Fati) Date: Wed, 8 Apr 2015 15:49:14 +0300 Subject: [Python-Dev] ctypes module In-Reply-To: References: Message-ID: Hi all, thank you for your responses. Apparently i was wrong in my previous email, ctypes (1.0.1 or 1.0.2) didn't have support for WinIA64 (libffi), there was an in-house implementation. However we are using the IA64 module extensively (including to successfully call LsaLogonUser). a much simpler example: Python 2.7.3 (default, Oct 22 2014, 12:21:16) [MSC v.1600 64 bit (Itanium)] on win32 Type "help", "copyright", "credits" or "license" for more information. >>> import ctypes >>> ctypes.cdll.kernel32.GetTickCount() -79897956 >>> On Wed, Apr 8, 2015 at 3:17 PM, Maciej Fijalkowski wrote: > for the record libffi supports itanium officially (but as usual I'm > very skeptical how well it works on less used platforms) > https://sourceware.org/libffi/ > > On Wed, Apr 8, 2015 at 1:32 PM, Nick Coghlan wrote: > > On 8 April 2015 at 20:36, Maciej Fijalkowski wrote: > >> I presume the reason was that noone wants to maintain code for the > >> case where there are no buildbots available and there is no > >> development time available. You are free to put back in the files and > >> see if they work (they might not), but such things are usually removed > >> if they're a maintenance burden. I would be happy to assist you with > >> finding someone willing to do commercial maintenance of ctypes for > >> itanium, but asking python devs to do it for free is a bit too much. > > > > As a point of reference, even Red Hat dropped Itanium support for > > RHEL6+ - you have to go all the way back to RHEL5 to find a version we > > still support running on Itanium. > > > > For most of CPython, keeping it running on arbitrary architectures > > often isn't too difficult, as libc abstracts away a lot of the > > hardware details. libffi (and hence ctypes) are notable exceptions to > > that :) > > > > Cheers, > > Nick. > > > > -- > > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia > -------------- next part -------------- An HTML attachment was scrubbed... URL: From regebro at gmail.com Wed Apr 8 17:18:15 2015 From: regebro at gmail.com (Lennart Regebro) Date: Wed, 8 Apr 2015 17:18:15 +0200 Subject: [Python-Dev] Status on PEP-431 Timezones Message-ID: Hi! I wrote PEP-431 two years ago, and never got around to implement it. This year I got some renewed motivation after Berker Peksa? made an effort of implementing it. I'm planning to work more on this during the PyCon sprints, and also have a BoF session or similar during the conference. Anyone interested in a session on this, mail me and we'll set up a time and place! //Lennart ------------------ If anyone is interested in the details of the problem, this is it. The big problem is the ambiguous times, like 02:30 a time when you move the clock back one hour, as there are two different 02:30's that day. I wrote down my experiences with looking into and trying to implement several different solutions. And the problem there is actually how to tell the datetime if it is before or after the changeover. == How others have solved it == === dateutil.tz: Ignore the problem === dateutil.tz simply ignores the problems with ambiguous datetimes, keeping them ambiguous. === pytz: One timezone instance per changeover === Pytz implements ambiguous datetimes by having one class per timezone. Each change in the UTC offset changes, either because of a DST changeover, or because the timezone changes, is represented as one instance of the class. All instances are held in a list which is a class attribute of the timezone class. You flag in which DST changeover you are by uising different instances as the datetimes tzinfo. Since the timezone this way knows if it is DST or not, the datetime as a whole knows if it's DST or not. Benefits: - Only known possible implementation without modifying stdlib, which of course was a requirement, as pytz is a third-party library. - DST offset can be quickly returned, as it does not need to be calculated. Drawbacks: - A complex and highly magical implementation of timezones that is hard to understand. - Required new normalize()/localize() functions on the timezone, and hence the API is not stdlib's API. - Hundreds of instances per timezone means slightly more memory usage. == Options for PEP 431 == === Stdlib option 0: Ignore it === I don't think this is an option, really. Listed for completness. === Stdlib option 1: One timezone instance per changeover === Option 1 is to do it like pytz, have one timezone instance per changeover. However, this is likely not possible to do without fundamentally changing the datetime API, or making it very hard to use. For example, when creating a datetime instance and passing in a tzinfo today this tzinfo is just attached to the datetime. But when having multiple instances of tzinfos this means you have to select the correct one to pass in. pytz solves this with the .localize() method, which let's the timezone class choose which instance to pass in. We can't pass in the timezone class into datetime(), because that would require datetime.__new__ to create new datetimes as a part of the timezone arithmetic. These in turn, would create new datetimes in __new__ as a part of the timezone arithmetic, which in turn, yeah, you get it... I haven't been able to solve that issue without either changing the API/usage, or getting infinite recursions. Benefits: - Proven soloution through pytz. - Fast dst() call. Drawbacks: - Trying to use this technique with the current API tends to create infinite recursions. It seems to require big API changes. - Slow datetime() instance creation. === Stdlib option 2: A datetime _is_dst flag === By having a flag on the datetime instance that says "this is in DST or not" the timezone implementation can be kept simpler. You also have to either calculate if the datetime is in a DST or not either when creating it, which demands datetime object creations, and causes infinite recursions, or you have to calculate it when needed, which means you can get "Ambiguous date time errors" at unexpected times later. Also, when trying to implement this, I get bogged down in the complexities of how tzinfo and datetime is calling each other back and forth, and when to pass in the current is_dst and when to pass in the the desired is_dst, etc. The API and current implementation is not designed with this case in mind, and it gets very tricky. Benefits: - Simpler tzinfo() implementations. Drawbacks: - It seems likely that we must change some API's. - This in turn may affect the pytz implementation. Or not, hard to say. - The DST offset must use slow timezone calculations. However, since datetimes are immutable it can be a cached, lazy, one-time operation. === Stdlib option 3: UTC internal representation === Having UTC as the internal representation makes the whole issue go away. Datetimes are no longer ambiguous, except when creating, so checks need to be done during creation, but that should be possible without datetime creation in this case, resolving the infinite recursion problem. Benefits: - Problem solved. - Minimal API changes. Drawbacks: - Backwards compatibility with pickles. - Possible other backwards incompatibility problems. - Both DST offset and date time display representation must use slow timezone calculations. However, since datetimes are immutable it can be a cached, lazy, one-time operation. I'm currently trying to implement solution #2 above. Feedback is welcome. From alexander.belopolsky at gmail.com Wed Apr 8 19:17:55 2015 From: alexander.belopolsky at gmail.com (Alexander Belopolsky) Date: Wed, 8 Apr 2015 13:17:55 -0400 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: References: Message-ID: On Wed, Apr 8, 2015 at 11:18 AM, Lennart Regebro wrote: > === Stdlib option 2: A datetime _is_dst flag === > > By having a flag on the datetime instance that says "this is in DST or not" > the timezone implementation can be kept simpler. > I floated this idea [1] back in the days when we discussed the datetime.timestamp() method. The attraction was that such API would be familiar to the users of POSIX mktime and struct tm, but the history have shown that these POSIX APIs were insufficient in many situations and struct tm was extended by may libraries to include non-standard tm_gmtoff and tm_zone fields. With datetime, we also have a problem that POSIX APIs don't have to deal with: local time arithmetics. What is t + timedelta(1) when t falls on the day before DST change? How would you set the isdst flag in the result? [1] http://bugs.python.org/issue2736#msg124237 -------------- next part -------------- An HTML attachment was scrubbed... URL: From carl at oddbird.net Wed Apr 8 19:37:24 2015 From: carl at oddbird.net (Carl Meyer) Date: Wed, 08 Apr 2015 11:37:24 -0600 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: References: Message-ID: <55256754.10204@oddbird.net> Hi Lennart, On 04/08/2015 09:18 AM, Lennart Regebro wrote: > I wrote PEP-431 two years ago, and never got around to implement it. > This year I got some renewed motivation after Berker Peksa? made an > effort of implementing it. > I'm planning to work more on this during the PyCon sprints, and also > have a BoF session or similar during the conference. > > Anyone interested in a session on this, mail me and we'll set up a > time and place! I'm interested in the topic, and would probably attend a BoF at PyCon. Comments below: > If anyone is interested in the details of the problem, this is it. > > The big problem is the ambiguous times, like 02:30 a time when you > move the clock back one hour, as there are two different 02:30's that > day. I wrote down my experiences with looking into and trying to > implement several different solutions. And the problem there is > actually how to tell the datetime if it is before or after the > changeover. > > > == How others have solved it == > > === dateutil.tz: Ignore the problem === > > dateutil.tz simply ignores the problems with ambiguous datetimes, keeping them > ambiguous. > > > === pytz: One timezone instance per changeover === > > Pytz implements ambiguous datetimes by having one class per timezone. Each > change in the UTC offset changes, either because of a DST changeover, or because > the timezone changes, is represented as one instance of the class. > > All instances are held in a list which is a class attribute of the timezone > class. You flag in which DST changeover you are by uising different instances > as the datetimes tzinfo. Since the timezone this way knows if it is DST or not, > the datetime as a whole knows if it's DST or not. > > Benefits: > - Only known possible implementation without modifying stdlib, which of course > was a requirement, as pytz is a third-party library. > - DST offset can be quickly returned, as it does not need to be calculated. > Drawbacks: > - A complex and highly magical implementation of timezones that is hard to > understand. > - Required new normalize()/localize() functions on the timezone, and hence > the API is not stdlib's API. > - Hundreds of instances per timezone means slightly more memory usage. > > > == Options for PEP 431 == > > === Stdlib option 0: Ignore it === > > I don't think this is an option, really. Listed for completness. > > > === Stdlib option 1: One timezone instance per changeover === > > Option 1 is to do it like pytz, have one timezone instance per changeover. > However, this is likely not possible to do without fundamentally changing the > datetime API, or making it very hard to use. > > For example, when creating a datetime instance and passing in a tzinfo today > this tzinfo is just attached to the datetime. But when having multiple > instances of tzinfos this means you have to select the correct one to pass in. > pytz solves this with the .localize() method, which let's the timezone > class choose which instance to pass in. > > We can't pass in the timezone class into datetime(), because that would > require datetime.__new__ to create new datetimes as a part of the timezone > arithmetic. These in turn, would create new datetimes in __new__ as a part of > the timezone arithmetic, which in turn, yeah, you get it... > > I haven't been able to solve that issue without either changing the API/usage, > or getting infinite recursions. > > Benefits: > - Proven soloution through pytz. > - Fast dst() call. > Drawbacks: > - Trying to use this technique with the current API tends to create > infinite recursions. It seems to require big API changes. > - Slow datetime() instance creation. I think "proven solution" is a significant benefit. Today, anyone who is serious about correct timezone handling in Python is almost certainly using pytz. So is adopting pytz's expanded API into the stdlib really a big problem? It probably presents _fewer_ back-compatibility issues with real-world code than taking a different approach from pytz would. > === Stdlib option 2: A datetime _is_dst flag === > > By having a flag on the datetime instance that says "this is in DST or not" > the timezone implementation can be kept simpler. Is this really adequate? pytz's implementation handles far more than "is DST or not", it also correctly handles historical timezone changes. How would those be handled under this proposal? > You also have to either calculate if the datetime is in a DST or not either > when creating it, which demands datetime object creations, and causes infinite > recursions, or you have to calculate it when needed, which means you can > get "Ambiguous date time errors" at unexpected times later. > > Also, when trying to implement this, I get bogged down in the complexities > of how tzinfo and datetime is calling each other back and forth, and when > to pass in the current is_dst and when to pass in the the desired is_dst, etc. > The API and current implementation is not designed with this case in mind, > and it gets very tricky. > > Benefits: > - Simpler tzinfo() implementations. > Drawbacks: > - It seems likely that we must change some API's. > - This in turn may affect the pytz implementation. Or not, hard to say. > - The DST offset must use slow timezone calculations. However, since datetimes > are immutable it can be a cached, lazy, one-time operation. > > > === Stdlib option 3: UTC internal representation === > > Having UTC as the internal representation makes the whole issue go away. > Datetimes are no longer ambiguous, except when creating, so checks need to > be done during creation, but that should be possible without datetime creation > in this case, resolving the infinite recursion problem. > > Benefits: > - Problem solved. > - Minimal API changes. > Drawbacks: > - Backwards compatibility with pickles. > - Possible other backwards incompatibility problems. > - Both DST offset and date time display representation must use slow timezone > calculations. However, since datetimes are immutable it can be a cached, > lazy, one-time operation. If designing a library from scratch without any back-compat considerations, this is probably the first approach I would try. I would favor either solution 1 or 3. Carl -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 819 bytes Desc: OpenPGP digital signature URL: From ryan at ryanhiebert.com Wed Apr 8 21:46:37 2015 From: ryan at ryanhiebert.com (Ryan Hiebert) Date: Wed, 8 Apr 2015 14:46:37 -0500 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: <55256754.10204@oddbird.net> References: <55256754.10204@oddbird.net> Message-ID: <6AD536DF-2119-4E08-BC5A-11B98AC71A41@ryanhiebert.com> On Apr 8, 2015, at 12:37, Carl Meyer wrote: >> Anyone interested in a session on this, mail me and we'll set up a >> time and place! > > I'm interested in the topic, and would probably attend a BoF at PyCon. I'm of a similar mind. From ischwabacher at wisc.edu Wed Apr 8 21:57:29 2015 From: ischwabacher at wisc.edu (Isaac Schwabacher) Date: Wed, 08 Apr 2015 14:57:29 -0500 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: <7750f8091a0d97.552587da@wiscmail.wisc.edu> References: <77209a091a642e.5525676e@wiscmail.wisc.edu> <7620cb6c1a0068.552567aa@wiscmail.wisc.edu> <77509fbf1a7e70.552567e7@wiscmail.wisc.edu> <7620ae291a135d.55256823@wiscmail.wisc.edu> <7620ed9a1a2aa7.5525685f@wiscmail.wisc.edu> <7710a0d11a27f7.5525689c@wiscmail.wisc.edu> <7710ad3b1a198f.552568d8@wiscmail.wisc.edu> <7760d4301a1d4c.55256914@wiscmail.wisc.edu> <76208ce51a5c78.55256951@wiscmail.wisc.edu> <7730b5c41a23cb.5525698d@wiscmail.wisc.edu> <7570e7311a1060.552569c9@wiscmail.wisc.edu> <757083011a11c1.55256a06@wiscmail.wisc.edu> <7570f1cd1a131a.55256a42@wiscmail.wisc.edu> <757085c41a7882.55256a7e@wiscmail.wisc.edu> <7730dfeb1a5d3d.55256abb@wiscmail.wisc.edu> <753098871a00de.55256af7@wiscmail.wisc.edu> <7620f8361a52d7.55256b34@wiscmail.wisc.edu> <7720e5701a69ad.55256b70@wiscmail.wisc.edu> <75f0ecde1a2563.55256bac@wiscmail.wisc.edu> <7750b6a91a5e4c.55256be9@wiscmail.wisc.edu> <7750977a1a049f.55256c25@wiscmail.wisc.edu> <7640c4bf1a07cf.55256c64@wiscmail.wisc.edu> <75f08a701a618b.55256ca0@wiscmail.wisc.edu> <75f0deaf1a3742.55256cdd@wiscmail.wisc.edu> <7620b7831a32e9.55256d19@wiscmail.wisc.edu> <7720b19e1a5c35.55256d55@wiscmail.wisc.edu> <7530c2a81a2bcb.55256d93@wiscmail.wisc.edu> <7530b51e1a18c8.55256dcf@wiscmail.wisc.edu> <7760d7cc1a30e9.55256e0b@wiscmail.wisc.edu> <7760902c1a5f2d.55256e48@wiscmail.wisc.edu> <7570a8311a2408.55256e84@wiscmail.wisc.edu> <75f0b58c1a7a4a.55256ec0@wiscmail.wisc.edu> <7720bec31a0cad.55256efd@wiscmail.wisc.edu> <7530a00e1a1af6.55256f39@wiscmail.wisc.edu> <7730b7591a30e1.55256f75@wiscmail.wisc.edu> <7710fee61a5671.55256fb2@wiscmail.wisc.edu> <75b0d3dc1a482f.55256fee@wiscmail.wisc.edu> <7730b3ce1a61f1.5525702b@wiscmail.wisc.edu> <772082981a26d3.55257067@wiscmail.wisc.edu> <7530868f1a0bdd.552570a3@wiscmail.wisc.edu> <7570cabc1a3ff2.55257196@wiscmail.wisc.edu> <7570bc5b1a6a82.552571d3@wiscmail.wisc.edu> <75b084f91a120a.5525720f@wiscmail.wisc.edu> <7710b7fc1a394a.5525724b@wiscmail.wisc.edu> <7530d1e81a1782.55257287@wiscmail.wisc.edu> <7760cb591a2d7b.552572c4@wiscmail.wisc.edu> <7760feb11a5acd.55257300@wiscmail.wisc.edu> <75b0f3071a23e4.5525733d@wiscmail.wisc.edu> <7570cd5d1a7fb2.552573b5@wiscmail.wisc.edu> <7710821a1a25cd.552573f1@wiscmail.wisc.edu> <7710bb741a5981.5525742e@wiscmail.wisc.edu> <7730d2541a1570.552574e2@wiscmail.wisc.edu> <7760a3a31a637b.55257599@wiscmail.wisc.edu> <7530ca4c1a27b2.552575d8@wiscmail.wisc.edu> <7620c1581a6be2.55257614@wiscmail.wisc.edu> <7720b5e81a3db7.55257651@wiscmail.wisc.edu> <7620ed3f1a3a4a.55257780@wiscmail.wisc.edu> <7530dfe01a5f6c.552577bc@wiscmail.wisc.edu> <7620f5691a01dd.55257927@wiscmail.wisc.edu> <7530f56a1a50f6.55257964@wiscmail.wisc.edu> <75b0d5b31a4f38.552579a0@wiscmail.wisc.edu> <771082041a7176.552579dc@wiscmail.wisc.edu> <7620ff401a4d3d.55257a19@wiscmail.wisc.edu> <7620a5b71a287f.55257a55@wiscmail.wisc.edu> <7710b5ef1a3ca6.55257a94@wiscmail.wisc.edu> <75b0c4bb1a121e.55257ad3@wiscmail.wisc.edu> <7570cc801a3f92.55257b8a@wiscmail.wisc.edu> <75709b1d1a4b95.55257bc6@wiscmail.wisc.edu> <7640ead11a2c66.55257c03@wiscmail.wisc.edu> <7640c1741a5cc5.55257c3f@wiscmail.wisc.edu> <7720ebe81a3cd8.55257c7b@wiscmail.wisc.edu> <776092b41a797c.55257cb8@wiscmail.wisc.edu> <77508b791a20b3.55257cf4@wiscmail.wisc.edu> <7750cc961a45a0.55257d30@wiscmail.wisc.edu> <75b0e8d61a0ee1.55257f8c@wiscmail.wisc.edu> <75f0aa2f1a0ae8.55257fca@wiscmail.wisc.edu> <75f0c2351a4c80.55258007@wiscmail.wisc.edu> <7760b6e21a2f01.55258046@wiscmail.wisc.edu> <7710d7e51a71a7.55258082@wiscmail.wisc.edu> <7710abcd1a15f8.552580be@wiscmail.wisc.edu> <7530c0391a2ea8.552580fb@wiscmail.wisc.edu> <7720d5181a3ff4.55258137@wiscmail.wisc.edu> <7720dd2f1a7c5a.55258173@wiscmail.wisc.edu> <7640c6b51a3df5.552581b0@wiscmail.wisc.edu> <7640ee4b1a339e.552581ec@wiscmail.wisc.edu> <764087921a0b9f.55258228@wiscmail.wisc.edu> <7750dca91a4174.55258265@wiscmail.wisc.edu> <7760c2011a39dc.552582a4@wiscmail.wisc.edu> <7620d9171a3992.552582e0@wiscmail.wisc.edu> <75b08d361a5634.5525831c@wiscmail.wisc.edu> <7620c4141a208c.5525840f@wiscmail.wisc.edu> <7640f6f91a3fb3.5525844c@wiscmail.wisc.edu> <7620e9e81a4383.55258488@wiscmail.wisc.edu> <75b0a38e1a1054.552584c4@wiscmail.wisc.edu> <7750bef61a2e60.55258503@wiscmail.wisc.edu> <776091541a6ac3.55258540@wiscmail.wisc.edu> <7640fc041a414e.5525857c@wiscmail.wisc.edu> <75b086b61a3536.552585b8@wiscmail.wisc.edu> <7760e3ea1a5213.55258634@wiscmail.wisc.edu> <7620d3511a1769.55258670@wiscmail.wisc.edu> <77309bf71a7f90.552586ac@wiscmail.wisc.edu> <75b0a1271a24e2.552586e9@wiscmail.wisc.edu> <75f0e7351a1eee.55258725@wiscmail.wisc.edu> <7760fdbc1a14b4.5525879e@wiscmail.wisc.edu> <7750f8091a0d97.552587da@wiscmail.wisc.edu> Message-ID: <75b0de0d1a5379.552541d9@wiscmail.wisc.edu> On 15-04-08, Lennart Regebro wrote: > === Stdlib option 2: A datetime _is_dst flag === > > By having a flag on the datetime instance that says "this is in DST or not" > the timezone implementation can be kept simpler. Storing the offset itself instead of a flag makes things conceptually cleaner. You get a representation that's slightly harder to construct from the sorts of information you have lying around (tz name, naive datetime, is_dst flag) but is no harder to construct *and* validate, and then is easier to work with and harder to misuse. As an added bonus, you get a representation that's still meaningful when time changes happen for political rather than DST reasons. Pros: - tzinfo.utcoffset() and local->UTC conversions don't require zoneinfo access. - it's harder to represent "I know this time is DST but I don't know what tz it's in" [I put this in pro because I don't see how this kind of ambiguity can lead to anything but trouble, but ymmv] - this representation is meaningful regardless of whether a time zone has DST - this representation meaningfully disambiguates across time changes not resulting from DST Cons: - tzinfo.dst() requires zoneinfo access - tzinfo.tzname() requires zoneinfo access IF you want those horrible ambiguous abbreviations (is CST America/Chicago or America/Havana?) [I really wanted to put this in pro] - (datetime, offset, tz) triples require validation [but so do (datetime, is_dst, tz) triples, even though it's easy to pretend they don't] - constructing an aware datetime from a naive one, an is_dst flag, and a time zone requires zoneinfo access [but this is needed for the validation step anyway] On 15-04-08, Alexander Belopolsky wrote: > With datetime, we also have a problem that POSIX APIs don't have to deal with: local time > arithmetics. What is t + timedelta(1) when t falls on the day before DST change? How would > you set the isdst flag in the result? It's whatever time comes 60*60*24 seconds after t in the same time zone, because the timedelta class isn't expressive enough to represent anything but absolute time differences (nor should it be, IMO). But it might make sense to import dateutil.relativedelta or mxDateTime.RelativeDateTime into the stdlib to make relative time differences (same local time on the next day, for instance) possible. ijs From regebro at gmail.com Wed Apr 8 23:38:44 2015 From: regebro at gmail.com (Lennart Regebro) Date: Wed, 8 Apr 2015 23:38:44 +0200 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: <55256754.10204@oddbird.net> References: <55256754.10204@oddbird.net> Message-ID: On Wed, Apr 8, 2015 at 7:37 PM, Carl Meyer wrote: > Hi Lennart, > > On 04/08/2015 09:18 AM, Lennart Regebro wrote: >> I wrote PEP-431 two years ago, and never got around to implement it. >> This year I got some renewed motivation after Berker Peksa? made an >> effort of implementing it. >> I'm planning to work more on this during the PyCon sprints, and also >> have a BoF session or similar during the conference. >> >> Anyone interested in a session on this, mail me and we'll set up a >> time and place! > > I'm interested in the topic, and would probably attend a BoF at PyCon. Cool! > So is adopting pytz's expanded API into the stdlib really a big problem? Maybe, maybe not. But that API is also needlessly complicated, precisely because it's working around the limitations of datetime.tzinfo. In the PEP I remove those limitations but keep the simpler API. With a solution based on how pytz does it, I don't think that's possible. > Is this really adequate? pytz's implementation handles far more than "is > DST or not", it also correctly handles historical timezone changes. How > would those be handled under this proposal? Those would still be handled. The flag is only to flag if it's DST or not in a timestamp that is otherwise ambiguous. //Lennart From regebro at gmail.com Wed Apr 8 23:47:45 2015 From: regebro at gmail.com (Lennart Regebro) Date: Wed, 8 Apr 2015 23:47:45 +0200 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: <75b0de0d1a5379.552541d9@wiscmail.wisc.edu> References: <7760d4301a1d4c.55256914@wiscmail.wisc.edu> <76208ce51a5c78.55256951@wiscmail.wisc.edu> <7730b5c41a23cb.5525698d@wiscmail.wisc.edu> <7570e7311a1060.552569c9@wiscmail.wisc.edu> <757083011a11c1.55256a06@wiscmail.wisc.edu> <7570f1cd1a131a.55256a42@wiscmail.wisc.edu> <757085c41a7882.55256a7e@wiscmail.wisc.edu> <7730dfeb1a5d3d.55256abb@wiscmail.wisc.edu> <753098871a00de.55256af7@wiscmail.wisc.edu> <7620f8361a52d7.55256b34@wiscmail.wisc.edu> <7720e5701a69ad.55256b70@wiscmail.wisc.edu> <75f0ecde1a2563.55256bac@wiscmail.wisc.edu> <7750b6a91a5e4c.55256be9@wiscmail.wisc.edu> <7750977a1a049f.55256c25@wiscmail.wisc.edu> <7640c4bf1a07cf.55256c64@wiscmail.wisc.edu> <75f08a701a618b.55256ca0@wiscmail.wisc.edu> <75f0deaf1a3742.55256cdd@wiscmail.wisc.edu> <7620b7831a32e9.55256d19@wiscmail.wisc.edu> <7720b19e1a5c35.55256d55@wiscmail.wisc.edu> <7530c2a81a2bcb.55256d93@wiscmail.wisc.edu> <7530b51e1a18c8.55256dcf@wiscmail.wisc.edu> <7760d7cc1a30e9.55256e0b@wiscmail.wisc.edu> <7760902c1a5f2d.55256e48@wiscmail.wisc.edu> <7570a8311a2408.55256e84@wiscmail.wisc.edu> <75f0b58c1a7a4a.55256ec0@wiscmail.wisc.edu> <7720bec31a0cad.55256efd@wiscmail.wisc.edu> <7530a00e1a1af6.55256f39@wiscmail.wisc.edu> <7730b7591a30e1.55256f75@wiscmail.wisc.edu> <7710fee61a5671.55256fb2@wiscmail.wisc.edu> <75b0d3dc1a482f.55256fee@wiscmail.wisc.edu> <7730b3ce1a61f1.5525702b@wiscmail.wisc.edu> <772082981a26d3.55257067@wiscmail.wisc.edu> <7530868f1a0bdd.552570a3@wiscmail.wisc.edu> <7570cabc1a3ff2.55257196@wiscmail.wisc.edu> <7570bc5b1a6a82.552571d3@wiscmail.wisc.edu> <75b084f91a120a.5525720f@wiscmail.wisc.edu> <7710b7fc1a394a.5525724b@wiscmail.wisc.edu> <7530d1e81a1782.55257287@wiscmail.wisc.edu> <7760cb591a2d7b.552572c4@wiscmail.wisc.edu> <7760feb11a5acd.55257300@wiscmail.wisc.edu> <75b0f3071a23e4.5525733d@wiscmail.wisc.edu> <7570cd5d1a7fb2.552573b5@wiscmail.wisc.edu> <7710821a1a25cd.552573f1@wiscmail.wisc.edu> <7710bb741a5981.5525742e@wiscmail.wisc.edu> <7730d2541a1570.552574e2@wiscmail.wisc.edu> <7760a3a31a637b.55257599@wiscmail.wisc.edu> <7530ca4c1a27b2.552575d8@wiscmail.wisc.edu> <7620c1581a6be2.55257614@wiscmail.wisc.edu> <7720b5e81a3db7.55257651@wiscmail.wisc.edu> <7620ed3f1a3a4a.55257780@wiscmail.wisc.edu> <7530dfe01a5f6c.552577bc@wiscmail.wisc.edu> <7620f5691a01dd.55257927@wiscmail.wisc.edu> <7530f56a1a50f6.55257964@wiscmail.wisc.edu> <75b0d5b31a4f38.552579a0@wiscmail.wisc.edu> <771082041a7176.552579dc@wiscmail.wisc.edu> <7620ff401a4d3d.55257a19@wiscmail.wisc.edu> <7620a5b71a287f.55257a55@wiscmail.wisc.edu> <7710b5ef1a3ca6.55257a94@wiscmail.wisc.edu> <75b0c4bb1a121e.55257ad3@wiscmail.wisc.edu> <7570cc801a3f92.55257b8a@wiscmail.wisc.edu> <75709b1d1a4b95.55257bc6@wiscmail.wisc.edu> <7640ead11a2c66.55257c03@wiscmail.wisc.edu> <7640c1741a5cc5.55257c3f@wiscmail.wisc.edu> <7720ebe81a3cd8.55257c7b@wiscmail.wisc.edu> <776092b41a797c.55257cb8@wiscmail.wisc.edu> <77508b791a20b3.55257cf4@wiscmail.wisc.edu> <7750cc961a45a0.55257d30@wiscmail.wisc.edu> <75b0e8d61a0ee1.55257f8c@wiscmail.wisc.edu> <75f0aa2f1a0ae8.55257fca@wiscmail.wisc.edu> <75f0c2351a4c80.55258007@wiscmail.wisc.edu> <7760b6e21a2f01.55258046@wiscmail.wisc.edu> <7710d7e51a71a7.55258082@wiscmail.wisc.edu> <7710abcd1a15f8.552580be@wiscmail.wisc.edu> <7530c0391a2ea8.552580fb@wiscmail.wisc.edu> <7720d5181a3ff4.55258137@wiscmail.wisc.edu> <7720dd2f1a7c5a.55258173@wiscmail.wisc.edu> <7640c6b51a3df5.552581b0@wiscmail.wisc.edu> <7640ee4b1a339e.552581ec@wiscmail.wisc.edu> <764087921a0b9f.55258228@wiscmail.wisc.edu> <7750dca91a4174.55258265@wiscmail.wisc.edu> <7760c2011a39dc.552582a4@wiscmail.wisc.edu> <7620d9171a3992.552582e0@wiscmail.wisc.edu> <75b08d361a5634.5525831c@wiscmail.wisc.edu> <7620c4141a208c.5525840f@wiscmail.wisc.edu> <7640f6f91a3fb3.5525844c@wiscmail.wisc.edu> <7620e9e81a4383.55258488@wiscmail.wisc.edu> <75b0a38e1a1054.552584c4@wiscmail.wisc.edu> <7750bef61a2e60.55258503@wiscmail.wisc.edu> <776091541a6ac3.55258540@wiscmail.wisc.edu> <7640fc041a414e.5525857c@wiscmail.wisc.edu> <75b086b61a3536.552585b8@wiscmail.wisc.edu> <7760e3ea1a5213.55258634@wiscmail.wisc.edu> <7620d3511a1769.55258670@wiscmail.wisc.edu> <77309bf71a7f90.552586ac@wiscmail.wisc.edu> <75b0a1271a24e2.552586e9@wiscmail.wisc.edu> <75f0e7351a1eee.55258725@wiscmail.wisc.edu> <7760fdbc1a14b4.5525879e@wiscmail.wisc.edu> <7750f8091a0d97.552587da@wiscmail.wisc.edu> <75b0de0d1a5379.552541d9@wiscmail.wisc.edu> Message-ID: On Wed, Apr 8, 2015 at 9:57 PM, Isaac Schwabacher wrote: > On 15-04-08, Lennart Regebro wrote: >> === Stdlib option 2: A datetime _is_dst flag === >> >> By having a flag on the datetime instance that says "this is in DST or not" >> the timezone implementation can be kept simpler. > > Storing the offset itself instead of a flag makes things conceptually cleaner. The concept would be pretty much the same, but yes, you would be able to save lookups in the dst() call, so that's a benefit. I was planning on storing the dst() offset at least lazily, but I'm not sure that means we can get rid of the flag. > As an added bonus, you get a representation that's still meaningful when time changes happen for political rather than DST reasons. That's a good point. Although you would still have to use the is_dst flag to choose in that case, even if neither or both is actually DST. Unless we come up with another name for that flag, which I don't think is important. //Lennart From alexander.belopolsky at gmail.com Thu Apr 9 00:32:41 2015 From: alexander.belopolsky at gmail.com (Alexander Belopolsky) Date: Wed, 8 Apr 2015 18:32:41 -0400 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: <75b0de0d1a5379.552541d9@wiscmail.wisc.edu> References: <7760d4301a1d4c.55256914@wiscmail.wisc.edu> <76208ce51a5c78.55256951@wiscmail.wisc.edu> <7730b5c41a23cb.5525698d@wiscmail.wisc.edu> <7570e7311a1060.552569c9@wiscmail.wisc.edu> <757083011a11c1.55256a06@wiscmail.wisc.edu> <7570f1cd1a131a.55256a42@wiscmail.wisc.edu> <757085c41a7882.55256a7e@wiscmail.wisc.edu> <7730dfeb1a5d3d.55256abb@wiscmail.wisc.edu> <753098871a00de.55256af7@wiscmail.wisc.edu> <7620f8361a52d7.55256b34@wiscmail.wisc.edu> <7720e5701a69ad.55256b70@wiscmail.wisc.edu> <75f0ecde1a2563.55256bac@wiscmail.wisc.edu> <7750b6a91a5e4c.55256be9@wiscmail.wisc.edu> <7750977a1a049f.55256c25@wiscmail.wisc.edu> <7640c4bf1a07cf.55256c64@wiscmail.wisc.edu> <75f08a701a618b.55256ca0@wiscmail.wisc.edu> <75f0deaf1a3742.55256cdd@wiscmail.wisc.edu> <7620b7831a32e9.55256d19@wiscmail.wisc.edu> <7720b19e1a5c35.55256d55@wiscmail.wisc.edu> <7530c2a81a2bcb.55256d93@wiscmail.wisc.edu> <7530b51e1a18c8.55256dcf@wiscmail.wisc.edu> <7760d7cc1a30e9.55256e0b@wiscmail.wisc.edu> <7760902c1a5f2d.55256e48@wiscmail.wisc.edu> <7570a8311a2408.55256e84@wiscmail.wisc.edu> <75f0b58c1a7a4a.55256ec0@wiscmail.wisc.edu> <7720bec31a0cad.55256efd@wiscmail.wisc.edu> <7530a00e1a1af6.55256f39@wiscmail.wisc.edu> <7730b7591a30e1.55256f75@wiscmail.wisc.edu> <7710fee61a5671.55256fb2@wiscmail.wisc.edu> <75b0d3dc1a482f.55256fee@wiscmail.wisc.edu> <7730b3ce1a61f1.5525702b@wiscmail.wisc.edu> <772082981a26d3.55257067@wiscmail.wisc.edu> <7530868f1a0bdd.552570a3@wiscmail.wisc.edu> <7570cabc1a3ff2.55257196@wiscmail.wisc.edu> <7570bc5b1a6a82.552571d3@wiscmail.wisc.edu> <75b084f91a120a.5525720f@wiscmail.wisc.edu> <7710b7fc1a394a.5525724b@wiscmail.wisc.edu> <7530d1e81a1782.55257287@wiscmail.wisc.edu> <7760cb591a2d7b.552572c4@wiscmail.wisc.edu> <7760feb11a5acd.55257300@wiscmail.wisc.edu> <75b0f3071a23e4.5525733d@wiscmail.wisc.edu> <7570cd5d1a7fb2.552573b5@wiscmail.wisc.edu> <7710821a1a25cd.552573f1@wiscmail.wisc.edu> <7710bb741a5981.5525742e@wiscmail.wisc.edu> <7730d2541a1570.552574e2@wiscmail.wisc.edu> <7760a3a31a637b.55257599@wiscmail.wisc.edu> <7530ca4c1a27b2.552575d8@wiscmail.wisc.edu> <7620c1581a6be2.55257614@wiscmail.wisc.edu> <7720b5e81a3db7.55257651@wiscmail.wisc.edu> <7620ed3f1a3a4a.55257780@wiscmail.wisc.edu> <7530dfe01a5f6c.552577bc@wiscmail.wisc.edu> <7620f5691a01dd.55257927@wiscmail.wisc.edu> <7530f56a1a50f6.55257964@wiscmail.wisc.edu> <75b0d5b31a4f38.552579a0@wiscmail.wisc.edu> <771082041a7176.552579dc@wiscmail.wisc.edu> <7620ff401a4d3d.55257a19@wiscmail.wisc.edu> <7620a5b71a287f.55257a55@wiscmail.wisc.edu> <7710b5ef1a3ca6.55257a94@wiscmail.wisc.edu> <75b0c4bb1a121e.55257ad3@wiscmail.wisc.edu> <7570cc801a3f92.55257b8a@wiscmail.wisc.edu> <75709b1d1a4b95.55257bc6@wiscmail.wisc.edu> <7640ead11a2c66.55257c03@wiscmail.wisc.edu> <7640c1741a5cc5.55257c3f@wiscmail.wisc.edu> <7720ebe81a3cd8.55257c7b@wiscmail.wisc.edu> <776092b41a797c.55257cb8@wiscmail.wisc.edu> <77508b791a20b3.55257cf4@wiscmail.wisc.edu> <7750cc961a45a0.55257d30@wiscmail.wisc.edu> <75b0e8d61a0ee1.55257f8c@wiscmail.wisc.edu> <75f0aa2f1a0ae8.55257fca@wiscmail.wisc.edu> <75f0c2351a4c80.55258007@wiscmail.wisc.edu> <7760b6e21a2f01.55258046@wiscmail.wisc.edu> <7710d7e51a71a7.55258082@wiscmail.wisc.edu> <7710abcd1a15f8.552580be@wiscmail.wisc.edu> <7530c0391a2ea8.552580fb@wiscmail.wisc.edu> <7720d5181a3ff4.55258137@wiscmail.wisc.edu> <7720dd2f1a7c5a.55258173@wiscmail.wisc.edu> <7640c6b51a3df5.552581b0@wiscmail.wisc.edu> <7640ee4b1a339e.552581ec@wiscmail.wisc.edu> <764087921a0b9f.55258228@wiscmail.wisc.edu> <7750dca91a4174.55258265@wiscmail.wisc.edu> <7760c2011a39dc.552582a4@wiscmail.wisc.edu> <7620d9171a3992.552582e0@wiscmail.wisc.edu> <75b08d361a5634.5525831c@wiscmail.wisc.edu> <7620c4141a208c.5525840f@wiscmail.wisc.edu> <7640f6f91a3fb3.5525844c@wiscmail.wisc.edu> <7620e9e81a4383.55258488@wiscmail.wisc.edu> <75b0a38e1a1054.552584c4@wiscmail.wisc.edu> <7750bef61a2e60.55258503@wiscmail.wisc.edu> <776091541a6ac3.55258540@wiscmail.wisc.edu> <7640fc041a414e.5525857c@wiscmail.wisc.edu> <75b086b61a3536.552585b8@wiscmail.wisc.edu> <7760e3ea1a5213.55258634@wiscmail.wisc.edu> <7620d3511a1769.55258670@wiscmail.wisc.edu> <77309bf71a7f90.552586ac@wiscmail.wisc.edu> <75b0a1271a24e2.552586e9@wiscmail.wisc.edu> <75f0e7351a1eee.55258725@wiscmail.wisc.edu> <7760fdbc1a14b4.5525879e@wiscmail.wisc.edu> <7750f8091a0d97.552587da@wiscmail.wisc.edu> <75b0de0d1a5379.552541d9@wiscmail.wisc.edu> Message-ID: On Wed, Apr 8, 2015 at 3:57 PM, Isaac Schwabacher wrote: > > On 15-04-08, Lennart Regebro wrote: > > === Stdlib option 2: A datetime _is_dst flag === > > > > By having a flag on the datetime instance that says "this is in DST or not" > > the timezone implementation can be kept simpler. > > Storing the offset itself instead of a flag makes things conceptually cleaner. It is important to distinguish two notions that unfortunately both called a "time zone." For lack of better terms, let me define "named offsets" and "locations" as follows: A "named offset" is an abbreviation such as UTC, EST, MSK, MSD which (at any given time) corresponds to a fixed offset from UTC. Since at different historical times, the same abbreviation such as MSK may correspond to a different offset, you cannot always derive one from another and extended struct tm provides fields for both: tm_gmtoff and tm_zone. A location is a name of geographical entity such as America/New_York which follows the same timekeeping rules. The isdst flag is necessary when you know the location and local time and want to find out the corresponding UTC time. In many locations, there is one our every year when knowing local time is not enough because the same local time corresponds to two different UTC times. This happens in the US when we move the clock back one hour some day in the Fall. UTC time marches on, but local time repeats the same hour. So in order to know what 01:30 AM is in New York, you also need to know whether it is before we moved the clocks back or after. So "storing the offset" and "storing a flag" are not two alternative solutions to the same problem- these are two solutions to two different problems. .. > On 15-04-08, Alexander Belopolsky wrote: > > With datetime, we also have a problem that POSIX APIs don't have to deal with: local time > > arithmetics. What is t + timedelta(1) when t falls on the day before DST change? How would > > you set the isdst flag in the result? > > It's whatever time comes 60*60*24 seconds after t in the same time zone, because the timedelta class isn't expressive enough to represent anything but absolute time differences (nor should it be, IMO). This is not what most uses expect. The expect datetime(y, m, d, 12, tzinfo=New_York) + timedelta(1) to be datetime(y, m, d+1, 12, tzinfo=New_York) -------------- next part -------------- An HTML attachment was scrubbed... URL: From ischwabacher at wisc.edu Thu Apr 9 00:52:27 2015 From: ischwabacher at wisc.edu (Isaac Schwabacher) Date: Wed, 08 Apr 2015 17:52:27 -0500 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: <7750e1f41a3875.5525b106@wiscmail.wisc.edu> References: <7760d4301a1d4c.55256914@wiscmail.wisc.edu> <76208ce51a5c78.55256951@wiscmail.wisc.edu> <7730b5c41a23cb.5525698d@wiscmail.wisc.edu> <7570e7311a1060.552569c9@wiscmail.wisc.edu> <757083011a11c1.55256a06@wiscmail.wisc.edu> <7570f1cd1a131a.55256a42@wiscmail.wisc.edu> <757085c41a7882.55256a7e@wiscmail.wisc.edu> <7730dfeb1a5d3d.55256abb@wiscmail.wisc.edu> <753098871a00de.55256af7@wiscmail.wisc.edu> <7620f8361a52d7.55256b34@wiscmail.wisc.edu> <7720e5701a69ad.55256b70@wiscmail.wisc.edu> <75f0ecde1a2563.55256bac@wiscmail.wisc.edu> <7750b6a91a5e4c.55256be9@wiscmail.wisc.edu> <7750977a1a049f.55256c25@wiscmail.wisc.edu> <7640c4bf1a07cf.55256c64@wiscmail.wisc.edu> <75f08a701a618b.55256ca0@wiscmail.wisc.edu> <75f0deaf1a3742.55256cdd@wiscmail.wisc.edu> <7620b7831a32e9.55256d19@wiscmail.wisc.edu> <7720b19e1a5c35.55256d55@wiscmail.wisc.edu> <7530c2a81a2bcb.55256d93@wiscmail.wisc.edu> <7530b51e1a18c8.55256dcf@wiscmail.wisc.edu> <7760d7cc1a30e9.55256e0b@wiscmail.wisc.edu> <7760902c1a5f2d.55256e48@wiscmail.wisc.edu> <7570a8311a2408.55256e84@wiscmail.wisc.edu> <75f0b58c1a7a4a.55256ec0@wiscmail.wisc.edu> <7720bec31a0cad.55256efd@wiscmail.wisc.edu> <7530a00e1a1af6.55256f39@wiscmail.wisc.edu> <7730b7591a30e1.55256f75@wiscmail.wisc.edu> <7710fee61a5671.55256fb2@wiscmail.wisc.edu> <75b0d3dc1a482f.55256fee@wiscmail.wisc.edu> <7730b3ce1a61f1.5525702b@wiscmail.wisc.edu> <772082981a26d3.55257067@wiscmail.wisc.edu> <7530868f1a0bdd.552570a3@wiscmail.wisc.edu> <7570cabc1a3ff2.55257196@wiscmail.wisc.edu> <7570bc5b1a6a82.552571d3@wiscmail.wisc.edu> <75b084f91a120a.5525720f@wiscmail.wisc.edu> <7710b7fc1a394a.5525724b@wiscmail.wisc.edu> <7530d1e81a1782.55257287@wiscmail.wisc.edu> <7760cb591a2d7b.552572c4@wiscmail.wisc.edu> <7760feb11a5acd.55257300@wiscmail.wisc.edu> <75b0f3071a23e4.5525733d@wiscmail.wisc.edu> <7570cd5d1a7fb2.552573b5@wiscmail.wisc.edu> <7710821a1a25cd.552573f1@wiscmail.wisc.edu> <7710bb741a5981.5525742e@wiscmail.wisc.edu> <7730d2541a1570.552574e2@wiscmail.wisc.edu> <7760a3a31a637b.55257599@wiscmail.wisc.edu> <7530ca4c1a27b2.552575d8@wiscmail.wisc.edu> <7620c1581a6be2.55257614@wiscmail.wisc.edu> <7720b5e81a3db7.55257651@wiscmail.wisc.edu> <7620ed3f1a3a4a.55257780@wiscmail.wisc.edu> <7530dfe01a5f6c.552577bc@wiscmail.wisc.edu> <7620f5691a01dd.55257927@wiscmail.wisc.edu> <7530f56a1a50f6.55257964@wiscmail.wisc.edu> <75b0d5b31a4f38.552579a0@wiscmail.wisc.edu> <771082041a7176.552579dc@wiscmail.wisc.edu> <7620ff401a4d3d.55257a19@wiscmail.wisc.edu> <7620a5b71a287f.55257a55@wiscmail.wisc.edu> <7710b5ef1a3ca6.55257a94@wiscmail.wisc.edu> <75b0c4bb1a121e.55257ad3@wiscmail.wisc.edu> <7570cc801a3f92.55257b8a@wiscmail.wisc.edu> <75709b1d1a4b95.55257bc6@wiscmail.wisc.edu> <7640ead11a2c66.55257c03@wiscmail.wisc.edu> <7640c1741a5cc5.55257c3f@wiscmail.wisc.edu> <7720ebe81a3cd8.55257c7b@wiscmail.wisc.edu> <776092b41a797c.55257cb8@wiscmail.wisc.edu> <77508b791a20b3.55257cf4@wiscmail.wisc.edu> <7750cc961a45a0.55257d30@wiscmail.wisc.edu> <75b0e8d61a0ee1.55257f8c@wiscmail.wisc.edu> <75f0aa2f1a0ae8.55257fca@wiscmail.wisc.edu> <75f0c2351a4c80.55258007@wiscmail.wisc.edu> <7760b6e21a2f01.55258046@wiscmail.wisc.edu> <7710d7e51a71a7.55258082@wiscmail.wisc.edu> <7710abcd1a15f8.552580be@wiscmail.wisc.edu> <7530c0391a2ea8.552580fb@wiscmail.wisc.edu> <7720d5181a3ff4.55258137@wiscmail.wisc.edu> <7720dd2f1a7c5a.55258173@wiscmail.wisc.edu> <7640c6b51a3df5.552581b0@wiscmail.wisc.edu> <7640ee4b1a339e.552581ec@wiscmail.wisc.edu> <764087921a0b9f.55258228@wiscmail.wisc.edu> <7750dca91a4174.55258265@wiscmail.wisc.edu> <7760c2011a39dc.552582a4@wiscmail.wisc.edu> <7620d9171a3992.552582e0@wiscmail.wisc.edu> <75b08d361a5634.5525831c@wiscmail.wisc.edu> <7620c4141a208c.5525840f@wiscmail.wisc.edu> <7640f6f91a3fb3.5525844c@wiscmail.wisc.edu> <7620e9e81a4383.55258488@wiscmail.wisc.edu> <75b0a38e1a1054.552584c4@wiscmail.wisc.edu> <7750bef61a2e60.55258503@wiscmail.wisc.edu> <776091541a6ac3.55258540@wiscmail.wisc.edu> <7640fc041a414e.5525857c@wiscmail.wisc.edu> <75b086b61a3536.552585b8@wiscmail.wisc.edu> <7760e3ea1a5213.55258634@wiscmail.wisc.edu> <7620d3511a1769.55258670@wiscmail.wisc.edu> <77309bf71a7f90.552586ac@wiscmail.wisc.edu> <75b0a1271a24e2.552586e9@wiscmail.wisc.edu> <75f0e7351a1eee.55258725@wiscmail.wisc.edu> <7760fdbc1a14b4.5525879e@wiscmail.wisc.edu> <7750f8091a0d97.552587da@wiscmail.wisc.edu> <75b0de0d1a5379.552541d9@wiscmail.wisc.edu> <7720aaab1a58e2.5525ad7b@wiscmail.wisc.edu> <75f0acb81a65a6.5525adb8@wiscmail.wisc.edu> <771093591a4622.5525adf4@wiscmail.wisc.edu> <7760ed451a72f2.5525ae30@wiscmail.wisc.edu> <75f0ce8f1a1f49.5525ae6d@wiscmail.wisc.edu> <75b0e6421a27a6.5525aea9@wiscmail.wisc.edu> <75309dc51a4385.5525aee6@wiscmail.wisc.edu> <7570ca031a50f1.5525af22@wiscmail.wisc.edu> <76209ade1a6819.5525af5f@wiscmail.wisc.edu> <7620c53c1a66f0.5525af9b@wiscmail.wisc.edu> <7750a5d91a3002.5525afd8@wiscmail.wisc.edu> <75b0e81b1a2ebc.5525b014@wiscmail.wisc.edu> <771080cc1a12e0.5525b050@wiscmail.wisc.edu> <7570e91a1a00ff.5525b08d@wiscmail.wisc.edu> <773095181a724e.5525b0c9@wiscmail.wisc.edu> <7750e1f41a3875.5525b106@wiscmail.wisc.edu> Message-ID: <7730929f1a1248.55256adb@wiscmail.wisc.edu> On 15-04-08, Alexander Belopolsky wrote: > > On Wed, Apr 8, 2015 at 3:57 PM, Isaac Schwabacher > wrote: > > > > On 15-04-08, Lennart Regebro wrote: > > > === Stdlib option 2: A datetime _is_dst flag === > > > > > > By having a flag on the datetime instance that says "this is in DST or not" > > > the timezone implementation can be kept simpler. > > > > Storing the offset itself instead of a flag makes things conceptually cleaner. > > It is important to distinguish two notions that unfortunately both called a "time zone." For lack > of better terms, let me define "named offsets" and "locations" as follows: > > A "named offset" is an abbreviation such as UTC, EST, MSK, MSD which (at any given time) > corresponds to a fixed offset from UTC. Since at different historical times, the same abbreviation > such as MSK may correspond to a different offset, you cannot always derive one from another > and extended struct tm provides fields for both: tm_gmtoff and tm_zone. > > A location is a name of geographical entity such as America/New_York which follows the sametimekeeping rules. > > > The isdst flag is necessary when you know the location and local time and want to find out the > corresponding UTC time. In many locations, there is one our every year when knowing local > time is not enough because the same local time corresponds to two different UTC times. > > > This happens in the US when we move the clock back one hour some day in the Fall. UTC time > marches on, but local time repeats the same hour. So in order to know what 01:30 AM is in New York, > you also need to know whether it is before we moved the clocks back or after. > > So "storing the offset" and "storing a flag" are not two alternative solutions to the same problem- these > are two solutions to two different problems. I'm viewing a time zone as a map from UTC to local time; for example, America/Chicago is a time zone. I'm not proposing storing the offset as an alternative to storing the *time zone*, I'm proposing it as an alternative to storing whether a given time is DST or not. We really don't care whether a time is DST or not; we care about the local time and the offset, and we need the time zone because otherwise we can't relate our time to other times. So our times would look like "2013-11-03 01:30:00-0500 America/Chicago" and an hour later, "2013-11-03 01:30:00-0600 America/Chicago". And all of that information is stored in the datetime object. > .. > > On 15-04-08, Alexander Belopolsky wrote: > > > With datetime, we also have a problem that POSIX APIs don't have to deal with: local time > > > arithmetics. What is t + timedelta(1) when t falls on the day before DST change? How would > > > you set the isdst flag in the result? > > > > It's whatever time comes 60*60*24 seconds after t in the same time zone, because the timedelta class isn't expressive enough to represent anything but absolute time differences (nor should it be, IMO). > > This is not what most uses expect. The expect > > datetime(y, m, d, 12, tzinfo=New_York) + timedelta(1) > > to be > > datetime(y, m, d+1, 12, tzinfo=New_York) The current definition of datetime.timedelta doesn't allow any reasonable definition of datetime(2013, 11, 3, 23, 30, tz=zoneinfo('America/Chicago')) - datetime(2013, 11, 3, 0, 30, tz=zoneinfo('America/Chicago')) other than timedelta(1) because the seconds field is not allowed to be more than 60*60*24. ijs From alexander.belopolsky at gmail.com Thu Apr 9 01:31:12 2015 From: alexander.belopolsky at gmail.com (Alexander Belopolsky) Date: Wed, 8 Apr 2015 19:31:12 -0400 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: <7730929f1a1248.55256adb@wiscmail.wisc.edu> References: <7720b19e1a5c35.55256d55@wiscmail.wisc.edu> <7530c2a81a2bcb.55256d93@wiscmail.wisc.edu> <7530b51e1a18c8.55256dcf@wiscmail.wisc.edu> <7760d7cc1a30e9.55256e0b@wiscmail.wisc.edu> <7760902c1a5f2d.55256e48@wiscmail.wisc.edu> <7570a8311a2408.55256e84@wiscmail.wisc.edu> <75f0b58c1a7a4a.55256ec0@wiscmail.wisc.edu> <7720bec31a0cad.55256efd@wiscmail.wisc.edu> <7530a00e1a1af6.55256f39@wiscmail.wisc.edu> <7730b7591a30e1.55256f75@wiscmail.wisc.edu> <7710fee61a5671.55256fb2@wiscmail.wisc.edu> <75b0d3dc1a482f.55256fee@wiscmail.wisc.edu> <7730b3ce1a61f1.5525702b@wiscmail.wisc.edu> <772082981a26d3.55257067@wiscmail.wisc.edu> <7530868f1a0bdd.552570a3@wiscmail.wisc.edu> <7570cabc1a3ff2.55257196@wiscmail.wisc.edu> <7570bc5b1a6a82.552571d3@wiscmail.wisc.edu> <75b084f91a120a.5525720f@wiscmail.wisc.edu> <7710b7fc1a394a.5525724b@wiscmail.wisc.edu> <7530d1e81a1782.55257287@wiscmail.wisc.edu> <7760cb591a2d7b.552572c4@wiscmail.wisc.edu> <7760feb11a5acd.55257300@wiscmail.wisc.edu> <75b0f3071a23e4.5525733d@wiscmail.wisc.edu> <7570cd5d1a7fb2.552573b5@wiscmail.wisc.edu> <7710821a1a25cd.552573f1@wiscmail.wisc.edu> <7710bb741a5981.5525742e@wiscmail.wisc.edu> <7730d2541a1570.552574e2@wiscmail.wisc.edu> <7760a3a31a637b.55257599@wiscmail.wisc.edu> <7530ca4c1a27b2.552575d8@wiscmail.wisc.edu> <7620c1581a6be2.55257614@wiscmail.wisc.edu> <7720b5e81a3db7.55257651@wiscmail.wisc.edu> <7620ed3f1a3a4a.55257780@wiscmail.wisc.edu> <7530dfe01a5f6c.552577bc@wiscmail.wisc.edu> <7620f5691a01dd.55257927@wiscmail.wisc.edu> <7530f56a1a50f6.55257964@wiscmail.wisc.edu> <75b0d5b31a4f38.552579a0@wiscmail.wisc.edu> <771082041a7176.552579dc@wiscmail.wisc.edu> <7620ff401a4d3d.55257a19@wiscmail.wisc.edu> <7620a5b71a287f.55257a55@wiscmail.wisc.edu> <7710b5ef1a3ca6.55257a94@wiscmail.wisc.edu> <75b0c4bb1a121e.55257ad3@wiscmail.wisc.edu> <7570cc801a3f92.55257b8a@wiscmail.wisc.edu> <75709b1d1a4b95.55257bc6@wiscmail.wisc.edu> <7640ead11a2c66.55257c03@wiscmail.wisc.edu> <7640c1741a5cc5.55257c3f@wiscmail.wisc.edu> <7720ebe81a3cd8.55257c7b@wiscmail.wisc.edu> <776092b41a797c.55257cb8@wiscmail.wisc.edu> <77508b791a20b3.55257cf4@wiscmail.wisc.edu> <7750cc961a45a0.55257d30@wiscmail.wisc.edu> <75b0e8d61a0ee1.55257f8c@wiscmail.wisc.edu> <75f0aa2f1a0ae8.55257fca@wiscmail.wisc.edu> <75f0c2351a4c80.55258007@wiscmail.wisc.edu> <7760b6e21a2f01.55258046@wiscmail.wisc.edu> <7710d7e51a71a7.55258082@wiscmail.wisc.edu> <7710abcd1a15f8.552580be@wiscmail.wisc.edu> <7530c0391a2ea8.552580fb@wiscmail.wisc.edu> <7720d5181a3ff4.55258137@wiscmail.wisc.edu> <7720dd2f1a7c5a.55258173@wiscmail.wisc.edu> <7640c6b51a3df5.552581b0@wiscmail.wisc.edu> <7640ee4b1a339e.552581ec@wiscmail.wisc.edu> <764087921a0b9f.55258228@wiscmail.wisc.edu> <7750dca91a4174.55258265@wiscmail.wisc.edu> <7760c2011a39dc.552582a4@wiscmail.wisc.edu> <7620d9171a3992.552582e0@wiscmail.wisc.edu> <75b08d361a5634.5525831c@wiscmail.wisc.edu> <7620c4141a208c.5525840f@wiscmail.wisc.edu> <7640f6f91a3fb3.5525844c@wiscmail.wisc.edu> <7620e9e81a4383.55258488@wiscmail.wisc.edu> <75b0a38e1a1054.552584c4@wiscmail.wisc.edu> <7750bef61a2e60.55258503@wiscmail.wisc.edu> <776091541a6ac3.55258540@wiscmail.wisc.edu> <7640fc041a414e.5525857c@wiscmail.wisc.edu> <75b086b61a3536.552585b8@wiscmail.wisc.edu> <7760e3ea1a5213.55258634@wiscmail.wisc.edu> <7620d3511a1769.55258670@wiscmail.wisc.edu> <77309bf71a7f90.552586ac@wiscmail.wisc.edu> <75b0a1271a24e2.552586e9@wiscmail.wisc.edu> <75f0e7351a1eee.55258725@wiscmail.wisc.edu> <7760fdbc1a14b4.5525879e@wiscmail.wisc.edu> <7750f8091a0d97.552587da@wiscmail.wisc.edu> <75b0de0d1a5379.552541d9@wiscmail.wisc.edu> <7720aaab1a58e2.5525ad7b@wiscmail.wisc.edu> <75f0acb81a65a6.5525adb8@wiscmail.wisc.edu> <771093591a4622.5525adf4@wiscmail.wisc.edu> <7760ed451a72f2.5525ae30@wiscmail.wisc.edu> <75f0ce8f1a1f49.5525ae6d@wiscmail.wisc.edu> <75b0e6421a27a6.5525aea9@wiscmail.wisc.edu> <75309dc51a4385.5525aee6@wiscmail.wisc.edu> <7570ca031a50f1.5525af22@wiscmail.wisc.edu> <76209ade1a6819.5525af5f@wiscmail.wisc.edu> <7620c53c1a66f0.5525af9b@wiscmail.wisc.edu> <7750a5d91a3002.5525afd8@wiscmail.wisc.edu> <75b0e81b1a2ebc.5525b014@wiscmail.wisc.edu> <771080cc1a12e0.5525b050@wiscmail.wisc.edu> <7570e91a1a00ff.5525b08d@wiscmail.wisc.edu> <773095181a724e.5525b0c9@wiscmail.wisc.edu> <7750e1f41a3875.5525b106@wiscmail.wisc.edu> <7730929f1a1248.55256adb@wiscmail.wisc.edu> Message-ID: On Wed, Apr 8, 2015 at 6:52 PM, Isaac Schwabacher wrote: > > > So "storing the offset" and "storing a flag" are not two alternative solutions to the same problem- these > > are two solutions to two different problems. > > I'm viewing a time zone as a map from UTC to local time; for example, America/Chicago is a time zone. I'm not proposing storing the offset as an alternative to storing the *time zone*, I'm proposing it as an alternative to storing whether a given time is DST or not. When you are proposing to store something, you also need to specify where you are proposing to store it. In the current design, local time information is stored in the datetime object and the rules that govern UTC to local mapping (and back) are stored in the tzinfo object. The implementors of concrete tzinfo subclasses are supposed to have access to time zone rules and implement utcoffset(dt), together with dst(dt) and tzname(dt) methods. Storing isdst in the datetime object would allow utcoffset(dt) to distinguish between 1:30AM before clock change and 1:30AM after. Where do you propose to store the offset? If you store it in dt, why would you need the tzinfo? > > We really don't care whether a time is DST or not; You may not care about it, but current tzinfo interface and tzinfo.fromutc(dt) method do. Whatever new features are added to the standard library, they cannot break existing uses. This means that whatever concrete tzinfo implementations we add to stdlib, they must provide an implementation of tzinfo.dst(dt) method. > So our times would look like "2013-11-03 01:30:00-0500 America/Chicago" and an hour later, "2013-11-03 01:30:00-0600 America/Chicago". And all of that information is stored in the datetime object. I don't think this is what most people would expect $ TZ=America/Chicago date Wed Apr 8 18:26:01 CDT 2015 or $ TZ=America/Chicago date +"%c %z" Wed 08 Apr 2015 06:27:09 PM CDT -0500 and not have location as a part of their timestamps. Regardless, whatever the proposal to add timezones to stdlib will end up being, it must include the ability to implement an equivalent of UNIX date command shown above. -------------- next part -------------- An HTML attachment was scrubbed... URL: From rosuav at gmail.com Thu Apr 9 01:32:12 2015 From: rosuav at gmail.com (Chris Angelico) Date: Thu, 9 Apr 2015 09:32:12 +1000 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: References: <76208ce51a5c78.55256951@wiscmail.wisc.edu> <7730b5c41a23cb.5525698d@wiscmail.wisc.edu> <7570e7311a1060.552569c9@wiscmail.wisc.edu> <757083011a11c1.55256a06@wiscmail.wisc.edu> <7570f1cd1a131a.55256a42@wiscmail.wisc.edu> <757085c41a7882.55256a7e@wiscmail.wisc.edu> <7730dfeb1a5d3d.55256abb@wiscmail.wisc.edu> <753098871a00de.55256af7@wiscmail.wisc.edu> <7620f8361a52d7.55256b34@wiscmail.wisc.edu> <7720e5701a69ad.55256b70@wiscmail.wisc.edu> <75f0ecde1a2563.55256bac@wiscmail.wisc.edu> <7750b6a91a5e4c.55256be9@wiscmail.wisc.edu> <7750977a1a049f.55256c25@wiscmail.wisc.edu> <7640c4bf1a07cf.55256c64@wiscmail.wisc.edu> <75f08a701a618b.55256ca0@wiscmail.wisc.edu> <75f0deaf1a3742.55256cdd@wiscmail.wisc.edu> <7620b7831a32e9.55256d19@wiscmail.wisc.edu> <7720b19e1a5c35.55256d55@wiscmail.wisc.edu> <7530c2a81a2bcb.55256d93@wiscmail.wisc.edu> <7530b51e1a18c8.55256dcf@wiscmail.wisc.edu> <7760d7cc1a30e9.55256e0b@wiscmail.wisc.edu> <7760902c1a5f2d.55256e48@wiscmail.wisc.edu> <7570a8311a2408.55256e84@wiscmail.wisc.edu> <75f0b58c1a7a4a.55256ec0@wiscmail.wisc.edu> <7720bec31a0cad.55256efd@wiscmail.wisc.edu> <7530a00e1a1af6.55256f39@wiscmail.wisc.edu> <7730b7591a30e1.55256f75@wiscmail.wisc.edu> <7710fee61a5671.55256fb2@wiscmail.wisc.edu> <75b0d3dc1a482f.55256fee@wiscmail.wisc.edu> <7730b3ce1a61f1.5525702b@wiscmail.wisc.edu> <772082981a26d3.55257067@wiscmail.wisc.edu> <7530868f1a0bdd.552570a3@wiscmail.wisc.edu> <7570cabc1a3ff2.55257196@wiscmail.wisc.edu> <7570bc5b1a6a82.552571d3@wiscmail.wisc.edu> <75b084f91a120a.5525720f@wiscmail.wisc.edu> <7710b7fc1a394a.5525724b@wiscmail.wisc.edu> <7530d1e81a1782.55257287@wiscmail.wisc.edu> <7760cb591a2d7b.552572c4@wiscmail.wisc.edu> <7760feb11a5acd.55257300@wiscmail.wisc.edu> <75b0f3071a23e4.5525733d@wiscmail.wisc.edu> <7570cd5d1a7fb2.552573b5@wiscmail.wisc.edu> <7710821a1a25cd.552573f1@wiscmail.wisc.edu> <7710bb741a5981.5525742e@wiscmail.wisc.edu> <7730d2541a1570.552574e2@wiscmail.wisc.edu> <7760a3a31a637b.55257599@wiscmail.wisc.edu> <7530ca4c1a27b2.552575d8@wiscmail.wisc.edu> <7620c1581a6be2.55257614@wiscmail.wisc.edu> <7720b5e81a3db7.55257651@wiscmail.wisc.edu> <7620ed3f1a3a4a.55257780@wiscmail.wisc.edu> <7530dfe01a5f6c.552577bc@wiscmail.wisc.edu> <7620f5691a01dd.55257927@wiscmail.wisc.edu> <7530f56a1a50f6.55257964@wiscmail.wisc.edu> <75b0d5b31a4f38.552579a0@wiscmail.wisc.edu> <771082041a7176.552579dc@wiscmail.wisc.edu> <7620ff401a4d3d.55257a19@wiscmail.wisc.edu> <7620a5b71a287f.55257a55@wiscmail.wisc.edu> <7710b5ef1a3ca6.55257a94@wiscmail.wisc.edu> <75b0c4bb1a121e.55257ad3@wiscmail.wisc.edu> <7570cc801a3f92.55257b8a@wiscmail.wisc.edu> <75709b1d1a4b95.55257bc6@wiscmail.wisc.edu> <7640ead11a2c66.55257c03@wiscmail.wisc.edu> <7640c1741a5cc5.55257c3f@wiscmail.wisc.edu> <7720ebe81a3cd8.55257c7b@wiscmail.wisc.edu> <776092b41a797c.55257cb8@wiscmail.wisc.edu> <77508b791a20b3.55257cf4@wiscmail.wisc.edu> <7750cc961a45a0.55257d30@wiscmail.wisc.edu> <75b0e8d61a0ee1.55257f8c@wiscmail.wisc.edu> <75f0aa2f1a0ae8.55257fca@wiscmail.wisc.edu> <75f0c2351a4c80.55258007@wiscmail.wisc.edu> <7760b6e21a2f01.55258046@wiscmail.wisc.edu> <7710d7e51a71a7.55258082@wiscmail.wisc.edu> <7710abcd1a15f8.552580be@wiscmail.wisc.edu> <7530c0391a2ea8.552580fb@wiscmail.wisc.edu> <7720d5181a3ff4.55258137@wiscmail.wisc.edu> <7720dd2f1a7c5a.55258173@wiscmail.wisc.edu> <7640c6b51a3df5.552581b0@wiscmail.wisc.edu> <7640ee4b1a339e.552581ec@wiscmail.wisc.edu> <764087921a0b9f.55258228@wiscmail.wisc.edu> <7750dca91a4174.55258265@wiscmail.wisc.edu> <7760c2011a39dc.552582a4@wiscmail.wisc.edu> <7620d9171a3992.552582e0@wiscmail.wisc.edu> <75b08d361a5634.5525831c@wiscmail.wisc.edu> <7620c4141a208c.5525840f@wiscmail.wisc.edu> <7640f6f91a3fb3.5525844c@wiscmail.wisc.edu> <7620e9e81a4383.55258488@wiscmail.wisc.edu> <75b0a38e1a1054.552584c4@wiscmail.wisc.edu> <7750bef61a2e60.55258503@wiscmail.wisc.edu> <776091541a6ac3.55258540@wiscmail.wisc.edu> <7640fc041a414e.5525857c@wiscmail.wisc.edu> <75b086b61a3536.552585b8@wiscmail.wisc.edu> <7760e3ea1a5213.55258634@wiscmail.wisc.edu> <7620d3511a1769.55258670@wiscmail.wisc.edu> <77309bf71a7f90.552586ac@wiscmail.wisc.edu> <75b0a1271a24e2.552586e9@wiscmail.wisc.edu> <75f0e7351a1eee.55258725@wiscmail.wisc.edu> <7760fdbc1a14b4.5525879e@wiscmail.wisc.edu> <7750f8091a0d97.552587da@wiscmail.wisc.edu> <75b0de0d1a5379.552541d9@wiscmail.wisc.edu> Message-ID: On Thu, Apr 9, 2015 at 8:32 AM, Alexander Belopolsky wrote: > A "named offset" is an abbreviation such as UTC, EST, MSK, MSD which (at any > given time) > corresponds to a fixed offset from UTC. That assumes the abbreviations are unique. They're not. Just this morning I had to explain to a new student of mine that no, my time zone is not "EST" = New York time, it's actually "EST" = Melbourne time. Granted, most of the time New York and Melbourne are opposite on DST, so one will be EST and one EDT, but that trick won't always help you. (BTW, thanks Lennart for your "Blame it on Caesar" PyCon talk. I linked my student to it as a "for further information" resource. Good fun, and a great summary of why political time is such a minefield.) If this proposal goes through, in some form or another, will there be One Obvious Way to do timezone-aware calculations in Python? That would definitely be an improvement. ChrisA From alexander.belopolsky at gmail.com Thu Apr 9 01:44:49 2015 From: alexander.belopolsky at gmail.com (Alexander Belopolsky) Date: Wed, 8 Apr 2015 19:44:49 -0400 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: References: <7730b5c41a23cb.5525698d@wiscmail.wisc.edu> <7570e7311a1060.552569c9@wiscmail.wisc.edu> <757083011a11c1.55256a06@wiscmail.wisc.edu> <7570f1cd1a131a.55256a42@wiscmail.wisc.edu> <757085c41a7882.55256a7e@wiscmail.wisc.edu> <7730dfeb1a5d3d.55256abb@wiscmail.wisc.edu> <753098871a00de.55256af7@wiscmail.wisc.edu> <7620f8361a52d7.55256b34@wiscmail.wisc.edu> <7720e5701a69ad.55256b70@wiscmail.wisc.edu> <75f0ecde1a2563.55256bac@wiscmail.wisc.edu> <7750b6a91a5e4c.55256be9@wiscmail.wisc.edu> <7750977a1a049f.55256c25@wiscmail.wisc.edu> <7640c4bf1a07cf.55256c64@wiscmail.wisc.edu> <75f08a701a618b.55256ca0@wiscmail.wisc.edu> <75f0deaf1a3742.55256cdd@wiscmail.wisc.edu> <7620b7831a32e9.55256d19@wiscmail.wisc.edu> <7720b19e1a5c35.55256d55@wiscmail.wisc.edu> <7530c2a81a2bcb.55256d93@wiscmail.wisc.edu> <7530b51e1a18c8.55256dcf@wiscmail.wisc.edu> <7760d7cc1a30e9.55256e0b@wiscmail.wisc.edu> <7760902c1a5f2d.55256e48@wiscmail.wisc.edu> <7570a8311a2408.55256e84@wiscmail.wisc.edu> <75f0b58c1a7a4a.55256ec0@wiscmail.wisc.edu> <7720bec31a0cad.55256efd@wiscmail.wisc.edu> <7530a00e1a1af6.55256f39@wiscmail.wisc.edu> <7730b7591a30e1.55256f75@wiscmail.wisc.edu> <7710fee61a5671.55256fb2@wiscmail.wisc.edu> <75b0d3dc1a482f.55256fee@wiscmail.wisc.edu> <7730b3ce1a61f1.5525702b@wiscmail.wisc.edu> <772082981a26d3.55257067@wiscmail.wisc.edu> <7530868f1a0bdd.552570a3@wiscmail.wisc.edu> <7570cabc1a3ff2.55257196@wiscmail.wisc.edu> <7570bc5b1a6a82.552571d3@wiscmail.wisc.edu> <75b084f91a120a.5525720f@wiscmail.wisc.edu> <7710b7fc1a394a.5525724b@wiscmail.wisc.edu> <7530d1e81a1782.55257287@wiscmail.wisc.edu> <7760cb591a2d7b.552572c4@wiscmail.wisc.edu> <7760feb11a5acd.55257300@wiscmail.wisc.edu> <75b0f3071a23e4.5525733d@wiscmail.wisc.edu> <7570cd5d1a7fb2.552573b5@wiscmail.wisc.edu> <7710821a1a25cd.552573f1@wiscmail.wisc.edu> <7710bb741a5981.5525742e@wiscmail.wisc.edu> <7730d2541a1570.552574e2@wiscmail.wisc.edu> <7760a3a31a637b.55257599@wiscmail.wisc.edu> <7530ca4c1a27b2.552575d8@wiscmail.wisc.edu> <7620c1581a6be2.55257614@wiscmail.wisc.edu> <7720b5e81a3db7.55257651@wiscmail.wisc.edu> <7620ed3f1a3a4a.55257780@wiscmail.wisc.edu> <7530dfe01a5f6c.552577bc@wiscmail.wisc.edu> <7620f5691a01dd.55257927@wiscmail.wisc.edu> <7530f56a1a50f6.55257964@wiscmail.wisc.edu> <75b0d5b31a4f38.552579a0@wiscmail.wisc.edu> <771082041a7176.552579dc@wiscmail.wisc.edu> <7620ff401a4d3d.55257a19@wiscmail.wisc.edu> <7620a5b71a287f.55257a55@wiscmail.wisc.edu> <7710b5ef1a3ca6.55257a94@wiscmail.wisc.edu> <75b0c4bb1a121e.55257ad3@wiscmail.wisc.edu> <7570cc801a3f92.55257b8a@wiscmail.wisc.edu> <75709b1d1a4b95.55257bc6@wiscmail.wisc.edu> <7640ead11a2c66.55257c03@wiscmail.wisc.edu> <7640c1741a5cc5.55257c3f@wiscmail.wisc.edu> <7720ebe81a3cd8.55257c7b@wiscmail.wisc.edu> <776092b41a797c.55257cb8@wiscmail.wisc.edu> <77508b791a20b3.55257cf4@wiscmail.wisc.edu> <7750cc961a45a0.55257d30@wiscmail.wisc.edu> <75b0e8d61a0ee1.55257f8c@wiscmail.wisc.edu> <75f0aa2f1a0ae8.55257fca@wiscmail.wisc.edu> <75f0c2351a4c80.55258007@wiscmail.wisc.edu> <7760b6e21a2f01.55258046@wiscmail.wisc.edu> <7710d7e51a71a7.55258082@wiscmail.wisc.edu> <7710abcd1a15f8.552580be@wiscmail.wisc.edu> <7530c0391a2ea8.552580fb@wiscmail.wisc.edu> <7720d5181a3ff4.55258137@wiscmail.wisc.edu> <7720dd2f1a7c5a.55258173@wiscmail.wisc.edu> <7640c6b51a3df5.552581b0@wiscmail.wisc.edu> <7640ee4b1a339e.552581ec@wiscmail.wisc.edu> <764087921a0b9f.55258228@wiscmail.wisc.edu> <7750dca91a4174.55258265@wiscmail.wisc.edu> <7760c2011a39dc.552582a4@wiscmail.wisc.edu> <7620d9171a3992.552582e0@wiscmail.wisc.edu> <75b08d361a5634.5525831c@wiscmail.wisc.edu> <7620c4141a208c.5525840f@wiscmail.wisc.edu> <7640f6f91a3fb3.5525844c@wiscmail.wisc.edu> <7620e9e81a4383.55258488@wiscmail.wisc.edu> <75b0a38e1a1054.552584c4@wiscmail.wisc.edu> <7750bef61a2e60.55258503@wiscmail.wisc.edu> <776091541a6ac3.55258540@wiscmail.wisc.edu> <7640fc041a414e.5525857c@wiscmail.wisc.edu> <75b086b61a3536.552585b8@wiscmail.wisc.edu> <7760e3ea1a5213.55258634@wiscmail.wisc.edu> <7620d3511a1769.55258670@wiscmail.wisc.edu> <77309bf71a7f90.552586ac@wiscmail.wisc.edu> <75b0a1271a24e2.552586e9@wiscmail.wisc.edu> <75f0e7351a1eee.55258725@wiscmail.wisc.edu> <7760fdbc1a14b4.5525879e@wiscmail.wisc.edu> <7750f8091a0d97.552587da@wiscmail.wisc.edu> <75b0de0d1a5379.552541d9@wiscmail.wisc.edu> Message-ID: On Wed, Apr 8, 2015 at 7:32 PM, Chris Angelico wrote: > On Thu, Apr 9, 2015 at 8:32 AM, Alexander Belopolsky > wrote: > > A "named offset" is an abbreviation such as UTC, EST, MSK, MSD which (at > any > > given time) > > corresponds to a fixed offset from UTC. > > That assumes the abbreviations are unique. They're not. Just this > morning I had to explain to a new student of mine that no, my time > zone is not "EST" = New York time, it's actually "EST" = Melbourne > time. Granted, most of the time New York and Melbourne are opposite on > DST, so one will be EST and one EDT, but that trick won't always help > you. I should have been more precise in my definitions. A "named offset" is a pair (tm_gmtoff, tm_zone). Given a location and a UTC time (UNIX timestamp), you should be able to produce a "named offset". $ TZ=Australia/Melbourne date -d @1428536256 +"%z %Z" +1000 EST The "name" part is usually redundant, but convenient for human readers. The opposite is not true: you cannot derive location from either or both parts. -------------- next part -------------- An HTML attachment was scrubbed... URL: From AlexMLord at gmail.com Thu Apr 9 02:11:50 2015 From: AlexMLord at gmail.com (Alex Lord) Date: Wed, 8 Apr 2015 20:11:50 -0400 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: References: <7530c2a81a2bcb.55256d93@wiscmail.wisc.edu> <7530b51e1a18c8.55256dcf@wiscmail.wisc.edu> <7760d7cc1a30e9.55256e0b@wiscmail.wisc.edu> <7760902c1a5f2d.55256e48@wiscmail.wisc.edu> <7570a8311a2408.55256e84@wiscmail.wisc.edu> <75f0b58c1a7a4a.55256ec0@wiscmail.wisc.edu> <7720bec31a0cad.55256efd@wiscmail.wisc.edu> <7530a00e1a1af6.55256f39@wiscmail.wisc.edu> <7730b7591a30e1.55256f75@wiscmail.wisc.edu> <7710fee61a5671.55256fb2@wiscmail.wisc.edu> <75b0d3dc1a482f.55256fee@wiscmail.wisc.edu> <7730b3ce1a61f1.5525702b@wiscmail.wisc.edu> <772082981a26d3.55257067@wiscmail.wisc.edu> <7530868f1a0bdd.552570a3@wiscmail.wisc.edu> <7570cabc1a3ff2.55257196@wiscmail.wisc.edu> <7570bc5b1a6a82.552571d3@wiscmail.wisc.edu> <75b084f91a120a.5525720f@wiscmail.wisc.edu> <7710b7fc1a394a.5525724b@wiscmail.wisc.edu> <7530d1e81a1782.55257287@wiscmail.wisc.edu> <7760cb591a2d7b.552572c4@wiscmail.wisc.edu> <7760feb11a5acd.55257300@wiscmail.wisc.edu> <75b0f3071a23e4.5525733d@wiscmail.wisc.edu> <7570cd5d1a7fb2.552573b5@wiscmail.wisc.edu> <7710821a1a25cd.552573f1@wiscmail.wisc.edu> <7710bb741a5981.5525742e@wiscmail.wisc.edu> <7730d2541a1570.552574e2@wiscmail.wisc.edu> <7760a3a31a637b.55257599@wiscmail.wisc.edu> <7530ca4c1a27b2.552575d8@wiscmail.wisc.edu> <7620c1581a6be2.55257614@wiscmail.wisc.edu> <7720b5e81a3db7.55257651@wiscmail.wisc.edu> <7620ed3f1a3a4a.55257780@wiscmail.wisc.edu> <7530dfe01a5f6c.552577bc@wiscmail.wisc.edu> <7620f5691a01dd.55257927@wiscmail.wisc.edu> <7530f56a1a50f6.55257964@wiscmail.wisc.edu> <75b0d5b31a4f38.552579a0@wiscmail.wisc.edu> <771082041a7176.552579dc@wiscmail.wisc.edu> <7620ff401a4d3d.55257a19@wiscmail.wisc.edu> <7620a5b71a287f.55257a55@wiscmail.wisc.edu> <7710b5ef1a3ca6.55257a94@wiscmail.wisc.edu> <75b0c4bb1a121e.55257ad3@wiscmail.wisc.edu> <7570cc801a3f92.55257b8a@wiscmail.wisc.edu> <75709b1d1a4b95.55257bc6@wiscmail.wisc.edu> <7640ead11a2c66.55257c03@wiscmail.wisc.edu> <7640c1741a5cc5.55257c3f@wiscmail.wisc.edu> <7720ebe81a3cd8.55257c7b@wiscmail.wisc.edu> <776092b41a797c.55257cb8@wiscmail.wisc.edu> <77508b791a20b3.55257cf4@wiscmail.wisc.edu> <7750cc961a45a0.55257d30@wiscmail.wisc.edu> <75b0e8d61a0ee1.55257f8c@wiscmail.wisc.edu> <75f0aa2f1a0ae8.55257fca@wiscmail.wisc.edu> <75f0c2351a4c80.55258007@wiscmail.wisc.edu> <7760b6e21a2f01.55258046@wiscmail.wisc.edu> <7710d7e51a71a7.55258082@wiscmail.wisc.edu> <7710abcd1a15f8.552580be@wiscmail.wisc.edu> <7530c0391a2ea8.552580fb@wiscmail.wisc.edu> <7720d5181a3ff4.55258137@wiscmail.wisc.edu> <7720dd2f1a7c5a.55258173@wiscmail.wisc.edu> <7640c6b51a3df5.552581b0@wiscmail.wisc.edu> <7640ee4b1a339e.552581ec@wiscmail.wisc.edu> <764087921a0b9f.55258228@wiscmail.wisc.edu> <7750dca91a4174.55258265@wiscmail.wisc.edu> <7760c2011a39dc.552582a4@wiscmail.wisc.edu> <7620d9171a3992.552582e0@wiscmail.wisc.edu> <75b08d361a5634.5525831c@wiscmail.wisc.edu> <7620c4141a208c.5525840f@wiscmail.wisc.edu> <7640f6f91a3fb3.5525844c@wiscmail.wisc.edu> <7620e9e81a4383.55258488@wiscmail.wisc.edu> <75b0a38e1a1054.552584c4@wiscmail.wisc.edu> <7750bef61a2e60.55258503@wiscmail.wisc.edu> <776091541a6ac3.55258540@wiscmail.wisc.edu> <7640fc041a414e.5525857c@wiscmail.wisc.edu> <75b086b61a3536.552585b8@wiscmail.wisc.edu> <7760e3ea1a5213.55258634@wiscmail.wisc.edu> <7620d3511a1769.55258670@wiscmail.wisc.edu> <77309bf71a7f90.552586ac@wiscmail.wisc.edu> <75b0a1271a24e2.552586e9@wiscmail.wisc.edu> <75f0e7351a1eee.55258725@wiscmail.wisc.edu> <7760fdbc1a14b4.5525879e@wiscmail.wisc.edu> <7750f8091a0d97.552587da@wiscmail.wisc.edu> <75b0de0d1a5379.552541d9@wiscmail.wisc.edu> <7720aaab1a58e2.5525ad7b@wiscmail.wisc.edu> <75f0acb81a65a6.5525adb8@wiscmail.wisc.edu> <771093591a4622.5525adf4@wiscmail.wisc.edu> <7760ed451a72f2.5525ae30@wiscmail.wisc.edu> <75f0ce8f1a1f49.5525ae6d@wiscmail.wisc.edu> <75b0e6421a27a6.5525aea9@wiscmail.wisc.edu> <75309dc51a4385.5525aee6@wiscmail.wisc.edu> <7570ca031a50f1.5525af22@wiscmail.wisc.edu> <76209ade1a6819.5525af5f@wiscmail.wisc.edu> <7620c53c1a66f0.5525af9b@wiscmail.wisc.edu> <7750a5d91a3002.5525afd8@wiscmail.wisc.edu> <75b0e81b1a2ebc.5525b014@wiscmail.wisc.edu> <771080cc1a12e0.5525b050@wiscmail.wisc.edu> <7570e91a1a00ff.5525b08d@wiscmail.wisc.edu> <773095181a724e.5525b0c9@wiscmail.wisc.edu> <7750e1f41a3875.5525b106@wiscmail.wisc.edu> <7730929f1a1248.55256adb@wiscmail.wisc.edu> Message-ID: Newb question time, what's BoF On Wed, Apr 8, 2015 at 7:31 PM, Alexander Belopolsky < alexander.belopolsky at gmail.com> wrote: > > On Wed, Apr 8, 2015 at 6:52 PM, Isaac Schwabacher > wrote: > > > > > So "storing the offset" and "storing a flag" are not two alternative > solutions to the same problem- these > > > are two solutions to two different problems. > > > > I'm viewing a time zone as a map from UTC to local time; for example, > America/Chicago is a time zone. I'm not proposing storing the offset as an > alternative to storing the *time zone*, I'm proposing it as an alternative > to storing whether a given time is DST or not. > > > When you are proposing to store something, you also need to specify where > you are proposing to store it. In the current design, local time > information is stored in the datetime object and the rules that govern UTC > to local mapping (and back) are stored in the tzinfo object. The > implementors of concrete tzinfo subclasses are supposed to have access to > time zone rules and implement utcoffset(dt), together with dst(dt) and > tzname(dt) methods. > > Storing isdst in the datetime object would allow utcoffset(dt) to > distinguish between 1:30AM before clock change and 1:30AM after. Where do > you propose to store the offset? If you store it in dt, why would you need > the tzinfo? > > > > > > We really don't care whether a time is DST or not; > > You may not care about it, but current tzinfo interface and > tzinfo.fromutc(dt) method do. Whatever new features are added to the > standard library, they cannot break existing uses. This means that > whatever concrete tzinfo implementations we add to stdlib, they must > provide an implementation of tzinfo.dst(dt) method. > > > So our times would look like "2013-11-03 01:30:00-0500 America/Chicago" > and an hour later, "2013-11-03 01:30:00-0600 America/Chicago". And all of > that information is stored in the datetime object. > > I don't think this is what most people would expect > > $ TZ=America/Chicago date > Wed Apr 8 18:26:01 CDT 2015 > > or > > $ TZ=America/Chicago date +"%c %z" > Wed 08 Apr 2015 06:27:09 PM CDT -0500 > > and not have location as a part of their timestamps. > > Regardless, whatever the proposal to add timezones to stdlib will end up > being, it must include the ability to implement an equivalent of UNIX date > command shown above. > > > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/alexmlord%40gmail.com > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexander.belopolsky at gmail.com Thu Apr 9 02:20:00 2015 From: alexander.belopolsky at gmail.com (Alexander Belopolsky) Date: Wed, 8 Apr 2015 20:20:00 -0400 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: References: <7530b51e1a18c8.55256dcf@wiscmail.wisc.edu> <7760d7cc1a30e9.55256e0b@wiscmail.wisc.edu> <7760902c1a5f2d.55256e48@wiscmail.wisc.edu> <7570a8311a2408.55256e84@wiscmail.wisc.edu> <75f0b58c1a7a4a.55256ec0@wiscmail.wisc.edu> <7720bec31a0cad.55256efd@wiscmail.wisc.edu> <7530a00e1a1af6.55256f39@wiscmail.wisc.edu> <7730b7591a30e1.55256f75@wiscmail.wisc.edu> <7710fee61a5671.55256fb2@wiscmail.wisc.edu> <75b0d3dc1a482f.55256fee@wiscmail.wisc.edu> <7730b3ce1a61f1.5525702b@wiscmail.wisc.edu> <772082981a26d3.55257067@wiscmail.wisc.edu> <7530868f1a0bdd.552570a3@wiscmail.wisc.edu> <7570cabc1a3ff2.55257196@wiscmail.wisc.edu> <7570bc5b1a6a82.552571d3@wiscmail.wisc.edu> <75b084f91a120a.5525720f@wiscmail.wisc.edu> <7710b7fc1a394a.5525724b@wiscmail.wisc.edu> <7530d1e81a1782.55257287@wiscmail.wisc.edu> <7760cb591a2d7b.552572c4@wiscmail.wisc.edu> <7760feb11a5acd.55257300@wiscmail.wisc.edu> <75b0f3071a23e4.5525733d@wiscmail.wisc.edu> <7570cd5d1a7fb2.552573b5@wiscmail.wisc.edu> <7710821a1a25cd.552573f1@wiscmail.wisc.edu> <7710bb741a5981.5525742e@wiscmail.wisc.edu> <7730d2541a1570.552574e2@wiscmail.wisc.edu> <7760a3a31a637b.55257599@wiscmail.wisc.edu> <7530ca4c1a27b2.552575d8@wiscmail.wisc.edu> <7620c1581a6be2.55257614@wiscmail.wisc.edu> <7720b5e81a3db7.55257651@wiscmail.wisc.edu> <7620ed3f1a3a4a.55257780@wiscmail.wisc.edu> <7530dfe01a5f6c.552577bc@wiscmail.wisc.edu> <7620f5691a01dd.55257927@wiscmail.wisc.edu> <7530f56a1a50f6.55257964@wiscmail.wisc.edu> <75b0d5b31a4f38.552579a0@wiscmail.wisc.edu> <771082041a7176.552579dc@wiscmail.wisc.edu> <7620ff401a4d3d.55257a19@wiscmail.wisc.edu> <7620a5b71a287f.55257a55@wiscmail.wisc.edu> <7710b5ef1a3ca6.55257a94@wiscmail.wisc.edu> <75b0c4bb1a121e.55257ad3@wiscmail.wisc.edu> <7570cc801a3f92.55257b8a@wiscmail.wisc.edu> <75709b1d1a4b95.55257bc6@wiscmail.wisc.edu> <7640ead11a2c66.55257c03@wiscmail.wisc.edu> <7640c1741a5cc5.55257c3f@wiscmail.wisc.edu> <7720ebe81a3cd8.55257c7b@wiscmail.wisc.edu> <776092b41a797c.55257cb8@wiscmail.wisc.edu> <77508b791a20b3.55257cf4@wiscmail.wisc.edu> <7750cc961a45a0.55257d30@wiscmail.wisc.edu> <75b0e8d61a0ee1.55257f8c@wiscmail.wisc.edu> <75f0aa2f1a0ae8.55257fca@wiscmail.wisc.edu> <75f0c2351a4c80.55258007@wiscmail.wisc.edu> <7760b6e21a2f01.55258046@wiscmail.wisc.edu> <7710d7e51a71a7.55258082@wiscmail.wisc.edu> <7710abcd1a15f8.552580be@wiscmail.wisc.edu> <7530c0391a2ea8.552580fb@wiscmail.wisc.edu> <7720d5181a3ff4.55258137@wiscmail.wisc.edu> <7720dd2f1a7c5a.55258173@wiscmail.wisc.edu> <7640c6b51a3df5.552581b0@wiscmail.wisc.edu> <7640ee4b1a339e.552581ec@wiscmail.wisc.edu> <764087921a0b9f.55258228@wiscmail.wisc.edu> <7750dca91a4174.55258265@wiscmail.wisc.edu> <7760c2011a39dc.552582a4@wiscmail.wisc.edu> <7620d9171a3992.552582e0@wiscmail.wisc.edu> <75b08d361a5634.5525831c@wiscmail.wisc.edu> <7620c4141a208c.5525840f@wiscmail.wisc.edu> <7640f6f91a3fb3.5525844c@wiscmail.wisc.edu> <7620e9e81a4383.55258488@wiscmail.wisc.edu> <75b0a38e1a1054.552584c4@wiscmail.wisc.edu> <7750bef61a2e60.55258503@wiscmail.wisc.edu> <776091541a6ac3.55258540@wiscmail.wisc.edu> <7640fc041a414e.5525857c@wiscmail.wisc.edu> <75b086b61a3536.552585b8@wiscmail.wisc.edu> <7760e3ea1a5213.55258634@wiscmail.wisc.edu> <7620d3511a1769.55258670@wiscmail.wisc.edu> <77309bf71a7f90.552586ac@wiscmail.wisc.edu> <75b0a1271a24e2.552586e9@wiscmail.wisc.edu> <75f0e7351a1eee.55258725@wiscmail.wisc.edu> <7760fdbc1a14b4.5525879e@wiscmail.wisc.edu> <7750f8091a0d97.552587da@wiscmail.wisc.edu> <75b0de0d1a5379.552541d9@wiscmail.wisc.edu> <7720aaab1a58e2.5525ad7b@wiscmail.wisc.edu> <75f0acb81a65a6.5525adb8@wiscmail.wisc.edu> <771093591a4622.5525adf4@wiscmail.wisc.edu> <7760ed451a72f2.5525ae30@wiscmail.wisc.edu> <75f0ce8f1a1f49.5525ae6d@wiscmail.wisc.edu> <75b0e6421a27a6.5525aea9@wiscmail.wisc.edu> <75309dc51a4385.5525aee6@wiscmail.wisc.edu> <7570ca031a50f1.5525af22@wiscmail.wisc.edu> <76209ade1a6819.5525af5f@wiscmail.wisc.edu> <7620c53c1a66f0.5525af9b@wiscmail.wisc.edu> <7750a5d91a3002.5525afd8@wiscmail.wisc.edu> <75b0e81b1a2ebc.5525b014@wiscmail.wisc.edu> <771080cc1a12e0.5525b050@wiscmail.wisc.edu> <7570e91a1a00ff.5525b08d@wiscmail.wisc.edu> <773095181a724e.5525b0c9@wiscmail.wisc.edu> <7750e1f41a3875.5525b106@wiscmail.wisc.edu> <7730929f1a1248.55256adb@wiscmail.wisc.edu> Message-ID: On Wed, Apr 8, 2015 at 8:11 PM, Alex Lord wrote: > Newb question time, what's BoF > http://en.wikipedia.org/wiki/Birds_of_a_feather_%28computing%29 -------------- next part -------------- An HTML attachment was scrubbed... URL: From regebro at gmail.com Thu Apr 9 04:34:36 2015 From: regebro at gmail.com (Lennart Regebro) Date: Thu, 9 Apr 2015 04:34:36 +0200 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: References: <7530c2a81a2bcb.55256d93@wiscmail.wisc.edu> <7530b51e1a18c8.55256dcf@wiscmail.wisc.edu> <7760d7cc1a30e9.55256e0b@wiscmail.wisc.edu> <7760902c1a5f2d.55256e48@wiscmail.wisc.edu> <7570a8311a2408.55256e84@wiscmail.wisc.edu> <75f0b58c1a7a4a.55256ec0@wiscmail.wisc.edu> <7720bec31a0cad.55256efd@wiscmail.wisc.edu> <7530a00e1a1af6.55256f39@wiscmail.wisc.edu> <7730b7591a30e1.55256f75@wiscmail.wisc.edu> <7710fee61a5671.55256fb2@wiscmail.wisc.edu> <75b0d3dc1a482f.55256fee@wiscmail.wisc.edu> <7730b3ce1a61f1.5525702b@wiscmail.wisc.edu> <772082981a26d3.55257067@wiscmail.wisc.edu> <7530868f1a0bdd.552570a3@wiscmail.wisc.edu> <7570cabc1a3ff2.55257196@wiscmail.wisc.edu> <7570bc5b1a6a82.552571d3@wiscmail.wisc.edu> <75b084f91a120a.5525720f@wiscmail.wisc.edu> <7710b7fc1a394a.5525724b@wiscmail.wisc.edu> <7530d1e81a1782.55257287@wiscmail.wisc.edu> <7760cb591a2d7b.552572c4@wiscmail.wisc.edu> <7760feb11a5acd.55257300@wiscmail.wisc.edu> <75b0f3071a23e4.5525733d@wiscmail.wisc.edu> <7570cd5d1a7fb2.552573b5@wiscmail.wisc.edu> <7710821a1a25cd.552573f1@wiscmail.wisc.edu> <7710bb741a5981.5525742e@wiscmail.wisc.edu> <7730d2541a1570.552574e2@wiscmail.wisc.edu> <7760a3a31a637b.55257599@wiscmail.wisc.edu> <7530ca4c1a27b2.552575d8@wiscmail.wisc.edu> <7620c1581a6be2.55257614@wiscmail.wisc.edu> <7720b5e81a3db7.55257651@wiscmail.wisc.edu> <7620ed3f1a3a4a.55257780@wiscmail.wisc.edu> <7530dfe01a5f6c.552577bc@wiscmail.wisc.edu> <7620f5691a01dd.55257927@wiscmail.wisc.edu> <7530f56a1a50f6.55257964@wiscmail.wisc.edu> <75b0d5b31a4f38.552579a0@wiscmail.wisc.edu> <771082041a7176.552579dc@wiscmail.wisc.edu> <7620ff401a4d3d.55257a19@wiscmail.wisc.edu> <7620a5b71a287f.55257a55@wiscmail.wisc.edu> <7710b5ef1a3ca6.55257a94@wiscmail.wisc.edu> <75b0c4bb1a121e.55257ad3@wiscmail.wisc.edu> <7570cc801a3f92.55257b8a@wiscmail.wisc.edu> <75709b1d1a4b95.55257bc6@wiscmail.wisc.edu> <7640ead11a2c66.55257c03@wiscmail.wisc.edu> <7640c1741a5cc5.55257c3f@wiscmail.wisc.edu> <7720ebe81a3cd8.55257c7b@wiscmail.wisc.edu> <776092b41a797c.55257cb8@wiscmail.wisc.edu> <77508b791a20b3.55257cf4@wiscmail.wisc.edu> <7750cc961a45a0.55257d30@wiscmail.wisc.edu> <75b0e8d61a0ee1.55257f8c@wiscmail.wisc.edu> <75f0aa2f1a0ae8.55257fca@wiscmail.wisc.edu> <75f0c2351a4c80.55258007@wiscmail.wisc.edu> <7760b6e21a2f01.55258046@wiscmail.wisc.edu> <7710d7e51a71a7.55258082@wiscmail.wisc.edu> <7710abcd1a15f8.552580be@wiscmail.wisc.edu> <7530c0391a2ea8.552580fb@wiscmail.wisc.edu> <7720d5181a3ff4.55258137@wiscmail.wisc.edu> <7720dd2f1a7c5a.55258173@wiscmail.wisc.edu> <7640c6b51a3df5.552581b0@wiscmail.wisc.edu> <7640ee4b1a339e.552581ec@wiscmail.wisc.edu> <764087921a0b9f.55258228@wiscmail.wisc.edu> <7750dca91a4174.55258265@wiscmail.wisc.edu> <7760c2011a39dc.552582a4@wiscmail.wisc.edu> <7620d9171a3992.552582e0@wiscmail.wisc.edu> <75b08d361a5634.5525831c@wiscmail.wisc.edu> <7620c4141a208c.5525840f@wiscmail.wisc.edu> <7640f6f91a3fb3.5525844c@wiscmail.wisc.edu> <7620e9e81a4383.55258488@wiscmail.wisc.edu> <75b0a38e1a1054.552584c4@wiscmail.wisc.edu> <7750bef61a2e60.55258503@wiscmail.wisc.edu> <776091541a6ac3.55258540@wiscmail.wisc.edu> <7640fc041a414e.5525857c@wiscmail.wisc.edu> <75b086b61a3536.552585b8@wiscmail.wisc.edu> <7760e3ea1a5213.55258634@wiscmail.wisc.edu> <7620d3511a1769.55258670@wiscmail.wisc.edu> <77309bf71a7f90.552586ac@wiscmail.wisc.edu> <75b0a1271a24e2.552586e9@wiscmail.wisc.edu> <75f0e7351a1eee.55258725@wiscmail.wisc.edu> <7760fdbc1a14b4.5525879e@wiscmail.wisc.edu> <7750f8091a0d97.552587da@wiscmail.wisc.edu> <75b0de0d1a5379.552541d9@wiscmail.wisc.edu> <7720aaab1a58e2.5525ad7b@wiscmail.wisc.edu> <75f0acb81a65a6.5525adb8@wiscmail.wisc.edu> <771093591a4622.5525adf4@wiscmail.wisc.edu> <7760ed451a72f2.5525ae30@wiscmail.wisc.edu> <75f0ce8f1a1f49.5525ae6d@wiscmail.wisc.edu> <75b0e6421a27a6.5525aea9@wiscmail.wisc.edu> <75309dc51a4385.5525aee6@wiscmail.wisc.edu> <7570ca031a50f1.5525af22@wiscmail.wisc.edu> <76209ade1a6819.5525af5f@wiscmail.wisc.edu> <7620c53c1a66f0.5525af9b@wiscmail.wisc.edu> <7750a5d91a3002.5525afd8@wiscmail.wisc.edu> <75b0e81b1a2ebc.5525b014@wiscmail.wisc.edu> <771080cc1a12e0.5525b050@wiscmail.wisc.edu> <7570e91a1a00ff.5525b08d@wiscmail.wisc.edu> <773095181a724e.5525b0c9@wiscmail.wisc.edu> <7750e1f41a3875.5525b106@wiscmail.wisc.edu> <7730929f1a1248.55256adb@wiscmail.wisc.edu> Message-ID: On Thu, Apr 9, 2015 at 1:31 AM, Alexander Belopolsky wrote: > Storing isdst in the datetime object would allow utcoffset(dt) to > distinguish between 1:30AM before clock change and 1:30AM after. Where do > you propose to store the offset? If you store it in dt, why would you need > the tzinfo? Because otherwise you don't know what tzinfo the datetime uses, and you need to know that if you add or subtract a timedelta, or compare with another datetime. > Regardless, whatever the proposal to add timezones to stdlib will end up > being, it must include the ability to implement an equivalent of UNIX date > command shown above. That's a strftime() issue, I think that issue is already solved. From ischwabacher at wisc.edu Thu Apr 9 17:59:44 2015 From: ischwabacher at wisc.edu (Isaac Schwabacher) Date: Thu, 09 Apr 2015 10:59:44 -0500 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: <76c089df1136a9.5526a18f@wiscmail.wisc.edu> References: <7720b19e1a5c35.55256d55@wiscmail.wisc.edu> <7530c2a81a2bcb.55256d93@wiscmail.wisc.edu> <7530b51e1a18c8.55256dcf@wiscmail.wisc.edu> <7760d7cc1a30e9.55256e0b@wiscmail.wisc.edu> <7760902c1a5f2d.55256e48@wiscmail.wisc.edu> <7570a8311a2408.55256e84@wiscmail.wisc.edu> <75f0b58c1a7a4a.55256ec0@wiscmail.wisc.edu> <7720bec31a0cad.55256efd@wiscmail.wisc.edu> <7530a00e1a1af6.55256f39@wiscmail.wisc.edu> <7730b7591a30e1.55256f75@wiscmail.wisc.edu> <7710fee61a5671.55256fb2@wiscmail.wisc.edu> <75b0d3dc1a482f.55256fee@wiscmail.wisc.edu> <7730b3ce1a61f1.5525702b@wiscmail.wisc.edu> <772082981a26d3.55257067@wiscmail.wisc.edu> <7530868f1a0bdd.552570a3@wiscmail.wisc.edu> <7570cabc1a3ff2.55257196@wiscmail.wisc.edu> <7570bc5b1a6a82.552571d3@wiscmail.wisc.edu> <75b084f91a120a.5525720f@wiscmail.wisc.edu> <7710b7fc1a394a.5525724b@wiscmail.wisc.edu> <7530d1e81a1782.55257287@wiscmail.wisc.edu> <7760cb591a2d7b.552572c4@wiscmail.wisc.edu> <7760feb11a5acd.55257300@wiscmail.wisc.edu> <75b0f3071a23e4.5525733d@wiscmail.wisc.edu> <7570cd5d1a7fb2.552573b5@wiscmail.wisc.edu> <7710821a1a25cd.552573f1@wiscmail.wisc.edu> <7710bb741a5981.5525742e@wiscmail.wisc.edu> <7730d2541a1570.552574e2@wiscmail.wisc.edu> <7760a3a31a637b.55257599@wiscmail.wisc.edu> <7530ca4c1a27b2.552575d8@wiscmail.wisc.edu> <7620c1581a6be2.55257614@wiscmail.wisc.edu> <7720b5e81a3db7.55257651@wiscmail.wisc.edu> <7620ed3f1a3a4a.55257780@wiscmail.wisc.edu> <7530dfe01a5f6c.552577bc@wiscmail.wisc.edu> <7620f5691a01dd.55257927@wiscmail.wisc.edu> <7530f56a1a50f6.55257964@wiscmail.wisc.edu> <75b0d5b31a4f38.552579a0@wiscmail.wisc.edu> <771082041a7176.552579dc@wiscmail.wisc.edu> <7620ff401a4d3d.55257a19@wiscmail.wisc.edu> <7620a5b71a287f.55257a55@wiscmail.wisc.edu> <7710b5ef1a3ca6.55257a94@wiscmail.wisc.edu> <75b0c4bb1a121e.55257ad3@wiscmail.wisc.edu> <7570cc801a3f92.55257b8a@wiscmail.wisc.edu> <75709b1d1a4b95.55257bc6@wiscmail.wisc.edu> <7640ead11a2c66.55257c03@wiscmail.wisc.edu> <7640c1741a5cc5.55257c3f@wiscmail.wisc.edu> <7720ebe81a3cd8.55257c7b@wiscmail.wisc.edu> <776092b41a797c.55257cb8@wiscmail.wisc.edu> <77508b791a20b3.55257cf4@wiscmail.wisc.edu> <7750cc961a45a0.55257d30@wiscmail.wisc.edu> <75b0e8d61a0ee1.55257f8c@wiscmail.wisc.edu> <75f0aa2f1a0ae8.55257fca@wiscmail.wisc.edu> <75f0c2351a4c80.55258007@wiscmail.wisc.edu> <7760b6e21a2f01.55258046@wiscmail.wisc.edu> <7710d7e51a71a7.55258082@wiscmail.wisc.edu> <7710abcd1a15f8.552580be@wiscmail.wisc.edu> <7530c0391a2ea8.552580fb@wiscmail.wisc.edu> <7720d5181a3ff4.55258137@wiscmail.wisc.edu> <7720dd2f1a7c5a.55258173@wiscmail.wisc.edu> <7640c6b51a3df5.552581b0@wiscmail.wisc.edu> <7640ee4b1a339e.552581ec@wiscmail.wisc.edu> <764087921a0b9f.55258228@wiscmail.wisc.edu> <7750dca91a4174.55258265@wiscmail.wisc.edu> <7760c2011a39dc.552582a4@wiscmail.wisc.edu> <7620d9171a3992.552582e0@wiscmail.wisc.edu> <75b08d361a5634.5525831c@wiscmail.wisc.edu> <7620c4141a208c.5525840f@wiscmail.wisc.edu> <7640f6f91a3fb3.5525844c@wiscmail.wisc.edu> <7620e9e81a4383.55258488@wiscmail.wisc.edu> <75b0a38e1a1054.552584c4@wiscmail.wisc.edu> <7750bef61a2e60.55258503@wiscmail.wisc.edu> <776091541a6ac3.55258540@wiscmail.wisc.edu> <7640fc041a414e.5525857c@wiscmail.wisc.edu> <75b086b61a3536.552585b8@wiscmail.wisc.edu> <7760e3ea1a5213.55258634@wiscmail.wisc.edu> <7620d3511a1769.55258670@wiscmail.wisc.edu> <77309bf71a7f90.552586ac@wiscmail.wisc.edu> <75b0a1271a24e2.552586e9@wiscmail.wisc.edu> <75f0e7351a1eee.55258725@wiscmail.wisc.edu> <7760fdbc1a14b4.5525879e@wiscmail.wisc.edu> <7750f8091a0d97.552587da@wiscmail.wisc.edu> <75b0de0d1a5379.552541d9@wiscmail.wisc.edu> <7720aaab1a58e2.5525ad7b@wiscmail.wisc.edu> <75f0acb81a65a6.5525adb8@wiscmail.wisc.edu> <771093591a4622.5525adf4@wiscmail.wisc.edu> <7760ed451a72f2.5525ae30@wiscmail.wisc.edu> <75f0ce8f1a1f49.5525ae6d@wiscmail.wisc.edu> <75b0e6421a27a6.5525aea9@wiscmail.wisc.edu> <75309dc51a4385.5525aee6@wiscmail.wisc.edu> <7570ca031a50f1.5525af22@wiscmail.wisc.edu> <76209ade1a6819.5525af5f@wiscmail.wisc.edu> <7620c53c1a66f0.5525af9b@wiscmail.wisc.edu> <7750a5d91a3002.5525afd8@wiscmail.wisc.edu> <75b0e81b1a2ebc.5525b014@wiscmail.wisc.edu> <771080cc1a12e0.5525b050@wiscmail.wisc.edu> <7570e91a1a00ff.5525b08d@wiscmail.wisc.edu> <773095181a724e.5525b0c9@wiscmail.wisc.edu> <7750e1f41a3875.5525b106@wiscmail.wisc.edu> <7730929f1a1248.55256adb@wiscmail.wisc.edu> <76b09e10112104.5526908b@wiscmail.wisc.edu> <7720b012114283.552690c7@wiscmail.wisc.edu> <7640a73b11533e.55269104@wiscmail.wisc.edu> <76b0a2f1116cdb.55269140@wiscmail.wisc.edu> <76c0ac45114602.5526917c@wiscmail.wisc.edu> <7640d4e1117490.552691b9@wiscmail.wisc.edu> <76a0f179117f2a.552691f5@wiscmail.wisc.edu> <72608879112f0b.55269232@wiscmail.wisc.edu> <76b09bed110ab2.5526926e@wiscmail.wisc.edu> <76a0cdf5114414.552692ab@wiscmail.wisc.edu> <7320e869111df7.552692e7@wiscmail.wisc.edu> <7630bb7e111e15.55269324@wiscmail.wisc.edu> <76308c771129d3.55269360@wiscmail.wisc.edu> <7640ab4e1107a0.5526939d@wiscmail.wisc.edu> <76a0be85116b99.552693d9@wiscmail.wisc.edu> <76a0b3571107d1.55269415@wiscmail.wisc.edu> <7650b967111fbc.55269452@wiscmail.wisc.edu> <7260abb0117259.5526948e@wiscmail.wisc.edu> <76a0bf79115912.552694ca@wiscmail.wisc.edu> <7650a6da1138b6.55269507@wiscmail.wisc.edu> <7740d46111594f.55269543@wiscmail.wisc.edu> <7260c04b1125e5.5526957f@wiscmail.wisc.edu> <76a0cfd5116d05.552695bc@wiscmail.wisc.edu> <76f089d4111322.552696ac@wiscmail.wisc.edu> <774096d0113e3f.552696e8@wiscmail.wisc.edu> <7630d06d1178fd.55269725@wiscmail.wisc.edu> <76409a001164fb.55269761@wiscmail.wisc.edu> <76f09d4e1103ab.5526979d@wiscmail.wisc.edu> <7320ce0e113eda.552697da@wiscmail.wisc.edu> <7260c2b911161a.55269816@wiscmail.wisc.edu> <7720c43e113113.55269853@wiscmail.wisc.edu> <774099521128a0.5526988f@wiscmail.wisc.edu> <7320a0cf115f98.5526990a@wiscmail.wisc.edu> <7720f8e1114254.55269947@wiscmail.wisc.edu> <76b0bdb1115264.55269986@wiscmail.wisc.edu> <7740c5cc1123f6.552699c2@wiscmail.wisc.edu> <7260ce0e114e6a.552699ff@wiscmail.wisc.edu> <7640f6fa114c41.55269a3b@wiscmail.wisc.edu> <76c09d38117188.55269a78@wiscmail.wisc.edu> <76c08193115371.55269ab4@wiscmail.wisc.edu> <76c0f4791171c4.55269af1@wiscmail.wisc.edu> <76a0d6d2114af4.55269b2d@wiscmail.wisc.edu> <7740bf60110f0e.55269b6a@wiscmail.wisc.edu> <7720e67d110dc2.55269ba6@wiscmail.wisc.edu> <76c0d1e0110a4d.55269be2@wiscmail.wisc.edu> <76b0d4f9117278.55269c1f@wiscmail.wisc.edu> <7650f68c1179f9.55269c5b@wiscmail.wisc.edu> <76509b38117661.55269c98@wiscmail.wisc.edu> <7260c5b0117cf7.55269cd4@wiscmail.wisc.edu> <76c099b11112c1.55269d8b@wiscmail.wisc.edu> <76f085321157b1.55269dcb@wiscmail.wisc.edu> <7640add8116ed3.55269e07@wiscmail.wisc.edu> <76309f0f1113a1.5526a116@wiscmail.wisc.edu> <7720c6e1113323.5526a153@wiscmail.wisc.edu> <76c089df1136a9.5526a18f@wiscmail.wisc.edu> Message-ID: <7720d92a112c25.55265ba0@wiscmail.wisc.edu> On 15-04-08, Alexander Belopolsky wrote: > > On Wed, Apr 8, 2015 at 6:52 PM, Isaac Schwabacher > wrote: > > > > > So "storing the offset" and "storing a flag" are not two alternative solutions to the same problem- these > > > are two solutions to two different problems. > > > > I'm viewing a time zone as a map from UTC to local time; for example, America/Chicago is a time zone. I'm not proposing storing the offset as an alternative to storing the *time zone*, I'm proposing it as an alternative to storing whether a given time is DST or not. > > When you are proposing to store something, you also need to specify where you are proposing to store it. In the current design, local time information is stored in the datetime object and the rules that govern UTC to local mapping (and back) are stored in the tzinfo object. The implementors of concrete tzinfo subclasses are supposed to have access to time zone rules and implement utcoffset(dt), together with dst(dt) and tzname(dt) methods. > > Storing isdst in the datetime object would allow?utcoffset(dt) to distinguish between 1:30AM before clock change and 1:30AM after. Where do you propose to store the offset? I propose to add an offset field to datetime.datetime. It should be set precisely when the tz field is set, and it should contain either a timedelta or an integer number of (possibly fractional) seconds, depending on what color the bikeshed gets painted. This is, IIUC, precisely where Lennart is proposing to store the is_dst flag. I just looked through the datetime documentation, and it looks like the currently blessed way of creating an aware datetime from a naive local datetime and a tzinfo is datetime.replace, which is too low level to handle the job. This method will either need to grow some validation, or there will need to be a tzinfo.localize(localtime, is_dst) method as well. > If you store it in dt, why would you need the tzinfo?? You can't do arithmetic without a time zone. Which is to say, if you try to do arithmetic without a time zone, you're implicitly doing it in a fixed offset time zone. So there's no way to disambiguate between "2013-11-03 01:30:00-0500 America/Chicago" + "1 hour" = "2013-11-03 01:30:00-0600 America/Chicago" and "2013-11-03 01:30:00-0500 America/New_York" + "1 hour" = "2013-11-03 02:30:00-0500 America/New_York". > > We really don't care whether a time is DST or not; > > You may not care about it, but current tzinfo interface and tzinfo.fromutc(dt) method do. Whatever new features are added to the standard library, they cannot break existing uses. This means that whatever concrete tzinfo implementations we add to stdlib, they must provide an implementation of tzinfo.dst(dt) method. I don't really mean we don't care; I mean that it's not close to orthogonal to the other things we care about in the way that local time and UTC time are. You'll never ask for *just* the DST flag. So making you look up in zoneinfo whether a given (localtime, offset, tzinfo) triple represents DST or STD is not as onerous, because if you used the other representation, you'd still be looking in zoneinfo for some other piece of information that the user asked for at the same time. So we implement the datetime.astimezone() ? > > So our times would look like "2013-11-03 01:30:00-0500 America/Chicago" and an hour later, "2013-11-03 01:30:00-0600 America/Chicago". And all of that information is stored in the datetime object. > > I don't think this is what most people would expect > > $ TZ=America/Chicago date > Wed Apr 8 18:26:01 CDT 2015 > > or > > $ TZ=America/Chicago date +"%c %z" > Wed 08 Apr 2015 06:27:09 PM CDT -0500 > > and not have location as a part of their timestamps. > Regardless, whatever the proposal to add timezones to stdlib will end up being, it must include the ability to implement an equivalent of UNIX date command shown above. Sorry, I meant that to show off the internal representation, rather than intending it as the literal string representation. But I *do* think that the right representation of a time zone-aware time is (localtime, offset, tzinfo), and the __repr__ should reflect that. Anyone who wants "CDT" instead of "-0500 America/Chicago" is talking to a human instead of a computer and should use strftime. ijs From alexander.belopolsky at gmail.com Thu Apr 9 18:20:50 2015 From: alexander.belopolsky at gmail.com (Alexander Belopolsky) Date: Thu, 9 Apr 2015 12:20:50 -0400 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: <7720d92a112c25.55265ba0@wiscmail.wisc.edu> References: <7720dd2f1a7c5a.55258173@wiscmail.wisc.edu> <7640c6b51a3df5.552581b0@wiscmail.wisc.edu> <7640ee4b1a339e.552581ec@wiscmail.wisc.edu> <764087921a0b9f.55258228@wiscmail.wisc.edu> <7750dca91a4174.55258265@wiscmail.wisc.edu> <7760c2011a39dc.552582a4@wiscmail.wisc.edu> <7620d9171a3992.552582e0@wiscmail.wisc.edu> <75b08d361a5634.5525831c@wiscmail.wisc.edu> <7620c4141a208c.5525840f@wiscmail.wisc.edu> <7640f6f91a3fb3.5525844c@wiscmail.wisc.edu> <7620e9e81a4383.55258488@wiscmail.wisc.edu> <75b0a38e1a1054.552584c4@wiscmail.wisc.edu> <7750bef61a2e60.55258503@wiscmail.wisc.edu> <776091541a6ac3.55258540@wiscmail.wisc.edu> <7640fc041a414e.5525857c@wiscmail.wisc.edu> <75b086b61a3536.552585b8@wiscmail.wisc.edu> <7760e3ea1a5213.55258634@wiscmail.wisc.edu> <7620d3511a1769.55258670@wiscmail.wisc.edu> <77309bf71a7f90.552586ac@wiscmail.wisc.edu> <75b0a1271a24e2.552586e9@wiscmail.wisc.edu> <75f0e7351a1eee.55258725@wiscmail.wisc.edu> <7760fdbc1a14b4.5525879e@wiscmail.wisc.edu> <7750f8091a0d97.552587da@wiscmail.wisc.edu> <75b0de0d1a5379.552541d9@wiscmail.wisc.edu> <7720aaab1a58e2.5525ad7b@wiscmail.wisc.edu> <75f0acb81a65a6.5525adb8@wiscmail.wisc.edu> <771093591a4622.5525adf4@wiscmail.wisc.edu> <7760ed451a72f2.5525ae30@wiscmail.wisc.edu> <75f0ce8f1a1f49.5525ae6d@wiscmail.wisc.edu> <75b0e6421a27a6.5525aea9@wiscmail.wisc.edu> <75309dc51a4385.5525aee6@wiscmail.wisc.edu> <7570ca031a50f1.5525af22@wiscmail.wisc.edu> <76209ade1a6819.5525af5f@wiscmail.wisc.edu> <7620c53c1a66f0.5525af9b@wiscmail.wisc.edu> <7750a5d91a3002.5525afd8@wiscmail.wisc.edu> <75b0e81b1a2ebc.5525b014@wiscmail.wisc.edu> <771080cc1a12e0.5525b050@wiscmail.wisc.edu> <7570e91a1a00ff.5525b08d@wiscmail.wisc.edu> <773095181a724e.5525b0c9@wiscmail.wisc.edu> <7750e1f41a3875.5525b106@wiscmail.wisc.edu> <7730929f1a1248.55256adb@wiscmail.wisc.edu> <76b09e10112104.5526908b@wiscmail.wisc.edu> <7720b012114283.552690c7@wiscmail.wisc.edu> <7640a73b11533e.55269104@wiscmail.wisc.edu> <76b0a2f1116cdb.55269140@wiscmail.wisc.edu> <76c0ac45114602.5526917c@wiscmail.wisc.edu> <7640d4e1117490.552691b9@wiscmail.wisc.edu> <76a0f179117f2a.552691f5@wiscmail.wisc.edu> <72608879112f0b.55269232@wiscmail.wisc.edu> <76b09bed110ab2.5526926e@wiscmail.wisc.edu> <76a0cdf5114414.552692ab@wiscmail.wisc.edu> <7320e869111df7.552692e7@wiscmail.wisc.edu> <7630bb7e111e15.55269324@wiscmail.wisc.edu> <76308c771129d3.55269360@wiscmail.wisc.edu> <7640ab4e1107a0.5526939d@wiscmail.wisc.edu> <76a0be85116b99.552693d9@wiscmail.wisc.edu> <76a0b3571107d1.55269415@wiscmail.wisc.edu> <7650b967111fbc.55269452@wiscmail.wisc.edu> <7260abb0117259.5526948e@wiscmail.wisc.edu> <76a0bf79115912.552694ca@wiscmail.wisc.edu> <7650a6da1138b6.55269507@wiscmail.wisc.edu> <7740d46111594f.55269543@wiscmail.wisc.edu> <7260c04b1125e5.5526957f@wiscmail.wisc.edu> <76a0cfd5116d05.552695bc@wiscmail.wisc.edu> <76f089d4111322.552696ac@wiscmail.wisc.edu> <774096d0113e3f.552696e8@wiscmail.wisc.edu> <7630d06d1178fd.55269725@wiscmail.wisc.edu> <76409a001164fb.55269761@wiscmail.wisc.edu> <76f09d4e1103ab.5526979d@wiscmail.wisc.edu> <7320ce0e113eda.552697da@wiscmail.wisc.edu> <7260c2b911161a.55269816@wiscmail.wisc.edu> <7720c43e113113.55269853@wiscmail.wisc.edu> <774099521128a0.5526988f@wiscmail.wisc.edu> <7320a0cf115f98.5526990a@wiscmail.wisc.edu> <7720f8e1114254.55269947@wiscmail.wisc.edu> <76b0bdb1115264.55269986@wiscmail.wisc.edu> <7740c5cc1123f6.552699c2@wiscmail.wisc.edu> <7260ce0e114e6a.552699ff@wiscmail.wisc.edu> <7640f6fa114c41.55269a3b@wiscmail.wisc.edu> <76c09d38117188.55269a78@wiscmail.wisc.edu> <76c08193115371.55269ab4@wiscmail.wisc.edu> <76c0f4791171c4.55269af1@wiscmail.wisc.edu> <76a0d6d2114af4.55269b2d@wiscmail.wisc.edu> <7740bf60110f0e.55269b6a@wiscmail.wisc.edu> <7720e67d110dc2.55269ba6@wiscmail.wisc.edu> <76c0d1e0110a4d.55269be2@wiscmail.wisc.edu> <76b0d4f9117278.55269c1f@wiscmail.wisc.edu> <7650f68c1179f9.55269c5b@wiscmail.wisc.edu> <76509b38117661.55269c98@wiscmail.wisc.edu> <7260c5b0117cf7.55269cd4@wiscmail.wisc.edu> <76c099b11112c1.55269d8b@wiscmail.wisc.edu> <76f085321157b1.55269dcb@wiscmail.wisc.edu> <7640add8116ed3.55269e07@wiscmail.wisc.edu> <76309f0f1113a1.5526a116@wiscmail.wisc.edu> <7720c6e1113323.5526a153@wiscmail.wisc.edu> <76c089df1136a9.5526a18f@wiscmail.wisc.edu> <7720d92a112c25.55265ba0@wiscmail.wisc.edu> Message-ID: On Thu, Apr 9, 2015 at 11:59 AM, Isaac Schwabacher wrote: > I just looked through the datetime documentation, and it looks like the > currently blessed way of creating an aware datetime from a naive local > datetime and a tzinfo is datetime.replace, which is too low level to handle > the job. Not quite. That's how you would create a UTC datetime, but from there you can get to your local timezone by calling astimezone() with no arguments: >>> print(datetime.now(timezone.utc).astimezone()) 2015-04-09 12:16:58.576272-04:00 -------------- next part -------------- An HTML attachment was scrubbed... URL: From ischwabacher at wisc.edu Thu Apr 9 18:49:51 2015 From: ischwabacher at wisc.edu (Isaac Schwabacher) Date: Thu, 09 Apr 2015 11:49:51 -0500 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: <7650d796111e1b.5526ad79@wiscmail.wisc.edu> References: <7720dd2f1a7c5a.55258173@wiscmail.wisc.edu> <7640c6b51a3df5.552581b0@wiscmail.wisc.edu> <7640ee4b1a339e.552581ec@wiscmail.wisc.edu> <764087921a0b9f.55258228@wiscmail.wisc.edu> <7750dca91a4174.55258265@wiscmail.wisc.edu> <7760c2011a39dc.552582a4@wiscmail.wisc.edu> <7620d9171a3992.552582e0@wiscmail.wisc.edu> <75b08d361a5634.5525831c@wiscmail.wisc.edu> <7620c4141a208c.5525840f@wiscmail.wisc.edu> <7640f6f91a3fb3.5525844c@wiscmail.wisc.edu> <7620e9e81a4383.55258488@wiscmail.wisc.edu> <75b0a38e1a1054.552584c4@wiscmail.wisc.edu> <7750bef61a2e60.55258503@wiscmail.wisc.edu> <776091541a6ac3.55258540@wiscmail.wisc.edu> <7640fc041a414e.5525857c@wiscmail.wisc.edu> <75b086b61a3536.552585b8@wiscmail.wisc.edu> <7760e3ea1a5213.55258634@wiscmail.wisc.edu> <7620d3511a1769.55258670@wiscmail.wisc.edu> <77309bf71a7f90.552586ac@wiscmail.wisc.edu> <75b0a1271a24e2.552586e9@wiscmail.wisc.edu> <75f0e7351a1eee.55258725@wiscmail.wisc.edu> <7760fdbc1a14b4.5525879e@wiscmail.wisc.edu> <7750f8091a0d97.552587da@wiscmail.wisc.edu> <75b0de0d1a5379.552541d9@wiscmail.wisc.edu> <7720aaab1a58e2.5525ad7b@wiscmail.wisc.edu> <75f0acb81a65a6.5525adb8@wiscmail.wisc.edu> <771093591a4622.5525adf4@wiscmail.wisc.edu> <7760ed451a72f2.5525ae30@wiscmail.wisc.edu> <75f0ce8f1a1f49.5525ae6d@wiscmail.wisc.edu> <75b0e6421a27a6.5525aea9@wiscmail.wisc.edu> <75309dc51a4385.5525aee6@wiscmail.wisc.edu> <7570ca031a50f1.5525af22@wiscmail.wisc.edu> <76209ade1a6819.5525af5f@wiscmail.wisc.edu> <7620c53c1a66f0.5525af9b@wiscmail.wisc.edu> <7750a5d91a3002.5525afd8@wiscmail.wisc.edu> <75b0e81b1a2ebc.5525b014@wiscmail.wisc.edu> <771080cc1a12e0.5525b050@wiscmail.wisc.edu> <7570e91a1a00ff.5525b08d@wiscmail.wisc.edu> <773095181a724e.5525b0c9@wiscmail.wisc.edu> <7750e1f41a3875.5525b106@wiscmail.wisc.edu> <7730929f1a1248.55256adb@wiscmail.wisc.edu> <76b09e10112104.5526908b@wiscmail.wisc.edu> <7720b012114283.552690c7@wiscmail.wisc.edu> <7640a73b11533e.55269104@wiscmail.wisc.edu> <76b0a2f1116cdb.55269140@wiscmail.wisc.edu> <76c0ac45114602.5526917c@wiscmail.wisc.edu> <7640d4e1117490.552691b9@wiscmail.wisc.edu> <76a0f179117f2a.552691f5@wiscmail.wisc.edu> <72608879112f0b.55269232@wiscmail.wisc.edu> <76b09bed110ab2.5526926e@wiscmail.wisc.edu> <76a0cdf5114414.552692ab@wiscmail.wisc.edu> <7320e869111df7.552692e7@wiscmail.wisc.edu> <7630bb7e111e15.55269324@wiscmail.wisc.edu> <76308c771129d3.55269360@wiscmail.wisc.edu> <7640ab4e1107a0.5526939d@wiscmail.wisc.edu> <76a0be85116b99.552693d9@wiscmail.wisc.edu> <76a0b3571107d1.55269415@wiscmail.wisc.edu> <7650b967111fbc.55269452@wiscmail.wisc.edu> <7260abb0117259.5526948e@wiscmail.wisc.edu> <76a0bf79115912.552694ca@wiscmail.wisc.edu> <7650a6da1138b6.55269507@wiscmail.wisc.edu> <7740d46111594f.55269543@wiscmail.wisc.edu> <7260c04b1125e5.5526957f@wiscmail.wisc.edu> <76a0cfd5116d05.552695bc@wiscmail.wisc.edu> <76f089d4111322.552696ac@wiscmail.wisc.edu> <774096d0113e3f.552696e8@wiscmail.wisc.edu> <7630d06d1178fd.55269725@wiscmail.wisc.edu> <76409a001164fb.55269761@wiscmail.wisc.edu> <76f09d4e1103ab.5526979d@wiscmail.wisc.edu> <7320ce0e113eda.552697da@wiscmail.wisc.edu> <7260c2b911161a.55269816@wiscmail.wisc.edu> <7720c43e113113.55269853@wiscmail.wisc.edu> <774099521128a0.5526988f@wiscmail.wisc.edu> <7320a0cf115f98.5526990a@wiscmail.wisc.edu> <7720f8e1114254.55269947@wiscmail.wisc.edu> <76b0bdb1115264.55269986@wiscmail.wisc.edu> <7740c5cc1123f6.552699c2@wiscmail.wisc.edu> <7260ce0e114e6a.552699ff@wiscmail.wisc.edu> <7640f6fa114c41.55269a3b@wiscmail.wisc.edu> <76c09d38117188.55269a78@wiscmail.wisc.edu> <76c08193115371.55269ab4@wiscmail.wisc.edu> <76c0f4791171c4.55269af1@wiscmail.wisc.edu> <76a0d6d2114af4.55269b2d@wiscmail.wisc.edu> <7740bf60110f0e.55269b6a@wiscmail.wisc.edu> <7720e67d110dc2.55269ba6@wiscmail.wisc.edu> <76c0d1e0110a4d.55269be2@wiscmail.wisc.edu> <76b0d4f9117278.55269c1f@wiscmail.wisc.edu> <7650f68c1179f9.55269c5b@wiscmail.wisc.edu> <76509b38117661.55269c98@wiscmail.wisc.edu> <7260c5b0117cf7.55269cd4@wiscmail.wisc.edu> <76c099b11112c1.55269d8b@wiscmail.wisc.edu> <76f085321157b1.55269dcb@wiscmail.wisc.edu> <7640add8116ed3.55269e07@wiscmail.wisc.edu> <76309f0f1113a1.5526a116@wiscmail.wisc.edu> <7720c6e1113323.5526a153@wiscmail.wisc.edu> <76c089df1136a9.5526a18f@wiscmail.wisc.edu> <7720d92a112c25.55265ba0@wiscmail.wisc.edu> <76c09c12116412.5526aaa1@wiscmail.wisc.edu> <7630b12011750c.5526ac83@wiscmail.wisc.edu> <7740c85d114ebf.5526acbf@wiscmail.wisc.edu> <76f0e78d11486f.5526ad3a@wiscmail.wisc.edu> <7650d796111e1b.5526ad79@wiscmail.wisc.edu> Message-ID: <765096e31165cc.5526675f@wiscmail.wisc.edu> On 15-04-09, Alexander Belopolsky wrote: > > On Thu, Apr 9, 2015 at 11:59 AM, Isaac Schwabacher wrote: > > > I just looked through the datetime documentation, and it looks like the currently blessed way of creating an aware datetime from a naive local datetime and a tzinfo is datetime.replace, which is too low level to handle the job. > > Not quite. That's how you would create a UTC datetime, but from there you can get to your local timezone by calling?astimezone() with no arguments: > > >>> print(datetime.now(timezone.utc).astimezone()) > 2015-04-09 12:16:58.576272-04:00 All of the datetimes in this example are time zone aware; I meant if someone passed you datetime(2013, 11, 3, 1, 30) without a time zone. astimezone assumes that the input naive time is UTC, which is not the case here. ijs From alexander.belopolsky at gmail.com Thu Apr 9 18:55:19 2015 From: alexander.belopolsky at gmail.com (Alexander Belopolsky) Date: Thu, 9 Apr 2015 12:55:19 -0400 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: <7720d92a112c25.55265ba0@wiscmail.wisc.edu> References: <7720dd2f1a7c5a.55258173@wiscmail.wisc.edu> <7640c6b51a3df5.552581b0@wiscmail.wisc.edu> <7640ee4b1a339e.552581ec@wiscmail.wisc.edu> <764087921a0b9f.55258228@wiscmail.wisc.edu> <7750dca91a4174.55258265@wiscmail.wisc.edu> <7760c2011a39dc.552582a4@wiscmail.wisc.edu> <7620d9171a3992.552582e0@wiscmail.wisc.edu> <75b08d361a5634.5525831c@wiscmail.wisc.edu> <7620c4141a208c.5525840f@wiscmail.wisc.edu> <7640f6f91a3fb3.5525844c@wiscmail.wisc.edu> <7620e9e81a4383.55258488@wiscmail.wisc.edu> <75b0a38e1a1054.552584c4@wiscmail.wisc.edu> <7750bef61a2e60.55258503@wiscmail.wisc.edu> <776091541a6ac3.55258540@wiscmail.wisc.edu> <7640fc041a414e.5525857c@wiscmail.wisc.edu> <75b086b61a3536.552585b8@wiscmail.wisc.edu> <7760e3ea1a5213.55258634@wiscmail.wisc.edu> <7620d3511a1769.55258670@wiscmail.wisc.edu> <77309bf71a7f90.552586ac@wiscmail.wisc.edu> <75b0a1271a24e2.552586e9@wiscmail.wisc.edu> <75f0e7351a1eee.55258725@wiscmail.wisc.edu> <7760fdbc1a14b4.5525879e@wiscmail.wisc.edu> <7750f8091a0d97.552587da@wiscmail.wisc.edu> <75b0de0d1a5379.552541d9@wiscmail.wisc.edu> <7720aaab1a58e2.5525ad7b@wiscmail.wisc.edu> <75f0acb81a65a6.5525adb8@wiscmail.wisc.edu> <771093591a4622.5525adf4@wiscmail.wisc.edu> <7760ed451a72f2.5525ae30@wiscmail.wisc.edu> <75f0ce8f1a1f49.5525ae6d@wiscmail.wisc.edu> <75b0e6421a27a6.5525aea9@wiscmail.wisc.edu> <75309dc51a4385.5525aee6@wiscmail.wisc.edu> <7570ca031a50f1.5525af22@wiscmail.wisc.edu> <76209ade1a6819.5525af5f@wiscmail.wisc.edu> <7620c53c1a66f0.5525af9b@wiscmail.wisc.edu> <7750a5d91a3002.5525afd8@wiscmail.wisc.edu> <75b0e81b1a2ebc.5525b014@wiscmail.wisc.edu> <771080cc1a12e0.5525b050@wiscmail.wisc.edu> <7570e91a1a00ff.5525b08d@wiscmail.wisc.edu> <773095181a724e.5525b0c9@wiscmail.wisc.edu> <7750e1f41a3875.5525b106@wiscmail.wisc.edu> <7730929f1a1248.55256adb@wiscmail.wisc.edu> <76b09e10112104.5526908b@wiscmail.wisc.edu> <7720b012114283.552690c7@wiscmail.wisc.edu> <7640a73b11533e.55269104@wiscmail.wisc.edu> <76b0a2f1116cdb.55269140@wiscmail.wisc.edu> <76c0ac45114602.5526917c@wiscmail.wisc.edu> <7640d4e1117490.552691b9@wiscmail.wisc.edu> <76a0f179117f2a.552691f5@wiscmail.wisc.edu> <72608879112f0b.55269232@wiscmail.wisc.edu> <76b09bed110ab2.5526926e@wiscmail.wisc.edu> <76a0cdf5114414.552692ab@wiscmail.wisc.edu> <7320e869111df7.552692e7@wiscmail.wisc.edu> <7630bb7e111e15.55269324@wiscmail.wisc.edu> <76308c771129d3.55269360@wiscmail.wisc.edu> <7640ab4e1107a0.5526939d@wiscmail.wisc.edu> <76a0be85116b99.552693d9@wiscmail.wisc.edu> <76a0b3571107d1.55269415@wiscmail.wisc.edu> <7650b967111fbc.55269452@wiscmail.wisc.edu> <7260abb0117259.5526948e@wiscmail.wisc.edu> <76a0bf79115912.552694ca@wiscmail.wisc.edu> <7650a6da1138b6.55269507@wiscmail.wisc.edu> <7740d46111594f.55269543@wiscmail.wisc.edu> <7260c04b1125e5.5526957f@wiscmail.wisc.edu> <76a0cfd5116d05.552695bc@wiscmail.wisc.edu> <76f089d4111322.552696ac@wiscmail.wisc.edu> <774096d0113e3f.552696e8@wiscmail.wisc.edu> <7630d06d1178fd.55269725@wiscmail.wisc.edu> <76409a001164fb.55269761@wiscmail.wisc.edu> <76f09d4e1103ab.5526979d@wiscmail.wisc.edu> <7320ce0e113eda.552697da@wiscmail.wisc.edu> <7260c2b911161a.55269816@wiscmail.wisc.edu> <7720c43e113113.55269853@wiscmail.wisc.edu> <774099521128a0.5526988f@wiscmail.wisc.edu> <7320a0cf115f98.5526990a@wiscmail.wisc.edu> <7720f8e1114254.55269947@wiscmail.wisc.edu> <76b0bdb1115264.55269986@wiscmail.wisc.edu> <7740c5cc1123f6.552699c2@wiscmail.wisc.edu> <7260ce0e114e6a.552699ff@wiscmail.wisc.edu> <7640f6fa114c41.55269a3b@wiscmail.wisc.edu> <76c09d38117188.55269a78@wiscmail.wisc.edu> <76c08193115371.55269ab4@wiscmail.wisc.edu> <76c0f4791171c4.55269af1@wiscmail.wisc.edu> <76a0d6d2114af4.55269b2d@wiscmail.wisc.edu> <7740bf60110f0e.55269b6a@wiscmail.wisc.edu> <7720e67d110dc2.55269ba6@wiscmail.wisc.edu> <76c0d1e0110a4d.55269be2@wiscmail.wisc.edu> <76b0d4f9117278.55269c1f@wiscmail.wisc.edu> <7650f68c1179f9.55269c5b@wiscmail.wisc.edu> <76509b38117661.55269c98@wiscmail.wisc.edu> <7260c5b0117cf7.55269cd4@wiscmail.wisc.edu> <76c099b11112c1.55269d8b@wiscmail.wisc.edu> <76f085321157b1.55269dcb@wiscmail.wisc.edu> <7640add8116ed3.55269e07@wiscmail.wisc.edu> <76309f0f1113a1.5526a116@wiscmail.wisc.edu> <7720c6e1113323.5526a153@wiscmail.wisc.edu> <76c089df1136a9.5526a18f@wiscmail.wisc.edu> <7720d92a112c25.55265ba0@wiscmail.wisc.edu> Message-ID: On Thu, Apr 9, 2015 at 11:59 AM, Isaac Schwabacher wrote: > > Storing isdst in the datetime object would allow utcoffset(dt) to > distinguish between 1:30AM before clock change and 1:30AM after. Where do > you propose to store the offset? > > I propose to add an offset field to datetime.datetime. It should be set > precisely when the tz field is set, and it should contain either a > timedelta or an integer number of (possibly fractional) seconds, depending > on what color the bikeshed gets painted. This is, IIUC, precisely where > Lennart is proposing to store the is_dst flag. Can you describe a situation where having isdst flag is not sufficient? Note that in many applications you want to find the appropriate offset knowing only local time and location, so requiring users to supply the offset defeats the purpose of the time zone database. In many applications, the isdst flag can be hidden from the user. For example, a scheduling application can pretend that a repeated hour does not exist and always assume that 01:30 AM is the first 01:30 AM. (In many business applications, it is a good idea anyways.) Alternatively, a user attempting to schedule an event at an ambiguous time can be presented with a warning and a request to pick one of two possible times. This would be a much friendlier interface than the one always requiring the use to specify the UTC offset. -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexander.belopolsky at gmail.com Thu Apr 9 19:00:45 2015 From: alexander.belopolsky at gmail.com (Alexander Belopolsky) Date: Thu, 9 Apr 2015 13:00:45 -0400 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: <765096e31165cc.5526675f@wiscmail.wisc.edu> References: <75b08d361a5634.5525831c@wiscmail.wisc.edu> <7620c4141a208c.5525840f@wiscmail.wisc.edu> <7640f6f91a3fb3.5525844c@wiscmail.wisc.edu> <7620e9e81a4383.55258488@wiscmail.wisc.edu> <75b0a38e1a1054.552584c4@wiscmail.wisc.edu> <7750bef61a2e60.55258503@wiscmail.wisc.edu> <776091541a6ac3.55258540@wiscmail.wisc.edu> <7640fc041a414e.5525857c@wiscmail.wisc.edu> <75b086b61a3536.552585b8@wiscmail.wisc.edu> <7760e3ea1a5213.55258634@wiscmail.wisc.edu> <7620d3511a1769.55258670@wiscmail.wisc.edu> <77309bf71a7f90.552586ac@wiscmail.wisc.edu> <75b0a1271a24e2.552586e9@wiscmail.wisc.edu> <75f0e7351a1eee.55258725@wiscmail.wisc.edu> <7760fdbc1a14b4.5525879e@wiscmail.wisc.edu> <7750f8091a0d97.552587da@wiscmail.wisc.edu> <75b0de0d1a5379.552541d9@wiscmail.wisc.edu> <7720aaab1a58e2.5525ad7b@wiscmail.wisc.edu> <75f0acb81a65a6.5525adb8@wiscmail.wisc.edu> <771093591a4622.5525adf4@wiscmail.wisc.edu> <7760ed451a72f2.5525ae30@wiscmail.wisc.edu> <75f0ce8f1a1f49.5525ae6d@wiscmail.wisc.edu> <75b0e6421a27a6.5525aea9@wiscmail.wisc.edu> <75309dc51a4385.5525aee6@wiscmail.wisc.edu> <7570ca031a50f1.5525af22@wiscmail.wisc.edu> <76209ade1a6819.5525af5f@wiscmail.wisc.edu> <7620c53c1a66f0.5525af9b@wiscmail.wisc.edu> <7750a5d91a3002.5525afd8@wiscmail.wisc.edu> <75b0e81b1a2ebc.5525b014@wiscmail.wisc.edu> <771080cc1a12e0.5525b050@wiscmail.wisc.edu> <7570e91a1a00ff.5525b08d@wiscmail.wisc.edu> <773095181a724e.5525b0c9@wiscmail.wisc.edu> <7750e1f41a3875.5525b106@wiscmail.wisc.edu> <7730929f1a1248.55256adb@wiscmail.wisc.edu> <76b09e10112104.5526908b@wiscmail.wisc.edu> <7720b012114283.552690c7@wiscmail.wisc.edu> <7640a73b11533e.55269104@wiscmail.wisc.edu> <76b0a2f1116cdb.55269140@wiscmail.wisc.edu> <76c0ac45114602.5526917c@wiscmail.wisc.edu> <7640d4e1117490.552691b9@wiscmail.wisc.edu> <76a0f179117f2a.552691f5@wiscmail.wisc.edu> <72608879112f0b.55269232@wiscmail.wisc.edu> <76b09bed110ab2.5526926e@wiscmail.wisc.edu> <76a0cdf5114414.552692ab@wiscmail.wisc.edu> <7320e869111df7.552692e7@wiscmail.wisc.edu> <7630bb7e111e15.55269324@wiscmail.wisc.edu> <76308c771129d3.55269360@wiscmail.wisc.edu> <7640ab4e1107a0.5526939d@wiscmail.wisc.edu> <76a0be85116b99.552693d9@wiscmail.wisc.edu> <76a0b3571107d1.55269415@wiscmail.wisc.edu> <7650b967111fbc.55269452@wiscmail.wisc.edu> <7260abb0117259.5526948e@wiscmail.wisc.edu> <76a0bf79115912.552694ca@wiscmail.wisc.edu> <7650a6da1138b6.55269507@wiscmail.wisc.edu> <7740d46111594f.55269543@wiscmail.wisc.edu> <7260c04b1125e5.5526957f@wiscmail.wisc.edu> <76a0cfd5116d05.552695bc@wiscmail.wisc.edu> <76f089d4111322.552696ac@wiscmail.wisc.edu> <774096d0113e3f.552696e8@wiscmail.wisc.edu> <7630d06d1178fd.55269725@wiscmail.wisc.edu> <76409a001164fb.55269761@wiscmail.wisc.edu> <76f09d4e1103ab.5526979d@wiscmail.wisc.edu> <7320ce0e113eda.552697da@wiscmail.wisc.edu> <7260c2b911161a.55269816@wiscmail.wisc.edu> <7720c43e113113.55269853@wiscmail.wisc.edu> <774099521128a0.5526988f@wiscmail.wisc.edu> <7320a0cf115f98.5526990a@wiscmail.wisc.edu> <7720f8e1114254.55269947@wiscmail.wisc.edu> <76b0bdb1115264.55269986@wiscmail.wisc.edu> <7740c5cc1123f6.552699c2@wiscmail.wisc.edu> <7260ce0e114e6a.552699ff@wiscmail.wisc.edu> <7640f6fa114c41.55269a3b@wiscmail.wisc.edu> <76c09d38117188.55269a78@wiscmail.wisc.edu> <76c08193115371.55269ab4@wiscmail.wisc.edu> <76c0f4791171c4.55269af1@wiscmail.wisc.edu> <76a0d6d2114af4.55269b2d@wiscmail.wisc.edu> <7740bf60110f0e.55269b6a@wiscmail.wisc.edu> <7720e67d110dc2.55269ba6@wiscmail.wisc.edu> <76c0d1e0110a4d.55269be2@wiscmail.wisc.edu> <76b0d4f9117278.55269c1f@wiscmail.wisc.edu> <7650f68c1179f9.55269c5b@wiscmail.wisc.edu> <76509b38117661.55269c98@wiscmail.wisc.edu> <7260c5b0117cf7.55269cd4@wiscmail.wisc.edu> <76c099b11112c1.55269d8b@wiscmail.wisc.edu> <76f085321157b1.55269dcb@wiscmail.wisc.edu> <7640add8116ed3.55269e07@wiscmail.wisc.edu> <76309f0f1113a1.5526a116@wiscmail.wisc.edu> <7720c6e1113323.5526a153@wiscmail.wisc.edu> <76c089df1136a9.5526a18f@wiscmail.wisc.edu> <7720d92a112c25.55265ba0@wiscmail.wisc.edu> <76c09c12116412.5526aaa1@wiscmail.wisc.edu> <7630b12011750c.5526ac83@wiscmail.wisc.edu> <7740c85d114ebf.5526acbf@wiscmail.wisc.edu> <76f0e78d11486f.5526ad3a@wiscmail.wisc.edu> <7650d796111e1b.5526ad79@wiscmail.wisc.edu> <765096e31165cc.5526675f@wiscmail.wisc.edu> Message-ID: On Thu, Apr 9, 2015 at 12:49 PM, Isaac Schwabacher wrote: > > if someone passed you datetime(2013, 11, 3, 1, 30) without a time zone. astimezone assumes that the input naive time is UTC, which is not the case here. No, it does not. Please read the documentation: "self must be aware (self.tzinfo must not be None, and self.utcoffset() must not return None)." < https://docs.python.org/3/library/datetime.html#datetime.datetime.astimezone > -------------- next part -------------- An HTML attachment was scrubbed... URL: From cristifati0 at gmail.com Thu Apr 9 19:01:33 2015 From: cristifati0 at gmail.com (Cristi Fati) Date: Thu, 9 Apr 2015 20:01:33 +0300 Subject: [Python-Dev] ctypes module In-Reply-To: References: Message-ID: Sorry if i'm pushy.... Would it worth me to send a patch with the changes? (i mean, would it bring value to the product? were there requests for ctypes on WinIA64 ?) If the answer to the above question is not "No", since the changes are in libffi should i send patches to you or libffi guys? (i'm unfamiliar with licensing terms). Regards, Cristi Fati. On Wed, Apr 8, 2015 at 3:49 PM, Cristi Fati wrote: > Hi all, thank you for your responses. Apparently i was wrong in my > previous email, ctypes (1.0.1 or 1.0.2) didn't have support for WinIA64 > (libffi), there was an in-house implementation. However we are using the > IA64 module extensively (including to successfully call LsaLogonUser). > > a much simpler example: > > Python 2.7.3 (default, Oct 22 2014, 12:21:16) [MSC v.1600 64 bit > (Itanium)] on win32 > Type "help", "copyright", "credits" or "license" for more information. > >>> import ctypes > >>> ctypes.cdll.kernel32.GetTickCount() > -79897956 > >>> > > On Wed, Apr 8, 2015 at 3:17 PM, Maciej Fijalkowski > wrote: > >> for the record libffi supports itanium officially (but as usual I'm >> very skeptical how well it works on less used platforms) >> https://sourceware.org/libffi/ >> >> On Wed, Apr 8, 2015 at 1:32 PM, Nick Coghlan wrote: >> > On 8 April 2015 at 20:36, Maciej Fijalkowski wrote: >> >> I presume the reason was that noone wants to maintain code for the >> >> case where there are no buildbots available and there is no >> >> development time available. You are free to put back in the files and >> >> see if they work (they might not), but such things are usually removed >> >> if they're a maintenance burden. I would be happy to assist you with >> >> finding someone willing to do commercial maintenance of ctypes for >> >> itanium, but asking python devs to do it for free is a bit too much. >> > >> > As a point of reference, even Red Hat dropped Itanium support for >> > RHEL6+ - you have to go all the way back to RHEL5 to find a version we >> > still support running on Itanium. >> > >> > For most of CPython, keeping it running on arbitrary architectures >> > often isn't too difficult, as libc abstracts away a lot of the >> > hardware details. libffi (and hence ctypes) are notable exceptions to >> > that :) >> > >> > Cheers, >> > Nick. >> > >> > -- >> > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexander.belopolsky at gmail.com Thu Apr 9 21:25:43 2015 From: alexander.belopolsky at gmail.com (Alexander Belopolsky) Date: Thu, 9 Apr 2015 15:25:43 -0400 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: <76509083115717.55268789@wiscmail.wisc.edu> References: <7620c53c1a66f0.5525af9b@wiscmail.wisc.edu> <7750a5d91a3002.5525afd8@wiscmail.wisc.edu> <75b0e81b1a2ebc.5525b014@wiscmail.wisc.edu> <771080cc1a12e0.5525b050@wiscmail.wisc.edu> <7570e91a1a00ff.5525b08d@wiscmail.wisc.edu> <773095181a724e.5525b0c9@wiscmail.wisc.edu> <7750e1f41a3875.5525b106@wiscmail.wisc.edu> <7730929f1a1248.55256adb@wiscmail.wisc.edu> <76b09e10112104.5526908b@wiscmail.wisc.edu> <7720b012114283.552690c7@wiscmail.wisc.edu> <7640a73b11533e.55269104@wiscmail.wisc.edu> <76b0a2f1116cdb.55269140@wiscmail.wisc.edu> <76c0ac45114602.5526917c@wiscmail.wisc.edu> <7640d4e1117490.552691b9@wiscmail.wisc.edu> <76a0f179117f2a.552691f5@wiscmail.wisc.edu> <72608879112f0b.55269232@wiscmail.wisc.edu> <76b09bed110ab2.5526926e@wiscmail.wisc.edu> <76a0cdf5114414.552692ab@wiscmail.wisc.edu> <7320e869111df7.552692e7@wiscmail.wisc.edu> <7630bb7e111e15.55269324@wiscmail.wisc.edu> <76308c771129d3.55269360@wiscmail.wisc.edu> <7640ab4e1107a0.5526939d@wiscmail.wisc.edu> <76a0be85116b99.552693d9@wiscmail.wisc.edu> <76a0b3571107d1.55269415@wiscmail.wisc.edu> <7650b967111fbc.55269452@wiscmail.wisc.edu> <7260abb0117259.5526948e@wiscmail.wisc.edu> <76a0bf79115912.552694ca@wiscmail.wisc.edu> <7650a6da1138b6.55269507@wiscmail.wisc.edu> <7740d46111594f.55269543@wiscmail.wisc.edu> <7260c04b1125e5.5526957f@wiscmail.wisc.edu> <76a0cfd5116d05.552695bc@wiscmail.wisc.edu> <76f089d4111322.552696ac@wiscmail.wisc.edu> <774096d0113e3f.552696e8@wiscmail.wisc.edu> <7630d06d1178fd.55269725@wiscmail.wisc.edu> <76409a001164fb.55269761@wiscmail.wisc.edu> <76f09d4e1103ab.5526979d@wiscmail.wisc.edu> <7320ce0e113eda.552697da@wiscmail.wisc.edu> <7260c2b911161a.55269816@wiscmail.wisc.edu> <7720c43e113113.55269853@wiscmail.wisc.edu> <774099521128a0.5526988f@wiscmail.wisc.edu> <7320a0cf115f98.5526990a@wiscmail.wisc.edu> <7720f8e1114254.55269947@wiscmail.wisc.edu> <76b0bdb1115264.55269986@wiscmail.wisc.edu> <7740c5cc1123f6.552699c2@wiscmail.wisc.edu> <7260ce0e114e6a.552699ff@wiscmail.wisc.edu> <7640f6fa114c41.55269a3b@wiscmail.wisc.edu> <76c09d38117188.55269a78@wiscmail.wisc.edu> <76c08193115371.55269ab4@wiscmail.wisc.edu> <76c0f4791171c4.55269af1@wiscmail.wisc.edu> <76a0d6d2114af4.55269b2d@wiscmail.wisc.edu> <7740bf60110f0e.55269b6a@wiscmail.wisc.edu> <7720e67d110dc2.55269ba6@wiscmail.wisc.edu> <76c0d1e0110a4d.55269be2@wiscmail.wisc.edu> <76b0d4f9117278.55269c1f@wiscmail.wisc.edu> <7650f68c1179f9.55269c5b@wiscmail.wisc.edu> <76509b38117661.55269c98@wiscmail.wisc.edu> <7260c5b0117cf7.55269cd4@wiscmail.wisc.edu> <76c099b11112c1.55269d8b@wiscmail.wisc.edu> <76f085321157b1.55269dcb@wiscmail.wisc.edu> <7640add8116ed3.55269e07@wiscmail.wisc.edu> <76309f0f1113a1.5526a116@wiscmail.wisc.edu> <7720c6e1113323.5526a153@wiscmail.wisc.edu> <76c089df1136a9.5526a18f@wiscmail.wisc.edu> <7720d92a112c25.55265ba0@wiscmail.wisc.edu> <7640f8d8116085.5526aff1@wiscmail.wisc.edu> <7320a0b711117d.5526b02d@wiscmail.wisc.edu> <7650bc3c1179cb.5526b06a@wiscmail.wisc.edu> <76b0bf22114a4a.5526b0a6@wiscmail.wisc.edu> <76a0af84112133.5526b0e2@wiscmail.wisc.edu> <7640ac9511459c.5526b11f@wiscmail.wisc.edu> <76f085bb110fcb.5526b15b@wiscmail.wisc.edu> <7630d7cc116aa0.5526b198@wiscmail.wisc.edu> <77409a4f115819.5526b1d4@wiscmail.wisc.edu> <76a0a288115d8b.5526b210@wiscmail.wisc.edu> <7260eef5117ede.5526b303@wiscmail.wisc.edu> <76f0c67311001f.5526b7f3@wiscmail.wisc.edu> <76f0bf17110ba5.5526b82f@wiscmail.wisc.edu> <7640f9ad116735.5526bd8b@wiscmail.wisc.edu> <76c0d47a111b11.5526bdc8@wiscmail.wisc.edu> <72609f51116be1.5526be04@wiscmail.wisc.edu> <76f09ef511243a.5526be7c@wiscmail.wisc.edu> <76f0d2df117b5b.5526c0d5@wiscmail.wisc.edu> <76c08fdc112390.5526c111@wiscmail.wisc.edu> <7720df4d112847.5526c14e@wiscmail.wisc.edu> <772091e81144cf.5526c18a@wiscmail.wisc.edu> <7320cab1115f36.5526c529@wiscmail.wisc.edu> <7720eda711299a.5526c566@wiscmail.wisc.edu> <7720b35711064d.5526c5a2@wiscmail.wisc.edu> <72609cdf116f22.5526c5df@wiscmail.wisc.edu> <7260ec68117733.5526cb0b@wiscmail.wisc.edu> <76a087c2117551.5526cbc2@wiscmail.wisc.edu> <726093871109dc.5526cbff@wiscmail.wisc.edu> <76f0eb8b111384.5526cc3b@wiscmail.wisc.edu> <76b09c3a1161c2.5526cc77@wiscmail.wisc.edu> <7630ab8b1105de.5526ccb6@wiscmail.wisc.edu> <726096741127d9.5526cda9@wiscmail.wisc.edu> <76509083115717.55268789@wiscmail.wisc.edu> Message-ID: On Thu, Apr 9, 2015 at 3:07 PM, Isaac Schwabacher wrote: > > No, it does not. Please read the documentation: "self must be aware > (self.tzinfo must not be None, and self.utcoffset() must not return None)." > > Whoops, you're right. But that's even worse-- it doesn't give you a way to > convert a naive datetime at all. Currently the only way from "2013-11-03 > 01:30:00" to "2013-11-03 01:30:00-0500 America/Chicago" is still > datetime.replace(). Well, you are right, but at least we do have a localtime utility hidden in the email package: >>> from datetime import * >>> from email.utils import localtime >>> print(localtime(datetime.now())) 2015-04-09 15:19:12.840000-04:00 You can read for the reasons it did not make into datetime. -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexander.belopolsky at gmail.com Thu Apr 9 21:58:53 2015 From: alexander.belopolsky at gmail.com (Alexander Belopolsky) Date: Thu, 9 Apr 2015 15:58:53 -0400 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: <76c0f92b11575d.55268f0d@wiscmail.wisc.edu> References: <7720b012114283.552690c7@wiscmail.wisc.edu> <7640a73b11533e.55269104@wiscmail.wisc.edu> <76b0a2f1116cdb.55269140@wiscmail.wisc.edu> <76c0ac45114602.5526917c@wiscmail.wisc.edu> <7640d4e1117490.552691b9@wiscmail.wisc.edu> <76a0f179117f2a.552691f5@wiscmail.wisc.edu> <72608879112f0b.55269232@wiscmail.wisc.edu> <76b09bed110ab2.5526926e@wiscmail.wisc.edu> <76a0cdf5114414.552692ab@wiscmail.wisc.edu> <7320e869111df7.552692e7@wiscmail.wisc.edu> <7630bb7e111e15.55269324@wiscmail.wisc.edu> <76308c771129d3.55269360@wiscmail.wisc.edu> <7640ab4e1107a0.5526939d@wiscmail.wisc.edu> <76a0be85116b99.552693d9@wiscmail.wisc.edu> <76a0b3571107d1.55269415@wiscmail.wisc.edu> <7650b967111fbc.55269452@wiscmail.wisc.edu> <7260abb0117259.5526948e@wiscmail.wisc.edu> <76a0bf79115912.552694ca@wiscmail.wisc.edu> <7650a6da1138b6.55269507@wiscmail.wisc.edu> <7740d46111594f.55269543@wiscmail.wisc.edu> <7260c04b1125e5.5526957f@wiscmail.wisc.edu> <76a0cfd5116d05.552695bc@wiscmail.wisc.edu> <76f089d4111322.552696ac@wiscmail.wisc.edu> <774096d0113e3f.552696e8@wiscmail.wisc.edu> <7630d06d1178fd.55269725@wiscmail.wisc.edu> <76409a001164fb.55269761@wiscmail.wisc.edu> <76f09d4e1103ab.5526979d@wiscmail.wisc.edu> <7320ce0e113eda.552697da@wiscmail.wisc.edu> <7260c2b911161a.55269816@wiscmail.wisc.edu> <7720c43e113113.55269853@wiscmail.wisc.edu> <774099521128a0.5526988f@wiscmail.wisc.edu> <7320a0cf115f98.5526990a@wiscmail.wisc.edu> <7720f8e1114254.55269947@wiscmail.wisc.edu> <76b0bdb1115264.55269986@wiscmail.wisc.edu> <7740c5cc1123f6.552699c2@wiscmail.wisc.edu> <7260ce0e114e6a.552699ff@wiscmail.wisc.edu> <7640f6fa114c41.55269a3b@wiscmail.wisc.edu> <76c09d38117188.55269a78@wiscmail.wisc.edu> <76c08193115371.55269ab4@wiscmail.wisc.edu> <76c0f4791171c4.55269af1@wiscmail.wisc.edu> <76a0d6d2114af4.55269b2d@wiscmail.wisc.edu> <7740bf60110f0e.55269b6a@wiscmail.wisc.edu> <7720e67d110dc2.55269ba6@wiscmail.wisc.edu> <76c0d1e0110a4d.55269be2@wiscmail.wisc.edu> <76b0d4f9117278.55269c1f@wiscmail.wisc.edu> <7650f68c1179f9.55269c5b@wiscmail.wisc.edu> <76509b38117661.55269c98@wiscmail.wisc.edu> <7260c5b0117cf7.55269cd4@wiscmail.wisc.edu> <76c099b11112c1.55269d8b@wiscmail.wisc.edu> <76f085321157b1.55269dcb@wiscmail.wisc.edu> <7640add8116ed3.55269e07@wiscmail.wisc.edu> <76309f0f1113a1.5526a116@wiscmail.wisc.edu> <7720c6e1113323.5526a153@wiscmail.wisc.edu> <76c089df1136a9.5526a18f@wiscmail.wisc.edu> <7720d92a112c25.55265ba0@wiscmail.wisc.edu> <7640f8d8116085.5526aff1@wiscmail.wisc.edu> <7320a0b711117d.5526b02d@wiscmail.wisc.edu> <7650bc3c1179cb.5526b06a@wiscmail.wisc.edu> <76b0bf22114a4a.5526b0a6@wiscmail.wisc.edu> <76a0af84112133.5526b0e2@wiscmail.wisc.edu> <7640ac9511459c.5526b11f@wiscmail.wisc.edu> <76f085bb110fcb.5526b15b@wiscmail.wisc.edu> <7630d7cc116aa0.5526b198@wiscmail.wisc.edu> <77409a4f115819.5526b1d4@wiscmail.wisc.edu> <76a0a288115d8b.5526b210@wiscmail.wisc.edu> <7260eef5117ede.5526b303@wiscmail.wisc.edu> <76f0c67311001f.5526b7f3@wiscmail.wisc.edu> <76f0bf17110ba5.5526b82f@wiscmail.wisc.edu> <7640f9ad116735.5526bd8b@wiscmail.wisc.edu> <76c0d47a111b11.5526bdc8@wiscmail.wisc.edu> <72609f51116be1.5526be04@wiscmail.wisc.edu> <76f09ef511243a.5526be7c@wiscmail.wisc.edu> <76f0d2df117b5b.5526c0d5@wiscmail.wisc.edu> <76c08fdc112390.5526c111@wiscmail.wisc.edu> <7720df4d112847.5526c14e@wiscmail.wisc.edu> <772091e81144cf.5526c18a@wiscmail.wisc.edu> <7320cab1115f36.5526c529@wiscmail.wisc.edu> <7720eda711299a.5526c566@wiscmail.wisc.edu> <7720b35711064d.5526c5a2@wiscmail.wisc.edu> <72609cdf116f22.5526c5df@wiscmail.wisc.edu> <7260ec68117733.5526cb0b@wiscmail.wisc.edu> <76a087c2117551.5526cbc2@wiscmail.wisc.edu> <726093871109dc.5526cbff@wiscmail.wisc.edu> <76f0eb8b111384.5526cc3b@wiscmail.wisc.edu> <76b09c3a1161c2.5526cc77@wiscmail.wisc.edu> <7630ab8b1105de.5526ccb6@wiscmail.wisc.edu> <726096741127d9.5526cda9@wiscmail.wisc.edu> <76509083115717.55268789@wiscmail.wisc.edu> <7630d8b9117947.5526d338@wiscmail.wisc.edu> <7630a0411144d9.5526d376@wiscmail.wisc.edu> <7630f1e7112d7d.5526d3b2@wiscmail.wisc.edu> <7640f55d1155b3.5526d42d@wiscmail.wisc.edu> <7630f265113a8c.5526d46a@wiscmail.wisc.edu> <7740a42f1118c1.5526d4a6@wiscmail.wisc.edu> <76a0daf4115f20.5526d4e5@wiscmail.wisc.edu> <76c0914511604d.5526d522@wiscmail.wisc.edu> <76c0f92b11575d.55268f0d@wiscmail.wisc.edu> Message-ID: On Thu, Apr 9, 2015 at 3:39 PM, Isaac Schwabacher wrote: > > Well, you are right, but at least we do have a localtime utility hidden > in the email package: > > > > > > >>> from datetime import * > > >>> from email.utils import localtime > > >>> print(localtime(datetime.now())) > > 2015-04-09 15:19:12.840000-04:00 > > > > > > You can read for the reasons it did > not make into datetime. > > But that's restricted to the system time zone. Nothing good ever comes > from the system time zone... Let's solve one problem at a time. You cannot have a complete TZ support without supporting the local timezone. The advantage of starting with the local timezone is that most systems have a semi-standard interface to obtain local zone name and offset for any UTC time. To support multiple zones, one would need to bundle an external library which opens a lot of questions that are orthogonal to the issue at hand: how to deal with ambiguous local times in the presence of DST or other political time shifts? email.utils.localtime solves this problem in the same way as POSIX mktime does: by introducing an optional isdst flag. The main objection to this interface was that in most use cases the programmer has no intelligent way to set this flag to anything other than -1. An alternative solution would be to have localtime(dt) (or astimezone(dt)) return a list of aware local times that in most cases will contain only one element, but may contain 0 or 2 when dt falls on a switchover. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ischwabacher at wisc.edu Thu Apr 9 21:07:05 2015 From: ischwabacher at wisc.edu (Isaac Schwabacher) Date: Thu, 09 Apr 2015 14:07:05 -0500 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: <726096741127d9.5526cda9@wiscmail.wisc.edu> References: <7720dd2f1a7c5a.55258173@wiscmail.wisc.edu> <7640c6b51a3df5.552581b0@wiscmail.wisc.edu> <7640ee4b1a339e.552581ec@wiscmail.wisc.edu> <764087921a0b9f.55258228@wiscmail.wisc.edu> <7750dca91a4174.55258265@wiscmail.wisc.edu> <7760c2011a39dc.552582a4@wiscmail.wisc.edu> <7620d9171a3992.552582e0@wiscmail.wisc.edu> <75b08d361a5634.5525831c@wiscmail.wisc.edu> <7620c4141a208c.5525840f@wiscmail.wisc.edu> <7640f6f91a3fb3.5525844c@wiscmail.wisc.edu> <7620e9e81a4383.55258488@wiscmail.wisc.edu> <75b0a38e1a1054.552584c4@wiscmail.wisc.edu> <7750bef61a2e60.55258503@wiscmail.wisc.edu> <776091541a6ac3.55258540@wiscmail.wisc.edu> <7640fc041a414e.5525857c@wiscmail.wisc.edu> <75b086b61a3536.552585b8@wiscmail.wisc.edu> <7760e3ea1a5213.55258634@wiscmail.wisc.edu> <7620d3511a1769.55258670@wiscmail.wisc.edu> <77309bf71a7f90.552586ac@wiscmail.wisc.edu> <75b0a1271a24e2.552586e9@wiscmail.wisc.edu> <75f0e7351a1eee.55258725@wiscmail.wisc.edu> <7760fdbc1a14b4.5525879e@wiscmail.wisc.edu> <7750f8091a0d97.552587da@wiscmail.wisc.edu> <75b0de0d1a5379.552541d9@wiscmail.wisc.edu> <7720aaab1a58e2.5525ad7b@wiscmail.wisc.edu> <75f0acb81a65a6.5525adb8@wiscmail.wisc.edu> <771093591a4622.5525adf4@wiscmail.wisc.edu> <7760ed451a72f2.5525ae30@wiscmail.wisc.edu> <75f0ce8f1a1f49.5525ae6d@wiscmail.wisc.edu> <75b0e6421a27a6.5525aea9@wiscmail.wisc.edu> <75309dc51a4385.5525aee6@wiscmail.wisc.edu> <7570ca031a50f1.5525af22@wiscmail.wisc.edu> <76209ade1a6819.5525af5f@wiscmail.wisc.edu> <7620c53c1a66f0.5525af9b@wiscmail.wisc.edu> <7750a5d91a3002.5525afd8@wiscmail.wisc.edu> <75b0e81b1a2ebc.5525b014@wiscmail.wisc.edu> <771080cc1a12e0.5525b050@wiscmail.wisc.edu> <7570e91a1a00ff.5525b08d@wiscmail.wisc.edu> <773095181a724e.5525b0c9@wiscmail.wisc.edu> <7750e1f41a3875.5525b106@wiscmail.wisc.edu> <7730929f1a1248.55256adb@wiscmail.wisc.edu> <76b09e10112104.5526908b@wiscmail.wisc.edu> <7720b012114283.552690c7@wiscmail.wisc.edu> <7640a73b11533e.55269104@wiscmail.wisc.edu> <76b0a2f1116cdb.55269140@wiscmail.wisc.edu> <76c0ac45114602.5526917c@wiscmail.wisc.edu> <7640d4e1117490.552691b9@wiscmail.wisc.edu> <76a0f179117f2a.552691f5@wiscmail.wisc.edu> <72608879112f0b.55269232@wiscmail.wisc.edu> <76b09bed110ab2.5526926e@wiscmail.wisc.edu> <76a0cdf5114414.552692ab@wiscmail.wisc.edu> <7320e869111df7.552692e7@wiscmail.wisc.edu> <7630bb7e111e15.55269324@wiscmail.wisc.edu> <76308c771129d3.55269360@wiscmail.wisc.edu> <7640ab4e1107a0.5526939d@wiscmail.wisc.edu> <76a0be85116b99.552693d9@wiscmail.wisc.edu> <76a0b3571107d1.55269415@wiscmail.wisc.edu> <7650b967111fbc.55269452@wiscmail.wisc.edu> <7260abb0117259.5526948e@wiscmail.wisc.edu> <76a0bf79115912.552694ca@wiscmail.wisc.edu> <7650a6da1138b6.55269507@wiscmail.wisc.edu> <7740d46111594f.55269543@wiscmail.wisc.edu> <7260c04b1125e5.5526957f@wiscmail.wisc.edu> <76a0cfd5116d05.552695bc@wiscmail.wisc.edu> <76f089d4111322.552696ac@wiscmail.wisc.edu> <774096d0113e3f.552696e8@wiscmail.wisc.edu> <7630d06d1178fd.55269725@wiscmail.wisc.edu> <76409a001164fb.55269761@wiscmail.wisc.edu> <76f09d4e1103ab.5526979d@wiscmail.wisc.edu> <7320ce0e113eda.552697da@wiscmail.wisc.edu> <7260c2b911161a.55269816@wiscmail.wisc.edu> <7720c43e113113.55269853@wiscmail.wisc.edu> <774099521128a0.5526988f@wiscmail.wisc.edu> <7320a0cf115f98.5526990a@wiscmail.wisc.edu> <7720f8e1114254.55269947@wiscmail.wisc.edu> <76b0bdb1115264.55269986@wiscmail.wisc.edu> <7740c5cc1123f6.552699c2@wiscmail.wisc.edu> <7260ce0e114e6a.552699ff@wiscmail.wisc.edu> <7640f6fa114c41.55269a3b@wiscmail.wisc.edu> <76c09d38117188.55269a78@wiscmail.wisc.edu> <76c08193115371.55269ab4@wiscmail.wisc.edu> <76c0f4791171c4.55269af1@wiscmail.wisc.edu> <76a0d6d2114af4.55269b2d@wiscmail.wisc.edu> <7740bf60110f0e.55269b6a@wiscmail.wisc.edu> <7720e67d110dc2.55269ba6@wiscmail.wisc.edu> <76c0d1e0110a4d.55269be2@wiscmail.wisc.edu> <76b0d4f9117278.55269c1f@wiscmail.wisc.edu> <7650f68c1179f9.55269c5b@wiscmail.wisc.edu> <76509b38117661.55269c98@wiscmail.wisc.edu> <7260c5b0117cf7.55269cd4@wiscmail.wisc.edu> <76c099b11112c1.55269d8b@wiscmail.wisc.edu> <76f085321157b1.55269dcb@wiscmail.wisc.edu> <7640add8116ed3.55269e07@wiscmail.wisc.edu> <76309f0f1113a1.5526a116@wiscmail.wisc.edu> <7720c6e1113323.5526a153@wiscmail.wisc.edu> <76c089df1136a9.5526a18f@wiscmail.wisc.edu> <7720d92a112c25.55265ba0@wiscmail.wisc.edu> <7640f8d8116085.5526aff1@wiscmail.wisc.edu> <7320a0b711117d.5526b02d@wiscmail.wisc.edu> <7650bc3c1179cb.5526b06a@wiscmail.wisc.edu> <76b0bf22114a4a.5526b0a6@wiscmail.wisc.edu> <76a0af84112133.5526b0e2@wiscmail.wisc.edu> <7640ac9511459c.5526b11f@wiscmail.wisc.edu> <76f085bb110fcb.5526b15b@wiscmail.wisc.edu> <7630d7cc116aa0.5526b198@wiscmail.wisc.edu> <77409a4f115819.5526b1d4@wiscmail.wisc.edu> <76a0a288115d8b.5526b210@wiscmail.wisc.edu> <7260eef5117ede.5526b303@wiscmail.wisc.edu> <76f0c67311001f.5526b7f3@wiscmail.wisc.edu> <76f0bf17110ba5.5526b82f@wiscmail.wisc.edu> <7640f9ad116735.5526bd8b@wiscmail.wisc.edu> <76c0d47a111b11.5526bdc8@wiscmail.wisc.edu> <72609f51116be1.5526be04@wiscmail.wisc.edu> <76f09ef511243a.5526be7c@wiscmail.wisc.edu> <76f0d2df117b5b.5526c0d5@wiscmail.wisc.edu> <76c08fdc112390.5526c111@wiscmail.wisc.edu> <7720df4d112847.5526c14e@wiscmail.wisc.edu> <772091e81144cf.5526c18a@wiscmail.wisc.edu> <7320cab1115f36.5526c529@wiscmail.wisc.edu> <7720eda711299a.5526c566@wiscmail.wisc.edu> <7720b35711064d.5526c5a2@wiscmail.wisc.edu> <72609cdf116f22.5526c5df@wiscmail.wisc.edu> <7260ec68117733.5526cb0b@wiscmail.wisc.edu> <76a087c2117551.5526cbc2@wiscmail.wisc.edu> <726093871109dc.5526cbff@wiscmail.wisc.edu> <76f0eb8b111384.5526cc3b@wiscmail.wisc.edu> <76b09c3a1161c2.5526cc77@wiscmail.wisc.edu> <7630ab8b1105de.5526ccb6@wiscmail.wisc.edu> <726096741127d9.5526cda9@wiscmail.wisc.edu> Message-ID: <76509083115717.55268789@wiscmail.wisc.edu> On 15-04-09, Alexander Belopolsky wrote: > > On Thu, Apr 9, 2015 at 11:59 AM, Isaac Schwabacher wrote: > > > > Storing isdst in the datetime object would allow?utcoffset(dt) to distinguish between 1:30AM before clock change and 1:30AM after. Where do you propose to store the offset? > > > > I propose to add an offset field to datetime.datetime. It should be set precisely when the tz field is set, and it should contain either a timedelta or an integer number of (possibly fractional) seconds, depending on what color the bikeshed gets painted. This is, IIUC, precisely where Lennart is proposing to store the is_dst flag. > > Can you describe a situation where having isdst flag is not sufficient? > > Note that in many applications you want to find the appropriate offset knowing only local time and location, so requiring users to supply the offset defeats the purpose of the time zone database. In many applications, the isdst flag can be hidden from the user. For example, a scheduling application can pretend that a repeated hour does not exist and always assume that 01:30 AM is the first 01:30 AM. (In many business applications, it is a good idea anyways.) Alternatively, a user attempting to schedule an event at an ambiguous time can be presented with a warning and a request to pick one of two possible times. This would be a much friendlier interface than the one always requiring the use to specify the UTC offset. I'm suggesting an internal representation, not a user interface. The only user-facing consequence of this internal representation would be exposing the offset to datetime.replace() and as a field on the object. All of the other interfaces that convert naive datetimes into aware datetimes would still use the is_dst flag as specified in the PEP. The intention is to make it easy to implement arithmetic and things like relative timedeltas. But looking at datetime.h, which seems to be willing to sacrifice conceptual simplicity in order to pack a datetime into as few bytes in memory as possible, it seems like whatever made that a good idea makes this a bad one. :/ > > if someone passed you datetime(2013, 11, 3, 1, 30) without a time zone. astimezone assumes that the input naive time is UTC, which is not the case here. > > No, it does not. Please read the documentation: "self must be aware (self.tzinfo must not be None, and self.utcoffset() must not return None)." Whoops, you're right. But that's even worse-- it doesn't give you a way to convert a naive datetime at all. Currently the only way from "2013-11-03 01:30:00" to "2013-11-03 01:30:00-0500 America/Chicago" is still datetime.replace(). ijs From ischwabacher at wisc.edu Thu Apr 9 21:39:09 2015 From: ischwabacher at wisc.edu (Isaac Schwabacher) Date: Thu, 09 Apr 2015 14:39:09 -0500 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: <76c0914511604d.5526d522@wiscmail.wisc.edu> References: <7620c53c1a66f0.5525af9b@wiscmail.wisc.edu> <7750a5d91a3002.5525afd8@wiscmail.wisc.edu> <75b0e81b1a2ebc.5525b014@wiscmail.wisc.edu> <771080cc1a12e0.5525b050@wiscmail.wisc.edu> <7570e91a1a00ff.5525b08d@wiscmail.wisc.edu> <773095181a724e.5525b0c9@wiscmail.wisc.edu> <7750e1f41a3875.5525b106@wiscmail.wisc.edu> <7730929f1a1248.55256adb@wiscmail.wisc.edu> <76b09e10112104.5526908b@wiscmail.wisc.edu> <7720b012114283.552690c7@wiscmail.wisc.edu> <7640a73b11533e.55269104@wiscmail.wisc.edu> <76b0a2f1116cdb.55269140@wiscmail.wisc.edu> <76c0ac45114602.5526917c@wiscmail.wisc.edu> <7640d4e1117490.552691b9@wiscmail.wisc.edu> <76a0f179117f2a.552691f5@wiscmail.wisc.edu> <72608879112f0b.55269232@wiscmail.wisc.edu> <76b09bed110ab2.5526926e@wiscmail.wisc.edu> <76a0cdf5114414.552692ab@wiscmail.wisc.edu> <7320e869111df7.552692e7@wiscmail.wisc.edu> <7630bb7e111e15.55269324@wiscmail.wisc.edu> <76308c771129d3.55269360@wiscmail.wisc.edu> <7640ab4e1107a0.5526939d@wiscmail.wisc.edu> <76a0be85116b99.552693d9@wiscmail.wisc.edu> <76a0b3571107d1.55269415@wiscmail.wisc.edu> <7650b967111fbc.55269452@wiscmail.wisc.edu> <7260abb0117259.5526948e@wiscmail.wisc.edu> <76a0bf79115912.552694ca@wiscmail.wisc.edu> <7650a6da1138b6.55269507@wiscmail.wisc.edu> <7740d46111594f.55269543@wiscmail.wisc.edu> <7260c04b1125e5.5526957f@wiscmail.wisc.edu> <76a0cfd5116d05.552695bc@wiscmail.wisc.edu> <76f089d4111322.552696ac@wiscmail.wisc.edu> <774096d0113e3f.552696e8@wiscmail.wisc.edu> <7630d06d1178fd.55269725@wiscmail.wisc.edu> <76409a001164fb.55269761@wiscmail.wisc.edu> <76f09d4e1103ab.5526979d@wiscmail.wisc.edu> <7320ce0e113eda.552697da@wiscmail.wisc.edu> <7260c2b911161a.55269816@wiscmail.wisc.edu> <7720c43e113113.55269853@wiscmail.wisc.edu> <774099521128a0.5526988f@wiscmail.wisc.edu> <7320a0cf115f98.5526990a@wiscmail.wisc.edu> <7720f8e1114254.55269947@wiscmail.wisc.edu> <76b0bdb1115264.55269986@wiscmail.wisc.edu> <7740c5cc1123f6.552699c2@wiscmail.wisc.edu> <7260ce0e114e6a.552699ff@wiscmail.wisc.edu> <7640f6fa114c41.55269a3b@wiscmail.wisc.edu> <76c09d38117188.55269a78@wiscmail.wisc.edu> <76c08193115371.55269ab4@wiscmail.wisc.edu> <76c0f4791171c4.55269af1@wiscmail.wisc.edu> <76a0d6d2114af4.55269b2d@wiscmail.wisc.edu> <7740bf60110f0e.55269b6a@wiscmail.wisc.edu> <7720e67d110dc2.55269ba6@wiscmail.wisc.edu> <76c0d1e0110a4d.55269be2@wiscmail.wisc.edu> <76b0d4f9117278.55269c1f@wiscmail.wisc.edu> <7650f68c1179f9.55269c5b@wiscmail.wisc.edu> <76509b38117661.55269c98@wiscmail.wisc.edu> <7260c5b0117cf7.55269cd4@wiscmail.wisc.edu> <76c099b11112c1.55269d8b@wiscmail.wisc.edu> <76f085321157b1.55269dcb@wiscmail.wisc.edu> <7640add8116ed3.55269e07@wiscmail.wisc.edu> <76309f0f1113a1.5526a116@wiscmail.wisc.edu> <7720c6e1113323.5526a153@wiscmail.wisc.edu> <76c089df1136a9.5526a18f@wiscmail.wisc.edu> <7720d92a112c25.55265ba0@wiscmail.wisc.edu> <7640f8d8116085.5526aff1@wiscmail.wisc.edu> <7320a0b711117d.5526b02d@wiscmail.wisc.edu> <7650bc3c1179cb.5526b06a@wiscmail.wisc.edu> <76b0bf22114a4a.5526b0a6@wiscmail.wisc.edu> <76a0af84112133.5526b0e2@wiscmail.wisc.edu> <7640ac9511459c.5526b11f@wiscmail.wisc.edu> <76f085bb110fcb.5526b15b@wiscmail.wisc.edu> <7630d7cc116aa0.5526b198@wiscmail.wisc.edu> <77409a4f115819.5526b1d4@wiscmail.wisc.edu> <76a0a288115d8b.5526b210@wiscmail.wisc.edu> <7260eef5117ede.5526b303@wiscmail.wisc.edu> <76f0c67311001f.5526b7f3@wiscmail.wisc.edu> <76f0bf17110ba5.5526b82f@wiscmail.wisc.edu> <7640f9ad116735.5526bd8b@wiscmail.wisc.edu> <76c0d47a111b11.5526bdc8@wiscmail.wisc.edu> <72609f51116be1.5526be04@wiscmail.wisc.edu> <76f09ef511243a.5526be7c@wiscmail.wisc.edu> <76f0d2df117b5b.5526c0d5@wiscmail.wisc.edu> <76c08fdc112390.5526c111@wiscmail.wisc.edu> <7720df4d112847.5526c14e@wiscmail.wisc.edu> <772091e81144cf.5526c18a@wiscmail.wisc.edu> <7320cab1115f36.5526c529@wiscmail.wisc.edu> <7720eda711299a.5526c566@wiscmail.wisc.edu> <7720b35711064d.5526c5a2@wiscmail.wisc.edu> <72609cdf116f22.5526c5df@wiscmail.wisc.edu> <7260ec68117733.5526cb0b@wiscmail.wisc.edu> <76a087c2117551.5526cbc2@wiscmail.wisc.edu> <726093871109dc.5526cbff@wiscmail.wisc.edu> <76f0eb8b111384.5526cc3b@wiscmail.wisc.edu> <76b09c3a1161c2.5526cc77@wiscmail.wisc.edu> <7630ab8b1105de.5526ccb6@wiscmail.wisc.edu> <726096741127d9.5526cda9@wiscmail.wisc.edu> <76509083115717.55268789@wiscmail.wisc.edu> <7630d8b9117947.5526d338@wiscmail.wisc.edu> <7630a0411144d9.5526d376@wiscmail.wisc.edu> <7630f1e7112d7d.5526d3b2@wiscmail.wisc.edu> <7640f55d1155b3.5526d42d@wiscmail.wisc.edu> <7630f265113a8c.5526d46a@wiscmail.wisc.edu> <7740a42f1118c1.5526d4a6@wiscmail.wisc.edu> <76a0daf4115f20.5526d4e5@wiscmail.wisc.edu> <76c0914511604d.5526d522@wiscmail.wisc.edu> Message-ID: <76c0f92b11575d.55268f0d@wiscmail.wisc.edu> On 15-04-09, Alexander Belopolsky wrote: > > On Thu, Apr 9, 2015 at 3:07 PM, Isaac Schwabacher wrote: > > > > No, it does not. Please read the documentation: "self must be aware (self.tzinfo must not be None, and self.utcoffset() must not return None)." > > > > Whoops, you're right. But that's even worse-- it doesn't give you a way to convert a naive datetime at all. Currently the only way from "2013-11-03 01:30:00" to "2013-11-03 01:30:00-0500 America/Chicago" is still datetime.replace(). > > > Well, you are right, but at least we do have a localtime utility hidden in the email package: > > > >>> from datetime import * > >>> from email.utils import localtime > >>> print(localtime(datetime.now())) > 2015-04-09 15:19:12.840000-04:00 > > > You can read for the reasons it did not make into datetime. But that's restricted to the system time zone. Nothing good ever comes from the system time zone... ijs From ischwabacher at wisc.edu Thu Apr 9 22:51:29 2015 From: ischwabacher at wisc.edu (Isaac Schwabacher) Date: Thu, 09 Apr 2015 15:51:29 -0500 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: <7720a4d6111b67.5526e63e@wiscmail.wisc.edu> References: <7720b012114283.552690c7@wiscmail.wisc.edu> <7640a73b11533e.55269104@wiscmail.wisc.edu> <76b0a2f1116cdb.55269140@wiscmail.wisc.edu> <76c0ac45114602.5526917c@wiscmail.wisc.edu> <7640d4e1117490.552691b9@wiscmail.wisc.edu> <76a0f179117f2a.552691f5@wiscmail.wisc.edu> <72608879112f0b.55269232@wiscmail.wisc.edu> <76b09bed110ab2.5526926e@wiscmail.wisc.edu> <76a0cdf5114414.552692ab@wiscmail.wisc.edu> <7320e869111df7.552692e7@wiscmail.wisc.edu> <7630bb7e111e15.55269324@wiscmail.wisc.edu> <76308c771129d3.55269360@wiscmail.wisc.edu> <7640ab4e1107a0.5526939d@wiscmail.wisc.edu> <76a0be85116b99.552693d9@wiscmail.wisc.edu> <76a0b3571107d1.55269415@wiscmail.wisc.edu> <7650b967111fbc.55269452@wiscmail.wisc.edu> <7260abb0117259.5526948e@wiscmail.wisc.edu> <76a0bf79115912.552694ca@wiscmail.wisc.edu> <7650a6da1138b6.55269507@wiscmail.wisc.edu> <7740d46111594f.55269543@wiscmail.wisc.edu> <7260c04b1125e5.5526957f@wiscmail.wisc.edu> <76a0cfd5116d05.552695bc@wiscmail.wisc.edu> <76f089d4111322.552696ac@wiscmail.wisc.edu> <774096d0113e3f.552696e8@wiscmail.wisc.edu> <7630d06d1178fd.55269725@wiscmail.wisc.edu> <76409a001164fb.55269761@wiscmail.wisc.edu> <76f09d4e1103ab.5526979d@wiscmail.wisc.edu> <7320ce0e113eda.552697da@wiscmail.wisc.edu> <7260c2b911161a.55269816@wiscmail.wisc.edu> <7720c43e113113.55269853@wiscmail.wisc.edu> <774099521128a0.5526988f@wiscmail.wisc.edu> <7320a0cf115f98.5526990a@wiscmail.wisc.edu> <7720f8e1114254.55269947@wiscmail.wisc.edu> <76b0bdb1115264.55269986@wiscmail.wisc.edu> <7740c5cc1123f6.552699c2@wiscmail.wisc.edu> <7260ce0e114e6a.552699ff@wiscmail.wisc.edu> <7640f6fa114c41.55269a3b@wiscmail.wisc.edu> <76c09d38117188.55269a78@wiscmail.wisc.edu> <76c08193115371.55269ab4@wiscmail.wisc.edu> <76c0f4791171c4.55269af1@wiscmail.wisc.edu> <76a0d6d2114af4.55269b2d@wiscmail.wisc.edu> <7740bf60110f0e.55269b6a@wiscmail.wisc.edu> <7720e67d110dc2.55269ba6@wiscmail.wisc.edu> <76c0d1e0110a4d.55269be2@wiscmail.wisc.edu> <76b0d4f9117278.55269c1f@wiscmail.wisc.edu> <7650f68c1179f9.55269c5b@wiscmail.wisc.edu> <76509b38117661.55269c98@wiscmail.wisc.edu> <7260c5b0117cf7.55269cd4@wiscmail.wisc.edu> <76c099b11112c1.55269d8b@wiscmail.wisc.edu> <76f085321157b1.55269dcb@wiscmail.wisc.edu> <7640add8116ed3.55269e07@wiscmail.wisc.edu> <76309f0f1113a1.5526a116@wiscmail.wisc.edu> <7720c6e1113323.5526a153@wiscmail.wisc.edu> <76c089df1136a9.5526a18f@wiscmail.wisc.edu> <7720d92a112c25.55265ba0@wiscmail.wisc.edu> <7640f8d8116085.5526aff1@wiscmail.wisc.edu> <7320a0b711117d.5526b02d@wiscmail.wisc.edu> <7650bc3c1179cb.5526b06a@wiscmail.wisc.edu> <76b0bf22114a4a.5526b0a6@wiscmail.wisc.edu> <76a0af84112133.5526b0e2@wiscmail.wisc.edu> <7640ac9511459c.5526b11f@wiscmail.wisc.edu> <76f085bb110fcb.5526b15b@wiscmail.wisc.edu> <7630d7cc116aa0.5526b198@wiscmail.wisc.edu> <77409a4f115819.5526b1d4@wiscmail.wisc.edu> <76a0a288115d8b.5526b210@wiscmail.wisc.edu> <7260eef5117ede.5526b303@wiscmail.wisc.edu> <76f0c67311001f.5526b7f3@wiscmail.wisc.edu> <76f0bf17110ba5.5526b82f@wiscmail.wisc.edu> <7640f9ad116735.5526bd8b@wiscmail.wisc.edu> <76c0d47a111b11.5526bdc8@wiscmail.wisc.edu> <72609f51116be1.5526be04@wiscmail.wisc.edu> <76f09ef511243a.5526be7c@wiscmail.wisc.edu> <76f0d2df117b5b.5526c0d5@wiscmail.wisc.edu> <76c08fdc112390.5526c111@wiscmail.wisc.edu> <7720df4d112847.5526c14e@wiscmail.wisc.edu> <772091e81144cf.5526c18a@wiscmail.wisc.edu> <7320cab1115f36.5526c529@wiscmail.wisc.edu> <7720eda711299a.5526c566@wiscmail.wisc.edu> <7720b35711064d.5526c5a2@wiscmail.wisc.edu> <72609cdf116f22.5526c5df@wiscmail.wisc.edu> <7260ec68117733.5526cb0b@wiscmail.wisc.edu> <76a087c2117551.5526cbc2@wiscmail.wisc.edu> <726093871109dc.5526cbff@wiscmail.wisc.edu> <76f0eb8b111384.5526cc3b@wiscmail.wisc.edu> <76b09c3a1161c2.5526cc77@wiscmail.wisc.edu> <7630ab8b1105de.5526ccb6@wiscmail.wisc.edu> <726096741127d9.5526cda9@wiscmail.wisc.edu> <76509083115717.55268789@wiscmail.wisc.edu> <7630d8b9117947.5526d338@wiscmail.wisc.edu> <7630a0411144d9.5526d376@wiscmail.wisc.edu> <7630f1e7112d7d.5526d3b2@wiscmail.wisc.edu> <7640f55d1155b3.5526d42d@wiscmail.wisc.edu> <7630f265113a8c.5526d46a@wiscmail.wisc.edu> <7740a42f1118c1.5526d4a6@wiscmail.wisc.edu> <76a0daf4115f20.5526d4e5@wiscmail.wisc.edu> <76c0914511604d.5526d522@wiscmail.wisc.edu> <76c0f92b11575d.55268f0d@wiscmail.wisc.edu> <77208197114f89.5526da62@wiscmail.wisc.edu> <76c0a306114887.5526dc44@wiscmail.wisc.edu> <76309ddb117ac8.5526dc80@wiscmail.wisc.edu> <7640a18f1170b1.5526dcbd@wiscmail.wisc.edu> <76509fd7113325.5526dcfc@wiscmail.wisc.edu> <7260b05511496a.5526dd38@wiscmail.wisc.edu> <7650d04d110f2f.5526dd74@wiscmail.wisc.edu> <7630b5a5116644.5526ddb3@wiscmail.wisc.edu> <76b0d2c1111bf0.5526ddf0@wiscmail.wisc.edu> <7720b3c511228b.5526de2c@wiscmail.wisc.edu> <7260d8d6114d61.5526de68@wiscmail.wisc.edu> <76c0cc251162c1.5526e059@wiscmail.wisc.edu> <76f08cba110e77.5526e096@wiscmail.wisc.edu> <76b0f8c5116117.5526e0d2@wiscmail.wisc.edu> <7650bede1107ce.5526e10f@wiscmail.wisc.edu> <76a0ad0a1101af.5526e14b@wiscmail.wisc.edu> <7320f45011194a.5526e188@wiscmail.wisc.edu> <7630b9ca116cfa.5526e200@wiscmail.wisc.edu> <76a0faa6111772.5526e23d@wiscmail.wisc.edu> <7320d1df111942.5526e279@wiscmail.wisc.edu> <7650ae6e110329.5526e2b6@wiscmail.wisc.edu> <76b0eced112c47.5526e511@wiscmail.wisc.edu> <7720f43a11344d.5526e54d@wiscmail.wisc.edu> <7720a4d6111b67.5526e63e@wiscmail.wisc.edu> Message-ID: <76f0a490117c27.5526a001@wiscmail.wisc.edu> On 15-04-09, Alexander Belopolsky wrote: > > On Thu, Apr 9, 2015 at 3:39 PM, Isaac Schwabacher wrote: > > > > Well, you are right, but at least we do have a localtime utility hidden in the email package: > > > > > > >>> from datetime import * > > > >>> from email.utils import localtime > > > >>> print(localtime(datetime.now())) > > > 2015-04-09 15:19:12.840000-04:00 > > > > > > You can read for the reasons it did not make into datetime. > > > > But that's restricted to the system time zone. Nothing good ever comes from the system time zone... > > Let's solve one problem at a time. You cannot have a complete TZ support without supporting the local timezone. The advantage of starting with the local timezone is that most systems have a semi-standard interface to obtain local zone name and offset for any UTC time. To support multiple zones, one would > need to bundle an external library which opens a lot of questions that are orthogonal to the issue at hand: > how to deal with ambiguous local times in the presence of DST or other political time shifts? PEP 431 proposes to import zoneinfo into the stdlib, so there will be a standard way to get this information. That's kind of the whole point. But there's really no good cross-platform way to get the name of the system time zone. tzlocal figures out what it is for certain classes of systems for which it's possible, and is also being proposed for stdlib inclusion in the PEP. > email.utils.localtime solves this problem in the same way as POSIX mktime does: by introducing an optional isdst flag. The main objection to this interface was that in most use cases the programmer has no intelligent way to set this flag to anything other than -1. > > An alternative solution would be to have localtime(dt) (or astimezone(dt)) return a list of aware local times that in most cases will contain only one element, but may contain 0 or 2 when dt falls on a switchover. This functionality should probably be exposed (in ruby it's TZInfo::TimeZone#periods_for_local, where TZInfo is an external package; I don't think it's easily available in pytz), but if it's the default way to do a conversion, I suspect that it's more likely to just lead to poor error reporting. ijs From regebro at gmail.com Thu Apr 9 22:56:52 2015 From: regebro at gmail.com (Lennart Regebro) Date: Thu, 9 Apr 2015 22:56:52 +0200 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: References: <7640a73b11533e.55269104@wiscmail.wisc.edu> <76b0a2f1116cdb.55269140@wiscmail.wisc.edu> <76c0ac45114602.5526917c@wiscmail.wisc.edu> <7640d4e1117490.552691b9@wiscmail.wisc.edu> <76a0f179117f2a.552691f5@wiscmail.wisc.edu> <72608879112f0b.55269232@wiscmail.wisc.edu> <76b09bed110ab2.5526926e@wiscmail.wisc.edu> <76a0cdf5114414.552692ab@wiscmail.wisc.edu> <7320e869111df7.552692e7@wiscmail.wisc.edu> <7630bb7e111e15.55269324@wiscmail.wisc.edu> <76308c771129d3.55269360@wiscmail.wisc.edu> <7640ab4e1107a0.5526939d@wiscmail.wisc.edu> <76a0be85116b99.552693d9@wiscmail.wisc.edu> <76a0b3571107d1.55269415@wiscmail.wisc.edu> <7650b967111fbc.55269452@wiscmail.wisc.edu> <7260abb0117259.5526948e@wiscmail.wisc.edu> <76a0bf79115912.552694ca@wiscmail.wisc.edu> <7650a6da1138b6.55269507@wiscmail.wisc.edu> <7740d46111594f.55269543@wiscmail.wisc.edu> <7260c04b1125e5.5526957f@wiscmail.wisc.edu> <76a0cfd5116d05.552695bc@wiscmail.wisc.edu> <76f089d4111322.552696ac@wiscmail.wisc.edu> <774096d0113e3f.552696e8@wiscmail.wisc.edu> <7630d06d1178fd.55269725@wiscmail.wisc.edu> <76409a001164fb.55269761@wiscmail.wisc.edu> <76f09d4e1103ab.5526979d@wiscmail.wisc.edu> <7320ce0e113eda.552697da@wiscmail.wisc.edu> <7260c2b911161a.55269816@wiscmail.wisc.edu> <7720c43e113113.55269853@wiscmail.wisc.edu> <774099521128a0.5526988f@wiscmail.wisc.edu> <7320a0cf115f98.5526990a@wiscmail.wisc.edu> <7720f8e1114254.55269947@wiscmail.wisc.edu> <76b0bdb1115264.55269986@wiscmail.wisc.edu> <7740c5cc1123f6.552699c2@wiscmail.wisc.edu> <7260ce0e114e6a.552699ff@wiscmail.wisc.edu> <7640f6fa114c41.55269a3b@wiscmail.wisc.edu> <76c09d38117188.55269a78@wiscmail.wisc.edu> <76c08193115371.55269ab4@wiscmail.wisc.edu> <76c0f4791171c4.55269af1@wiscmail.wisc.edu> <76a0d6d2114af4.55269b2d@wiscmail.wisc.edu> <7740bf60110f0e.55269b6a@wiscmail.wisc.edu> <7720e67d110dc2.55269ba6@wiscmail.wisc.edu> <76c0d1e0110a4d.55269be2@wiscmail.wisc.edu> <76b0d4f9117278.55269c1f@wiscmail.wisc.edu> <7650f68c1179f9.55269c5b@wiscmail.wisc.edu> <76509b38117661.55269c98@wiscmail.wisc.edu> <7260c5b0117cf7.55269cd4@wiscmail.wisc.edu> <76c099b11112c1.55269d8b@wiscmail.wisc.edu> <76f085321157b1.55269dcb@wiscmail.wisc.edu> <7640add8116ed3.55269e07@wiscmail.wisc.edu> <76309f0f1113a1.5526a116@wiscmail.wisc.edu> <7720c6e1113323.5526a153@wiscmail.wisc.edu> <76c089df1136a9.5526a18f@wiscmail.wisc.edu> <7720d92a112c25.55265ba0@wiscmail.wisc.edu> <7640f8d8116085.5526aff1@wiscmail.wisc.edu> <7320a0b711117d.5526b02d@wiscmail.wisc.edu> <7650bc3c1179cb.5526b06a@wiscmail.wisc.edu> <76b0bf22114a4a.5526b0a6@wiscmail.wisc.edu> <76a0af84112133.5526b0e2@wiscmail.wisc.edu> <7640ac9511459c.5526b11f@wiscmail.wisc.edu> <76f085bb110fcb.5526b15b@wiscmail.wisc.edu> <7630d7cc116aa0.5526b198@wiscmail.wisc.edu> <77409a4f115819.5526b1d4@wiscmail.wisc.edu> <76a0a288115d8b.5526b210@wiscmail.wisc.edu> <7260eef5117ede.5526b303@wiscmail.wisc.edu> <76f0c67311001f.5526b7f3@wiscmail.wisc.edu> <76f0bf17110ba5.5526b82f@wiscmail.wisc.edu> <7640f9ad116735.5526bd8b@wiscmail.wisc.edu> <76c0d47a111b11.5526bdc8@wiscmail.wisc.edu> <72609f51116be1.5526be04@wiscmail.wisc.edu> <76f09ef511243a.5526be7c@wiscmail.wisc.edu> <76f0d2df117b5b.5526c0d5@wiscmail.wisc.edu> <76c08fdc112390.5526c111@wiscmail.wisc.edu> <7720df4d112847.5526c14e@wiscmail.wisc.edu> <772091e81144cf.5526c18a@wiscmail.wisc.edu> <7320cab1115f36.5526c529@wiscmail.wisc.edu> <7720eda711299a.5526c566@wiscmail.wisc.edu> <7720b35711064d.5526c5a2@wiscmail.wisc.edu> <72609cdf116f22.5526c5df@wiscmail.wisc.edu> <7260ec68117733.5526cb0b@wiscmail.wisc.edu> <76a087c2117551.5526cbc2@wiscmail.wisc.edu> <726093871109dc.5526cbff@wiscmail.wisc.edu> <76f0eb8b111384.5526cc3b@wiscmail.wisc.edu> <76b09c3a1161c2.5526cc77@wiscmail.wisc.edu> <7630ab8b1105de.5526ccb6@wiscmail.wisc.edu> <726096741127d9.5526cda9@wiscmail.wisc.edu> <76509083115717.55268789@wiscmail.wisc.edu> <7630d8b9117947.5526d338@wiscmail.wisc.edu> <7630a0411144d9.5526d376@wiscmail.wisc.edu> <7630f1e7112d7d.5526d3b2@wiscmail.wisc.edu> <7640f55d1155b3.5526d42d@wiscmail.wisc.edu> <7630f265113a8c.5526d46a@wiscmail.wisc.edu> <7740a42f1118c1.5526d4a6@wiscmail.wisc.edu> <76a0daf4115f20.5526d4e5@wiscmail.wisc.edu> <76c0914511604d.5526d522@wiscmail.wisc.edu> <76c0f92b11575d.55268f0d@wiscmail.wisc.edu> Message-ID: There is plenty of time to book an open space now. I would prefer sometime Saturday. Any wishes? -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexander.belopolsky at gmail.com Fri Apr 10 04:21:35 2015 From: alexander.belopolsky at gmail.com (Alexander Belopolsky) Date: Thu, 9 Apr 2015 22:21:35 -0400 Subject: [Python-Dev] Aware datetime from naive local time Was: Status on PEP-431 Timezones Message-ID: On Thu, Apr 9, 2015 at 4:51 PM, Isaac Schwabacher wrote: > > > > Well, you are right, but at least we do have a localtime utility > hidden in the email package: > > > > > > > > >>> from datetime import * > > > > >>> from email.utils import localtime > > > > >>> print(localtime(datetime.now())) > > > > 2015-04-09 15:19:12.840000-04:00 > > > > > > > > You can read for the reasons it > did not make into datetime. > > > > > > But that's restricted to the system time zone. Nothing good ever comes > from the system time zone... > > > > Let's solve one problem at a time. ... > > PEP 431 proposes to import zoneinfo into the stdlib, ... I am changing the subject so that we can focus on one question without diverting to PEP-size issues that are better suited for python ideas. I would like to add a functionality to the datetime module that would solve a seemingly simple problem: given a naive datetime instance assumed to be in local time, construct the corresponding aware datetime object with tzinfo set to an appropriate fixed offset datetime.timezone instance. Python 3 has this functionality implemented in the email package since version 3.3, and it appears to work well even in the ambiguous hour >>> from email.utils import localtime >>> from datetime import datetime >>> localtime(datetime(2014,11,2,1,30)).strftime('%c %z %Z') 'Sun Nov 2 01:30:00 2014 -0400 EDT' >>> localtime(datetime(2014,11,2,1,30), isdst=0).strftime('%c %z %Z') 'Sun Nov 2 01:30:00 2014 -0500 EST' However, in a location with a more interesting history, you can get a situation tha -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexander.belopolsky at gmail.com Fri Apr 10 05:14:10 2015 From: alexander.belopolsky at gmail.com (Alexander Belopolsky) Date: Thu, 9 Apr 2015 23:14:10 -0400 Subject: [Python-Dev] Aware datetime from naive local time Was: Status on PEP-431 Timezones In-Reply-To: References: Message-ID: Sorry for a truncated message. Please scroll past the quoted portion. On Thu, Apr 9, 2015 at 10:21 PM, Alexander Belopolsky < alexander.belopolsky at gmail.com> wrote: > > On Thu, Apr 9, 2015 at 4:51 PM, Isaac Schwabacher > wrote: > >> > > > Well, you are right, but at least we do have a localtime utility >> hidden in the email package: >> > > > >> > > > >>> from datetime import * >> > > > >>> from email.utils import localtime >> > > > >>> print(localtime(datetime.now())) >> > > > 2015-04-09 15:19:12.840000-04:00 >> > > > >> > > > You can read for the reasons it >> did not make into datetime. >> > > >> > > But that's restricted to the system time zone. Nothing good ever >> comes from the system time zone... >> > >> > Let's solve one problem at a time. ... >> >> PEP 431 proposes to import zoneinfo into the stdlib, ... > > > I am changing the subject so that we can focus on one question without > diverting to PEP-size issues that are better suited for python ideas. > > I would like to add a functionality to the datetime module that would > solve a seemingly simple problem: given a naive datetime instance assumed > to be in local time, construct the corresponding aware datetime object with > tzinfo set to an appropriate fixed offset datetime.timezone instance. > > Python 3 has this functionality implemented in the email package since > version 3.3, and it appears to work well even > in the ambiguous hour > > >>> from email.utils import localtime > >>> from datetime import datetime > >>> localtime(datetime(2014,11,2,1,30)).strftime('%c %z %Z') > 'Sun Nov 2 01:30:00 2014 -0400 EDT' > >>> localtime(datetime(2014,11,2,1,30), isdst=0).strftime('%c %z %Z') > 'Sun Nov 2 01:30:00 2014 -0500 EST' > > However, in a location with a more interesting history, you can get a > situation that > would look like this in the zoneinfo database: $ zdump -v -c 1992 Europe/Kiev ... Europe/Kiev Sat Mar 24 22:59:59 1990 UTC = Sun Mar 25 01:59:59 1990 MSK isdst=0 Europe/Kiev Sat Mar 24 23:00:00 1990 UTC = Sun Mar 25 03:00:00 1990 MSD isdst=1 Europe/Kiev Sat Jun 30 21:59:59 1990 UTC = Sun Jul 1 01:59:59 1990 MSD isdst=1 Europe/Kiev Sat Jun 30 22:00:00 1990 UTC = Sun Jul 1 01:00:00 1990 EEST isdst=1 Europe/Kiev Sat Sep 28 23:59:59 1991 UTC = Sun Sep 29 02:59:59 1991 EEST isdst=1 Europe/Kiev Sun Sep 29 00:00:00 1991 UTC = Sun Sep 29 02:00:00 1991 EET isdst=0 ... Look what happened on July 1, 1990. At 2 AM, the clocks in Ukraine were moved back one hour. So times like 01:30 AM happened twice there on that day. Let's see how Python handles this situation $ TZ=Europe/Kiev python3 >>> from email.utils import localtime >>> from datetime import datetime >>> localtime(datetime(1990,7,1,1,30)).strftime('%c %z %Z') 'Sun Jul 1 01:30:00 1990 +0400 MSD' So far so good, I've got the first of the two 01:30AM's. But what if I want the other 01:30AM? Well, >>> localtime(datetime(1990,7,1,1,30), isdst=0).strftime('%c %z %Z') 'Sun Jul 1 01:30:00 1990 +0300 EEST' gives me "the other 01:30AM", but it is counter-intuitive: I have to ask for the standard (winter) time to get the daylight savings (summer) time. The uncertainty about how to deal with the repeated hour was the reason why email.utils.localtime-like interface did not make it to the datetime module. The main objection to the isdst flag was that in most situations, determining whether DST is in effect is as hard as finding the UTC offset, so reducing the problem of finding the UTC offset to the one of finding the value for isdst does not solve much. I now realize that the problem is simply in the name for the flag. While we cannot often tell what isdst should be and in some situations the actual DST status does not differentiate between the two possible times, we can always say whether we want to get the first or the second time. In other words, instead of localtime(dt, isdst=-1), we may want localtime(dt, which=0) where "which" is used to resolve the ambiguity: "which=0" means return the first (in UTC order) of the two times and "which=1" means return the second. (In the non-ambiguous cases "which" is ignored.) An alternative solution would be make localtime(dt) return a list of 0, 1 or 2 instances, but this will probably make a common usage (the case when the user does not care which time she gets) more cumbersome. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Fri Apr 10 12:12:02 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 10 Apr 2015 06:12:02 -0400 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: <76509083115717.55268789@wiscmail.wisc.edu> References: <7620c53c1a66f0.5525af9b@wiscmail.wisc.edu> <7750a5d91a3002.5525afd8@wiscmail.wisc.edu> <75b0e81b1a2ebc.5525b014@wiscmail.wisc.edu> <771080cc1a12e0.5525b050@wiscmail.wisc.edu> <7570e91a1a00ff.5525b08d@wiscmail.wisc.edu> <773095181a724e.5525b0c9@wiscmail.wisc.edu> <7750e1f41a3875.5525b106@wiscmail.wisc.edu> <7730929f1a1248.55256adb@wiscmail.wisc.edu> <76b09e10112104.5526908b@wiscmail.wisc.edu> <7720b012114283.552690c7@wiscmail.wisc.edu> <7640a73b11533e.55269104@wiscmail.wisc.edu> <76b0a2f1116cdb.55269140@wiscmail.wisc.edu> <76c0ac45114602.5526917c@wiscmail.wisc.edu> <7640d4e1117490.552691b9@wiscmail.wisc.edu> <76a0f179117f2a.552691f5@wiscmail.wisc.edu> <72608879112f0b.55269232@wiscmail.wisc.edu> <76b09bed110ab2.5526926e@wiscmail.wisc.edu> <76a0cdf5114414.552692ab@wiscmail.wisc.edu> <7320e869111df7.552692e7@wiscmail.wisc.edu> <7630bb7e111e15.55269324@wiscmail.wisc.edu> <76308c771129d3.55269360@wiscmail.wisc.edu> <7640ab4e1107a0.5526939d@wiscmail.wisc.edu> <76a0be85116b99.552693d9@wiscmail.wisc.edu> <76a0b3571107d1.55269415@wiscmail.wisc.edu> <7650b967111fbc.55269452@wiscmail.wisc.edu> <7260abb0117259.5526948e@wiscmail.wisc.edu> <76a0bf79115912.552694ca@wiscmail.wisc.edu> <7650a6da1138b6.55269507@wiscmail.wisc.edu> <7740d46111594f.55269543@wiscmail.wisc.edu> <7260c04b1125e5.5526957f@wiscmail.wisc.edu> <76a0cfd5116d05.552695bc@wiscmail.wisc.edu> <76f089d4111322.552696ac@wiscmail.wisc.edu> <774096d0113e3f.552696e8@wiscmail.wisc.edu> <7630d06d1178fd.55269725@wiscmail.wisc.edu> <76409a001164fb.55269761@wiscmail.wisc.edu> <76f09d4e1103ab.5526979d@wiscmail.wisc.edu> <7320ce0e113eda.552697da@wiscmail.wisc.edu> <7260c2b911161a.55269816@wiscmail.wisc.edu> <7720c43e113113.55269853@wiscmail.wisc.edu> <774099521128a0.5526988f@wiscmail.wisc.edu> <7320a0cf115f98.5526990a@wiscmail.wisc.edu> <7720f8e1114254.55269947@wiscmail.wisc.edu> <76b0bdb1115264.55269986@wiscmail.wisc.edu> <7740c5cc1123f6.552699c2@wiscmail.wisc.edu> <7260ce0e114e6a.552699ff@wiscmail.wisc.edu> <7640f6fa114c41.55269a3b@wiscmail.wisc.edu> <76c09d38117188.55269a78@wiscmail.wisc.edu> <76c08193115371.55269ab4@wiscmail.wisc.edu> <76c0f4791171c4.55269af1@wiscmail.wisc.edu> <76a0d6d2114af4.55269b2d@wiscmail.wisc.edu> <7740bf60110f0e.55269b6a@wiscmail.wisc.edu> <7720e67d110dc2.55269ba6@wiscmail.wisc.edu> <76c0d1e0110a4d.55269be2@wiscmail.wisc.edu> <76b0d4f9117278.55269c1f@wiscmail.wisc.edu> <7650f68c1179f9.55269c5b@wiscmail.wisc.edu> <76509b38117661.55269c98@wiscmail.wisc.edu> <7260c5b0117cf7.55269cd4@wiscmail.wisc.edu> <76c099b11112c1.55269d8b@wiscmail.wisc.edu> <76f085321157b1.55269dcb@wiscmail.wisc.edu> <7640add8116ed3.55269e07@wiscmail.wisc.edu> <76309f0f1113a1.5526a116@wiscmail.wisc.edu> <7720c6e1113323.5526a153@wiscmail.wisc.edu> <76c089df1136a9.5526a18f@wiscmail.wisc.edu> <7720d92a112c25.55265ba0@wiscmail.wisc.edu> <7640f8d8116085.5526aff1@wiscmail.wisc.edu> <7320a0b711117d.5526b02d@wiscmail.wisc.edu> <7650bc3c1179cb.5526b06a@wiscmail.wisc.edu> <76b0bf22114a4a.5526b0a6@wiscmail.wisc.edu> <76a0af84112133.5526b0e2@wiscmail.wisc.edu> <7640ac9511459c.5526b11f@wiscmail.wisc.edu> <76f085bb110fcb.5526b15b@wiscmail.wisc.edu> <7630d7cc116aa0.5526b198@wiscmail.wisc.edu> <77409a4f115819.5526b1d4@wiscmail.wisc.edu> <76a0a288115d8b.5526b210@wiscmail.wisc.edu> <7260eef5117ede.5526b303@wiscmail.wisc.edu> <76f0c67311001f.5526b7f3@wiscmail.wisc.edu> <76f0bf17110ba5.5526b82f@wiscmail.wisc.edu> <7640f9ad116735.5526bd8b@wiscmail.wisc.edu> <76c0d47a111b11.5526bdc8@wiscmail.wisc.edu> <72609f51116be1.5526be04@wiscmail.wisc.edu> <76f09ef511243a.5526be7c@wiscmail.wisc.edu> <76f0d2df117b5b.5526c0d5@wiscmail.wisc.edu> <76c08fdc112390.5526c111@wiscmail.wisc.edu> <7720df4d112847.5526c14e@wiscmail.wisc.edu> <772091e81144cf.5526c18a@wiscmail.wisc.edu> <7320cab1115f36.5526c529@wiscmail.wisc.edu> <7720eda711299a.5526c566@wiscmail.wisc.edu> <7720b35711064d.5526c5a2@wiscmail.wisc.edu> <72609cdf116f22.5526c5df@wiscmail.wisc.edu> <7260ec68117733.5526cb0b@wiscmail.wisc.edu> <76a087c2117551.5526cbc2@wiscmail.wisc.edu> <726093871109dc.5526cbff@wiscmail.wisc.edu> <76f0eb8b111384.5526cc3b@wiscmail.wisc.edu> <76b09c3a1161c2.5526cc77@wiscmail.wisc.edu> <7630ab8b1105de.5526ccb6@wiscmail.wisc.edu> <726096741127d9.5526cda9@wiscmail.wisc.edu> <76509083115717.55268789@wiscmail.wisc.edu> Message-ID: On 9 Apr 2015 16:41, "Isaac Schwabacher" wrote: > I'm suggesting an internal representation, not a user interface. The only user-facing consequence of this internal representation would be exposing the offset to datetime.replace() and as a field on the object. All of the other interfaces that convert naive datetimes into aware datetimes would still use the is_dst flag as specified in the PEP. The intention is to make it easy to implement arithmetic and things like relative timedeltas. > > But looking at datetime.h, which seems to be willing to sacrifice conceptual simplicity in order to pack a datetime into as few bytes in memory as possible, it seems like whatever made that a good idea makes this a bad one. :/ The question of "store the DST flag" vs "store the offset" is essentially a data normalisation one - there's only a single bit of additional information actually needed (whether the time is DST or not in the annual hour of ambiguity), which can then be combined with the local time, the location and the zone info database to get the *actual* offset. The developer facing API should thus focus on letting developers provide that additional bit of information needed to query the zone info database correctly. Caching the result of the database zone offset lookup on the instance may be useful as a speed optimisation (at the cost of additional memory usage per instance), but that difference should be quantified before introducing the redundancy, rather than being assumed to be necessary. As an internal space/speed trade-off it's also not a decision that needs to be made as part of the PEP discussion - the external API will be the same regardless of whether or not the offset is cached after looking it up from the zone info database. A question the PEP perhaps *should* consider is whether or not to offer an API allowing datetime objects to be built from a naive datetime, a fixed offset and a location, throwing NonExistentTimeError if the given date, time and offset doesn't match either the DST or non-DST times at that location. Cheers, Nick. P.S. The description of NonExistentTimeError in the PEP doesn't seem quite right, as it currently says it will only be thrown if "is_dst=None", which seems like a copy and paste error from the definition of AmbiguousTimeError. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Fri Apr 10 12:38:54 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 10 Apr 2015 06:38:54 -0400 Subject: [Python-Dev] Aware datetime from naive local time Was: Status on PEP-431 Timezones In-Reply-To: References: Message-ID: On 9 Apr 2015 23:15, "Alexander Belopolsky" wrote: > > Sorry for a truncated message. Please scroll past the quoted portion. > > On Thu, Apr 9, 2015 at 10:21 PM, Alexander Belopolsky < alexander.belopolsky at gmail.com> wrote: >> >> >> On Thu, Apr 9, 2015 at 4:51 PM, Isaac Schwabacher wrote: >>> >>> > > > Well, you are right, but at least we do have a localtime utility hidden in the email package: >>> > > > >>> > > > >>> from datetime import * >>> > > > >>> from email.utils import localtime >>> > > > >>> print(localtime(datetime.now())) >>> > > > 2015-04-09 15:19:12.840000-04:00 >>> > > > >>> > > > You can read for the reasons it did not make into datetime. >>> > > >>> > > But that's restricted to the system time zone. Nothing good ever comes from the system time zone... >>> > >>> > Let's solve one problem at a time. ... >>> >>> PEP 431 proposes to import zoneinfo into the stdlib, ... >> >> >> I am changing the subject so that we can focus on one question without diverting to PEP-size issues that are better suited for python ideas. >> >> I would like to add a functionality to the datetime module that would solve a seemingly simple problem: given a naive datetime instance assumed to be in local time, construct the corresponding aware datetime object with tzinfo set to an appropriate fixed offset datetime.timezone instance. >> >> Python 3 has this functionality implemented in the email package since version 3.3, and it appears to work well even >> in the ambiguous hour >> >> >>> from email.utils import localtime >> >>> from datetime import datetime >> >>> localtime(datetime(2014,11,2,1,30)).strftime('%c %z %Z') >> 'Sun Nov 2 01:30:00 2014 -0400 EDT' >> >>> localtime(datetime(2014,11,2,1,30), isdst=0).strftime('%c %z %Z') >> 'Sun Nov 2 01:30:00 2014 -0500 EST' >> >> However, in a location with a more interesting history, you can get a situation that > > > would look like this in the zoneinfo database: > > $ zdump -v -c 1992 Europe/Kiev > ... > Europe/Kiev Sat Mar 24 22:59:59 1990 UTC = Sun Mar 25 01:59:59 1990 MSK isdst=0 > Europe/Kiev Sat Mar 24 23:00:00 1990 UTC = Sun Mar 25 03:00:00 1990 MSD isdst=1 > Europe/Kiev Sat Jun 30 21:59:59 1990 UTC = Sun Jul 1 01:59:59 1990 MSD isdst=1 > Europe/Kiev Sat Jun 30 22:00:00 1990 UTC = Sun Jul 1 01:00:00 1990 EEST isdst=1 > Europe/Kiev Sat Sep 28 23:59:59 1991 UTC = Sun Sep 29 02:59:59 1991 EEST isdst=1 > Europe/Kiev Sun Sep 29 00:00:00 1991 UTC = Sun Sep 29 02:00:00 1991 EET isdst=0 > ... > > Look what happened on July 1, 1990. At 2 AM, the clocks in Ukraine were moved back one hour. So times like 01:30 AM happened twice there on that day. Let's see how Python handles this situation > > $ TZ=Europe/Kiev python3 > >>> from email.utils import localtime > >>> from datetime import datetime > >>> localtime(datetime(1990,7,1,1,30)).strftime('%c %z %Z') > 'Sun Jul 1 01:30:00 1990 +0400 MSD' > > So far so good, I've got the first of the two 01:30AM's. But what if I want the other 01:30AM? Well, > > >>> localtime(datetime(1990,7,1,1,30), isdst=0).strftime('%c %z %Z') > 'Sun Jul 1 01:30:00 1990 +0300 EEST' > > gives me "the other 01:30AM", but it is counter-intuitive: I have to ask for the standard (winter) time to get the daylight savings (summer) time. > > The uncertainty about how to deal with the repeated hour was the reason why email.utils.localtime-like interface did not make it to the datetime module. > > The main objection to the isdst flag was that in most situations, determining whether DST is in effect is as hard as finding the UTC offset, so reducing the problem of finding the UTC offset to the one of finding the value for isdst does not solve much. > > I now realize that the problem is simply in the name for the flag. While we cannot often tell what isdst should be and in some situations the actual DST status does not differentiate between the two possible times, we can always say whether we want to get the first or the second time. > > In other words, instead of localtime(dt, isdst=-1), we may want localtime(dt, which=0) where "which" is used to resolve the ambiguity: "which=0" means return the first (in UTC order) of the two times and "which=1" means return the second. (In the non-ambiguous cases "which" is ignored.) It actually took me a long time to understand that the "isdst" flag in this context related to the following chain of reasoning: 1. Due to various reasons, local time offsets relative to UTC may change, thus repeating certain subsets of local time 2. Repeated local times usually relate to winding clocks back an hour at the end of a DST period 3. "isdst=True" thus refers to "before the local time change winds the clocks back", while "isdst=False" refers to *after* the clocks are wound back As Alexander says, you can reduce the amount of assumed knowledge needed to understand the API by focusing on the ambiguity resolution directly without assuming that the *reason* for the ambiguity is "end of DST period". >From a pedagogical point of view, having a separate API that returned 0, 1, or 2 results for a local time lookup could thus help make it clear that local time to absolute time conversions are effectively a database lookup problem, and that timezone offset changes (whether historical or cyclical) mean that the mapping isn't 1:1 - some expressible local times never actually happen, while others happen more than once. For the normal APIs, NonExistentTimeError would then correspond with the case where the record lookup API returned no results, while the suggested "which" index would handle the two results case without assuming the repeated local time was specifically due to the end of a DST period. Regards, Nick. -------------- next part -------------- An HTML attachment was scrubbed... URL: From stuart at stuartbishop.net Fri Apr 10 12:59:43 2015 From: stuart at stuartbishop.net (Stuart Bishop) Date: Fri, 10 Apr 2015 17:59:43 +0700 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: References: <55256754.10204@oddbird.net> Message-ID: On 9 April 2015 at 04:38, Lennart Regebro wrote: >> So is adopting pytz's expanded API into the stdlib really a big problem? > > Maybe, maybe not. But that API is also needlessly complicated, > precisely because it's working around the limitations of > datetime.tzinfo. In the PEP I remove those limitations but keep the > simpler API. With a solution based on how pytz does it, I don't think > that's possible. Speaking at the pytz author and maintainer, I can categorically state that pytz's extended API sucks. >> Is this really adequate? pytz's implementation handles far more than "is >> DST or not", it also correctly handles historical timezone changes. How >> would those be handled under this proposal? > > Those would still be handled. The flag is only to flag if it's DST or > not in a timestamp that is otherwise ambiguous. Yeah. I look forward to datetime growing an isdst flag, which if nothing else means they can round trip with the 9-tuples used by the time module and a simple LocalTz class written that uses time.mktime and friends. I'd have done it myself if I remembered any C. At that point, the most disgusting parts of pytz can be torn out, giving it the standard documented API and I'm more than happy to do that bit. -- Stuart Bishop http://www.stuartbishop.net/ From stuart at stuartbishop.net Fri Apr 10 14:27:38 2015 From: stuart at stuartbishop.net (Stuart Bishop) Date: Fri, 10 Apr 2015 19:27:38 +0700 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: References: <7750a5d91a3002.5525afd8@wiscmail.wisc.edu> <75b0e81b1a2ebc.5525b014@wiscmail.wisc.edu> <771080cc1a12e0.5525b050@wiscmail.wisc.edu> <7570e91a1a00ff.5525b08d@wiscmail.wisc.edu> <773095181a724e.5525b0c9@wiscmail.wisc.edu> <7750e1f41a3875.5525b106@wiscmail.wisc.edu> <7730929f1a1248.55256adb@wiscmail.wisc.edu> <76b09e10112104.5526908b@wiscmail.wisc.edu> <7720b012114283.552690c7@wiscmail.wisc.edu> <7640a73b11533e.55269104@wiscmail.wisc.edu> <76b0a2f1116cdb.55269140@wiscmail.wisc.edu> <76c0ac45114602.5526917c@wiscmail.wisc.edu> <7640d4e1117490.552691b9@wiscmail.wisc.edu> <76a0f179117f2a.552691f5@wiscmail.wisc.edu> <72608879112f0b.55269232@wiscmail.wisc.edu> <76b09bed110ab2.5526926e@wiscmail.wisc.edu> <76a0cdf5114414.552692ab@wiscmail.wisc.edu> <7320e869111df7.552692e7@wiscmail.wisc.edu> <7630bb7e111e15.55269324@wiscmail.wisc.edu> <76308c771129d3.55269360@wiscmail.wisc.edu> <7640ab4e1107a0.5526939d@wiscmail.wisc.edu> <76a0be85116b99.552693d9@wiscmail.wisc.edu> <76a0b3571107d1.55269415@wiscmail.wisc.edu> <7650b967111fbc.55269452@wiscmail.wisc.edu> <7260abb0117259.5526948e@wiscmail.wisc.edu> <76a0bf79115912.552694ca@wiscmail.wisc.edu> <7650a6da1138b6.55269507@wiscmail.wisc.edu> <7740d46111594f.55269543@wiscmail.wisc.edu> <7260c04b1125e5.5526957f@wiscmail.wisc.edu> <76a0cfd5116d05.552695bc@wiscmail.wisc.edu> <76f089d4111322.552696ac@wiscmail.wisc.edu> <774096d0113e3f.552696e8@wiscmail.wisc.edu> <7630d06d1178fd.55269725@wiscmail.wisc.edu> <76409a001164fb.55269761@wiscmail.wisc.edu> <76f09d4e1103ab.5526979d@wiscmail.wisc.edu> <7320ce0e113eda.552697da@wiscmail.wisc.edu> <7260c2b911161a.55269816@wiscmail.wisc.edu> <7720c43e113113.55269853@wiscmail.wisc.edu> <774099521128a0.5526988f@wiscmail.wisc.edu> <7320a0cf115f98.5526990a@wiscmail.wisc.edu> <7720f8e1114254.55269947@wiscmail.wisc.edu> <76b0bdb1115264.55269986@wiscmail.wisc.edu> <7740c5cc1123f6.552699c2@wiscmail.wisc.edu> <7260ce0e114e6a.552699ff@wiscmail.wisc.edu> <7640f6fa114c41.55269a3b@wiscmail.wisc.edu> <76c09d38117188.55269a78@wiscmail.wisc.edu> <76c08193115371.55269ab4@wiscmail.wisc.edu> <76c0f4791171c4.55269af1@wiscmail.wisc.edu> <76a0d6d2114af4.55269b2d@wiscmail.wisc.edu> <7740bf60110f0e.55269b6a@wiscmail.wisc.edu> <7720e67d110dc2.55269ba6@wiscmail.wisc.edu> <76c0d1e0110a4d.55269be2@wiscmail.wisc.edu> <76b0d4f9117278.55269c1f@wiscmail.wisc.edu> <7650f68c1179f9.55269c5b@wiscmail.wisc.edu> <76509b38117661.55269c98@wiscmail.wisc.edu> <7260c5b0117cf7.55269cd4@wiscmail.wisc.edu> <76c099b11112c1.55269d8b@wiscmail.wisc.edu> <76f085321157b1.55269dcb@wiscmail.wisc.edu> <7640add8116ed3.55269e07@wiscmail.wisc.edu> <76309f0f1113a1.5526a116@wiscmail.wisc.edu> <7720c6e1113323.5526a153@wiscmail.wisc.edu> <76c089df1136a9.5526a18f@wiscmail.wisc.edu> <7720d92a112c25.55265ba0@wiscmail.wisc.edu> <7640f8d8116085.5526aff1@wiscmail.wisc.edu> <7320a0b711117d.5526b02d@wiscmail.wisc.edu> <7650bc3c1179cb.5526b06a@wiscmail.wisc.edu> <76b0bf22114a4a.5526b0a6@wiscmail.wisc.edu> <76a0af84112133.5526b0e2@wiscmail.wisc.edu> <7640ac9511459c.5526b11f@wiscmail.wisc.edu> <76f085bb110fcb.5526b15b@wiscmail.wisc.edu> <7630d7cc116aa0.5526b198@wiscmail.wisc.edu> <77409a4f115819.5526b1d4@wiscmail.wisc.edu> <76a0a288115d8b.5526b210@wiscmail.wisc.edu> <7260eef5117ede.5526b303@wiscmail.wisc.edu> <76f0c67311001f.5526b7f3@wiscmail.wisc.edu> <76f0bf17110ba5.5526b82f@wiscmail.wisc.edu> <7640f9ad116735.5526bd8b@wiscmail.wisc.edu> <76c0d47a111b11.5526bdc8@wiscmail.wisc.edu> <72609f51116be1.5526be04@wiscmail.wisc.edu> <76f09ef511243a.5526be7c@wiscmail.wisc.edu> <76f0d2df117b5b.5526c0d5@wiscmail.wisc.edu> <76c08fdc112390.5526c111@wiscmail.wisc.edu> <7720df4d112847.5526c14e@wiscmail.wisc.edu> <772091e81144cf.5526c18a@wiscmail.wisc.edu> <7320cab1115f36.5526c529@wiscmail.wisc.edu> <7720eda711299a.5526c566@wiscmail.wisc.edu> <7720b35711064d.5526c5a2@wiscmail.wisc.edu> <72609cdf116f22.5526c5df@wiscmail.wisc.edu> <7260ec68117733.5526cb0b@wiscmail.wisc.edu> <76a087c2117551.5526cbc2@wiscmail.wisc.edu> <726093871109dc.5526cbff@wiscmail.wisc.edu> <76f0eb8b111384.5526cc3b@wiscmail.wisc.edu> <76b09c3a1161c2.5526cc77@wiscmail.wisc.edu> <7630ab8b1105de.5526ccb6@wiscmail.wisc.edu> <726096741127d9.5526cda9@wiscmail.wisc.edu> <76509083115717.55268789@wiscmail.wisc.edu> Message-ID: On 10 April 2015 at 17:12, Nick Coghlan wrote: > On 9 Apr 2015 16:41, "Isaac Schwabacher" wrote: > >> I'm suggesting an internal representation, not a user interface. The only >> user-facing consequence of this internal representation would be exposing >> the offset to datetime.replace() and as a field on the object. All of the >> other interfaces that convert naive datetimes into aware datetimes would >> still use the is_dst flag as specified in the PEP. The intention is to make >> it easy to implement arithmetic and things like relative timedeltas. >> >> But looking at datetime.h, which seems to be willing to sacrifice >> conceptual simplicity in order to pack a datetime into as few bytes in >> memory as possible, it seems like whatever made that a good idea makes this >> a bad one. :/ For the history here (at least as I recall it), the is_dst flag was deliberately left out of the datetime class because this single bit would have bumped the pickle size by a whole byte. At the time, the primary use case was for storing timestamps in Zope2's ZODB database. Soon after it became apparent to most people that naive timestamps and web applications don't mix, making the optimization pointless since the pickle size blew out anyway with people storing the timezone (and this is also the reason pytz.utc takes pains to be the minimum sized pickle). Its probably all rather silly in todays 64 bit world. > The question of "store the DST flag" vs "store the offset" is essentially a > data normalisation one - there's only a single bit of additional information > actually needed (whether the time is DST or not in the annual hour of > ambiguity), which can then be combined with the local time, the location and > the zone info database to get the *actual* offset. The dst flag must be stored in the datetime, either as a boolean or encoded in the timezone structure. If you only have the offset, you lose interoperability with the time module (and the standard IANA zoneinfo library, posix etc., which it wraps). (dt + timedelta(hours=1)).ctime() will give an incorrect answer when you cross a DST transition. > A question the PEP perhaps *should* consider is whether or not to offer an > API allowing datetime objects to be built from a naive datetime, a fixed > offset and a location, throwing NonExistentTimeError if the given date, time > and offset doesn't match either the DST or non-DST times at that location. I don't think you need a specific API, apart from being able to construct a tzinfo using nothing but an offset (lots of people require this for things like parsing email headers, which is why pytz has the FixedOffset class). datetime.now().astimezone(FixedOffset(-1200)).astimezone(timezone('Melbourne/Australia', is_dst=None) > P.S. The description of NonExistentTimeError in the PEP doesn't seem quite > right, as it currently says it will only be thrown if "is_dst=None", which > seems like a copy and paste error from the definition of AmbiguousTimeError. The PEP is correct here. If you explicitly specify is_dst, you know which side of the transition you are on and which offset to use. You can calculate datetime(2000, 12, 31, 2, 0, 0, 0).astimezone(foo, is_dst=False) using the non-DST offset and get an answer. It might not have ever appeared on the clock on your wall, but it is better than a punch in the face. If you want a punch in the face, is_dst=None refuses to guess and you get the exception. (Except for those cases where the timezone offset is changed without a DST transition, but that is rare enough everyone pretends they don't exist) (Thanks Lennart!) From v+python at g.nevcal.com Fri Apr 10 17:57:24 2015 From: v+python at g.nevcal.com (Glenn Linderman) Date: Fri, 10 Apr 2015 08:57:24 -0700 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: References: <76f09ef511243a.5526be7c@wiscmail.wisc.edu> <76f0d2df117b5b.5526c0d5@wiscmail.wisc.edu> <76c08fdc112390.5526c111@wiscmail.wisc.edu> <7720df4d112847.5526c14e@wiscmail.wisc.edu> <772091e81144cf.5526c18a@wiscmail.wisc.edu> <7320cab1115f36.5526c529@wiscmail.wisc.edu> <7720eda711299a.5526c566@wiscmail.wisc.edu> <7720b35711064d.5526c5a2@wiscmail.wisc.edu> <72609cdf116f22.5526c5df@wiscmail.wisc.edu> <7260ec68117733.5526cb0b@wiscmail.wisc.edu> <76a087c2117551.5526cbc2@wiscmail.wisc.edu> <726093871109dc.5526cbff@wiscmail.wisc.edu> <76f0eb8b111384.5526cc3b@wiscmail.wisc.edu> <76b09c3a1161c2.5526cc77@wiscmail.wisc.edu> <7630ab8b1105de.5526ccb6@wiscmail.wisc.edu> <726096741127d9.5526cda9@wiscmail.wisc.edu> <76509083115717.55268789@wiscmail.wisc.edu> Message-ID: <5527F2E4.5010805@g.nevcal.com> On 4/10/2015 5:27 AM, Stuart Bishop wrote: >> P.S. The description of NonExistentTimeError in the PEP doesn't seem quite >> >right, as it currently says it will only be thrown if "is_dst=None", which >> >seems like a copy and paste error from the definition of AmbiguousTimeError. > The PEP is correct here. If you explicitly specify is_dst, you know > which side of the transition you are on and which offset to use. You > can calculate datetime(2000, 12, 31, 2, 0, 0, 0).astimezone(foo, > is_dst=False) using the non-DST offset and get an answer. It might not > have ever appeared on the clock on your wall, but it is better than a > punch in the face. If you want a punch in the face, is_dst=None > refuses to guess and you get the exception. So does this mean that consistent use of is_dst=False throughout the summer would allow for people that refuse to accept government regulation into their timekeeping, and are willing to be out of step with their neighbors, would be able to match their software to their wall-clock time? -------------- next part -------------- An HTML attachment was scrubbed... URL: From status at bugs.python.org Fri Apr 10 18:08:21 2015 From: status at bugs.python.org (Python tracker) Date: Fri, 10 Apr 2015 18:08:21 +0200 (CEST) Subject: [Python-Dev] Summary of Python tracker Issues Message-ID: <20150410160821.3809356647@psf.upfronthosting.co.za> ACTIVITY SUMMARY (2015-04-03 - 2015-04-10) Python tracker at http://bugs.python.org/ To view or respond to any of the issues listed below, click on the issue. Do NOT respond to this message. Issues counts and deltas: open 4823 (-15) closed 30844 (+62) total 35667 (+47) Open issues with patches: 2238 Issues opened (38) ================== #16840: Tkinter doesn't support large integers http://bugs.python.org/issue16840 reopened by haypo #21859: Add Python implementation of FileIO http://bugs.python.org/issue21859 reopened by serhiy.storchaka #23860: Failure to check return value from lseek() in Modules/mmapmodu http://bugs.python.org/issue23860 opened by dogbert2 #23861: Make stdprinter use DebugOutputString when no stdout/stderr av http://bugs.python.org/issue23861 opened by steve.dower #23863: Fix EINTR Socket Module issues in 2.7 http://bugs.python.org/issue23863 opened by mcjeff #23864: issubclass without registration only works for "one-trick pony http://bugs.python.org/issue23864 opened by mjpieters #23865: Fix possible leaks in close methods http://bugs.python.org/issue23865 opened by serhiy.storchaka #23867: Argument Clinic: inline parsing code for 1-argument functions http://bugs.python.org/issue23867 opened by serhiy.storchaka #23868: Uninitialized objects are tracked by the garbage collector http://bugs.python.org/issue23868 opened by h.venev #23869: Initialization is being done in PyType_GenericAlloc http://bugs.python.org/issue23869 opened by h.venev #23870: pprint collections classes http://bugs.python.org/issue23870 opened by serhiy.storchaka #23874: Encrypted MSI fails to install with code 2755 http://bugs.python.org/issue23874 opened by jason.coombs #23876: Fix mkdir() call for Watcom compilers on UNIX-like platforms http://bugs.python.org/issue23876 opened by Jeffrey.Armstrong #23878: Missing sanity checks for various C library function calls... http://bugs.python.org/issue23878 opened by dogbert2 #23880: Tkinter: getint and getdouble should support Tcl_Obj http://bugs.python.org/issue23880 opened by serhiy.storchaka #23882: unittest discovery and namespaced packages http://bugs.python.org/issue23882 opened by Florian.Apolloner #23883: __all__ lists are incomplete http://bugs.python.org/issue23883 opened by serhiy.storchaka #23885: urllib.quote horribly mishandles unicode as second parameter http://bugs.python.org/issue23885 opened by koriakin #23886: faulthandler_user should use _PyThreadState_Current http://bugs.python.org/issue23886 opened by Albert.Zeyer #23887: HTTPError doesn't have a good "repr" representation http://bugs.python.org/issue23887 opened by facundobatista #23888: Fixing fractional expiry time bug in cookiejar http://bugs.python.org/issue23888 opened by ssh #23889: Speedup inspect.Signature.bind http://bugs.python.org/issue23889 opened by yselivanov #23890: assertRaises increases reference counter http://bugs.python.org/issue23890 opened by Vjacheslav.Fyodorov #23891: Tutorial doesn't mention either pip or virtualenv http://bugs.python.org/issue23891 opened by akuchling #23892: Introduce sys.implementation.opt_levels http://bugs.python.org/issue23892 opened by brett.cannon #23893: Forward-port future_builtins http://bugs.python.org/issue23893 opened by brett.cannon #23894: lib2to3 doesn't recognize rb'...' as a raw byte string in Pyth http://bugs.python.org/issue23894 opened by eli.bendersky #23895: PATCH: python socket module fails to build on Solaris when -zi http://bugs.python.org/issue23895 opened by andy_js #23896: lib2to3 doesn't provide a grammar where exec is a function http://bugs.python.org/issue23896 opened by eli.bendersky #23897: Update Python 3 extension module porting guide http://bugs.python.org/issue23897 opened by ncoghlan #23898: inspect() changes in Python3.4 are not compatible with objects http://bugs.python.org/issue23898 opened by zzzeek #23899: HTTP regression in distutils uploads to chishop http://bugs.python.org/issue23899 opened by jason.coombs #23900: Add a default docstring to Enum subclasses http://bugs.python.org/issue23900 opened by ncoghlan #23901: Force console stdout to use UTF8 on Windows http://bugs.python.org/issue23901 opened by paul.moore #23902: let exception react to being raised or the setting of magic pr http://bugs.python.org/issue23902 opened by abathur #23903: Generate PC/python3.def by scraping headers http://bugs.python.org/issue23903 opened by zach.ware #23904: pathlib.PurePath does not accept bytes components http://bugs.python.org/issue23904 opened by bru #23906: poplib maxline behaviour may be wrong http://bugs.python.org/issue23906 opened by gnarvaja Most recent 15 issues with no replies (15) ========================================== #23906: poplib maxline behaviour may be wrong http://bugs.python.org/issue23906 #23904: pathlib.PurePath does not accept bytes components http://bugs.python.org/issue23904 #23903: Generate PC/python3.def by scraping headers http://bugs.python.org/issue23903 #23900: Add a default docstring to Enum subclasses http://bugs.python.org/issue23900 #23899: HTTP regression in distutils uploads to chishop http://bugs.python.org/issue23899 #23897: Update Python 3 extension module porting guide http://bugs.python.org/issue23897 #23896: lib2to3 doesn't provide a grammar where exec is a function http://bugs.python.org/issue23896 #23895: PATCH: python socket module fails to build on Solaris when -zi http://bugs.python.org/issue23895 #23894: lib2to3 doesn't recognize rb'...' as a raw byte string in Pyth http://bugs.python.org/issue23894 #23892: Introduce sys.implementation.opt_levels http://bugs.python.org/issue23892 #23891: Tutorial doesn't mention either pip or virtualenv http://bugs.python.org/issue23891 #23889: Speedup inspect.Signature.bind http://bugs.python.org/issue23889 #23888: Fixing fractional expiry time bug in cookiejar http://bugs.python.org/issue23888 #23886: faulthandler_user should use _PyThreadState_Current http://bugs.python.org/issue23886 #23876: Fix mkdir() call for Watcom compilers on UNIX-like platforms http://bugs.python.org/issue23876 Most recent 15 issues waiting for review (15) ============================================= #23903: Generate PC/python3.def by scraping headers http://bugs.python.org/issue23903 #23898: inspect() changes in Python3.4 are not compatible with objects http://bugs.python.org/issue23898 #23895: PATCH: python socket module fails to build on Solaris when -zi http://bugs.python.org/issue23895 #23893: Forward-port future_builtins http://bugs.python.org/issue23893 #23888: Fixing fractional expiry time bug in cookiejar http://bugs.python.org/issue23888 #23887: HTTPError doesn't have a good "repr" representation http://bugs.python.org/issue23887 #23880: Tkinter: getint and getdouble should support Tcl_Obj http://bugs.python.org/issue23880 #23878: Missing sanity checks for various C library function calls... http://bugs.python.org/issue23878 #23876: Fix mkdir() call for Watcom compilers on UNIX-like platforms http://bugs.python.org/issue23876 #23870: pprint collections classes http://bugs.python.org/issue23870 #23867: Argument Clinic: inline parsing code for 1-argument functions http://bugs.python.org/issue23867 #23865: Fix possible leaks in close methods http://bugs.python.org/issue23865 #23863: Fix EINTR Socket Module issues in 2.7 http://bugs.python.org/issue23863 #23860: Failure to check return value from lseek() in Modules/mmapmodu http://bugs.python.org/issue23860 #23857: Make default HTTPS certificate verification setting configurab http://bugs.python.org/issue23857 Top 10 most discussed issues (10) ================================= #23857: Make default HTTPS certificate verification setting configurab http://bugs.python.org/issue23857 28 msgs #23496: Steps for Android Native Build of Python 3.4.2 http://bugs.python.org/issue23496 27 msgs #23863: Fix EINTR Socket Module issues in 2.7 http://bugs.python.org/issue23863 12 msgs #15443: datetime module has no support for nanoseconds http://bugs.python.org/issue15443 8 msgs #21859: Add Python implementation of FileIO http://bugs.python.org/issue21859 8 msgs #23893: Forward-port future_builtins http://bugs.python.org/issue23893 8 msgs #23852: Wrong computation of max_fd on OpenBSD http://bugs.python.org/issue23852 7 msgs #23898: inspect() changes in Python3.4 are not compatible with objects http://bugs.python.org/issue23898 7 msgs #23883: __all__ lists are incomplete http://bugs.python.org/issue23883 5 msgs #6818: remove/delete method for zipfile/tarfile objects http://bugs.python.org/issue6818 4 msgs Issues closed (56) ================== #2174: xml.sax.xmlreader does not support the InputSource protocol http://bugs.python.org/issue2174 closed by fdrake #2276: distutils out-of-date for runtime_library_dirs flag on OS X http://bugs.python.org/issue2276 closed by r.david.murray #3566: httplib persistent connections violate MUST in RFC2616 sec 8.1 http://bugs.python.org/issue3566 closed by r.david.murray #6650: sre_parse contains a confusing generic error message http://bugs.python.org/issue6650 closed by serhiy.storchaka #7511: msvc9compiler.py: ValueError when trying to compile with VC Ex http://bugs.python.org/issue7511 closed by zach.ware #8841: getopt errors should be specialized http://bugs.python.org/issue8841 closed by r.david.murray #9248: multiprocessing.pool: Proposal: "waitforslot" http://bugs.python.org/issue9248 closed by r.david.murray #10590: Parameter type error for xml.sax.parseString(string, ...) http://bugs.python.org/issue10590 closed by serhiy.storchaka #10838: subprocess __all__ is incomplete http://bugs.python.org/issue10838 closed by gregory.p.smith #12082: Python/import.c still references fstat even with DONT_HAVE_FST http://bugs.python.org/issue12082 closed by r.david.murray #13238: Add shell command helpers to subprocess module http://bugs.python.org/issue13238 closed by r.david.murray #14812: Change file associations to not be a default installer feature http://bugs.python.org/issue14812 closed by r.david.murray #15011: Change Scripts to bin on Windows http://bugs.python.org/issue15011 closed by r.david.murray #15133: tkinter.BooleanVar.get() behavior and docstring disagree http://bugs.python.org/issue15133 closed by serhiy.storchaka #15582: Enhance inspect.getdoc to follow inheritance chains http://bugs.python.org/issue15582 closed by serhiy.storchaka #18544: subprocess.Popen support for redirection of arbitrary file des http://bugs.python.org/issue18544 closed by r.david.murray #18965: 2to3 can produce illegal bytes literals http://bugs.python.org/issue18965 closed by serhiy.storchaka #19247: Describe surrogateescape algorithm in the Library Reference http://bugs.python.org/issue19247 closed by ncoghlan #19738: pytime.c loses precision under Windows http://bugs.python.org/issue19738 closed by haypo #20611: socket.create_connection() doesn't handle EINTR properly http://bugs.python.org/issue20611 closed by gregory.p.smith #20660: Starting a second multiprocessing.Manager causes INCREF on all http://bugs.python.org/issue20660 closed by r.david.murray #22144: ellipsis needs better display in lexer documentation http://bugs.python.org/issue22144 closed by r.david.murray #22649: Use _PyUnicodeWriter in case_operation() http://bugs.python.org/issue22649 closed by haypo #22721: pprint output for sets and dicts is not stable http://bugs.python.org/issue22721 closed by serhiy.storchaka #23021: Get rid of references to PyString in Modules/ http://bugs.python.org/issue23021 closed by berker.peksag #23025: ssl.RAND_bytes docs should mention os.urandom http://bugs.python.org/issue23025 closed by berker.peksag #23027: test_warnings fails with -Werror http://bugs.python.org/issue23027 closed by berker.peksag #23062: test_argparse --version test cases http://bugs.python.org/issue23062 closed by berker.peksag #23199: libpython27.a in amd64 release is 32-bit http://bugs.python.org/issue23199 closed by zach.ware #23338: PyErr_Format in ctypes uses invalid parameter http://bugs.python.org/issue23338 closed by serhiy.storchaka #23400: Inconsistent behaviour of multiprocessing.Queue() if sem_open http://bugs.python.org/issue23400 closed by berker.peksag #23411: Update urllib.parse.__all__ http://bugs.python.org/issue23411 closed by serhiy.storchaka #23459: Linux: expose the new execveat() syscall http://bugs.python.org/issue23459 closed by haypo #23492: Argument Clinic: improve generated parser for 1-argument funct http://bugs.python.org/issue23492 closed by serhiy.storchaka #23500: Argument Clinic: multiple macro definition http://bugs.python.org/issue23500 closed by larry #23501: Argument Clinic: generate code into separate files by default http://bugs.python.org/issue23501 closed by serhiy.storchaka #23570: Change "with subprocess.Popen():" (context manager) to ignore http://bugs.python.org/issue23570 closed by haypo #23772: pymonotonic() gone backward on "AMD64 Debian root 3.x" buildbo http://bugs.python.org/issue23772 closed by haypo #23799: Join started threads in tests http://bugs.python.org/issue23799 closed by serhiy.storchaka #23807: Improved test coverage for calendar.py command line http://bugs.python.org/issue23807 closed by Thana Annis #23825: test_idle fails under -OO http://bugs.python.org/issue23825 closed by serhiy.storchaka #23841: py34 OrderedDict is using weakref for root reference http://bugs.python.org/issue23841 closed by rhettinger #23849: Leaks in test_deque http://bugs.python.org/issue23849 closed by rhettinger #23853: PEP 475: handle EINTR in the ssl module http://bugs.python.org/issue23853 closed by haypo #23856: build.bat -e shouldn't also build http://bugs.python.org/issue23856 closed by tim.golden #23862: subprocess popen arguments using double quotes http://bugs.python.org/issue23862 closed by paul.moore #23866: array module broken http://bugs.python.org/issue23866 closed by serhiy.storchaka #23871: turning itertools.{repeat,count} into indexable iterables http://bugs.python.org/issue23871 closed by r.david.murray #23872: Typo in response in smtpd http://bugs.python.org/issue23872 closed by python-dev #23873: Removal of dead code in smtpd http://bugs.python.org/issue23873 closed by python-dev #23875: Incorrect argument parsing in _ssl http://bugs.python.org/issue23875 closed by python-dev #23877: Build fails when threads are disabled during configure step http://bugs.python.org/issue23877 closed by python-dev #23879: asyncio: SelectorEventLoop.sock_connect() must not retry conne http://bugs.python.org/issue23879 closed by haypo #23881: test_urllib2net: issues with ftp://gatekeeper.research.compaq. http://bugs.python.org/issue23881 closed by haypo #23884: New DateType for argparse (like FileType but for dates) http://bugs.python.org/issue23884 closed by petedmarsh #23905: Python here making CPU go to 100% http://bugs.python.org/issue23905 closed by r.david.murray From ischwabacher at wisc.edu Fri Apr 10 18:43:06 2015 From: ischwabacher at wisc.edu (Isaac Schwabacher) Date: Fri, 10 Apr 2015 11:43:06 -0500 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: <7530ba90d0977.5527fd74@wiscmail.wisc.edu> References: <7750a5d91a3002.5525afd8@wiscmail.wisc.edu> <75b0e81b1a2ebc.5525b014@wiscmail.wisc.edu> <771080cc1a12e0.5525b050@wiscmail.wisc.edu> <7570e91a1a00ff.5525b08d@wiscmail.wisc.edu> <773095181a724e.5525b0c9@wiscmail.wisc.edu> <7750e1f41a3875.5525b106@wiscmail.wisc.edu> <7730929f1a1248.55256adb@wiscmail.wisc.edu> <76b09e10112104.5526908b@wiscmail.wisc.edu> <7720b012114283.552690c7@wiscmail.wisc.edu> <7640a73b11533e.55269104@wiscmail.wisc.edu> <76b0a2f1116cdb.55269140@wiscmail.wisc.edu> <76c0ac45114602.5526917c@wiscmail.wisc.edu> <7640d4e1117490.552691b9@wiscmail.wisc.edu> <76a0f179117f2a.552691f5@wiscmail.wisc.edu> <72608879112f0b.55269232@wiscmail.wisc.edu> <76b09bed110ab2.5526926e@wiscmail.wisc.edu> <76a0cdf5114414.552692ab@wiscmail.wisc.edu> <7320e869111df7.552692e7@wiscmail.wisc.edu> <7630bb7e111e15.55269324@wiscmail.wisc.edu> <76308c771129d3.55269360@wiscmail.wisc.edu> <7640ab4e1107a0.5526939d@wiscmail.wisc.edu> <76a0be85116b99.552693d9@wiscmail.wisc.edu> <76a0b3571107d1.55269415@wiscmail.wisc.edu> <7650b967111fbc.55269452@wiscmail.wisc.edu> <7260abb0117259.5526948e@wiscmail.wisc.edu> <76a0bf79115912.552694ca@wiscmail.wisc.edu> <7650a6da1138b6.55269507@wiscmail.wisc.edu> <7740d46111594f.55269543@wiscmail.wisc.edu> <7260c04b1125e5.5526957f@wiscmail.wisc.edu> <76a0cfd5116d05.552695bc@wiscmail.wisc.edu> <76f089d4111322.552696ac@wiscmail.wisc.edu> <774096d0113e3f.552696e8@wiscmail.wisc.edu> <7630d06d1178fd.55269725@wiscmail.wisc.edu> <76409a001164fb.55269761@wiscmail.wisc.edu> <76f09d4e1103ab.5526979d@wiscmail.wisc.edu> <7320ce0e113eda.552697da@wiscmail.wisc.edu> <7260c2b911161a.55269816@wiscmail.wisc.edu> <7720c43e113113.55269853@wiscmail.wisc.edu> <774099521128a0.5526988f@wiscmail.wisc.edu> <7320a0cf115f98.5526990a@wiscmail.wisc.edu> <7720f8e1114254.55269947@wiscmail.wisc.edu> <76b0bdb1115264.55269986@wiscmail.wisc.edu> <7740c5cc1123f6.552699c2@wiscmail.wisc.edu> <7260ce0e114e6a.552699ff@wiscmail.wisc.edu> <7640f6fa114c41.55269a3b@wiscmail.wisc.edu> <76c09d38117188.55269a78@wiscmail.wisc.edu> <76c08193115371.55269ab4@wiscmail.wisc.edu> <76c0f4791171c4.55269af1@wiscmail.wisc.edu> <76a0d6d2114af4.55269b2d@wiscmail.wisc.edu> <7740bf60110f0e.55269b6a@wiscmail.wisc.edu> <7720e67d110dc2.55269ba6@wiscmail.wisc.edu> <76c0d1e0110a4d.55269be2@wiscmail.wisc.edu> <76b0d4f9117278.55269c1f@wiscmail.wisc.edu> <7650f68c1179f9.55269c5b@wiscmail.wisc.edu> <76509b38117661.55269c98@wiscmail.wisc.edu> <7260c5b0117cf7.55269cd4@wiscmail.wisc.edu> <76c099b11112c1.55269d8b@wiscmail.wisc.edu> <76f085321157b1.55269dcb@wiscmail.wisc.edu> <7640add8116ed3.55269e07@wiscmail.wisc.edu> <76309f0f1113a1.5526a116@wiscmail.wisc.edu> <7720c6e1113323.5526a153@wiscmail.wisc.edu> <76c089df1136a9.5526a18f@wiscmail.wisc.edu> <7720d92a112c25.55265ba0@wiscmail.wisc.edu> <7640f8d8116085.5526aff1@wiscmail.wisc.edu> <7320a0b711117d.5526b02d@wiscmail.wisc.edu> <7650bc3c1179cb.5526b06a@wiscmail.wisc.edu> <76b0bf22114a4a.5526b0a6@wiscmail.wisc.edu> <76a0af84112133.5526b0e2@wiscmail.wisc.edu> <7640ac9511459c.5526b11f@wiscmail.wisc.edu> <76f085bb110fcb.5526b15b@wiscmail.wisc.edu> <7630d7cc116aa0.5526b198@wiscmail.wisc.edu> <77409a4f115819.5526b1d4@wiscmail.wisc.edu> <76a0a288115d8b.5526b210@wiscmail.wisc.edu> <7260eef5117ede.5526b303@wiscmail.wisc.edu> <76f0c67311001f.5526b7f3@wiscmail.wisc.edu> <76f0bf17110ba5.5526b82f@wiscmail.wisc.edu> <7640f9ad116735.5526bd8b@wiscmail.wisc.edu> <76c0d47a111b11.5526bdc8@wiscmail.wisc.edu> <72609f51116be1.5526be04@wiscmail.wisc.edu> <76f09ef511243a.5526be7c@wiscmail.wisc.edu> <76f0d2df117b5b.5526c0d5@wiscmail.wisc.edu> <76c08fdc112390.5526c111@wiscmail.wisc.edu> <7720df4d112847.5526c14e@wiscmail.wisc.edu> <772091e81144cf.5526c18a@wiscmail.wisc.edu> <7320cab1115f36.5526c529@wiscmail.wisc.edu> <7720eda711299a.5526c566@wiscmail.wisc.edu> <7720b35711064d.5526c5a2@wiscmail.wisc.edu> <72609cdf116f22.5526c5df@wiscmail.wisc.edu> <7260ec68117733.5526cb0b@wiscmail.wisc.edu> <76a087c2117551.5526cbc2@wiscmail.wisc.edu> <726093871109dc.5526cbff@wiscmail.wisc.edu> <76f0eb8b111384.5526cc3b@wiscmail.wisc.edu> <76b09c3a1161c2.5526cc77@wiscmail.wisc.edu> <7630ab8b1105de.5526ccb6@wiscmail.wisc.edu> <726096741127d9.5526cda9@wiscmail.wisc.edu> <76509083115717.55268789@wiscmail.wisc.edu> <7550e9f8d1203.5527e77d@wiscmail.wisc.edu> <7530b404d3c58.5527e7ba@wiscmail.wisc.edu> <7470a2add6954.5527e7f6@wiscmail.wisc.edu> <7620d018d11f0.5527e832@wiscmail.wisc.edu> <7650a6fbd7303.5527e86f@wiscmail.wisc.edu> <7650976dd3a26.5527e8ab@wiscmail.wisc.edu> <7650d57fd0c99.5527ea16@wiscmail.wisc.edu> <74d0b08fd5d5a.5527ea53@wiscmail.wisc.edu> <7470c4b8d4f56.5527eace@wiscmail.wisc.edu> <74d0a4cfd27d4.5527ebfa@wiscmail.wisc.edu> <76209682d7e3f.5527ec37@wiscmail.wisc.edu> <7620acc5d61c8.5527ec73@wiscmail.wisc.edu> <7790fdcad77bd.5527ecb0@wiscmail.wisc.edu> <747083cbd2891.5527ecec@wiscmail.wisc.edu> <74e0fa3ed6905.5527ed28@wiscmail.wisc.edu> <7530a19ed45ae.5527ed65@wiscmail.wisc.edu> <7530ed3dd1ce9.5527eda1@wiscmail.wisc.edu> <7460eb98d21f5.5527edde@wiscmail.wisc.edu> <76509f5bd7695.5527ee1a@wiscmail.wisc.edu> <7620cdcdd7a5a.5527ee57@wiscmail.wisc.edu> <7530e9d1d0832.5527ee93@wiscmail.wisc.edu> <7620cd7dd76b0.5527ef0b@wiscmail.wisc.edu> <7470f1e5d6893.5527ef48@wiscmail.wisc.edu> <7460e0a6d7169.5527efc0@wiscmail.wisc.edu> <74708430d37b4.5527efff@wiscmail.wisc.edu> <77909980d233b.5527f0f0@wiscmail.wisc.edu> <75508ed6d3def.5527f12c@wiscmail.wisc.edu> <74d089dbd7803.5527f168@wiscmail.wisc.edu> <74e0d157d7450.5527f1a5@wiscmail.wisc.edu> <7620fa11d092d.5527f21d@wiscmail.wisc.edu> <7530e0f9d08fe.5527f259@wiscmail.wisc.edu> <7650c220d517f.5527f296@wiscmail.wisc.edu> <76508a76d70d6.5527f2d2@wiscmail.wisc.edu> <747089a5d1c5c.5527f30e@wiscmail.wisc.edu> <74d0b7b0d1b00.5527f34b@wiscmail.wisc.edu> <7530dae8d68a2.5527f387@wiscmail.wisc.edu> <74709076d52cf.5527f3c4@wiscmail.wisc.edu> <7440eab2d631d.5527f400@wiscmail.wisc.edu> <74f0a4c4d3f8a.5527f569@wiscmail.wisc.edu> <7650fec1d42ee.5527f5e4@wiscmail.wisc.edu> <7530f19bd56c4.5527f623@wiscmail.wisc.edu> <7650d0d4d02e1.5527f69b@wiscmail.wisc.edu> <7650b1f9d5dea.5527f6da@wiscmail.wisc.edu> <7530c757d4f45.5527f753@wiscmail.wisc.edu> <7530ea72d3768.5527f7cb@wiscmail.wisc.edu> <7440a4bfd17f5.5527f808@wiscmail.wisc.edu> <7460cfc4d038c.5527f844@wiscmail.wisc.edu> <74d0aa83d7007.5527f881@wiscmail.wisc.edu> <7550a7f9d5bf8.5527f8bd@wiscmail.wisc.edu> <74f0a5a5d5f7a.5527f8f9@wiscmail.wisc.edu> <77908bbad2606.5527f936@wiscmail.wisc.edu> <7530c0dad0aac.5527f972@wiscmail.wisc.edu> <7620b290d7acb.5527f9af@wiscmail.wisc.edu> <7530a93fd12e8.5527f9eb@wiscmail.wisc.edu> <74409ce3d674f.5527fa27@wiscmail.wisc.edu> <74f0db36d7d6e.5527fa64@wiscmail.wisc.edu> <74e0b8a4d65a3.5527faa0@wiscmail.wisc.edu> <74d0872fd5f5c.5527fadc@wiscmail.wisc.edu> <74d0c93cd5dfa.5527fb19@wiscmail.wisc.edu> <7440a9edd18fd.5527fb91@wiscmail.wisc.edu> <7620e95ed61ae.5527fbce@wiscmail.wisc.edu> <7470e6d9d3a54.5527fc0a@wiscmail.wisc.edu> <74f0ef60d15b6.5527fc46@wiscmail.wisc.edu> <77909799d6c93.5527fc83@wiscmail.wisc.edu> <74d0f69bd796c.5527fcbf@wiscmail.wisc.edu> <75308783d3c84.5527fcfb@wiscmail.wisc.edu> <7530ba90d0977.5527fd74@wiscmail.wisc.edu> Message-ID: <7790a931d1c08.5527b74a@wiscmail.wisc.edu> On 15-04-10, Stuart Bishop wrote: > > On 10 April 2015 at 17:12, Nick Coghlan wrote: > > > The question of "store the DST flag" vs "store the offset" is essentially a > > data normalisation one - there's only a single bit of additional information > > actually needed (whether the time is DST or not in the annual hour of > > ambiguity), which can then be combined with the local time, the location and > > the zone info database to get the *actual* offset. One thing that hasn't been mentioned is that is_dst and offset are not parallel on the datetime.time class, since you have no information about when in the history of the time zone a time was recorded. If you only record the time zone and is_dst flag, then updating zoneinfo can change the UTC time corresponding to existing aware time objects. Whereas if you only store offsets, it can change whether a time is interpreted as being DST or not (or whether the offset is considered valid at all). Which is mostly to say that aware time objects that don't also store dates are probably just a bad idea in general. > The dst flag must be stored in the datetime, either as a boolean or > encoded in the timezone structure. If you only have the offset, you > lose interoperability with the time module (and the standard IANA > zoneinfo library, posix etc., which it wraps). (dt + > timedelta(hours=1)).ctime() will give an incorrect answer when you > cross a DST transition. >From local time + offset you can compute UTC time and from there you can lookup in the tzinfo whether it's DST or not. But yes, I'm starting to be less enamored of this idea. I get that it's basically the same as doing everything in UTC and caching local time, but I wasn't thinking about the fact that so many existing APIs need (local time, is_dst) that the friction is a problem. > > A question the PEP perhaps *should* consider is whether or not to offer an > > API allowing datetime objects to be built from a naive datetime, a fixed > > offset and a location, throwing NonExistentTimeError if the given date, time > > and offset doesn't match either the DST or non-DST times at that location. > > I don't think you need a specific API, apart from being able to > construct a tzinfo using nothing but an offset (lots of people require > this for things like parsing email headers, which is why pytz has the > FixedOffset class). > > datetime.now().astimezone(FixedOffset(-1200)).astimezone(timezone('Melbourne/Australia', > is_dst=None) This doesn't work: ``` >>> from datetime import * >>> datetime.now().astimezone(timezone(-timedelta(hours=2))) ValueError: astimezone() cannot be applied to a naive datetime ``` But also, how is this different from needing to know the offset in order to construct an aware datetime? I know that you can do datetime.now(tz), and you can do datetime(2013, 11, 3, 1, 30, tzinfo=zoneinfo('America/Chicago')), but not being able to add a time zone to an existing naive datetime is painful (and strptime doesn't even let you pass in a time zone). Some of the people who designed the instruments that we use to collect data understood why we would care when it was collected, and some of them didn't, and I need to be able to handle their data regardless. (The makers of one device with subsecond resolution opted for maximum compatibility with Microsoft Excel by writing a CSV with times as naive days since Dec. 30, 1899 to five decimal digits, but I doubt there's anything to be done for them.) In addition, it would be nice to able to say that going back and forth between aware and naive datetimes is evil and you should never do it, but it's a necessity if you want to be able to implement relative timedeltas in some form. Unless the stdlib wants to step up with a be-all, end-all implementation of "this time tomorrow" and friends, there shouldn't be unnecessary friction here. > > P.S. The description of NonExistentTimeError in the PEP doesn't seem quite > > right, as it currently says it will only be thrown if "is_dst=None", which > > seems like a copy and paste error from the definition of AmbiguousTimeError. > > The PEP is correct here. If you explicitly specify is_dst, you know > which side of the transition you are on and which offset to use. You > can calculate datetime(2000, 12, 31, 2, 0, 0, 0).astimezone(foo, > is_dst=False) using the non-DST offset and get an answer. It might not > have ever appeared on the clock on your wall, but it is better than a > punch in the face. If you want a punch in the face, is_dst=None > refuses to guess and you get the exception. Returning a DST time precisely when is_dst=False is passed isn't a punch in the face? I mean, I get what you're saying and I agree that "given that this time is skipped, which offset do I want to interpret it in?" is the right question to ask, but this is horribly counterintuitive to anyone who hasn't spent a lot of time pondering it. If nonexistent times are going to behave that way, the API needs to have a better solution to Naming Things. > (Except for those cases where the timezone offset is changed without a > DST transition, but that is rare enough everyone pretends they don't > exist) As far as method parameters go, you might as well just say the first time is DST and the second is STD and explain in the docs that "is_dst" is just a mnemonic. But if you have an is_dst flag on your datetimes, suddenly this is an issue. ijs From alexander.belopolsky at gmail.com Sat Apr 11 01:32:00 2015 From: alexander.belopolsky at gmail.com (Alexander Belopolsky) Date: Fri, 10 Apr 2015 19:32:00 -0400 Subject: [Python-Dev] Aware datetime from naive local time Was: Status on PEP-431 Timezones In-Reply-To: References: Message-ID: On Fri, Apr 10, 2015 at 6:38 AM, Nick Coghlan wrote: > It actually took me a long time to understand that the "isdst" flag in > this context related to the following chain of reasoning: > > 1. Due to various reasons, local time offsets relative to UTC may change, > thus repeating certain subsets of local time > 2. Repeated local times usually relate to winding clocks back an hour at > the end of a DST period > 3. "isdst=True" thus refers to "before the local time change winds the > clocks back", while "isdst=False" refers to *after* the clocks are wound > back > As Alexander says, you can reduce the amount of assumed knowledge needed to > understand the API by focusing on the ambiguity resolution directly without > assuming that the *reason* for the ambiguity is "end of DST period". > This is an excellent summary of my original post. (It is in fact better than the post itself which I therefore did not include in the quote.) For the mathematically inclined, I can reformulate the problem as follows. For any given geographical location, loc, and a moment in time t expressed as UTC time, one can tell what time was shown on a "local clock-tower." This defines a function wall(loc, t). This function is a piece-wise linear function which may have regular or irregular discontinuities. Because of these discontinuities, an equation wall(loc, t) = lt may have 0, 1 or 2 solutions. The DST switchovers are an example of regular discontinuities. In most locations, they follow a somewhat predictable pattern with two discontinuities per year. Irregular discontinuities happen in locations with activist governments and don't follow any general rules. For most world locations past discontinuities are fairly well documented for at least a century and future changes are published with at least 6 months lead time. >From a pedagogical point of view, having a separate API that returned 0, 1, > or 2 results for a local time lookup could thus help make it clear that > local time to absolute time conversions are effectively a database lookup > problem, and that timezone offset changes (whether historical or cyclical) > mean that the mapping isn't 1:1 - some expressible local times never > actually happen, while others happen more than once. > The downside of this API is that naively written code is prone to crashes. Someone unaware of the invalid local times and not caring about the choice between ambiguities may write code like t = utc_times_from_local(lt)[0] which may work fine for many years before someone gets an IndexError and a backtrace in her server log. > For the normal APIs, NonExistentTimeError would then correspond with the > case where the record lookup API returned no results, while the suggested > "which" index would handle the two results case without assuming the > repeated local time was specifically due to the end of a DST period. > The NonExistentTimeError has a similar problem as an API returning an empty list. Seeing NonExistentTimeError in a server log is not a big improvement over seeing an IndexError. Moreover, a program that rejects invalid times on input, but stores them for a long time may see its database silently corrupted after a zoneinfo update. Now it is time to make specific proposal. I would like to extend datetime.astimezone() method to work on naive datetime instances. Such instances will be assumed to be in local time and discontinuities will be handled as follows: 1. wall(t) == lt has a single solution. This is the trivial case and lt.astimezone(utc) and lt.astimezone(utc, which=i) for i=0,1 should return that solution. 2. wall(t) == lt has two solutions t1 and t2 such that t1 < t2. In this case lt.astimezone(utc) == lt.astimezone(utc, which=0) == t1 and lt.astimezone(utc, which=1) == t2. 3. wall(t) == lt has no solution. This happens when there is UTC time t0 such that wall(t0) < lt and wall(t0+epsilon) > lt (a positive discontinuity at time t0). In this case lt.astimezone(utc) should return t0 + lt - wall(t0). I.e., we ignore the discontinuity and extend wall(t) linearly past t0. Obviously, in this case the invariant wall(lt.astimezone(utc)) == lt won't hold. The "which" flag should be handled as follows: lt.astimezone(utc) == lt.astimezone(utc, which=0) and lt.astimezone(utc, which=0) == t0 + lt - wall(t0+eps). With the proposed features in place, one can use the naive code t = lt.astimezone(utc) and get predictable behavior in all cases and no crashes. A more sophisticated program can be written like this: t1 = lt.astimezone(utc, which=0) t2 = lt.astimezone(utc, which=1) if t1 == t2: t = t1 elif t2 > t1: # ask the user to pick between t1 and t2 or raise AmbiguousLocalTimeError else: t = t1 # warn the user that time was invalid and changed or raise InvalidLocalTimeError -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexander.belopolsky at gmail.com Sat Apr 11 02:29:42 2015 From: alexander.belopolsky at gmail.com (Alexander Belopolsky) Date: Fri, 10 Apr 2015 20:29:42 -0400 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: <7790a931d1c08.5527b74a@wiscmail.wisc.edu> References: <76a0af84112133.5526b0e2@wiscmail.wisc.edu> <7640ac9511459c.5526b11f@wiscmail.wisc.edu> <76f085bb110fcb.5526b15b@wiscmail.wisc.edu> <7630d7cc116aa0.5526b198@wiscmail.wisc.edu> <77409a4f115819.5526b1d4@wiscmail.wisc.edu> <76a0a288115d8b.5526b210@wiscmail.wisc.edu> <7260eef5117ede.5526b303@wiscmail.wisc.edu> <76f0c67311001f.5526b7f3@wiscmail.wisc.edu> <76f0bf17110ba5.5526b82f@wiscmail.wisc.edu> <7640f9ad116735.5526bd8b@wiscmail.wisc.edu> <76c0d47a111b11.5526bdc8@wiscmail.wisc.edu> <72609f51116be1.5526be04@wiscmail.wisc.edu> <76f09ef511243a.5526be7c@wiscmail.wisc.edu> <76f0d2df117b5b.5526c0d5@wiscmail.wisc.edu> <76c08fdc112390.5526c111@wiscmail.wisc.edu> <7720df4d112847.5526c14e@wiscmail.wisc.edu> <772091e81144cf.5526c18a@wiscmail.wisc.edu> <7320cab1115f36.5526c529@wiscmail.wisc.edu> <7720eda711299a.5526c566@wiscmail.wisc.edu> <7720b35711064d.5526c5a2@wiscmail.wisc.edu> <72609cdf116f22.5526c5df@wiscmail.wisc.edu> <7260ec68117733.5526cb0b@wiscmail.wisc.edu> <76a087c2117551.5526cbc2@wiscmail.wisc.edu> <726093871109dc.5526cbff@wiscmail.wisc.edu> <76f0eb8b111384.5526cc3b@wiscmail.wisc.edu> <76b09c3a1161c2.5526cc77@wiscmail.wisc.edu> <7630ab8b1105de.5526ccb6@wiscmail.wisc.edu> <726096741127d9.5526cda9@wiscmail.wisc.edu> <76509083115717.55268789@wiscmail.wisc.edu> <7550e9f8d1203.5527e77d@wiscmail.wisc.edu> <7530b404d3c58.5527e7ba@wiscmail.wisc.edu> <7470a2add6954.5527e7f6@wiscmail.wisc.edu> <7620d018d11f0.5527e832@wiscmail.wisc.edu> <7650a6fbd7303.5527e86f@wiscmail.wisc.edu> <7650976dd3a26.5527e8ab@wiscmail.wisc.edu> <7650d57fd0c99.5527ea16@wiscmail.wisc.edu> <74d0b08fd5d5a.5527ea53@wiscmail.wisc.edu> <7470c4b8d4f56.5527eace@wiscmail.wisc.edu> <74d0a4cfd27d4.5527ebfa@wiscmail.wisc.edu> <76209682d7e3f.5527ec37@wiscmail.wisc.edu> <7620acc5d61c8.5527ec73@wiscmail.wisc.edu> <7790fdcad77bd.5527ecb0@wiscmail.wisc.edu> <747083cbd2891.5527ecec@wiscmail.wisc.edu> <74e0fa3ed6905.5527ed28@wiscmail.wisc.edu> <7530a19ed45ae.5527ed65@wiscmail.wisc.edu> <7530ed3dd1ce9.5527eda1@wiscmail.wisc.edu> <7460eb98d21f5.5527edde@wiscmail.wisc.edu> <76509f5bd7695.5527ee1a@wiscmail.wisc.edu> <7620cdcdd7a5a.5527ee57@wiscmail.wisc.edu> <7530e9d1d0832.5527ee93@wiscmail.wisc.edu> <7620cd7dd76b0.5527ef0b@wiscmail.wisc.edu> <7470f1e5d6893.5527ef48@wiscmail.wisc.edu> <7460e0a6d7169.5527efc0@wiscmail.wisc.edu> <74708430d37b4.5527efff@wiscmail.wisc.edu> <77909980d233b.5527f0f0@wiscmail.wisc.edu> <75508ed6d3def.5527f12c@wiscmail.wisc.edu> <74d089dbd7803.5527f168@wiscmail.wisc.edu> <74e0d157d7450.5527f1a5@wiscmail.wisc.edu> <7620fa11d092d.5527f21d@wiscmail.wisc.edu> <7530e0f9d08fe.5527f259@wiscmail.wisc.edu> <7650c220d517f.5527f296@wiscmail.wisc.edu> <76508a76d70d6.5527f2d2@wiscmail.wisc.edu> <747089a5d1c5c.5527f30e@wiscmail.wisc.edu> <74d0b7b0d1b00.5527f34b@wiscmail.wisc.edu> <7530dae8d68a2.5527f387@wiscmail.wisc.edu> <74709076d52cf.5527f3c4@wiscmail.wisc.edu> <7440eab2d631d.5527f400@wiscmail.wisc.edu> <74f0a4c4d3f8a.5527f569@wiscmail.wisc.edu> <7650fec1d42ee.5527f5e4@wiscmail.wisc.edu> <7530f19bd56c4.5527f623@wiscmail.wisc.edu> <7650d0d4d02e1.5527f69b@wiscmail.wisc.edu> <7650b1f9d5dea.5527f6da@wiscmail.wisc.edu> <7530c757d4f45.5527f753@wiscmail.wisc.edu> <7530ea72d3768.5527f7cb@wiscmail.wisc.edu> <7440a4bfd17f5.5527f808@wiscmail.wisc.edu> <7460cfc4d038c.5527f844@wiscmail.wisc.edu> <74d0aa83d7007.5527f881@wiscmail.wisc.edu> <7550a7f9d5bf8.5527f8bd@wiscmail.wisc.edu> <74f0a5a5d5f7a.5527f8f9@wiscmail.wisc.edu> <77908bbad2606.5527f936@wiscmail.wisc.edu> <7530c0dad0aac.5527f972@wiscmail.wisc.edu> <7620b290d7acb.5527f9af@wiscmail.wisc.edu> <7530a93fd12e8.5527f9eb@wiscmail.wisc.edu> <74409ce3d674f.5527fa27@wiscmail.wisc.edu> <74f0db36d7d6e.5527fa64@wiscmail.wisc.edu> <74e0b8a4d65a3.5527faa0@wiscmail.wisc.edu> <74d0872fd5f5c.5527fadc@wiscmail.wisc.edu> <74d0c93cd5dfa.5527fb19@wiscmail.wisc.edu> <7440a9edd18fd.5527fb91@wiscmail.wisc.edu> <7620e95ed61ae.5527fbce@wiscmail.wisc.edu> <7470e6d9d3a54.5527fc0a@wiscmail.wisc.edu> <74f0ef60d15b6.5527fc46@wiscmail.wisc.edu> <77909799d6c93.5527fc83@wiscmail.wisc.edu> <74d0f69bd796c.5527fcbf@wiscmail.wisc.edu> <75308783d3c84.5527fcfb@wiscmail.wisc.edu> <7530ba90d0977.5527fd74@wiscmail.wisc.edu> <7790a931d1c08.5527b74a@wiscmail.wisc.edu> Message-ID: On Fri, Apr 10, 2015 at 12:43 PM, Isaac Schwabacher wrote: > (and strptime doesn't even let you pass in a time zone) Not true: >>> datetime.strptime('-0400 EDT', '%z %Z') datetime.datetime(1900, 1, 1, 0, 0, tzinfo=datetime.timezone(datetime.timedelta(-1, 72000), 'EDT')) -------------- next part -------------- An HTML attachment was scrubbed... URL: From stuart at stuartbishop.net Sat Apr 11 07:48:27 2015 From: stuart at stuartbishop.net (Stuart Bishop) Date: Sat, 11 Apr 2015 12:48:27 +0700 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: <5527F2E4.5010805@g.nevcal.com> References: <76f09ef511243a.5526be7c@wiscmail.wisc.edu> <76f0d2df117b5b.5526c0d5@wiscmail.wisc.edu> <76c08fdc112390.5526c111@wiscmail.wisc.edu> <7720df4d112847.5526c14e@wiscmail.wisc.edu> <772091e81144cf.5526c18a@wiscmail.wisc.edu> <7320cab1115f36.5526c529@wiscmail.wisc.edu> <7720eda711299a.5526c566@wiscmail.wisc.edu> <7720b35711064d.5526c5a2@wiscmail.wisc.edu> <72609cdf116f22.5526c5df@wiscmail.wisc.edu> <7260ec68117733.5526cb0b@wiscmail.wisc.edu> <76a087c2117551.5526cbc2@wiscmail.wisc.edu> <726093871109dc.5526cbff@wiscmail.wisc.edu> <76f0eb8b111384.5526cc3b@wiscmail.wisc.edu> <76b09c3a1161c2.5526cc77@wiscmail.wisc.edu> <7630ab8b1105de.5526ccb6@wiscmail.wisc.edu> <726096741127d9.5526cda9@wiscmail.wisc.edu> <76509083115717.55268789@wiscmail.wisc.edu> <5527F2E4.5010805@g.nevcal.com> Message-ID: On 10 April 2015 at 22:57, Glenn Linderman wrote: > On 4/10/2015 5:27 AM, Stuart Bishop wrote: > > P.S. The description of NonExistentTimeError in the PEP doesn't seem quite >> right, as it currently says it will only be thrown if "is_dst=None", which >> seems like a copy and paste error from the definition of >> AmbiguousTimeError. > > The PEP is correct here. If you explicitly specify is_dst, you know > which side of the transition you are on and which offset to use. You > can calculate datetime(2000, 12, 31, 2, 0, 0, 0).astimezone(foo, > is_dst=False) using the non-DST offset and get an answer. It might not > have ever appeared on the clock on your wall, but it is better than a > punch in the face. If you want a punch in the face, is_dst=None > refuses to guess and you get the exception. > > So does this mean that consistent use of is_dst=False throughout the summer > would allow for people that refuse to accept government regulation into > their timekeeping, and are willing to be out of step with their neighbors, > would be able to match their software to their wall-clock time? No, this logic is only done when necessary. The goal is that, by default, you can convert any naive timestamp into any timezone without raising an exception. The vast majority of callsites do not require perfect accuracy, but just want a good enough result they can display or log. They do not want to worry about Heisenbugs, where if their application is being run in a timezone with DST transitions then it fails for one hour per year in the middle of the night. If you do need accuracy, you explicitly override the default arguments and take responsibility for handling the exceptions. -- Stuart Bishop http://www.stuartbishop.net/ From regebro at gmail.com Sat Apr 11 17:00:32 2015 From: regebro at gmail.com (Lennart Regebro) Date: Sat, 11 Apr 2015 17:00:32 +0200 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: References: <76f09ef511243a.5526be7c@wiscmail.wisc.edu> <76f0d2df117b5b.5526c0d5@wiscmail.wisc.edu> <76c08fdc112390.5526c111@wiscmail.wisc.edu> <7720df4d112847.5526c14e@wiscmail.wisc.edu> <772091e81144cf.5526c18a@wiscmail.wisc.edu> <7320cab1115f36.5526c529@wiscmail.wisc.edu> <7720eda711299a.5526c566@wiscmail.wisc.edu> <7720b35711064d.5526c5a2@wiscmail.wisc.edu> <72609cdf116f22.5526c5df@wiscmail.wisc.edu> <7260ec68117733.5526cb0b@wiscmail.wisc.edu> <76a087c2117551.5526cbc2@wiscmail.wisc.edu> <726093871109dc.5526cbff@wiscmail.wisc.edu> <76f0eb8b111384.5526cc3b@wiscmail.wisc.edu> <76b09c3a1161c2.5526cc77@wiscmail.wisc.edu> <7630ab8b1105de.5526ccb6@wiscmail.wisc.edu> <726096741127d9.5526cda9@wiscmail.wisc.edu> <76509083115717.55268789@wiscmail.wisc.edu> <5527F2E4.5010805@g.nevcal.com> Message-ID: I've booked an open space for 3pm in 512CG. Be there or be to naive! -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Sat Apr 11 23:38:17 2015 From: guido at python.org (Guido van Rossum) Date: Sat, 11 Apr 2015 17:38:17 -0400 Subject: [Python-Dev] Tracker user registration problem Message-ID: I am sitting net to Christie Wilson (bobcatfish) who is trying to register a user account for our issue tracker. However it doesn't seem to work -- the promised email never arrives. Who can help with this? -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From benjamin at python.org Sat Apr 11 23:39:35 2015 From: benjamin at python.org (Benjamin Peterson) Date: Sat, 11 Apr 2015 17:39:35 -0400 Subject: [Python-Dev] Tracker user registration problem In-Reply-To: References: Message-ID: <1428788375.4050095.252170001.6A567456@webmail.messagingengine.com> Did you check spam? On Sat, Apr 11, 2015, at 17:38, Guido van Rossum wrote: > I am sitting net to Christie Wilson (bobcatfish) who is trying to > register > a user account for our issue tracker. However it doesn't seem to work -- > the promised email never arrives. Who can help with this? > > -- > --Guido van Rossum (python.org/~guido) > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/benjamin%40python.org From guido at python.org Sat Apr 11 23:48:55 2015 From: guido at python.org (Guido van Rossum) Date: Sat, 11 Apr 2015 17:48:55 -0400 Subject: [Python-Dev] Tracker user registration problem In-Reply-To: <1428788375.4050095.252170001.6A567456@webmail.messagingengine.com> References: <1428788375.4050095.252170001.6A567456@webmail.messagingengine.com> Message-ID: It's not in spam, and asking for a password reset email claims that the email is not registered. On Sat, Apr 11, 2015 at 5:39 PM, Benjamin Peterson wrote: > Did you check spam? > > On Sat, Apr 11, 2015, at 17:38, Guido van Rossum wrote: > > I am sitting net to Christie Wilson (bobcatfish) who is trying to > > register > > a user account for our issue tracker. However it doesn't seem to work -- > > the promised email never arrives. Who can help with this? > > > > -- > > --Guido van Rossum (python.org/~guido) > > _______________________________________________ > > Python-Dev mailing list > > Python-Dev at python.org > > https://mail.python.org/mailman/listinfo/python-dev > > Unsubscribe: > > https://mail.python.org/mailman/options/python-dev/benjamin%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Sat Apr 11 23:49:25 2015 From: guido at python.org (Guido van Rossum) Date: Sat, 11 Apr 2015 17:49:25 -0400 Subject: [Python-Dev] Tracker user registration problem In-Reply-To: References: <1428788375.4050095.252170001.6A567456@webmail.messagingengine.com> Message-ID: Sorry. It was in spam. On Sat, Apr 11, 2015 at 5:48 PM, Guido van Rossum wrote: > It's not in spam, and asking for a password reset email claims that the > email is not registered. > > On Sat, Apr 11, 2015 at 5:39 PM, Benjamin Peterson > wrote: > >> Did you check spam? >> >> On Sat, Apr 11, 2015, at 17:38, Guido van Rossum wrote: >> > I am sitting net to Christie Wilson (bobcatfish) who is trying to >> > register >> > a user account for our issue tracker. However it doesn't seem to work -- >> > the promised email never arrives. Who can help with this? >> > >> > -- >> > --Guido van Rossum (python.org/~guido) >> > _______________________________________________ >> > Python-Dev mailing list >> > Python-Dev at python.org >> > https://mail.python.org/mailman/listinfo/python-dev >> > Unsubscribe: >> > >> https://mail.python.org/mailman/options/python-dev/benjamin%40python.org >> > > > > -- > --Guido van Rossum (python.org/~guido) > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at holdenweb.com Sun Apr 12 22:13:57 2015 From: steve at holdenweb.com (Steve Holden) Date: Sun, 12 Apr 2015 16:13:57 -0400 Subject: [Python-Dev] PyCon Survivor's breakfast Message-ID: <5CC31D9E-C077-4994-9663-A8C8B68EE371@holdenweb.com> Hi everybody, This year sees the third PyCon Survivors' Breakfast at Suite 701, a restaurant attached to the Place d'Armes hotel just up the road from the Palais de Congres (map on Eventbrite page). All Python developers, PSF members and PyCon delegates are cordially invited to attend. The event starts at 7:30 but you can roll in whenever you feel like breakfast (I start early so that those who wish to start sprinting promptly can still join us). The places cost about $45 a head (I've been asked about costs, I don't really know why) and you have a choice of prices, including free. I don't want anyone to feel that inability to pay need stop them coming along to join us. Last year's event was a relaxing change from the Palais de Congres. I hope I see you there. Sign up at http://bit.ly/PyConSurvivor2015. Numbers are limited, but last year there was a place for everyone who wanted to attend. If you are at PyCon, or know someone who is, please let them know that this invitation extends to everyone who attended. regards Steve -- Steve Holden steve at holdenweb.com / +1 571 484 6266 / +44 113 320 2335 / @holdenweb -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at holdenweb.com Sun Apr 12 23:19:58 2015 From: steve at holdenweb.com (Steve Holden) Date: Sun, 12 Apr 2015 17:19:58 -0400 Subject: [Python-Dev] [PSF-Members] PyCon Survivor's breakfast In-Reply-To: <0916C392-E0B5-4268-8D95-D95F839E7125@wirtel.be> References: <5CC31D9E-C077-4994-9663-A8C8B68EE371@holdenweb.com> <0916C392-E0B5-4268-8D95-D95F839E7125@wirtel.be> Message-ID: Indeed. I believe that the date is on the Eventbrite page, but sorry for omitting obviously necessary information. It's tomorrow. Hope to see you there. S On Apr 12, 2015, at 4:49 PM, Stephane Wirtel wrote: > 7h30 morning ? Tomorrow > > On 12 avr. 2015, at 4:13 PM, Steve Holden wrote: > >> Hi everybody, >> >> This year sees the third PyCon Survivors' Breakfast at Suite 701, a restaurant attached to the Place d'Armes hotel just up the road from the Palais de Congres (map on Eventbrite page). >> >> All Python developers, PSF members and PyCon delegates are cordially invited to attend. The event starts at 7:30 but you can roll in whenever you feel like breakfast (I start early so that those who wish to start sprinting promptly can still join us). >> >> The places cost about $45 a head (I've been asked about costs, I don't really know why) and you have a choice of prices, including free. I don't want anyone to feel that inability to pay need stop them coming along to join us. Last year's event was a relaxing change from the Palais de Congres. I hope I see you there. >> >> Sign up at http://bit.ly/PyConSurvivor2015. >> >> Numbers are limited, but last year there was a place for everyone who wanted to attend. If you are at PyCon, or know someone who is, please let them know that this invitation extends to everyone who attended. >> >> regards >> Steve >> -- >> Steve Holden steve at holdenweb.com / +1 571 484 6266 / +44 113 320 2335 / @holdenweb >> >> >> >> >> _______________________________________________ >> PSF-Members mailing list >> PSF-Members at python.org >> https://mail.python.org/mailman/listinfo/psf-members >> PSF home page (http://www.python.org/psf/) >> PSF membership FAQ (http://www.python.org/psf/membership/) >> PSF members' wiki (http://wiki.python.org/psf/) -- Steve Holden steve at holdenweb.com / +1 571 484 6266 / +44 113 320 2335 / @holdenweb -------------- next part -------------- An HTML attachment was scrubbed... URL: From regebro at gmail.com Mon Apr 13 15:55:08 2015 From: regebro at gmail.com (Lennart Regebro) Date: Mon, 13 Apr 2015 09:55:08 -0400 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: References: Message-ID: The Open Space was good, and the conclusion was that solution #2 indeed seems to be the right one. We also concluded that likely the datetime() constructor itself needs to grow an is_dst flag. There was no further insight into whether having an offset or a is_dst flag as an attribute, I think that will clear up during the implementation. //Lennart From alexander.belopolsky at gmail.com Mon Apr 13 19:45:19 2015 From: alexander.belopolsky at gmail.com (Alexander Belopolsky) Date: Mon, 13 Apr 2015 13:45:19 -0400 Subject: [Python-Dev] Aware datetime from naive local time Was: Status on PEP-431 Timezones In-Reply-To: References: Message-ID: On Mon, Apr 13, 2015 at 1:24 PM, Chris Barker wrote: > > >> Because of these discontinuities, an equation wall(loc, t) = lt may >> have 0, 1 >> or 2 solutions. >> > > This is where I'm confused -- I can see how going from "wall" time > ("local" time, etc) to UTC has 0, 1, or 2 solutions: > > One solution most of the time > > Zero solutions when we "spring forward" -- i.e. there is no 2:30 am > on March 8, 2015 in the US timezones that use DST > > Two solutions when we "fall back", i.e. there are two 2:30 am Nov 1, 2015 > in the US timezones that use DST > > But I can't see where there are multiple solutions the other way around -- > doesn't a given UTC time map to one and only one "wall time" in a given > timezone? > > Am I wrong, or is this a semantic question as to what "wall" time means? > You are right about what wall() means, but I should have been more explicit about knowns and unknowns in the wall(loc, t) = lt equation. In that equation I considered loc (the geographical place) and lt (the time on the clock tower) to be known and t (the universal (UTC) time) to be unknown. A solution to the equation is the value of the unknown (t) given the values of the knowns (loc and lt). The rest of your exposition is correct including "a given UTC time maps to one and only one 'wall time' in a given timezone." However, different UTC times may map to the same wall time and some expressible wall times are not results of a map of any UTC time. -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Mon Apr 13 20:05:07 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Mon, 13 Apr 2015 11:05:07 -0700 Subject: [Python-Dev] Aware datetime from naive local time Was: Status on PEP-431 Timezones In-Reply-To: References: Message-ID: On Mon, Apr 13, 2015 at 10:45 AM, Alexander Belopolsky < alexander.belopolsky at gmail.com> wrote: > > Am I wrong, or is this a semantic question as to what "wall" time means? >> > > You are right about what wall() means, but I should have been more > explicit about knowns and unknowns in the wall(loc, t) = lt equation. > > In that equation I considered loc (the geographical place) and lt (the > time on the clock tower) to be known and t (the universal (UTC) time) to be > unknown. A solution to the equation is the value of the unknown (t) given > the values of the knowns (loc and lt). > > The rest of your exposition is correct including "a given UTC time maps to > one and only one 'wall time' in a given timezone." However, different UTC > times may map to the same wall time and some expressible wall times are not > results of a map of any UTC time. > got it. I suggest you perhaps word it something like: wall_time = f( location, utc_time) and utc_time = f( location, utc_time ) These are two different problems, and one is much harder than the other! (though both are ugly!) you can, of course shoreten the names to "wall" and "utc" and "loc" if you like, but I kind of like long, readable names.. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Mon Apr 13 19:24:43 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Mon, 13 Apr 2015 10:24:43 -0700 Subject: [Python-Dev] Aware datetime from naive local time Was: Status on PEP-431 Timezones In-Reply-To: References: Message-ID: Sorry to be brain dead here, but I'm a bit lost: On Fri, Apr 10, 2015 at 4:32 PM, Alexander Belopolsky < alexander.belopolsky at gmail.com> wrote: > For any given geographical location, loc, and a moment in time t expressed > as UTC time, one can tell what time was shown on a "local clock-tower." > This defines a function wall(loc, t). This function is a piece-wise > linear function which may have regular or irregular discontinuities. > got it. > Because of these discontinuities, an equation wall(loc, t) = lt may have > 0, 1 > or 2 solutions. > This is where I'm confused -- I can see how going from "wall" time ("local" time, etc) to UTC has 0, 1, or 2 solutions: One solution most of the time Zero solutions when we "spring forward" -- i.e. there is no 2:30 am on March 8, 2015 in the US timezones that use DST Two solutions when we "fall back", i.e. there are two 2:30 am Nov 1, 2015 in the US timezones that use DST But I can't see where there are multiple solutions the other way around -- doesn't a given UTC time map to one and only one "wall time" in a given timezone? Am I wrong, or is this a semantic question as to what "wall" time means? Thanks for any clarification, -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexander.belopolsky at gmail.com Mon Apr 13 21:14:23 2015 From: alexander.belopolsky at gmail.com (Alexander Belopolsky) Date: Mon, 13 Apr 2015 15:14:23 -0400 Subject: [Python-Dev] Aware datetime from naive local time Was: Status on PEP-431 Timezones In-Reply-To: References: Message-ID: On Mon, Apr 13, 2015 at 2:05 PM, Chris Barker wrote: > However, different UTC times may map to the same wall time and some >> expressible wall times are not results of a map of any UTC time. >> > > got it. I suggest you perhaps word it something like: > > wall_time = f( location, utc_time) > > and > > utc_time = f( location, utc_time ) > > These are two different problems, and one is much harder than the other! > (though both are ugly!) > You probably meant "utc_time = f( location, wall_time)" in the last equation, but that would still be wrong. A somewhat more correct equation would be utc_time = f^(-1)( location, wall_time) where f^(-1) is the inverse function of f, but since f in not monotonic, no such inverse exists. Finding the inverse of f is the same as solving the equation f(x) = y for any given y. If f is such that this equation has only one solution for all possible values of y then an inverse exists, but this is not so in our case. -------------- next part -------------- An HTML attachment was scrubbed... URL: From python at mrabarnett.plus.com Mon Apr 13 22:10:22 2015 From: python at mrabarnett.plus.com (MRAB) Date: Mon, 13 Apr 2015 21:10:22 +0100 Subject: [Python-Dev] Aware datetime from naive local time Was: Status on PEP-431 Timezones In-Reply-To: References: Message-ID: <552C22AE.1020704@mrabarnett.plus.com> On 2015-04-13 20:14, Alexander Belopolsky wrote: > > On Mon, Apr 13, 2015 at 2:05 PM, Chris Barker > wrote: > > However, different UTC times may map to the same wall time and > some expressible wall times are not results of a map of any UTC > time. > > > got it. I suggest you perhaps word it something like: > > wall_time = f( location, utc_time) > > and > > utc_time = f( location, utc_time ) > > These are two different problems, and one is much harder than the > other! (though both are ugly!) > > > You probably meant "utc_time = f( location, wall_time)" in the last > equation, but that would still be wrong. > A somewhat more correct equation would be > > utc_time = f^(-1)( location, wall_time) > > where f^(-1) is the inverse function of f, but since f in not monotonic, > no such inverse exists. > Don't you mean "f??"? :-) > Finding the inverse of f is the same as solving the equation f(x) = y > for any given y. If f is such that this > equation has only one solution for all possible values of y then an > inverse exists, but this > is not so in our case. > From chris.barker at noaa.gov Mon Apr 13 23:12:02 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Mon, 13 Apr 2015 14:12:02 -0700 Subject: [Python-Dev] Aware datetime from naive local time Was: Status on PEP-431 Timezones In-Reply-To: References: Message-ID: On Mon, Apr 13, 2015 at 12:14 PM, Alexander Belopolsky < alexander.belopolsky at gmail.com> wrote: > utc_time = f( location, utc_time ) >> >> These are two different problems, and one is much harder than the other! >> (though both are ugly!) >> > > You probably meant "utc_time = f( location, wall_time)" in the last > equation, > oops, yes. > but that would still be wrong. > > A somewhat more correct equation would be > > utc_time = f^(-1)( location, wall_time) > In this case I meant "f" as "a function of", so the two fs were not intended to be the same. Yes, one is the inverse of another, and in this case the inverse is not definable (at least not uniquely). I have no doubt you understand all this (better than I do), I'm just suggesting that in the discussion we find a way to be as clear as possible as to which function is being discussed when. But anyway -- thanks all for hashing this out -- getting something reasonable into datetime will be very nice. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From zachary.ware+pydev at gmail.com Mon Apr 13 23:28:24 2015 From: zachary.ware+pydev at gmail.com (Zachary Ware) Date: Mon, 13 Apr 2015 16:28:24 -0500 Subject: [Python-Dev] Python3 Stable ABI Message-ID: In issue23903, I've created a script that will produce PC/python3.def by scraping the header files in Include. There are are many many discrepencies between what my script generates and what is currently in the repository (diff below), but in every case I've checked the script has been right: what the script finds is actually exported as part of the limited API, but due to not being in the .def file it's not actually exported from python3.dll. Almost all of the differences are things that the script found that weren't present, but there are a couple things going the other way. The point of this message is to ask everybody who maintains anything in C to take a look through and make sure everything in their area is properly guarded (or not) by Py_LIMITED_API. Alternately, if somebody can find a bug in my script and brain that's finding too much stuff, that would be great too. Ideally, after this is all settled I'd like to add the script to both the Makefile and the Windows build system, such that PC/python3.def is always kept up to date and flags changes that weren't meant to be made. Regards, -- Zach (I'm afraid Gmail might mangle this beyond recognition, you can find the diff at http://bugs.python.org/review/23903/diff/14549/PC/python3.def if it does.) diff -r 24f2c0279120 PC/python3.def --- a/PC/python3.def Mon Apr 13 15:51:59 2015 -0500 +++ b/PC/python3.def Mon Apr 13 16:10:34 2015 -0500 @@ -1,13 +1,15 @@ ; This file specifies the import forwarding for python3.dll ; It is used when building python3dll.vcxproj +; Generated by python3defgen.py, DO NOT modify directly! LIBRARY "python3" EXPORTS + PyAST_FromNode=python35.PyAST_FromNode + PyAST_FromNodeObject=python35.PyAST_FromNodeObject + PyAST_Validate=python35.PyAST_Validate PyArg_Parse=python35.PyArg_Parse PyArg_ParseTuple=python35.PyArg_ParseTuple PyArg_ParseTupleAndKeywords=python35.PyArg_ParseTupleAndKeywords PyArg_UnpackTuple=python35.PyArg_UnpackTuple - PyArg_VaParse=python35.PyArg_VaParse - PyArg_VaParseTupleAndKeywords=python35.PyArg_VaParseTupleAndKeywords PyArg_ValidateKeywordArguments=python35.PyArg_ValidateKeywordArguments PyBaseObject_Type=python35.PyBaseObject_Type DATA PyBool_FromLong=python35.PyBool_FromLong @@ -39,7 +41,6 @@ PyCFunction_GetFlags=python35.PyCFunction_GetFlags PyCFunction_GetFunction=python35.PyCFunction_GetFunction PyCFunction_GetSelf=python35.PyCFunction_GetSelf - PyCFunction_New=python35.PyCFunction_New PyCFunction_NewEx=python35.PyCFunction_NewEx PyCFunction_Type=python35.PyCFunction_Type DATA PyCallIter_New=python35.PyCallIter_New @@ -58,6 +59,7 @@ PyCapsule_SetPointer=python35.PyCapsule_SetPointer PyCapsule_Type=python35.PyCapsule_Type DATA PyClassMethodDescr_Type=python35.PyClassMethodDescr_Type DATA + PyCmpWrapper_Type=python35.PyCmpWrapper_Type DATA PyCodec_BackslashReplaceErrors=python35.PyCodec_BackslashReplaceErrors PyCodec_Decode=python35.PyCodec_Decode PyCodec_Decoder=python35.PyCodec_Decoder @@ -68,6 +70,7 @@ PyCodec_IncrementalEncoder=python35.PyCodec_IncrementalEncoder PyCodec_KnownEncoding=python35.PyCodec_KnownEncoding PyCodec_LookupError=python35.PyCodec_LookupError + PyCodec_NameReplaceErrors=python35.PyCodec_NameReplaceErrors PyCodec_Register=python35.PyCodec_Register PyCodec_RegisterError=python35.PyCodec_RegisterError PyCodec_ReplaceErrors=python35.PyCodec_ReplaceErrors @@ -122,6 +125,7 @@ PyErr_Fetch=python35.PyErr_Fetch PyErr_Format=python35.PyErr_Format PyErr_FormatV=python35.PyErr_FormatV + PyErr_GetExcInfo=python35.PyErr_GetExcInfo PyErr_GivenExceptionMatches=python35.PyErr_GivenExceptionMatches PyErr_NewException=python35.PyErr_NewException PyErr_NewExceptionWithDoc=python35.PyErr_NewExceptionWithDoc @@ -132,14 +136,25 @@ PyErr_PrintEx=python35.PyErr_PrintEx PyErr_ProgramText=python35.PyErr_ProgramText PyErr_Restore=python35.PyErr_Restore + PyErr_SetExcFromWindowsErr=python35.PyErr_SetExcFromWindowsErr + PyErr_SetExcFromWindowsErrWithFilename=python35.PyErr_SetExcFromWindowsErrWithFilename + PyErr_SetExcFromWindowsErrWithFilenameObject=python35.PyErr_SetExcFromWindowsErrWithFilenameObject + PyErr_SetExcFromWindowsErrWithFilenameObjects=python35.PyErr_SetExcFromWindowsErrWithFilenameObjects + PyErr_SetExcInfo=python35.PyErr_SetExcInfo + PyErr_SetExcWithArgsKwargs=python35.PyErr_SetExcWithArgsKwargs PyErr_SetFromErrno=python35.PyErr_SetFromErrno PyErr_SetFromErrnoWithFilename=python35.PyErr_SetFromErrnoWithFilename PyErr_SetFromErrnoWithFilenameObject=python35.PyErr_SetFromErrnoWithFilenameObject + PyErr_SetFromErrnoWithFilenameObjects=python35.PyErr_SetFromErrnoWithFilenameObjects + PyErr_SetFromWindowsErr=python35.PyErr_SetFromWindowsErr + PyErr_SetFromWindowsErrWithFilename=python35.PyErr_SetFromWindowsErrWithFilename + PyErr_SetImportError=python35.PyErr_SetImportError PyErr_SetInterrupt=python35.PyErr_SetInterrupt PyErr_SetNone=python35.PyErr_SetNone PyErr_SetObject=python35.PyErr_SetObject PyErr_SetString=python35.PyErr_SetString PyErr_SyntaxLocation=python35.PyErr_SyntaxLocation + PyErr_SyntaxLocationEx=python35.PyErr_SyntaxLocationEx PyErr_WarnEx=python35.PyErr_WarnEx PyErr_WarnExplicit=python35.PyErr_WarnExplicit PyErr_WarnFormat=python35.PyErr_WarnFormat @@ -171,12 +186,21 @@ PyExc_AssertionError=python35.PyExc_AssertionError DATA PyExc_AttributeError=python35.PyExc_AttributeError DATA PyExc_BaseException=python35.PyExc_BaseException DATA + PyExc_BlockingIOError=python35.PyExc_BlockingIOError DATA + PyExc_BrokenPipeError=python35.PyExc_BrokenPipeError DATA PyExc_BufferError=python35.PyExc_BufferError DATA PyExc_BytesWarning=python35.PyExc_BytesWarning DATA + PyExc_ChildProcessError=python35.PyExc_ChildProcessError DATA + PyExc_ConnectionAbortedError=python35.PyExc_ConnectionAbortedError DATA + PyExc_ConnectionError=python35.PyExc_ConnectionError DATA + PyExc_ConnectionRefusedError=python35.PyExc_ConnectionRefusedError DATA + PyExc_ConnectionResetError=python35.PyExc_ConnectionResetError DATA PyExc_DeprecationWarning=python35.PyExc_DeprecationWarning DATA PyExc_EOFError=python35.PyExc_EOFError DATA PyExc_EnvironmentError=python35.PyExc_EnvironmentError DATA PyExc_Exception=python35.PyExc_Exception DATA + PyExc_FileExistsError=python35.PyExc_FileExistsError DATA + PyExc_FileNotFoundError=python35.PyExc_FileNotFoundError DATA PyExc_FloatingPointError=python35.PyExc_FloatingPointError DATA PyExc_FutureWarning=python35.PyExc_FutureWarning DATA PyExc_GeneratorExit=python35.PyExc_GeneratorExit DATA @@ -185,18 +209,23 @@ PyExc_ImportWarning=python35.PyExc_ImportWarning DATA PyExc_IndentationError=python35.PyExc_IndentationError DATA PyExc_IndexError=python35.PyExc_IndexError DATA + PyExc_InterruptedError=python35.PyExc_InterruptedError DATA + PyExc_IsADirectoryError=python35.PyExc_IsADirectoryError DATA PyExc_KeyError=python35.PyExc_KeyError DATA PyExc_KeyboardInterrupt=python35.PyExc_KeyboardInterrupt DATA PyExc_LookupError=python35.PyExc_LookupError DATA PyExc_MemoryError=python35.PyExc_MemoryError DATA - PyExc_MemoryErrorInst=python35.PyExc_MemoryErrorInst DATA PyExc_NameError=python35.PyExc_NameError DATA + PyExc_NotADirectoryError=python35.PyExc_NotADirectoryError DATA PyExc_NotImplementedError=python35.PyExc_NotImplementedError DATA PyExc_OSError=python35.PyExc_OSError DATA PyExc_OverflowError=python35.PyExc_OverflowError DATA PyExc_PendingDeprecationWarning=python35.PyExc_PendingDeprecationWarning DATA + PyExc_PermissionError=python35.PyExc_PermissionError DATA + PyExc_ProcessLookupError=python35.PyExc_ProcessLookupError DATA PyExc_RecursionErrorInst=python35.PyExc_RecursionErrorInst DATA PyExc_ReferenceError=python35.PyExc_ReferenceError DATA + PyExc_ResourceWarning=python35.PyExc_ResourceWarning DATA PyExc_RuntimeError=python35.PyExc_RuntimeError DATA PyExc_RuntimeWarning=python35.PyExc_RuntimeWarning DATA PyExc_StopIteration=python35.PyExc_StopIteration DATA @@ -205,6 +234,7 @@ PyExc_SystemError=python35.PyExc_SystemError DATA PyExc_SystemExit=python35.PyExc_SystemExit DATA PyExc_TabError=python35.PyExc_TabError DATA + PyExc_TimeoutError=python35.PyExc_TimeoutError DATA PyExc_TypeError=python35.PyExc_TypeError DATA PyExc_UnboundLocalError=python35.PyExc_UnboundLocalError DATA PyExc_UnicodeDecodeError=python35.PyExc_UnicodeDecodeError DATA @@ -215,6 +245,7 @@ PyExc_UserWarning=python35.PyExc_UserWarning DATA PyExc_ValueError=python35.PyExc_ValueError DATA PyExc_Warning=python35.PyExc_Warning DATA + PyExc_WindowsError=python35.PyExc_WindowsError DATA PyExc_ZeroDivisionError=python35.PyExc_ZeroDivisionError DATA PyException_GetCause=python35.PyException_GetCause PyException_GetContext=python35.PyException_GetContext @@ -242,10 +273,12 @@ PyGILState_Release=python35.PyGILState_Release PyGetSetDescr_Type=python35.PyGetSetDescr_Type DATA PyImport_AddModule=python35.PyImport_AddModule + PyImport_AddModuleObject=python35.PyImport_AddModuleObject PyImport_AppendInittab=python35.PyImport_AppendInittab PyImport_Cleanup=python35.PyImport_Cleanup PyImport_ExecCodeModule=python35.PyImport_ExecCodeModule PyImport_ExecCodeModuleEx=python35.PyImport_ExecCodeModuleEx + PyImport_ExecCodeModuleObject=python35.PyImport_ExecCodeModuleObject PyImport_ExecCodeModuleWithPathnames=python35.PyImport_ExecCodeModuleWithPathnames PyImport_GetImporter=python35.PyImport_GetImporter PyImport_GetMagicNumber=python35.PyImport_GetMagicNumber @@ -253,8 +286,10 @@ PyImport_GetModuleDict=python35.PyImport_GetModuleDict PyImport_Import=python35.PyImport_Import PyImport_ImportFrozenModule=python35.PyImport_ImportFrozenModule + PyImport_ImportFrozenModuleObject=python35.PyImport_ImportFrozenModuleObject PyImport_ImportModule=python35.PyImport_ImportModule PyImport_ImportModuleLevel=python35.PyImport_ImportModuleLevel + PyImport_ImportModuleLevelObject=python35.PyImport_ImportModuleLevelObject PyImport_ImportModuleNoBlock=python35.PyImport_ImportModuleNoBlock PyImport_ReloadModule=python35.PyImport_ReloadModule PyInterpreterState_Clear=python35.PyInterpreterState_Clear @@ -310,10 +345,18 @@ PyMapping_SetItemString=python35.PyMapping_SetItemString PyMapping_Size=python35.PyMapping_Size PyMapping_Values=python35.PyMapping_Values + PyMarshal_ReadObjectFromString=python35.PyMarshal_ReadObjectFromString + PyMarshal_WriteLongToFile=python35.PyMarshal_WriteLongToFile + PyMarshal_WriteObjectToFile=python35.PyMarshal_WriteObjectToFile + PyMarshal_WriteObjectToString=python35.PyMarshal_WriteObjectToString + PyMem_Calloc=python35.PyMem_Calloc PyMem_Free=python35.PyMem_Free PyMem_Malloc=python35.PyMem_Malloc PyMem_Realloc=python35.PyMem_Realloc PyMemberDescr_Type=python35.PyMemberDescr_Type DATA + PyMember_GetOne=python35.PyMember_GetOne + PyMember_SetOne=python35.PyMember_SetOne + PyMemoryView_FromMemory=python35.PyMemoryView_FromMemory PyMemoryView_FromObject=python35.PyMemoryView_FromObject PyMemoryView_GetContiguous=python35.PyMemoryView_GetContiguous PyMemoryView_Type=python35.PyMemoryView_Type DATA @@ -327,9 +370,15 @@ PyModule_GetFilename=python35.PyModule_GetFilename PyModule_GetFilenameObject=python35.PyModule_GetFilenameObject PyModule_GetName=python35.PyModule_GetName + PyModule_GetNameObject=python35.PyModule_GetNameObject PyModule_GetState=python35.PyModule_GetState PyModule_New=python35.PyModule_New + PyModule_NewObject=python35.PyModule_NewObject PyModule_Type=python35.PyModule_Type DATA + PyNode_AddChild=python35.PyNode_AddChild + PyNode_Free=python35.PyNode_Free + PyNode_ListTree=python35.PyNode_ListTree + PyNode_New=python35.PyNode_New PyNullImporter_Type=python35.PyNullImporter_Type DATA PyNumber_Absolute=python35.PyNumber_Absolute PyNumber_Add=python35.PyNumber_Add @@ -343,6 +392,7 @@ PyNumber_InPlaceAnd=python35.PyNumber_InPlaceAnd PyNumber_InPlaceFloorDivide=python35.PyNumber_InPlaceFloorDivide PyNumber_InPlaceLshift=python35.PyNumber_InPlaceLshift + PyNumber_InPlaceMatrixMultiply=python35.PyNumber_InPlaceMatrixMultiply PyNumber_InPlaceMultiply=python35.PyNumber_InPlaceMultiply PyNumber_InPlaceOr=python35.PyNumber_InPlaceOr PyNumber_InPlacePower=python35.PyNumber_InPlacePower @@ -355,6 +405,7 @@ PyNumber_Invert=python35.PyNumber_Invert PyNumber_Long=python35.PyNumber_Long PyNumber_Lshift=python35.PyNumber_Lshift + PyNumber_MatrixMultiply=python35.PyNumber_MatrixMultiply PyNumber_Multiply=python35.PyNumber_Multiply PyNumber_Negative=python35.PyNumber_Negative PyNumber_Or=python35.PyNumber_Or @@ -367,6 +418,7 @@ PyNumber_TrueDivide=python35.PyNumber_TrueDivide PyNumber_Xor=python35.PyNumber_Xor PyOS_AfterFork=python35.PyOS_AfterFork + PyOS_CheckStack=python35.PyOS_CheckStack PyOS_InitInterrupts=python35.PyOS_InitInterrupts PyOS_InputHook=python35.PyOS_InputHook DATA PyOS_InterruptOccurred=python35.PyOS_InterruptOccurred @@ -393,6 +445,7 @@ PyObject_CallMethod=python35.PyObject_CallMethod PyObject_CallMethodObjArgs=python35.PyObject_CallMethodObjArgs PyObject_CallObject=python35.PyObject_CallObject + PyObject_Calloc=python35.PyObject_Calloc PyObject_CheckReadBuffer=python35.PyObject_CheckReadBuffer PyObject_ClearWeakRefs=python35.PyObject_ClearWeakRefs PyObject_DelItem=python35.PyObject_DelItem @@ -405,6 +458,7 @@ PyObject_GC_UnTrack=python35.PyObject_GC_UnTrack PyObject_GenericGetAttr=python35.PyObject_GenericGetAttr PyObject_GenericSetAttr=python35.PyObject_GenericSetAttr + PyObject_GenericSetDict=python35.PyObject_GenericSetDict PyObject_GetAttr=python35.PyObject_GetAttr PyObject_GetAttrString=python35.PyObject_GetAttrString PyObject_GetItem=python35.PyObject_GetItem @@ -431,9 +485,10 @@ PyObject_SetItem=python35.PyObject_SetItem PyObject_Size=python35.PyObject_Size PyObject_Str=python35.PyObject_Str - PyObject_Type=python35.PyObject_Type DATA + PyObject_Type=python35.PyObject_Type PyParser_SimpleParseFileFlags=python35.PyParser_SimpleParseFileFlags PyParser_SimpleParseStringFlags=python35.PyParser_SimpleParseStringFlags + PyParser_SimpleParseStringFlagsFilename=python35.PyParser_SimpleParseStringFlagsFilename PyProperty_Type=python35.PyProperty_Type DATA PyRangeIter_Type=python35.PyRangeIter_Type DATA PyRange_Type=python35.PyRange_Type DATA @@ -474,8 +529,8 @@ PySlice_New=python35.PySlice_New PySlice_Type=python35.PySlice_Type DATA PySortWrapper_Type=python35.PySortWrapper_Type DATA + PyState_AddModule=python35.PyState_AddModule PyState_FindModule=python35.PyState_FindModule - PyState_AddModule=python35.PyState_AddModule PyState_RemoveModule=python35.PyState_RemoveModule PyStructSequence_GetItem=python35.PyStructSequence_GetItem PyStructSequence_New=python35.PyStructSequence_New @@ -484,9 +539,11 @@ PySuper_Type=python35.PySuper_Type DATA PySys_AddWarnOption=python35.PySys_AddWarnOption PySys_AddWarnOptionUnicode=python35.PySys_AddWarnOptionUnicode + PySys_AddXOption=python35.PySys_AddXOption PySys_FormatStderr=python35.PySys_FormatStderr PySys_FormatStdout=python35.PySys_FormatStdout PySys_GetObject=python35.PySys_GetObject + PySys_GetXOptions=python35.PySys_GetXOptions PySys_HasWarnOptions=python35.PySys_HasWarnOptions PySys_ResetWarnOptions=python35.PySys_ResetWarnOptions PySys_SetArgv=python35.PySys_SetArgv @@ -503,6 +560,24 @@ PyThreadState_New=python35.PyThreadState_New PyThreadState_SetAsyncExc=python35.PyThreadState_SetAsyncExc PyThreadState_Swap=python35.PyThreadState_Swap + PyThread_GetInfo=python35.PyThread_GetInfo + PyThread_ReInitTLS=python35.PyThread_ReInitTLS + PyThread_acquire_lock=python35.PyThread_acquire_lock + PyThread_acquire_lock_timed=python35.PyThread_acquire_lock_timed + PyThread_allocate_lock=python35.PyThread_allocate_lock + PyThread_create_key=python35.PyThread_create_key + PyThread_delete_key=python35.PyThread_delete_key + PyThread_delete_key_value=python35.PyThread_delete_key_value + PyThread_exit_thread=python35.PyThread_exit_thread + PyThread_free_lock=python35.PyThread_free_lock + PyThread_get_key_value=python35.PyThread_get_key_value + PyThread_get_stacksize=python35.PyThread_get_stacksize + PyThread_get_thread_ident=python35.PyThread_get_thread_ident + PyThread_init_thread=python35.PyThread_init_thread + PyThread_release_lock=python35.PyThread_release_lock + PyThread_set_key_value=python35.PyThread_set_key_value + PyThread_set_stacksize=python35.PyThread_set_stacksize + PyThread_start_new_thread=python35.PyThread_start_new_thread PyTraceBack_Here=python35.PyTraceBack_Here PyTraceBack_Print=python35.PyTraceBack_Print PyTraceBack_Type=python35.PyTraceBack_Type DATA @@ -561,34 +636,51 @@ PyUnicode_AsEncodedString=python35.PyUnicode_AsEncodedString PyUnicode_AsEncodedUnicode=python35.PyUnicode_AsEncodedUnicode PyUnicode_AsLatin1String=python35.PyUnicode_AsLatin1String + PyUnicode_AsMBCSString=python35.PyUnicode_AsMBCSString PyUnicode_AsRawUnicodeEscapeString=python35.PyUnicode_AsRawUnicodeEscapeString + PyUnicode_AsUCS4=python35.PyUnicode_AsUCS4 + PyUnicode_AsUCS4Copy=python35.PyUnicode_AsUCS4Copy PyUnicode_AsUTF16String=python35.PyUnicode_AsUTF16String PyUnicode_AsUTF32String=python35.PyUnicode_AsUTF32String PyUnicode_AsUTF8String=python35.PyUnicode_AsUTF8String PyUnicode_AsUnicodeEscapeString=python35.PyUnicode_AsUnicodeEscapeString PyUnicode_AsWideChar=python35.PyUnicode_AsWideChar - PyUnicode_ClearFreelist=python35.PyUnicode_ClearFreelist + PyUnicode_AsWideCharString=python35.PyUnicode_AsWideCharString + PyUnicode_BuildEncodingMap=python35.PyUnicode_BuildEncodingMap + PyUnicode_ClearFreeList=python35.PyUnicode_ClearFreeList PyUnicode_Compare=python35.PyUnicode_Compare + PyUnicode_CompareWithASCIIString=python35.PyUnicode_CompareWithASCIIString PyUnicode_Concat=python35.PyUnicode_Concat PyUnicode_Contains=python35.PyUnicode_Contains PyUnicode_Count=python35.PyUnicode_Count PyUnicode_Decode=python35.PyUnicode_Decode PyUnicode_DecodeASCII=python35.PyUnicode_DecodeASCII PyUnicode_DecodeCharmap=python35.PyUnicode_DecodeCharmap + PyUnicode_DecodeCodePageStateful=python35.PyUnicode_DecodeCodePageStateful PyUnicode_DecodeFSDefault=python35.PyUnicode_DecodeFSDefault PyUnicode_DecodeFSDefaultAndSize=python35.PyUnicode_DecodeFSDefaultAndSize PyUnicode_DecodeLatin1=python35.PyUnicode_DecodeLatin1 + PyUnicode_DecodeLocale=python35.PyUnicode_DecodeLocale + PyUnicode_DecodeLocaleAndSize=python35.PyUnicode_DecodeLocaleAndSize + PyUnicode_DecodeMBCS=python35.PyUnicode_DecodeMBCS + PyUnicode_DecodeMBCSStateful=python35.PyUnicode_DecodeMBCSStateful PyUnicode_DecodeRawUnicodeEscape=python35.PyUnicode_DecodeRawUnicodeEscape PyUnicode_DecodeUTF16=python35.PyUnicode_DecodeUTF16 PyUnicode_DecodeUTF16Stateful=python35.PyUnicode_DecodeUTF16Stateful PyUnicode_DecodeUTF32=python35.PyUnicode_DecodeUTF32 PyUnicode_DecodeUTF32Stateful=python35.PyUnicode_DecodeUTF32Stateful + PyUnicode_DecodeUTF7=python35.PyUnicode_DecodeUTF7 + PyUnicode_DecodeUTF7Stateful=python35.PyUnicode_DecodeUTF7Stateful PyUnicode_DecodeUTF8=python35.PyUnicode_DecodeUTF8 PyUnicode_DecodeUTF8Stateful=python35.PyUnicode_DecodeUTF8Stateful PyUnicode_DecodeUnicodeEscape=python35.PyUnicode_DecodeUnicodeEscape + PyUnicode_EncodeCodePage=python35.PyUnicode_EncodeCodePage + PyUnicode_EncodeFSDefault=python35.PyUnicode_EncodeFSDefault + PyUnicode_EncodeLocale=python35.PyUnicode_EncodeLocale PyUnicode_FSConverter=python35.PyUnicode_FSConverter PyUnicode_FSDecoder=python35.PyUnicode_FSDecoder PyUnicode_Find=python35.PyUnicode_Find + PyUnicode_FindChar=python35.PyUnicode_FindChar PyUnicode_Format=python35.PyUnicode_Format PyUnicode_FromEncodedObject=python35.PyUnicode_FromEncodedObject PyUnicode_FromFormat=python35.PyUnicode_FromFormat @@ -599,30 +691,28 @@ PyUnicode_FromStringAndSize=python35.PyUnicode_FromStringAndSize PyUnicode_FromWideChar=python35.PyUnicode_FromWideChar PyUnicode_GetDefaultEncoding=python35.PyUnicode_GetDefaultEncoding + PyUnicode_GetLength=python35.PyUnicode_GetLength PyUnicode_GetSize=python35.PyUnicode_GetSize + PyUnicode_InternFromString=python35.PyUnicode_InternFromString + PyUnicode_InternImmortal=python35.PyUnicode_InternImmortal + PyUnicode_InternInPlace=python35.PyUnicode_InternInPlace PyUnicode_IsIdentifier=python35.PyUnicode_IsIdentifier PyUnicode_Join=python35.PyUnicode_Join PyUnicode_Partition=python35.PyUnicode_Partition PyUnicode_RPartition=python35.PyUnicode_RPartition PyUnicode_RSplit=python35.PyUnicode_RSplit + PyUnicode_ReadChar=python35.PyUnicode_ReadChar PyUnicode_Replace=python35.PyUnicode_Replace PyUnicode_Resize=python35.PyUnicode_Resize PyUnicode_RichCompare=python35.PyUnicode_RichCompare - PyUnicode_SetDefaultEncoding=python35.PyUnicode_SetDefaultEncoding PyUnicode_Split=python35.PyUnicode_Split PyUnicode_Splitlines=python35.PyUnicode_Splitlines + PyUnicode_Substring=python35.PyUnicode_Substring PyUnicode_Tailmatch=python35.PyUnicode_Tailmatch PyUnicode_Translate=python35.PyUnicode_Translate - PyUnicode_BuildEncodingMap=python35.PyUnicode_BuildEncodingMap - PyUnicode_CompareWithASCIIString=python35.PyUnicode_CompareWithASCIIString - PyUnicode_DecodeUTF7=python35.PyUnicode_DecodeUTF7 - PyUnicode_DecodeUTF7Stateful=python35.PyUnicode_DecodeUTF7Stateful - PyUnicode_EncodeFSDefault=python35.PyUnicode_EncodeFSDefault - PyUnicode_InternFromString=python35.PyUnicode_InternFromString - PyUnicode_InternImmortal=python35.PyUnicode_InternImmortal - PyUnicode_InternInPlace=python35.PyUnicode_InternInPlace PyUnicode_Type=python35.PyUnicode_Type DATA - PyWeakref_GetObject=python35.PyWeakref_GetObject DATA + PyUnicode_WriteChar=python35.PyUnicode_WriteChar + PyWeakref_GetObject=python35.PyWeakref_GetObject PyWeakref_NewProxy=python35.PyWeakref_NewProxy PyWeakref_NewRef=python35.PyWeakref_NewRef PyWrapperDescr_Type=python35.PyWrapperDescr_Type DATA @@ -633,6 +723,8 @@ Py_BuildValue=python35.Py_BuildValue Py_CompileString=python35.Py_CompileString Py_DecRef=python35.Py_DecRef + Py_DecodeLocale=python35.Py_DecodeLocale + Py_EncodeLocale=python35.Py_EncodeLocale Py_EndInterpreter=python35.Py_EndInterpreter Py_Exit=python35.Py_Exit Py_FatalError=python35.Py_FatalError @@ -660,44 +752,95 @@ Py_NewInterpreter=python35.Py_NewInterpreter Py_ReprEnter=python35.Py_ReprEnter Py_ReprLeave=python35.Py_ReprLeave + Py_SetPath=python35.Py_SetPath Py_SetProgramName=python35.Py_SetProgramName Py_SetPythonHome=python35.Py_SetPythonHome Py_SetRecursionLimit=python35.Py_SetRecursionLimit Py_SymtableString=python35.Py_SymtableString Py_VaBuildValue=python35.Py_VaBuildValue + Py_hexdigits=python35.Py_hexdigits DATA + _PyDebug_PrintTotalRefs=python35._PyDebug_PrintTotalRefs + _PyDict_Dummy=python35._PyDict_Dummy + _PyDict_GetItemId=python35._PyDict_GetItemId + _PyDict_GetItemIdWithError=python35._PyDict_GetItemIdWithError + _PyDict_SetItemId=python35._PyDict_SetItemId _PyErr_BadInternalCall=python35._PyErr_BadInternalCall + _PyEval_FiniThreads=python35._PyEval_FiniThreads + _PyGILState_Reinit=python35._PyGILState_Reinit + _PyImportZip_Init=python35._PyImportZip_Init + _PyMethodWrapper_Type=python35._PyMethodWrapper_Type DATA + _PyNamespace_New=python35._PyNamespace_New + _PyNamespace_Type=python35._PyNamespace_Type DATA + _PyNone_Type=python35._PyNone_Type DATA + _PyNotImplemented_Type=python35._PyNotImplemented_Type DATA + _PyOS_GetOpt=python35._PyOS_GetOpt + _PyOS_IsMainThread=python35._PyOS_IsMainThread + _PyOS_SigintEvent=python35._PyOS_SigintEvent _PyObject_CallFunction_SizeT=python35._PyObject_CallFunction_SizeT + _PyObject_CallMethodId=python35._PyObject_CallMethodId + _PyObject_CallMethodIdObjArgs=python35._PyObject_CallMethodIdObjArgs + _PyObject_CallMethodId_SizeT=python35._PyObject_CallMethodId_SizeT _PyObject_CallMethod_SizeT=python35._PyObject_CallMethod_SizeT + _PyObject_GC_Calloc=python35._PyObject_GC_Calloc _PyObject_GC_Malloc=python35._PyObject_GC_Malloc _PyObject_GC_New=python35._PyObject_GC_New _PyObject_GC_NewVar=python35._PyObject_GC_NewVar _PyObject_GC_Resize=python35._PyObject_GC_Resize + _PyObject_GetAttrId=python35._PyObject_GetAttrId + _PyObject_HasAttrId=python35._PyObject_HasAttrId + _PyObject_IsAbstract=python35._PyObject_IsAbstract _PyObject_New=python35._PyObject_New _PyObject_NewVar=python35._PyObject_NewVar + _PyObject_SetAttrId=python35._PyObject_SetAttrId _PyState_AddModule=python35._PyState_AddModule + _PySys_SetObjectId=python35._PySys_SetObjectId + _PyThreadState_DeleteExcept=python35._PyThreadState_DeleteExcept _PyThreadState_Init=python35._PyThreadState_Init _PyThreadState_Prealloc=python35._PyThreadState_Prealloc _PyTrash_delete_later=python35._PyTrash_delete_later DATA _PyTrash_delete_nesting=python35._PyTrash_delete_nesting DATA _PyTrash_deposit_object=python35._PyTrash_deposit_object _PyTrash_destroy_chain=python35._PyTrash_destroy_chain + _PyTrash_thread_deposit_object=python35._PyTrash_thread_deposit_object + _PyTrash_thread_destroy_chain=python35._PyTrash_thread_destroy_chain + _PyUnicode_ClearStaticStrings=python35._PyUnicode_ClearStaticStrings + _PyUnicode_FromId=python35._PyUnicode_FromId _PyWeakref_CallableProxyType=python35._PyWeakref_CallableProxyType DATA _PyWeakref_ProxyType=python35._PyWeakref_ProxyType DATA _PyWeakref_RefType=python35._PyWeakref_RefType DATA + _Py_AddToAllObjects=python35._Py_AddToAllObjects _Py_BuildValue_SizeT=python35._Py_BuildValue_SizeT _Py_CheckRecursionLimit=python35._Py_CheckRecursionLimit DATA _Py_CheckRecursiveCall=python35._Py_CheckRecursiveCall _Py_Dealloc=python35._Py_Dealloc + _Py_DumpTraceback=python35._Py_DumpTraceback DATA + _Py_DumpTracebackThreads=python35._Py_DumpTracebackThreads DATA _Py_EllipsisObject=python35._Py_EllipsisObject DATA _Py_FalseStruct=python35._Py_FalseStruct DATA + _Py_ForgetReference=python35._Py_ForgetReference + _Py_GetAllocatedBlocks=python35._Py_GetAllocatedBlocks + _Py_GetRefTotal=python35._Py_GetRefTotal + _Py_HashSecret_Initialized=python35._Py_HashSecret_Initialized DATA + _Py_NegativeRefcount=python35._Py_NegativeRefcount + _Py_NewReference=python35._Py_NewReference _Py_NoneStruct=python35._Py_NoneStruct DATA _Py_NotImplementedStruct=python35._Py_NotImplementedStruct DATA + _Py_PrintReferenceAddresses=python35._Py_PrintReferenceAddresses + _Py_PrintReferences=python35._Py_PrintReferences + _Py_RefTotal=python35._Py_RefTotal DATA _Py_SwappedOp=python35._Py_SwappedOp DATA - _Py_TrueStruct=python35._Py_TrueStruct DATA _Py_VaBuildValue_SizeT=python35._Py_VaBuildValue_SizeT - _PyArg_Parse_SizeT=python35._PyArg_Parse_SizeT - _PyArg_ParseTuple_SizeT=python35._PyArg_ParseTuple_SizeT - _PyArg_ParseTupleAndKeywords_SizeT=python35._PyArg_ParseTupleAndKeywords_SizeT - _PyArg_VaParse_SizeT=python35._PyArg_VaParse_SizeT - _PyArg_VaParseTupleAndKeywords_SizeT=python35._PyArg_VaParseTupleAndKeywords_SizeT - _Py_BuildValue_SizeT=python35._Py_BuildValue_SizeT + _Py_add_one_to_index_C=python35._Py_add_one_to_index_C + _Py_add_one_to_index_F=python35._Py_add_one_to_index_F + _Py_device_encoding=python35._Py_device_encoding + _Py_fopen=python35._Py_fopen + _Py_fopen_obj=python35._Py_fopen_obj + _Py_read=python35._Py_read + _Py_stat=python35._Py_stat + _Py_wfopen=python35._Py_wfopen + _Py_wgetcwd=python35._Py_wgetcwd + _Py_wreadlink=python35._Py_wreadlink + _Py_wrealpath=python35._Py_wrealpath + _Py_write=python35._Py_write + _Py_write_noraise=python35._Py_write_noraise From eric at trueblade.com Tue Apr 14 19:40:40 2015 From: eric at trueblade.com (Eric V. Smith) Date: Tue, 14 Apr 2015 13:40:40 -0400 Subject: [Python-Dev] Keyword-only parameters Message-ID: <552D5118.80703@trueblade.com> I'm working on adding a numeric_owner parameter to some tarfile methods (http://bugs.python.org/issue23193), In a review, Berker suggested making the parameter keyword-only. I agree that you'd likely never want to pass just "True", but that "numeric_owner=True" would be a better usage. But, I don't see a lot of keyword-only parameters being added to stdlib code. Is there some position we've taken on this? Barring someone saying "stdlib APIs shouldn't contain keyword-only params", I'm inclined to make numeric_owner keyword-only. Is there anything stopping me from making it keyword-only? Thanks. Eric. From andrew.svetlov at gmail.com Tue Apr 14 19:53:30 2015 From: andrew.svetlov at gmail.com (Andrew Svetlov) Date: Tue, 14 Apr 2015 13:53:30 -0400 Subject: [Python-Dev] Keyword-only parameters In-Reply-To: <552D5118.80703@trueblade.com> References: <552D5118.80703@trueblade.com> Message-ID: At least asyncio uses keyword-only intensively. On Tue, Apr 14, 2015 at 1:40 PM, Eric V. Smith wrote: > I'm working on adding a numeric_owner parameter to some tarfile methods > (http://bugs.python.org/issue23193), > > In a review, Berker suggested making the parameter keyword-only. I agree > that you'd likely never want to pass just "True", but that > "numeric_owner=True" would be a better usage. > > But, I don't see a lot of keyword-only parameters being added to stdlib > code. Is there some position we've taken on this? Barring someone saying > "stdlib APIs shouldn't contain keyword-only params", I'm inclined to > make numeric_owner keyword-only. > > Is there anything stopping me from making it keyword-only? > > Thanks. > Eric. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.com -- Thanks, Andrew Svetlov From tritium-list at sdamon.com Tue Apr 14 19:45:37 2015 From: tritium-list at sdamon.com (Alexander Walters) Date: Tue, 14 Apr 2015 13:45:37 -0400 Subject: [Python-Dev] Keyword-only parameters In-Reply-To: <552D5118.80703@trueblade.com> References: <552D5118.80703@trueblade.com> Message-ID: <552D5241.3010301@sdamon.com> Lacking anything anyone else says... the use case for keyword only arguments (where they actually make the code better rather than simply being different) is rather limited. On 4/14/2015 13:40, Eric V. Smith wrote: > I'm working on adding a numeric_owner parameter to some tarfile methods > (http://bugs.python.org/issue23193), > > In a review, Berker suggested making the parameter keyword-only. I agree > that you'd likely never want to pass just "True", but that > "numeric_owner=True" would be a better usage. > > But, I don't see a lot of keyword-only parameters being added to stdlib > code. Is there some position we've taken on this? Barring someone saying > "stdlib APIs shouldn't contain keyword-only params", I'm inclined to > make numeric_owner keyword-only. > > Is there anything stopping me from making it keyword-only? > > Thanks. > Eric. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/tritium-list%40sdamon.com From steve at pearwood.info Tue Apr 14 20:06:18 2015 From: steve at pearwood.info (Steven D'Aprano) Date: Wed, 15 Apr 2015 04:06:18 +1000 Subject: [Python-Dev] Keyword-only parameters In-Reply-To: <552D5118.80703@trueblade.com> References: <552D5118.80703@trueblade.com> Message-ID: <20150414180618.GG5663@ando.pearwood.info> On Tue, Apr 14, 2015 at 01:40:40PM -0400, Eric V. Smith wrote: > But, I don't see a lot of keyword-only parameters being added to stdlib > code. Is there some position we've taken on this? Barring someone saying > "stdlib APIs shouldn't contain keyword-only params", I'm inclined to > make numeric_owner keyword-only. I expect that's because keyword-only parameters are quite recent (3.x only) and most of the stdlib is quite old. Keyword-only feels right for this to me too. -- Steve From zachary.ware+pydev at gmail.com Tue Apr 14 20:11:04 2015 From: zachary.ware+pydev at gmail.com (Zachary Ware) Date: Tue, 14 Apr 2015 13:11:04 -0500 Subject: [Python-Dev] Keyword-only parameters In-Reply-To: <20150414180618.GG5663@ando.pearwood.info> References: <552D5118.80703@trueblade.com> <20150414180618.GG5663@ando.pearwood.info> Message-ID: On Tue, Apr 14, 2015 at 1:06 PM, Steven D'Aprano wrote: > On Tue, Apr 14, 2015 at 01:40:40PM -0400, Eric V. Smith wrote: >> But, I don't see a lot of keyword-only parameters being added to stdlib >> code. Is there some position we've taken on this? Barring someone saying >> "stdlib APIs shouldn't contain keyword-only params", I'm inclined to >> make numeric_owner keyword-only. > > I expect that's because keyword-only parameters are quite recent (3.x > only) and most of the stdlib is quite old. > > Keyword-only feels right for this to me too. +1 -- Zach From pmiscml at gmail.com Tue Apr 14 19:56:55 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Tue, 14 Apr 2015 20:56:55 +0300 Subject: [Python-Dev] Keyword-only parameters In-Reply-To: <552D5118.80703@trueblade.com> References: <552D5118.80703@trueblade.com> Message-ID: <20150414205655.16ba7242@x230> Hello, On Tue, 14 Apr 2015 13:40:40 -0400 "Eric V. Smith" wrote: > I'm working on adding a numeric_owner parameter to some tarfile > methods (http://bugs.python.org/issue23193), > > In a review, Berker suggested making the parameter keyword-only. I > agree that you'd likely never want to pass just "True", but that > "numeric_owner=True" would be a better usage. > > But, I don't see a lot of keyword-only parameters being added to > stdlib code. Well, majority of stdlib is stable, why would somebody suddenly go and start adding some novelty to otherwise stable and consistent API? But newer parts of stdlib, e.g. asyncio, visibly overuse kw-only args. > Is there some position we've taken on this? Barring > someone saying "stdlib APIs shouldn't contain keyword-only params", > I'm inclined to make numeric_owner keyword-only. > > Is there anything stopping me from making it keyword-only? > > Thanks. > Eric. -- Best regards, Paul mailto:pmiscml at gmail.com From jjevnik at quantopian.com Tue Apr 14 20:09:14 2015 From: jjevnik at quantopian.com (Joe Jevnik) Date: Tue, 14 Apr 2015 14:09:14 -0400 Subject: [Python-Dev] Keyword-only parameters In-Reply-To: <20150414180618.GG5663@ando.pearwood.info> References: <552D5118.80703@trueblade.com> <20150414180618.GG5663@ando.pearwood.info> Message-ID: I personally find that keyword only arguments make for nicer APIS. It also makes subclassing easier because you are free to add new positional arguments later. Especially for boolean arguments, I think keyword only is a great API choice. On Tue, Apr 14, 2015 at 2:06 PM, Steven D'Aprano wrote: > On Tue, Apr 14, 2015 at 01:40:40PM -0400, Eric V. Smith wrote: > > > But, I don't see a lot of keyword-only parameters being added to stdlib > > code. Is there some position we've taken on this? Barring someone saying > > "stdlib APIs shouldn't contain keyword-only params", I'm inclined to > > make numeric_owner keyword-only. > > I expect that's because keyword-only parameters are quite recent (3.x > only) and most of the stdlib is quite old. > > Keyword-only feels right for this to me too. > > -- > Steve > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/joe%40quantopian.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From lukasz at langa.pl Tue Apr 14 20:34:23 2015 From: lukasz at langa.pl (=?utf-8?Q?=C5=81ukasz_Langa?=) Date: Tue, 14 Apr 2015 14:34:23 -0400 Subject: [Python-Dev] Keyword-only parameters In-Reply-To: <552D5118.80703@trueblade.com> References: <552D5118.80703@trueblade.com> Message-ID: <68C6EB7A-49F5-42C7-AFAA-41419AEAFE3D@langa.pl> If you?re introducing a new parameter that is a boolean, making it kw-only is generally accepted. Some people (myself included) would encourage you to do so. Besides asyncio, there are already new arguments that are kw-only in many modules, including configparser, unittest, xml.etree, xmlrpc, urllib.request, traceback, tarfile, shutil, ssl, etc. > On Apr 14, 2015, at 1:40 PM, Eric V. Smith wrote: > > I'm working on adding a numeric_owner parameter to some tarfile methods > (http://bugs.python.org/issue23193), > > In a review, Berker suggested making the parameter keyword-only. I agree > that you'd likely never want to pass just "True", but that > "numeric_owner=True" would be a better usage. > > But, I don't see a lot of keyword-only parameters being added to stdlib > code. Is there some position we've taken on this? Barring someone saying > "stdlib APIs shouldn't contain keyword-only params", I'm inclined to > make numeric_owner keyword-only. > > Is there anything stopping me from making it keyword-only? > > Thanks. > Eric. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/lukasz%40langa.pl From eric at trueblade.com Tue Apr 14 20:25:43 2015 From: eric at trueblade.com (Eric V. Smith) Date: Tue, 14 Apr 2015 14:25:43 -0400 Subject: [Python-Dev] Keyword-only parameters In-Reply-To: References: <552D5118.80703@trueblade.com> <20150414180618.GG5663@ando.pearwood.info> Message-ID: <552D5BA7.3030503@trueblade.com> On 04/14/2015 02:11 PM, Zachary Ware wrote: > On Tue, Apr 14, 2015 at 1:06 PM, Steven D'Aprano wrote: >> On Tue, Apr 14, 2015 at 01:40:40PM -0400, Eric V. Smith wrote: >>> But, I don't see a lot of keyword-only parameters being added to stdlib >>> code. Is there some position we've taken on this? Barring someone saying >>> "stdlib APIs shouldn't contain keyword-only params", I'm inclined to >>> make numeric_owner keyword-only. >> >> I expect that's because keyword-only parameters are quite recent (3.x >> only) and most of the stdlib is quite old. >> >> Keyword-only feels right for this to me too. > > +1 Thanks. I'll make it keyword-only. Eric. From regebro at gmail.com Tue Apr 14 21:04:48 2015 From: regebro at gmail.com (Lennart Regebro) Date: Tue, 14 Apr 2015 15:04:48 -0400 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: References: Message-ID: OK, so I realized another thing today, and that is that arithmetic doesn't necessarily round trip. For example, 2002-10-27 01:00 US/Eastern comes both in DST and STD. But 2002-10-27 01:00 US/Eastern STD minus two days is 2002-10-25 01:00 US/Eastern DST However, 2002-10-25 01:00 US/Eastern DST plus two days is 2002-10-27 01:00 US/Eastern, but it is ambiguous if you want DST or not DST. And you can't pass in a is_dst flag to __add__, so the arithmatic must just pick one, and the sensible one is to keep to the same DST. That means that: tz = get_timezone('US/Eastern') dt = datetime(2002, 10, 27, 1, 0, tz=tz, is_dst=False) dt2 = dt - 420 + 420 assert dt == dt2 Will fail, which will be unexpected for most people. I think there is no way around this, but I thought I should flag for it. This is a good reason to do all your date time arithmetic in UTC. //Lennart From ethan at stoneleaf.us Tue Apr 14 21:10:39 2015 From: ethan at stoneleaf.us (Ethan Furman) Date: Tue, 14 Apr 2015 12:10:39 -0700 Subject: [Python-Dev] Keyword-only parameters In-Reply-To: <552D5118.80703@trueblade.com> References: <552D5118.80703@trueblade.com> Message-ID: <20150414191039.GA14135@stoneleaf.us> On 04/14, Eric V. Smith wrote: > But, I don't see a lot of keyword-only parameters being added to stdlib > code. Is there some position we've taken on this? Barring someone saying > "stdlib APIs shouldn't contain keyword-only params", I'm inclined to > make numeric_owner keyword-only. os.path and shutil make extensive use of keyword-only params. -- ~Ethan~ From larry at hastings.org Tue Apr 14 21:40:32 2015 From: larry at hastings.org (Larry Hastings) Date: Tue, 14 Apr 2015 15:40:32 -0400 Subject: [Python-Dev] Keyword-only parameters In-Reply-To: <20150414205655.16ba7242@x230> References: <552D5118.80703@trueblade.com> <20150414205655.16ba7242@x230> Message-ID: <552D6D30.1070308@hastings.org> On 04/14/2015 01:40 PM, Eric V. Smith wrote: > I'm working on adding a numeric_owner parameter to some tarfile methods > (http://bugs.python.org/issue23193), > > In a review, Berker suggested making the parameter keyword-only. I agree > that you'd likely never want to pass just "True", but that > "numeric_owner=True" would be a better usage. Boolean parameters are the classic problem that keyword-only parameters solve. It forces the caller to provide context for the parameter, solving the mystery-meat API problem of tarfile.extractall(".", None, True) On 04/14/2015 01:56 PM, Paul Sokolovsky wrote: > But newer parts of stdlib, e.g. asyncio, visibly overuse kw-only args. Overuse? asyncio? You mean "that thing Guido just wrote last year"? The most practical definition I've heard for the word "pythonic" is "code like Guido writes". //arry/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From pmiscml at gmail.com Tue Apr 14 22:23:17 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Tue, 14 Apr 2015 23:23:17 +0300 Subject: [Python-Dev] Keyword-only parameters In-Reply-To: <552D6D30.1070308@hastings.org> References: <552D5118.80703@trueblade.com> <20150414205655.16ba7242@x230> <552D6D30.1070308@hastings.org> Message-ID: <20150414232317.312e868f@x230> Hello, On Tue, 14 Apr 2015 15:40:32 -0400 Larry Hastings wrote: > On 04/14/2015 01:56 PM, Paul Sokolovsky wrote: > > But newer parts of stdlib, e.g. asyncio, visibly overuse kw-only > > args. > > Overuse? asyncio? You mean "that thing Guido just wrote last > year"? The most practical definition I've heard for the word > "pythonic" is "code like Guido writes". You probably read that sentence too fast, so let's clarify it up: 1. asyncio project, previously known as "tulip" is some 2.5 years old, prooflink: https://github.com/python/asyncio/commit/0b0da72d0d23a4c582ea07dd3d2638021183750e . The fact that people still don't know it well enough (e.g. that it's full of kw-only args, or that it's so old) is sad. 2. asyncio has >1 function which accepts just couple of arguments, one of the arguments being forced as keyword-only. There can be multiple opinions of how to treat such usage of kw-only arguments, and one of the opinions can be summarized as "overuse of kw-only arguments". And to state the obvious, Python had keyword arguments for ages, and people used them all this time. The talk now is to *force* people to use keyword arguments. Done right (as supposedly asyncio tries to do, actual results may vary), it's good. But if it suddenly goes as a fashion to make kw-only args *everywhere*, that can only lead to harder to type and noisier to read code. So yes, people better err on the side of not forcing them as such, leaving choice to the actual users. > //arry/ -- Best regards, Paul mailto:pmiscml at gmail.com From alexander.belopolsky at gmail.com Tue Apr 14 22:52:02 2015 From: alexander.belopolsky at gmail.com (Alexander Belopolsky) Date: Tue, 14 Apr 2015 16:52:02 -0400 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: References: Message-ID: On Wed, Apr 8, 2015 at 11:18 AM, Lennart Regebro wrote: > > I wrote PEP-431 two years ago, and never got around to implement it. > This year I got some renewed motivation after Berker Peksa? made an > effort of implementing it. > I'm planning to work more on this during the PyCon sprints, and also > have a BoF session or similar during the conference. For those who were not at the conference, can someone summarize the post-PyCon status of this PEP? Is Barry still the "BDFL-Delegate"? Is there an updated draft? Should this discussion move to python-ideas? -------------- next part -------------- An HTML attachment was scrubbed... URL: From greg at krypto.org Tue Apr 14 23:11:28 2015 From: greg at krypto.org (Gregory P. Smith) Date: Tue, 14 Apr 2015 21:11:28 +0000 Subject: [Python-Dev] Keyword-only parameters In-Reply-To: <552D5241.3010301@sdamon.com> References: <552D5118.80703@trueblade.com> <552D5241.3010301@sdamon.com> Message-ID: On Tue, Apr 14, 2015 at 1:56 PM Alexander Walters wrote: > Lacking anything anyone else says... the use case for keyword only > arguments (where they actually make the code better rather than simply > being different) is rather limited. > > I disagree. For parameters not often passed or beyond 3-4 parameters being used on any API I actually prefer that we start using keyword only parameters for those. I'd say feel free to use keyword only parameters when you believe they make sense. Most of the stdlib does not do it because most of the APIs were defined in the 2.x era and maintaining compatibility with them is desirable for 2AND3 compatible code. But for new things, feel free. -gps > On 4/14/2015 13:40, Eric V. Smith wrote: > > I'm working on adding a numeric_owner parameter to some tarfile methods > > (http://bugs.python.org/issue23193), > > > > In a review, Berker suggested making the parameter keyword-only. I agree > > that you'd likely never want to pass just "True", but that > > "numeric_owner=True" would be a better usage. > > > > But, I don't see a lot of keyword-only parameters being added to stdlib > > code. Is there some position we've taken on this? Barring someone saying > > "stdlib APIs shouldn't contain keyword-only params", I'm inclined to > > make numeric_owner keyword-only. > > > > Is there anything stopping me from making it keyword-only? > > > > Thanks. > > Eric. > > _______________________________________________ > > Python-Dev mailing list > > Python-Dev at python.org > > https://mail.python.org/mailman/listinfo/python-dev > > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/tritium-list%40sdamon.com > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/greg%40krypto.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Tue Apr 14 23:14:21 2015 From: guido at python.org (Guido van Rossum) Date: Tue, 14 Apr 2015 17:14:21 -0400 Subject: [Python-Dev] Keyword-only parameters In-Reply-To: References: <552D5118.80703@trueblade.com> <552D5241.3010301@sdamon.com> Message-ID: Also, why do you think we added the 'lone star' syntax? :-) -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From regebro at gmail.com Tue Apr 14 23:18:08 2015 From: regebro at gmail.com (Lennart Regebro) Date: Tue, 14 Apr 2015 17:18:08 -0400 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: References: Message-ID: On Tue, Apr 14, 2015 at 4:52 PM, Alexander Belopolsky wrote: > For those who were not at the conference, can someone summarize the > post-PyCon status of this PEP? > > Is Barry still the "BDFL-Delegate"? Is there an updated draft? Should this > discussion move to python-ideas? There is no status change except that it will need updates, but I'm waiting with that until I know what updates are needed, and that means an implementation is needed first. //Lennart From v+python at g.nevcal.com Wed Apr 15 00:32:57 2015 From: v+python at g.nevcal.com (Glenn Linderman) Date: Tue, 14 Apr 2015 15:32:57 -0700 Subject: [Python-Dev] Keyword-only parameters In-Reply-To: References: <552D5118.80703@trueblade.com> <552D5241.3010301@sdamon.com> Message-ID: <552D9599.7030809@g.nevcal.com> On 4/14/2015 2:14 PM, Guido van Rossum wrote: > Also, why do you think we added the 'lone star' syntax? :-) Hint: Not because Guido is from Texas.... -------------- next part -------------- An HTML attachment was scrubbed... URL: From bpmaide at yahoo.com.mx Wed Apr 15 01:50:39 2015 From: bpmaide at yahoo.com.mx (Baldomero Perez martinez) Date: Tue, 14 Apr 2015 23:50:39 +0000 (UTC) Subject: [Python-Dev] Aprender Pythton Message-ID: <1449835078.3774458.1429055439863.JavaMail.yahoo@mail.yahoo.com> Quiero aprender python quiero empezar agradezco si me pueden ayudar?L.I. Baldomero P?rez Mart?nez Enc. Proy. Informatica Fideicomiso Ingenio Atencingo 80326 -------------- next part -------------- An HTML attachment was scrubbed... URL: From raulcumplido at gmail.com Wed Apr 15 02:15:37 2015 From: raulcumplido at gmail.com (=?UTF-8?Q?Ra=C3=BAl_Cumplido?=) Date: Tue, 14 Apr 2015 20:15:37 -0400 Subject: [Python-Dev] Aprender Pythton In-Reply-To: <1449835078.3774458.1429055439863.JavaMail.yahoo@mail.yahoo.com> References: <1449835078.3774458.1429055439863.JavaMail.yahoo@mail.yahoo.com> Message-ID: Hola Baldomero, Esta lista es para el desarrollo de el lenguaje Python. Para dudas sobre como aprender Python o su utilizaci?n puedes utilizar la lista en espa?ol python-es at python.org o la lista en ingl?s python-list at python.org Existe tambi?n la lista en ingl?s para pedir ayuda para iniciarse en el uso de Python tutor at python.org Saludos, Ra?l On 14 Apr 2015 20:03, "Baldomero Perez martinez" < bpmaide at yahoo.com.dmarc.invalid.mx> wrote: > Quiero aprender python quiero empezar agradezco si me pueden ayudar > > L.I. Baldomero P?rez Mart?nez > Enc. Proy. Informatica > Fideicomiso Ingenio Atencingo 80326 > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/raulcumplido%40gmail.com > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From erik.river at gmail.com Wed Apr 15 02:36:49 2015 From: erik.river at gmail.com (Erik Rivera) Date: Tue, 14 Apr 2015 20:36:49 -0400 Subject: [Python-Dev] Aprender Pythton In-Reply-To: <1449835078.3774458.1429055439863.JavaMail.yahoo@mail.yahoo.com> References: <1449835078.3774458.1429055439863.JavaMail.yahoo@mail.yahoo.com> Message-ID: Baldomero, Veo que perteneces al estado de Puebla, M?xico, existe la lista de Python M?xico https://mail.python.org/mailman/listinfo/python-mx, habemos varios de Puebla que te podemos apoyar. Saludos. El 14 de abril de 2015, 19:50, Baldomero Perez martinez < bpmaide at yahoo.com.dmarc.invalid.mx> escribi?: > Quiero aprender python quiero empezar agradezco si me pueden ayudar > > L.I. Baldomero P?rez Mart?nez > Enc. Proy. Informatica > Fideicomiso Ingenio Atencingo 80326 > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/erik.river%40gmail.com > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From andrew.svetlov at gmail.com Wed Apr 15 02:41:10 2015 From: andrew.svetlov at gmail.com (Andrew Svetlov) Date: Tue, 14 Apr 2015 20:41:10 -0400 Subject: [Python-Dev] Aprender Pythton In-Reply-To: References: <1449835078.3774458.1429055439863.JavaMail.yahoo@mail.yahoo.com> Message-ID: I'm sorry. Please use English in the mailing list. People may not understand your chatting. 2015-04-14 20:36 GMT-04:00 Erik Rivera : > Baldomero, > > Veo que perteneces al estado de Puebla, M?xico, existe la lista de Python > M?xico https://mail.python.org/mailman/listinfo/python-mx, habemos varios > de Puebla que te podemos apoyar. > > Saludos. > > El 14 de abril de 2015, 19:50, Baldomero Perez martinez < > bpmaide at yahoo.com.dmarc.invalid.mx> escribi?: > >> Quiero aprender python quiero empezar agradezco si me pueden ayudar >> >> L.I. Baldomero P?rez Mart?nez >> Enc. Proy. Informatica >> Fideicomiso Ingenio Atencingo 80326 >> >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/erik.river%40gmail.com >> >> > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.com > > -- Thanks, Andrew Svetlov -------------- next part -------------- An HTML attachment was scrubbed... URL: From raulcumplido at gmail.com Wed Apr 15 02:46:01 2015 From: raulcumplido at gmail.com (=?UTF-8?Q?Ra=C3=BAl_Cumplido?=) Date: Tue, 14 Apr 2015 20:46:01 -0400 Subject: [Python-Dev] Aprender Pythton In-Reply-To: References: <1449835078.3774458.1429055439863.JavaMail.yahoo@mail.yahoo.com> Message-ID: Hi Andrew, Is someone asking where to find resources to learn Python. We have redirected him to the python lists both in english and spanish. We should have replied in English if it would have been something related to python-dev, but we have responded in Spanish as maybe the user doesn't understand English. Kind Regards, Ra?l 2015-04-14 20:41 GMT-04:00 Andrew Svetlov : > I'm sorry. Please use English in the mailing list. > > People may not understand your chatting. > > 2015-04-14 20:36 GMT-04:00 Erik Rivera : > >> Baldomero, >> >> Veo que perteneces al estado de Puebla, M?xico, existe la lista de Python >> M?xico https://mail.python.org/mailman/listinfo/python-mx, habemos >> varios de Puebla que te podemos apoyar. >> >> Saludos. >> >> El 14 de abril de 2015, 19:50, Baldomero Perez martinez < >> bpmaide at yahoo.com.dmarc.invalid.mx> escribi?: >> >>> Quiero aprender python quiero empezar agradezco si me pueden ayudar >>> >>> L.I. Baldomero P?rez Mart?nez >>> Enc. Proy. Informatica >>> Fideicomiso Ingenio Atencingo 80326 >>> >>> _______________________________________________ >>> Python-Dev mailing list >>> Python-Dev at python.org >>> https://mail.python.org/mailman/listinfo/python-dev >>> Unsubscribe: >>> https://mail.python.org/mailman/options/python-dev/erik.river%40gmail.com >>> >>> >> >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.com >> >> > > > -- > Thanks, > Andrew Svetlov > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/raulcumplido%40gmail.com > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From andrew.svetlov at gmail.com Wed Apr 15 02:54:36 2015 From: andrew.svetlov at gmail.com (Andrew Svetlov) Date: Tue, 14 Apr 2015 20:54:36 -0400 Subject: [Python-Dev] Aprender Pythton In-Reply-To: References: <1449835078.3774458.1429055439863.JavaMail.yahoo@mail.yahoo.com> Message-ID: Python-dev is for development OF Python, not for development WITH Python or Python LEARNING, BTW. On Tue, Apr 14, 2015 at 8:46 PM, Ra?l Cumplido wrote: > Hi Andrew, > > Is someone asking where to find resources to learn Python. We have > redirected him to the python lists both in english and spanish. > > We should have replied in English if it would have been something related > to python-dev, but we have responded in Spanish as maybe the user doesn't > understand English. > > Kind Regards, > Ra?l > > 2015-04-14 20:41 GMT-04:00 Andrew Svetlov : > >> I'm sorry. Please use English in the mailing list. >> >> People may not understand your chatting. >> >> 2015-04-14 20:36 GMT-04:00 Erik Rivera : >> >>> Baldomero, >>> >>> Veo que perteneces al estado de Puebla, M?xico, existe la lista de >>> Python M?xico https://mail.python.org/mailman/listinfo/python-mx, >>> habemos varios de Puebla que te podemos apoyar. >>> >>> Saludos. >>> >>> El 14 de abril de 2015, 19:50, Baldomero Perez martinez < >>> bpmaide at yahoo.com.dmarc.invalid.mx> escribi?: >>> >>>> Quiero aprender python quiero empezar agradezco si me pueden ayudar >>>> >>>> L.I. Baldomero P?rez Mart?nez >>>> Enc. Proy. Informatica >>>> Fideicomiso Ingenio Atencingo 80326 >>>> >>>> _______________________________________________ >>>> Python-Dev mailing list >>>> Python-Dev at python.org >>>> https://mail.python.org/mailman/listinfo/python-dev >>>> Unsubscribe: >>>> https://mail.python.org/mailman/options/python-dev/erik.river%40gmail.com >>>> >>>> >>> >>> _______________________________________________ >>> Python-Dev mailing list >>> Python-Dev at python.org >>> https://mail.python.org/mailman/listinfo/python-dev >>> Unsubscribe: >>> https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.com >>> >>> >> >> >> -- >> Thanks, >> Andrew Svetlov >> >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/raulcumplido%40gmail.com >> >> > -- Thanks, Andrew Svetlov -------------- next part -------------- An HTML attachment was scrubbed... URL: From graffatcolmingov at gmail.com Wed Apr 15 02:59:31 2015 From: graffatcolmingov at gmail.com (Ian Cordasco) Date: Tue, 14 Apr 2015 19:59:31 -0500 Subject: [Python-Dev] Aprender Pythton In-Reply-To: References: <1449835078.3774458.1429055439863.JavaMail.yahoo@mail.yahoo.com> Message-ID: On Tue, Apr 14, 2015 at 7:54 PM, Andrew Svetlov wrote: > Python-dev is for development OF Python, not for development WITH Python > or Python LEARNING, BTW. > > On Tue, Apr 14, 2015 at 8:46 PM, Ra?l Cumplido > wrote: > >> Hi Andrew, >> >> Is someone asking where to find resources to learn Python. We have >> redirected him to the python lists both in english and spanish. >> >> We should have replied in English if it would have been something related >> to python-dev, but we have responded in Spanish as maybe the user doesn't >> understand English. >> >> Kind Regards, >> Ra?l >> >> 2015-04-14 20:41 GMT-04:00 Andrew Svetlov : >> >>> I'm sorry. Please use English in the mailing list. >>> >>> People may not understand your chatting. >>> >>> 2015-04-14 20:36 GMT-04:00 Erik Rivera : >>> >>>> Baldomero, >>>> >>>> Veo que perteneces al estado de Puebla, M?xico, existe la lista de >>>> Python M?xico https://mail.python.org/mailman/listinfo/python-mx, >>>> habemos varios de Puebla que te podemos apoyar. >>>> >>>> Saludos. >>>> >>>> El 14 de abril de 2015, 19:50, Baldomero Perez martinez < >>>> bpmaide at yahoo.com.dmarc.invalid.mx> escribi?: >>>> >>>>> Quiero aprender python quiero empezar agradezco si me pueden ayudar >>>>> >>>>> L.I. Baldomero P?rez Mart?nez >>>>> Enc. Proy. Informatica >>>>> Fideicomiso Ingenio Atencingo 80326 >>>>> >>>>> _______________________________________________ >>>>> Python-Dev mailing list >>>>> Python-Dev at python.org >>>>> https://mail.python.org/mailman/listinfo/python-dev >>>>> Unsubscribe: >>>>> https://mail.python.org/mailman/options/python-dev/erik.river%40gmail.com >>>>> >>>>> >>>> >>>> _______________________________________________ >>>> Python-Dev mailing list >>>> Python-Dev at python.org >>>> https://mail.python.org/mailman/listinfo/python-dev >>>> Unsubscribe: >>>> https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.com >>>> >>>> >>> >>> >>> -- >>> Thanks, >>> Andrew Svetlov >>> >>> _______________________________________________ >>> Python-Dev mailing list >>> Python-Dev at python.org >>> https://mail.python.org/mailman/listinfo/python-dev >>> Unsubscribe: >>> https://mail.python.org/mailman/options/python-dev/raulcumplido%40gmail.com >>> >>> >> > > > -- > Thanks, > Andrew Svetlov > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/graffatcolmingov%40gmail.com > > Andrew if you translate the text that was sent to Baldomero, you'll see that's exactly what they said. Please don't be so rude (or lazy) to people helping someone learn Python, regardless of whether they mistakenly posted to the wrong list. -------------- next part -------------- An HTML attachment was scrubbed... URL: From andrew.svetlov at gmail.com Wed Apr 15 03:12:24 2015 From: andrew.svetlov at gmail.com (Andrew Svetlov) Date: Tue, 14 Apr 2015 21:12:24 -0400 Subject: [Python-Dev] Aprender Pythton In-Reply-To: References: <1449835078.3774458.1429055439863.JavaMail.yahoo@mail.yahoo.com> Message-ID: Ok, sorry. On Tue, Apr 14, 2015 at 8:59 PM, Ian Cordasco wrote: > > > On Tue, Apr 14, 2015 at 7:54 PM, Andrew Svetlov > wrote: > >> Python-dev is for development OF Python, not for development WITH Python >> or Python LEARNING, BTW. >> >> On Tue, Apr 14, 2015 at 8:46 PM, Ra?l Cumplido >> wrote: >> >>> Hi Andrew, >>> >>> Is someone asking where to find resources to learn Python. We have >>> redirected him to the python lists both in english and spanish. >>> >>> We should have replied in English if it would have been something >>> related to python-dev, but we have responded in Spanish as maybe the user >>> doesn't understand English. >>> >>> Kind Regards, >>> Ra?l >>> >>> 2015-04-14 20:41 GMT-04:00 Andrew Svetlov : >>> >>>> I'm sorry. Please use English in the mailing list. >>>> >>>> People may not understand your chatting. >>>> >>>> 2015-04-14 20:36 GMT-04:00 Erik Rivera : >>>> >>>>> Baldomero, >>>>> >>>>> Veo que perteneces al estado de Puebla, M?xico, existe la lista de >>>>> Python M?xico https://mail.python.org/mailman/listinfo/python-mx, >>>>> habemos varios de Puebla que te podemos apoyar. >>>>> >>>>> Saludos. >>>>> >>>>> El 14 de abril de 2015, 19:50, Baldomero Perez martinez < >>>>> bpmaide at yahoo.com.dmarc.invalid.mx> escribi?: >>>>> >>>>>> Quiero aprender python quiero empezar agradezco si me pueden ayudar >>>>>> >>>>>> L.I. Baldomero P?rez Mart?nez >>>>>> Enc. Proy. Informatica >>>>>> Fideicomiso Ingenio Atencingo 80326 >>>>>> >>>>>> _______________________________________________ >>>>>> Python-Dev mailing list >>>>>> Python-Dev at python.org >>>>>> https://mail.python.org/mailman/listinfo/python-dev >>>>>> Unsubscribe: >>>>>> https://mail.python.org/mailman/options/python-dev/erik.river%40gmail.com >>>>>> >>>>>> >>>>> >>>>> _______________________________________________ >>>>> Python-Dev mailing list >>>>> Python-Dev at python.org >>>>> https://mail.python.org/mailman/listinfo/python-dev >>>>> Unsubscribe: >>>>> https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.com >>>>> >>>>> >>>> >>>> >>>> -- >>>> Thanks, >>>> Andrew Svetlov >>>> >>>> _______________________________________________ >>>> Python-Dev mailing list >>>> Python-Dev at python.org >>>> https://mail.python.org/mailman/listinfo/python-dev >>>> Unsubscribe: >>>> https://mail.python.org/mailman/options/python-dev/raulcumplido%40gmail.com >>>> >>>> >>> >> >> >> -- >> Thanks, >> Andrew Svetlov >> >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/graffatcolmingov%40gmail.com >> >> > Andrew if you translate the text that was sent to Baldomero, you'll see > that's exactly what they said. Please don't be so rude (or lazy) to people > helping someone learn Python, regardless of whether they mistakenly posted > to the wrong list. > -- Thanks, Andrew Svetlov -------------- next part -------------- An HTML attachment was scrubbed... URL: From stuart at stuartbishop.net Wed Apr 15 06:59:02 2015 From: stuart at stuartbishop.net (Stuart Bishop) Date: Wed, 15 Apr 2015 06:59:02 +0200 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: References: Message-ID: On 14 April 2015 at 21:04, Lennart Regebro wrote: > OK, so I realized another thing today, and that is that arithmetic > doesn't necessarily round trip. > > For example, 2002-10-27 01:00 US/Eastern comes both in DST and STD. > > But 2002-10-27 01:00 US/Eastern STD minus two days is 2002-10-25 01:00 > US/Eastern DST > However, 2002-10-25 01:00 US/Eastern DST plus two days is 2002-10-27 > 01:00 US/Eastern, but it is ambiguous if you want DST or not DST. > And you can't pass in a is_dst flag to __add__, so the arithmatic must > just pick one, and the sensible one is to keep to the same DST. >>> import pytz >>> from datetime import datetime, timedelta >>> tz = pytz.timezone('US/Eastern') >>> dt = tz.localize(datetime(2002, 10, 27, 1, 0), is_dst=False) >>> dt2 = tz.normalize(dt - timedelta(days=2) + timedelta(days=2)) >>> dt == dt2 True >>> >>> tz.normalize(dt - timedelta(days=2)) datetime.datetime(2002, 10, 25, 2, 0, tzinfo=) >>> tz.normalize(tz.normalize(dt - timedelta(days=2)) + timedelta(days=2)) datetime.datetime(2002, 10, 27, 1, 0, tzinfo=) 2002-10-27 01:00 US/Eastern is_dst=0 is after the DST transition (EST). Subtracting 48 hours from it crosses the DST boundary and should give you 2002-10-27 02:00 US/Eastern is_dst=1, prior to the DST transition (EDT). Adding 48 hours again goes past 2002-10-27 01:00 EDT, crosses the DST boundary, and gives you back 2002-10-27 01:00 EST. -- Stuart Bishop http://www.stuartbishop.net/ From regebro at gmail.com Wed Apr 15 10:09:05 2015 From: regebro at gmail.com (Lennart Regebro) Date: Wed, 15 Apr 2015 04:09:05 -0400 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: References: Message-ID: Yeah, I just realized this. As long as you use timedelta, the difference is of course not one day, but 24 hours. That solves the problem, but it is surprising in other ways. In US/Eastern datetime.datetime(2002, 10, 27, 1, 0) - datetime.timedelta(1) needs to become datetime.datetime(2002, 10, 26, 2, 0) (Note the hour change) I was thinking in calendrial arithmetic, which the datetime module doesn't need to care about. On Wed, Apr 15, 2015 at 12:59 AM, Stuart Bishop wrote: > On 14 April 2015 at 21:04, Lennart Regebro wrote: >> OK, so I realized another thing today, and that is that arithmetic >> doesn't necessarily round trip. >> >> For example, 2002-10-27 01:00 US/Eastern comes both in DST and STD. >> >> But 2002-10-27 01:00 US/Eastern STD minus two days is 2002-10-25 01:00 >> US/Eastern DST >> However, 2002-10-25 01:00 US/Eastern DST plus two days is 2002-10-27 >> 01:00 US/Eastern, but it is ambiguous if you want DST or not DST. >> And you can't pass in a is_dst flag to __add__, so the arithmatic must >> just pick one, and the sensible one is to keep to the same DST. > >>>> import pytz >>>> from datetime import datetime, timedelta >>>> tz = pytz.timezone('US/Eastern') >>>> dt = tz.localize(datetime(2002, 10, 27, 1, 0), is_dst=False) >>>> dt2 = tz.normalize(dt - timedelta(days=2) + timedelta(days=2)) >>>> dt == dt2 > True >>>> >>>> tz.normalize(dt - timedelta(days=2)) > datetime.datetime(2002, 10, 25, 2, 0, tzinfo= EDT-1 day, 20:00:00 DST>) >>>> tz.normalize(tz.normalize(dt - timedelta(days=2)) + timedelta(days=2)) > datetime.datetime(2002, 10, 27, 1, 0, tzinfo= EST-1 day, 19:00:00 STD>) > > > 2002-10-27 01:00 US/Eastern is_dst=0 is after the DST transition > (EST). Subtracting 48 hours from it crosses the DST boundary and > should give you 2002-10-27 02:00 US/Eastern is_dst=1, prior to the DST > transition (EDT). Adding 48 hours again goes past 2002-10-27 01:00 > EDT, crosses the DST boundary, and gives you back 2002-10-27 01:00 > EST. > > > -- > Stuart Bishop > http://www.stuartbishop.net/ From regebro at gmail.com Wed Apr 15 17:00:47 2015 From: regebro at gmail.com (Lennart Regebro) Date: Wed, 15 Apr 2015 11:00:47 -0400 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: References: Message-ID: OK, so I just had a realization. Because we want some internal flag to tell if the datetime is in DST or not, the datetime pickle format will change. And the datetime pickle format changing is the biggest reason I had against changing the internal representation to UTC. So because of this, perhaps we actually *should* change the internal representation to UTC, because that makes the issues I'm fighting with now so much simpler. (I'm currently trying to get arithmetic to do the right thing in all cases, which is crazy complicated). We can add support to unpickle previous datetimes, but we won't be able to add forwards compatibility, meaning that pickles saved in Python 3.5 will not be unpicklable in Python 3.4. Please discuss. //Lennart From rosuav at gmail.com Wed Apr 15 17:10:42 2015 From: rosuav at gmail.com (Chris Angelico) Date: Thu, 16 Apr 2015 01:10:42 +1000 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: References: Message-ID: On Thu, Apr 16, 2015 at 1:00 AM, Lennart Regebro wrote: > So because of this, perhaps we actually *should* change the internal > representation to UTC, because that makes the issues I'm fighting with > now so much simpler. (I'm currently trying to get arithmetic to do the > right thing in all cases, which is crazy complicated). If I understand you correctly, then, an aware datetime would represent a unique instant in time (modulo relativity), coupled with some metadata stating what civil timezone it should be understood in terms of. This is the same as a PostgreSQL "timestamp with time zone" field, and IMO is a pretty reliable way to do things. So count me as +1 for this proposal. Bikeshed: Would arithmetic be based on UTC time or Unix time? It'd be more logical to describe it as "adding six hours means adding six hours to the UTC time", but it'd look extremely odd when there's a leap second. ChrisA From regebro at gmail.com Wed Apr 15 17:43:16 2015 From: regebro at gmail.com (Lennart Regebro) Date: Wed, 15 Apr 2015 11:43:16 -0400 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: References: Message-ID: On Wed, Apr 15, 2015 at 11:10 AM, Chris Angelico wrote: > Bikeshed: Would arithmetic be based on UTC time or Unix time? It'd be > more logical to describe it as "adding six hours means adding six > hours to the UTC time", but it'd look extremely odd when there's a > leap second. It would ignore leap seconds. If you want to call that unix time or not is a matter of opinion. Hm. I guess the internal representation *could* be EPOCH + offset, and local times could be calculated properties, which could be cached (or possibly calculated at creation). //Lennart From regebro at gmail.com Wed Apr 15 18:42:19 2015 From: regebro at gmail.com (Lennart Regebro) Date: Wed, 15 Apr 2015 12:42:19 -0400 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: References: Message-ID: On Wed, Apr 15, 2015 at 11:43 AM, Lennart Regebro wrote: > On Wed, Apr 15, 2015 at 11:10 AM, Chris Angelico wrote: >> Bikeshed: Would arithmetic be based on UTC time or Unix time? It'd be >> more logical to describe it as "adding six hours means adding six >> hours to the UTC time", but it'd look extremely odd when there's a >> leap second. > > It would ignore leap seconds. If you want to call that unix time or > not is a matter of opinion. Hm. I guess the internal representation > *could* be EPOCH + offset, and local times could be calculated > properties, which could be cached (or possibly calculated at > creation). In any case there wold probably need to be a PEP on that, and that means PEP 431 wouldn't make it into 3.5, unless somebody smarter than me want to take a shot at it. //Lennart From ischwabacher at wisc.edu Wed Apr 15 18:40:46 2015 From: ischwabacher at wisc.edu (Isaac Schwabacher) Date: Wed, 15 Apr 2015 11:40:46 -0500 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: <7460bc3511cd82.552e9122@wiscmail.wisc.edu> References: <74b0bbf711f9b6.552e8a42@wiscmail.wisc.edu> <7620f27c11fc2c.552e8a7f@wiscmail.wisc.edu> <7450edc311fe88.552e8abb@wiscmail.wisc.edu> <74b097e311e30a.552e8af7@wiscmail.wisc.edu> <74b0fce111d161.552e8b33@wiscmail.wisc.edu> <779099d1118aea.552e8b70@wiscmail.wisc.edu> <7450a0e311b722.552e8baf@wiscmail.wisc.edu> <7450acc211ad3a.552e8beb@wiscmail.wisc.edu> <77409ee811e62a.552e8c27@wiscmail.wisc.edu> <74b0c4c611d986.552e8ca0@wiscmail.wisc.edu> <7150e90e11e9aa.552e8ec7@wiscmail.wisc.edu> <7620ed9c11dc2f.552e8f3f@wiscmail.wisc.edu> <7100dd9c11a855.552e8f7b@wiscmail.wisc.edu> <7740c27a11897e.552e8fb8@wiscmail.wisc.edu> <76208d8e1184ca.552e8ff4@wiscmail.wisc.edu> <7150ad8711adea.552e9031@wiscmail.wisc.edu> <7100e15f11921e.552e906d@wiscmail.wisc.edu> <7720b86f11a526.552e90a9@wiscmail.wisc.edu> <71008b1f1193de.552e90e6@wiscmail.wisc.edu> <7460bc3511cd82.552e9122@wiscmail.wisc.edu> Message-ID: <7460f8e3119451.552e4e3e@wiscmail.wisc.edu> On 15-04-15, Lennart Regebro wrote: > On Wed, Apr 15, 2015 at 11:10 AM, Chris Angelico wrote: > > Bikeshed: Would arithmetic be based on UTC time or Unix time? It'd be > > more logical to describe it as "adding six hours means adding six > > hours to the UTC time", but it'd look extremely odd when there's a > > leap second. > > It would ignore leap seconds. If you want to call that unix time or > not is a matter of opinion. Hm. I guess the internal representation > *could* be EPOCH + offset, and local times could be calculated > properties, which could be cached (or possibly calculated at > creation). I am on the fence about EPOCH + offset as an internal representation. On the one hand, it is conceptually cleaner than the horrible byte-packing that exists today, but on the other hand, it has implications for future implementations of leap-second-awareness. For instance, offset measures the difference between the local time and UTC. But is it honest-to-goodness leap-second aware UTC, or is it really Unix time? This could be solved by giving each tzinfo a pointer to the UTC from which it is offset, but that sounds like a can of worms we don't want to open. But if don't and we store EPOCH + offset, we can't add leap second awareness to UTC without corrupting all persisted aware datetimes. Also, I didn't mention this before because I figured people were getting sick of my dumb idea, but another advantage of always caching the offset is that you can detect future datetimes that have been corrupted by zoneinfo changes. You need both absolute time and offset to be able to do this. ijs From regebro at gmail.com Wed Apr 15 18:55:33 2015 From: regebro at gmail.com (Lennart Regebro) Date: Wed, 15 Apr 2015 12:55:33 -0400 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: <7460f8e3119451.552e4e3e@wiscmail.wisc.edu> References: <74b0bbf711f9b6.552e8a42@wiscmail.wisc.edu> <7620f27c11fc2c.552e8a7f@wiscmail.wisc.edu> <7450edc311fe88.552e8abb@wiscmail.wisc.edu> <74b097e311e30a.552e8af7@wiscmail.wisc.edu> <74b0fce111d161.552e8b33@wiscmail.wisc.edu> <779099d1118aea.552e8b70@wiscmail.wisc.edu> <7450a0e311b722.552e8baf@wiscmail.wisc.edu> <7450acc211ad3a.552e8beb@wiscmail.wisc.edu> <77409ee811e62a.552e8c27@wiscmail.wisc.edu> <74b0c4c611d986.552e8ca0@wiscmail.wisc.edu> <7150e90e11e9aa.552e8ec7@wiscmail.wisc.edu> <7620ed9c11dc2f.552e8f3f@wiscmail.wisc.edu> <7100dd9c11a855.552e8f7b@wiscmail.wisc.edu> <7740c27a11897e.552e8fb8@wiscmail.wisc.edu> <76208d8e1184ca.552e8ff4@wiscmail.wisc.edu> <7150ad8711adea.552e9031@wiscmail.wisc.edu> <7100e15f11921e.552e906d@wiscmail.wisc.edu> <7720b86f11a526.552e90a9@wiscmail.wisc.edu> <71008b1f1193de.552e90e6@wiscmail.wisc.edu> <7460bc3511cd82.552e9122@wiscmail.wisc.edu> <7460f8e3119451.552e4e3e@wiscmail.wisc.edu> Message-ID: On Wed, Apr 15, 2015 at 12:40 PM, Isaac Schwabacher wrote: > I am on the fence about EPOCH + offset as an internal representation. On the one hand, it is conceptually cleaner than the horrible byte-packing that exists today, but on the other hand, it has implications for future implementations of leap-second-awareness. For instance, offset measures the difference between the local time and UTC. But is it honest-to-goodness leap-second aware UTC, or is it really Unix time? This could be solved by giving each tzinfo a pointer to the UTC from which it is offset, but that sounds like a can of worms we don't want to open. But if don't and we store EPOCH + offset, we can't add leap second awareness to UTC without corrupting all persisted aware datetimes. That's true, with separate values like there is now we can easily allow 23:59:60 as a timestamp during leap seconds. I'm not entirely sure it makes a big difference though, I don't think we ever wants to deal with leap seconds by default. I don't think we ever want the standard arithmetic deal with leap seconds anyway. datetime(2012, 6, 30, 23, 30) + timedelta(seconds=3600) returning datetime(2012, 7, 1, 0, 29, 59) I guess leap second implementations should rather have special functions for arithmethics that deal with this. //Lennart From ischwabacher at wisc.edu Wed Apr 15 19:33:29 2015 From: ischwabacher at wisc.edu (Isaac Schwabacher) Date: Wed, 15 Apr 2015 12:33:29 -0500 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: <7100d01211ef35.552ea0e2@wiscmail.wisc.edu> References: <74b0bbf711f9b6.552e8a42@wiscmail.wisc.edu> <7620f27c11fc2c.552e8a7f@wiscmail.wisc.edu> <7450edc311fe88.552e8abb@wiscmail.wisc.edu> <74b097e311e30a.552e8af7@wiscmail.wisc.edu> <74b0fce111d161.552e8b33@wiscmail.wisc.edu> <779099d1118aea.552e8b70@wiscmail.wisc.edu> <7450a0e311b722.552e8baf@wiscmail.wisc.edu> <7450acc211ad3a.552e8beb@wiscmail.wisc.edu> <77409ee811e62a.552e8c27@wiscmail.wisc.edu> <74b0c4c611d986.552e8ca0@wiscmail.wisc.edu> <7150e90e11e9aa.552e8ec7@wiscmail.wisc.edu> <7620ed9c11dc2f.552e8f3f@wiscmail.wisc.edu> <7100dd9c11a855.552e8f7b@wiscmail.wisc.edu> <7740c27a11897e.552e8fb8@wiscmail.wisc.edu> <76208d8e1184ca.552e8ff4@wiscmail.wisc.edu> <7150ad8711adea.552e9031@wiscmail.wisc.edu> <7100e15f11921e.552e906d@wiscmail.wisc.edu> <7720b86f11a526.552e90a9@wiscmail.wisc.edu> <71008b1f1193de.552e90e6@wiscmail.wisc.edu> <7460bc3511cd82.552e9122@wiscmail.wisc.edu> <7460f8e3119451.552e4e3e@wiscmail.wisc.edu> <7460ca3f1195ee.552e9cd9@wiscmail.wisc.edu> <76b0a6c711ecfb.552e9e4b@wiscmail.wisc.edu> <7100f4c2119210.552e9e87@wiscmail.wisc.edu> <7100b27d11c91c.552e9ec4@wiscmail.wisc.edu> <74b0992d11b910.552e9f00@wiscmail.wisc.edu> <7740966f11d7c9.552e9f3c@wiscmail.wisc.edu> <76208c4b11ac55.552e9f79@wiscmail.wisc.edu> <715084f311a5db.552e9fb5@wiscmail.wisc.edu> <76b0ebdb11a7c5.552e9ff1@wiscmail.wisc.edu> <77708af611f97c.552ea02e@wiscmail.wisc.edu> <7770fb77118644.552ea06a@wiscmail.wisc.edu> <7100d01211ef35.552ea0e2@wiscmail.wisc.edu> Message-ID: <7460fdf911fbb2.552e5a99@wiscmail.wisc.edu> On 15-04-15, Lennart Regebro wrote: > On Wed, Apr 15, 2015 at 12:40 PM, Isaac Schwabacher > wrote: > > I am on the fence about EPOCH + offset as an internal representation. On the one hand, it is conceptually cleaner than the horrible byte-packing that exists today, but on the other hand, it has implications for future implementations of leap-second-awareness. For instance, offset measures the difference between the local time and UTC. But is it honest-to-goodness leap-second aware UTC, or is it really Unix time? This could be solved by giving each tzinfo a pointer to the UTC from which it is offset, but that sounds like a can of worms we don't want to open. But if don't and we store EPOCH + offset, we can't add leap second awareness to UTC without corrupting all persisted aware datetimes. > > > That's true, with separate values like there is now we can easily > allow 23:59:60 as a timestamp during leap seconds. I'm not entirely > sure it makes a big difference though, I don't think we ever wants to > deal with leap seconds by default. > > I don't think we ever want the standard arithmetic deal with leap > seconds anyway. > > datetime(2012, 6, 30, 23, 30) + timedelta(seconds=3600) returning > datetime(2012, 7, 1, 0, 29, 59) > > > I guess leap second implementations should rather have special > functions for arithmethics that deal with this. You need relative timedeltas to mitigate the pain of leap seconds, yes. But as soon as you have timedeltas that are capable of representing "this number of seconds into the next minute" ("one minute") as opposed to "sixty seconds", this isn't so much of a problem. Though of course subtraction will (and should!) continue to yield timedelta(seconds=3601). >From my perspective, the issue is more that the stdlib shouldn't rule out leap seconds. It's reasonable enough to expect users who actually want them to write appropriate tzinfo and timedelta classes, but we don't want to make that impossible the way the old tzinfo interface made DST-aware time zones impossible by insisting that implementers implement a function that wasn't mathematically a function. I need to think about this more before I can get a real rant going. ijs From stuart at stuartbishop.net Wed Apr 15 21:23:51 2015 From: stuart at stuartbishop.net (Stuart Bishop) Date: Wed, 15 Apr 2015 21:23:51 +0200 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: References: Message-ID: On 15 April 2015 at 17:00, Lennart Regebro wrote: > OK, so I just had a realization. > > Because we want some internal flag to tell if the datetime is in DST > or not, the datetime pickle format will change. And the datetime > pickle format changing is the biggest reason I had against changing > the internal representation to UTC. > > So because of this, perhaps we actually *should* change the internal > representation to UTC, because that makes the issues I'm fighting with > now so much simpler. (I'm currently trying to get arithmetic to do the > right thing in all cases, which is crazy complicated). Huh. I didn't think you would need to change any arithmetic (but haven't looked at this for quite some time). You can already add or subtract timedeltas to timezone aware datetime instances. The problem with the existing implementation is the tzinfo instance does not have enough information to do correct conversions when the time is ambiguous, so it has to guess. With the addition of the is_dst hint to the datetime instance, it will no longer need to guess. Arithmetic remains 'add the timedelta to the naive datetime, and then punt it to the tzinfo to make any necessary adjustments' and I thought this would not need to be changed at all. > We can add support to unpickle previous datetimes, but we won't be > able to add forwards compatibility, meaning that pickles saved in > Python 3.5 will not be unpicklable in Python 3.4. I don't think this can be avoided entirely. Any ideas I can come up with that might help are worse than requiring devs to convert their datetimes to strings in the rare case they need their 3.5 pickles read with 3.4. -- Stuart Bishop http://www.stuartbishop.net/ From regebro at gmail.com Wed Apr 15 21:51:46 2015 From: regebro at gmail.com (Lennart Regebro) Date: Wed, 15 Apr 2015 15:51:46 -0400 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: References: Message-ID: On Wed, Apr 15, 2015 at 3:23 PM, Stuart Bishop wrote: > Huh. I didn't think you would need to change any arithmetic Not really, the problem is in keeping the date normalized after each call, and doing so the right way. > Arithmetic > remains 'add the timedelta to the naive datetime, and then punt it to > the tzinfo to make any necessary adjustments' and I thought this would > not need to be changed at all. Just punting it to tzinfo to make adjustments, ie effectively just doing what normalize() does creates infinite recursion as there is more arithmetic in there, so it's not quite that simple. > I don't think this can be avoided entirely. Any ideas I can come up > with that might help are worse than requiring devs to convert their > datetimes to strings in the rare case they need their 3.5 pickles read > with 3.4. Pickle forward compatibility isn't really expected anyway... From rosuav at gmail.com Wed Apr 15 21:57:25 2015 From: rosuav at gmail.com (Chris Angelico) Date: Thu, 16 Apr 2015 05:57:25 +1000 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: References: Message-ID: On Thu, Apr 16, 2015 at 1:43 AM, Lennart Regebro wrote: > On Wed, Apr 15, 2015 at 11:10 AM, Chris Angelico wrote: >> Bikeshed: Would arithmetic be based on UTC time or Unix time? It'd be >> more logical to describe it as "adding six hours means adding six >> hours to the UTC time", but it'd look extremely odd when there's a >> leap second. > > It would ignore leap seconds. If you want to call that unix time or > not is a matter of opinion. Hm. I guess the internal representation > *could* be EPOCH + offset, and local times could be calculated > properties, which could be cached (or possibly calculated at > creation). I was just talking about leap seconds, here (which Unix time ignores), not about the internal representation, which is an implementation detail. If a timedelta is represented as a number of seconds, then "adding six hours" really means "adding 6*3600 seconds", and most people would be VERY surprised if one of those is "consumed" by a leap second; but it ought at least to be acknowledged in the docs. ChrisA From 4kir4.1i at gmail.com Wed Apr 15 21:12:40 2015 From: 4kir4.1i at gmail.com (Akira Li) Date: Wed, 15 Apr 2015 22:12:40 +0300 Subject: [Python-Dev] Status on PEP-431 Timezones References: <7760c2011a39dc.552582a4@wiscmail.wisc.edu> <7620d9171a3992.552582e0@wiscmail.wisc.edu> <75b08d361a5634.5525831c@wiscmail.wisc.edu> <7620c4141a208c.5525840f@wiscmail.wisc.edu> <7640f6f91a3fb3.5525844c@wiscmail.wisc.edu> <7620e9e81a4383.55258488@wiscmail.wisc.edu> <75b0a38e1a1054.552584c4@wiscmail.wisc.edu> <7750bef61a2e60.55258503@wiscmail.wisc.edu> <776091541a6ac3.55258540@wiscmail.wisc.edu> <7640fc041a414e.5525857c@wiscmail.wisc.edu> <75b086b61a3536.552585b8@wiscmail.wisc.edu> <7760e3ea1a5213.55258634@wiscmail.wisc.edu> <7620d3511a1769.55258670@wiscmail.wisc.edu> <77309bf71a7f90.552586ac@wiscmail.wisc.edu> <75b0a1271a24e2.552586e9@wiscmail.wisc.edu> <75f0e7351a1eee.55258725@wiscmail.wisc.edu> <7760fdbc1a14b4.5525879e@wiscmail.wisc.edu> <7750f8091a0d97.552587da@wiscmail.wisc.edu> <75b0de0d1a5379.552541d9@wiscmail.wisc.edu> Message-ID: <87mw29p5nb.fsf@gmail.com> Alexander Belopolsky writes: > On Wed, Apr 8, 2015 at 3:57 PM, Isaac Schwabacher > wrote: >> >> On 15-04-08, Alexander Belopolsky wrote: >> > With datetime, we also have a problem that POSIX APIs don't have to > deal with: local time >> > arithmetics. What is t + timedelta(1) when t falls on the day before > DST change? How would >> > you set the isdst flag in the result? >> >> It's whatever time comes 60*60*24 seconds after t in the same time zone, > because the timedelta class isn't expressive enough to represent anything > but absolute time differences (nor should it be, IMO). > > This is not what most uses expect. The expect > > datetime(y, m, d, 12, tzinfo=New_York) + timedelta(1) > > to be > > datetime(y, m, d+1, 12, tzinfo=New_York) It is incorrect. If you want d+1 for +timedelta(1); use a **naive** datetime. Otherwise +timedelta(1) is +24h: tomorrow = tz.localize(aware_dt.replace(tzinfo=None) + timedelta(1), is_dst=None) dt_plus24h = tz.normalize(aware_dt + timedelta(1)) # +24h *tomorrow* and *aware_dt* have the *same* time but it is unknown how many hours have passed if the utc offset has changed in between. *dt_plus24h* may have a different time but there are exactly 24 hours have passed between *dt_plush24* and *aware_dt* http://stackoverflow.com/questions/441147/how-can-i-subtract-a-day-from-a-python-date From 4kir4.1i at gmail.com Wed Apr 15 21:53:16 2015 From: 4kir4.1i at gmail.com (Akira Li) Date: Wed, 15 Apr 2015 22:53:16 +0300 Subject: [Python-Dev] Status on PEP-431 Timezones References: Message-ID: <87fv81p3rn.fsf@gmail.com> Lennart Regebro writes: > OK, so I realized another thing today, and that is that arithmetic > doesn't necessarily round trip. > > For example, 2002-10-27 01:00 US/Eastern comes both in DST and STD. > > But 2002-10-27 01:00 US/Eastern STD minus two days is 2002-10-25 01:00 > US/Eastern DST "two days" is ambiguous here. It is incorrect if you mean 48 hours (the difference is 49 hours): #!/usr/bin/env python3 from datetime import datetime, timedelta import pytz tz = pytz.timezone('US/Eastern') then_isdst = False # STD then = tz.localize(datetime(2002, 10, 27, 1), is_dst=then_isdst) now = tz.localize(datetime(2002, 10, 25, 1), is_dst=None) # no utc transition print((then - now) // timedelta(hours=1)) # -> 49 > However, 2002-10-25 01:00 US/Eastern DST plus two days is 2002-10-27 > 01:00 US/Eastern, but it is ambiguous if you want DST or not DST. It is not ambiguous if you know what "two days" *in your particular application* should mean (`day+2` vs. +48h exactly): print(tz.localize(now.replace(tzinfo=None) + timedelta(2), is_dst=then_isdst)) # -> 2002-10-27 01:00:00-05:00 # +49h print(tz.normalize(now + timedelta(2))) # +48h # -> 2002-10-27 01:00:00-04:00 Here's a simple mental model that can be used for date arithmetics: - naive datetime + timedelta(2) == "same time, elapsed hours unknown" - aware utc datetime + timedelta(2) == "same time, +48h" - aware datetime with timezone that may have different utc offsets at different times + timedelta(2) == "unknown time, +48h" "unknown" means that you can't tell without knowning the specific timezone. It ignores leap seconds. The 3rd case behaves *as if* the calculations are performed using these steps (the actual implementation may be different): 1. convert an aware datetime object to utc (dt.astimezone(pytz.utc)) 2. do the simple arithmetics using utc time 3. convert the result to the original pytz timezone (utc_dt.astimezone(tz)) you don't need `.localize()`, `.normalize()` calls here. > And you can't pass in a is_dst flag to __add__, so the arithmatic must > just pick one, and the sensible one is to keep to the same DST. > > That means that: > > tz = get_timezone('US/Eastern') > dt = datetimedatetime(2002, 10, 27, 1, 0, tz=tz, is_dst=False) > dt2 = dt - 420 + 420 > assert dt == dt2 > > Will fail, which will be unexpected for most people. > > I think there is no way around this, but I thought I should flag for > it. This is a good reason to do all your date time arithmetic in UTC. > > //Lennart It won't fail: from datetime import datetime, timedelta import pytz tz = pytz.timezone('US/Eastern') dt = tz.localize(datetime(2002, 10, 27, 1), is_dst=False) delta = timedelta(seconds=420) assert dt == tz.normalize(tz.normalize(dt - delta) + delta) The only reason `tz.normalize()` is used so that tzinfo would be correct for the resulting datetime object; it does not affect the comparison otherwise: assert dt == (dt - delta + delta) #XXX tzinfo may be incorrect assert dt == tz.normalize(dt - delta + delta) # correct tzinfo for the final result From 4kir4.1i at gmail.com Wed Apr 15 22:24:03 2015 From: 4kir4.1i at gmail.com (Akira Li) Date: Wed, 15 Apr 2015 23:24:03 +0300 Subject: [Python-Dev] Status on PEP-431 Timezones References: <7550a7f9d5bf8.5527f8bd@wiscmail.wisc.edu> <74f0a5a5d5f7a.5527f8f9@wiscmail.wisc.edu> <77908bbad2606.5527f936@wiscmail.wisc.edu> <7530c0dad0aac.5527f972@wiscmail.wisc.edu> <7620b290d7acb.5527f9af@wiscmail.wisc.edu> <7530a93fd12e8.5527f9eb@wiscmail.wisc.edu> <74409ce3d674f.5527fa27@wiscmail.wisc.edu> <74f0db36d7d6e.5527fa64@wiscmail.wisc.edu> <74e0b8a4d65a3.5527faa0@wiscmail.wisc.edu> <74d0872fd5f5c.5527fadc@wiscmail.wisc.edu> <74d0c93cd5dfa.5527fb19@wiscmail.wisc.edu> <7440a9edd18fd.5527fb91@wiscmail.wisc.edu> <7620e95ed61ae.5527fbce@wiscmail.wisc.edu> <7470e6d9d3a54.5527fc0a@wiscmail.wisc.edu> <74f0ef60d15b6.5527fc46@wiscmail.wisc.edu> <77909799d6c93.5527fc83@wiscmail.wisc.edu> <74d0f69bd796c.5527fcbf@wiscmail.wisc.edu> <75308783d3c84.5527fcfb@wiscmail.wisc.edu> <7530ba90d0977.5527fd74@wiscmail.wisc.edu> <7790a931d1c08.5527b74a@wiscmail.wisc.edu> Message-ID: <87bnipp2cc.fsf@gmail.com> Isaac Schwabacher writes: > ... > > I know that you can do datetime.now(tz), and you can do datetime(2013, > 11, 3, 1, 30, tzinfo=zoneinfo('America/Chicago')), but not being able > to add a time zone to an existing naive datetime is painful (and > strptime doesn't even let you pass in a time zone). `.now(tz)` is correct. `datetime(..., tzinfo=tz)`) is wrong: if tz is a pytz timezone then you may get a wrong tzinfo (LMT), you should use `tz.localize(naive_dt, is_dst=False|True|None)` instead. > ... From ischwabacher at wisc.edu Wed Apr 15 22:35:13 2015 From: ischwabacher at wisc.edu (Isaac Schwabacher) Date: Wed, 15 Apr 2015 15:35:13 -0500 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: <74b09ad111d439.552ecb61@wiscmail.wisc.edu> References: <7550a7f9d5bf8.5527f8bd@wiscmail.wisc.edu> <74f0a5a5d5f7a.5527f8f9@wiscmail.wisc.edu> <77908bbad2606.5527f936@wiscmail.wisc.edu> <7530c0dad0aac.5527f972@wiscmail.wisc.edu> <7620b290d7acb.5527f9af@wiscmail.wisc.edu> <7530a93fd12e8.5527f9eb@wiscmail.wisc.edu> <74409ce3d674f.5527fa27@wiscmail.wisc.edu> <74f0db36d7d6e.5527fa64@wiscmail.wisc.edu> <74e0b8a4d65a3.5527faa0@wiscmail.wisc.edu> <74d0872fd5f5c.5527fadc@wiscmail.wisc.edu> <74d0c93cd5dfa.5527fb19@wiscmail.wisc.edu> <7440a9edd18fd.5527fb91@wiscmail.wisc.edu> <7620e95ed61ae.5527fbce@wiscmail.wisc.edu> <7470e6d9d3a54.5527fc0a@wiscmail.wisc.edu> <74f0ef60d15b6.5527fc46@wiscmail.wisc.edu> <77909799d6c93.5527fc83@wiscmail.wisc.edu> <74d0f69bd796c.5527fcbf@wiscmail.wisc.edu> <75308783d3c84.5527fcfb@wiscmail.wisc.edu> <7530ba90d0977.5527fd74@wiscmail.wisc.edu> <7790a931d1c08.5527b74a@wiscmail.wisc.edu> <87bnipp2cc.fsf@gmail.com> <7100c8e6119092.552ecb25@wiscmail.wisc.edu> <74b09ad111d439.552ecb61@wiscmail.wisc.edu> Message-ID: <74b0db7f11dbfe.552e8531@wiscmail.wisc.edu> On 15-04-15, Akira Li <4kir4.1i at gmail.com> wrote: > Isaac Schwabacher writes: > > ... > > > > I know that you can do datetime.now(tz), and you can do datetime(2013, > > 11, 3, 1, 30, tzinfo=zoneinfo('America/Chicago')), but not being able > > to add a time zone to an existing naive datetime is painful (and > > strptime doesn't even let you pass in a time zone). > > `.now(tz)` is correct. `datetime(..., tzinfo=tz)`) is wrong: if tz is a > pytz timezone then you may get a wrong tzinfo (LMT), you should use > `tz.localize(naive_dt, is_dst=False|True|None)` instead. The whole point of this thread is to finalize PEP 431, which fixes the problem for which `localize()` and `normalize()` are workarounds. When this is done, `datetime(..., tzinfo=tz)` will be correct. ijs From 4kir4.1i at gmail.com Wed Apr 15 22:46:21 2015 From: 4kir4.1i at gmail.com (Akira Li) Date: Wed, 15 Apr 2015 23:46:21 +0300 Subject: [Python-Dev] Aware datetime from naive local time Was: Status on PEP-431 Timezones References: Message-ID: <877ftdp1b6.fsf@gmail.com> Alexander Belopolsky writes: > Sorry for a truncated message. Please scroll past the quoted portion. > > On Thu, Apr 9, 2015 at 10:21 PM, Alexander Belopolsky < > alexander.belopolsky at gmail.com> wrote: > >> >> On Thu, Apr 9, 2015 at 4:51 PM, Isaac Schwabacher >> wrote: >> >>> > > > Well, you are right, but at least we do have a localtime utility >>> hidden in the email package: >>> > > > >>> > > > >>> from datetime import * >>> > > > >>> from email.utils import localtime >>> > > > >>> print(localtime(datetime.now())) >>> > > > 2015-04-09 15:19:12.840000-04:00 >>> > > > >>> > > > You can read for the reasons it >>> did not make into datetime. >>> > > >>> > > But that's restricted to the system time zone. Nothing good ever >>> comes from the system time zone... >>> > >>> > Let's solve one problem at a time. ... >>> >>> PEP 431 proposes to import zoneinfo into the stdlib, ... >> >> >> I am changing the subject so that we can focus on one question without >> diverting to PEP-size issues that are better suited for python ideas. >> >> I would like to add a functionality to the datetime module that would >> solve a seemingly simple problem: given a naive datetime instance assumed >> to be in local time, construct the corresponding aware datetime object with >> tzinfo set to an appropriate fixed offset datetime.timezone instance. >> >> Python 3 has this functionality implemented in the email package since >> version 3.3, and it appears to work well even >> in the ambiguous hour >> >> >>> from email.utils import localtime >> >>> from datetime import datetime >> >>> localtime(datetime(2014,11,2,1,30)).strftime('%c %z %Z') >> 'Sun Nov 2 01:30:00 2014 -0400 EDT' >> >>> localtime(datetime(2014,11,2,1,30), isdst=0).strftime('%c %z %Z') >> 'Sun Nov 2 01:30:00 2014 -0500 EST' >> >> However, in a location with a more interesting history, you can get a >> situation that >> > > would look like this in the zoneinfo database: > > $ zdump -v -c 1992 Europe/Kiev > ... > Europe/Kiev Sat Mar 24 22:59:59 1990 UTC = Sun Mar 25 01:59:59 1990 MSK > isdst=0 > Europe/Kiev Sat Mar 24 23:00:00 1990 UTC = Sun Mar 25 03:00:00 1990 MSD > isdst=1 > Europe/Kiev Sat Jun 30 21:59:59 1990 UTC = Sun Jul 1 01:59:59 1990 MSD > isdst=1 > Europe/Kiev Sat Jun 30 22:00:00 1990 UTC = Sun Jul 1 01:00:00 1990 EEST > isdst=1 > Europe/Kiev Sat Sep 28 23:59:59 1991 UTC = Sun Sep 29 02:59:59 1991 EEST > isdst=1 > Europe/Kiev Sun Sep 29 00:00:00 1991 UTC = Sun Sep 29 02:00:00 1991 EET > isdst=0 > ... > > Look what happened on July 1, 1990. At 2 AM, the clocks in Ukraine were > moved back one hour. So times like 01:30 AM happened twice there on that > day. Let's see how Python handles this situation > > $ TZ=Europe/Kiev python3 >>>> from email.utils import localtime >>>> from datetime import datetime >>>> localtime(datetime(1990,7,1,1,30)).strftime('%c %z %Z') > 'Sun Jul 1 01:30:00 1990 +0400 MSD' > > So far so good, I've got the first of the two 01:30AM's. But what if I > want the other 01:30AM? Well, > >>>> localtime(datetime(1990,7,1,1,30), isdst=0).strftime('%c %z %Z') > 'Sun Jul 1 01:30:00 1990 +0300 EEST' > > gives me "the other 01:30AM", but it is counter-intuitive: I have to ask > for the standard (winter) time to get the daylight savings (summer) time. > It looks incorrect. Here's the corresponding pytz code: from datetime import datetime import pytz tz = pytz.timezone('Europe/Kiev') print(tz.localize(datetime(1990, 7, 1, 1, 30), is_dst=False).strftime('%c %z %Z')) # -> Sun Jul 1 01:30:00 1990 +0300 EEST print(tz.localize(datetime(1990, 7, 1, 1, 30), is_dst=True).strftime('%c %z %Z')) # -> Sun Jul 1 01:30:00 1990 +0400 MSD See also "Enhance support for end-of-DST-like ambiguous time" [1] [1] https://bugs.launchpad.net/pytz/+bug/1378150 `email.utils.localtime()` is broken: from datetime import datetime from email.utils import localtime print(localtime(datetime(1990, 7, 1, 1, 30)).strftime('%c %z %Z')) # -> Sun Jul 1 01:30:00 1990 +0300 EEST print(localtime(datetime(1990, 7, 1, 1, 30), isdst=0).strftime('%c %z %Z')) # -> Sun Jul 1 01:30:00 1990 +0300 EEST print(localtime(datetime(1990, 7, 1, 1, 30), isdst=1).strftime('%c %z %Z')) # -> Sun Jul 1 01:30:00 1990 +0300 EEST print(localtime(datetime(1990, 7, 1, 1, 30), isdst=-1).strftime('%c %z %Z')) # -> Sun Jul 1 01:30:00 1990 +0300 EEST Versions: $ ./python -V Python 3.5.0a3+ $ dpkg -s tzdata | grep -i version Version: 2015b-0ubuntu0.14.04 > The uncertainty about how to deal with the repeated hour was the reason why > email.utils.localtime-like interface did not make it to the datetime > module. "repeated hour" (time jumps back) can be treated like a end-of-DST transition, to resolve ambiguities [1]. > The main objection to the isdst flag was that in most situations, > determining whether DST is in effect is as hard as finding the UTC offset, > so reducing the problem of finding the UTC offset to the one of finding the > value for isdst does not solve much. > > I now realize that the problem is simply in the name for the flag. While > we cannot often tell what isdst should be and in some situations the actual > DST status does not differentiate between the two possible times, we can > always say whether we want to get the first or the second time. > > In other words, instead of localtime(dt, isdst=-1), we may want > localtime(dt, which=0) where "which" is used to resolve the ambiguity: > "which=0" means return the first (in UTC order) of the two times and > "which=1" means return the second. (In the non-ambiguous cases "which" is > ignored.) > > An alternative solution would be make localtime(dt) return a list of 0, 1 > or 2 instances, but this will probably make a common usage (the case when > the user does not care which time she gets) more cumbersome. From stuart at stuartbishop.net Wed Apr 15 23:28:08 2015 From: stuart at stuartbishop.net (Stuart Bishop) Date: Wed, 15 Apr 2015 23:28:08 +0200 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: References: Message-ID: On 15 April 2015 at 21:51, Lennart Regebro wrote: > On Wed, Apr 15, 2015 at 3:23 PM, Stuart Bishop wrote: > Just punting it to tzinfo to make adjustments, ie effectively just > doing what normalize() does creates infinite recursion as there is > more arithmetic in there, so it's not quite that simple. This sounds familiar. Its infinite recursion if the tzinfo does its calculations using localized datetimes. If the tzinfo is stripped for the calculations, there is no tzinfo to recurse into. At least this was how I hoped it would work, and it sucks if it doesn't. You could be right that using the UTC representation internally for datetimes with a tzinfo makes the most sense. -- Stuart Bishop http://www.stuartbishop.net/ From 4kir4.1i at gmail.com Thu Apr 16 00:02:48 2015 From: 4kir4.1i at gmail.com (Akira Li) Date: Thu, 16 Apr 2015 01:02:48 +0300 Subject: [Python-Dev] Aware datetime from naive local time Was: Status on PEP-431 Timezones References: Message-ID: <87zj69nj7b.fsf@gmail.com> Alexander Belopolsky writes: > ... > For most world locations past discontinuities are fairly well documented > for at least a century and future changes are published with at least 6 > months lead time. It is important to note that the different versions of the tz database may lead to different tzinfo (utc offset, tzname) even for *past* dates. i.e., (lt, tzid, isdst) is not enough because the result for (lt, tzid(2015b), isdst) may be different from (lt, tzid(X), isdst) where lt = local time e.g., naive datetime tzid = timezone from the tz database e.g., Europe/Kiev isdst = a boolean flag for disambiguation X != 2015b In other words, a fixed utc offset might not be sufficient even for past dates. >... > Moreover, a program that rejects invalid times on input, but stores them > for a long time may see its database silently corrupted after a zoneinfo > update. > Now it is time to make specific proposal. I would like to extend > datetime.astimezone() method to work on naive datetime instances. Such > instances will be assumed to be in local time and discontinuities will be > handled as follows: > > > 1. wall(t) == lt has a single solution. This is the trivial case and > lt.astimezone(utc) and lt.astimezone(utc, which=i) for i=0,1 should return > that solution. > > 2. wall(t) == lt has two solutions t1 and t2 such that t1 < t2. In this > case lt.astimezone(utc) == lt.astimezone(utc, which=0) == t1 and > lt.astimezone(utc, which=1) == t2. In pytz terms: `which = not isdst` (end-of-DST-like transition: isdst changes from True to False in the direction of utc time). It resolves AmbiguousTimeError raised by `tz.localize(naive, is_dst=None)`. > 3. wall(t) == lt has no solution. This happens when there is UTC time t0 > such that wall(t0) < lt and wall(t0+epsilon) > lt (a positive discontinuity > at time t0). In this case lt.astimezone(utc) should return t0 + lt - > wall(t0). I.e., we ignore the discontinuity and extend wall(t) linearly > past t0. Obviously, in this case the invariant wall(lt.astimezone(utc)) == > lt won't hold. The "which" flag should be handled as follows: > lt.astimezone(utc) == lt.astimezone(utc, which=0) and lt.astimezone(utc, > which=0) == t0 + lt - wall(t0+eps). It is inconsistent with the previous case: here `which = isdst` but `which = not isdst` above. `lt.astimezone(utc, which=0) == t0 + lt - wall(t0+eps)` corresponds to: result = tz.normalize(tz.localize(lt, isdst=False)) i.e., `which = isdst` (t0 is at the start of DST and therefore isdst changes from False to True). It resolves NonExistentTimeError raised by `tz.localize(naive, is_dst=None)`. start-of-DST-like transition ("Spring forward"). For example, from datetime import datetime, timedelta import pytz tz = pytz.timezone('America/New_York') # 2am -- non-existent time print(tz.normalize(tz.localize(datetime(2015, 3, 8, 2), is_dst=False))) # -> 2015-03-08 03:00:00-04:00 # after the jump (wall(t0+eps)) print(tz.localize(datetime(2015, 3, 8, 3), is_dst=None)) # -> 2015-03-08 03:00:00-04:00 # same time, unambiguous # 2:01am -- non-existent time print(tz.normalize(tz.localize(datetime(2015, 3, 8, 2, 1), is_dst=False))) # -> 2015-03-08 03:01:00-04:00 print(tz.localize(datetime(2015, 3, 8, 3, 1), is_dst=None)) # -> 2015-03-08 03:01:00-04:00 # same time, unambiguous # 2:59am non-existent time dt = tz.normalize(tz.localize(datetime(2015, 3, 8, 2, 59), is_dst=True)) print(dt) # -> 2015-03-08 01:59:00-05:00 # before the jump (wall(t0-eps)) print(tz.normalize(dt + timedelta(minutes=1))) # -> 2015-03-08 03:00:00-04:00 > With the proposed features in place, one can use the naive code > > t = lt.astimezone(utc) > > and get predictable behavior in all cases and no crashes. > > A more sophisticated program can be written like this: > > t1 = lt.astimezone(utc, which=0) > t2 = lt.astimezone(utc, which=1) > if t1 == t2: > t = t1 > elif t2 > t1: > # ask the user to pick between t1 and t2 or raise > AmbiguousLocalTimeError > else: > t = t1 > # warn the user that time was invalid and changed or raise > InvalidLocalTimeError From alexander.belopolsky at gmail.com Thu Apr 16 00:14:13 2015 From: alexander.belopolsky at gmail.com (Alexander Belopolsky) Date: Wed, 15 Apr 2015 18:14:13 -0400 Subject: [Python-Dev] Aware datetime from naive local time Was: Status on PEP-431 Timezones In-Reply-To: <877ftdp1b6.fsf@gmail.com> References: <877ftdp1b6.fsf@gmail.com> Message-ID: On Wed, Apr 15, 2015 at 4:46 PM, Akira Li <4kir4.1i at gmail.com> wrote: > > Look what happened on July 1, 1990. At 2 AM, the clocks in Ukraine were > > moved back one hour. So times like 01:30 AM happened twice there on that > > day. Let's see how Python handles this situation > > > > $ TZ=Europe/Kiev python3 > >>>> from email.utils import localtime > >>>> from datetime import datetime > >>>> localtime(datetime(1990,7,1,1,30)).strftime('%c %z %Z') > > 'Sun Jul 1 01:30:00 1990 +0400 MSD' > > > > So far so good, I've got the first of the two 01:30AM's. But what if I > > want the other 01:30AM? Well, > > > >>>> localtime(datetime(1990,7,1,1,30), isdst=0).strftime('%c %z %Z') > > 'Sun Jul 1 01:30:00 1990 +0300 EEST' > > > > gives me "the other 01:30AM", but it is counter-intuitive: I have to ask > > for the standard (winter) time to get the daylight savings (summer) > time. > > > > It looks incorrect. Here's the corresponding pytz code: > > from datetime import datetime > import pytz > > tz = pytz.timezone('Europe/Kiev') > print(tz.localize(datetime(1990, 7, 1, 1, 30), > is_dst=False).strftime('%c %z %Z')) > # -> Sun Jul 1 01:30:00 1990 +0300 EEST > print(tz.localize(datetime(1990, 7, 1, 1, 30), is_dst=True).strftime('%c > %z %Z')) > # -> Sun Jul 1 01:30:00 1990 +0400 MSD > > See also "Enhance support for end-of-DST-like ambiguous time" [1] > > [1] https://bugs.launchpad.net/pytz/+bug/1378150 > > `email.utils.localtime()` is broken: > If you think there is a bug in email.utils.localtime - please open an issue at . > > from datetime import datetime > from email.utils import localtime > > print(localtime(datetime(1990, 7, 1, 1, 30)).strftime('%c %z %Z')) > # -> Sun Jul 1 01:30:00 1990 +0300 EEST > print(localtime(datetime(1990, 7, 1, 1, 30), isdst=0).strftime('%c %z > %Z')) > # -> Sun Jul 1 01:30:00 1990 +0300 EEST > print(localtime(datetime(1990, 7, 1, 1, 30), isdst=1).strftime('%c %z > %Z')) > # -> Sun Jul 1 01:30:00 1990 +0300 EEST > print(localtime(datetime(1990, 7, 1, 1, 30), isdst=-1).strftime('%c %z > %Z')) > # -> Sun Jul 1 01:30:00 1990 +0300 EEST > > > Versions: > > $ ./python -V > Python 3.5.0a3+ > $ dpkg -s tzdata | grep -i version > Version: 2015b-0ubuntu0.14.04 > > > The uncertainty about how to deal with the repeated hour was the reason > why > > email.utils.localtime-like interface did not make it to the datetime > > module. > > "repeated hour" (time jumps back) can be treated like a end-of-DST > transition, to resolve ambiguities [1]. I don't understand what you are complaining about. It is quite possible that pytz uses is_dst flag differently from the way email.utils.localtime uses isdst. I was not able to find a good description of what is_dst means in pytz, but localtime's isdst is documented as follows: a positive or zero value for *isdst* causes localtime to presume initially that summer time (for example, Daylight Saving Time) is or is not (respectively) in effect for the specified time. Can you demonstrate that email.utils.localtime does not behave as documented? -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexander.belopolsky at gmail.com Thu Apr 16 01:12:55 2015 From: alexander.belopolsky at gmail.com (Alexander Belopolsky) Date: Wed, 15 Apr 2015 19:12:55 -0400 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: References: Message-ID: On Wed, Apr 15, 2015 at 5:28 PM, Stuart Bishop wrote: > > On 15 April 2015 at 21:51, Lennart Regebro wrote: > > On Wed, Apr 15, 2015 at 3:23 PM, Stuart Bishop wrote: > > > Just punting it to tzinfo to make adjustments, ie effectively just > > doing what normalize() does creates infinite recursion as there is > > more arithmetic in there, so it's not quite that simple. > > This sounds familiar. Its infinite recursion if the tzinfo does its > calculations using localized datetimes. If the tzinfo is stripped for > the calculations, there is no tzinfo to recurse into. At least this > was how I hoped it would work, and it sucks if it doesn't. You could > be right that using the UTC representation internally for datetimes > with a tzinfo makes the most sense. There is no infinite recursion in the way datetime module deals with zone conversions. However, implementors of tzinfo subclasses often overlook the fact that datetime module design mandates specific rules for what utcoffset() should return for the missing and ambiguous hours. Granted, the relevant section in the manual [1] is not an easy read and in fact for a long time that documentation itself was displaying a buggy implementation of the LocalTimezone class. [2] Understanding how the design works requires a bit of algebra [3], but I strongly recommend that anyone trying to improve the timezones support in the datetime module, print out those 200 lines of comments and go through them with a pencil following the proofs. Note that one of the key assumptions [3.2] in that write-up does not hold in real life. The assumption is that "standard time" offset does not depend on the point in time. However, I do believe that this assumption can be relaxed without invalidating the main result. I believe we can still have unambiguous fromutc() as long as standard time offset does not change "too often." Basically, if we (generously) allow utcoffset to vary from -24h to +24h, then a "sane" zone can be defined as the one where utcoffset changes at most once in any 48 hour period. If I am right about this and the algebra works out, then we don't need to change datetime module design to properly support all world timezones. [1] https://docs.python.org/3/library/datetime.html#datetime.tzinfo.fromutc [2] http://bugs.python.org/issue9063 [3] https://hg.python.org/cpython/file/132b5376bf34/Lib/datetime.py#l1935 [3.2] https://hg.python.org/cpython/file/132b5376bf34/Lib/datetime.py#l1948 -------------- next part -------------- An HTML attachment was scrubbed... URL: From 4kir4.1i at gmail.com Thu Apr 16 00:22:44 2015 From: 4kir4.1i at gmail.com (Akira Li) Date: Thu, 16 Apr 2015 01:22:44 +0300 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: <74b0db7f11dbfe.552e8531@wiscmail.wisc.edu> (Isaac Schwabacher's message of "Wed, 15 Apr 2015 15:35:13 -0500") References: <7620b290d7acb.5527f9af@wiscmail.wisc.edu> <7530a93fd12e8.5527f9eb@wiscmail.wisc.edu> <74409ce3d674f.5527fa27@wiscmail.wisc.edu> <74f0db36d7d6e.5527fa64@wiscmail.wisc.edu> <74e0b8a4d65a3.5527faa0@wiscmail.wisc.edu> <74d0872fd5f5c.5527fadc@wiscmail.wisc.edu> <74d0c93cd5dfa.5527fb19@wiscmail.wisc.edu> <7440a9edd18fd.5527fb91@wiscmail.wisc.edu> <7620e95ed61ae.5527fbce@wiscmail.wisc.edu> <7470e6d9d3a54.5527fc0a@wiscmail.wisc.edu> <74f0ef60d15b6.5527fc46@wiscmail.wisc.edu> <77909799d6c93.5527fc83@wiscmail.wisc.edu> <74d0f69bd796c.5527fcbf@wiscmail.wisc.edu> <75308783d3c84.5527fcfb@wiscmail.wisc.edu> <7530ba90d0977.5527fd74@wiscmail.wisc.edu> <7790a931d1c08.5527b74a@wiscmail.wisc.edu> <87bnipp2cc.fsf@gmail.com> <7100c8e6119092.552ecb25@wiscmail.wisc.edu> <74b09ad111d439.552ecb61@wiscmail.wisc.edu> <74b0db7f11dbfe.552e8531@wiscmail.wisc.edu> Message-ID: <87vbgxnia3.fsf@gmail.com> Isaac Schwabacher writes: > On 15-04-15, Akira Li <4kir4.1i at gmail.com> wrote: >> Isaac Schwabacher writes: >> > ... >> > >> > I know that you can do datetime.now(tz), and you can do datetime(2013, >> > 11, 3, 1, 30, tzinfo=zoneinfo('America/Chicago')), but not being able >> > to add a time zone to an existing naive datetime is painful (and >> > strptime doesn't even let you pass in a time zone). >> >> `.now(tz)` is correct. `datetime(..., tzinfo=tz)`) is wrong: if tz is a >> pytz timezone then you may get a wrong tzinfo (LMT), you should use >> `tz.localize(naive_dt, is_dst=False|True|None)` instead. > > The whole point of this thread is to finalize PEP 431, which fixes the > problem for which `localize()` and `normalize()` are workarounds. When > this is done, `datetime(..., tzinfo=tz)` will be correct. > > ijs The input time is ambiguous. Even if we assume PEP 431 is implemented in some form, your code is still missing isdst parameter (or the analog). PEP 431 won't fix it; it can't resolve the ambiguity by itself. Notice is_dst paramter in the `tz.localize()` call (current API). .now(tz) works even during end-of-DST transitions (current API) when the local time is ambiguous. From ischwabacher at wisc.edu Thu Apr 16 00:36:50 2015 From: ischwabacher at wisc.edu (Isaac Schwabacher) Date: Wed, 15 Apr 2015 17:36:50 -0500 Subject: [Python-Dev] Status on PEP-431 Timezones In-Reply-To: <76b0faaa11ddf7.552ee7e9@wiscmail.wisc.edu> References: <7620b290d7acb.5527f9af@wiscmail.wisc.edu> <7530a93fd12e8.5527f9eb@wiscmail.wisc.edu> <74409ce3d674f.5527fa27@wiscmail.wisc.edu> <74f0db36d7d6e.5527fa64@wiscmail.wisc.edu> <74e0b8a4d65a3.5527faa0@wiscmail.wisc.edu> <74d0872fd5f5c.5527fadc@wiscmail.wisc.edu> <74d0c93cd5dfa.5527fb19@wiscmail.wisc.edu> <7440a9edd18fd.5527fb91@wiscmail.wisc.edu> <7620e95ed61ae.5527fbce@wiscmail.wisc.edu> <7470e6d9d3a54.5527fc0a@wiscmail.wisc.edu> <74f0ef60d15b6.5527fc46@wiscmail.wisc.edu> <77909799d6c93.5527fc83@wiscmail.wisc.edu> <74d0f69bd796c.5527fcbf@wiscmail.wisc.edu> <75308783d3c84.5527fcfb@wiscmail.wisc.edu> <7530ba90d0977.5527fd74@wiscmail.wisc.edu> <7790a931d1c08.5527b74a@wiscmail.wisc.edu> <87bnipp2cc.fsf@gmail.com> <7100c8e6119092.552ecb25@wiscmail.wisc.edu> <74b09ad111d439.552ecb61@wiscmail.wisc.edu> <74b0db7f11dbfe.552e8531@wiscmail.wisc.edu> <87vbgxnia3.fsf@gmail.com> <7740b12311d16c.552ee6bc@wiscmail.wisc.edu> <7450994e11f719.552ee6f9@wiscmail.wisc.edu> <7150bb6211f4ae.552ee735@wiscmail.wisc.edu> <7460ca7211e63f.552ee771@wiscmail.wisc.edu> <76b0faaa11ddf7.552ee7e9@wiscmail.wisc.edu> Message-ID: <7150dae811ea69.552ea1b2@wiscmail.wisc.edu> On 15-04-15, Akira Li <4kir4.1i at gmail.com> wrote: > Isaac Schwabacher writes: > > > On 15-04-15, Akira Li <4kir4.1i at gmail.com> wrote: > >> Isaac Schwabacher writes: > >> > ... > >> > > >> > I know that you can do datetime.now(tz), and you can do datetime(2013, > >> > 11, 3, 1, 30, tzinfo=zoneinfo('America/Chicago')), but not being able > >> > to add a time zone to an existing naive datetime is painful (and > >> > strptime doesn't even let you pass in a time zone). > >> > >> `.now(tz)` is correct. `datetime(..., tzinfo=tz)`) is wrong: if tz is a > >> pytz timezone then you may get a wrong tzinfo (LMT), you should use > >> `tz.localize(naive_dt, is_dst=False|True|None)` instead. > > > > The whole point of this thread is to finalize PEP 431, which fixes the > > problem for which `localize()` and `normalize()` are workarounds. When > > this is done, `datetime(..., tzinfo=tz)` will be correct. > > > > ijs > > The input time is ambiguous. Even if we assume PEP 431 is implemented in > some form, your code is still missing isdst parameter (or the > analog). PEP 431 won't fix it; it can't resolve the ambiguity by > itself. Notice is_dst paramter in the `tz.localize()` call (current > API). ...yeah, I forgot to throw that in there. It was supposed to be there all along. Nothing to see here, move along. > .now(tz) works even during end-of-DST transitions (current API) when the > local time is ambiguous. I know that. That's what I was complaining about-- I was trying to talk about how astimezone() was going to be inadequate even after the PEP was implemented because it couldn't turn naive datetimes into aware ones, and people were giving examples that started with aware datetimes generated by now(tz), which completely went around the point I was trying to make. But it looks like astimezone() is going to grow an is_dst parameter, and everything will be OK. ijs From Steve.Dower at microsoft.com Thu Apr 16 01:51:10 2015 From: Steve.Dower at microsoft.com (Steve Dower) Date: Wed, 15 Apr 2015 23:51:10 +0000 Subject: [Python-Dev] Python3 Stable ABI In-Reply-To: References: Message-ID: I don't see any obvious issues, but there may be some that don't need to be marked stable. Given that a mismatch here will cause build errors for users, I'm +1 on checking this in. Cheers, Steve Top-posted from my Windows Phone ________________________________ From: Zachary Ware Sent: ?4/?13/?2015 17:29 To: Python-Dev Subject: [Python-Dev] Python3 Stable ABI In issue23903, I've created a script that will produce PC/python3.def by scraping the header files in Include. There are are many many discrepencies between what my script generates and what is currently in the repository (diff below), but in every case I've checked the script has been right: what the script finds is actually exported as part of the limited API, but due to not being in the .def file it's not actually exported from python3.dll. Almost all of the differences are things that the script found that weren't present, but there are a couple things going the other way. The point of this message is to ask everybody who maintains anything in C to take a look through and make sure everything in their area is properly guarded (or not) by Py_LIMITED_API. Alternately, if somebody can find a bug in my script and brain that's finding too much stuff, that would be great too. Ideally, after this is all settled I'd like to add the script to both the Makefile and the Windows build system, such that PC/python3.def is always kept up to date and flags changes that weren't meant to be made. Regards, -- Zach (I'm afraid Gmail might mangle this beyond recognition, you can find the diff at http://bugs.python.org/review/23903/diff/14549/PC/python3.def if it does.) diff -r 24f2c0279120 PC/python3.def --- a/PC/python3.def Mon Apr 13 15:51:59 2015 -0500 +++ b/PC/python3.def Mon Apr 13 16:10:34 2015 -0500 @@ -1,13 +1,15 @@ ; This file specifies the import forwarding for python3.dll ; It is used when building python3dll.vcxproj +; Generated by python3defgen.py, DO NOT modify directly! LIBRARY "python3" EXPORTS + PyAST_FromNode=python35.PyAST_FromNode + PyAST_FromNodeObject=python35.PyAST_FromNodeObject + PyAST_Validate=python35.PyAST_Validate PyArg_Parse=python35.PyArg_Parse PyArg_ParseTuple=python35.PyArg_ParseTuple PyArg_ParseTupleAndKeywords=python35.PyArg_ParseTupleAndKeywords PyArg_UnpackTuple=python35.PyArg_UnpackTuple - PyArg_VaParse=python35.PyArg_VaParse - PyArg_VaParseTupleAndKeywords=python35.PyArg_VaParseTupleAndKeywords PyArg_ValidateKeywordArguments=python35.PyArg_ValidateKeywordArguments PyBaseObject_Type=python35.PyBaseObject_Type DATA PyBool_FromLong=python35.PyBool_FromLong @@ -39,7 +41,6 @@ PyCFunction_GetFlags=python35.PyCFunction_GetFlags PyCFunction_GetFunction=python35.PyCFunction_GetFunction PyCFunction_GetSelf=python35.PyCFunction_GetSelf - PyCFunction_New=python35.PyCFunction_New PyCFunction_NewEx=python35.PyCFunction_NewEx PyCFunction_Type=python35.PyCFunction_Type DATA PyCallIter_New=python35.PyCallIter_New @@ -58,6 +59,7 @@ PyCapsule_SetPointer=python35.PyCapsule_SetPointer PyCapsule_Type=python35.PyCapsule_Type DATA PyClassMethodDescr_Type=python35.PyClassMethodDescr_Type DATA + PyCmpWrapper_Type=python35.PyCmpWrapper_Type DATA PyCodec_BackslashReplaceErrors=python35.PyCodec_BackslashReplaceErrors PyCodec_Decode=python35.PyCodec_Decode PyCodec_Decoder=python35.PyCodec_Decoder @@ -68,6 +70,7 @@ PyCodec_IncrementalEncoder=python35.PyCodec_IncrementalEncoder PyCodec_KnownEncoding=python35.PyCodec_KnownEncoding PyCodec_LookupError=python35.PyCodec_LookupError + PyCodec_NameReplaceErrors=python35.PyCodec_NameReplaceErrors PyCodec_Register=python35.PyCodec_Register PyCodec_RegisterError=python35.PyCodec_RegisterError PyCodec_ReplaceErrors=python35.PyCodec_ReplaceErrors @@ -122,6 +125,7 @@ PyErr_Fetch=python35.PyErr_Fetch PyErr_Format=python35.PyErr_Format PyErr_FormatV=python35.PyErr_FormatV + PyErr_GetExcInfo=python35.PyErr_GetExcInfo PyErr_GivenExceptionMatches=python35.PyErr_GivenExceptionMatches PyErr_NewException=python35.PyErr_NewException PyErr_NewExceptionWithDoc=python35.PyErr_NewExceptionWithDoc @@ -132,14 +136,25 @@ PyErr_PrintEx=python35.PyErr_PrintEx PyErr_ProgramText=python35.PyErr_ProgramText PyErr_Restore=python35.PyErr_Restore + PyErr_SetExcFromWindowsErr=python35.PyErr_SetExcFromWindowsErr + PyErr_SetExcFromWindowsErrWithFilename=python35.PyErr_SetExcFromWindowsErrWithFilename + PyErr_SetExcFromWindowsErrWithFilenameObject=python35.PyErr_SetExcFromWindowsErrWithFilenameObject + PyErr_SetExcFromWindowsErrWithFilenameObjects=python35.PyErr_SetExcFromWindowsErrWithFilenameObjects + PyErr_SetExcInfo=python35.PyErr_SetExcInfo + PyErr_SetExcWithArgsKwargs=python35.PyErr_SetExcWithArgsKwargs PyErr_SetFromErrno=python35.PyErr_SetFromErrno PyErr_SetFromErrnoWithFilename=python35.PyErr_SetFromErrnoWithFilename PyErr_SetFromErrnoWithFilenameObject=python35.PyErr_SetFromErrnoWithFilenameObject + PyErr_SetFromErrnoWithFilenameObjects=python35.PyErr_SetFromErrnoWithFilenameObjects + PyErr_SetFromWindowsErr=python35.PyErr_SetFromWindowsErr + PyErr_SetFromWindowsErrWithFilename=python35.PyErr_SetFromWindowsErrWithFilename + PyErr_SetImportError=python35.PyErr_SetImportError PyErr_SetInterrupt=python35.PyErr_SetInterrupt PyErr_SetNone=python35.PyErr_SetNone PyErr_SetObject=python35.PyErr_SetObject PyErr_SetString=python35.PyErr_SetString PyErr_SyntaxLocation=python35.PyErr_SyntaxLocation + PyErr_SyntaxLocationEx=python35.PyErr_SyntaxLocationEx PyErr_WarnEx=python35.PyErr_WarnEx PyErr_WarnExplicit=python35.PyErr_WarnExplicit PyErr_WarnFormat=python35.PyErr_WarnFormat @@ -171,12 +186,21 @@ PyExc_AssertionError=python35.PyExc_AssertionError DATA PyExc_AttributeError=python35.PyExc_AttributeError DATA PyExc_BaseException=python35.PyExc_BaseException DATA + PyExc_BlockingIOError=python35.PyExc_BlockingIOError DATA + PyExc_BrokenPipeError=python35.PyExc_BrokenPipeError DATA PyExc_BufferError=python35.PyExc_BufferError DATA PyExc_BytesWarning=python35.PyExc_BytesWarning DATA + PyExc_ChildProcessError=python35.PyExc_ChildProcessError DATA + PyExc_ConnectionAbortedError=python35.PyExc_ConnectionAbortedError DATA + PyExc_ConnectionError=python35.PyExc_ConnectionError DATA + PyExc_ConnectionRefusedError=python35.PyExc_ConnectionRefusedError DATA + PyExc_ConnectionResetError=python35.PyExc_ConnectionResetError DATA PyExc_DeprecationWarning=python35.PyExc_DeprecationWarning DATA PyExc_EOFError=python35.PyExc_EOFError DATA PyExc_EnvironmentError=python35.PyExc_EnvironmentError DATA PyExc_Exception=python35.PyExc_Exception DATA + PyExc_FileExistsError=python35.PyExc_FileExistsError DATA + PyExc_FileNotFoundError=python35.PyExc_FileNotFoundError DATA PyExc_FloatingPointError=python35.PyExc_FloatingPointError DATA PyExc_FutureWarning=python35.PyExc_FutureWarning DATA PyExc_GeneratorExit=python35.PyExc_GeneratorExit DATA @@ -185,18 +209,23 @@ PyExc_ImportWarning=python35.PyExc_ImportWarning DATA PyExc_IndentationError=python35.PyExc_IndentationError DATA PyExc_IndexError=python35.PyExc_IndexError DATA + PyExc_InterruptedError=python35.PyExc_InterruptedError DATA + PyExc_IsADirectoryError=python35.PyExc_IsADirectoryError DATA PyExc_KeyError=python35.PyExc_KeyError DATA PyExc_KeyboardInterrupt=python35.PyExc_KeyboardInterrupt DATA PyExc_LookupError=python35.PyExc_LookupError DATA PyExc_MemoryError=python35.PyExc_MemoryError DATA - PyExc_MemoryErrorInst=python35.PyExc_MemoryErrorInst DATA PyExc_NameError=python35.PyExc_NameError DATA + PyExc_NotADirectoryError=python35.PyExc_NotADirectoryError DATA PyExc_NotImplementedError=python35.PyExc_NotImplementedError DATA PyExc_OSError=python35.PyExc_OSError DATA PyExc_OverflowError=python35.PyExc_OverflowError DATA PyExc_PendingDeprecationWarning=python35.PyExc_PendingDeprecationWarning DATA + PyExc_PermissionError=python35.PyExc_PermissionError DATA + PyExc_ProcessLookupError=python35.PyExc_ProcessLookupError DATA PyExc_RecursionErrorInst=python35.PyExc_RecursionErrorInst DATA PyExc_ReferenceError=python35.PyExc_ReferenceError DATA + PyExc_ResourceWarning=python35.PyExc_ResourceWarning DATA PyExc_RuntimeError=python35.PyExc_RuntimeError DATA PyExc_RuntimeWarning=python35.PyExc_RuntimeWarning DATA PyExc_StopIteration=python35.PyExc_StopIteration DATA @@ -205,6 +234,7 @@ PyExc_SystemError=python35.PyExc_SystemError DATA PyExc_SystemExit=python35.PyExc_SystemExit DATA PyExc_TabError=python35.PyExc_TabError DATA + PyExc_TimeoutError=python35.PyExc_TimeoutError DATA PyExc_TypeError=python35.PyExc_TypeError DATA PyExc_UnboundLocalError=python35.PyExc_UnboundLocalError DATA PyExc_UnicodeDecodeError=python35.PyExc_UnicodeDecodeError DATA @@ -215,6 +245,7 @@ PyExc_UserWarning=python35.PyExc_UserWarning DATA PyExc_ValueError=python35.PyExc_ValueError DATA PyExc_Warning=python35.PyExc_Warning DATA + PyExc_WindowsError=python35.PyExc_WindowsError DATA PyExc_ZeroDivisionError=python35.PyExc_ZeroDivisionError DATA PyException_GetCause=python35.PyException_GetCause PyException_GetContext=python35.PyException_GetContext @@ -242,10 +273,12 @@ PyGILState_Release=python35.PyGILState_Release PyGetSetDescr_Type=python35.PyGetSetDescr_Type DATA PyImport_AddModule=python35.PyImport_AddModule + PyImport_AddModuleObject=python35.PyImport_AddModuleObject PyImport_AppendInittab=python35.PyImport_AppendInittab PyImport_Cleanup=python35.PyImport_Cleanup PyImport_ExecCodeModule=python35.PyImport_ExecCodeModule PyImport_ExecCodeModuleEx=python35.PyImport_ExecCodeModuleEx + PyImport_ExecCodeModuleObject=python35.PyImport_ExecCodeModuleObject PyImport_ExecCodeModuleWithPathnames=python35.PyImport_ExecCodeModuleWithPathnames PyImport_GetImporter=python35.PyImport_GetImporter PyImport_GetMagicNumber=python35.PyImport_GetMagicNumber @@ -253,8 +286,10 @@ PyImport_GetModuleDict=python35.PyImport_GetModuleDict PyImport_Import=python35.PyImport_Import PyImport_ImportFrozenModule=python35.PyImport_ImportFrozenModule + PyImport_ImportFrozenModuleObject=python35.PyImport_ImportFrozenModuleObject PyImport_ImportModule=python35.PyImport_ImportModule PyImport_ImportModuleLevel=python35.PyImport_ImportModuleLevel + PyImport_ImportModuleLevelObject=python35.PyImport_ImportModuleLevelObject PyImport_ImportModuleNoBlock=python35.PyImport_ImportModuleNoBlock PyImport_ReloadModule=python35.PyImport_ReloadModule PyInterpreterState_Clear=python35.PyInterpreterState_Clear @@ -310,10 +345,18 @@ PyMapping_SetItemString=python35.PyMapping_SetItemString PyMapping_Size=python35.PyMapping_Size PyMapping_Values=python35.PyMapping_Values + PyMarshal_ReadObjectFromString=python35.PyMarshal_ReadObjectFromString + PyMarshal_WriteLongToFile=python35.PyMarshal_WriteLongToFile + PyMarshal_WriteObjectToFile=python35.PyMarshal_WriteObjectToFile + PyMarshal_WriteObjectToString=python35.PyMarshal_WriteObjectToString + PyMem_Calloc=python35.PyMem_Calloc PyMem_Free=python35.PyMem_Free PyMem_Malloc=python35.PyMem_Malloc PyMem_Realloc=python35.PyMem_Realloc PyMemberDescr_Type=python35.PyMemberDescr_Type DATA + PyMember_GetOne=python35.PyMember_GetOne + PyMember_SetOne=python35.PyMember_SetOne + PyMemoryView_FromMemory=python35.PyMemoryView_FromMemory PyMemoryView_FromObject=python35.PyMemoryView_FromObject PyMemoryView_GetContiguous=python35.PyMemoryView_GetContiguous PyMemoryView_Type=python35.PyMemoryView_Type DATA @@ -327,9 +370,15 @@ PyModule_GetFilename=python35.PyModule_GetFilename PyModule_GetFilenameObject=python35.PyModule_GetFilenameObject PyModule_GetName=python35.PyModule_GetName + PyModule_GetNameObject=python35.PyModule_GetNameObject PyModule_GetState=python35.PyModule_GetState PyModule_New=python35.PyModule_New + PyModule_NewObject=python35.PyModule_NewObject PyModule_Type=python35.PyModule_Type DATA + PyNode_AddChild=python35.PyNode_AddChild + PyNode_Free=python35.PyNode_Free + PyNode_ListTree=python35.PyNode_ListTree + PyNode_New=python35.PyNode_New PyNullImporter_Type=python35.PyNullImporter_Type DATA PyNumber_Absolute=python35.PyNumber_Absolute PyNumber_Add=python35.PyNumber_Add @@ -343,6 +392,7 @@ PyNumber_InPlaceAnd=python35.PyNumber_InPlaceAnd PyNumber_InPlaceFloorDivide=python35.PyNumber_InPlaceFloorDivide PyNumber_InPlaceLshift=python35.PyNumber_InPlaceLshift + PyNumber_InPlaceMatrixMultiply=python35.PyNumber_InPlaceMatrixMultiply PyNumber_InPlaceMultiply=python35.PyNumber_InPlaceMultiply PyNumber_InPlaceOr=python35.PyNumber_InPlaceOr PyNumber_InPlacePower=python35.PyNumber_InPlacePower @@ -355,6 +405,7 @@ PyNumber_Invert=python35.PyNumber_Invert PyNumber_Long=python35.PyNumber_Long PyNumber_Lshift=python35.PyNumber_Lshift + PyNumber_MatrixMultiply=python35.PyNumber_MatrixMultiply PyNumber_Multiply=python35.PyNumber_Multiply PyNumber_Negative=python35.PyNumber_Negative PyNumber_Or=python35.PyNumber_Or @@ -367,6 +418,7 @@ PyNumber_TrueDivide=python35.PyNumber_TrueDivide PyNumber_Xor=python35.PyNumber_Xor PyOS_AfterFork=python35.PyOS_AfterFork + PyOS_CheckStack=python35.PyOS_CheckStack PyOS_InitInterrupts=python35.PyOS_InitInterrupts PyOS_InputHook=python35.PyOS_InputHook DATA PyOS_InterruptOccurred=python35.PyOS_InterruptOccurred @@ -393,6 +445,7 @@ PyObject_CallMethod=python35.PyObject_CallMethod PyObject_CallMethodObjArgs=python35.PyObject_CallMethodObjArgs PyObject_CallObject=python35.PyObject_CallObject + PyObject_Calloc=python35.PyObject_Calloc PyObject_CheckReadBuffer=python35.PyObject_CheckReadBuffer PyObject_ClearWeakRefs=python35.PyObject_ClearWeakRefs PyObject_DelItem=python35.PyObject_DelItem @@ -405,6 +458,7 @@ PyObject_GC_UnTrack=python35.PyObject_GC_UnTrack PyObject_GenericGetAttr=python35.PyObject_GenericGetAttr PyObject_GenericSetAttr=python35.PyObject_GenericSetAttr + PyObject_GenericSetDict=python35.PyObject_GenericSetDict PyObject_GetAttr=python35.PyObject_GetAttr PyObject_GetAttrString=python35.PyObject_GetAttrString PyObject_GetItem=python35.PyObject_GetItem @@ -431,9 +485,10 @@ PyObject_SetItem=python35.PyObject_SetItem PyObject_Size=python35.PyObject_Size PyObject_Str=python35.PyObject_Str - PyObject_Type=python35.PyObject_Type DATA + PyObject_Type=python35.PyObject_Type PyParser_SimpleParseFileFlags=python35.PyParser_SimpleParseFileFlags PyParser_SimpleParseStringFlags=python35.PyParser_SimpleParseStringFlags + PyParser_SimpleParseStringFlagsFilename=python35.PyParser_SimpleParseStringFlagsFilename PyProperty_Type=python35.PyProperty_Type DATA PyRangeIter_Type=python35.PyRangeIter_Type DATA PyRange_Type=python35.PyRange_Type DATA @@ -474,8 +529,8 @@ PySlice_New=python35.PySlice_New PySlice_Type=python35.PySlice_Type DATA PySortWrapper_Type=python35.PySortWrapper_Type DATA + PyState_AddModule=python35.PyState_AddModule PyState_FindModule=python35.PyState_FindModule - PyState_AddModule=python35.PyState_AddModule PyState_RemoveModule=python35.PyState_RemoveModule PyStructSequence_GetItem=python35.PyStructSequence_GetItem PyStructSequence_New=python35.PyStructSequence_New @@ -484,9 +539,11 @@ PySuper_Type=python35.PySuper_Type DATA PySys_AddWarnOption=python35.PySys_AddWarnOption PySys_AddWarnOptionUnicode=python35.PySys_AddWarnOptionUnicode + PySys_AddXOption=python35.PySys_AddXOption PySys_FormatStderr=python35.PySys_FormatStderr PySys_FormatStdout=python35.PySys_FormatStdout PySys_GetObject=python35.PySys_GetObject + PySys_GetXOptions=python35.PySys_GetXOptions PySys_HasWarnOptions=python35.PySys_HasWarnOptions PySys_ResetWarnOptions=python35.PySys_ResetWarnOptions PySys_SetArgv=python35.PySys_SetArgv @@ -503,6 +560,24 @@ PyThreadState_New=python35.PyThreadState_New PyThreadState_SetAsyncExc=python35.PyThreadState_SetAsyncExc PyThreadState_Swap=python35.PyThreadState_Swap + PyThread_GetInfo=python35.PyThread_GetInfo + PyThread_ReInitTLS=python35.PyThread_ReInitTLS + PyThread_acquire_lock=python35.PyThread_acquire_lock + PyThread_acquire_lock_timed=python35.PyThread_acquire_lock_timed + PyThread_allocate_lock=python35.PyThread_allocate_lock + PyThread_create_key=python35.PyThread_create_key + PyThread_delete_key=python35.PyThread_delete_key + PyThread_delete_key_value=python35.PyThread_delete_key_value + PyThread_exit_thread=python35.PyThread_exit_thread + PyThread_free_lock=python35.PyThread_free_lock + PyThread_get_key_value=python35.PyThread_get_key_value + PyThread_get_stacksize=python35.PyThread_get_stacksize + PyThread_get_thread_ident=python35.PyThread_get_thread_ident + PyThread_init_thread=python35.PyThread_init_thread + PyThread_release_lock=python35.PyThread_release_lock + PyThread_set_key_value=python35.PyThread_set_key_value + PyThread_set_stacksize=python35.PyThread_set_stacksize + PyThread_start_new_thread=python35.PyThread_start_new_thread PyTraceBack_Here=python35.PyTraceBack_Here PyTraceBack_Print=python35.PyTraceBack_Print PyTraceBack_Type=python35.PyTraceBack_Type DATA @@ -561,34 +636,51 @@ PyUnicode_AsEncodedString=python35.PyUnicode_AsEncodedString PyUnicode_AsEncodedUnicode=python35.PyUnicode_AsEncodedUnicode PyUnicode_AsLatin1String=python35.PyUnicode_AsLatin1String + PyUnicode_AsMBCSString=python35.PyUnicode_AsMBCSString PyUnicode_AsRawUnicodeEscapeString=python35.PyUnicode_AsRawUnicodeEscapeString + PyUnicode_AsUCS4=python35.PyUnicode_AsUCS4 + PyUnicode_AsUCS4Copy=python35.PyUnicode_AsUCS4Copy PyUnicode_AsUTF16String=python35.PyUnicode_AsUTF16String PyUnicode_AsUTF32String=python35.PyUnicode_AsUTF32String PyUnicode_AsUTF8String=python35.PyUnicode_AsUTF8String PyUnicode_AsUnicodeEscapeString=python35.PyUnicode_AsUnicodeEscapeString PyUnicode_AsWideChar=python35.PyUnicode_AsWideChar - PyUnicode_ClearFreelist=python35.PyUnicode_ClearFreelist + PyUnicode_AsWideCharString=python35.PyUnicode_AsWideCharString + PyUnicode_BuildEncodingMap=python35.PyUnicode_BuildEncodingMap + PyUnicode_ClearFreeList=python35.PyUnicode_ClearFreeList PyUnicode_Compare=python35.PyUnicode_Compare + PyUnicode_CompareWithASCIIString=python35.PyUnicode_CompareWithASCIIString PyUnicode_Concat=python35.PyUnicode_Concat PyUnicode_Contains=python35.PyUnicode_Contains PyUnicode_Count=python35.PyUnicode_Count PyUnicode_Decode=python35.PyUnicode_Decode PyUnicode_DecodeASCII=python35.PyUnicode_DecodeASCII PyUnicode_DecodeCharmap=python35.PyUnicode_DecodeCharmap + PyUnicode_DecodeCodePageStateful=python35.PyUnicode_DecodeCodePageStateful PyUnicode_DecodeFSDefault=python35.PyUnicode_DecodeFSDefault PyUnicode_DecodeFSDefaultAndSize=python35.PyUnicode_DecodeFSDefaultAndSize PyUnicode_DecodeLatin1=python35.PyUnicode_DecodeLatin1 + PyUnicode_DecodeLocale=python35.PyUnicode_DecodeLocale + PyUnicode_DecodeLocaleAndSize=python35.PyUnicode_DecodeLocaleAndSize + PyUnicode_DecodeMBCS=python35.PyUnicode_DecodeMBCS + PyUnicode_DecodeMBCSStateful=python35.PyUnicode_DecodeMBCSStateful PyUnicode_DecodeRawUnicodeEscape=python35.PyUnicode_DecodeRawUnicodeEscape PyUnicode_DecodeUTF16=python35.PyUnicode_DecodeUTF16 PyUnicode_DecodeUTF16Stateful=python35.PyUnicode_DecodeUTF16Stateful PyUnicode_DecodeUTF32=python35.PyUnicode_DecodeUTF32 PyUnicode_DecodeUTF32Stateful=python35.PyUnicode_DecodeUTF32Stateful + PyUnicode_DecodeUTF7=python35.PyUnicode_DecodeUTF7 + PyUnicode_DecodeUTF7Stateful=python35.PyUnicode_DecodeUTF7Stateful PyUnicode_DecodeUTF8=python35.PyUnicode_DecodeUTF8 PyUnicode_DecodeUTF8Stateful=python35.PyUnicode_DecodeUTF8Stateful PyUnicode_DecodeUnicodeEscape=python35.PyUnicode_DecodeUnicodeEscape + PyUnicode_EncodeCodePage=python35.PyUnicode_EncodeCodePage + PyUnicode_EncodeFSDefault=python35.PyUnicode_EncodeFSDefault + PyUnicode_EncodeLocale=python35.PyUnicode_EncodeLocale PyUnicode_FSConverter=python35.PyUnicode_FSConverter PyUnicode_FSDecoder=python35.PyUnicode_FSDecoder PyUnicode_Find=python35.PyUnicode_Find + PyUnicode_FindChar=python35.PyUnicode_FindChar PyUnicode_Format=python35.PyUnicode_Format PyUnicode_FromEncodedObject=python35.PyUnicode_FromEncodedObject PyUnicode_FromFormat=python35.PyUnicode_FromFormat @@ -599,30 +691,28 @@ PyUnicode_FromStringAndSize=python35.PyUnicode_FromStringAndSize PyUnicode_FromWideChar=python35.PyUnicode_FromWideChar PyUnicode_GetDefaultEncoding=python35.PyUnicode_GetDefaultEncoding + PyUnicode_GetLength=python35.PyUnicode_GetLength PyUnicode_GetSize=python35.PyUnicode_GetSize + PyUnicode_InternFromString=python35.PyUnicode_InternFromString + PyUnicode_InternImmortal=python35.PyUnicode_InternImmortal + PyUnicode_InternInPlace=python35.PyUnicode_InternInPlace PyUnicode_IsIdentifier=python35.PyUnicode_IsIdentifier PyUnicode_Join=python35.PyUnicode_Join PyUnicode_Partition=python35.PyUnicode_Partition PyUnicode_RPartition=python35.PyUnicode_RPartition PyUnicode_RSplit=python35.PyUnicode_RSplit + PyUnicode_ReadChar=python35.PyUnicode_ReadChar PyUnicode_Replace=python35.PyUnicode_Replace PyUnicode_Resize=python35.PyUnicode_Resize PyUnicode_RichCompare=python35.PyUnicode_RichCompare - PyUnicode_SetDefaultEncoding=python35.PyUnicode_SetDefaultEncoding PyUnicode_Split=python35.PyUnicode_Split PyUnicode_Splitlines=python35.PyUnicode_Splitlines + PyUnicode_Substring=python35.PyUnicode_Substring PyUnicode_Tailmatch=python35.PyUnicode_Tailmatch PyUnicode_Translate=python35.PyUnicode_Translate - PyUnicode_BuildEncodingMap=python35.PyUnicode_BuildEncodingMap - PyUnicode_CompareWithASCIIString=python35.PyUnicode_CompareWithASCIIString - PyUnicode_DecodeUTF7=python35.PyUnicode_DecodeUTF7 - PyUnicode_DecodeUTF7Stateful=python35.PyUnicode_DecodeUTF7Stateful - PyUnicode_EncodeFSDefault=python35.PyUnicode_EncodeFSDefault - PyUnicode_InternFromString=python35.PyUnicode_InternFromString - PyUnicode_InternImmortal=python35.PyUnicode_InternImmortal - PyUnicode_InternInPlace=python35.PyUnicode_InternInPlace PyUnicode_Type=python35.PyUnicode_Type DATA - PyWeakref_GetObject=python35.PyWeakref_GetObject DATA + PyUnicode_WriteChar=python35.PyUnicode_WriteChar + PyWeakref_GetObject=python35.PyWeakref_GetObject PyWeakref_NewProxy=python35.PyWeakref_NewProxy PyWeakref_NewRef=python35.PyWeakref_NewRef PyWrapperDescr_Type=python35.PyWrapperDescr_Type DATA @@ -633,6 +723,8 @@ Py_BuildValue=python35.Py_BuildValue Py_CompileString=python35.Py_CompileString Py_DecRef=python35.Py_DecRef + Py_DecodeLocale=python35.Py_DecodeLocale + Py_EncodeLocale=python35.Py_EncodeLocale Py_EndInterpreter=python35.Py_EndInterpreter Py_Exit=python35.Py_Exit Py_FatalError=python35.Py_FatalError @@ -660,44 +752,95 @@ Py_NewInterpreter=python35.Py_NewInterpreter Py_ReprEnter=python35.Py_ReprEnter Py_ReprLeave=python35.Py_ReprLeave + Py_SetPath=python35.Py_SetPath Py_SetProgramName=python35.Py_SetProgramName Py_SetPythonHome=python35.Py_SetPythonHome Py_SetRecursionLimit=python35.Py_SetRecursionLimit Py_SymtableString=python35.Py_SymtableString Py_VaBuildValue=python35.Py_VaBuildValue + Py_hexdigits=python35.Py_hexdigits DATA + _PyDebug_PrintTotalRefs=python35._PyDebug_PrintTotalRefs + _PyDict_Dummy=python35._PyDict_Dummy + _PyDict_GetItemId=python35._PyDict_GetItemId + _PyDict_GetItemIdWithError=python35._PyDict_GetItemIdWithError + _PyDict_SetItemId=python35._PyDict_SetItemId _PyErr_BadInternalCall=python35._PyErr_BadInternalCall + _PyEval_FiniThreads=python35._PyEval_FiniThreads + _PyGILState_Reinit=python35._PyGILState_Reinit + _PyImportZip_Init=python35._PyImportZip_Init + _PyMethodWrapper_Type=python35._PyMethodWrapper_Type DATA + _PyNamespace_New=python35._PyNamespace_New + _PyNamespace_Type=python35._PyNamespace_Type DATA + _PyNone_Type=python35._PyNone_Type DATA + _PyNotImplemented_Type=python35._PyNotImplemented_Type DATA + _PyOS_GetOpt=python35._PyOS_GetOpt + _PyOS_IsMainThread=python35._PyOS_IsMainThread + _PyOS_SigintEvent=python35._PyOS_SigintEvent _PyObject_CallFunction_SizeT=python35._PyObject_CallFunction_SizeT + _PyObject_CallMethodId=python35._PyObject_CallMethodId + _PyObject_CallMethodIdObjArgs=python35._PyObject_CallMethodIdObjArgs + _PyObject_CallMethodId_SizeT=python35._PyObject_CallMethodId_SizeT _PyObject_CallMethod_SizeT=python35._PyObject_CallMethod_SizeT + _PyObject_GC_Calloc=python35._PyObject_GC_Calloc _PyObject_GC_Malloc=python35._PyObject_GC_Malloc _PyObject_GC_New=python35._PyObject_GC_New _PyObject_GC_NewVar=python35._PyObject_GC_NewVar _PyObject_GC_Resize=python35._PyObject_GC_Resize + _PyObject_GetAttrId=python35._PyObject_GetAttrId + _PyObject_HasAttrId=python35._PyObject_HasAttrId + _PyObject_IsAbstract=python35._PyObject_IsAbstract _PyObject_New=python35._PyObject_New _PyObject_NewVar=python35._PyObject_NewVar + _PyObject_SetAttrId=python35._PyObject_SetAttrId _PyState_AddModule=python35._PyState_AddModule + _PySys_SetObjectId=python35._PySys_SetObjectId + _PyThreadState_DeleteExcept=python35._PyThreadState_DeleteExcept _PyThreadState_Init=python35._PyThreadState_Init _PyThreadState_Prealloc=python35._PyThreadState_Prealloc _PyTrash_delete_later=python35._PyTrash_delete_later DATA _PyTrash_delete_nesting=python35._PyTrash_delete_nesting DATA _PyTrash_deposit_object=python35._PyTrash_deposit_object _PyTrash_destroy_chain=python35._PyTrash_destroy_chain + _PyTrash_thread_deposit_object=python35._PyTrash_thread_deposit_object + _PyTrash_thread_destroy_chain=python35._PyTrash_thread_destroy_chain + _PyUnicode_ClearStaticStrings=python35._PyUnicode_ClearStaticStrings + _PyUnicode_FromId=python35._PyUnicode_FromId _PyWeakref_CallableProxyType=python35._PyWeakref_CallableProxyType DATA _PyWeakref_ProxyType=python35._PyWeakref_ProxyType DATA _PyWeakref_RefType=python35._PyWeakref_RefType DATA + _Py_AddToAllObjects=python35._Py_AddToAllObjects _Py_BuildValue_SizeT=python35._Py_BuildValue_SizeT _Py_CheckRecursionLimit=python35._Py_CheckRecursionLimit DATA _Py_CheckRecursiveCall=python35._Py_CheckRecursiveCall _Py_Dealloc=python35._Py_Dealloc + _Py_DumpTraceback=python35._Py_DumpTraceback DATA + _Py_DumpTracebackThreads=python35._Py_DumpTracebackThreads DATA _Py_EllipsisObject=python35._Py_EllipsisObject DATA _Py_FalseStruct=python35._Py_FalseStruct DATA + _Py_ForgetReference=python35._Py_ForgetReference + _Py_GetAllocatedBlocks=python35._Py_GetAllocatedBlocks + _Py_GetRefTotal=python35._Py_GetRefTotal + _Py_HashSecret_Initialized=python35._Py_HashSecret_Initialized DATA + _Py_NegativeRefcount=python35._Py_NegativeRefcount + _Py_NewReference=python35._Py_NewReference _Py_NoneStruct=python35._Py_NoneStruct DATA _Py_NotImplementedStruct=python35._Py_NotImplementedStruct DATA + _Py_PrintReferenceAddresses=python35._Py_PrintReferenceAddresses + _Py_PrintReferences=python35._Py_PrintReferences + _Py_RefTotal=python35._Py_RefTotal DATA _Py_SwappedOp=python35._Py_SwappedOp DATA - _Py_TrueStruct=python35._Py_TrueStruct DATA _Py_VaBuildValue_SizeT=python35._Py_VaBuildValue_SizeT - _PyArg_Parse_SizeT=python35._PyArg_Parse_SizeT - _PyArg_ParseTuple_SizeT=python35._PyArg_ParseTuple_SizeT - _PyArg_ParseTupleAndKeywords_SizeT=python35._PyArg_ParseTupleAndKeywords_SizeT - _PyArg_VaParse_SizeT=python35._PyArg_VaParse_SizeT - _PyArg_VaParseTupleAndKeywords_SizeT=python35._PyArg_VaParseTupleAndKeywords_SizeT - _Py_BuildValue_SizeT=python35._Py_BuildValue_SizeT + _Py_add_one_to_index_C=python35._Py_add_one_to_index_C + _Py_add_one_to_index_F=python35._Py_add_one_to_index_F + _Py_device_encoding=python35._Py_device_encoding + _Py_fopen=python35._Py_fopen + _Py_fopen_obj=python35._Py_fopen_obj + _Py_read=python35._Py_read + _Py_stat=python35._Py_stat + _Py_wfopen=python35._Py_wfopen + _Py_wgetcwd=python35._Py_wgetcwd + _Py_wreadlink=python35._Py_wreadlink + _Py_wrealpath=python35._Py_wrealpath + _Py_write=python35._Py_write + _Py_write_noraise=python35._Py_write_noraise _______________________________________________ Python-Dev mailing list Python-Dev at python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/steve.dower%40microsoft.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From martin at v.loewis.de Thu Apr 16 21:34:42 2015 From: martin at v.loewis.de (=?windows-1252?Q?=22Martin_v=2E_L=F6wis=22?=) Date: Thu, 16 Apr 2015 21:34:42 +0200 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: <55204174.9010404@egenix.com> References: <551E63E5.6080805@hastings.org> <551ED41D.2070909@egenix.com> <1428174125.3954963.249326869.58843520@webmail.messagingengine.com> <55203CED.5030609@egenix.com> <1428176941.3962322.249337589.2D987B60@webmail.messagingengine.com> <55204174.9010404@egenix.com> Message-ID: <55300ED2.8060208@v.loewis.de> Am 04.04.15 um 21:54 schrieb M.-A. Lemburg: >>> FWIW: The PSF mostly uses StartSSL nowadays and they also support code >>> signing certificates. Given that this option is a lot cheaper than >>> Verisign, I think we should switch, unless there are significant >>> reasons not to. We should revisit this in 2017. >> >> Agree - apparently the starlssl process for getting a signing cert is >> complex/obscure, so we should start early. > > Not really. Once you have the org verification it's really easy. > >> Let me know if I can help providing PSF organization verification. > > I already completed that for the current cycle. > I had asked the PSF for a StartSSL certificate when the previous certificate expired, and the PSF was not able to provide one. After waiting several weeks for the PSF to provide the certificate, Kurt then kindly went to Verisign. Kind regards, Martin From martin at v.loewis.de Thu Apr 16 21:38:47 2015 From: martin at v.loewis.de (=?windows-1252?Q?=22Martin_v=2E_L=F6wis=22?=) Date: Thu, 16 Apr 2015 21:38:47 +0200 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: <1428208996091.85387@microsoft.com> References: <551E63E5.6080805@hastings.org> <551ED41D.2070909@egenix.com> <551F1650.8070808@egenix.com> <20150404121023.3fefee2f@limelight.wooz.org> , <1428208996091.85387@microsoft.com> Message-ID: <55300FC7.2080303@v.loewis.de> Am 05.04.15 um 06:43 schrieb Steve Dower: > Now I just have to find the time to learn how to use it... I always sign with Kleopatra on Windows. It's really simple: just drag all files you want to sign onto it, configure "detached" signatures, and it will place the signature next to the original file. Regards, Martin From martin at v.loewis.de Thu Apr 16 21:12:53 2015 From: martin at v.loewis.de (=?windows-1252?Q?=22Martin_v=2E_L=F6wis=22?=) Date: Thu, 16 Apr 2015 21:12:53 +0200 Subject: [Python-Dev] Python3 Stable ABI In-Reply-To: References: Message-ID: <553009B5.9010808@v.loewis.de> Am 13.04.15 um 23:28 schrieb Zachary Ware: > In issue23903, I've created a script that will produce PC/python3.def > by scraping the header files in Include. See my comment in the issue. Having a script to check is good; having it generate the def file automatically is bad. It's typically the case that changes to the header files didn't consider the stable ABI, so each change needs to be reviewed manually. Typically, further changes to other source files will be necessary. Regards, Martin From facundobatista at gmail.com Thu Apr 16 23:09:01 2015 From: facundobatista at gmail.com (Facundo Batista) Date: Thu, 16 Apr 2015 18:09:01 -0300 Subject: [Python-Dev] Not being able to compile: "make: *** [Programs/_freeze_importlib] Error 1" Message-ID: Hello! It's been a while since I compiled python and run the test suite. Now that I'm starting to work in a patch, wanted to do that as a sanity check, and even "./configure" finishing ok, "make" fails :/ (I'm on Ubuntu Trusty, all packages updated, all dependencies theorically installed). Full trace here: http://linkode.org/TgkzZw90JUaoodvYzU7zX6 Before going into a deep debug, I thought about sending a mail to see if anybode else hit this issue, if it's a common problem, if there's a known workaround. Thanks!! -- . Facundo Blog: http://www.taniquetil.com.ar/plog/ PyAr: http://www.python.org/ar/ Twitter: @facundobatista From rdmurray at bitdance.com Thu Apr 16 23:34:26 2015 From: rdmurray at bitdance.com (R. David Murray) Date: Thu, 16 Apr 2015 17:34:26 -0400 Subject: [Python-Dev] Not being able to compile: "make: *** [Programs/_freeze_importlib] Error 1" In-Reply-To: References: Message-ID: <20150416213427.28A7AB3050C@webabinitio.net> On Thu, 16 Apr 2015 18:09:01 -0300, Facundo Batista wrote: > Full trace here: > > http://linkode.org/TgkzZw90JUaoodvYzU7zX6 > > Before going into a deep debug, I thought about sending a mail to see > if anybode else hit this issue, if it's a common problem, if there's a > known workaround. Most likely you just need to run 'make touch' so that it doesn't try to rebuild stuff it doesn't need to (because we check in those particular build artifacts, like the frozen importlib). --David From mal at egenix.com Fri Apr 17 00:46:20 2015 From: mal at egenix.com (M.-A. Lemburg) Date: Fri, 17 Apr 2015 00:46:20 +0200 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: <55300ED2.8060208@v.loewis.de> References: <551E63E5.6080805@hastings.org> <551ED41D.2070909@egenix.com> <1428174125.3954963.249326869.58843520@webmail.messagingengine.com> <55203CED.5030609@egenix.com> <1428176941.3962322.249337589.2D987B60@webmail.messagingengine.com> <55204174.9010404@egenix.com> <55300ED2.8060208@v.loewis.de> Message-ID: <55303BBC.1050806@egenix.com> On 16.04.2015 21:34, "Martin v. L?wis" wrote: > Am 04.04.15 um 21:54 schrieb M.-A. Lemburg: >>>> FWIW: The PSF mostly uses StartSSL nowadays and they also support code >>>> signing certificates. Given that this option is a lot cheaper than >>>> Verisign, I think we should switch, unless there are significant >>>> reasons not to. We should revisit this in 2017. >>> >>> Agree - apparently the starlssl process for getting a signing cert is >>> complex/obscure, so we should start early. >> >> Not really. Once you have the org verification it's really easy. >> >>> Let me know if I can help providing PSF organization verification. >> >> I already completed that for the current cycle. >> > > I had asked the PSF for a StartSSL certificate when the previous > certificate expired, and the PSF was not able to provide one. After > waiting several weeks for the PSF to provide the certificate, Kurt then > kindly went to Verisign. When was that ? I never received such a request. The account I'm using was created in Dec 2014 and the validation received on 2014-12-17. This is valid for about a year: https://wiki.python.org/psf/PSF%20SSL%20Certificates Code signing certificates are valid for two years, so switching to StartSSL probably doesn't make much sense now, unless perhaps we want to switch to SHA2 and longer RSA keys (if that's possible for code signing certs - I'd have to check). -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source (#1, Apr 17 2015) >>> Python Projects, Coaching and Consulting ... http://www.egenix.com/ >>> mxODBC Plone/Zope Database Adapter ... http://zope.egenix.com/ >>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ ________________________________________________________________________ ::::: Try our mxODBC.Connect Python Database Interface for free ! :::::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/ From facundobatista at gmail.com Fri Apr 17 03:23:09 2015 From: facundobatista at gmail.com (Facundo Batista) Date: Thu, 16 Apr 2015 22:23:09 -0300 Subject: [Python-Dev] Not being able to compile: "make: *** [Programs/_freeze_importlib] Error 1" In-Reply-To: <20150416213427.28A7AB3050C@webabinitio.net> References: <20150416213427.28A7AB3050C@webabinitio.net> Message-ID: On Thu, Apr 16, 2015 at 6:34 PM, R. David Murray wrote: > Most likely you just need to run 'make touch' so that it doesn't try > to rebuild stuff it doesn't need to (because we check in those > particular build artifacts, like the frozen importlib). "make touch" didn't fix it, but when doing that I noticed this message: Modules/Setup.dist is newer than Modules/Setup; check to make sure you have all the updates you need in your Modules/Setup file. Usually, copying Modules/Setup.dist to Modules/Setup will work. I copied that file, and then make touch, and then the "make" finished OK!!! Should we update the developer guide with these instructions, or all this is unlikely to happen? Thanks!! -- . Facundo Blog: http://www.taniquetil.com.ar/plog/ PyAr: http://www.python.org/ar/ Twitter: @facundobatista From facundobatista at gmail.com Fri Apr 17 03:32:35 2015 From: facundobatista at gmail.com (Facundo Batista) Date: Thu, 16 Apr 2015 22:32:35 -0300 Subject: [Python-Dev] How to behave regarding commiting Message-ID: Hola! I'm asking this because quite some time passed since I was active in the development of our beloved language. I'm trying to not break any new rule not known by me. I opened a bug recently [0], somebody else proposed a patch that I like. However, that patch has no test. I will do a test for that code, but then what? Shall I just commit and push? Or the whole branch should be proposed for further reviewing? Thank you!! [0] http://bugs.python.org/issue23887 -- . Facundo Blog: http://www.taniquetil.com.ar/plog/ PyAr: http://www.python.org/ar/ Twitter: @facundobatista From berker.peksag at gmail.com Fri Apr 17 03:41:49 2015 From: berker.peksag at gmail.com (=?UTF-8?Q?Berker_Peksa=C4=9F?=) Date: Fri, 17 Apr 2015 04:41:49 +0300 Subject: [Python-Dev] How to behave regarding commiting In-Reply-To: References: Message-ID: On Fri, Apr 17, 2015 at 4:32 AM, Facundo Batista wrote: > Hola! > > I'm asking this because quite some time passed since I was active in > the development of our beloved language. > > I'm trying to not break any new rule not known by me. > > I opened a bug recently [0], somebody else proposed a patch that I > like. However, that patch has no test. I will do a test for that code, > but then what? > > Shall I just commit and push? Or the whole branch should be proposed > for further reviewing? Hi, Since writing a test for that patch is simple, I'd just commit it to the default branch. --Berker From brett at python.org Fri Apr 17 15:45:22 2015 From: brett at python.org (Brett Cannon) Date: Fri, 17 Apr 2015 13:45:22 +0000 Subject: [Python-Dev] Not being able to compile: "make: *** [Programs/_freeze_importlib] Error 1" In-Reply-To: References: <20150416213427.28A7AB3050C@webabinitio.net> Message-ID: On Thu, Apr 16, 2015 at 9:23 PM Facundo Batista wrote: > On Thu, Apr 16, 2015 at 6:34 PM, R. David Murray > wrote: > > > Most likely you just need to run 'make touch' so that it doesn't try > > to rebuild stuff it doesn't need to (because we check in those > > particular build artifacts, like the frozen importlib). > > "make touch" didn't fix it, but when doing that I noticed this message: > > Modules/Setup.dist is newer than Modules/Setup; > check to make sure you have all the updates you > need in your Modules/Setup file. > Usually, copying Modules/Setup.dist to Modules/Setup will work. > > I copied that file, and then make touch, and then the "make" finished OK!!! > > Should we update the developer guide with these instructions, or all > this is unlikely to happen? > It happens on occasion. It definitely wouldn't hurt to add a little note to the UNIX build section of the devguide that this happens on occasion and so to look for it when running configure or make (it's always printed first so it's easy to spot). -------------- next part -------------- An HTML attachment was scrubbed... URL: From a.v.shkop at gmail.com Fri Apr 17 09:40:14 2015 From: a.v.shkop at gmail.com (Alex Shkop) Date: Fri, 17 Apr 2015 07:40:14 +0000 Subject: [Python-Dev] unittest test discovery and namespace packages Message-ID: Hello! There's an issue considering test discovery in unittest module. Basically it is about unittest module that doesn't find tests in namespace packages. For more info see issue http://bugs.python.org/issue23882. I'm willing to make a patch for this bug. But I need help to formulate how test discovery should work. Documentation states that all importable modules that match pattern will be loaded. This means that test modules inside namespace packages should be loaded too. But enabling this would change things drastically. For example now, running python -m unittest inside cpython source root does nothing. If we will enable test discovery inside namespace packages then this command will start running the whole python test suite in Lib/test/. So I'm looking for someone's help to clarify how test discovery should work. Thanks, Alex -- Issue in bugtracker - http://bugs.python.org/issue23882 Documentation for discover() method - https://docs.python.org/3.4/library/unittest.html#unittest.TestLoader.discover -------------- next part -------------- An HTML attachment was scrubbed... URL: From encukou at gmail.com Fri Apr 17 15:52:56 2015 From: encukou at gmail.com (Petr Viktorin) Date: Fri, 17 Apr 2015 15:52:56 +0200 Subject: [Python-Dev] Changing PyModuleDef.m_reload to m_slots Message-ID: <55311038.5080200@gmail.com> Hello, PEP 489, Redesigning extension module loading [0], is currently discussed on import-sig, but one question calls for a wider audience. As a background, the PyModuleDef structure [1] is currently: struct PyModuleDef{ PyModuleDef_Base m_base; const char* m_name; const char* m_doc; Py_ssize_t m_size; PyMethodDef *m_methods; inquiry m_reload; traverseproc m_traverse; inquiry m_clear; freefunc m_free; }; ... where the m_reload pointer is unused, and must be NULL. My proposal is to repurpose this pointer to hold an array of slots, in the style of PEP 384's PyType_Spec [2], which would allow adding extensions -- both those needed for PEP 489 and future ones. The result would be: typedef struct { int slot; void *value; } PyModuleDesc_Slot; typedef struct PyModuleDef { PyModuleDef_Base m_base; const char* m_name; const char* m_doc; Py_ssize_t m_size; PyMethodDef *m_methods; PyModuleDesc_Slot* m_slots; /* <-- change here */ traverseproc m_traverse; inquiry m_clear; freefunc m_free; } PyModuleDef; The alternative is defining another struct for module definitions, with a parallel versions of functions to operate on it, and duplication on the lower level (e.g. a variant of PyModule_GetDef, variants for PyState_FindModule &c with the supporting registry, extra storage in module objects, two places to look in for GC/deallocation hooks). The downside is that, strictly speaking, this would break API/ABI, though in pedantic details. The field name could conceivably be used to get the NULL or in C99-style designated initializers, but since NULL is the only valid value, I don't see a reason for that. It could also be used for zeroing the structure after allocating a PyModuleDef dynamically. ABI-wise, this changes a function pointer to a data pointer. To my knowledge, Python doesn't support a platform where this would matter, but I admit my knowledge is not complete. Please share any concerns you might have about this change. [0] https://www.python.org/dev/peps/pep-0489/ [1] https://docs.python.org/3/c-api/module.html#c.PyModuleDef [2] https://www.python.org/dev/peps/pep-0384/#type-objects From status at bugs.python.org Fri Apr 17 18:08:24 2015 From: status at bugs.python.org (Python tracker) Date: Fri, 17 Apr 2015 18:08:24 +0200 (CEST) Subject: [Python-Dev] Summary of Python tracker Issues Message-ID: <20150417160824.84FD356706@psf.upfronthosting.co.za> ACTIVITY SUMMARY (2015-04-10 - 2015-04-17) Python tracker at http://bugs.python.org/ To view or respond to any of the issues listed below, click on the issue. Do NOT respond to this message. Issues counts and deltas: open 4792 (-31) closed 30957 (+113) total 35749 (+82) Open issues with patches: 2240 Issues opened (60) ================== #22980: C extension naming doesn't take bitness into account http://bugs.python.org/issue22980 reopened by lemburg #23908: Check path arguments of os functions for null character http://bugs.python.org/issue23908 opened by serhiy.storchaka #23910: C implementation of namedtuple (WIP) http://bugs.python.org/issue23910 opened by llllllllll #23911: Move path-based bootstrap code to a separate frozen file. http://bugs.python.org/issue23911 opened by eric.snow #23914: pickle fails with SystemError http://bugs.python.org/issue23914 opened by alex #23915: traceback set with BaseException.with_traceback() overwritten http://bugs.python.org/issue23915 opened by abathur #23917: please fall back to sequential compilation when concurrent doe http://bugs.python.org/issue23917 opened by doko #23919: [Windows] test_os fails several C-level assertions http://bugs.python.org/issue23919 opened by zach.ware #23920: Should Clinic have "nullable" or types=NoneType? http://bugs.python.org/issue23920 opened by larry #23921: Standardize documentation whitespace, formatting http://bugs.python.org/issue23921 opened by jedwards #23922: turtle.py and turtledemo use the default tkinter icon http://bugs.python.org/issue23922 opened by Al.Sweigart #23926: skipitem() in getargs.c still supports 'w' and 'w#', and shoul http://bugs.python.org/issue23926 opened by larry #23927: getargs.c skipitem() doesn't skip 'w*' http://bugs.python.org/issue23927 opened by larry #23930: SimpleCookie doesn't parse comma-only separated cookies corre http://bugs.python.org/issue23930 opened by riklaunim #23931: Update DevGuide link in Quickstart Step 1 http://bugs.python.org/issue23931 opened by willingc #23933: Struct module should acept arrays http://bugs.python.org/issue23933 opened by gamaanderson #23934: inspect.signature reporting "()" for all builtin & extension t http://bugs.python.org/issue23934 opened by ncoghlan #23936: Wrong references to deprecated find_module instead of find_spe http://bugs.python.org/issue23936 opened by raulcd #23937: IDLE start maximized http://bugs.python.org/issue23937 opened by zektron42 #23942: Explain naming of the patch files in the bug tracker http://bugs.python.org/issue23942 opened by maciej.szulik #23946: Invalid timestamps reported by os.stat() when Windows FILETIME http://bugs.python.org/issue23946 opened by CristiFati #23947: Add mechanism to import stdlib package bypassing user packages http://bugs.python.org/issue23947 opened by steve.dower #23948: Deprecate os.kill() on Windows http://bugs.python.org/issue23948 opened by jpe #23949: Number of elements display in error message is wrong while unp http://bugs.python.org/issue23949 opened by ulaganathanm123 at gmail.com #23950: Odd behavior with "file" and "filename" attributes in cgi.Fiel http://bugs.python.org/issue23950 opened by deadpixi #23951: Update devguide style to use a similar theme as Docs http://bugs.python.org/issue23951 opened by willingc #23952: Document the 'maxlen' member of the cgi module http://bugs.python.org/issue23952 opened by deadpixi #23953: test_mmap uses cruel and unusual amounts of disk space http://bugs.python.org/issue23953 opened by larry #23954: Pressing enter/return or clicking IDLE's autocomplete does not http://bugs.python.org/issue23954 opened by Al.Sweigart #23955: Add python.ini file for embedded/applocal installs http://bugs.python.org/issue23955 opened by steve.dower #23958: compile warnings in libffi http://bugs.python.org/issue23958 opened by steveha #23959: Update imaplib to support RFC3501 http://bugs.python.org/issue23959 opened by maciej.szulik #23960: PyErr_SetImportError doesn't clean up on some errors http://bugs.python.org/issue23960 opened by blackfawn #23961: IDLE autocomplete window does not automatically close when sel http://bugs.python.org/issue23961 opened by Al.Sweigart #23962: Incorrect TimeoutError referenced in concurrent.futures docume http://bugs.python.org/issue23962 opened by ryder.lewis #23963: Windows build error using original openssl source http://bugs.python.org/issue23963 opened by anselm.kruis #23964: Update README documentation for IDLE tests. http://bugs.python.org/issue23964 opened by Al.Sweigart #23965: test_ssl failure on Fedora 22 http://bugs.python.org/issue23965 opened by kushal.das #23966: More clearly expose/explain native and cross-build target info http://bugs.python.org/issue23966 opened by ncoghlan #23967: Make inspect.signature expression evaluation more powerful http://bugs.python.org/issue23967 opened by larry #23968: rename the platform directory from plat-$(MACHDEP) to plat-$(P http://bugs.python.org/issue23968 opened by doko #23969: please set a SOABI for MacOSX http://bugs.python.org/issue23969 opened by doko #23970: Update distutils.msvccompiler for VC14 http://bugs.python.org/issue23970 opened by steve.dower #23971: dict(list) and dict.fromkeys() doesn't account for 2/3 fill ra http://bugs.python.org/issue23971 opened by larry #23972: Asyncio reuseport http://bugs.python.org/issue23972 opened by Boris.FELD #23973: PEP 484 implementation http://bugs.python.org/issue23973 opened by gvanrossum #23974: random.randrange() biased output http://bugs.python.org/issue23974 opened by gurnec #23975: numbers.Rational implements __float__ incorrectly http://bugs.python.org/issue23975 opened by wolma #23976: ZipFile.writestr implies non-regular files http://bugs.python.org/issue23976 opened by dalphus #23977: Enhancing IDLE's test_delegator.py unit test http://bugs.python.org/issue23977 opened by Al.Sweigart #23978: ttk.Style.element_create using incorrect tk.call syntax http://bugs.python.org/issue23978 opened by jmorgensen #23979: Multiprocessing Pool.map pickles arguments passed to workers http://bugs.python.org/issue23979 opened by kieleth #23980: Documentation for format units starting with 'e' is inconsiste http://bugs.python.org/issue23980 opened by larry #23981: Update test_unicodedata.py to use script_helpers http://bugs.python.org/issue23981 opened by bobcatfish #23982: Tkinter in Python 3.4 for Windows incorrectly renders certain http://bugs.python.org/issue23982 opened by MartyMacGyver #23983: Update example in the pty documentation http://bugs.python.org/issue23983 opened by berker.peksag #23984: Documentation error: Descriptors http://bugs.python.org/issue23984 opened by bjonnh #23985: Crash when deleting slices from duplicated bytearray http://bugs.python.org/issue23985 opened by johan #23986: Inaccuracy about "in" keyword for list and tuple http://bugs.python.org/issue23986 opened by wim.glenn #23987: docs about containers membership testing wrong for broken obje http://bugs.python.org/issue23987 opened by ethan.furman Most recent 15 issues with no replies (15) ========================================== #23986: Inaccuracy about "in" keyword for list and tuple http://bugs.python.org/issue23986 #23984: Documentation error: Descriptors http://bugs.python.org/issue23984 #23983: Update example in the pty documentation http://bugs.python.org/issue23983 #23982: Tkinter in Python 3.4 for Windows incorrectly renders certain http://bugs.python.org/issue23982 #23978: ttk.Style.element_create using incorrect tk.call syntax http://bugs.python.org/issue23978 #23977: Enhancing IDLE's test_delegator.py unit test http://bugs.python.org/issue23977 #23976: ZipFile.writestr implies non-regular files http://bugs.python.org/issue23976 #23975: numbers.Rational implements __float__ incorrectly http://bugs.python.org/issue23975 #23973: PEP 484 implementation http://bugs.python.org/issue23973 #23968: rename the platform directory from plat-$(MACHDEP) to plat-$(P http://bugs.python.org/issue23968 #23963: Windows build error using original openssl source http://bugs.python.org/issue23963 #23960: PyErr_SetImportError doesn't clean up on some errors http://bugs.python.org/issue23960 #23955: Add python.ini file for embedded/applocal installs http://bugs.python.org/issue23955 #23954: Pressing enter/return or clicking IDLE's autocomplete does not http://bugs.python.org/issue23954 #23952: Document the 'maxlen' member of the cgi module http://bugs.python.org/issue23952 Most recent 15 issues waiting for review (15) ============================================= #23983: Update example in the pty documentation http://bugs.python.org/issue23983 #23981: Update test_unicodedata.py to use script_helpers http://bugs.python.org/issue23981 #23977: Enhancing IDLE's test_delegator.py unit test http://bugs.python.org/issue23977 #23971: dict(list) and dict.fromkeys() doesn't account for 2/3 fill ra http://bugs.python.org/issue23971 #23970: Update distutils.msvccompiler for VC14 http://bugs.python.org/issue23970 #23968: rename the platform directory from plat-$(MACHDEP) to plat-$(P http://bugs.python.org/issue23968 #23967: Make inspect.signature expression evaluation more powerful http://bugs.python.org/issue23967 #23964: Update README documentation for IDLE tests. http://bugs.python.org/issue23964 #23963: Windows build error using original openssl source http://bugs.python.org/issue23963 #23962: Incorrect TimeoutError referenced in concurrent.futures docume http://bugs.python.org/issue23962 #23960: PyErr_SetImportError doesn't clean up on some errors http://bugs.python.org/issue23960 #23949: Number of elements display in error message is wrong while unp http://bugs.python.org/issue23949 #23946: Invalid timestamps reported by os.stat() when Windows FILETIME http://bugs.python.org/issue23946 #23936: Wrong references to deprecated find_module instead of find_spe http://bugs.python.org/issue23936 #23934: inspect.signature reporting "()" for all builtin & extension t http://bugs.python.org/issue23934 Top 10 most discussed issues (10) ================================= #22980: C extension naming doesn't take bitness into account http://bugs.python.org/issue22980 34 msgs #9517: Make test.script_helper more comprehensive, and use it in the http://bugs.python.org/issue9517 22 msgs #5438: test_bigmem.test_from_2G_generator uses more memory than expec http://bugs.python.org/issue5438 15 msgs #23949: Number of elements display in error message is wrong while unp http://bugs.python.org/issue23949 11 msgs #23970: Update distutils.msvccompiler for VC14 http://bugs.python.org/issue23970 11 msgs #17445: Handle bytes comparisons in difflib.Differ http://bugs.python.org/issue17445 10 msgs #23910: C implementation of namedtuple (WIP) http://bugs.python.org/issue23910 10 msgs #4254: _cursesmodule.c callable update_lines_cols() http://bugs.python.org/issue4254 9 msgs #19050: [Python 2, Windows] fflush called on pointer to potentially cl http://bugs.python.org/issue19050 9 msgs #21327: socket.type value changes after using settimeout() http://bugs.python.org/issue21327 9 msgs Issues closed (104) =================== #6006: ffi.c compile failures on AIX 5.3 with xlc http://bugs.python.org/issue6006 closed by akuchling #6983: Add specific get_platform() for freebsd http://bugs.python.org/issue6983 closed by akuchling #8882: socketmodule.c`getsockaddrarg() should not check the length of http://bugs.python.org/issue8882 closed by akuchling #9014: Incorrect documentation of the PyObject_HEAD macro http://bugs.python.org/issue9014 closed by gregory.p.smith #9859: Add tests to verify API match of modules with 2 implementation http://bugs.python.org/issue9859 closed by gregory.p.smith #11754: Check calculated constants in test_string.py http://bugs.python.org/issue11754 closed by r.david.murray #12275: urllib.request.HTTPRedirectHandler won't redirect to a URL wit http://bugs.python.org/issue12275 closed by vadmium #12327: in HTTPConnection the are len(body) and TypeError catch except http://bugs.python.org/issue12327 closed by r.david.murray #12652: Keep test.support docs out of the global docs index http://bugs.python.org/issue12652 closed by willingc #12955: urllib.request example should use "with ... as:" http://bugs.python.org/issue12955 closed by berker.peksag #13164: importing rlcompleter module writes a control sequence in stdo http://bugs.python.org/issue13164 closed by r.david.murray #13472: devguide doesn???t list all build dependencies http://bugs.python.org/issue13472 closed by willingc #14374: Compiling Python 2.7.2 on HP11i PA-RISC ends with segmentation http://bugs.python.org/issue14374 closed by akuchling #14458: Non-admin installation fails http://bugs.python.org/issue14458 closed by zach.ware #14767: urllib.request.HTTPRedirectHandler raises HTTPError when Locat http://bugs.python.org/issue14767 closed by vadmium #15625: Support u and w codes in memoryview http://bugs.python.org/issue15625 closed by steve.dower #16050: ctypes: callback from C++ to Python fails with Illegal Instruc http://bugs.python.org/issue16050 closed by r.david.murray #16222: some terms not found by devguide's search box http://bugs.python.org/issue16222 closed by willingc #16229: Demo *redemo.py* lacking in Windows installation http://bugs.python.org/issue16229 closed by zach.ware #16405: Explain how to set up the whitespace commit hook locally http://bugs.python.org/issue16405 closed by ned.deily #16501: deprecate RISCOS "support" http://bugs.python.org/issue16501 closed by akuchling #16914: timestamping in smtplib.py to help troubleshoot server timeout http://bugs.python.org/issue16914 closed by r.david.murray #17202: Add .bat line to .hgeol http://bugs.python.org/issue17202 closed by python-dev #17284: create mercurial section in devguide's committing.rst http://bugs.python.org/issue17284 closed by willingc #17380: initproc return value is unclear http://bugs.python.org/issue17380 closed by r.david.murray #17630: Create a pure Python zipfile/tarfile importer http://bugs.python.org/issue17630 closed by gregory.p.smith #17751: ctypes/test/test_macholib.py fails when run from the installed http://bugs.python.org/issue17751 closed by zach.ware #17800: Add gc.needs_finalizing() to check if an object needs finalisi http://bugs.python.org/issue17800 closed by benjamin.peterson #17831: urllib.URLopener.open breaks ActiveDirectory user http://bugs.python.org/issue17831 closed by r.david.murray #17898: gettext bug while parsing plural-forms metadata http://bugs.python.org/issue17898 closed by akuchling #18128: pygettext: non-standard timestamp format in POT-Creation-Date http://bugs.python.org/issue18128 closed by r.david.murray #18166: 'value' attribute for ValueError http://bugs.python.org/issue18166 closed by pitrou #18402: Finding perl64 http://bugs.python.org/issue18402 closed by python-dev #19121: Documentation guidelines enhancements http://bugs.python.org/issue19121 closed by willingc #19292: Make SSLContext.set_default_verify_paths() work on Windows http://bugs.python.org/issue19292 closed by pitrou #19933: Round default argument for "ndigits" http://bugs.python.org/issue19933 closed by steve.dower #20309: Not all method descriptors are callable http://bugs.python.org/issue20309 closed by ncoghlan #20586: Argument Clinic: functions with valid sig but no docstring hav http://bugs.python.org/issue20586 closed by zach.ware #21013: server-specific SSL context configuration http://bugs.python.org/issue21013 closed by pitrou #21116: Failure to create multiprocessing shared arrays larger than 50 http://bugs.python.org/issue21116 closed by pitrou #21146: update gzip usage examples in docs http://bugs.python.org/issue21146 closed by akuchling #21217: inspect.getsourcelines finds wrong lines when lambda used argu http://bugs.python.org/issue21217 closed by pitrou #21400: Code coverage documentation is out-of-date. http://bugs.python.org/issue21400 closed by ned.deily #21416: argparse should accept bytes arguments as originally passed http://bugs.python.org/issue21416 closed by zach.ware #21511: Thinko in Lib/quopri.py, decoding b"==" to b"=" http://bugs.python.org/issue21511 closed by r.david.murray #21741: Convert most of the test suite to using unittest.main() http://bugs.python.org/issue21741 closed by zach.ware #22046: ZipFile.read() should mention that it might throw NotImplement http://bugs.python.org/issue22046 closed by gregory.p.smith #22248: urllib.request.urlopen raises exception when 30X-redirect url http://bugs.python.org/issue22248 closed by vadmium #22411: Embedding Python on Windows http://bugs.python.org/issue22411 closed by steve.dower #22631: Feature Request CAN_RAW_FD_FRAMES http://bugs.python.org/issue22631 closed by larry #22982: BOM incorrectly inserted before writing, after seeking in text http://bugs.python.org/issue22982 closed by pitrou #23193: Please support "numeric_owner" in tarfile http://bugs.python.org/issue23193 closed by eric.smith #23309: Hang on interpreter shutdown if daemon thread prints to stdout http://bugs.python.org/issue23309 closed by pitrou #23310: MagicMock initializer fails for magic methods http://bugs.python.org/issue23310 closed by lukasz.langa #23452: Build errors using VS Express 2013 in win32 mode http://bugs.python.org/issue23452 closed by tim.golden #23464: Remove or deprecate JoinableQueue in asyncio docs http://bugs.python.org/issue23464 closed by r.david.murray #23528: Limit decompressed data when reading from GzipFile http://bugs.python.org/issue23528 closed by pitrou #23529: Limit decompressed data when reading from LZMAFile and BZ2File http://bugs.python.org/issue23529 closed by pitrou #23612: 3.5.0a2 Windows installer does not remove 3.5.0a1 http://bugs.python.org/issue23612 closed by zach.ware #23686: Update Windows and OS X installer OpenSSL to 1.0.2a http://bugs.python.org/issue23686 closed by python-dev #23703: urljoin() with no directory segments duplicates filename http://bugs.python.org/issue23703 closed by berker.peksag #23726: Don't enable GC for classes that don't add new fields http://bugs.python.org/issue23726 closed by pitrou #23727: rfc2231 line continuations result in an additional newline http://bugs.python.org/issue23727 closed by r.david.murray #23730: Document return value for ZipFile.extract() http://bugs.python.org/issue23730 closed by python-dev #23731: Implement PEP 488 http://bugs.python.org/issue23731 closed by brett.cannon #23732: Update porting HOWTO about new -b semantics http://bugs.python.org/issue23732 closed by brett.cannon #23733: Update porting HOWTO for bytes interpolation http://bugs.python.org/issue23733 closed by brett.cannon #23761: test_socket fails on Mac OSX 10.9.5 http://bugs.python.org/issue23761 closed by willingc #23811: Python py_compile error message inconsistent and missing newli http://bugs.python.org/issue23811 closed by berker.peksag #23817: Consider FreeBSD like any other OS in SOVERSION http://bugs.python.org/issue23817 closed by haypo #23820: test_importlib fails under -O http://bugs.python.org/issue23820 closed by brett.cannon #23822: test_py_compile fails under -O http://bugs.python.org/issue23822 closed by brett.cannon #23826: test_enum fails under -OO http://bugs.python.org/issue23826 closed by ethan.furman #23865: Fix possible leaks in close methods http://bugs.python.org/issue23865 closed by serhiy.storchaka #23900: Add a default docstring to Enum subclasses http://bugs.python.org/issue23900 closed by python-dev #23904: pathlib.PurePath does not accept bytes components http://bugs.python.org/issue23904 closed by python-dev #23907: Python http://bugs.python.org/issue23907 closed by r.david.murray #23909: highlight query string does not work on docs.python.org/2 http://bugs.python.org/issue23909 closed by python-dev #23912: Inconsistent whitespace/formatting in docs/reference/datamodel http://bugs.python.org/issue23912 closed by berker.peksag #23913: error in some byte sizes of docs in array module http://bugs.python.org/issue23913 closed by r.david.murray #23916: module importing performance regression http://bugs.python.org/issue23916 closed by pitrou #23918: symbols namespace pollution http://bugs.python.org/issue23918 closed by python-dev #23923: Integer operations (// or %) on negative numbers product wrong http://bugs.python.org/issue23923 closed by eric.smith #23924: Add OS X to Dev Guide Quickstart build step http://bugs.python.org/issue23924 closed by ned.deily #23925: test_cmd_line failing on PYTHONSTARTUP http://bugs.python.org/issue23925 closed by r.david.murray #23928: SSL wiki page, host name matching, CN and SAN http://bugs.python.org/issue23928 closed by pitrou #23929: Minor typo (way vs. away) in os.renames docstring http://bugs.python.org/issue23929 closed by python-dev #23932: Tutorial section on function annotations is out of date re: PE http://bugs.python.org/issue23932 closed by python-dev #23935: Clean up Clinic's type expressions of buffers http://bugs.python.org/issue23935 closed by larry #23938: Deprecation of windows xp support missing from whats new http://bugs.python.org/issue23938 closed by python-dev #23939: test_get_platform_osx failure on Python 3.5.0a0 osx 10.6 http://bugs.python.org/issue23939 closed by ned.deily #23940: test__supports_universal_builds failure on Python 3.5.0a0 osx http://bugs.python.org/issue23940 closed by ned.deily #23941: Google Login not working http://bugs.python.org/issue23941 closed by ned.deily #23943: Misspellings in a few files http://bugs.python.org/issue23943 closed by berker.peksag #23944: Argument Clinic: wrap impl's declaration if it's too long http://bugs.python.org/issue23944 closed by larry #23945: webbrowser.open opens twice on Windows if BROWSER is set http://bugs.python.org/issue23945 closed by andig2 #23956: Compatibility misspelled in Lib/imp.py http://bugs.python.org/issue23956 closed by python-dev #23957: Additional misspelled in documentation http://bugs.python.org/issue23957 closed by r.david.murray #23988: keyworded argument count is wrong http://bugs.python.org/issue23988 closed by benjamin.peterson #1481347: parse_makefile doesn't handle $$ correctly http://bugs.python.org/issue1481347 closed by ned.deily #1745108: 2.5.1 curses panel segfault in new_panel on aix 5.3 http://bugs.python.org/issue1745108 closed by akuchling #1475523: gettext breaks on plural-forms header http://bugs.python.org/issue1475523 closed by akuchling #1648890: HP-UX: ld -Wl,+b... http://bugs.python.org/issue1648890 closed by akuchling #1284316: Win32: Security problem with default installation directory http://bugs.python.org/issue1284316 closed by steve.dower From martin at v.loewis.de Fri Apr 17 19:31:10 2015 From: martin at v.loewis.de (=?windows-1252?Q?=22Martin_v=2E_L=F6wis=22?=) Date: Fri, 17 Apr 2015 19:31:10 +0200 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: <55303BBC.1050806@egenix.com> References: <551E63E5.6080805@hastings.org> <551ED41D.2070909@egenix.com> <1428174125.3954963.249326869.58843520@webmail.messagingengine.com> <55203CED.5030609@egenix.com> <1428176941.3962322.249337589.2D987B60@webmail.messagingengine.com> <55204174.9010404@egenix.com> <55300ED2.8060208@v.loewis.de> <55303BBC.1050806@egenix.com> Message-ID: <5531435E.5070904@v.loewis.de> Am 17.04.15 um 00:46 schrieb M.-A. Lemburg: >> I had asked the PSF for a StartSSL certificate when the previous >> certificate expired, and the PSF was not able to provide one. After >> waiting several weeks for the PSF to provide the certificate, Kurt then >> kindly went to Verisign. > > When was that ? I never received such a request. I sent the request to Jesse Noller, Noah Kantrowitz and Kurt Kaiser on 2014-03-17. On 2014-04-15, Jesse indicated that he had given up. Regards, Martin From mal at egenix.com Fri Apr 17 19:38:36 2015 From: mal at egenix.com (M.-A. Lemburg) Date: Fri, 17 Apr 2015 19:38:36 +0200 Subject: [Python-Dev] [python-committers] Do we need to sign Windows files with GnuPG? In-Reply-To: <5531435E.5070904@v.loewis.de> References: <551E63E5.6080805@hastings.org> <551ED41D.2070909@egenix.com> <1428174125.3954963.249326869.58843520@webmail.messagingengine.com> <55203CED.5030609@egenix.com> <1428176941.3962322.249337589.2D987B60@webmail.messagingengine.com> <55204174.9010404@egenix.com> <55300ED2.8060208@v.loewis.de> <55303BBC.1050806@egenix.com> <5531435E.5070904@v.loewis.de> Message-ID: <5531451C.9030202@egenix.com> On 17.04.2015 19:31, "Martin v. L?wis" wrote: > Am 17.04.15 um 00:46 schrieb M.-A. Lemburg: >>> I had asked the PSF for a StartSSL certificate when the previous >>> certificate expired, and the PSF was not able to provide one. After >>> waiting several weeks for the PSF to provide the certificate, Kurt then >>> kindly went to Verisign. >> >> When was that ? I never received such a request. > > I sent the request to Jesse Noller, Noah Kantrowitz and Kurt Kaiser on > 2014-03-17. On 2014-04-15, Jesse indicated that he had given up. I guess that explains why nothing happened. Jesse owned the StartSSL account before I took over in Dec last year. -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source (#1, Apr 17 2015) >>> Python Projects, Coaching and Consulting ... http://www.egenix.com/ >>> mxODBC Plone/Zope Database Adapter ... http://zope.egenix.com/ >>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ ________________________________________________________________________ ::::: Try our mxODBC.Connect Python Database Interface for free ! :::::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/ From robertc at robertcollins.net Fri Apr 17 20:47:19 2015 From: robertc at robertcollins.net (Robert Collins) Date: Sat, 18 Apr 2015 06:47:19 +1200 Subject: [Python-Dev] unittest test discovery and namespace packages In-Reply-To: References: Message-ID: On 17 April 2015 at 19:40, Alex Shkop wrote: > Hello! > > There's an issue considering test discovery in unittest module. Basically it > is about unittest module that doesn't find tests in namespace packages. For > more info see issue http://bugs.python.org/issue23882. > > I'm willing to make a patch for this bug. But I need help to formulate how > test discovery should work. > > Documentation states that all importable modules that match pattern will be > loaded. This means that test modules inside namespace packages should be > loaded too. But enabling this would change things drastically. For example > now, running > > python -m unittest > > inside cpython source root does nothing. If we will enable test discovery > inside namespace packages then this command will start running the whole > python test suite in Lib/test/. I don't think that 'scan the global namespace' makes a sensible default definition. The behaviour of discovery with namespace packages today requires some key to select the namespace - either a locally discovered directory, which happens to be a namespace package, or the name of the package to process. Since discovery is recursive, sub namespace packages should work, but I note there are no explicit tests to this effect. I'm sorry I didn't respond earlier on the tracker, didn't see the issue in my inbox for some reason. Lets discuss there. -Rob -- Robert Collins Distinguished Technologist HP Converged Cloud From yselivanov.ml at gmail.com Fri Apr 17 21:12:50 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Fri, 17 Apr 2015 15:12:50 -0400 Subject: [Python-Dev] async/await PEP Message-ID: <55315B32.5000205@gmail.com> Hello, I've just posted a new PEP about adding async/await to python-ideas. Maybe I should have posted it here instead.. Anyways, please take a look. Thanks, Yury From guido at python.org Fri Apr 17 23:58:25 2015 From: guido at python.org (Guido van Rossum) Date: Fri, 17 Apr 2015 14:58:25 -0700 Subject: [Python-Dev] PEP 484 (Type Hints) -- the next draft is here Message-ID: The PEP 484 authors (Jukka, ?ukasz and I) have a new draft ready. This time we're posting to python-dev. While we don't have all details totally sorted out, it's a lot closer. We have a BDFL-Delegate (Mark Shannon), and we've worked out a variety of issues at PyCon. Our hope remains that we'll get the typing.py module added to CPython 3.5 before beta 1 goes out (May 24). Below is a list of key changes since the draft posted to python-dev on March 20 (for more details see https://github.com/ambv/typehinting/commits/master/pep-0484.txt) and the full text of the PEP. The new draft is also in the peps repo ( https://hg.python.org/peps/file/tip/pep-0484.txt) and will soon be on python.org (https://www.python.org/dev/peps/pep-0484/ -- give it 10-30 minutes). As always, between draft postings the PEP text is updated frequently by the authors in a dedicated GitHub repo (https://github.com/ambv/typehinting), and many detailed discussions are found in the issue tracker there ( https://github.com/ambv/typehinting/issues). The typing.py module also lives in that repo ( https://github.com/ambv/typehinting/tree/master/prototyping). Key changes since 20-Mar-2015 draft -------------------------------------------------- Generics: - Properly specify generics and type variables. - Add covariant/contravariant flags to TypeVar(). Type specifications: - Define type aliases. - Kill Undefined. - Better specification of `# type:` comments. Specifics: - Add Generator (a generic type describing generator objects) - Document get_type_hints(). Stubs: - Only .pyi is a valid extension for stub files. - Add discussion of third-party stubs (see issue #84). Process: - Mark Shannon is now BDFL-Delegate. - Add section on PEP Development Process, with github links. Other: - Mention the __future__ proposal (a separate PEP to be developed). - Lots of editing; remove some sections that didn't specify anything. Full PEP text ------------------ PEP: 484 Title: Type Hints Version: $Revision$ Last-Modified: $Date$ Author: Guido van Rossum , Jukka Lehtosalo < jukka.lehtosalo at iki.fi>, ?ukasz Langa BDFL-delegate: Mark Shannon Discussions-To: Python-Dev Status: Draft Type: Standards Track Content-Type: text/x-rst Created: 29-Sep-2014 Post-History: 16-Jan-2015,20-Mar-2015,17-Apr-2015 Resolution: Abstract ======== This PEP introduces a standard syntax for type hints using annotations (PEP 3107) on function definitions. For example, here is a simple function whose argument and return type are declared in the annotations:: def greeting(name: str) -> str: return 'Hello ' + name While these annotations are available at runtime through the usual ``__annotations__`` attribute, *no type checking happens at runtime*. Instead, the proposal assumes the existence of a separate off-line type checker which users can run over their source code voluntarily. Essentially, such a type checker acts as a very powerful linter. The proposal is strongly inspired by mypy [mypy]_. For example, the type "sequence of integers" can be written as ``Sequence[int]``. The square brackets mean that no new syntax needs to be added to the language. The example here uses a custom class ``Sequence``, imported from a pure-Python module ``typing``. The ``Sequence[int]`` notation works by implementing ``__getitem__()`` in the metaclass. The type system supports unions, generic types, and a special type named ``Any`` which is consistent with (i.e. assignable to and from) all types. This latter feature is taken from the idea of gradual typing. Gradual typing and the full type system are explained in PEP 483. Other approaches from which we have borrowed or to which ours can be compared and contrasted are described in PEP 482. Rationale and Goals =================== PEP 3107 added support for arbitrary annotations on parts of a function definition. Although no meaning was assigned to annotations then, there has always been an implicit goal to use them for type hinting, which is listed as the first possible use case in said PEP. This PEP aims to provide a standard syntax for type annotations, opening up Python code to easier static analysis and refactoring, potential runtime type checking, and performance optimizations utilizing type information. Of these goals, static analysis is the most important. This includes support for off-line type checkers such as mypy, as well as providing a standard notation that can be used by IDEs for code completion and refactoring. Non-goals --------- While the proposed typing module will contain some building blocks for runtime type checking -- in particular a useful ``isinstance()`` implementation -- third party packages would have to be developed to implement specific runtime type checking functionality, for example using decorators or metaclasses. Using type hints for performance optimizations is left as an exercise for the reader. It should also be emphasized that Python will remain a dynamically typed language, and the authors have no desire to ever make type hints mandatory, even by convention. What is checked? ================ Any function (or method -- for brevity we won't be repeating this) with at least one argument or return annotation is checked, unless type checking is disabled by the ``@no_type_check`` decorator or a ``# type: ignore`` comment (see below). A checked function should have annotations for all its arguments and its return type, with the exception that the ``self`` argument of a method should not be annotated; it is assumed to have the type of the containing class. Notably, the return type of ``__init__`` should be annotated with ``-> None``. The body of a checked function is checked for consistency with the given annotations. The annotations are also used to check correctness of calls appearing in other checked functions. Functions without any annotations (or whose checking is disabled) are assumed to have type ``Any`` when they are referenced in checked functions, and this should completely silence complaints from the checker regarding those references (although a checker may still request that a type be specified using a cast or a ``# type:`` comment if a more specific type than ``Any`` is needed for analysis of subsequent code). A type checker should understand decorators; this may require annotations on decorator definitions. In particular, a type checker should understand the built-in decorators ``@property``, ``@staticmethod`` and ``@classmethod``. The first argument of a class method should not be annotated; it is assumed to be a subclass of the defining class. Type Definition Syntax ====================== The syntax leverages PEP 3107-style annotations with a number of extensions described in sections below. In its basic form, type hinting is used by filling function annotation slots with classes:: def greeting(name: str) -> str: return 'Hello ' + name This states that the expected type of the ``name`` argument is ``str``. Analogically, the expected return type is ``str``. Expressions whose type is a subtype of a specific argument type are also accepted for that argument. Acceptable type hints --------------------- Type hints may be built-in classes (including those defined in standard library or third-party extension modules), abstract base classes, types available in the ``types`` module, and user-defined classes (including those defined in the standard library or third-party modules). Annotations for built-in classes (and other classes at the discretion of the developer) may be placed in stub files (see below). Annotations must be valid expressions that evaluate without raising exceptions at the time the function is defined (but see below for forward references). The needs of static analysis require that annotations must be simple enough to be interpreted by static analysis tools. In particular, dynamically computed types are not acceptable. (This is an intentionally somewhat vague requirement, specific inclusions and exclusions may be added to future versions of this PEP as warranted by the discussion.) In addition to the above, the following special constructs defined below may be used: ``None``, ``Any``, ``Union``, ``Tuple``, ``Callable``, all ABCs and stand-ins for concrete classes exported from ``typing`` (e.g. ``Sequence`` and ``Dict``), type variables, and type aliases. All newly introducedw names used to support features described in following sections (such as ``Any`` and ``Union``) are available in the ``typing`` module. Using None ---------- When used in a type hint, the expression ``None`` is considered equivalent to ``type(None)``. Type aliases ------------ Type aliases are defined by simple variable assignments:: Url = str def retry(url: Url, retry_count: int) -> None: ... Note that we recommend capitalizing alias names, since they represent user-defined types, which (like user-defined classes) are typically spelled that way. Type aliases may be as complex as type hints in annotations -- anything that is acceptable as a type hint is acceptable in a type alias:: from typing import TypeVar, Iterable, Tuple T = TypeVar('T', int, float, complex) Vector = Iterable[Tuple[T, T]] def inproduct(v: Vector) -> T: return sum(x*y for x, y in v) This is equivalent to:: from typing import TypeVar, Iterable, Tuple T = TypeVar('T', int, float, complex) def inproduct(v: Iterable[Tuple[T, T]]) -> T: return sum(x*y for x, y in v) Callable -------- Frameworks expecting callback functions of specific signatures might be type hinted using ``Callable[[Arg1Type, Arg2Type], ReturnType]``. Examples:: from typing import Callable def feeder(get_next_item: Callable[[], str]) -> None: # Body def async_query(on_success: Callable[[int], None], on_error: Callable[[int, Exception], None]) -> None: # Body It is possible to declare the return type of a callable without specifying the call signature by substituting a literal ellipsis (three dots) for the list of arguments:: def partial(func: Callable[..., str], *args) -> Callable[..., str]: # Body Note that there are no square brackets around the ellipsis. The arguments of the callback are completely unconstrained in this case (and keyword arguments are acceptable). Since using callbacks with keyword arguments is not perceived as a common use case, there is currently no support for specifying keyword arguments with ``Callable``. Similarly, there is no support for specifying callback signatures with a variable number of argument of a specific type. Generics -------- Since type information about objects kept in containers cannot be statically inferred in a generic way, abstract base classes have been extended to support subscription to denote expected types for container elements. Example:: from typing import Mapping, Set def notify_by_email(employees: Set[Employee], overrides: Mapping[str, str]) -> None: ... Generics can be parametrized by using a new factory available in ``typing`` called ``TypeVar``. Example:: from typing import Sequence, TypeVar T = TypeVar('T') # Declare type variable def first(l: Sequence[T]) -> T: # Generic function return l[0] In this case the contract is that the returned value is consistent with the elements held by the collection. ``TypeVar`` supports constraining parametric types to a fixed set of possible types. For example, we can define a type variable that ranges over just ``str`` and ``bytes``. By default, a type variable ranges over all possible types. Example of constraining a type variable:: from typing import TypeVar AnyStr = TypeVar('AnyStr', str, bytes) def concat(x: AnyStr, y: AnyStr) -> AnyStr: return x + y The function ``concat`` can be called with either two ``str`` arguments or two ``bytes`` arguments, but not with a mix of ``str`` and ``bytes`` arguments. Note that subtypes of types constrained by a type variable are treated as their respective explicitly listed base types in the context of the type variable. Consider this example:: class MyStr(str): ... x = concat(MyStr('apple'), MyStr('pie')) The call is valid but the type variable ``AnyStr`` will be set to ``str`` and not ``MyStr``. In effect, the inferred type of the return value assigned to ``x`` will also be ``str``. Additionally, ``Any`` is a valid value for every type variable. Consider the following:: def count_truthy(elements: List[Any]) -> int: return sum(1 for elem in elements if element) This is equivalent to omitting the generic notation and just saying ``elements: List``. User-defined generic types -------------------------- You can include a ``Generic`` base class to define a user-defined class as generic. Example:: from typing import TypeVar, Generic T = TypeVar('T') class LoggedVar(Generic[T]): def __init__(self, value: T, name: str, logger: Logger) -> None: self.name = name self.logger = logger self.value = value def set(self, new: T) -> None: self.log('Set ' + repr(self.value)) self.value = new def get(self) -> T: self.log('Get ' + repr(self.value)) return self.value def log(self, message: str) -> None: self.logger.info('{}: {}'.format(self.name message)) ``Generic[T]`` as a base class defines that the class ``LoggedVar`` takes a single type parameter ``T``. This also makes ``T`` valid as a type within the class body. The ``Generic`` base class uses a metaclass that defines ``__getitem__`` so that ``LoggedVar[t]`` is valid as a type:: from typing import Iterable def zero_all_vars(vars: Iterable[LoggedVar[int]]) -> None: for var in vars: var.set(0) A generic type can have any number of type variables, and type variables may be constrained. This is valid:: from typing import TypeVar, Generic ... T = TypeVar('T') S = TypeVar('S') class Pair(Generic[T, S]): ... Each type variable argument to ``Generic`` must be distinct. This is thus invalid:: from typing import TypeVar, Generic ... T = TypeVar('T') class Pair(Generic[T, T]): # INVALID ... You can use multiple inheritance with ``Generic``:: from typing import TypeVar, Generic, Sized T = TypeVar('T') class LinkedList(Sized, Generic[T]): ... Arbitrary generic types as base classes --------------------------------------- ``Generic[T]`` is only valid as a base class -- it's not a proper type. However, user-defined generic types such as ``LinkedList[T]`` from the above example and built-in generic types and ABCs such as ``List[T]`` and ``Iterable[T]`` are valid both as types and as base classes. For example, we can define a subclass of ``Dict`` that specializes type arguments:: from typing import Dict, List, Optional class Node: ... class SymbolTable(Dict[str, List[Node]]): def push(self, name: str, node: Node) -> None: self.setdefault(name, []).append(node) def pop(self, name: str) -> Node: return self[name].pop() def lookup(self, name: str) -> Optional[Node]: nodes = self.get(name) if nodes: return nodes[-1] return None ``SymbolTable`` is a subclass of ``dict`` and a subtype of ``Dict[str, List[Node]]``. If a generic base class has a type variable as a type argument, this makes the defined class generic. For example, we can define a generic ``LinkedList`` class that is iterable and a container:: from typing import TypeVar, Iterable, Container T = TypeVar('T') class LinkedList(Iterable[T], Container[T]): ... Now ``LinkedList[int]`` is a valid type. Note that we can use ``T`` multiple times in the base class list, as long as we don't use the same type variable ``T`` multiple times within ``Generic[...]``. Abstract generic types ---------------------- The metaclass used by ``Generic`` is a subclass of ``abc.ABCMeta``. A generic class can be an ABC by including abstract methods or properties, and generic classes can also have ABCs as base classes without a metaclass conflict. Forward references ------------------ When a type hint contains names that have not been defined yet, that definition may be expressed as a string literal, to be resolved later. A situation where this occurs commonly is the definition of a container class, where the class being defined occurs in the signature of some of the methods. For example, the following code (the start of a simple binary tree implementation) does not work:: class Tree: def __init__(self, left: Tree, right: Tree): self.left = left self.right = right To address this, we write:: class Tree: def __init__(self, left: 'Tree', right: 'Tree'): self.left = left self.right = right The string literal should contain a valid Python expression (i.e., ``compile(lit, '', 'expr')`` should be a valid code object) and it should evaluate without errors once the module has been fully loaded. The local and global namespace in which it is evaluated should be the same namespaces in which default arguments to the same function would be evaluated. Moreover, the expression should be parseable as a valid type hint, i.e., it is constrained by the rules from the section `Acceptable type hints`_ above. It is allowable to use string literals as *part* of a type hint, for example:: class Tree: ... def leaves(self) -> List['Tree']: ... Union types ----------- Since accepting a small, limited set of expected types for a single argument is common, there is a new special factory called ``Union``. Example:: from typing import Union def handle_employees(e: Union[Employee, Sequence[Employee]]) -> None: if isinstance(e, Employee): e = [e] ... A type factored by ``Union[T1, T2, ...]`` responds ``True`` to ``issubclass`` checks for ``T1`` and any of its subclasses, ``T2`` and any of its subclasses, and so on. One common case of union types are *optional* types. By default, ``None`` is an invalid value for any type, unless a default value of ``None`` has been provided in the function definition. Examples:: def handle_employee(e: Union[Employee, None]) -> None: ... As a shorthand for ``Union[T1, None]`` you can write ``Optional[T1]``; for example, the above is equivalent to:: from typing import Optional def handle_employee(e: Optional[Employee]) -> None: ... An optional type is also automatically assumed when the default value is ``None``, for example:: def handle_employee(e: Employee = None): ... This is equivalent to:: def handle_employee(e: Optional[Employee] = None) -> None: ... The ``Any`` type ---------------- A special kind of type is ``Any``. Every class is a subclass of ``Any``. This is also true for the builtin class ``object``. However, to the static type checker these are completely different. When the type of a value is ``object``, the type checker will reject almost all operations on it, and assigning it to a variable (or using it as a return value) of a more specialized type is a type error. On the other hand, when a value has type ``Any``, the type checker will allow all operations on it, and a value of type ``Any`` can be assigned to a variable (or used as a return value) of a more constrained type. Predefined constants -------------------- Some predefined Boolean constants are defined in the ``typing`` module to enable platform-specific type definitions and such:: from typing import PY2, PY3, WINDOWS, POSIX if PY2: text = unicode else: text = str def f() -> text: ... if WINDOWS: loop = ProactorEventLoop else: loop = UnixSelectorEventLoop It is up to the type checker implementation to define their values, as long as ``PY2 == not PY3`` and ``WINDOWS == not POSIX``. When the program is being executed these always reflect the current platform, and this is also the suggested default when the program is being type-checked. Compatibility with other uses of function annotations ===================================================== A number of existing or potential use cases for function annotations exist, which are incompatible with type hinting. These may confuse a static type checker. However, since type hinting annotations have no runtime behavior (other than evaluation of the annotation expression and storing annotations in the ``__annotations__`` attribute of the function object), this does not make the program incorrect -- it just may cause a type checker to emit spurious warnings or errors. To mark portions of the program that should not be covered by type hinting, you can use one or more of the following: * a ``# type: ignore`` comment; * a ``@no_type_check`` decorator on a class or function; * a custom class or function decorator marked with ``@no_type_check_decorator``. For more details see later sections. In order for maximal compatibility with offline type checking it may eventually be a good idea to change interfaces that rely on annotations to switch to a different mechanism, for example a decorator. In Python 3.5 there is no pressure to do this, however. See also the longer discussion under `Rejected alternatives`_ below. Type comments ============= No first-class syntax support for explicitly marking variables as being of a specific type is added by this PEP. To help with type inference in complex cases, a comment of the following format may be used:: x = [] # type: List[Employee] x, y, z = [], [], [] # type: List[int], List[int], List[str] x, y, z = [], [], [] # type: (List[int], List[int], List[str]) x = [ 1, 2, ] # type: List[int] Type comments should be put on the last line of the statement that contains the variable definition. They can also be placed on ``with`` statements and ``for`` statements, right after the colon. Examples of type comments on ``with`` and ``for`` statements:: with frobnicate() as foo: # type: int # Here foo is an int ... for x, y in points: # type: float, float # Here x and y are floats ... The ``# type: ignore`` comment should be put on the line that the error refers to:: import http.client errors = { 'not_found': http.client.NOT_FOUND # type: ignore } A ``# type: ignore`` comment on a line by itself disables all type checking for the rest of the file. If type hinting proves useful in general, a syntax for typing variables may be provided in a future Python version. Casts ===== Occasionally the type checker may need a different kind of hint: the programmer may know that an expression is of a more constrained type than the type checker infers. For example:: from typing import List, cast def find_first_str(a: List[object]) -> str: index = next(i for i, x in enumerate(a) if isinstance(x, str)) # We only get here if there's at least one string in a return cast(str, a[index]) The type checker infers the type ``object`` for ``a[index]``, but we know that (if the code gets to that point) it must be a string. The ``cast(t, x)`` call tells the type checker that we are confident that the type of ``x`` is ``t``. At runtime a cast always returns the expression unchanged -- it does not check the type, and it does not convert or coerce the value. Casts differ from type comments (see the previous section). When using a type comment, the type checker should still verify that the inferred type is consistent with the stated type. When using a cast, the type checker should blindly believe the programmer. Also, casts can be used in expressions, while type comments only apply to assignments. Stub Files ========== Stub files are files containing type hints that are only for use by the type checker, not at runtime. There are several use cases for stub files: * Extension modules * Third-party modules whose authors have not yet added type hints * Standard library modules for which type hints have not yet been written * Modules that must be compatible with Python 2 and 3 * Modules that use annotations for other purposes Stub files have the same syntax as regular Python modules. There is one feature of the ``typing`` module that may only be used in stub files: the ``@overload`` decorator described below. The type checker should only check function signatures in stub files; function bodies in stub files should just be a single ``pass`` statement. The type checker should have a configurable search path for stub files. If a stub file is found the type checker should not read the corresponding "real" module. While stub files are syntactically valid Python modules, they use the ``.pyi`` extension to make it possible to maintain stub files in the same directory as the corresponding real module. This also reinforces the notion that no runtime behavior should be expected of stub files. Function overloading -------------------- The ``@overload`` decorator allows describing functions that support multiple different combinations of argument types. This pattern is used frequently in builtin modules and types. For example, the ``__getitem__()`` method of the ``bytes`` type can be described as follows:: from typing import overload class bytes: ... @overload def __getitem__(self, i: int) -> int: pass @overload def __getitem__(self, s: slice) -> bytes: pass This description is more precise than would be possible using unions (which cannot express the relationship between the argument and return types):: from typing import Union class bytes: ... def __getitem__(self, a: Union[int, slice]) -> Union[int, bytes]: pass Another example where ``@overload`` comes in handy is the type of the builtin ``map()`` function, which takes a different number of arguments depending on the type of the callable:: from typing import Callable, Iterable, Iterator, Tuple, TypeVar, overload T1 = TypeVar('T1') T2 = TypeVar('T2) S = TypeVar('S') @overload def map(func: Callable[[T1], S], iter1: Iterable[T1]) -> Iterator[S]: pass @overload def map(func: Callable[[T1, T2], S], iter1: Iterable[T1], iter2: Iterable[T2]) -> Iterator[S]: pass # ... and we could add more items to support more than two iterables Note that we could also easily add items to support ``map(None, ...)``:: @overload def map(func: None, iter1: Iterable[T1]) -> Iterable[T1]: pass @overload def map(func: None, iter1: Iterable[T1], iter2: Iterable[T2]) -> Iterable[Tuple[T1, T2]]: pass The ``@overload`` decorator may only be used in stub files. While it would be possible to provide a multiple dispatch implementation using this syntax, its implementation would require using ``sys._getframe()``, which is frowned upon. Also, designing and implementing an efficient multiple dispatch mechanism is hard, which is why previous attempts were abandoned in favor of ``functools.singledispatch()``. (See PEP 443, especially its section "Alternative approaches".) In the future we may come up with a satisfactory multiple dispatch design, but we don't want such a design to be constrained by the overloading syntax defined for type hints in stub files. Storing and distributing stub files ----------------------------------- The easiest form of stub file storage and distribution is to put them alongside Python modules in the same directory. This makes them easy to find by both programmers and the tools. However, since package maintainers are free not to add type hinting to their packages, third-party stubs installable by ``pip`` from PyPI are also supported. In this case we have to consider three issues: naming, versioning, installation path. This PEP does not provide a recommendation on a naming scheme that should be used for third-party stub file packages. Discoverability will hopefully be based on package popularity, like with Django packages for example. Third-party stubs have to be versioned using the lowest version of the source package that is compatible. Example: FooPackage has versions 1.0, 1.1, 1.2, 1.3, 2.0, 2.1, 2.2. There are API changes in versions 1.1, 2.0 and 2.2. The stub file package maintainer is free to release stubs for all versions but at least 1.0, 1.1, 2.0 and 2.2 are needed to enable the end user type check all versions. This is because the user knows that the closest *lower or equal* version of stubs is compatible. In the provided example, for FooPackage 1.3 the user would choose stubs version 1.1. Note that if the user decides to use the "latest" available source package, using the "latest" stub files should generally also work if they're updated often. Third-party stub packages can use any location for stub storage. The type checker will search for them using PYTHONPATH. A default fallback directory that is always checked is ``shared/typehints/python3.5/`` (or 3.6, etc.). Since there can only be one package installed for a given Python version per environment, no additional versioning is performed under that directory (just like bare directory installs by ``pip`` in site-packages). Stub file package authors might use the following snippet in ``setup.py``:: ... data_files=[ ( 'shared/typehints/python{}.{}'.format(*sys.version_info[:2]), pathlib.Path(SRC_PATH).glob('**/*.pyi'), ), ], ... Exceptions ========== No syntax for listing explicitly raised exceptions is proposed. Currently the only known use case for this feature is documentational, in which case the recommendation is to put this information in a docstring. The ``typing`` Module ===================== To open the usage of static type checking to Python 3.5 as well as older versions, a uniform namespace is required. For this purpose, a new module in the standard library is introduced called ``typing``. It holds a set of classes representing builtin types with generics, namely: * Dict, used as ``Dict[key_type, value_type]`` * List, used as ``List[element_type]`` * Set, used as ``Set[element_type]``. See remark for ``AbstractSet`` below. * FrozenSet, used as ``FrozenSet[element_type]`` * Tuple, used by listing the element types, for example ``Tuple[int, int, str]``. Arbitrary-length homogeneous tuples can be expressed using one type and ellipsis, for example ``Tuple[int, ...]``. (The ``...`` here are part of the syntax.) * NamedTuple, used as ``NamedTuple(type_name, [(field_name, field_type), ...])`` and equivalent to ``collections.namedtuple(type_name, [field_name, ...])``. The generic versions of concrete collection types (``Dict``, ``List``, ``Set``, ``FrozenSet``, and homogeneous arbitrary-length ``Tuple``) are mainly useful for annotating return values. For arguments, prefer the abstract collection types defined below, e.g. ``Mapping``, ``Sequence`` or ``AbstractSet``. The ``typing`` module defines the ``Generator`` type for return values of generator functions. It is a subtype of ``Iterable`` and it has additional type variables for the type accepted by the ``send()`` method and the return type of the generator: * Generator, used as ``Generator[yield_type, send_type, return_type]`` It also introduces factories and helper members needed to express generics and union types: * Any, used as ``def get(key: str) -> Any: ...`` * Union, used as ``Union[Type1, Type2, Type3]`` * TypeVar, used as ``X = TypeVar('X', Type1, Type2, Type3)`` or simply ``Y = TypeVar('Y')`` * Callable, used as ``Callable[[Arg1Type, Arg2Type], ReturnType]`` * AnyStr, defined as ``TypeVar('AnyStr', str, bytes)`` All abstract base classes available in ``collections.abc`` are importable from the ``typing`` module, with added generics support: * ByteString * Callable (see above) * Container * Hashable (not generic, but present for completeness) * ItemsView * Iterable * Iterator * KeysView * Mapping * MappingView * MutableMapping * MutableSequence * MutableSet * Sequence * Set, renamed to ``AbstractSet``. This name change was required because ``Set`` in the ``typing`` module means ``set()`` with generics. * Sized (not generic, but present for completeness) * ValuesView A few one-off types are defined that test for single special methods (similar to ``Hashable`` or ``Sized``): * Reversible, to test for ``__reversed__`` * SupportsAbs, to test for ``__abs__`` * SupportsFloat, to test for ``__float__`` * SupportsInt, to test for ``__int__`` * SupportsRound, to test for ``__round__`` The library includes literals for platform-specific type hinting: * PY2 * PY3, equivalent to ``not PY2`` * WINDOWS * POSIX, equivalent to ``not WINDOWS`` The following conveniece functions and decorators are exported: * cast, described earlier * no_type_check, a decorator to disable type checking per class or function (see below) * no_type_check_decorator, a decorator to create your own decorators with the same meaning as ``@no_type_check`` (see below) * overload, described earlier * get_type_hints, a utility function to retrieve the type hints from a function or method. Given a function or method object, it returns a dict with the same format as ``__annotations__``, but evaluating forward references (which are given as string literals) as expressions in the context of the original function or method definition. The following types are available in the ``typing.io`` module: * IO (generic over ``AnyStr``) * BinaryIO (a simple subclass of ``IO[bytes]``) * TextIO (a simple subclass of ``IO[str]``) The following types are provided by the ``typing.re`` module: * Match and Pattern, types of ``re.match()`` and ``re.compile()`` results (generic over ``AnyStr``) As a convenience measure, types from ``typing.io`` and ``typing.re`` are also available in ``typing`` (quoting Guido, "There's a reason those modules have two-letter names."). Rejected Alternatives ===================== During discussion of earlier drafts of this PEP, various objections were raised and alternatives were proposed. We discuss some of these here and explain why we reject them. Several main objections were raised. Which brackets for generic type parameters? ------------------------------------------- Most people are familiar with the use of angular brackets (e.g. ``List``) in languages like C++, Java, C# and Swift to express the parametrization of generic types. The problem with these is that they are really hard to parse, especially for a simple-minded parser like Python. In most languages the ambiguities are usually dealy with by only allowing angular brackets in specific syntactic positions, where general expressions aren't allowed. (And also by using very powerful parsing techniques that can backtrack over an arbitrary section of code.) But in Python, we'd like type expressions to be (syntactically) the same as other expressions, so that we can use e.g. variable assignment to create type aliases. Consider this simple type expression:: List >From the Python parser's perspective, the expression begins with the same four tokens (NAME, LESS, NAME, GREATER) as a chained comparison:: a < b > c # I.e., (a < b) and (b > c) We can even make up an example that could be parsed both ways:: a < b > [ c ] Assuming we had angular brackets in the language, this could be interpreted as either of the following two:: (a)[c] # I.e., (a).__getitem__(c) a < b > ([c]) # I.e., (a < b) and (b > [c]) It would surely be possible to come up with a rule to disambiguate such cases, but to most users the rules would feel arbitrary and complex. It would also require us to dramatically change the CPython parser (and every other parser for Python). It should be noted that Python's current parser is intentionally "dumb" -- a simple grammar is easier for users to reason about. For all these reasons, square brackets (e.g. ``List[int]``) are (and have long been) the preferred syntax for generic type parameters. They can be implemented by defining the ``__getitem__()`` method on the metaclass, and no new syntax is required at all. This option works in all recent versions of Python (starting with Python 2.2). Python is not alone in this syntactic choice -- generic classes in Scala also use square brackets. What about existing uses of annotations? ---------------------------------------- One line of argument points out that PEP 3107 explicitly supports the use of arbitrary expressions in function annotations. The new proposal is then considered incompatible with the specification of PEP 3107. Our response to this is that, first of all, the current proposal does not introduce any direct incompatibilities, so programs using annotations in Python 3.4 will still work correctly and without prejudice in Python 3.5. We do hope that type hints will eventually become the sole use for annotations, but this will require additional discussion and a deprecation period after the initial roll-out of the typing module with Python 3.5. The current PEP will have provisional status (see PEP 411) until Python 3.6 is released. The fastest conceivable scheme would introduce silent deprecation of non-type-hint annotations in 3.6, full deprecation in 3.7, and declare type hints as the only allowed use of annotations in Python 3.8. This should give authors of packages that use annotations plenty of time to devise another approach, even if type hints become an overnight success. Another possible outcome would be that type hints will eventually become the default meaning for annotations, but that there will always remain an option to disable them. For this purpose the current proposal defines a decorator ``@no_type_check`` which disables the default interpretation of annotations as type hints in a given class or function. It also defines a meta-decorator ``@no_type_check_decorator`` which can be used to decorate a decorator (!), causing annotations in any function or class decorated with the latter to be ignored by the type checker. There are also ``# type: ignore`` comments, and static checkers should support configuration options to disable type checking in selected packages. Despite all these options, proposals have been circulated to allow type hints and other forms of annotations to coexist for individual arguments. One proposal suggests that if an annotation for a given argument is a dictionary literal, each key represents a different form of annotation, and the key ``'type'`` would be use for type hints. The problem with this idea and its variants is that the notation becomes very "noisy" and hard to read. Also, in most cases where existing libraries use annotations, there would be little need to combine them with type hints. So the simpler approach of selectively disabling type hints appears sufficient. The problem of forward declarations ----------------------------------- The current proposal is admittedly sub-optimal when type hints must contain forward references. Python requires all names to be defined by the time they are used. Apart from circular imports this is rarely a problem: "use" here means "look up at runtime", and with most "forward" references there is no problem in ensuring that a name is defined before the function using it is called. The problem with type hints is that annotations (per PEP 3107, and similar to default values) are evaluated at the time a function is defined, and thus any names used in an annotation must be already defined when the function is being defined. A common scenario is a class definition whose methods need to reference the class itself in their annotations. (More general, it can also occur with mutually recursive classes.) This is natural for container types, for example:: class Node: """Binary tree node.""" def __init__(self, left: Node, right: None): self.left = left self.right = right As written this will not work, because of the peculiarity in Python that class names become defined once the entire body of the class has been executed. Our solution, which isn't particularly elegant, but gets the job done, is to allow using string literals in annotations. Most of the time you won't have to use this though -- most *uses* of type hints are expected to reference builtin types or types defined in other modules. A counterproposal would change the semantics of type hints so they aren't evaluated at runtime at all (after all, type checking happens off-line, so why would type hints need to be evaluated at runtime at all). This of course would run afoul of backwards compatibility, since the Python interpreter doesn't actually know whether a particular annotation is meant to be a type hint or something else. A compromise is possible where a ``__future__`` import could enable turning *all* annotations in a given module into string literals, as follows:: from __future__ import annotations class ImSet: def add(self, a: ImSet) -> List[ImSet]: ... assert ImSet.add.__annotations__ == {'a': 'ImSet', 'return': 'List[ImSet]'} Such a ``__future__`` import statement will be proposed in a separate PEP. The double colon ---------------- A few creative souls have tried to invent solutions for this problem. For example, it was proposed to use a double colon (``::``) for type hints, solving two problems at once: disambiguating between type hints and other annotations, and changing the semantics to preclude runtime evaluation. There are several things wrong with this idea, however. * It's ugly. The single colon in Python has many uses, and all of them look familiar because they resemble the use of the colon in English text. This is a general rule of thumb by which Python abides for most forms of punctuation; the exceptions are typically well known from other programming languages. But this use of ``::`` is unheard of in English, and in other languages (e.g. C++) it is used as a scoping operator, which is a very different beast. In contrast, the single colon for type hints reads natural -- and no wonder, since it was carefully designed for this purpose (the idea long predates PEP 3107 [gvr-artima]_). It is also used in the same fashion in other languages from Pascal to Swift. * What would you do for return type annotations? * It's actually a feature that type hints are evaluated at runtime. * Making type hints available at runtime allows runtime type checkers to be built on top of type hints. * It catches mistakes even when the type checker is not run. Since it is a separate program, users may choose not to run it (or even install it), but might still want to use type hints as a concise form of documentation. Broken type hints are no use even for documentation. * Because it's new syntax, using the double colon for type hints would limit them to code that works with Python 3.5 only. By using existing syntax, the current proposal can easily work for older versions of Python 3. (And in fact mypy supports Python 3.2 and newer.) * If type hints become successful we may well decide to add new syntax in the future to declare the type for variables, for example ``var age: int = 42``. If we were to use a double colon for argument type hints, for consistency we'd have to use the same convention for future syntax, perpetuating the ugliness. Other forms of new syntax ------------------------- A few other forms of alternative syntax have been proposed, e.g. the introduction of a ``where`` keyword [roberge]_, and Cobra-inspired ``requires`` clauses. But these all share a problem with the double colon: they won't work for earlier versions of Python 3. The same would apply to a new ``__future__`` import. Other backwards compatible conventions -------------------------------------- The ideas put forward include: * A decorator, e.g. ``@typehints(name=str, returns=str)``. This could work, but it's pretty verbose (an extra line, and the argument names must be repeated), and a far cry in elegance from the PEP 3107 notation. * Stub files. We do want stub files, but they are primarily useful for adding type hints to existing code that doesn't lend itself to adding type hints, e.g. 3rd party packages, code that needs to support both Python 2 and Python 3, and especially extension modules. For most situations, having the annotations in line with the function definitions makes them much more useful. * Docstrings. There is an existing convention for docstrings, based on the Sphinx notation (``:type arg1: description``). This is pretty verbose (an extra line per parameter), and not very elegant. We could also make up something new, but the annotation syntax is hard to beat (because it was designed for this very purpose). It's also been proposed to simply wait another release. But what problem would that solve? It would just be procrastination. PEP Development Process ======================= A live draft for this PEP lives on GitHub [github]_. There is also an issue tracker [issues]_, where much of the technical discussion takes place. The draft on GitHub is updated regularly in small increments. The official PEPS repo [peps_] is (usually) only updated when a new draft is posted to python-dev. Acknowledgements ================ This document could not be completed without valuable input, encouragement and advice from Jim Baker, Jeremy Siek, Michael Matson Vitousek, Andrey Vlasovskikh, Radomir Dopieralski, Peter Ludemann, and the BDFL-Delegate, Mark Shannon. Influences include existing languages, libraries and frameworks mentioned in PEP 482. Many thanks to their creators, in alphabetical order: Stefan Behnel, William Edwards, Greg Ewing, Larry Hastings, Anders Hejlsberg, Alok Menghrajani, Travis E. Oliphant, Joe Pamer, Raoul-Gabriel Urma, and Julien Verlaguet. References ========== .. [mypy] http://mypy-lang.org .. [pyflakes] https://github.com/pyflakes/pyflakes/ .. [pylint] http://www.pylint.org .. [gvr-artima] http://www.artima.com/weblogs/viewpost.jsp?thread=85551 .. [roberge] http://aroberge.blogspot.com/2015/01/type-hinting-in-python-focus-on.html .. [github] https://github.com/ambv/typehinting .. [issues] https://github.com/ambv/typehinting/issues .. [peps] https://hg.python.org/peps/file/tip/pep-0484.txt Copyright ========= This document has been placed in the public domain. .. Local Variables: mode: indented-text indent-tabs-mode: nil sentence-end-double-space: t fill-column: 70 coding: utf-8 End: -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From rdmurray at bitdance.com Sat Apr 18 01:18:35 2015 From: rdmurray at bitdance.com (R. David Murray) Date: Fri, 17 Apr 2015 19:18:35 -0400 Subject: [Python-Dev] Summary of Python tracker Issues In-Reply-To: <20150417160824.84FD356706@psf.upfronthosting.co.za> References: <20150417160824.84FD356706@psf.upfronthosting.co.za> Message-ID: <20150417231836.11A5DB30514@webabinitio.net> On Fri, 17 Apr 2015 18:08:24 +0200, Python tracker wrote: > > ACTIVITY SUMMARY (2015-04-10 - 2015-04-17) > Python tracker at http://bugs.python.org/ > > To view or respond to any of the issues listed below, click on the issue. > Do NOT respond to this message. > > Issues counts and deltas: > open 4792 (-31) > closed 30957 (+113) > total 35749 (+82) That's a successful sprint week :) Thanks everyone! --David From pmiscml at gmail.com Sat Apr 18 00:35:31 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Sat, 18 Apr 2015 01:35:31 +0300 Subject: [Python-Dev] async/await PEP In-Reply-To: <55315B32.5000205@gmail.com> References: <55315B32.5000205@gmail.com> Message-ID: <20150418013531.7802cdc6@x230> Hello, On Fri, 17 Apr 2015 15:12:50 -0400 Yury Selivanov wrote: > Hello, > > I've just posted a new PEP about adding async/await to python-ideas. > Maybe I should have posted it here instead.. For reference, python-ideas archive link is https://mail.python.org/pipermail/python-ideas/2015-April/033007.html > > Anyways, please take a look. > > Thanks, > Yury -- Best regards, Paul mailto:pmiscml at gmail.com From eric at trueblade.com Sat Apr 18 01:52:22 2015 From: eric at trueblade.com (Eric V. Smith) Date: Fri, 17 Apr 2015 19:52:22 -0400 Subject: [Python-Dev] Summary of Python tracker Issues In-Reply-To: <20150417231836.11A5DB30514@webabinitio.net> References: <20150417160824.84FD356706@psf.upfronthosting.co.za> <20150417231836.11A5DB30514@webabinitio.net> Message-ID: <0868BD19-C7D1-46F5-B069-BD24D1CB87EB@trueblade.com> Thank you, David, for all of your work. It's much appreciated. -- Eric. > On Apr 17, 2015, at 7:18 PM, R. David Murray wrote: > >> On Fri, 17 Apr 2015 18:08:24 +0200, Python tracker wrote: >> >> ACTIVITY SUMMARY (2015-04-10 - 2015-04-17) >> Python tracker at http://bugs.python.org/ >> >> To view or respond to any of the issues listed below, click on the issue. >> Do NOT respond to this message. >> >> Issues counts and deltas: >> open 4792 (-31) >> closed 30957 (+113) >> total 35749 (+82) > > That's a successful sprint week :) > > Thanks everyone! > > --David > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/eric%2Ba-python-dev%40trueblade.com > From 4kir4.1i at gmail.com Sat Apr 18 02:19:41 2015 From: 4kir4.1i at gmail.com (Akira Li) Date: Sat, 18 Apr 2015 03:19:41 +0300 Subject: [Python-Dev] Aware datetime from naive local time Was: Status on PEP-431 Timezones In-Reply-To: References: <877ftdp1b6.fsf@gmail.com> Message-ID: On Thu, Apr 16, 2015 at 1:14 AM, Alexander Belopolsky < alexander.belopolsky at gmail.com> wrote: > > On Wed, Apr 15, 2015 at 4:46 PM, Akira Li <4kir4.1i at gmail.com> wrote: > >> > Look what happened on July 1, 1990. At 2 AM, the clocks in Ukraine were >> > moved back one hour. So times like 01:30 AM happened twice there on >> that >> > day. Let's see how Python handles this situation >> > >> > $ TZ=Europe/Kiev python3 >> >>>> from email.utils import localtime >> >>>> from datetime import datetime >> >>>> localtime(datetime(1990,7,1,1,30)).strftime('%c %z %Z') >> > 'Sun Jul 1 01:30:00 1990 +0400 MSD' >> > >> > So far so good, I've got the first of the two 01:30AM's. But what if I >> > want the other 01:30AM? Well, >> > >> >>>> localtime(datetime(1990,7,1,1,30), isdst=0).strftime('%c %z %Z') >> > 'Sun Jul 1 01:30:00 1990 +0300 EEST' >> > >> > gives me "the other 01:30AM", but it is counter-intuitive: I have to ask >> > for the standard (winter) time to get the daylight savings (summer) >> time. >> > >> >> It looks incorrect. Here's the corresponding pytz code: >> >> from datetime import datetime >> import pytz >> >> tz = pytz.timezone('Europe/Kiev') >> print(tz.localize(datetime(1990, 7, 1, 1, 30), >> is_dst=False).strftime('%c %z %Z')) >> # -> Sun Jul 1 01:30:00 1990 +0300 EEST >> print(tz.localize(datetime(1990, 7, 1, 1, 30), >> is_dst=True).strftime('%c %z %Z')) >> # -> Sun Jul 1 01:30:00 1990 +0400 MSD >> >> See also "Enhance support for end-of-DST-like ambiguous time" [1] >> >> [1] https://bugs.launchpad.net/pytz/+bug/1378150 >> >> `email.utils.localtime()` is broken: >> > > If you think there is a bug in email.utils.localtime - please open an > issue at . > > Your question below suggests that you believe it is not a bug i.e., `email.utils.localtime()` is broken *by design* unless you think it is ok to ignore `+0400 MSD`. pytz works for me (I can get both `+0300 EEST` and `+0400 MSD`). I don't think `localtime()` can be fixed without the tz database. I don't know whether it should be fixed, let somebody else who can't use pytz to pioneer the issue. The purpose of the code example is to **inform** that `email.utils.localtime()` fails (it returns only +0300 EEST) in this case: >> from datetime import datetime >> from email.utils import localtime >> >> print(localtime(datetime(1990, 7, 1, 1, 30)).strftime('%c %z %Z')) >> # -> Sun Jul 1 01:30:00 1990 +0300 EEST >> print(localtime(datetime(1990, 7, 1, 1, 30), isdst=0).strftime('%c %z >> %Z')) >> # -> Sun Jul 1 01:30:00 1990 +0300 EEST >> print(localtime(datetime(1990, 7, 1, 1, 30), isdst=1).strftime('%c %z >> %Z')) >> # -> Sun Jul 1 01:30:00 1990 +0300 EEST >> print(localtime(datetime(1990, 7, 1, 1, 30), isdst=-1).strftime('%c %z >> %Z')) >> # -> Sun Jul 1 01:30:00 1990 +0300 EEST >> >> >> Versions: >> >> $ ./python -V >> Python 3.5.0a3+ >> $ dpkg -s tzdata | grep -i version >> Version: 2015b-0ubuntu0.14.04 >> >> > The uncertainty about how to deal with the repeated hour was the reason >> why >> > email.utils.localtime-like interface did not make it to the datetime >> > module. >> >> "repeated hour" (time jumps back) can be treated like a end-of-DST >> transition, to resolve ambiguities [1]. > > > I don't understand what you are complaining about. It is quite possible > that pytz uses is_dst flag differently from the way email.utils.localtime > uses isdst. > > I was not able to find a good description of what is_dst means in pytz, > but localtime's isdst is documented as follows: > > a positive or zero value for *isdst* causes localtime to > presume initially that summer time (for example, Daylight Saving Time) > is or is not (respectively) in effect for the specified time. > > Can you demonstrate that email.utils.localtime does not behave as > documented? > No need to be so defensive about it. *""repeated hour" (time jumps back) can be treated like a end-of-DST transition, to resolve ambiguities [1]."* is just a *an example* on how to fix the problem in the same way how it is done in pytz: >>> from datetime import datetime >>> import pytz >>> tz = pytz.timezone('Europe/Kiev') >>> after = tz.localize(datetime(1990, 7, 1, 1, 30), is_dst=False) >>> before = tz.localize(datetime(1990, 7, 1, 1, 30), is_dst=True) >>> before < after True >>> before datetime.datetime(1990, 7, 1, 1, 30, tzinfo=) >>> after datetime.datetime(1990, 7, 1, 1, 30, tzinfo=) >>> before.astimezone(pytz.utc) datetime.datetime(1990, 6, 30, 21, 30, tzinfo=) >>> after.astimezone(pytz.utc) datetime.datetime(1990, 6, 30, 22, 30, tzinfo=) >>> before.dst() datetime.timedelta(0, 3600) >>> after.dst() datetime.timedelta(0, 3600) >>> pytz.OLSON_VERSION '2015b' Here's "summer time" in both cases i.e., it is not *true* end-of-DST transition (that is why I've used the word *"like"* above). If we ignore ambiguous time that may occur more than twice then a boolean flag such as pytz's is_dst is *always* enough to resolve the ambiguity assuming we have access to the tz database. And yes, the example demonstrates that the behavior of pytz's is_dst and localtime()'s isdst is different. The example just shows that the current behavior of localtime() doesn't allow to get `+0400 DST` (on my system, see the software versions above) and how to get it (*adopt* the pytz behavior -- you need zoneinfo for that) i.e., the message is a problem and a possible solution -- no complains. [1] https://bugs.launchpad.net/pytz/+bug/1378150 -- Akira. -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexander.belopolsky at gmail.com Sat Apr 18 03:10:17 2015 From: alexander.belopolsky at gmail.com (Alexander Belopolsky) Date: Fri, 17 Apr 2015 21:10:17 -0400 Subject: [Python-Dev] Aware datetime from naive local time Was: Status on PEP-431 Timezones In-Reply-To: References: <877ftdp1b6.fsf@gmail.com> Message-ID: On Fri, Apr 17, 2015 at 8:19 PM, Akira Li <4kir4.1i at gmail.com> wrote: > Can you demonstrate that email.utils.localtime does not behave as >> documented? >> > > > No need to be so defensive about it. > There is nothing "defensive" in my question. I simply don't understand what you are complaining about other than your code using pytz produces different results from some other your code using email.utils.localtime. If you think you found a bug in email.utils.localtime - please explain it without referring to a third party library. It will also help if you do it at the bug tracker. -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefan_ml at behnel.de Sat Apr 18 07:35:19 2015 From: stefan_ml at behnel.de (Stefan Behnel) Date: Sat, 18 Apr 2015 07:35:19 +0200 Subject: [Python-Dev] PEP 484 (Type Hints) -- the next draft is here In-Reply-To: References: Message-ID: Guido van Rossum schrieb am 17.04.2015 um 23:58: > The ``typing`` module defines the ``Generator`` type for return values > of generator functions. It is a subtype of ``Iterable`` and it has > additional type variables for the type accepted by the ``send()`` > method and the return type of the generator: > > * Generator, used as ``Generator[yield_type, send_type, return_type]`` Is this meant to accept only Python generators, or any kind of object that implements the coroutine protocol? I'm asking because asyncio currently suffers from an annoying type check here, so I'd like to avoid that this problem keeps popping up again in other places. Stefan From stefan_ml at behnel.de Sat Apr 18 07:58:28 2015 From: stefan_ml at behnel.de (Stefan Behnel) Date: Sat, 18 Apr 2015 07:58:28 +0200 Subject: [Python-Dev] Changing PyModuleDef.m_reload to m_slots In-Reply-To: <55311038.5080200@gmail.com> References: <55311038.5080200@gmail.com> Message-ID: Petr Viktorin schrieb am 17.04.2015 um 15:52: > As a background, the PyModuleDef structure [1] is currently: > > struct PyModuleDef{ > PyModuleDef_Base m_base; > const char* m_name; > const char* m_doc; > Py_ssize_t m_size; > PyMethodDef *m_methods; > inquiry m_reload; > traverseproc m_traverse; > inquiry m_clear; > freefunc m_free; > }; > > ... where the m_reload pointer is unused, and must be NULL. > My proposal is to repurpose this pointer to hold an array of slots, in the > style of PEP 384's PyType_Spec [2], which would allow adding extensions -- > both those needed for PEP 489 and future ones. FWIW, I'm +1 on this. It replaces a struct field that risks staying unused basically forever with an extensible interface that massively improves the current extension module protocol and allows future extensions (including getting back the pointer we replaced). The alternative of essentially duplicating all sorts of things to accommodate for a new metadata struct is way too cumbersome and ugly in comparison. Stefan From guido at python.org Sat Apr 18 18:39:30 2015 From: guido at python.org (Guido van Rossum) Date: Sat, 18 Apr 2015 09:39:30 -0700 Subject: [Python-Dev] PEP 484 (Type Hints) -- the next draft is here In-Reply-To: References: Message-ID: (Also, there might be some interaction with PEP 492 here, which also tweaks the definition of generators.) On Sat, Apr 18, 2015 at 9:38 AM, Guido van Rossum wrote: > That's a good question. We *could* make it so that you can subclass > Generator and instantiate the instances; or we could even make it do some > structural type checking. (Please file a pull request or issue for this at > github.com/ambv/typehinting .) But perhaps we should also change asyncio? > What check are you talking about? > > On Fri, Apr 17, 2015 at 10:35 PM, Stefan Behnel > wrote: > >> Guido van Rossum schrieb am 17.04.2015 um 23:58: >> > The ``typing`` module defines the ``Generator`` type for return values >> > of generator functions. It is a subtype of ``Iterable`` and it has >> > additional type variables for the type accepted by the ``send()`` >> > method and the return type of the generator: >> > >> > * Generator, used as ``Generator[yield_type, send_type, return_type]`` >> >> Is this meant to accept only Python generators, or any kind of object that >> implements the coroutine protocol? I'm asking because asyncio currently >> suffers from an annoying type check here, so I'd like to avoid that this >> problem keeps popping up again in other places. >> >> Stefan >> >> >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/guido%40python.org >> > > > > -- > --Guido van Rossum (python.org/~guido) > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Sat Apr 18 18:38:15 2015 From: guido at python.org (Guido van Rossum) Date: Sat, 18 Apr 2015 09:38:15 -0700 Subject: [Python-Dev] PEP 484 (Type Hints) -- the next draft is here In-Reply-To: References: Message-ID: That's a good question. We *could* make it so that you can subclass Generator and instantiate the instances; or we could even make it do some structural type checking. (Please file a pull request or issue for this at github.com/ambv/typehinting .) But perhaps we should also change asyncio? What check are you talking about? On Fri, Apr 17, 2015 at 10:35 PM, Stefan Behnel wrote: > Guido van Rossum schrieb am 17.04.2015 um 23:58: > > The ``typing`` module defines the ``Generator`` type for return values > > of generator functions. It is a subtype of ``Iterable`` and it has > > additional type variables for the type accepted by the ``send()`` > > method and the return type of the generator: > > > > * Generator, used as ``Generator[yield_type, send_type, return_type]`` > > Is this meant to accept only Python generators, or any kind of object that > implements the coroutine protocol? I'm asking because asyncio currently > suffers from an annoying type check here, so I'd like to avoid that this > problem keeps popping up again in other places. > > Stefan > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefan_ml at behnel.de Sat Apr 18 19:39:51 2015 From: stefan_ml at behnel.de (Stefan Behnel) Date: Sat, 18 Apr 2015 19:39:51 +0200 Subject: [Python-Dev] PEP 484 (Type Hints) -- the next draft is here In-Reply-To: References: Message-ID: Guido van Rossum schrieb am 18.04.2015 um 18:38: > That's a good question. We *could* make it so that you can subclass > Generator and instantiate the instances; or we could even make it do some > structural type checking. (Please file a pull request or issue for this at > github.com/ambv/typehinting .) Done: https://github.com/ambv/typehinting/issues/89 > But perhaps we should also change asyncio? > What check are you talking about? https://hg.python.org/cpython/file/439517000aa2/Lib/asyncio/coroutines.py#l169 The current (3.5alpha) check in iscoroutine() is an instance check against _COROUTINE_TYPES, which contains types.GeneratorType and another wrapper type. It excludes objects that implement the coroutine protocol without being Python generators, e.g. Cython compiled generators, which mimic the Python generator interface without having byte code in them (meaning, they are not C-level compatible with Python generators). I'm sure the same applies to other compilers like Numba or Nuitka. Supporting asyncio in recent Python releases thus means monkey patching _COROUTINE_TYPES and adding another type to it. Given that it's not a public interface, this seems a bit fragile. It also seems that this code only appeared somewhere in the 3.4.x release series. Older versions were even worse and did a straight call to inspect.isgenerator() in their iscoroutine() function, which means that the whole function needed to be replaced and wrapped (and even that didn't fix all places where inspect was used). I guess I should open a (CPython) ticket for this topic, too... Stefan From stefan_ml at behnel.de Sat Apr 18 21:29:30 2015 From: stefan_ml at behnel.de (Stefan Behnel) Date: Sat, 18 Apr 2015 21:29:30 +0200 Subject: [Python-Dev] PEP 484 (Type Hints) -- the next draft is here In-Reply-To: References: Message-ID: Stefan Behnel schrieb am 18.04.2015 um 19:39: > Guido van Rossum schrieb am 18.04.2015 um 18:38: >> That's a good question. We *could* make it so that you can subclass >> Generator and instantiate the instances; or we could even make it do some >> structural type checking. (Please file a pull request or issue for this at >> github.com/ambv/typehinting .) >> >> But perhaps we should also change asyncio? >> What check are you talking about? > > https://hg.python.org/cpython/file/439517000aa2/Lib/asyncio/coroutines.py#l169 > > The current (3.5alpha) check in iscoroutine() is an instance check against > _COROUTINE_TYPES, which contains types.GeneratorType and another wrapper > type. It excludes objects that implement the coroutine protocol without > being Python generators, e.g. Cython compiled generators, which mimic the > Python generator interface without having byte code in them (meaning, they > are not C-level compatible with Python generators). I'm sure the same > applies to other compilers like Numba or Nuitka. Supporting asyncio in > recent Python releases thus means monkey patching _COROUTINE_TYPES and > adding another type to it. Given that it's not a public interface, this > seems a bit fragile. > > It also seems that this code only appeared somewhere in the 3.4.x release > series. Older versions were even worse and did a straight call to > inspect.isgenerator() in their iscoroutine() function, which means that the > whole function needed to be replaced and wrapped (and even that didn't fix > all places where inspect was used). Hmm, looks like I remembered it incorrectly. Monkey patching the inspect module is required in both versions as it's being used in more places, especially the coroutine wrappers. I'll see how I can get it working with Cython and then open tickets for it. Stefan From guido at python.org Sat Apr 18 22:28:35 2015 From: guido at python.org (Guido van Rossum) Date: Sat, 18 Apr 2015 13:28:35 -0700 Subject: [Python-Dev] PEP 484 (Type Hints) -- the next draft is here In-Reply-To: References: Message-ID: Maybe the thing to fix then is the inspect module, not asyncio? Anyway, let is know via tickets. On Sat, Apr 18, 2015 at 12:29 PM, Stefan Behnel wrote: > Stefan Behnel schrieb am 18.04.2015 um 19:39: > > Guido van Rossum schrieb am 18.04.2015 um 18:38: > >> That's a good question. We *could* make it so that you can subclass > >> Generator and instantiate the instances; or we could even make it do > some > >> structural type checking. (Please file a pull request or issue for this > at > >> github.com/ambv/typehinting .) > >> > >> But perhaps we should also change asyncio? > >> What check are you talking about? > > > > > https://hg.python.org/cpython/file/439517000aa2/Lib/asyncio/coroutines.py#l169 > > > > The current (3.5alpha) check in iscoroutine() is an instance check > against > > _COROUTINE_TYPES, which contains types.GeneratorType and another wrapper > > type. It excludes objects that implement the coroutine protocol without > > being Python generators, e.g. Cython compiled generators, which mimic the > > Python generator interface without having byte code in them (meaning, > they > > are not C-level compatible with Python generators). I'm sure the same > > applies to other compilers like Numba or Nuitka. Supporting asyncio in > > recent Python releases thus means monkey patching _COROUTINE_TYPES and > > adding another type to it. Given that it's not a public interface, this > > seems a bit fragile. > > > > It also seems that this code only appeared somewhere in the 3.4.x release > > series. Older versions were even worse and did a straight call to > > inspect.isgenerator() in their iscoroutine() function, which means that > the > > whole function needed to be replaced and wrapped (and even that didn't > fix > > all places where inspect was used). > > Hmm, looks like I remembered it incorrectly. Monkey patching the inspect > module is required in both versions as it's being used in more places, > especially the coroutine wrappers. I'll see how I can get it working with > Cython and then open tickets for it. > > Stefan > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From larry at hastings.org Sun Apr 19 10:19:35 2015 From: larry at hastings.org (Larry Hastings) Date: Sun, 19 Apr 2015 01:19:35 -0700 Subject: [Python-Dev] Surely "nullable" is a reasonable name? In-Reply-To: <53E454E9.3030100@hastings.org> References: <53DF326F.9030908@hastings.org> <53E0F488.1090105@v.loewis.de> <53E454E9.3030100@hastings.org> Message-ID: <55336517.6090702@hastings.org> On 08/07/2014 09:41 PM, Larry Hastings wrote: > Well! It's rare that the core dev community is so consistent in its > opinion. I still think "nullable" is totally appropriate, but I'll > change it to "allow_none". (reviving eight-month-old thread) In case anybody here is still interested in arguing about this: the Clinic API may be shifting a bit here. What follows is a quick refresher course on Argument Clinic, followed by a discussion of the proposed new API. Here's an Argument Clinic declaration of a parameter: s: str() The parameter is called "s", and it's specifying a converter function called "str" which handles converting string parameters. The str() converter itself accepts parameters; since the parameters all have default values, they're all optional. By default, str() maps directly to the "s" format unit for PyArg_ParseTuple(), as it does here. Currently str() (and a couple other converter functions) accepts a parameter called "types". "types" is specified as a string, and contains an unordered set of whitespace-separated strings representing the Python types of the values this (Clinic) parameter should accept. The default value of "types" for str() is "str"; the following declaration is equivalent to the declaration above: s: str(types="str") Other legal values for the "types" parameter for the str converter include "bytes bytearray str" and "robuffer str". Internally the types parameter is converted into a set of strings; passing it in as a string is a nicety for the caller's benefit. (It also means that the strings "robuffer str" and "str robuffer" are considered equivalent.) There's a second parameter, currently called "nullable", but I was supposed to rename it "allow_none", so I'll use that name here. If you pass in "allow_none=True" to a converter, it means "this (Clinic) parameter should accept the Python value None". So, to map to the format unit "z", you would specify: s: str(allow_none=True) And to map to the format unit "z#", you would specify: s: str(types="robuffer str", allow_none=True, length=True) In hindsight this is all a bit silly. I propose what I think is a much better API below. We should rename "types" to "accept". "accept" should takes a set of types; these types specify the types of Python objects the Clinic parameter should accept. For the funny pseudo-types needed in some Clinic declarations ("buffer", "robuffer", and "rwbuffer"), Clinic provides empty class declarations so these behave like types too. accept={str} is the default for the str() converter. If you want to map to format unit "z", you would write this: s: str(accept={str, NoneType}) (In case you haven't seen it before: NoneType = type(None). I don't think the name is registered anywhere officially in the standard library... but that's the name.) The upside of this approach: * Way, way more obvious to the casual reader. "types" was always meant as an unordered collection of types, but I felt specifying it with strings was unwieldy and made for poor reading ({'str', 'robuffer'}). Passing it in as a single string which I internally split and put in a set() was a bad compromise. But the semantics of this whitespace-delimited string were a bit unclear, even to the experienced Clinic hacker. This set-of-types version maps exactly to what the parameter was always meant to accept in the first place. As with any other code, people will read Clinic declarations far, far more often than they will write them, so optimizing for clarity is paramount. * Zen: "There should be one (and preferably only one) obvious way to do it." We have a way of specifying the types this parameter should accept; "allow_none" adds a second. * Zen: "Special cases aren't special enough to break the rules". "allow_none" was really just a special case of one possible type for "types". The downside of this approach: * You have to know what the default accept= set is for each converter. Luckily this is not onerous; there are only four converters that need an "accept" parameter, and their default values are all simple: int(accept={int}) str(accept={str}) Py_UNICODE(accept={str}) Py_buffer(accept={buffer}) I suggest this is only a (minor) problem when writing a Clinic declaration. It doesn't affect later readability, which is much more important. * It means repeating yourself a little. If you just want to say "I want to accept None too", you have to redundantly specify the default type(s) accepted by the converter function. In practice, it's really only redundant for four or five format units, and they're not the frequently-used ones. Right now I only see three uses of nullable for the built-in format units (there are two more for my path_converter) and they're all for the str converter. Yes, we could create a set containing the default types accepted by each converter function, and just let the caller specify that and incrementally add +{NoneType}. But this would be far longer than simply redundantly respecifying the default (e.g. "accept=str_converter.accept_default + {NoneType}"). Sometimes the best thing is just to bite the bullet and accept a little redundancy. Does "accept" sound good, including accepting "NoneType"? FWIW I originally proposed this in an issue on the tracker: http://bugs.python.org/issue23920 But we should probably continue the discussion here. Cheers, //arry/ p.s. Yes, I never actually got around to renaming "nullable". But I was going to get it done before beta, honest. And anyway remembering to do that is what led me down this path, which I think has been fruitful in its own right. -------------- next part -------------- An HTML attachment was scrubbed... URL: From waultah at gmail.com Sun Apr 19 21:03:27 2015 From: waultah at gmail.com (Riley Banks) Date: Sun, 19 Apr 2015 20:03:27 +0100 Subject: [Python-Dev] [Issue 22619] Patch needs a review Message-ID: Greetings. Can someone review Serhiy's patch for the following issue? https://bugs.python.org/issue22619 I see Dmitry pinged the issue like 2 months ago, then 1 month later... -------------- next part -------------- An HTML attachment was scrubbed... URL: From v+python at g.nevcal.com Sun Apr 19 22:26:13 2015 From: v+python at g.nevcal.com (Glenn Linderman) Date: Sun, 19 Apr 2015 13:26:13 -0700 Subject: [Python-Dev] Surely "nullable" is a reasonable name? In-Reply-To: <55336517.6090702@hastings.org> References: <53DF326F.9030908@hastings.org> <53E0F488.1090105@v.loewis.de> <53E454E9.3030100@hastings.org> <55336517.6090702@hastings.org> Message-ID: <55340F65.8050803@g.nevcal.com> On 4/19/2015 1:19 AM, Larry Hastings wrote: > > > On 08/07/2014 09:41 PM, Larry Hastings wrote: >> Well! It's rare that the core dev community is so consistent in its >> opinion. I still think "nullable" is totally appropriate, but I'll >> change it to "allow_none". > > (reviving eight-month-old thread) > * Zen: "There should be one (and preferably only one) obvious way to > do it." We have a way of specifying the types this parameter > should accept; "allow_none" adds a second. > * Zen: "Special cases aren't special enough to break the rules". > "allow_none" was really just a special case of one possible type > for "types". > Is argument clinic a special case of type annotations? (Quoted and worded to be provocative, intentionally but not maliciously.) OK, I know that argument clinic applies to C code and I know that type annotations apply to Python code. And I know that C code is a lot more restrictive /a priori/ which clinic has to accommodate, and type annotations are a way of adding (unenforced) restrictions on Python code. Still, from a 50,000' view, there seems to be an overlap in functionality... and both are aimed at Py 3.5... I find that interesting... I guess describing parameter types is the latest Python trend :) -------------- next part -------------- An HTML attachment was scrubbed... URL: From larry at hastings.org Sun Apr 19 23:38:40 2015 From: larry at hastings.org (Larry Hastings) Date: Sun, 19 Apr 2015 14:38:40 -0700 Subject: [Python-Dev] Surely "nullable" is a reasonable name? In-Reply-To: <55340F65.8050803@g.nevcal.com> References: <53DF326F.9030908@hastings.org> <53E0F488.1090105@v.loewis.de> <53E454E9.3030100@hastings.org> <55336517.6090702@hastings.org> <55340F65.8050803@g.nevcal.com> Message-ID: <55342060.7010007@hastings.org> On 04/19/2015 01:26 PM, Glenn Linderman wrote: > Is argument clinic a special case of type annotations? (Quoted and > worded to be provocative, intentionally but not maliciously.) > > OK, I know that argument clinic applies to C code and I know that type > annotations apply to Python code. And I know that C code is a lot more > restrictive /a priori/ which clinic has to accommodate, and type > annotations are a way of adding (unenforced) restrictions on Python > code. Still, from a 50,000' view, there seems to be an overlap in > functionality... and both are aimed at Py 3.5... I find that > interesting... I guess describing parameter types is the latest Python > trend :) Argument Clinic and Python 3 type annotations are related concepts. Argument Clinic's syntax is designed in such a way that we actually use ast.parse() to parse it, and that includes using the type annotation syntax. That's about all they have in common. This discussion is off-topic and of limited interest; if you have further questions along these lines please email me privately. //arry/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From ericsnowcurrently at gmail.com Mon Apr 20 01:03:32 2015 From: ericsnowcurrently at gmail.com (Eric Snow) Date: Mon, 20 Apr 2015 01:03:32 +0200 Subject: [Python-Dev] Should instances really be able to dictate the "existence" of special methods? Message-ID: _PyObject_LookupSpecial is used in place of obj.__getattribute__ for looking up special methods. (As far as I recall it is not exposed in the stdlib, e.g. inspect.getattr_special.) Correct me if I'm wrong (please!), but there are two key reasons: * access to special methods in spite of obj.__getattribute__ * speed While _PyObject_LookupSpecial does not do lookup on obj.__dict__ or call obj.__getattr__, it does resolve descriptors. This is important particularly since special methods will nearly always be some kind of descriptor. However, one consequence of this is that instances can influence whether or not some capability, as relates to the special method, is available. This is accomplished by the descriptor's __get__ raising AttributeError. My question is: was this intentional? Considering the obscure bugs that can result (e.g. where did the AttributeError come from?), it seems more likely that it is an oversight of an obscure corner case. If that is the case then it would be nice if we could fix _PyObject_LookupSpecial to chain any AttributeError coming from descr.__get__ into a RuntimeError. However, I doubt we could get away with that at this point. Also, while it may be appropriate in general to allow instances to dictate the availability of attributes/methods (e.g. through __getattribute__, __getattr__, or descriptors), I'm not convinced it makes sense for special methods. We are already explicitly disconnecting special methods from instances in _PyObject_LookupSpecial (except in the case of descriptors...). -eric p.s. I also find it a bit strange that instances have any say at all in which methods (i.e. behavior) are *available*. Certainly instances influence behavior, but I always find their impact on method availability to be surprising. Conceptually for me instances are all about state and classes about behavior (driven by state). However, it is very rarely that I run into code that takes advantage of the opportunity. :) From guido at python.org Mon Apr 20 02:20:59 2015 From: guido at python.org (Guido van Rossum) Date: Sun, 19 Apr 2015 17:20:59 -0700 Subject: [Python-Dev] Should instances really be able to dictate the "existence" of special methods? In-Reply-To: References: Message-ID: (I suppose this new thread is a result of some research you did regarding the thread complaining about callable()?) On Sun, Apr 19, 2015 at 4:03 PM, Eric Snow wrote: > _PyObject_LookupSpecial is used in place of obj.__getattribute__ for > looking up special methods. (As far as I recall it is not exposed in > the stdlib, e.g. inspect.getattr_special.) Correct me if I'm wrong > (please!), but there are two key reasons: > > * access to special methods in spite of obj.__getattribute__ > * speed > Good question! I don't have an easy pointer to the original discussion, but I do recall that this was introduced in response to some issues with the original behavior, which looked up dunder methods on the instance and relied on the general mechanism for binding it to the instance. I don't think the reason was to circumvent __getattribute__, but your second bullet rings true: for every +, -, * etc. there would be a (usually failing) lookup in the instance dict before searching the class dict and then the base classes etc. There may also have been some confusion where people would e.g. assign a function of two arguments to x.__add__ and would be disappointed to find out it was called with only one argument. I think there were some folks who wanted to fix this by somehow "binding" such calls to the instance (since there's no easy way otherwise to get the first argument) but I thought the use case was sufficiently odd that it was better to avoid it altogether. In any case, it's not just an optimization -- it's an intentional (though obscure) feature. > While _PyObject_LookupSpecial does not do lookup on obj.__dict__ or > call obj.__getattr__, it does resolve descriptors. This is important > particularly since special methods will nearly always be some kind of > descriptor. However, one consequence of this is that instances can > influence whether or not some capability, as relates to the special > method, is available. This is accomplished by the descriptor's > __get__ raising AttributeError. > Well, it's not really the instance that raises AttributeError -- it's the descriptor, which is a separate class (usually but not always a builtin class, such as property or classmethod). And the descriptor is "owned" by the class. > My question is: was this intentional? Considering the obscure bugs > that can result (e.g. where did the AttributeError come from?), it > seems more likely that it is an oversight of an obscure corner case. > I'm not sure what you would do to avoid this. You can't very well declare that a descriptor's __get__ method must not raise AttributeError. It could be implemented in Python and it could just hit a bug or something. But perhaps I'm misunderstanding the situation you're describing? > If that is the case then it would be nice if we could fix > _PyObject_LookupSpecial to chain any AttributeError coming from > descr.__get__ into a RuntimeError. However, I doubt we could get away > with that at this point. > Yeah, I think that ship has sailed. It also seems to be hardly worth trying to control "double fault" situations like this. (It's not really a double fault, but it reeks like it.) I wonder if maybe you're feeling inspired by PEP 479? But that's really a much more special case, and I don't really want to start down a whole cascade of trying to "fix" all cases where an AttributeError could be raised due to a problem in the user's lookup code. > Also, while it may be appropriate in general to allow instances to > dictate the availability of attributes/methods (e.g. through > __getattribute__, __getattr__, or descriptors), I'm not convinced it > makes sense for special methods. We are already explicitly > disconnecting special methods from instances in > _PyObject_LookupSpecial (except in the case of descriptors...). > I'm still a little bit confused why you consider an error from the descriptor as "dictated by the instance". I think what you're trying to describe is that there is a method on the class but trying to bind it to the instance fails. Well, all sorts of things may fails. (In fact very few things cannot raise an exception in Python.) > -eric > > p.s. I also find it a bit strange that instances have any say at all > in which methods (i.e. behavior) are *available*. Certainly instances > influence behavior, but I always find their impact on method > availability to be surprising. Conceptually for me instances are all > about state and classes about behavior (driven by state). However, it > is very rarely that I run into code that takes advantage of the > opportunity. :) > If I understand what you're trying to say, what you're describing is due to Python's unification of instance variables and methods into attributes. It's pretty powerful that if x.foo(args) is a method call, you can also write this as (x.foo)(args), and you can separate the attribute access even further from the call and pass x.foo to some other function that is eventually going to call it. Languages that don't do this have to use a lambda at that point. Like every design choice each choice has its pluses and minuses, but this is how it's done in Python, and it's not going to change. -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From ericsnowcurrently at gmail.com Mon Apr 20 03:36:41 2015 From: ericsnowcurrently at gmail.com (Eric Snow) Date: Mon, 20 Apr 2015 03:36:41 +0200 Subject: [Python-Dev] Should instances really be able to dictate the "existence" of special methods? In-Reply-To: References: Message-ID: On Mon, Apr 20, 2015 at 2:20 AM, Guido van Rossum wrote: > (I suppose this new thread is a result of some research you did regarding > the thread complaining about callable()?) Yep. :) > On Sun, Apr 19, 2015 at 4:03 PM, Eric Snow > wrote: >> >> _PyObject_LookupSpecial is used in place of obj.__getattribute__ for >> looking up special methods. (As far as I recall it is not exposed in >> the stdlib, e.g. inspect.getattr_special.) Correct me if I'm wrong >> (please!), but there are two key reasons: >> >> * access to special methods in spite of obj.__getattribute__ >> * speed > > Good question! I don't have an easy pointer to the original discussion, but > I do recall that this was introduced in response to some issues with the > original behavior, which looked up dunder methods on the instance and relied > on the general mechanism for binding it to the instance. I don't think the > reason was to circumvent __getattribute__, but your second bullet rings > true: for every +, -, * etc. there would be a (usually failing) lookup in > the instance dict before searching the class dict and then the base classes > etc. There may also have been some confusion where people would e.g. assign > a function of two arguments to x.__add__ and would be disappointed to find > out it was called with only one argument. I think there were some folks who > wanted to fix this by somehow "binding" such calls to the instance (since > there's no easy way otherwise to get the first argument) but I thought the > use case was sufficiently odd that it was better to avoid it altogether. > > In any case, it's not just an optimization -- it's an intentional (though > obscure) feature. Thanks for explaining. >> While _PyObject_LookupSpecial does not do lookup on obj.__dict__ or >> call obj.__getattr__, it does resolve descriptors. This is important >> particularly since special methods will nearly always be some kind of >> descriptor. However, one consequence of this is that instances can >> influence whether or not some capability, as relates to the special >> method, is available. This is accomplished by the descriptor's >> __get__ raising AttributeError. > > Well, it's not really the instance that raises AttributeError -- it's the > descriptor, which is a separate class (usually but not always a builtin > class, such as property or classmethod). And the descriptor is "owned" by > the class. Sure. That's what I meant. :) The instance can influence what the descriptor returns. >> My question is: was this intentional? Considering the obscure bugs >> that can result (e.g. where did the AttributeError come from?), it >> seems more likely that it is an oversight of an obscure corner case. > > I'm not sure what you would do to avoid this. You can't very well declare > that a descriptor's __get__ method must not raise AttributeError. It could > be implemented in Python and it could just hit a bug or something. Right. And such a bug will be misinterpreted and obscured and hard to unravel. I ran into this a while back with pickle (which still does lookup for special methods on the instance). Ultimately it's the same old problem of not knowing how to interpret an exception that may have bubbled up from some other layer. Like I said, I don't think there's anything to be done about it either way. I just got the feeling that in the case of special methods, the descriptor part of lookup should not expect AttributeError to come out of the getter. So I wanted to see if my intuition was correct even if the point is essentially irrelevant. :) At this point, though, I think my intuition wasn't quite right, though I still don't think a descriptor's getter is the right place to raise AttributeError. > But > perhaps I'm misunderstanding the situation you're describing? > >> >> If that is the case then it would be nice if we could fix >> _PyObject_LookupSpecial to chain any AttributeError coming from >> descr.__get__ into a RuntimeError. However, I doubt we could get away >> with that at this point. > > Yeah, I think that ship has sailed. It also seems to be hardly worth trying > to control "double fault" situations like this. (It's not really a double > fault, but it reeks like it.) > > I wonder if maybe you're feeling inspired by PEP 479? But that's really a > much more special case, and I don't really want to start down a whole > cascade of trying to "fix" all cases where an AttributeError could be raised > due to a problem in the user's lookup code. Nah. It isn't about fixing all the cases nor directly related to PEP 479. Instead it is in response to one obscure corner case (the behavior of callable). >> Also, while it may be appropriate in general to allow instances to >> dictate the availability of attributes/methods (e.g. through >> __getattribute__, __getattr__, or descriptors), I'm not convinced it >> makes sense for special methods. We are already explicitly >> disconnecting special methods from instances in >> _PyObject_LookupSpecial (except in the case of descriptors...). > > I'm still a little bit confused why you consider an error from the > descriptor as "dictated by the instance". Sorry, I should have been more clear. The descriptor part of attr lookup involves a way for the instance to influence the outcome of the lookup (at the discretion of the descriptor). I didn't mean to imply that the instance has a direct role in special method lookup. > I think what you're trying to > describe is that there is a method on the class but trying to bind it to the > instance fails. Well, all sorts of things may fails. (In fact very few > things cannot raise an exception in Python.) > >> >> -eric >> >> p.s. I also find it a bit strange that instances have any say at all >> in which methods (i.e. behavior) are *available*. Certainly instances >> influence behavior, but I always find their impact on method >> availability to be surprising. Conceptually for me instances are all >> about state and classes about behavior (driven by state). However, it >> is very rarely that I run into code that takes advantage of the >> opportunity. :) > > > If I understand what you're trying to say, what you're describing is due to > Python's unification of instance variables and methods into attributes. It's > pretty powerful that if x.foo(args) is a method call, you can also write > this as (x.foo)(args), and you can separate the attribute access even > further from the call and pass x.foo to some other function that is > eventually going to call it. Languages that don't do this have to use a > lambda at that point. Like every design choice each choice has its pluses > and minuses, but this is how it's done in Python, and it's not going to > change. Again I wasn't very clear. Rather than the unification of attributes, I'm referring to how descriptors can raise AttributeError in __get__ and it gets interpreted as "attribute missing" in object.__getattribute__ (and similar lookup methods). However, just in the context of descriptors it makes a little more sense, even if still consider it too clever. I think I was just focusing too much on the influence of instances on descriptors. :) -eric From guido at python.org Mon Apr 20 04:37:27 2015 From: guido at python.org (Guido van Rossum) Date: Sun, 19 Apr 2015 19:37:27 -0700 Subject: [Python-Dev] Should instances really be able to dictate the "existence" of special methods? In-Reply-To: References: Message-ID: OK, so I think there isn't anything we can or should do here. Yes, it's possible that type(x).__add__ succeeds but x.__add__ fails. That's how you spell descriptor. :-) You could also use a random number generator in __getattribube__... On Sun, Apr 19, 2015 at 6:36 PM, Eric Snow wrote: > On Mon, Apr 20, 2015 at 2:20 AM, Guido van Rossum > wrote: > > (I suppose this new thread is a result of some research you did regarding > > the thread complaining about callable()?) > > Yep. :) > > > On Sun, Apr 19, 2015 at 4:03 PM, Eric Snow > > wrote: > >> > >> _PyObject_LookupSpecial is used in place of obj.__getattribute__ for > >> looking up special methods. (As far as I recall it is not exposed in > >> the stdlib, e.g. inspect.getattr_special.) Correct me if I'm wrong > >> (please!), but there are two key reasons: > >> > >> * access to special methods in spite of obj.__getattribute__ > >> * speed > > > > Good question! I don't have an easy pointer to the original discussion, > but > > I do recall that this was introduced in response to some issues with the > > original behavior, which looked up dunder methods on the instance and > relied > > on the general mechanism for binding it to the instance. I don't think > the > > reason was to circumvent __getattribute__, but your second bullet rings > > true: for every +, -, * etc. there would be a (usually failing) lookup in > > the instance dict before searching the class dict and then the base > classes > > etc. There may also have been some confusion where people would e.g. > assign > > a function of two arguments to x.__add__ and would be disappointed to > find > > out it was called with only one argument. I think there were some folks > who > > wanted to fix this by somehow "binding" such calls to the instance (since > > there's no easy way otherwise to get the first argument) but I thought > the > > use case was sufficiently odd that it was better to avoid it altogether. > > > > In any case, it's not just an optimization -- it's an intentional (though > > obscure) feature. > > Thanks for explaining. > > >> While _PyObject_LookupSpecial does not do lookup on obj.__dict__ or > >> call obj.__getattr__, it does resolve descriptors. This is important > >> particularly since special methods will nearly always be some kind of > >> descriptor. However, one consequence of this is that instances can > >> influence whether or not some capability, as relates to the special > >> method, is available. This is accomplished by the descriptor's > >> __get__ raising AttributeError. > > > > Well, it's not really the instance that raises AttributeError -- it's the > > descriptor, which is a separate class (usually but not always a builtin > > class, such as property or classmethod). And the descriptor is "owned" by > > the class. > > Sure. That's what I meant. :) The instance can influence what the > descriptor returns. > > >> My question is: was this intentional? Considering the obscure bugs > >> that can result (e.g. where did the AttributeError come from?), it > >> seems more likely that it is an oversight of an obscure corner case. > > > > I'm not sure what you would do to avoid this. You can't very well declare > > that a descriptor's __get__ method must not raise AttributeError. It > could > > be implemented in Python and it could just hit a bug or something. > > Right. And such a bug will be misinterpreted and obscured and hard to > unravel. I ran into this a while back with pickle (which still does > lookup for special methods on the instance). Ultimately it's the same > old problem of not knowing how to interpret an exception that may have > bubbled up from some other layer. > > Like I said, I don't think there's anything to be done about it either > way. I just got the feeling that in the case of special methods, the > descriptor part of lookup should not expect AttributeError to come out > of the getter. So I wanted to see if my intuition was correct even if > the point is essentially irrelevant. :) At this point, though, I > think my intuition wasn't quite right, though I still don't think a > descriptor's getter is the right place to raise AttributeError. > > > But > > perhaps I'm misunderstanding the situation you're describing? > > > >> > >> If that is the case then it would be nice if we could fix > >> _PyObject_LookupSpecial to chain any AttributeError coming from > >> descr.__get__ into a RuntimeError. However, I doubt we could get away > >> with that at this point. > > > > Yeah, I think that ship has sailed. It also seems to be hardly worth > trying > > to control "double fault" situations like this. (It's not really a double > > fault, but it reeks like it.) > > > > I wonder if maybe you're feeling inspired by PEP 479? But that's really a > > much more special case, and I don't really want to start down a whole > > cascade of trying to "fix" all cases where an AttributeError could be > raised > > due to a problem in the user's lookup code. > > Nah. It isn't about fixing all the cases nor directly related to PEP > 479. Instead it is in response to one obscure corner case (the > behavior of callable). > > >> Also, while it may be appropriate in general to allow instances to > >> dictate the availability of attributes/methods (e.g. through > >> __getattribute__, __getattr__, or descriptors), I'm not convinced it > >> makes sense for special methods. We are already explicitly > >> disconnecting special methods from instances in > >> _PyObject_LookupSpecial (except in the case of descriptors...). > > > > I'm still a little bit confused why you consider an error from the > > descriptor as "dictated by the instance". > > Sorry, I should have been more clear. The descriptor part of attr > lookup involves a way for the instance to influence the outcome of the > lookup (at the discretion of the descriptor). I didn't mean to imply > that the instance has a direct role in special method lookup. > > > I think what you're trying to > > describe is that there is a method on the class but trying to bind it to > the > > instance fails. Well, all sorts of things may fails. (In fact very few > > things cannot raise an exception in Python.) > > > >> > >> -eric > >> > >> p.s. I also find it a bit strange that instances have any say at all > >> in which methods (i.e. behavior) are *available*. Certainly instances > >> influence behavior, but I always find their impact on method > >> availability to be surprising. Conceptually for me instances are all > >> about state and classes about behavior (driven by state). However, it > >> is very rarely that I run into code that takes advantage of the > >> opportunity. :) > > > > > > If I understand what you're trying to say, what you're describing is due > to > > Python's unification of instance variables and methods into attributes. > It's > > pretty powerful that if x.foo(args) is a method call, you can also write > > this as (x.foo)(args), and you can separate the attribute access even > > further from the call and pass x.foo to some other function that is > > eventually going to call it. Languages that don't do this have to use a > > lambda at that point. Like every design choice each choice has its pluses > > and minuses, but this is how it's done in Python, and it's not going to > > change. > > Again I wasn't very clear. Rather than the unification of attributes, > I'm referring to how descriptors can raise AttributeError in __get__ > and it gets interpreted as "attribute missing" in > object.__getattribute__ (and similar lookup methods). However, just > in the context of descriptors it makes a little more sense, even if > still consider it too clever. I think I was just focusing too much on > the influence of instances on descriptors. :) > > -eric > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From ericsnowcurrently at gmail.com Mon Apr 20 04:41:30 2015 From: ericsnowcurrently at gmail.com (Eric Snow) Date: Mon, 20 Apr 2015 04:41:30 +0200 Subject: [Python-Dev] Should instances really be able to dictate the "existence" of special methods? In-Reply-To: References: Message-ID: On Mon, Apr 20, 2015 at 4:37 AM, Guido van Rossum wrote: > OK, so I think there isn't anything we can or should do here. Yes, it's > possible that type(x).__add__ succeeds but x.__add__ fails. That's how you > spell descriptor. :-) You could also use a random number generator in > __getattribube__... Cool. That's pretty much what I figured. -eric From larry at hastings.org Mon Apr 20 10:16:00 2015 From: larry at hastings.org (Larry Hastings) Date: Mon, 20 Apr 2015 01:16:00 -0700 Subject: [Python-Dev] [RELEASED] Python 3.5.0a4 is now available Message-ID: <5534B5C0.2030800@hastings.org> On behalf of the Python development community and the Python 3.5 release team, I'm thrilled to announce the availability of Python 3.5.0a4. Python 3.5.0a4 is the fourth and alpha release of Python 3.5, which will be the next major release of Python. Python 3.5 is still under development, and is not yet complete. This is a preview release, and its use is not recommended for production settings. The next release of Python 3.5 will be 3.5.0b1, the first beta release. Python 3.5 will enter "feature freeze" at this time; no new features will be added to 3.5 after this point. Python 3.5.0b1 is scheduled to be released May 22, 2015. Three important notes for Windows users about Python 3.5.0a4: * If you have previously installed Python 3.5.0a1, you may need to manually uninstall it before installing Python 3.5.0a4 (issue23612). * If installing Python 3.5.0a4 as a non-privileged user, you may need to escalate to administrator privileges to install an update to your C runtime libraries. * There is now a third type of Windows installer for Python 3.5. In addition to the conventional installer and the web-based installer, Python 3.5 now has an embeddable installer designed to be run as part of a larger application's installer for apps using or extending Python. You can find Python 3.5.0a4 here: https://www.python.org/downloads/release/python-350a4/ Happy hacking, //arry/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From barry at python.org Mon Apr 20 15:54:04 2015 From: barry at python.org (Barry Warsaw) Date: Mon, 20 Apr 2015 09:54:04 -0400 Subject: [Python-Dev] Surely "nullable" is a reasonable name? In-Reply-To: <55336517.6090702@hastings.org> References: <53DF326F.9030908@hastings.org> <53E0F488.1090105@v.loewis.de> <53E454E9.3030100@hastings.org> <55336517.6090702@hastings.org> Message-ID: <20150420095404.29e76d21@anarchist.wooz.org> On Apr 19, 2015, at 01:19 AM, Larry Hastings wrote: >We should rename "types" to "accept". "accept" should takes a set of types; >these types specify the types of Python objects the Clinic parameter should >accept. For the funny pseudo-types needed in some Clinic declarations >("buffer", "robuffer", and "rwbuffer"), Clinic provides empty class >declarations so these behave like types too. Having only followed the AC discussions tangentially, I have to say that the above suggestion and the given examples make a lot more intuitive sense to me. I had the same initial thought as Glenn regarding type annotations. It's fine that they're separate concepts, but it's also helpful that Larry's suggestion above seems to align them better. Cheers, -Barry From s.shanabrook at gmail.com Mon Apr 20 15:52:39 2015 From: s.shanabrook at gmail.com (Saul Shanabrook) Date: Mon, 20 Apr 2015 13:52:39 +0000 Subject: [Python-Dev] Starting CPython development w/ Docker Message-ID: I started trying some CPythong development a week ago at PyCon and first got testing working using Docker on my mac. This had the advantage of not having to worry about installing and dependencies, and also let me test on different Python versions easily. If you are interested in trying it, I laid out all the steps here: http://www.saulshanabrook.com/cpython-dev-w-docker/ Saul Shanabrook -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Mon Apr 20 16:52:54 2015 From: brett at python.org (Brett Cannon) Date: Mon, 20 Apr 2015 14:52:54 +0000 Subject: [Python-Dev] Starting CPython development w/ Docker In-Reply-To: References: Message-ID: On Mon, Apr 20, 2015 at 10:44 AM Saul Shanabrook wrote: > I started trying some CPythong development a week ago at PyCon and first > got testing working using Docker on my mac. This had the advantage of not > having to worry about installing and dependencies, and also let me test on > different Python versions easily. > > If you are interested in trying it, I laid out all the steps here: > http://www.saulshanabrook.com/cpython-dev-w-docker/ > Would you mind proposing this idea to core-mentorship at python.org and seeing if anyone else would benefit? If others do try it out and like then feel free to file an issue at bugs.python.org for devinabox to add your script. -------------- next part -------------- An HTML attachment was scrubbed... URL: From christian at python.org Mon Apr 20 17:15:38 2015 From: christian at python.org (Christian Heimes) Date: Mon, 20 Apr 2015 17:15:38 +0200 Subject: [Python-Dev] Starting CPython development w/ Docker In-Reply-To: References: Message-ID: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 On 2015-04-20 15:52, Saul Shanabrook wrote: > I started trying some CPythong development a week ago at PyCon and > first got testing working using Docker on my mac. This had the > advantage of not having to worry about installing and dependencies, > and also let me test on different Python versions easily. > > If you are interested in trying it, I laid out all the steps here: > http://www.saulshanabrook.com/cpython-dev-w-docker/ Good work! May I suggest that you add ccache and the --config-cache option to configure? Both speed up the build process a lot. For ccache you have to install the package and put /usr/lib64/ccache in front of PATH. Christian -----BEGIN PGP SIGNATURE----- iQEcBAEBCgAGBQJVNRgSAAoJEIZoUkkhLbaJZF0IAJQHcVJvxCnWXjtnsgW7y4Rc sTccSCvBU6Qlwghnb8jn5+fMLGpgBpT1JQqvX8jdzXSkRyvFjPROdGTFom6qRm4F vrF2XAfA5lH2nwvtx1S8WsG6NaStGn/H1tfwZRujE5bxViuPpd3UXfYPJogSKWAZ OadnpnjdhS24EJ5GMnJe61KPZphBlNyTmSB60SXxu/Sk3AIVnM+RPKshvbsla1vg 3udBRgjP2LLSO6mzwIjfL8+blnpExkUTLiqHqpw9QX3i4ox4xkIozBj56KinfVvC lXHtMuK+dv34A43BffAX1ipEVd+HpCBYwhQEitievjidLXkoLcpAwsl/bSY7ToI= =JeXE -----END PGP SIGNATURE----- From s.shanabrook at gmail.com Mon Apr 20 18:36:37 2015 From: s.shanabrook at gmail.com (Saul Shanabrook) Date: Mon, 20 Apr 2015 16:36:37 +0000 Subject: [Python-Dev] Starting CPython development w/ Docker In-Reply-To: <553525B7.3020703@willingconsulting.com> References: <553525B7.3020703@willingconsulting.com> Message-ID: On Mon, Apr 20, 2015 at 12:13 PM Carol Willing < willingc at willingconsulting.com> wrote: > On 4/20/15 7:52 AM, Brett Cannon wrote: > > > On Mon, Apr 20, 2015 at 10:44 AM Saul Shanabrook > wrote: > >> I started trying some CPythong development a week ago at PyCon and first >> got testing working using Docker on my mac. This had the advantage of not >> having to worry about installing and dependencies, and also let me test on >> different Python versions easily. >> > Saul, thanks for the steps, and I will be trying it out. > Let me know if you have any questions at all, I am happy to help, even if they are just about getting docker setup. > If you are interested in trying it, I laid out all the steps here: >> http://www.saulshanabrook.com/cpython-dev-w-docker/ >> > > Would you mind proposing this idea to core-mentorship at python.org and > seeing if anyone else would benefit? If others do try it out and like then > feel free to file an issue at bugs.python.org for devinabox to add your > script. > > Please do share your good work with core-mentorship :) > > Brett, Up to date docker development containers (devinabox in the cloudy > sky) would be helpful for new developers since there will be less > questioning and uncertainty if the dev environment is set up correctly. > > > I posted to core-mentorship and am happy to help get this workflow integrated anywhere, if it is useful. > > _______________________________________________ > Python-Dev mailing listPython-Dev at python.orghttps://mail.python.org/mailman/listinfo/python-dev > > Unsubscribe: https://mail.python.org/mailman/options/python-dev/willingc%40willingconsulting.com > > > > -- > *Carol Willing* > Developer | Willing Consulting > https://willingconsulting.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From willingc at willingconsulting.com Mon Apr 20 18:13:43 2015 From: willingc at willingconsulting.com (Carol Willing) Date: Mon, 20 Apr 2015 09:13:43 -0700 Subject: [Python-Dev] Starting CPython development w/ Docker In-Reply-To: References: Message-ID: <553525B7.3020703@willingconsulting.com> On 4/20/15 7:52 AM, Brett Cannon wrote: > > On Mon, Apr 20, 2015 at 10:44 AM Saul Shanabrook > > wrote: > > I started trying some CPythong development a week ago at PyCon and > first got testing working using Docker on my mac. This had the > advantage of not having to worry about installing and > dependencies, and also let me test on different Python versions > easily. > Saul, thanks for the steps, and I will be trying it out. > > If you are interested in trying it, I laid out all the steps here: > http://www.saulshanabrook.com/cpython-dev-w-docker/ > > > Would you mind proposing this idea to core-mentorship at python.org > and seeing if anyone else would > benefit? If others do try it out and like then feel free to file an > issue at bugs.python.org for devinabox to add > your script. Please do share your good work with core-mentorship :) Brett, Up to date docker development containers (devinabox in the cloudy sky) would be helpful for new developers since there will be less questioning and uncertainty if the dev environment is set up correctly. > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/willingc%40willingconsulting.com -- *Carol Willing* Developer | Willing Consulting https://willingconsulting.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From hjwp2 at cantab.net Mon Apr 20 20:30:39 2015 From: hjwp2 at cantab.net (Harry Percival) Date: Mon, 20 Apr 2015 19:30:39 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction Message-ID: Hi all, tldr; type hints in python source are scary. Would reserving them for stub files be better? For people that don't know me (most of you I think), I don't have a long experience of programming (perhaps 5 years, barring a bit of messing about with BASIC in the 80s), I've never made any commits on cPython, and so I don't speak from a great height of experience. I am very much a "mediocre programmer" (TM). I wouldn't have felt it appropriate to email python-dev about this, except that one of the members encouraged me to do so -- perhaps they felt there hadn't been enough "balance" in discussions to date. Also, I've had enough positive reactions in person and on the twitters from people well-known in the Python community, to make me feel like maybe the issue is at least worth discussing... And I do have some experience with teaching Python to beginners, enough to worry about anything I think might make their lives more difficult. So, in outline: I think: - type hints are ugly - they make the language harder to understand - particularly for beginners - the defense that "they're optional" doesn't really work, for several reasons. - but maybe there's a way to keep them and their benefits, without incurring the above costs My first reaction to type hints was "yuck", and I'm sure I'm not the only one to think that. viz (from some pycon slides): def zipmap(f: Callable[[int, int], int], xx: List[int], yy: List[int]) -> List[Tuple[int, int, int]]: arg. and imagine it with default arguments. Of course, part of this reaction is just a knee-jerk reaction to the new and unfamiliar, and should be dismissed, entirely justifiably, as mere irrationality. But I'm sure sensible people agree that they do make our function definitions longer, more complex, and harder to read. No doubt this has occurred to everyone that's been working on them. There is a cost. But the benefits make it worthwhile. I don't want to spend too long debating the benefits -- Guido gave an outline of them at Pycon, and I know y'all wouldn't be doing all this work for no reason. All I will say is -- it sounds like the people that will benefit are Google and other "Enterprise" users, IDE vendors, and the people that will pay for it in sweat and confusion are beginners and John Q. Mediocre Programmer. But what I really want to dwell on are the costs. I've heard this argument that, because the type hints are optional, they're not something that beginners or the average user needs to worry about. Beginners won't need to learn them, and they'll probably never see them. And average users won't need them either, so they don't need to worry about them either. So the costs are minimal! I should relax. I'm not so sure. My worry is that once type hinting gets standardised, then they will become a "best practice", and there's a particular personality type out there that's going to start wanting to add type hints to every function they write. Similarly to mindlessly obeying PEP8 while ignoring its intentions, hobgoblin-of-little-minds style, I think we're very likely to see type hints appearing in a lot of python source, or a lot of pre-commit-hook checkers. Pretty soon it will be hard to find any open source library code that doesn't have type hints, or any project style guide that doesn't require them. It may not even be an irrational, cargo-cult thing -- they really may be paying dividends. But it does mean we will all have to wade through a bunch of type hints before we can understand any function. Not the end of the world. The extra effort may even help us understand our functions better. But it's just that little bit uglier, just that little extra mental effort, just that little extra barrier that is going to mean some people, at the margin, just give up on learning programming, or switch to javascript, or whatever it'll be. Now I'm aware that throwing out type hints altogether is unlikely to be a popular proposal at this stage. So I'm casting around for an alternative. And it seems to me that stub files might be the answer. >From what I understand, type hint files are files with the extension .pyi that provide stub versions of the functions in the matching .py files, but that contain the type hints, while the .py file doesn't. In fact they're likely to be a very popular approach anyway, since they allow type hints for codebases that need to work under both 2 and 3. That sounds like the best of both worlds to me. - .py files stay beautiful, concise, and easy to read. - beginners don't have to worry about wading through type definitions when they find themselves browsing someone else's source - type information is available to the linters and static file checkers, so we get all the benefits. Sounds great right? Everybody will be happy! So let's nail it down! If I was in charge, here's what I'd do: * standardise the syntax for type hints in 3.5, as per PEP484 * but: recommend the use of stub files as the preferred place to store hints * and: deprecate function annotations in the core language * remove them from the core language altogether in 3.6 How about that? HP -------------- next part -------------- An HTML attachment was scrubbed... URL: From barry at python.org Mon Apr 20 20:41:06 2015 From: barry at python.org (Barry Warsaw) Date: Mon, 20 Apr 2015 14:41:06 -0400 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: Message-ID: <20150420144106.63513828@anarchist.wooz.org> On Apr 20, 2015, at 07:30 PM, Harry Percival wrote: >tldr; type hints in python source are scary. Would reserving them for stub >files be better? I think so. I think PEP 8 should require stub files for stdlib modules and strongly encourage them for 3rd party code. Cheers, -Barry From ckaynor at zindagigames.com Mon Apr 20 21:03:08 2015 From: ckaynor at zindagigames.com (Chris Kaynor) Date: Mon, 20 Apr 2015 12:03:08 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: Message-ID: On Mon, Apr 20, 2015 at 11:30 AM, Harry Percival wrote: > My first reaction to type hints was "yuck", and I'm sure I'm not the only > one to think that. viz (from some pycon slides): > > def zipmap(f: Callable[[int, int], int], xx: List[int], > yy: List[int]) -> List[Tuple[int, int, int]]: My opinion of type hints is: they are great inline, when they are simple. Cases like (I know its not a real-world case, but there are plenty of similarly simple functions): def add(a: int, b: int) -> int: are fine. When you get more complicated examples, especially when they involve complex types (list, tuple, callable, etc). In such cases, it may make sense to predefine a type (I believe the typehints PEP supports some form of typedef - I think just a simple assignment) and use that instead of the full type name. This will generally make the function definition MUCH easier to read. > That sounds like the best of both worlds to me. > - .py files stay beautiful, concise, and easy to read. > - beginners don't have to worry about wading through type definitions when > they find themselves browsing someone else's source > - type information is available to the linters and static file checkers, so > we get all the benefits. > > Sounds great right? Everybody will be happy! So let's nail it down! If I > was in charge, here's what I'd do: > > * standardise the syntax for type hints in 3.5, as per PEP484 > * but: recommend the use of stub files as the preferred place to store hints > * and: deprecate function annotations in the core language > * remove them from the core language altogether in 3.6 The main drawback of using a stub file is that it is much more likely to end up out-of-date, and thus useless at best, than if the types are defined inline. The same issue applies to various alternative proposals such as having the typehints in decorators, but to a lesser degree. The same issue also applies to general comments as well. I would say there are a few major reasons to use stub files: 1) You cannot put the annotations in the source, probably either because the module is a C library, or you must support versions which do not support annotations. 2) You must use annotations for other purposes. 3) The annotations are particularly complicated, and you cannot provide nice names for the various types (likely, because the types are just very difficult to name, such as in the zipmap example, though that one might be easier in a more specific context). Outside of those specific reasons, I would vastly prefer any annotations to be included in the source, where they are much more likely to be updated, rather than in a separate file, where they will likely be forgotten. From tymoteusz.jankowski at gmail.com Mon Apr 20 21:03:49 2015 From: tymoteusz.jankowski at gmail.com (Tymoteusz Jankowski) Date: Mon, 20 Apr 2015 19:03:49 +0000 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150420144106.63513828@anarchist.wooz.org> References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: burn the witch.. More seriously.. +1 to Harry voice. Adding type hints to function code is so ugly that that i'm breaking silence and i'm expressing it here before you, so: It's ugly Perhaps this question was asked a million times, but why not docstrings, which seems to be more elegant and natural place for that? I will not use it. Damn, if i had reputation this could be a threat. ;) On Mon, Apr 20, 2015 at 8:43 PM Barry Warsaw wrote: > On Apr 20, 2015, at 07:30 PM, Harry Percival wrote: > > >tldr; type hints in python source are scary. Would reserving them for stub > >files be better? > > I think so. I think PEP 8 should require stub files for stdlib modules and > strongly encourage them for 3rd party code. > > Cheers, > -Barry > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/tymoteusz.jankowski%40gmail.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Mon Apr 20 21:09:00 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Mon, 20 Apr 2015 20:09:00 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150420144106.63513828@anarchist.wooz.org> References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: On 20 April 2015 at 19:41, Barry Warsaw wrote: >>tldr; type hints in python source are scary. Would reserving them for stub >>files be better? > > I think so. I think PEP 8 should require stub files for stdlib modules and > strongly encourage them for 3rd party code. Agreed. I have many of the same concerns as Harry, but I wouldn't have expressed them quite as well. I'm not too worried about actually removing annotations from the core language, but I agree that we should create a strong culture of "type hints go in stub files" to keep source files readable and clean. On that note, I'm not sure "stub" files is a particularly good name. Maybe "type files" would be better? Something that emphasises that they are the correct place to put type hints, not a workaround. Paul From ethan at stoneleaf.us Mon Apr 20 21:26:25 2015 From: ethan at stoneleaf.us (Ethan Furman) Date: Mon, 20 Apr 2015 12:26:25 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150420144106.63513828@anarchist.wooz.org> References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: <20150420192625.GA21797@stoneleaf.us> On 04/20, Barry Warsaw wrote: > On Apr 20, 2015, at 07:30 PM, Harry Percival wrote: > >>tldr; type hints in python source are scary. Would reserving them for stub >>files be better? > > I think so. I think PEP 8 should require stub files for stdlib modules and > strongly encourage them for 3rd party code. +1 -- ~Ethan~ From lukasz at langa.pl Mon Apr 20 21:35:33 2015 From: lukasz at langa.pl (=?utf-8?Q?=C5=81ukasz_Langa?=) Date: Mon, 20 Apr 2015 12:35:33 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: Message-ID: <714505F2-5759-4F37-A99D-627699920972@langa.pl> On Apr 20, 2015, at 11:30 AM, Harry Percival wrote: > I think: > - type hints are ugly Making them work with the current Python syntax was a challenge. Granted, the end result is not perfect. It can be improved *if* type hints prove to be generally useful and popular. This might not happen. > - they make the language harder to understand > - particularly for beginners A counter-point to that would be that good APIs already expose this information but using informal languages in docstrings, pure comments or straight API documentation. This only standardizes the notation in a way that: - let?s those annotations be accessed at runtime - let?s the entire Python ecosystem learn to read them statically and use them for REPL, IDE completion and type checking - moves them to a place where it?s less likely that they become outdated > def zipmap(f: Callable[[int, int], int], xx: List[int], > yy: List[int]) -> List[Tuple[int, int, int]]: > Yeah, so agreed, this is pretty busy. For such cases, reformatting makes it less confusing (see: Screenshot 1). As already stated, you can also name the types, in which case they?re less busy. That?s a compomise, too, since they require indirection when reading. > No doubt this has occurred to everyone that's been working on them. There is a cost. But the benefits make it worthwhile. Yes, they start to read more transparently after a while. > It sounds like the people that will benefit are Google and other "Enterprise" users, IDE vendors, and the people that will pay for it in sweat and confusion are beginners and John Q. Mediocre Programmer. Any startup that at the beginning bangs out code as fast as possible - and eventually *becomes successful* - will be happy to be able to introduce gradual typing as a measure of improving quality. > My worry is that once type hinting gets standardised, then they will become a "best practice? For library authors and APIs, they might. There?s nothing alarming with that, as I said, for those use cases you already expect ?type hints? in the form of docstrings, comments, API docs, tests, etc. > there's a particular personality type out there that's going to start wanting to add type hints to every function they write. Similarly to mindlessly obeying PEP8 while ignoring its intentions, hobgoblin-of-little-minds style, I think we're very likely to see type hints appearing in a lot of python source, or a lot of pre-commit-hook checkers. Yes, this is going to happen sometimes. If you can?t fight a hobgoblin in your community/organization, you?ve got bigger problems than ugly hints. Ultimately, we can?t optimize for the lowest common denominator. > Pretty soon it will be hard to find any open source library code that doesn't have type hints, or any project style guide that doesn't require them. Harry, you must be the biggest Python 3 supporter ever <3 More seriously, no, this is not going to happen fast. > Sounds great right? Everybody will be happy! So let's nail it down! Stub files have many downsides, too, unfortunately: - we don?t *want* to have them, but we *need* to have them (C extensions, third-party modules, Python 2, ?) - they bring cognitive overhead of having to switch between two files - they require the author to repeat himself quite a lot - they might go out of date much easier than annotations in the function signature - they can?t help with local variable inference Since it was mentioned in a different e-mail in this thread: yes, the standard library is not getting any type annotations. When we decide to ship type hints with Python 3.6, they will be added as stubs. As for the actual proposal: > * standardise the syntax for type hints in 3.5, as per PEP484 > * but: recommend the use of stub files as the preferred place to store hints > * and: deprecate function annotations in the core language > * remove them from the core language altogether in 3.6 That is still pretty radical. I?ll leave the final words to the other authors of PEP 484 but from my perspective it?s not type annotations that are ugly, it?s the stubs. The annotation syntax might be improved for future releases. Stubs - by design - will always be inferior because of the reasons stated above. -- Best regards, ?ukasz Langa WWW: http://lukasz.langa.pl/ Twitter: @llanga IRC: ambv on #python-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Screenshot 1.png Type: image/png Size: 2024 bytes Desc: not available URL: From p.f.moore at gmail.com Mon Apr 20 21:49:14 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Mon, 20 Apr 2015 20:49:14 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <714505F2-5759-4F37-A99D-627699920972@langa.pl> References: <714505F2-5759-4F37-A99D-627699920972@langa.pl> Message-ID: On 20 April 2015 at 20:35, ?ukasz Langa wrote: > Since it was mentioned in a different e-mail in this thread: yes, the > standard library is not getting any type annotations. When we decide to > ship type hints with Python 3.6, they will be added as stubs. Why is this? Surely this is something where the stdlib should "eat its own dogfood" and include inline hints rather than stub files? Paul -------------- next part -------------- An HTML attachment was scrubbed... URL: From marky1991 at gmail.com Mon Apr 20 21:55:34 2015 From: marky1991 at gmail.com (Mark Young) Date: Mon, 20 Apr 2015 15:55:34 -0400 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <714505F2-5759-4F37-A99D-627699920972@langa.pl> Message-ID: Just another peanut from the gallery: I pretty much agree with everything that harry said. My current response to type annotations is "Yuck, that kills readability. I hope no code I ever have to read uses this.". -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Mon Apr 20 22:07:13 2015 From: guido at python.org (Guido van Rossum) Date: Mon, 20 Apr 2015 13:07:13 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <714505F2-5759-4F37-A99D-627699920972@langa.pl> Message-ID: On Mon, Apr 20, 2015 at 12:49 PM, Paul Moore wrote: > > On 20 April 2015 at 20:35, ?ukasz Langa wrote: > >> Since it was mentioned in a different e-mail in this thread: yes, the >> standard library is not getting any type annotations. When we decide to >> ship type hints with Python 3.6, they will be added as stubs. > > > Why is this? Surely this is something where the stdlib should "eat its own > dogfood" and include inline hints rather than stub files? > Actually, "eat your own dogfood" is not one of the goals of the stdlib -- nor is it supposed to be an example of how to code. This is often misunderstood. The stdlib contains a lot of Python code, and you can learn a lot from it, but good coding habits aren't generally something you learn there -- the code is crusty (some of the oldest Python code in existence lives in the stdlib!), often has to bend over backwards to support backward compatibility, and is riddled with performance hacks. Based on some events in the distant past, there's actually an active ban against sweeping changes to the stdlib that attempt to "modernize" it or use new features -- because there is so much code in the stdlib, review of such sweeping (often near-mechanical) changes is inevitably less thorough than when a new feature is implemented, and even the best tests don't catch everything, so regressions in dark corners are often the result. The only place in the stdlib where I expect inline type hints to be used is in brand new modules introduced in 3.6 or later, and then only when the author believes inline type hints to be clearer than wordy docstrings. The situation is possibly even bleaker (or happier, depending on your position :-) for inline type hints in 3rd party packages -- few package authors will be satisfied with supporting only Python 3.5 and later. True, you can support Python 3.2 and up by declaring the 3rd party typing package as a dependency (unless Python 3.5+ is detected), but I don't expect this to become a popular approach overnight. So I think the rumors of Python's death are greatly exaggerated. -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From ericsnowcurrently at gmail.com Mon Apr 20 22:10:34 2015 From: ericsnowcurrently at gmail.com (Eric Snow) Date: Mon, 20 Apr 2015 14:10:34 -0600 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <714505F2-5759-4F37-A99D-627699920972@langa.pl> References: <714505F2-5759-4F37-A99D-627699920972@langa.pl> Message-ID: On Mon, Apr 20, 2015 at 1:35 PM, ?ukasz Langa wrote: > Yeah, so agreed, this is pretty busy. For such cases, reformatting makes > it less confusing (see: Screenshot 1). > > > While it helps, this sort of best-practice is still unsettled (and apparently not obvious). In the short term it would make more sense to recommend using stub files for all the reason Harry enumerated. Once the best practices are nailed down through experience with stub files, then we can make recommendations regarding inline type hints. -eric -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Screenshot 1.png Type: image/png Size: 2024 bytes Desc: not available URL: From robertc at robertcollins.net Mon Apr 20 22:15:13 2015 From: robertc at robertcollins.net (Robert Collins) Date: Tue, 21 Apr 2015 08:15:13 +1200 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <714505F2-5759-4F37-A99D-627699920972@langa.pl> Message-ID: On 21 April 2015 at 08:07, Guido van Rossum wrote: > The situation is possibly even bleaker (or happier, depending on your > position :-) for inline type hints in 3rd party packages -- few package > authors will be satisfied with supporting only Python 3.5 and later. True, > you can support Python 3.2 and up by declaring the 3rd party typing package > as a dependency (unless Python 3.5+ is detected), but I don't expect this to > become a popular approach overnight. mypy has a codec for 2.x which strips type annotations - https://github.com/JukkaL/mypy/tree/master/mypy/codec - while you can't run mypy under 2.x, you can run it under 3.x to perform the analysis, and ones code still runs under 2.x. Another route - the one I've been experimenting with as I get familiar with mypy - is to just use type comments exclusively. Function type comments currently break, but that seems like a fairly shallow bug to me, rather than something that shouldn't work. The advantage of that route is that editors which make comments appear in subtle colours, makes the type hints be unobtrusive without specific syntax colouring support. -Rob -- Robert Collins Distinguished Technologist HP Converged Cloud From robertc at robertcollins.net Mon Apr 20 22:17:07 2015 From: robertc at robertcollins.net (Robert Collins) Date: Tue, 21 Apr 2015 08:17:07 +1200 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <714505F2-5759-4F37-A99D-627699920972@langa.pl> Message-ID: On 21 April 2015 at 08:10, Eric Snow wrote: > > > > While it helps, this sort of best-practice is still unsettled (and apparently not obvious). In the short term it would make more sense to recommend using stub files for all the reason Harry enumerated. Once the best practices are nailed down through experience with stub files, then we can make recommendations regarding inline type hints. > > -eric Forgive my ignorance, but can stub files can't annotate variables within functions? E.g. AIUI if there is a stub file, it is used in the static analysis instead of the actual source. Likely I've got it modelled wrong in my head :) -Rob -- Robert Collins Distinguished Technologist HP Converged Cloud From Nikolaus at rath.org Mon Apr 20 21:51:15 2015 From: Nikolaus at rath.org (Nikolaus Rath) Date: Mon, 20 Apr 2015 12:51:15 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: (Harry Percival's message of "Mon, 20 Apr 2015 19:30:39 +0100") References: Message-ID: <87sibuy3ws.fsf@thinkpad.rath.org> On Apr 20 2015, Harry Percival wrote: > My first reaction to type hints was "yuck", and I'm sure I'm not the only > one to think that. viz (from some pycon slides): > > def zipmap(f: Callable[[int, int], int], xx: List[int], > yy: List[int]) -> List[Tuple[int, int, int]]: > > arg. and imagine it with default arguments. This is indeed ugly as hell and I certainly would not want to see this in any Python file I'm working with. However, I always assumed that whatever tools consume these annotations are expected to be good enough at automatic type inference that something like this would never be necessary? In practice, I would hope that the above simplifies to def zipmap(f, xx: List[int], yy: List[int]): because (just picking a probably buggy random implementation) zz = [] # --> z must be List[A] for i in range(min(len(xx), len(yy))): x = xx[i] # --> x must be int y = xx[i] # --> y must be int z = f(x,y) # --> f must be Callable[(int,int], B] zz[i] = (x,y,z) # --> A must be Tuple[int,int,B] return zz # --> return value must be List[Tuple[int,int,B]] it doesn't catch that B = int, but I think that's acceptable. Is this not the case? Are we really expecting people to write stuff like the above? Best, -Nikolaus -- GPG encrypted emails preferred. Key id: 0xD113FCAC3C4E599F Fingerprint: ED31 791B 2C5C 1613 AF38 8B8A D113 FCAC 3C4E 599F ?Time flies like an arrow, fruit flies like a Banana.? From guido at python.org Mon Apr 20 22:29:11 2015 From: guido at python.org (Guido van Rossum) Date: Mon, 20 Apr 2015 13:29:11 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <714505F2-5759-4F37-A99D-627699920972@langa.pl> Message-ID: On Mon, Apr 20, 2015 at 1:15 PM, Robert Collins wrote: > On 21 April 2015 at 08:07, Guido van Rossum wrote: > > > The situation is possibly even bleaker (or happier, depending on your > > position :-) for inline type hints in 3rd party packages -- few package > > authors will be satisfied with supporting only Python 3.5 and later. > True, > > you can support Python 3.2 and up by declaring the 3rd party typing > package > > as a dependency (unless Python 3.5+ is detected), but I don't expect > this to > > become a popular approach overnight. > > mypy has a codec for 2.x which strips type annotations - > https://github.com/JukkaL/mypy/tree/master/mypy/codec - while you > can't run mypy under 2.x, you can run it under 3.x to perform the > analysis, and ones code still runs under 2.x. > I know, it was my idea. :-) But I think stubs are usually better of you want to support PY2. The experience when the codec is not installed is really poor. > Another route - the one I've been experimenting with as I get familiar > with mypy - is to just use type comments exclusively. Function type > comments currently break, but that seems like a fairly shallow bug to > me, rather than something that shouldn't work. The advantage of that > route is that editors which make comments appear in subtle colours, > makes the type hints be unobtrusive without specific syntax colouring > support. > There are definitely some tooling issues around annotations (e.g. I've had to file at least one bug with "pep8"), but in cases like this the language must lead, and tooling will follow. The idea of using comments exclusively is interesting, although I think there are also some real downsides. (As an example: Jython is thinking of making some use of type hints for code generation, and they strongly prefer to have the hints in the AST, whereas comments don't show up in the AST at all.) -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From me+python at ixokai.io Mon Apr 20 22:29:51 2015 From: me+python at ixokai.io (Stephen Hansen) Date: Mon, 20 Apr 2015 13:29:51 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: Message-ID: > > Sounds great right? Everybody will be happy! So let's nail it down! If I > was in charge, here's what I'd do: > > * standardise the syntax for type hints in 3.5, as per PEP484 > * but: recommend the use of stub files as the preferred place to store > hints > * and: deprecate function annotations in the core language > * remove them from the core language altogether in 3.6 > Personally, I'm not all that keen on the verbosity of the syntax; I'm sad that List[int] has to be how you spell things instead of just [int], but I'm sure there's reasons that can't work. That said, I hate stub files. I understand why they may be needed sometimes so accept them as a necessary evil, but that doesn't make them a goal to me. Stub files become "optional headers" which I'll have to keep in sync with the actual code -- which in my opinion, just about guarantees a maintenance burden that will fall by the side of the road. If I have to look at another file know or change the function arguments in the code I'm working on, that hurts readability, too. Will it take some getting used to, this syntax? Yes. At one point I thought comprehensions and ternary expressions were unreadable. I use them all the time now and find them very elegant. I'm doubtful I'll ever find type hints /elegant/, but I'm pretty sure they won't be "ugly" forever. Ugly has a lot to do with familiarity. It's also deeply subjective. But its an objective reality, imho, that having to maintain and sync up function definitions in *two different files* is a burden. And that is a burden I really don't want to deal with. --Stephen -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Mon Apr 20 22:31:24 2015 From: guido at python.org (Guido van Rossum) Date: Mon, 20 Apr 2015 13:31:24 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <714505F2-5759-4F37-A99D-627699920972@langa.pl> Message-ID: On Mon, Apr 20, 2015 at 1:17 PM, Robert Collins wrote: > On 21 April 2015 at 08:10, Eric Snow wrote: > > > > > > > > While it helps, this sort of best-practice is still unsettled (and > apparently not obvious). In the short term it would make more sense to > recommend using stub files for all the reason Harry enumerated. Once the > best practices are nailed down through experience with stub files, then we > can make recommendations regarding inline type hints. > > > > -eric > > Forgive my ignorance, but can stub files can't annotate variables > within functions? E.g. AIUI if there is a stub file, it is used in the > static analysis instead of the actual source. Likely I've got it > modelled wrong in my head :) > > -Rob > Correct, stub files are only used to type-check *users* of a module. If you want a module itself to be type-checked you have to use inline type hints. (Though it has been suggested to combine the hints from the stub with the implementation and use this to type-check the implementation, and some tool chains may actually implement this.) -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From rdmurray at bitdance.com Mon Apr 20 22:37:04 2015 From: rdmurray at bitdance.com (R. David Murray) Date: Mon, 20 Apr 2015 16:37:04 -0400 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <714505F2-5759-4F37-A99D-627699920972@langa.pl> References: <714505F2-5759-4F37-A99D-627699920972@langa.pl> Message-ID: <20150420203704.632B9250E59@webabinitio.net> I wrote a longer response and then realized it didn't really add much to the discussion. So let me be short: type annotations do *not* appeal to me, and I am not looking forward to the cognitive overhead of dealing with them. Perhaps I will eventually grow to like them if the tools that use them really add value. You'll have to sell me on it, though. On Mon, 20 Apr 2015 12:35:33 -0700, wrote: > Stub files have many downsides, too, unfortunately: > - we don???t *want* to have them, but we *need* to have them (C extensions, third-party modules, Python 2, ???) > - they bring cognitive overhead of having to switch between two files > - they require the author to repeat himself quite a lot > - they might go out of date much easier than annotations in the function signature The whole point of type hints is for the linters/IDEs, so IMO it is perfectly reasonable to put the burden of making them useful onto the linters/IDEs. The UI for it can unify the two files into a single view...I know because way back in the dark ages I wrote a small editor-based IDE that did something very analogous on an IBM Mainframe, and it worked really well as a development environment. --David From guido at python.org Mon Apr 20 22:41:53 2015 From: guido at python.org (Guido van Rossum) Date: Mon, 20 Apr 2015 13:41:53 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction Message-ID: On Mon, Apr 20, 2015 at 12:51 PM, Nikolaus Rath wrote: > On Apr 20 2015, Harry Percival wrote: > > My first reaction to type hints was "yuck", and I'm sure I'm not the only > > one to think that. viz (from some pycon slides): > > > > def zipmap(f: Callable[[int, int], int], xx: List[int], > > yy: List[int]) -> List[Tuple[int, int, int]]: > > > > arg. and imagine it with default arguments. > > This is indeed ugly as hell and I certainly would not want to see this > in any Python file I'm working with. > > However, I always assumed that whatever tools consume these annotations > are expected to be good enough at automatic type inference that > something like this would never be necessary? > > In practice, I would hope that the above simplifies to > > def zipmap(f, xx: List[int], yy: List[int]): > > because (just picking a probably buggy random implementation) > > zz = [] # --> z must be List[A] > for i in range(min(len(xx), len(yy))): > x = xx[i] # --> x must be int > y = xx[i] # --> y must be int > z = f(x,y) # --> f must be Callable[(int,int], B] > zz[i] = (x,y,z) # --> A must be Tuple[int,int,B] > > return zz # --> return value must be List[Tuple[int,int,B]] > > it doesn't catch that B = int, but I think that's acceptable. > > > Is this not the case? Are we really expecting people to write stuff like > the above? > Jukka (mypy's author) believes that specifying full annotations will catch more bugs, and makes the error messages easier to understand -- in practice when you let the type inferencer run free it will often infer bizarre union types or other complex forms, and the errors may well appear in the wrong spot. This is a common issue with type inferencing, and anyone who has tried to learn Haskell (or C++ :-) has plenty of experience with such errors. The good news is that I don't actually expect people to have to write or even read stuff like the above; the example was quoted out of context. In code that the typical (mediocre :-) programmer writes, argument types won't be more complex than some built-in primitive type (e.g. int or str) or a List of primitive types, or perhaps a user-defined class or List of such. And you can always leave the annotation out if you find yourself fighting the type checker -- it will default to Any and the type checker will just shut up. That's the beauty of gradual typing (and it differs greatly from strict typing as seen in Haskell or C++). -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From ijmorlan at uwaterloo.ca Mon Apr 20 21:37:57 2015 From: ijmorlan at uwaterloo.ca (Isaac Morland) Date: Mon, 20 Apr 2015 15:37:57 -0400 (EDT) Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: On Mon, 20 Apr 2015, Paul Moore wrote: > On 20 April 2015 at 19:41, Barry Warsaw wrote: >>> tldr; type hints in python source are scary. Would reserving them for stub >>> files be better? >> >> I think so. I think PEP 8 should require stub files for stdlib modules and >> strongly encourage them for 3rd party code. > > Agreed. I have many of the same concerns as Harry, but I wouldn't have > expressed them quite as well. I'm not too worried about actually > removing annotations from the core language, but I agree that we > should create a strong culture of "type hints go in stub files" to > keep source files readable and clean. > > On that note, I'm not sure "stub" files is a particularly good name. > Maybe "type files" would be better? Something that emphasises that > they are the correct place to put type hints, not a workaround. How about "header" files? (ducks...) Isaac Morland CSCF Web Guru DC 2619, x36650 WWW Software Specialist From harry.percival at gmail.com Mon Apr 20 22:48:55 2015 From: harry.percival at gmail.com (Harry Percival) Date: Mon, 20 Apr 2015 21:48:55 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: > "I hate stub files. [...] in my opinion, [it] just about guarantees a maintenance burden that will fall by the side of the road. I'm not so pessimistic. It's not like documentation or docstrings or comments -- the whole point is that it should be very easy to have an automated check for whether your stubs are in sync with your source, because both are in code. Unlike docs or comments which can easily become out of date, because there's no automated process to tell you they need updating... I'm thinking of it as a thing your editor will warn you of. Like pyflakes warnings about unused variables & co, I'm never happy until I've silenced them all in a file, and similarly, your editor will keep bugging you until you've got your stubs inline with your code... On 20 April 2015 at 20:37, Isaac Morland wrote: > On Mon, 20 Apr 2015, Paul Moore wrote: > > On 20 April 2015 at 19:41, Barry Warsaw wrote: >> >>> tldr; type hints in python source are scary. Would reserving them for >>>> stub >>>> files be better? >>>> >>> >>> I think so. I think PEP 8 should require stub files for stdlib modules >>> and >>> strongly encourage them for 3rd party code. >>> >> >> Agreed. I have many of the same concerns as Harry, but I wouldn't have >> expressed them quite as well. I'm not too worried about actually >> removing annotations from the core language, but I agree that we >> should create a strong culture of "type hints go in stub files" to >> keep source files readable and clean. >> >> On that note, I'm not sure "stub" files is a particularly good name. >> Maybe "type files" would be better? Something that emphasises that >> they are the correct place to put type hints, not a workaround. >> > > How about "header" files? > > (ducks...) > > Isaac Morland CSCF Web Guru > DC 2619, x36650 WWW Software Specialist > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/hjwp2%40cantab.net > -- ------------------------------ Harry J.W. Percival ------------------------------ Twitter: @hjwp Mobile: +44 (0) 78877 02511 Skype: harry.percival -------------- next part -------------- An HTML attachment was scrubbed... URL: From harry.percival at gmail.com Mon Apr 20 22:50:05 2015 From: harry.percival at gmail.com (Harry Percival) Date: Mon, 20 Apr 2015 21:50:05 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: > stub files are only used to type-check *users* of a module. If you want a module itself to be type-checked you have to use inline type hints is this a fundamental limitation, or just the current state of tooling? On 20 April 2015 at 21:48, Harry Percival wrote: > > "I hate stub files. [...] in my opinion, [it] just about guarantees a > maintenance burden that will fall by the side of the road. > > I'm not so pessimistic. It's not like documentation or docstrings or > comments -- the whole point is that it should be very easy to have an > automated check for whether your stubs are in sync with your source, > because both are in code. Unlike docs or comments which can easily become > out of date, because there's no automated process to tell you they need > updating... I'm thinking of it as a thing your editor will warn you of. > Like pyflakes warnings about unused variables & co, I'm never happy until > I've silenced them all in a file, and similarly, your editor will keep > bugging you until you've got your stubs inline with your code... > > > On 20 April 2015 at 20:37, Isaac Morland wrote: > >> On Mon, 20 Apr 2015, Paul Moore wrote: >> >> On 20 April 2015 at 19:41, Barry Warsaw wrote: >>> >>>> tldr; type hints in python source are scary. Would reserving them for >>>>> stub >>>>> files be better? >>>>> >>>> >>>> I think so. I think PEP 8 should require stub files for stdlib modules >>>> and >>>> strongly encourage them for 3rd party code. >>>> >>> >>> Agreed. I have many of the same concerns as Harry, but I wouldn't have >>> expressed them quite as well. I'm not too worried about actually >>> removing annotations from the core language, but I agree that we >>> should create a strong culture of "type hints go in stub files" to >>> keep source files readable and clean. >>> >>> On that note, I'm not sure "stub" files is a particularly good name. >>> Maybe "type files" would be better? Something that emphasises that >>> they are the correct place to put type hints, not a workaround. >>> >> >> How about "header" files? >> >> (ducks...) >> >> Isaac Morland CSCF Web Guru >> DC 2619, x36650 WWW Software Specialist >> >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/hjwp2%40cantab.net >> > > > > -- > ------------------------------ > Harry J.W. Percival > ------------------------------ > Twitter: @hjwp > Mobile: +44 (0) 78877 02511 > Skype: harry.percival > -- ------------------------------ Harry J.W. Percival ------------------------------ Twitter: @hjwp Mobile: +44 (0) 78877 02511 Skype: harry.percival -------------- next part -------------- An HTML attachment was scrubbed... URL: From robertc at robertcollins.net Mon Apr 20 23:01:20 2015 From: robertc at robertcollins.net (Robert Collins) Date: Tue, 21 Apr 2015 09:01:20 +1200 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: On 21 April 2015 at 08:50, Harry Percival wrote: >> stub files are only used to type-check *users* of a module. If you want a >> module itself to be type-checked you have to use inline type hints > > is this a fundamental limitation, or just the current state of tooling? AIUI its the fundamental design. Stubs don't annotate python code, they *are* annotated code themselves. They aren't merged with the observed code at all. Could they be? Possibly. I don't know how much work that would be. -Rob -- Robert Collins Distinguished Technologist HP Converged Cloud From guido at python.org Mon Apr 20 23:02:22 2015 From: guido at python.org (Guido van Rossum) Date: Mon, 20 Apr 2015 14:02:22 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: On Mon, Apr 20, 2015 at 1:50 PM, Harry Percival wrote: > > stub files are only used to type-check *users* of a module. If you want > a module itself to be type-checked you have to use inline type hints > > is this a fundamental limitation, or just the current state of tooling? > It's not fundamental, it's just more in line with the original purpose of stubs (to describe C extensions). -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From rymg19 at gmail.com Mon Apr 20 22:59:26 2015 From: rymg19 at gmail.com (Ryan Gonzalez) Date: Mon, 20 Apr 2015 15:59:26 -0500 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: Only if you want Java users burning all written copies of the PEP... On Mon, Apr 20, 2015 at 2:37 PM, Isaac Morland wrote: > On Mon, 20 Apr 2015, Paul Moore wrote: > > On 20 April 2015 at 19:41, Barry Warsaw wrote: >> >>> tldr; type hints in python source are scary. Would reserving them for >>>> stub >>>> files be better? >>>> >>> >>> I think so. I think PEP 8 should require stub files for stdlib modules >>> and >>> strongly encourage them for 3rd party code. >>> >> >> Agreed. I have many of the same concerns as Harry, but I wouldn't have >> expressed them quite as well. I'm not too worried about actually >> removing annotations from the core language, but I agree that we >> should create a strong culture of "type hints go in stub files" to >> keep source files readable and clean. >> >> On that note, I'm not sure "stub" files is a particularly good name. >> Maybe "type files" would be better? Something that emphasises that >> they are the correct place to put type hints, not a workaround. >> > > How about "header" files? > > (ducks...) > > Isaac Morland CSCF Web Guru > DC 2619, x36650 WWW Software Specialist > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com > -- Ryan [ERROR]: Your autotools build scripts are 200 lines longer than your program. Something?s wrong. http://kirbyfan64.github.io/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From harry.percival at gmail.com Mon Apr 20 23:00:57 2015 From: harry.percival at gmail.com (Harry Percival) Date: Mon, 20 Apr 2015 22:00:57 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: @Lukasz: Of course you're right, ugly is a matter of perspective, and I'm sure I could grow to love them, and they might evolve into a more polished direction > "they start to read more transparently after a while." But I'm still worried about beginners, and I'm even worried about me. I like to be able to scan through some code and see the essence of it. I learned Java at school, and I got it figured out, but i'm glad I left it behind. Every so often I read a TDD book and the examples are all in java and it just feels like obfuscation -- public void static private String[] class blabla... so many keywords and types getting in the way of *what the code is actually doing*. That's what's so appealing about Python, it strips it down to just the basics. And, to me, type hints are always going to be some unnecessary chaff that gets in the way of my understanding -- not useless, not that they don't have a purpose or that we should remove them completely. But if there was a way of just hiding them, so that I don't have to think about them, as a beginner, or as a normal programmer, most of the time, in the 90% of cases where I don't need to see them, I shouldn't have to... that's why i'm so keen on this stub files idea. One thing I don't understand is this "local variable inference" thing -- can that not be made to work in stub files? On 20 April 2015 at 21:50, Harry Percival wrote: > > stub files are only used to type-check *users* of a module. If you want > a module itself to be type-checked you have to use inline type hints > > is this a fundamental limitation, or just the current state of tooling? > > On 20 April 2015 at 21:48, Harry Percival > wrote: > >> > "I hate stub files. [...] in my opinion, [it] just about guarantees a >> maintenance burden that will fall by the side of the road. >> >> I'm not so pessimistic. It's not like documentation or docstrings or >> comments -- the whole point is that it should be very easy to have an >> automated check for whether your stubs are in sync with your source, >> because both are in code. Unlike docs or comments which can easily become >> out of date, because there's no automated process to tell you they need >> updating... I'm thinking of it as a thing your editor will warn you of. >> Like pyflakes warnings about unused variables & co, I'm never happy until >> I've silenced them all in a file, and similarly, your editor will keep >> bugging you until you've got your stubs inline with your code... >> >> >> On 20 April 2015 at 20:37, Isaac Morland wrote: >> >>> On Mon, 20 Apr 2015, Paul Moore wrote: >>> >>> On 20 April 2015 at 19:41, Barry Warsaw wrote: >>>> >>>>> tldr; type hints in python source are scary. Would reserving them for >>>>>> stub >>>>>> files be better? >>>>>> >>>>> >>>>> I think so. I think PEP 8 should require stub files for stdlib >>>>> modules and >>>>> strongly encourage them for 3rd party code. >>>>> >>>> >>>> Agreed. I have many of the same concerns as Harry, but I wouldn't have >>>> expressed them quite as well. I'm not too worried about actually >>>> removing annotations from the core language, but I agree that we >>>> should create a strong culture of "type hints go in stub files" to >>>> keep source files readable and clean. >>>> >>>> On that note, I'm not sure "stub" files is a particularly good name. >>>> Maybe "type files" would be better? Something that emphasises that >>>> they are the correct place to put type hints, not a workaround. >>>> >>> >>> How about "header" files? >>> >>> (ducks...) >>> >>> Isaac Morland CSCF Web Guru >>> DC 2619, x36650 WWW Software Specialist >>> >>> _______________________________________________ >>> Python-Dev mailing list >>> Python-Dev at python.org >>> https://mail.python.org/mailman/listinfo/python-dev >>> Unsubscribe: >>> https://mail.python.org/mailman/options/python-dev/hjwp2%40cantab.net >>> >> >> >> >> -- >> ------------------------------ >> Harry J.W. Percival >> ------------------------------ >> Twitter: @hjwp >> Mobile: +44 (0) 78877 02511 >> Skype: harry.percival >> > > > > -- > ------------------------------ > Harry J.W. Percival > ------------------------------ > Twitter: @hjwp > Mobile: +44 (0) 78877 02511 > Skype: harry.percival > -- ------------------------------ Harry J.W. Percival ------------------------------ Twitter: @hjwp Mobile: +44 (0) 78877 02511 Skype: harry.percival -------------- next part -------------- An HTML attachment was scrubbed... URL: From graffatcolmingov at gmail.com Tue Apr 21 00:02:00 2015 From: graffatcolmingov at gmail.com (Ian Cordasco) Date: Mon, 20 Apr 2015 17:02:00 -0500 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: On Mon, Apr 20, 2015 at 4:00 PM, Harry Percival wrote: > @Lukasz: > > Of course you're right, ugly is a matter of perspective, and I'm sure I > could grow to love them, and they might evolve into a more polished > direction > > > "they start to read more transparently after a while." > > But I'm still worried about beginners, and I'm even worried about me. I > like to be able to scan through some code and see the essence of it. I > learned Java at school, and I got it figured out, but i'm glad I left it > behind. Every so often I read a TDD book and the examples are all in java > and it just feels like obfuscation -- public void static private String[] > class blabla... so many keywords and types getting in the way of *what the > code is actually doing*. That's what's so appealing about Python, it > strips it down to just the basics. And, to me, type hints are always going > to be some unnecessary chaff that gets in the way of my understanding -- > not useless, not that they don't have a purpose or that we should remove > them completely. But if there was a way of just hiding them, so that I > don't have to think about them, as a beginner, or as a normal programmer, > most of the time, in the 90% of cases where I don't need to see them, I > shouldn't have to... that's why i'm so keen on this stub files idea. > > One thing I don't understand is this "local variable inference" thing -- > can that not be made to work in stub files? > > > > On 20 April 2015 at 21:50, Harry Percival > wrote: > >> > stub files are only used to type-check *users* of a module. If you want >> a module itself to be type-checked you have to use inline type hints >> >> is this a fundamental limitation, or just the current state of tooling? >> >> On 20 April 2015 at 21:48, Harry Percival >> wrote: >> >>> > "I hate stub files. [...] in my opinion, [it] just about guarantees a >>> maintenance burden that will fall by the side of the road. >>> >>> I'm not so pessimistic. It's not like documentation or docstrings or >>> comments -- the whole point is that it should be very easy to have an >>> automated check for whether your stubs are in sync with your source, >>> because both are in code. Unlike docs or comments which can easily become >>> out of date, because there's no automated process to tell you they need >>> updating... I'm thinking of it as a thing your editor will warn you of. >>> Like pyflakes warnings about unused variables & co, I'm never happy until >>> I've silenced them all in a file, and similarly, your editor will keep >>> bugging you until you've got your stubs inline with your code... >>> >>> >>> On 20 April 2015 at 20:37, Isaac Morland wrote: >>> >>>> On Mon, 20 Apr 2015, Paul Moore wrote: >>>> >>>> On 20 April 2015 at 19:41, Barry Warsaw wrote: >>>>> >>>>>> tldr; type hints in python source are scary. Would reserving them for >>>>>>> stub >>>>>>> files be better? >>>>>>> >>>>>> >>>>>> I think so. I think PEP 8 should require stub files for stdlib >>>>>> modules and >>>>>> strongly encourage them for 3rd party code. >>>>>> >>>>> >>>>> Agreed. I have many of the same concerns as Harry, but I wouldn't have >>>>> expressed them quite as well. I'm not too worried about actually >>>>> removing annotations from the core language, but I agree that we >>>>> should create a strong culture of "type hints go in stub files" to >>>>> keep source files readable and clean. >>>>> >>>>> On that note, I'm not sure "stub" files is a particularly good name. >>>>> Maybe "type files" would be better? Something that emphasises that >>>>> they are the correct place to put type hints, not a workaround. >>>>> >>>> >>>> How about "header" files? >>>> >>>> (ducks...) >>>> >>>> Isaac Morland CSCF Web Guru >>>> DC 2619, x36650 WWW Software Specialist >>>> >>>> _______________________________________________ >>>> Python-Dev mailing list >>>> Python-Dev at python.org >>>> https://mail.python.org/mailman/listinfo/python-dev >>>> Unsubscribe: >>>> https://mail.python.org/mailman/options/python-dev/hjwp2%40cantab.net >>>> >>> >>> So I've generally stayed out of this but I feel there is some context that people are missing in general. First, allow me to provide some context: I maintain a /lot/ of Python code[1] and nearly all of it is designed to be compatible with Pythons 2.6, 2.7, 3.2, 3.3, 3.4 (and eventually 3.5) and sometimes 2.5 (depending on the project). If I want to improve a developer's experience with some of that code using Type Hints I will essentially have no way to do that unless I write the code with the annotations and ship versions with annotations stripped and other versions with annotations? That's a lot more overhead. If I could provide the annotations in stubs that means that only the people who care about using them will have to use them. Is it more overhead to manage twice the number of files? Yes. Do I feel it would be worth it to not overly complicate how these packages are released? Yes. Further, there are far more reasons to make stubs the baseline (in my opinion) the biggest reason of all is that people want to provide stubs for popular yet unmaintained libraries as third party packages. Should everyone using PIL be using Pillow? Of course. Does that mean they'll migrate or be allowed to migrate? No. Should they be able to benefit from this? Yes the should. The only way for PIL users to be able to do that is if stub files can be packaged separately for PIL and distributed by someone else. I think while the authors are currently seeing stubs as a necessary *evil* they're missing points where they're a better backwards compatible solution for people who want to give users with capable IDEs the ability to use stub (or hint) files. Cheers, Ian [1]: I am a maintainer of flake8, requests, chardet, github3.py uritemplate.py, rfc3986, sqlobject, the requests toolbelt, a bunch of the requests auth plugins, and I'm a core developer on a small number of openstack projects. -------------- next part -------------- An HTML attachment was scrubbed... URL: From lukasz at langa.pl Tue Apr 21 00:14:29 2015 From: lukasz at langa.pl (=?utf-8?Q?=C5=81ukasz_Langa?=) Date: Mon, 20 Apr 2015 15:14:29 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: On Apr 20, 2015, at 3:02 PM, Ian Cordasco wrote: > > I think while the authors are currently seeing stubs as a necessary *evil* they're missing points where they're a better backwards compatible solution for people who want to give users with capable IDEs the ability to use stub (or hint) files. We might have not chosen the wording that makes this obvious to you but we definitely did consider all of the points you?re referring to: https://www.python.org/dev/peps/pep-0484/#stub-files -- Best regards, ?ukasz Langa WWW: http://lukasz.langa.pl/ Twitter: @llanga IRC: ambv on #python-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: From robertc at robertcollins.net Tue Apr 21 00:37:00 2015 From: robertc at robertcollins.net (Robert Collins) Date: Tue, 21 Apr 2015 10:37:00 +1200 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: On 21 April 2015 at 10:02, Ian Cordasco wrote: > > > > So I've generally stayed out of this but I feel there is some context that > people are missing in general. > > First, allow me to provide some context: I maintain a /lot/ of Python > code[1] and nearly all of it is designed to be compatible with Pythons 2.6, > 2.7, 3.2, 3.3, 3.4 (and eventually 3.5) and sometimes 2.5 (depending on the > project). If I want to improve a developer's experience with some of that > code using Type Hints I will essentially have no way to do that unless I > write the code with the annotations and ship versions with annotations > stripped and other versions with annotations? That's a lot more overhead. If > I could provide the annotations in stubs that means that only the people who > care about using them will have to use them. 2.5? I'm so sorry :). Being in approximately the same boat, I definitely want to be able to improve the developer experience. That said, with one key exception (str/bytes/unicode) Python code generally has the same type on all versions. Sure it might be imported from somewhere else, and you're restricted to the common subset of APIs, but the types in use don't vary per-python. So - as long as your *developers* can run mypy on 3.2+, they can benefit from type checking. mypy itself requires 3.2+ to run, but programs with type annotations should be able to run on all those python versions you mention. Now, what is the minimum barrier for entry? Nothing :) - at the moment every file can be analysed, and mypy ships with a bunch of stubs that describe the stdlib. So - you'll get some benefit immediately, where bad use of stdlib routines is happening. Constraining the types of functions gets you better errors (because you are expressing intent rather than what-might-happen which the inference has to work from otherwise. In particular, constraining the type of *inputs* can let bad callers be detected, rather than the engine assuming they are valid-until-a-contradiction-occurs. You can do that with stub files: put them in repo A, and add them to the MYPYPATH when working on repo B which calls the code in repo A. You can also add those stubs to repo B, but I wouldn't do that because then they will skew vs repo A. A further step up would be to annotate A in its code, rather than using stubs. That way, developers of repo A will be warned about bad uses of other repo A code. But - if you have stubs *or* annotations in-line in repo A, *everyone* changing repo A needs to care. Because if the code mismatches the stub, the folk that do care will now be unable to use repo A correctly - their type checker will complain about valid uses, and fail to complain about more invalid uses. I'm particularly interested in mypy for OpenStack because for some repos > 10% of reported bugs are type mismatch errors which mypy may well have avoided. > Is it more overhead to manage twice the number of files? Yes. Do I feel it > would be worth it to not overly complicate how these packages are released? > Yes. > Further, there are far more reasons to make stubs the baseline (in my > opinion) the biggest reason of all is that people want to provide stubs for > popular yet unmaintained libraries as third party packages. Should everyone > using PIL be using Pillow? Of course. Does that mean they'll migrate or be > allowed to migrate? No. Should they be able to benefit from this? Yes the > should. The only way for PIL users to be able to do that is if stub files > can be packaged separately for PIL and distributed by someone else. stubs can certainly be packaged and distributed separately. That doesn't make the case that we should use stubs for projects that are opting in. > I think while the authors are currently seeing stubs as a necessary *evil* > they're missing points where they're a better backwards compatible solution > for people who want to give users with capable IDEs the ability to use stub > (or hint) files. -Rob -- Robert Collins Distinguished Technologist HP Converged Cloud From guido at python.org Tue Apr 21 01:12:31 2015 From: guido at python.org (Guido van Rossum) Date: Mon, 20 Apr 2015 16:12:31 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: On Mon, Apr 20, 2015 at 2:01 PM, Robert Collins wrote: > On 21 April 2015 at 08:50, Harry Percival > wrote: > >> stub files are only used to type-check *users* of a module. If you want > a > >> module itself to be type-checked you have to use inline type hints > > > > is this a fundamental limitation, or just the current state of tooling? > > AIUI its the fundamental design. Stubs don't annotate python code, > they *are* annotated code themselves. They aren't merged with the > observed code at all. > > Could they be? Possibly. I don't know how much work that would be. > It's fundamental in the implementation of mypy. It doesn't have to be in the implementation of other type checkers (and IIRC the Google folks are planning to merge the two streams). However if you are using typing.get_type_hints(func) this is not intended to give you access to hints defined in stubs (it would require a huge amount of machinery to implement that right). -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From harry.percival at gmail.com Mon Apr 20 23:09:18 2015 From: harry.percival at gmail.com (Harry Percival) Date: Mon, 20 Apr 2015 22:09:18 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: So I guess the main difference is that type annotations in stub files wouldn't be available at runtime? Ie, they wouldn't magically appear in __annotations__ (unless the python interpreter itself started to evaluate stub files too) On 20 April 2015 at 22:02, Guido van Rossum wrote: > On Mon, Apr 20, 2015 at 1:50 PM, Harry Percival > wrote: > >> > stub files are only used to type-check *users* of a module. If you want >> a module itself to be type-checked you have to use inline type hints >> >> is this a fundamental limitation, or just the current state of tooling? >> > > It's not fundamental, it's just more in line with the original purpose of > stubs (to describe C extensions). > > -- > --Guido van Rossum (python.org/~guido) > -- ------------------------------ Harry J.W. Percival ------------------------------ Twitter: @hjwp Mobile: +44 (0) 78877 02511 Skype: harry.percival -------------- next part -------------- An HTML attachment was scrubbed... URL: From harry.percival at gmail.com Tue Apr 21 00:34:51 2015 From: harry.percival at gmail.com (Harry Percival) Date: Mon, 20 Apr 2015 23:34:51 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: exactly. yay stub files! we all agree! everyone loves them! boo type annotations inline in python source. only some people love them. and even then, only after a while, and only tentatively... and some people fear them, mightily... On 20 April 2015 at 23:14, ?ukasz Langa wrote: > On Apr 20, 2015, at 3:02 PM, Ian Cordasco > wrote: > > > I think while the authors are currently seeing stubs as a necessary *evil* > they're missing points where they're a better backwards compatible solution > for people who want to give users with capable IDEs the ability to use stub > (or hint) files. > > > We might have not chosen the wording that makes this obvious to you but we > definitely did consider all of the points you?re referring to: > > https://www.python.org/dev/peps/pep-0484/#stub-files > > -- > Best regards, > ?ukasz Langa > > WWW: http://lukasz.langa.pl/ > Twitter: @llanga > IRC: ambv on #python-dev > -- ------------------------------ Harry J.W. Percival ------------------------------ Twitter: @hjwp Mobile: +44 (0) 78877 02511 Skype: harry.percival -------------- next part -------------- An HTML attachment was scrubbed... URL: From jackdied at gmail.com Tue Apr 21 01:41:00 2015 From: jackdied at gmail.com (Jack Diederich) Date: Mon, 20 Apr 2015 19:41:00 -0400 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: Twelve years ago a wise man said to me "I suggest that you also propose a new name for the resulting language" I talked with many of you at PyCon about the costs of PEP 484. There are plenty of people who have done a fine job promoting the benefits. * It is not optional. Please stop saying that. The people promoting it would prefer that everyone use it. If it is approved it will be optional in the way that PEP8 is optional. If I'm reading your annotated code it is certainly /not/ optional that I understand the annotations. * Uploading stubs for other people's code is a terrible idea. Who do I contact when I update the interface to my library? The random Joe who "helped" by uploading annotations three months ago and then quit the internet? I don't even want to think about people maliciously adding stubs to PyPI. * The cognitive load is very high. The average function signature will double in length. This is not a small cost and telling me it is "optional" to pretend that every other word on the line doesn't exist is a farce. * Every company's style guide is about to get much longer. That in itself is an indicator that this is a MAJOR language change and not just some "optional" add-on. * People will screw it up. The same people who can't be trusted to program without type annotations are also going to be *writing* those type annotations. * Teaching python is about to get much less attractive. It will not be optional for teachers to say "just pretend all this stuff over here doesn't exist" * "No new syntax" is a lie. Or rather a red herring. There are lots of new things it will be required to know and just because the compiler doesn't have to change doesn't mean the language isn't undergoing a major change. If this wasn't in a PEP and it wasn't going to ship in the stdlib very few people would use it. If you told everyone they had to install a different python implementation they wouldn't. This is much worse than that - it is Python4 hidden away inside a PEP. There are many fine languages that have sophisticated type systems. And many bondage & discipline languages that make you type things three times to make really really sure you meant to type that. If you find those other languages appealing I invite you to go use them instead. -Jack https://mail.python.org/pipermail/python-dev/2003-February/033291.html -------------- next part -------------- An HTML attachment was scrubbed... URL: From rymg19 at gmail.com Tue Apr 21 02:00:53 2015 From: rymg19 at gmail.com (Ryan Gonzalez) Date: Mon, 20 Apr 2015 19:00:53 -0500 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: Although I like the concept of type annotations and the PEP, I have to agree with this. If I saw these type annotations when learning Python (I'm self-taught), there's a 99% chance I would've freaked. It's the same issue as with teaching C++: it's wrong to say, "Hey, I taught you the basics, but there's other stuff that's going to confuse you to a ridiculous extent when you read it." People can't ignore it. It'll become a normal part of Python programs. At least now you can say, "I'm using the mypy type checker." Don't get me wrong; I like mypy. I helped with their documentation and am watching the GitHub repo. But this is dead-on. On Mon, Apr 20, 2015 at 6:41 PM, Jack Diederich wrote: > Twelve years ago a wise man said to me "I suggest that you also propose a > new name for the resulting language" > > I talked with many of you at PyCon about the costs of PEP 484. There are > plenty of people who have done a fine job promoting the benefits. > > * It is not optional. Please stop saying that. The people promoting it > would prefer that everyone use it. If it is approved it will be optional in > the way that PEP8 is optional. If I'm reading your annotated code it is > certainly /not/ optional that I understand the annotations. > > * Uploading stubs for other people's code is a terrible idea. Who do I > contact when I update the interface to my library? The random Joe who > "helped" by uploading annotations three months ago and then quit the > internet? I don't even want to think about people maliciously adding stubs > to PyPI. > > * The cognitive load is very high. The average function signature will > double in length. This is not a small cost and telling me it is "optional" > to pretend that every other word on the line doesn't exist is a farce. > > * Every company's style guide is about to get much longer. That in itself > is an indicator that this is a MAJOR language change and not just some > "optional" add-on. > > * People will screw it up. The same people who can't be trusted to program > without type annotations are also going to be *writing* those type > annotations. > > * Teaching python is about to get much less attractive. It will not be > optional for teachers to say "just pretend all this stuff over here doesn't > exist" > > * "No new syntax" is a lie. Or rather a red herring. There are lots of new > things it will be required to know and just because the compiler doesn't > have to change doesn't mean the language isn't undergoing a major change. > > If this wasn't in a PEP and it wasn't going to ship in the stdlib very few > people would use it. If you told everyone they had to install a different > python implementation they wouldn't. This is much worse than that - it is > Python4 hidden away inside a PEP. > > There are many fine languages that have sophisticated type systems. And > many bondage & discipline languages that make you type things three times > to make really really sure you meant to type that. If you find those other > languages appealing I invite you to go use them instead. > > -Jack > > https://mail.python.org/pipermail/python-dev/2003-February/033291.html > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com > > -- Ryan [ERROR]: Your autotools build scripts are 200 lines longer than your program. Something?s wrong. http://kirbyfan64.github.io/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From rdmurray at bitdance.com Tue Apr 21 02:43:38 2015 From: rdmurray at bitdance.com (R. David Murray) Date: Mon, 20 Apr 2015 20:43:38 -0400 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: <20150421004339.1DA5CB500F5@webabinitio.net> +1 to this from me too. I'm afraid that means I'm -1 on the PEP. I didn't write this in my earlier email because I wasn't sure about it, but my gut reaction after reading Harry's email was "if type annotations are used in the stdlib, I'll probably stop contributing". That doesn't mean that's *true*, but that's the first time I've ever had that thought, so it is probably worth sharing. Basically, it makes Python less fun to program in. That may be be an emotional reaction and irrational, but I think it matters. And yes, I write production Python code for a living, though granted not at Google or Facebook or Dropbox scale. On Mon, 20 Apr 2015 19:00:53 -0500, Ryan Gonzalez wrote: > Although I like the concept of type annotations and the PEP, I have to > agree with this. If I saw these type annotations when learning Python (I'm > self-taught), there's a 99% chance I would've freaked. > > It's the same issue as with teaching C++: it's wrong to say, "Hey, I taught > you the basics, but there's other stuff that's going to confuse you to a > ridiculous extent when you read it." People can't ignore it. It'll become a > normal part of Python programs. > > At least now you can say, "I'm using the mypy type checker." > > Don't get me wrong; I like mypy. I helped with their documentation and am > watching the GitHub repo. But this is dead-on. > > > On Mon, Apr 20, 2015 at 6:41 PM, Jack Diederich wrote: > > > Twelve years ago a wise man said to me "I suggest that you also propose a > > new name for the resulting language" > > > > I talked with many of you at PyCon about the costs of PEP 484. There are > > plenty of people who have done a fine job promoting the benefits. > > > > * It is not optional. Please stop saying that. The people promoting it > > would prefer that everyone use it. If it is approved it will be optional in > > the way that PEP8 is optional. If I'm reading your annotated code it is > > certainly /not/ optional that I understand the annotations. > > > > * Uploading stubs for other people's code is a terrible idea. Who do I > > contact when I update the interface to my library? The random Joe who > > "helped" by uploading annotations three months ago and then quit the > > internet? I don't even want to think about people maliciously adding stubs > > to PyPI. > > > > * The cognitive load is very high. The average function signature will > > double in length. This is not a small cost and telling me it is "optional" > > to pretend that every other word on the line doesn't exist is a farce. > > > > * Every company's style guide is about to get much longer. That in itself > > is an indicator that this is a MAJOR language change and not just some > > "optional" add-on. > > > > * People will screw it up. The same people who can't be trusted to program > > without type annotations are also going to be *writing* those type > > annotations. > > > > * Teaching python is about to get much less attractive. It will not be > > optional for teachers to say "just pretend all this stuff over here doesn't > > exist" > > > > * "No new syntax" is a lie. Or rather a red herring. There are lots of new > > things it will be required to know and just because the compiler doesn't > > have to change doesn't mean the language isn't undergoing a major change. > > > > If this wasn't in a PEP and it wasn't going to ship in the stdlib very few > > people would use it. If you told everyone they had to install a different > > python implementation they wouldn't. This is much worse than that - it is > > Python4 hidden away inside a PEP. > > > > There are many fine languages that have sophisticated type systems. And > > many bondage & discipline languages that make you type things three times > > to make really really sure you meant to type that. If you find those other > > languages appealing I invite you to go use them instead. > > > > -Jack > > > > https://mail.python.org/pipermail/python-dev/2003-February/033291.html > > > > _______________________________________________ > > Python-Dev mailing list > > Python-Dev at python.org > > https://mail.python.org/mailman/listinfo/python-dev > > Unsubscribe: > > https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com > > > > > > > -- > Ryan > [ERROR]: Your autotools build scripts are 200 lines longer than your > program. Something???s wrong. > http://kirbyfan64.github.io/ > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/rdmurray%40bitdance.com From rosuav at gmail.com Tue Apr 21 02:45:47 2015 From: rosuav at gmail.com (Chris Angelico) Date: Tue, 21 Apr 2015 10:45:47 +1000 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: On Tue, Apr 21, 2015 at 9:41 AM, Jack Diederich wrote: > * It is not optional. Please stop saying that. The people promoting it would > prefer that everyone use it. If it is approved it will be optional in the > way that PEP8 is optional. If I'm reading your annotated code it is > certainly /not/ optional that I understand the annotations. > > * The cognitive load is very high. The average function signature will > double in length. This is not a small cost and telling me it is "optional" > to pretend that every other word on the line doesn't exist is a farce. > > * Every company's style guide is about to get much longer. That in itself is > an indicator that this is a MAJOR language change and not just some > "optional" add-on. Maybe I'm completely misreading everything here, but I would have thought that there are two completely different use-cases here: 1) Library authors 2) Application authors When you're writing a library, it can be a great help to provide type annotations, because every application that uses your library can benefit. When you're writing an application, you can completely ignore them, but still get the benefit of everyone else's. Most company style guides are going to spend most of their time dealing with application code. Maybe you'd put a few type annotations on some of your internal library-like routines, but you certainly don't need to adorn every single function parameter and return value in code that's called only from elsewhere. Say you have an application framework like Flask - you're going to be writing a bunch of functions that you never call from your own code, but which get called by the framework. Annotating their return values is almost completely useless - you're not going to be checking Flask itself for type errors! And the benefit of annotating their parameters is constrained to the functions themselves, so you can take your pick whether or not you annotate any particular function. Type hints are on par with all those other structured function signature tidbits - formatted docstrings, autodoc comments, etc, etc, etc. Has any one of those become mandatory for all code? Nope. And possibly the best answer to anyone who tries to demand type hints for all code is TheDailyWTF, where you can find JavaDoc like this: http://thedailywtf.com/articles/If_At_First_You_Don_0x27_t_Succeed http://thedailywtf.com/articles/SelfDocumenting I have no fears for my own code. Are you afraid for yours? ChrisA From ben+python at benfinney.id.au Tue Apr 21 02:52:52 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Tue, 21 Apr 2015 10:52:52 +1000 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: <85sibumhej.fsf@benfinney.id.au> Chris Angelico writes: > On Tue, Apr 21, 2015 at 9:41 AM, Jack Diederich wrote: > > * It is not optional. Please stop saying that. The people promoting > > it would prefer that everyone use it. If it is approved it will be > > optional in the way that PEP8 is optional. If I'm reading your > > annotated code it is certainly /not/ optional that I understand the > > annotations. [?] > > Maybe I'm completely misreading everything here [?] I think you've misunderstood the complaint. > When you're writing a library, it can be a great help to provide type > annotations, because every application that uses your library can > benefit. When you're writing an application, you can completely ignore > them, but still get the benefit of everyone else's. Jack is not complaining only about *writing* code. He's complaining about the effect this will have on code that we all are expected to *read*. Programmers spend a great deal of time reading code written by other people. The costs of this proposal are only partly on the writers of the code; they are significantly borne by the people *reading* that code. For them, it is not optional. > I have no fears for my own code. Are you afraid for yours? Jack, if I understand correctly, fears for the code that will be written by others in conformance with this proposal, that he will then have to read and understand. -- \ ?The opposite of a correct statement is a false statement. But | `\ the opposite of a profound truth may well be another profound | _o__) truth.? ?Niels Bohr | Ben Finney From rosuav at gmail.com Tue Apr 21 03:07:02 2015 From: rosuav at gmail.com (Chris Angelico) Date: Tue, 21 Apr 2015 11:07:02 +1000 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <85sibumhej.fsf@benfinney.id.au> References: <20150420144106.63513828@anarchist.wooz.org> <85sibumhej.fsf@benfinney.id.au> Message-ID: On Tue, Apr 21, 2015 at 10:52 AM, Ben Finney wrote: > Chris Angelico writes: > >> On Tue, Apr 21, 2015 at 9:41 AM, Jack Diederich wrote: >> > * It is not optional. Please stop saying that. The people promoting >> > it would prefer that everyone use it. If it is approved it will be >> > optional in the way that PEP8 is optional. If I'm reading your >> > annotated code it is certainly /not/ optional that I understand the >> > annotations. > [?] >> >> Maybe I'm completely misreading everything here [?] > > I think you've misunderstood the complaint. > >> When you're writing a library, it can be a great help to provide type >> annotations, because every application that uses your library can >> benefit. When you're writing an application, you can completely ignore >> them, but still get the benefit of everyone else's. > > Jack is not complaining only about *writing* code. He's complaining > about the effect this will have on code that we all are expected to > *read*. Ahh. Yes, that's a concern. When you go digging into that library to find out how it works, yes, you'd be face-to-face with their type annotations. I doubt the worst-case complex ones are going to come up all that often, though. Sure, you might declare that something returns a list of dictionaries mapping tuples of integers and strings to list of sets of atoms, but that's hardly common. (And chances are you can just declare that it returns List[Dict] and have done with it.) Maybe it'd be of value to have a quick "code stripper" that takes away all the annotations, plus any other junk/framing that you're not interested in, and gives you something you can browse in a text editor? It could take away all the Sphinx adornments from docstrings, any source control versioning markers, all that kind of thing. Then you could read through the code in a simpler form, while still having type annotations there for you if you need them. ChrisA From ncoghlan at gmail.com Tue Apr 21 03:30:01 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Mon, 20 Apr 2015 21:30:01 -0400 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150420144106.63513828@anarchist.wooz.org> References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: On 20 Apr 2015 14:44, "Barry Warsaw" wrote: > > On Apr 20, 2015, at 07:30 PM, Harry Percival wrote: > > >tldr; type hints in python source are scary. Would reserving them for stub > >files be better? > > I think so. I think PEP 8 should require stub files for stdlib modules and > strongly encourage them for 3rd party code. +1 Having stub files take precedence over inline annotations would also neatly deal with the annotation compatibility problem (you can use a stub file to annotate code using annotations for another purpose). Another point in favour of that approach: stub files could be evaluated holistically rather than statement at a time, permitting implicit forward references. Cheers, Nick. > > Cheers, > -Barry > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ncoghlan%40gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at pearwood.info Tue Apr 21 05:00:50 2015 From: steve at pearwood.info (Steven D'Aprano) Date: Tue, 21 Apr 2015 13:00:50 +1000 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: <20150421030048.GH5663@ando.pearwood.info> On Mon, Apr 20, 2015 at 11:34:51PM +0100, Harry Percival wrote: > exactly. yay stub files! we all agree! everyone loves them! Not even close. -- Steve From steve at pearwood.info Tue Apr 21 05:29:34 2015 From: steve at pearwood.info (Steven D'Aprano) Date: Tue, 21 Apr 2015 13:29:34 +1000 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: Message-ID: <20150421032934.GI5663@ando.pearwood.info> On Mon, Apr 20, 2015 at 07:30:39PM +0100, Harry Percival wrote: > Hi all, > > tldr; type hints in python source are scary. Would reserving them for stub > files be better? No no no, a thousand times no it would not! Please excuse my extreme reaction, but over on the python-list mailing list (comp.lang.python if you prefer Usenet) we already had this discussion back in January. Anyone wishing to read those conversations should start here: https://mail.python.org/pipermail/python-list/2015-January/697202.html https://mail.python.org/pipermail/python-list/2015-January/697315.html Be prepared for a long, long, long read. Nothing in your post hasn't already been discussed (except for your proposal to deprecate annotations altogether). So if you feel that I'm giving any of your ideas or concerns short-shrift, I'm not, it's just that I've already given them more time than I can afford. And now I get to do it all again, yay. (When reading those threads, please excuse my occasional snark towards Rick, he is a notorious troll on the list and sometimes I let myself be goaded into somewhat less than professional responses.) While I sympathise with your thought that "it's scary", I think it is misguided and wrong. As a thought-experiment, let us say that we roll back the clock to 1993 or thereabouts, just as Python 1.0 (or so) was about to be released, and Guido proposed adding *default values* to function declarations [assuming they weren't there from the start]. If we were used to Python's clean syntax: def zipmap(f, xx, yy): the thought of having to deal with default values: def zipmap(f=None, xx=(), yy=()): might be scary. Especially since those defaults could be arbitrarily complex expressions. Twenty years on, what should we think about such fears? Type hinting or declarations are extremely common in programming languages, and I'm not just talking about older languages like C and Java. New languages, both dynamic and static, like Cobra, Julia, Haskell, Go, Boo, D, F#, Fantom, Kotlin, Rust and many more include optional or mandatory type declarations. You cannot be a programmer without expecting to deal with type hints/declarations somewhere. As soon as you read code written in other languages (and surely you do that, don't you? you practically cannot escape Java and C code on the internet) and in my opinion Python cannot be a modern language without them. I'm going to respond to your recommendation to use stub files in another post (replying to Barry Warsaw), here I will discuss your concerns first. > My first reaction to type hints was "yuck", and I'm sure I'm not the only > one to think that. viz (from some pycon slides): > > def zipmap(f: Callable[[int, int], int], xx: List[int], > yy: List[int]) -> List[Tuple[int, int, int]]: > > arg. and imagine it with default arguments. You've picked a complex example and written it poorly. I'd say yuck too, but let's use something closer to PEP-8 formatting: def zipmap(f: Callable[[int, int], int], xx: List[int], yy: List[int] ) -> List[Tuple[int, int, int]]: Not quite so bad with each parameter on its own line. It's actually quite readable, once you learn what the annotations mean. Like all new syntax, of course you need to learn it. But the type hints are just regular Python expressions. > Of course, part of this reaction is just a knee-jerk reaction to the new > and unfamiliar, and should be dismissed, entirely justifiably, as mere > irrationality. But I'm sure sensible people agree that they do make our > function definitions longer, more complex, and harder to read. Everything has a cost and all features, or lack of features, are a trade-off. Function definitions could be even shorter and simpler and easier to read if we didn't have default values. [...] > I'm not so sure. My worry is that once type hinting gets standardised, > then they will become a "best practice", and there's a particular > personality type out there that's going to start wanting to add type hints > to every function they write. Similarly to mindlessly obeying PEP8 while > ignoring its intentions, hobgoblin-of-little-minds style, I think we're > very likely to see type hints appearing in a lot of python source, or a lot > of pre-commit-hook checkers. Pretty soon it will be hard to find any open > source library code that doesn't have type hints, or any project style > guide that doesn't require them. I doubt that very much. I'm not a betting man, but if I were, I would put money on it. Firstly: libraries tend to be multi-version, and these days they are often hybrid Python 2 & 3 code. Since annotations are 3 only, libraries cannot use these type hints until they drop support for Python 2.7, which will surely be *no less* than five years away. Probably more like ten. So "annotations everywhere" are, at best, many years away. Secondly, more importantly, there are a lot of Python programmers who (like you) dislike type hinting. They are not going to suddenly jump on the band wagon and use them. Even if the libraries that they use have annotations, most people don't read the source and will never know. But the biggest reason that I know that there will not be a sudden rush to horrible enterprisey code in Python due to type annotations is that Python has allowed enterprisey code for 20+ years and it hasn't become common practice. We can overuse design patterns, and turn everything into a property (or worse, getter/setter methods) just like Java, and we simple don't. Well, perhaps some libraries do. Since we don't tend to read their source code, how would we know if they are full of properties and decorators and metaclasses doing the most horrid things imaginable? All we care about is that the code we write is ordinary readable Python code, and it works. (There may be exceptions, but they are *exceptions*.) There is no epidemic of style guides making properties mandatory, or git check-ins that reject any function that fails to have doctests, or any of the bad things that have been possible in Python for two decades. Type hints can be abused too, and just like all the other things that can be, they won't be abused to the degree you fear. My comments on stub files to follow. -- Steve From steve at pearwood.info Tue Apr 21 05:34:21 2015 From: steve at pearwood.info (Steven D'Aprano) Date: Tue, 21 Apr 2015 13:34:21 +1000 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150420144106.63513828@anarchist.wooz.org> References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: <20150421033421.GJ5663@ando.pearwood.info> On Mon, Apr 20, 2015 at 02:41:06PM -0400, Barry Warsaw wrote: > On Apr 20, 2015, at 07:30 PM, Harry Percival wrote: > > >tldr; type hints in python source are scary. Would reserving them for stub > >files be better? > > I think so. I think PEP 8 should require stub files for stdlib modules and > strongly encourage them for 3rd party code. A very, very strong -1 to that. Stub files are a necessary evil. Except where absolutely necessary, they should be strongly discouraged. A quote from the Go FAQs: Dependency management is a big part of software development today but the ?header files? of languages in the C tradition are antithetical to clean dependency analysis?and fast compilation. http://golang.org/doc/faq#What_is_the_purpose_of_the_project Things that go together should be together. A function parameter and its type information (if any) go together: the type is as much a part of the parameter declaration as the name and the default. Putting them together is the best situation: def func(n: Integer): ... and should strongly be prefered as best practice for when you choose to use type hinting at all. Alternatives are not as good. Second best is to put them close by, as in a decorator: @typing(n=Integer) # Don't Repeat Yourself violation def func(n): ... A distant third best is a docstring. Not only does it also violate DRY, but it also increases the likelyhood of errors: def func(n): """Blah blah blah blah blah blah Arguments: m: Integer """ Keeping documentation and code in synch is hard, and such mistakes are not uncommon. Putting the type information in a stub file is an exponentially more distant fourth best, or to put it another way, *the worst* solution for where to put type hints. Not only do you Repeat Yourself with the name of the parameter, but also the name of the function (or method and class) AND module. The type information *isn't even in the same file*, which increases the chance of it being lost, forgotten, deleted, out of date, unmaintained, etc. -- Steve From guido at python.org Tue Apr 21 05:37:28 2015 From: guido at python.org (Guido van Rossum) Date: Mon, 20 Apr 2015 20:37:28 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: On Mon, Apr 20, 2015 at 4:41 PM, Jack Diederich wrote: > Twelve years ago a wise man said to me "I suggest that you also propose a > new name for the resulting language" > The barrage of FUD makes me feel like the woman who asked her doctor for a second opinion and was told "you're ugly too." > I talked with many of you at PyCon about the costs of PEP 484. There are > plenty of people who have done a fine job promoting the benefits. > So excuse me if I also defend the PEP against your attack. > * It is not optional. Please stop saying that. The people promoting it > would prefer that everyone use it. If it is approved it will be optional in > the way that PEP8 is optional. If I'm reading your annotated code it is > certainly /not/ optional that I understand the annotations. > The PEP 8 analogy is actually great -- that PEP's style advice *is* optional and plenty of people ignore it happily. The benefits come out when you're sharing code but violations are no big deal. And as to requiring what the annotations mean: the CPython interpreter itself doesn't even use them when executing your code, so you can certainly pretend that the annotations aren't there and read the body of the function or the docstring instead. > * Uploading stubs for other people's code is a terrible idea. Who do I > contact when I update the interface to my library? The random Joe who > "helped" by uploading annotations three months ago and then quit the > internet? I don't even want to think about people maliciously adding stubs > to PyPI. > We're certainly not planning to let arbitrary people upload stubs for arbitrary code to PyPI that will automatically be used by the type checkers. (We can't stop people from publishing their stubs, just as you can't stop people from writing blog posts or stackoverflow answers with examples for your library.) The actual plan is to have a common repository of stubs (a prototype exists at https://github.com/JukkaL/typeshed) but we also plan some kind of submission review. I've proposed that when submitting stubs for a package you don't own, the typeshed owners ask the package owner what their position is, and we expect the answers to fall on the following spectrum: - I don't want stubs uploaded for my package - I'll write the stubs myself - I want to review all stubs that are uploaded for my package before they are accepted - Please go ahead and add stubs for my package and let me know when they're ready - Go ahead, I trust you This seems a reasonable due diligence policy that avoids the scenarios you're worried about. (Of course if you refuse stubs a black market for stubs might spring into existence. That sounds kind of exciting... :-) > * The cognitive load is very high. The average function signature will > double in length. This is not a small cost and telling me it is "optional" > to pretend that every other word on the line doesn't exist is a farce. > Have you never found yourself trying to read someone else's code and being stopped in your tracks because you couldn't find out what kind of argument was expected somewhere? I'm personally quite tired of seeing that an argument is a "file" and not knowing whether it's meant to be an IO stream or a filename (and was that bytes or unicode? :-). Or seeing something documented as "creation time" without knowing whether that's a datetime or a a POSIX timestamp (or maybe milliseconds or a string encoding a date). Etc., etc. And yes, you can put the info in docstrings or comments (though it won't be less verbose, and it will be more likely wrong). One way to see type hints is as a way to save on docstrings and comments by using a standard notation that ties type info to the argument without the need to repeat the argument name. Yes, you need to use your brain. But don't tell my that all the Python code you read or write is self-explanatory, because it isn't. > * Every company's style guide is about to get much longer. That in itself > is an indicator that this is a MAJOR language change and not just some > "optional" add-on. > Actually, standardization *reduces* the length of a company style guide, like PEP 8 did before. Would you rather have instructions for the specific linter your company uses, instead of the single phrase "use type hints (PEP 484)"? > * People will screw it up. The same people who can't be trusted to program > without type annotations are also going to be *writing* those type > annotations. > I don't follow this argument. So what? People will screw up anything you give them. introspection, metaclasses, import hooks, reduce(), you name it. > * Teaching python is about to get much less attractive. It will not be > optional for teachers to say "just pretend all this stuff over here doesn't > exist" > This argument is particularly weak. Teachers are actually *very* effective teaching only the parts of Python they need -- some start with just functions and variables. Others with just classes and methods. Few ever even teach operator overloading. > * "No new syntax" is a lie. Or rather a red herring. There are lots of new > things it will be required to know and just because the compiler doesn't > have to change doesn't mean the language isn't undergoing a major change. > The language undergoes major changes all the time. There was a time before context managers. You've survived print() becoming a function. We're getting a matrix multiply operator. The sky isn't falling. > If this wasn't in a PEP and it wasn't going to ship in the stdlib very few > people would use it. If you told everyone they had to install a different > python implementation they wouldn't. This is much worse than that - it is > Python4 hidden away inside a PEP. > But the nice thing is that this time you really *can* mix Python 3 and Python 4 together in one app. :-P > There are many fine languages that have sophisticated type systems. And > many bondage & discipline languages that make you type things three times > to make really really sure you meant to type that. If you find those other > languages appealing I invite you to go use them instead. > > -Jack > https://mail.python.org/pipermail/python-dev/2003-February/033291.html > So finally it's payback time? Well played sir. :-) -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From tjreedy at udel.edu Tue Apr 21 08:13:55 2015 From: tjreedy at udel.edu (Terry Reedy) Date: Tue, 21 Apr 2015 02:13:55 -0400 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> <85sibumhej.fsf@benfinney.id.au> Message-ID: On 4/20/2015 9:07 PM, Chris Angelico wrote: > On Tue, Apr 21, 2015 at 10:52 AM, Ben Finney wrote: >> Jack is not complaining only about *writing* code. He's complaining >> about the effect this will have on code that we all are expected to >> *read*. For reading, good function and parameter names and a good summary as the first line of the docstring are probably more useful *to humans* than formal type annotations. > Ahh. Yes, that's a concern. When you go digging into that library to > find out how it works, yes, you'd be face-to-face with their type > annotations. > Maybe it'd be of value to have a quick "code stripper" that takes away > all the annotations, I was already thinking about that for Idle. The file should then be either semi read-only or renamed. Also options to merge annotations in from stub files for editing and to split back out when writing. > plus any other junk/framing that you're not > interested in, and gives you something you can browse in a text > editor? It could take away all the Sphinx adornments from docstrings, > any source control versioning markers, all that kind of thing. Interesting idea. -- Terry Jan Reedy From mal at egenix.com Tue Apr 21 09:33:52 2015 From: mal at egenix.com (M.-A. Lemburg) Date: Tue, 21 Apr 2015 09:33:52 +0200 Subject: [Python-Dev] typeshed for 3rd party packages (was: Type hints -- a mediocre programmer's reaction) In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: <5535FD60.7020404@egenix.com> On 21.04.2015 05:37, Guido van Rossum wrote: > On Mon, Apr 20, 2015 at 4:41 PM, Jack Diederich wrote: >> * Uploading stubs for other people's code is a terrible idea. Who do I >> contact when I update the interface to my library? The random Joe who >> "helped" by uploading annotations three months ago and then quit the >> internet? I don't even want to think about people maliciously adding stubs >> to PyPI. >> > > We're certainly not planning to let arbitrary people upload stubs for > arbitrary code to PyPI that will automatically be used by the type > checkers. (We can't stop people from publishing their stubs, just as you > can't stop people from writing blog posts or stackoverflow answers with > examples for your library.) > > The actual plan is to have a common repository of stubs (a prototype exists > at https://github.com/JukkaL/typeshed) but we also plan some kind of > submission review. I've proposed that when submitting stubs for a package > you don't own, the typeshed owners ask the package owner what their > position is, and we expect the answers to fall on the following spectrum: > > - I don't want stubs uploaded for my package > - I'll write the stubs myself > - I want to review all stubs that are uploaded for my package before they > are accepted > - Please go ahead and add stubs for my package and let me know when they're > ready > - Go ahead, I trust you > > This seems a reasonable due diligence policy that avoids the scenarios > you're worried about. (Of course if you refuse stubs a black market for > stubs might spring into existence. That sounds kind of exciting... :-) Hmm, that's the first time I've heard about this. I agree with Jack that it's a terrible idea to allow this for 3rd party packages. If people want to contribute stubs, they should contribute them to the package in the usual ways, not in a side channel. The important part missing in the above setup is maintenance and to some extent an external change of the API definitions. Both require active participation in the package project, not the separated setup proposed above, to be effective and actually work out in the long run. For the stdlib, typesched looks like a nice idea to spread the workload. -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source (#1, Apr 21 2015) >>> Python Projects, Coaching and Consulting ... http://www.egenix.com/ >>> mxODBC Plone/Zope Database Adapter ... http://zope.egenix.com/ >>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ ________________________________________________________________________ ::::: Try our mxODBC.Connect Python Database Interface for free ! :::::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/ From solipsis at pitrou.net Tue Apr 21 09:49:58 2015 From: solipsis at pitrou.net (Antoine Pitrou) Date: Tue, 21 Apr 2015 09:49:58 +0200 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction References: <20150420144106.63513828@anarchist.wooz.org> <20150421004339.1DA5CB500F5@webabinitio.net> Message-ID: <20150421094958.2960986c@fsol> On Mon, 20 Apr 2015 20:43:38 -0400 "R. David Murray" wrote: > +1 to this from me too. I'm afraid that means I'm -1 on the PEP. > > I didn't write this in my earlier email because I wasn't sure about it, > but my gut reaction after reading Harry's email was "if type annotations > are used in the stdlib, I'll probably stop contributing". That doesn't > mean that's *true*, but that's the first time I've ever had that > thought, so it is probably worth sharing. I think it would be nice to know what the PEP means for daily stdlib development. If patches have to carry typing information each time they add/enhance an API that's an addition burden. If typing is done separately by interested people then it sounds like it wouldn't have much of an impact on everyone else's workflow. Regards Antoine. From cory at lukasa.co.uk Tue Apr 21 10:58:45 2015 From: cory at lukasa.co.uk (Cory Benfield) Date: Tue, 21 Apr 2015 09:58:45 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: On 21 April 2015 at 01:45, Chris Angelico wrote: > When you're writing a library, it can be a great help to provide type > annotations, because every application that uses your library can > benefit. It can be a great help to whom? Not to me (the library author), because I can't use them in my library code, because I have to support 2.7. That's by no means a bad thing (after all, most libraries are written to help others), but I found it really unclear who was being advantaged here. To show the downside, I decided to annotate requests' API a little bit. Then I stopped, because I really couldn't work out how to do it. For example, requests.get() takes a URL parameter. This can be an instance of basestring, or anything that provides a __str__ or __unicode__ method that provides us a string that looks like a URL (such as a URL abstraction class). We don't know ahead of time what those objects may be, so there's no type signature I can provide here that is of any use beyond 'Any'. We also provide headers/params options, whose type signature would be (as far as I can tell) Union[Iterable[Tuple[basestring, basestring]], Mapping[basestring, basestring]], which is a fairly unpleasant type (and also limiting, as we totally accept two-element lists here as well as two-tuples. That's got *nothing* on the type of the `files` argument, which is the most incredibly polymorphic argument I've ever seen: the best I can work out it would be: Optional[ Union[ Mapping[ basestring, Union[ Tuple[basestring, Optional[Union[basestring, file]]], Tuple[basestring, Optional[Union[basestring, file]], Optional[basestring]], Tuple[basestring, Optional[Union[basestring, file]], Optional[basestring], Optional[Headers]] ] ], Iterable[ Tuple[ basestring, Union[ Tuple[basestring, Optional[Union[basestring, file]]], Tuple[basestring, Optional[Union[basestring, file]], Optional[basestring]], Tuple[basestring, Optional[Union[basestring, file]], Optional[basestring], Optional[Headers]] ] ] ] ] Headers = Union[ Mapping[basestring, basestring], Iterable[Tuple[basestring, basestring]], ] This includes me factoring out the Headers type for my own sanity, and does not include the fact that in practice almost all instances of 'basestring' above actually mean 'anything that can be safely cast to basestring', and 'file' means 'any file-like object including BytesIO and StringIO' (speaking of which, the PEP doesn't even begin to go into how to handle that kind of requirement: do we have some kind of Readable interface that can be implemented? Or do I have to manually union together all types someone might want to use?). Now, this can be somewhat improved: the three types of tuple (again, not just tuples, often lists!) can be factored out, as can the Union[basestring, file]. However, there's still really no way around the fact that this is a seriously complex type signature! Python has had years of people writing APIs that feel natural, even if that means weird contortions around 'types'. Writing an argument as 'clever' as the 'files' argument in requests in a statically typed language is an absolute nightmare, but Python makes it an easy and natural thing to do. Further, Python's type system is not sufficiently flexible to allow library authors to adequately specify the types their code actually works on. I need to be able to talk about interfaces, because interfaces are the contract around which APIs are build in Python, but I cannot do that with this system in a way that makes any sense at all. To even begin to be useful for library authors this PEP would need to allow some kind of type hierarchy that is *not* defined by inheritance, but instead by interfaces. We've been discouraging use of 'type' and 'isinstance' for years because they break duck typing, but that has *nothing* on what this PEP appears to do to duck typing. I suppose the TLDR of this email is that I think that libraries with 'pythonic' APIs are the least likely to take up this typing system because it will provide the least value to them. Maintaining signatures will be a pain (stub files are necessary), writing clear signatures will be a pain (see above), and writing signatures that are both sufficiently open to be back-compatible and sufficiently restrictive to be able to catch bugs seems like it might be very nearly impossible. I'd love for someone to tell me I'm wrong, though. I want to like this PEP, I really do. From rosuav at gmail.com Tue Apr 21 11:10:40 2015 From: rosuav at gmail.com (Chris Angelico) Date: Tue, 21 Apr 2015 19:10:40 +1000 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: On Tue, Apr 21, 2015 at 6:58 PM, Cory Benfield wrote: > On 21 April 2015 at 01:45, Chris Angelico wrote: >> When you're writing a library, it can be a great help to provide type >> annotations, because every application that uses your library can >> benefit. > > It can be a great help to whom? Not to me (the library author), > because I can't use them in my library code, because I have to support > 2.7. That's by no means a bad thing (after all, most libraries are > written to help others), but I found it really unclear who was being > advantaged here. Mainly application users get the benefit, I expect. But I may be wrong. > ... That's got *nothing* on the type of the `files` > argument, which is the most incredibly polymorphic argument I've ever > seen: the best I can work out it would be: > > Optional[ > Union[ > Mapping[ > basestring, > Union[ > Tuple[basestring, Optional[Union[basestring, file]]], > Tuple[basestring, Optional[Union[basestring, file]], > Optional[basestring]], > Tuple[basestring, Optional[Union[basestring, file]], > Optional[basestring], Optional[Headers]] > ] > ], > Iterable[ > Tuple[ > basestring, > Union[ > Tuple[basestring, Optional[Union[basestring, file]]], > Tuple[basestring, Optional[Union[basestring, > file]], Optional[basestring]], > Tuple[basestring, Optional[Union[basestring, > file]], Optional[basestring], Optional[Headers]] > ] > ] > ] > ] At this point, you may want to just stop caring about the exact type. Part of the point of gradual typing is that you can short-cut a lot of this. And quite frankly, this isn't really helping anything. Just skip it and say that it's Union[Mapping, Iterable, None]. ChrisA From cory at lukasa.co.uk Tue Apr 21 11:33:28 2015 From: cory at lukasa.co.uk (Cory Benfield) Date: Tue, 21 Apr 2015 10:33:28 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: On 21 April 2015 at 10:10, Chris Angelico wrote: > At this point, you may want to just stop caring about the exact type. > Part of the point of gradual typing is that you can short-cut a lot of > this. And quite frankly, this isn't really helping anything. Just skip > it and say that it's Union[Mapping, Iterable, None]. I think this is my problem with the proposal. This change doesn't catch any of the bugs anyone was going to hit (no-one is passing a callable to this parameter), and it doesn't catch any of the bugs someone might actually hit (getting the tuple elements in the wrong order, for example). At this point the type signature is worse than useless. It seems like the only place the type annotations will get used is in relatively trivial cases where the types are obvious anyway. I don't deny that *some* bugs will be caught, but I suspect they'll overwhelmingly be crass ones that would have been caught quickly anyway. The minute a type signature might actually help prevent a subtle bug we can't use them anymore because the toolchain isn't powerful enough to do it so we just put 'Any' and call it a day. And even then we haven't helped the trivial case much because anyone who wants to provide a duck-typed object that should be perfectly suitable in this case no longer can. What this proposal *needs* to be generally useful, IMO, is the ability to specify types that actually match the way we have encouraged people to write code: duck typed, favouring composition over inheritance, etc. At the very least that means 'type constraints' (see Haskell, Rust, really any language with a powerful and robust type system: hell, even see Go's interfaces) need to be something we can specify. From steve at pearwood.info Tue Apr 21 12:27:23 2015 From: steve at pearwood.info (Steven D'Aprano) Date: Tue, 21 Apr 2015 20:27:23 +1000 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: <20150421102722.GA27773@ando.pearwood.info> On Mon, Apr 20, 2015 at 08:37:28PM -0700, Guido van Rossum wrote: > On Mon, Apr 20, 2015 at 4:41 PM, Jack Diederich wrote: > > > Twelve years ago a wise man said to me "I suggest that you also propose a > > new name for the resulting language" > > > > The barrage of FUD makes me feel like the woman who asked her doctor for a > second opinion and was told "you're ugly too." Don't worry Guido, some of us are very excited to see this coming to fruition :-) It's been over ten years since your first blog post on optional typing for Python. At least nobody can accuse you of rushing into this. http://www.artima.com/weblogs/viewpost.jsp?thread=85551 -- Steve From rob.cliffe at btinternet.com Tue Apr 21 12:56:15 2015 From: rob.cliffe at btinternet.com (Rob Cliffe) Date: Tue, 21 Apr 2015 11:56:15 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: <55362CCF.3020809@btinternet.com> On 21/04/2015 10:33, Cory Benfield wrote: > On 21 April 2015 at 10:10, Chris Angelico wrote: >> At this point, you may want to just stop caring about the exact type. >> Part of the point of gradual typing is that you can short-cut a lot of >> this. And quite frankly, this isn't really helping anything. Just skip >> it and say that it's Union[Mapping, Iterable, None]. > I think this is my problem with the proposal. This change doesn't > catch any of the bugs anyone was going to hit (no-one is passing a > callable to this parameter), and it doesn't catch any of the bugs > someone might actually hit (getting the tuple elements in the wrong > order, for example). At this point the type signature is worse than > useless. Exactly. At this point putting the type signatures in seems to be a pointless exercise in masochism (for the author) and sadism (for the reader). The refreshing thing about Python is that it is a fantastic, *concise*, dynamically typed language, at the opposite end of the spectrum from C++ (ugh!). We have got used to the consequences (good and bad) of this: Duck typing, e.g. a function to sort numbers (sorry Tim Peters bad example) turns out to support any kind of object (e.g. strings) that supports comparison. Not to mention objects of some class that will be written in 5 years time. (Adding a type hint that restricted the argument to say a sequence of numbers turns out to be a mistake. And what is a number? Is Fraction? What about complex numbers, which can't be sorted? What if the function were written before the Decimal class?) Errors are often not caught until run time that would be caught at compile time in other languages (though static code checkers help). (Not much of a disadvantage because of Python's superb error diagnostics.) Python code typically says what it is doing, with the minimum of syntactic guff. (Well, apart from colons after if/while/try etc. :-) ) Which makes it easy to read. Now it seems as if this proposal wants to start turning Python in the C++ direction, encouraging adding ugly boilerplate code. (This may only be tangentially relevant, but I want to scream when I see some combination of public/private/protected/static/extern etc., most of which I don't understand.) Chris A makes the valid point (if I understand correctly) that Authors of libraries should make it as easy as possible to (i) know what object types can be passed to functions (ii) diagnose when the wrong type of object is passed Authors of apps are not under such obligation, they can basically do what they want. Well, (i) can be done with good documentation (docstrings etc.). (ii) can be done with appropriate runtime checks and good error messages. E.g. (Cory's example) I'm sure it is possible to test somehow if an object is file-like (if only by trying to access it like a file). Is thorough argument checking and provision of good diagnostics going to be easy for the library author? No. Is it going to be a lot of work to do thoroughly? Almost certainly yes. But what the hell, writing a production-quality library is not an exercise for newbies. It seems to me that type hints are attempting to be a silver bullet and to capture in a simple formula what is often, in practice, *not simple at all*, viz. "Is this passed object suitable?". Attempting - and failing, except in the simplest cases. Apologies, Guido, but: There was once a Chinese student who was a marvellous painter. He painted a perfect life-like picture of a snake. But he was unable to stop and leave it alone. In his zeal he went on to paint feet on the snake. Which of course completely spoiled the picture, as snakes don't have feet. Hence "to paint feet on the snake": to be unable to resist tinkering with something that is already good. (I suppose "If it ain't broke, don't fix it" is an approximate Western equivalent.) You see where I'm going with this - adding type hints to Python feels a bit like painting feet on the snake. Or at least turning it into a different language. Best wishes Rob Cliffe -------------- next part -------------- An HTML attachment was scrubbed... URL: From gjcarneiro at gmail.com Tue Apr 21 13:23:16 2015 From: gjcarneiro at gmail.com (Gustavo Carneiro) Date: Tue, 21 Apr 2015 12:23:16 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <55362CCF.3020809@btinternet.com> References: <20150420144106.63513828@anarchist.wooz.org> <55362CCF.3020809@btinternet.com> Message-ID: On 21 April 2015 at 11:56, Rob Cliffe wrote: > On 21/04/2015 10:33, Cory Benfield wrote: > > On 21 April 2015 at 10:10, Chris Angelico wrote: > > At this point, you may want to just stop caring about the exact type. > Part of the point of gradual typing is that you can short-cut a lot of > this. And quite frankly, this isn't really helping anything. Just skip > it and say that it's Union[Mapping, Iterable, None]. > > I think this is my problem with the proposal. This change doesn't > catch any of the bugs anyone was going to hit (no-one is passing a > callable to this parameter), and it doesn't catch any of the bugs > someone might actually hit (getting the tuple elements in the wrong > order, for example). At this point the type signature is worse than > useless. > > Exactly. At this point putting the type signatures in seems to be a > pointless exercise in masochism (for the author) and sadism (for the > reader). > > The refreshing thing about Python is that it is a fantastic, *concise*, > dynamically typed language, at the opposite end of the spectrum from C++ > (ugh!). > We have got used to the consequences (good and bad) of this: > Duck typing, e.g. a function to sort numbers (sorry Tim Peters bad > example) turns out to support any kind of object (e.g. strings) that > supports comparison. > Not to mention objects of some class that will be written in 5 > years time. > (Adding a type hint that restricted the argument to say a sequence > of numbers turns out to be a mistake. And what is a number? > Is Fraction? What about complex numbers, which can't be sorted? > What if the function were written before the Decimal class?) > Errors are often not caught until run time that would be caught at > compile time in other languages (though static code checkers help). > (Not much of a disadvantage because of Python's superb error > diagnostics.) > Python code typically says what it is doing, with the minimum of > syntactic guff. (Well, apart from colons after if/while/try etc. :-) ) > Which makes it easy to read. > Now it seems as if this proposal wants to start turning Python in the C++ > direction, encouraging adding ugly boilerplate code. (This may only be > tangentially relevant, but I want to scream when I see some combination of > public/private/protected/static/extern etc., most of which I don't > understand.) > > Chris A makes the valid point (if I understand correctly) that > Authors of libraries should make it as easy as possible to > (i) know what object types can be passed to functions > (ii) diagnose when the wrong type of object is passed > Authors of apps are not under such obligation, they can basically do > what they want. > > Well, > (i) can be done with good documentation (docstrings etc.). > Documentation is not checked. It often loses sync with the actual code. Docs say one thing, code does another. Documenting types in docstrings forces you to repeat yourself. You have to write the parameter names twice, once in the function definition line, another in the docstring. And you need to understand the syntax of whatever docstring format will be used to check the code. I'm sure I'm not the only one that thinks that type hints actually *improve readability*. I will not argue whether it makes the code uglier or prettier, but I am certain that most of the time the type hints make the code easier to read because you don't have to spend so much mental effort trying to figure out "gee, I wonder if this parameter is supposed to be bool, string, or int?" The type (or types) that a parameter is expected to have becomes explicit, otherwise you have to read the entire function body to understand. > (ii) can be done with appropriate runtime checks and good error > messages. > E.g. (Cory's example) I'm sure it is possible to test somehow if an object > is file-like (if only by trying to access it like a file). > Is thorough argument checking and provision of good diagnostics going to > be easy for the library author? No. Is it going to be a lot of work to do > thoroughly? Almost certainly yes. > But what the hell, writing a production-quality library is not an exercise > for newbies. > > It seems to me that type hints are attempting to be a silver bullet and to > capture in a simple formula what is often, in practice, *not simple at > all*, viz. "Is this passed object suitable?". Attempting - and failing, > except in the simplest cases. > Even if it only helps in the simplest cases, it saves the reader of the code a lot of effort if you add up all those little cases together. My suggestion, wherever an annotated type is too complex, don't annotate it. If you do /only/ the simple cases, the end result is easier to read. > > Apologies, Guido, but: > There was once a Chinese student who was a marvellous painter. He painted > a perfect life-like picture of a snake. > But he was unable to stop and leave it alone. In his zeal he went on to > paint feet on the snake. Which of course completely spoiled the picture, > as snakes don't have feet. > Hence "to paint feet on the snake": to be unable to resist tinkering with > something that is already good. (I suppose "If it ain't broke, don't fix > it" is an approximate Western equivalent.) > You see where I'm going with this - adding type hints to Python feels a > bit like painting feet on the snake. Or at least turning it into a > different language. > Oh, $DEITY, I really hope this is not going to be another excuse lazy programmers will use to avoid porting to Python 3 :-( > Best wishes > Rob Cliffe > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/gjcarneiro%40gmail.com > > -- Gustavo J. A. M. Carneiro Gambit Research "The universe is always one step beyond logic." -- Frank Herbert -------------- next part -------------- An HTML attachment was scrubbed... URL: From taleinat at gmail.com Tue Apr 21 13:50:25 2015 From: taleinat at gmail.com (Tal Einat) Date: Tue, 21 Apr 2015 14:50:25 +0300 Subject: [Python-Dev] Surely "nullable" is a reasonable name? In-Reply-To: <55336517.6090702@hastings.org> References: <53DF326F.9030908@hastings.org> <53E0F488.1090105@v.loewis.de> <53E454E9.3030100@hastings.org> <55336517.6090702@hastings.org> Message-ID: On Sun, Apr 19, 2015 at 11:19 AM, Larry Hastings wrote: > > > On 08/07/2014 09:41 PM, Larry Hastings wrote: > > Well! It's rare that the core dev community is so consistent in its > opinion. I still think "nullable" is totally appropriate, but I'll change > it to "allow_none". > > > (reviving eight-month-old thread) > > In case anybody here is still interested in arguing about this: the Clinic > API may be shifting a bit here. What follows is a quick refresher course on > Argument Clinic, followed by a discussion of the proposed new API. > > Here's an Argument Clinic declaration of a parameter: > s: str() > The parameter is called "s", and it's specifying a converter function called > "str" which handles converting string parameters. The str() converter > itself accepts parameters; since the parameters all have default values, > they're all optional. By default, str() maps directly to the "s" format > unit for PyArg_ParseTuple(), as it does here. > > Currently str() (and a couple other converter functions) accepts a parameter > called "types". "types" is specified as a string, and contains an unordered > set of whitespace-separated strings representing the Python types of the > values this (Clinic) parameter should accept. The default value of "types" > for str() is "str"; the following declaration is equivalent to the > declaration above: > s: str(types="str") > Other legal values for the "types" parameter for the str converter include > "bytes bytearray str" and "robuffer str". Internally the types parameter is > converted into a set of strings; passing it in as a string is a nicety for > the caller's benefit. (It also means that the strings "robuffer str" and > "str robuffer" are considered equivalent.) > > There's a second parameter, currently called "nullable", but I was supposed > to rename it "allow_none", so I'll use that name here. If you pass in > "allow_none=True" to a converter, it means "this (Clinic) parameter should > accept the Python value None". So, to map to the format unit "z", you would > specify: > s: str(allow_none=True) > > And to map to the format unit "z#", you would specify: > s: str(types="robuffer str", allow_none=True, length=True) > > > In hindsight this is all a bit silly. I propose what I think is a much > better API below. > > We should rename "types" to "accept". "accept" should takes a set of types; > these types specify the types of Python objects the Clinic parameter should > accept. For the funny pseudo-types needed in some Clinic declarations > ("buffer", "robuffer", and "rwbuffer"), Clinic provides empty class > declarations so these behave like types too. > > accept={str} is the default for the str() converter. If you want to map to > format unit "z", you would write this: > s: str(accept={str, NoneType}) > (In case you haven't seen it before: NoneType = type(None). I don't think > the name is registered anywhere officially in the standard library... but > that's the name.) > > The upside of this approach: > > Way, way more obvious to the casual reader. "types" was always meant as an > unordered collection of types, but I felt specifying it with strings was > unwieldy and made for poor reading ({'str', 'robuffer'}). Passing it in as > a single string which I internally split and put in a set() was a bad > compromise. But the semantics of this whitespace-delimited string were a > bit unclear, even to the experienced Clinic hacker. This set-of-types > version maps exactly to what the parameter was always meant to accept in the > first place. As with any other code, people will read Clinic declarations > far, far more often than they will write them, so optimizing for clarity is > paramount. > Zen: "There should be one (and preferably only one) obvious way to do it." > We have a way of specifying the types this parameter should accept; > "allow_none" adds a second. > Zen: "Special cases aren't special enough to break the rules". "allow_none" > was really just a special case of one possible type for "types". > > > The downside of this approach: > > You have to know what the default accept= set is for each converter. > Luckily this is not onerous; there are only four converters that need an > "accept" parameter, and their default values are all simple: > > int(accept={int}) > str(accept={str}) > Py_UNICODE(accept={str}) > Py_buffer(accept={buffer}) > > I suggest this is only a (minor) problem when writing a Clinic > declaration. It doesn't affect later readability, which is much more > important. > > It means repeating yourself a little. If you just want to say "I want to > accept None too", you have to redundantly specify the default type(s) > accepted by the converter function. In practice, it's really only redundant > for four or five format units, and they're not the frequently-used ones. > Right now I only see three uses of nullable for the built-in format units > (there are two more for my path_converter) and they're all for the str > converter. > > Yes, we could create a set containing the default types accepted by each > converter function, and just let the caller specify that and incrementally > add +{NoneType}. But this would be far longer than simply redundantly > respecifying the default (e.g. "accept=str_converter.accept_default + > {NoneType}"). > > Sometimes the best thing is just to bite the bullet and accept a little > redundancy. > > > Does "accept" sound good, including accepting "NoneType"? "Accept" sounds great to me. Allowing a parameter to be None by specifying NoneType is logical and so makes a certain amount of sense. It seems to me that this approach will make it very common to have parameter declarations of the form "int(accept={int, NoneType})" in the stdlib. I'm conflicted whether this is better or worse than "int(allow_none=True)". The former makes it clear that the parameter can accept more than a single type, necessitating additional processing to check what the actual type of the passed value is, which I like. So I'm +0.5 for this at the moment. As for the default set of accepted types for various convertors, if we could choose any syntax we liked, something like "accept=+{NoneType}" would be much better IMO. I'm definitely against repeating the default set of accepted types, since this would require very wide changes whenever the default set of types for a convertor is changed, as well as breaking compatibility for 3rd party libraries using AC. - Tal From chris at simplistix.co.uk Tue Apr 21 14:24:35 2015 From: chris at simplistix.co.uk (Chris Withers) Date: Tue, 21 Apr 2015 13:24:35 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: <55364183.4040606@simplistix.co.uk> On 20/04/2015 20:09, Paul Moore wrote: > On 20 April 2015 at 19:41, Barry Warsaw wrote: >>> tldr; type hints in python source are scary. Would reserving them for stub >>> files be better? >> I think so. I think PEP 8 should require stub files for stdlib modules and >> strongly encourage them for 3rd party code. > Agreed. I have many of the same concerns as Harry, but I wouldn't have > expressed them quite as well. I'm not too worried about actually > removing annotations from the core language, but I agree that we > should create a strong culture of "type hints go in stub files" to > keep source files readable and clean. > > On that note, I'm not sure "stub" files is a particularly good name. > Maybe "type files" would be better? Something that emphasises that > they are the correct place to put type hints, not a workaround. Or we could just be honest and admit that we're choosing to add header files to Python. It's a shame, as it's more complexity, and it's being inflicted on those who might be writing a library for the first time, or those becoming core committers for the first time, or those just trying to "do the right thing". Currently, the burden is a heavier one (type inference, rather than reading it from a file) but borne by people best placed to handle it (authors of IDEs, type checking software, etc). Chris From chris at simplistix.co.uk Tue Apr 21 14:25:34 2015 From: chris at simplistix.co.uk (Chris Withers) Date: Tue, 21 Apr 2015 13:25:34 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: Message-ID: <553641BE.2000906@simplistix.co.uk> On 20/04/2015 19:30, Harry Percival wrote: > Hi all, > > tldr; type hints in python source are scary. Would reserving them for > stub files be better? I think Jack's summary of this is excellent and aligns well with where I think I'm coming from on this: https://mail.python.org/pipermail/python-dev/2015-April/139253.html Harry Also makes just as many good points so I'll reply here, with a note that while switching to (type/header/stub - my ordered preference for how to describe .pyi files) is preferable to type hints inside files, it's still a massive change to the language, not one I can say I've missed over the past 15 years, and one if asked to vote on (I know that's not the case ;-)) that I would choose to vote against. Anyway, I've not posted much to python-dev in quite a while, but this is a topic that I would be kicking myself in 5-10 years time when I've had to move to Javascript or because everyone else has drifted away from Python as it had become ugly... Chris From chris at simplistix.co.uk Tue Apr 21 14:28:24 2015 From: chris at simplistix.co.uk (Chris Withers) Date: Tue, 21 Apr 2015 13:28:24 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <55362CCF.3020809@btinternet.com> Message-ID: <55364268.2000706@simplistix.co.uk> On 21/04/2015 12:23, Gustavo Carneiro wrote: > Well, > > (i) can be done with good documentation (docstrings etc.). > > > Documentation is not checked. It often loses sync with the actual > code. Docs say one thing, code does another. That certainly something that could be fixed by formalising the documentation specification and having a checker check that rather than changing the syntax of the language... Chris -------------- next part -------------- An HTML attachment was scrubbed... URL: From cory at lukasa.co.uk Tue Apr 21 14:47:11 2015 From: cory at lukasa.co.uk (Cory Benfield) Date: Tue, 21 Apr 2015 13:47:11 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> <55362CCF.3020809@btinternet.com> Message-ID: On 21 April 2015 at 12:23, Gustavo Carneiro wrote: > Documentation is not checked. It often loses sync with the actual code. > Docs say one thing, code does another. Agreed. I don't think anyone would disagree here. I'm talking from the position of being a library author, where supporting versions of Python lower than 3.5 will be a reality for at least 5 more years. I will not be able to inline my type hints, so they'll have to go in stub files, and now we've got the same problem: type hints can go out of step just as easily as documentation can. > Documenting types in docstrings forces you to repeat yourself. You have to > write the parameter names twice, once in the function definition line, > another in the docstring. And you need to understand the syntax of whatever > docstring format will be used to check the code. Yes, but I'm repeating myself O(small integer) lines away from where I first wrote it. The maintenance burden of keeping those things in step is not very high. As for your last point, my code is not checked against my docstrings, primarily for the reasons I mentioned in my original email. My docstrings are merely rendered into documentation. > I'm sure I'm not the only one that thinks that type hints actually *improve > readability*. In some cases, sure. No contest. > but I am certain that most of the time the type hints make the > code easier to read because you don't have to spend so much mental effort > trying to figure out "gee, I wonder if this parameter is supposed to be > bool, string, or int?" Yes. If you spend a lot of your time doing this, then type hints will help you, *assuming* the programmer who originally wrote the ambiguous code diligently maintains the type hints. I have no reason to believe that a programmer whose code is that ambiguous is going to do that, or even write type hints at all. The bigger problem is that, previously, Python functions never took bool, string, or int. They took 'object that can be evaluated in a boolean context', 'object that behaves like strings in some undefined way', and 'objects that behave like ints in some undefined way'. The type hint you provide is *not helpful*. What I *need* is a type hint that says 'object that has the .endswith() method' or 'object that can be safely cast to a string', so I can tell whether I can pass you a custom URL class that happens to render into a textual URL or whether it definitely has to be a string object. But you can't express that relationship in this model, so you will just say 'string', and now I will turn off type checking because the type hinter keeps shouting at me for using my URLBuilder object. If you only work with primitive types and only want your users to use primitive types, this proposal almost certainly works brilliantly. If you *don't*, this proposal begins to become extremely uncomfortable extremely fast. > Even if it only helps in the simplest cases, it saves the reader of the code > a lot of effort if you add up all those little cases together. > > My suggestion, wherever an annotated type is too complex, don't annotate it. > If you do /only/ the simple cases, the end result is easier to read. If that's genuinely the bar for success for this PEP, then fine, I wish it the best. I'd *like* a PEP that can deal with the complex cases too, but I freely admit to having no idea how such a thing may be implemented. All I can say is that I think this PEP will have limited uptake amongst many third-party libraries, particularly the complex ones. From steve at pearwood.info Tue Apr 21 14:47:23 2015 From: steve at pearwood.info (Steven D'Aprano) Date: Tue, 21 Apr 2015 22:47:23 +1000 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <55362CCF.3020809@btinternet.com> References: <55362CCF.3020809@btinternet.com> Message-ID: <20150421124720.GK5663@ando.pearwood.info> On Tue, Apr 21, 2015 at 11:56:15AM +0100, Rob Cliffe wrote: > (Adding a type hint that restricted the argument to say a > sequence of numbers turns out to be a mistake. Let's find out how big a mistake it is with an test run. py> def sorter(alist: List[int]) -> List[int]: ... return sorted(alist) ... py> data = (chr(i) + 'ay' for i in range(97, 107)) py> type(data) py> sorter(data) ['aay', 'bay', 'cay', 'day', 'eay', 'fay', 'gay', 'hay', 'iay', 'jay'] When we say that type checking is optional, we mean it. [Disclaimer: I had to fake the List object, since I don't have the typing module, but everything else is exactly as you see it.] Annotations will be available for type checking. If you don't want to type check, don't type check. If you want to go against the type hints, you can go against the type hints, and get exactly the same runtime errors as you have now: py> sorter(None) Traceback (most recent call last): File "", line 1, in File "", line 2, in sorter TypeError: 'NoneType' object is not iterable There is no compile time checking unless you choose to run a type checker or linter -- and even if the type checker flags errors, you can ignore it and run the program regardless. Just like today with linters like PyFlakes, PyLint and similar. For those who choose not to run a type checker, the annotations will be nothing more than introspectable documentation. > And what is a number? > Is Fraction? What about complex numbers, which can't be > sorted? What if the function were written before the Decimal class?) I know that you are intending these as rhetorical questions, but Python has had a proper numeric tower since version 2.5 or 2.6. So: py> from numbers import Number py> from decimal import Decimal py> isinstance(Decimal("1.25"), Number) True py> isinstance(2+3j, Number) True > Errors are often not caught until run time that would be caught at > compile time in other languages (though static code checkers help). Yes they do help, which is exactly the point. > (Not much of a disadvantage because of Python's superb error > diagnostics.) That's certainly very optimistic of you. If I had to pick just one out of compile time type checking versus run time unit tests, I'd pick run time tests. But it is naive to deny the benefits of compile time checks in catching errors that you otherwise might not have found even with extensive unit tests (and lets face it, we never have enough unit tests). Ironically, type hinting will *reduce* the need for intrusive, anti-duck-testing explicit calls to isinstance() at runtime: def func(x:float): if isinstance(x, float): ... else: raise TypeError Why bother making that expensive isinstance call every single time the function is called, if the type checker can prove that x is always a float? > Python code typically says what it is doing, with the minimum of > syntactic guff. (Well, apart from colons after if/while/try etc. :-) ) > Which makes it easy to read. > Now it seems as if this proposal wants to start turning Python in the > C++ direction, encouraging adding ugly boilerplate code. (This may only > be tangentially relevant, but I want to scream when I see some > combination of public/private/protected/static/extern etc., most of > which I don't understand.) Perhaps if you understood it you would be less inclined to scream. > Chris A makes the valid point (if I understand correctly) that > Authors of libraries should make it as easy as possible to > (i) know what object types can be passed to functions > (ii) diagnose when the wrong type of object is passed > Authors of apps are not under such obligation, they can basically > do what they want. > > Well, > (i) can be done with good documentation (docstrings etc.). > (ii) can be done with appropriate runtime checks and good error > messages. How ironic. After singing the praises of duck-typing, now you are recommending runtime type checks. As far as good error messages go, they don't help you one bit when the application suddenly falls over in a totally unexpected place due to a bug in your code. I can't go into too many details due to commercial confidentiality, but we experienced something similar recently. A situation nobody foresaw, that wasn't guarded against, and wasn't tested for, came up after deployment. There was a traceback, of course, but a failure in the field 200km away with a stressed customer and hundreds of angry users is not as useful as a compile-time failure during development. > You see where I'm going with this - adding type hints to Python feels a > bit like painting feet on the snake. Pythons are one of the few snakes which have vestigal legs: http://en.wikipedia.org/wiki/Pelvic_spur -- Steve From solipsis at pitrou.net Tue Apr 21 15:08:27 2015 From: solipsis at pitrou.net (Antoine Pitrou) Date: Tue, 21 Apr 2015 15:08:27 +0200 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction References: <55362CCF.3020809@btinternet.com> <20150421124720.GK5663@ando.pearwood.info> Message-ID: <20150421150827.3c2a0af6@fsol> On Tue, 21 Apr 2015 22:47:23 +1000 Steven D'Aprano wrote: > > Ironically, type hinting will *reduce* the need for intrusive, > anti-duck-testing explicit calls to isinstance() at runtime: It won't, since as you pointed out yourself, type checks are purely optional and entirely separate from compilation and runtime evaluation. > Why bother making that expensive isinstance call every single time the > function is called, if the type checker can prove that x is always a > float? Because the user might not run the type checker, obviously. To quote you: """When we say that type checking is optional, we mean it.""" You can't at the same time point out that type checking has no power or control over runtime behaviour, and then claim that type checking makes runtime behaviour (for example, ability to accept or reject certain types) saner. It is a trivial contradiction. (and because of the former property, type hints can be entirely wrong - for example incomplete, or too strict, or too lax - without anyone noticing as long as they are self-consistent, since runtime behaviour is unaffected by them) Regards Antoine. From steve at pearwood.info Tue Apr 21 15:16:19 2015 From: steve at pearwood.info (Steven D'Aprano) Date: Tue, 21 Apr 2015 23:16:19 +1000 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <553641BE.2000906@simplistix.co.uk> References: <553641BE.2000906@simplistix.co.uk> Message-ID: <20150421131618.GM5663@ando.pearwood.info> On Tue, Apr 21, 2015 at 01:25:34PM +0100, Chris Withers wrote: > Anyway, I've not posted much to python-dev in quite a while, but this is > a topic that I would be kicking myself in 5-10 years time when I've had > to move to Javascript or because everyone > else has drifted away from Python as it had become ugly... Facebook released Flow, a static typechecker for Javascript, to a very positive reaction. From their announcement: Flow?s type checking is opt-in ? you do not need to type check all your code at once. However, underlying the design of Flow is the assumption that most JavaScript code is implicitly statically typed; even though types may not appear anywhere in the code, they are in the developer?s mind as a way to reason about the correctness of the code. Flow infers those types automatically wherever possible, which means that it can find type errors without needing any changes to the code at all. On the other hand, some JavaScript code, especially frameworks, make heavy use of reflection that is often hard to reason about statically. For such inherently dynamic code, type checking would be too imprecise, so Flow provides a simple way to explicitly trust such code and move on. This design is validated by our huge JavaScript codebase at Facebook: Most of our code falls in the implicitly statically typed category, where developers can check their code for type errors without having to explicitly annotate that code with types. Quoted here: http://blog.jooq.org/2014/12/11/the-inconvenient-truth-about-dynamic-vs-static-typing/ More about flow: http://flowtype.org/ Matz is interested in the same sort of gradual type checking for Ruby as Guido wants to add to Python: https://www.omniref.com/blog/blog/2014/11/17/matz-at-rubyconf-2014-will-ruby-3-dot-0-be-statically-typed/ Julia already includes this sort of hybrid dynamic+static type checking: http://julia.readthedocs.org/en/latest/manual/types/ I could keep going, but I hope I've made my point. Whatever language you are using in 5-10 years time, it will almost certainly be either mostly static with some dynamic features like Java, or dynamic with optional and gradual typing. -- Steven From p.f.moore at gmail.com Tue Apr 21 15:20:55 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Tue, 21 Apr 2015 14:20:55 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150421124720.GK5663@ando.pearwood.info> References: <55362CCF.3020809@btinternet.com> <20150421124720.GK5663@ando.pearwood.info> Message-ID: On 21 April 2015 at 13:47, Steven D'Aprano wrote: > On Tue, Apr 21, 2015 at 11:56:15AM +0100, Rob Cliffe wrote: > >> (Adding a type hint that restricted the argument to say a >> sequence of numbers turns out to be a mistake. > > Let's find out how big a mistake it is with an test run. > > py> def sorter(alist: List[int]) -> List[int]: > ... return sorted(alist) > ... > py> data = (chr(i) + 'ay' for i in range(97, 107)) > py> type(data) > > py> sorter(data) > ['aay', 'bay', 'cay', 'day', 'eay', 'fay', 'gay', 'hay', 'iay', 'jay'] > > > When we say that type checking is optional, we mean it. That's a very good point - although I have to say that I sort of feel like I'm cheating doing that. It doesn't matter how many times you say "optional", it does still feel like it's wrong, in basically the same way as using undocumented methods on a class, etc. Conceded, type checking is optional. But that message seems to be failing to get through. It's not a technical thing, it's a PR issue. Which is *very* close to meaning that the problem is FUD, but less in the sense of "people deliberately spreading FUD" as "people being worried, not understanding the proposal, and not being sure how type hints will work in practice". I'm reasonably willing to just accept that the PEP will be getting implemented, and to wait and see how things turn out. But my fear, uncertainty and doubt remain. I'll try not to spread them, though. >> You see where I'm going with this - adding type hints to Python feels a >> bit like painting feet on the snake. > > Pythons are one of the few snakes which have vestigal legs: > > http://en.wikipedia.org/wiki/Pelvic_spur LOL, instant win :-) Paul From solipsis at pitrou.net Tue Apr 21 15:42:28 2015 From: solipsis at pitrou.net (Antoine Pitrou) Date: Tue, 21 Apr 2015 15:42:28 +0200 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction References: <553641BE.2000906@simplistix.co.uk> <20150421131618.GM5663@ando.pearwood.info> Message-ID: <20150421154228.4b38b7a5@fsol> On Tue, 21 Apr 2015 23:16:19 +1000 Steven D'Aprano wrote: > > I could keep going, but I hope I've made my point. I don't think so. Just because other languages are looking at it doesn't mean it will end up successful. It means it's an interesting idea, that's all. A litmus test for this PEP would be to apply it to a sizable code base and evaluate the results. Looking at isolated function examples doesn't tell us how it will scale (and I'm obviously not talking about performance, but the ability to still give useful diagnostics in the presence of typing "holes" - i.e. untyped functions - or imprecisions, when factored in a complex graph of call dependencies). > Whatever language you > are using in 5-10 years time, it will almost certainly be either mostly > static with some dynamic features like Java, or dynamic with optional > and gradual typing. Anybody can make predictions. As a data point, 6 years ago people were predicting that the average desktop CPU would have 16 cores nowadays... (we don't hear much from them anymore :-)) Regards Antoine. From chris at withers.org Tue Apr 21 08:50:34 2015 From: chris at withers.org (Chris Withers) Date: Tue, 21 Apr 2015 07:50:34 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: <5535F33A.3040808@withers.org> On 20/04/2015 20:09, Paul Moore wrote: > On 20 April 2015 at 19:41, Barry Warsaw wrote: >>> tldr; type hints in python source are scary. Would reserving them for stub >>> files be better? >> I think so. I think PEP 8 should require stub files for stdlib modules and >> strongly encourage them for 3rd party code. > Agreed. I have many of the same concerns as Harry, but I wouldn't have > expressed them quite as well. I'm not too worried about actually > removing annotations from the core language, but I agree that we > should create a strong culture of "type hints go in stub files" to > keep source files readable and clean. > > On that note, I'm not sure "stub" files is a particularly good name. > Maybe "type files" would be better? Something that emphasises that > they are the correct place to put type hints, not a workaround. Or we could just be honest and admit that we're choosing to add header files to Python. It's a shame, as it's more complexity, and it's being inflicted on those who might be writing a library for the first time, or those becoming core committers for the first time, or those just trying to "do the right thing". Currently, the burden is a heavier one (type inference, rather than reading it from a file) but borne by people best placed to handle it (authors of IDEs, type checking software, etc). Chris From chris at withers.org Tue Apr 21 08:56:42 2015 From: chris at withers.org (Chris Withers) Date: Tue, 21 Apr 2015 07:56:42 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: Message-ID: <5535F4AA.7000307@withers.org> On 20/04/2015 19:30, Harry Percival wrote: > Hi all, > > tldr; type hints in python source are scary. Would reserving them for > stub files be better? I was trying to find Jack's original post as I think his summary is excellent and aligns well with where I think I'm coming from on this: https://mail.python.org/pipermail/python-dev/2015-April/139253.html However, Harry makes just as many good points so I'll reply here, with a note that while switching to (type/header/stub - my ordered preference for how to describe .pyi files) is preferable to type hints inside files, it's still a massive change to the language, not one I can say I've missed over the past 15 years, and one if asked to vote on (I know that's not the case ;-)) that I would choose to vote against. Anyway, I've not posted much to python-dev in quite a while, but this is a topic that I would be kicking myself in 5-10 years time when I've had to move to Javascript or because everyone else has drifted away from Python as it had become ugly... Chris From lkb.teichmann at gmail.com Tue Apr 21 10:23:43 2015 From: lkb.teichmann at gmail.com (Martin Teichmann) Date: Tue, 21 Apr 2015 10:23:43 +0200 Subject: [Python-Dev] async/await PEP In-Reply-To: <55315B32.5000205@gmail.com> References: <55315B32.5000205@gmail.com> Message-ID: Hi Yury, Hi List, I do certainly like the idea of PEP 492, just some small comments: why do we need two keywords? To me it is not necessarily intuitive when to use async and when to use await (why is it async for and not await for?), so wouldn't it be much simpler, and more symmetric, to just have one keyword? I personally prefer await for that, then it is "await def", and "await for" (and "await with", etc.). Greetings Martin From arnodel at gmail.com Tue Apr 21 11:56:07 2015 From: arnodel at gmail.com (Arnaud Delobelle) Date: Tue, 21 Apr 2015 09:56:07 +0000 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: On Tue, 21 Apr 2015 at 09:59 Cory Benfield wrote: [...] > Further, Python's type system is not sufficiently flexible to allow > library authors to adequately specify the types their code actually > works on. I need to be able to talk about interfaces, because > interfaces are the contract around which APIs are build in Python, but > I cannot do that with this system in a way that makes any sense at > all. To even begin to be useful for library authors this PEP would > need to allow some kind of type hierarchy that is *not* defined by > inheritance, but instead by interfaces. We've been discouraging use of > 'type' and 'isinstance' for years because they break duck typing, but > that has *nothing* on what this PEP appears to do to duck typing. > > This is why I feel like this PEP may be a real threat to duck typing. If people constantly get told by their editor / IDE that they are calling function with the wrong argument types, what are they going to do? They may start adopting the same approach as in Java / C++ etc... where interfaces must be explicitly defined and the practice of duck typing may become forgotten because discouraged by the tools programmers use. I guess what I'm saying is that this could encourage a very significant cultural change in the way Python code is written towards a less flexible mindset. -- Arnaud -------------- next part -------------- An HTML attachment was scrubbed... URL: From harry.percival at gmail.com Tue Apr 21 13:16:05 2015 From: harry.percival at gmail.com (Harry Percival) Date: Tue, 21 Apr 2015 12:16:05 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <55362CCF.3020809@btinternet.com> References: <20150420144106.63513828@anarchist.wooz.org> <55362CCF.3020809@btinternet.com> Message-ID: Hey, I just wanted to say to everyone, thanks for being so patient and willing to engage with this discussion, despite my not having done my research and read the (substantial) prior discussion on the topic. Here it is (or at least, some of it!) for any other newcomers: https://mail.python.org/pipermail/python-list/2015-January/697202.html https://mail.python.org/pipermail/python-list/2015-January/697315.html https://github.com/ambv/typehinting/issues/55 I'm not sure any of it will radically change anyone's mind. Everyone (I think) agrees that type hints carry some cost in terms of legibility and simplicity of the language. Everyone agrees they have some benefits. The disagreement remains over whether it will be worth it, and whether stub files vs inline vs something-else would be the best place to hide them. Is "let's wait and see" an acceptable position? Probably the only real way to resolve the disagreement is to start seeing the type hints in use, and form opinions based on real-world practice. Some people will like them inline. Lots of people will use stub files, since they're the only way to maintain 2+3 compatibility, so it's my home that the community will standardise around that. Still other people will want to have another crack at using docstrings, but the standardisation on notation of types will still help them. And still other others will want no part in them, and will argue against them in their communities. Hopefully a best practice will evolve. And hopefully it'll be the stub file one, and then we really can deprecate function annotations in 3.6 ;-) On 21 April 2015 at 11:56, Rob Cliffe wrote: > On 21/04/2015 10:33, Cory Benfield wrote: > > On 21 April 2015 at 10:10, Chris Angelico wrote: > > At this point, you may want to just stop caring about the exact type. > Part of the point of gradual typing is that you can short-cut a lot of > this. And quite frankly, this isn't really helping anything. Just skip > it and say that it's Union[Mapping, Iterable, None]. > > I think this is my problem with the proposal. This change doesn't > catch any of the bugs anyone was going to hit (no-one is passing a > callable to this parameter), and it doesn't catch any of the bugs > someone might actually hit (getting the tuple elements in the wrong > order, for example). At this point the type signature is worse than > useless. > > Exactly. At this point putting the type signatures in seems to be a > pointless exercise in masochism (for the author) and sadism (for the > reader). > > The refreshing thing about Python is that it is a fantastic, *concise*, > dynamically typed language, at the opposite end of the spectrum from C++ > (ugh!). > We have got used to the consequences (good and bad) of this: > Duck typing, e.g. a function to sort numbers (sorry Tim Peters bad > example) turns out to support any kind of object (e.g. strings) that > supports comparison. > Not to mention objects of some class that will be written in 5 > years time. > (Adding a type hint that restricted the argument to say a sequence > of numbers turns out to be a mistake. And what is a number? > Is Fraction? What about complex numbers, which can't be sorted? > What if the function were written before the Decimal class?) > Errors are often not caught until run time that would be caught at > compile time in other languages (though static code checkers help). > (Not much of a disadvantage because of Python's superb error > diagnostics.) > Python code typically says what it is doing, with the minimum of > syntactic guff. (Well, apart from colons after if/while/try etc. :-) ) > Which makes it easy to read. > Now it seems as if this proposal wants to start turning Python in the C++ > direction, encouraging adding ugly boilerplate code. (This may only be > tangentially relevant, but I want to scream when I see some combination of > public/private/protected/static/extern etc., most of which I don't > understand.) > > Chris A makes the valid point (if I understand correctly) that > Authors of libraries should make it as easy as possible to > (i) know what object types can be passed to functions > (ii) diagnose when the wrong type of object is passed > Authors of apps are not under such obligation, they can basically do > what they want. > > Well, > (i) can be done with good documentation (docstrings etc.). > (ii) can be done with appropriate runtime checks and good error > messages. > E.g. (Cory's example) I'm sure it is possible to test somehow if an object > is file-like (if only by trying to access it like a file). > Is thorough argument checking and provision of good diagnostics going to > be easy for the library author? No. Is it going to be a lot of work to do > thoroughly? Almost certainly yes. > But what the hell, writing a production-quality library is not an exercise > for newbies. > > It seems to me that type hints are attempting to be a silver bullet and to > capture in a simple formula what is often, in practice, *not simple at > all*, viz. "Is this passed object suitable?". Attempting - and failing, > except in the simplest cases. > > Apologies, Guido, but: > There was once a Chinese student who was a marvellous painter. He painted > a perfect life-like picture of a snake. > But he was unable to stop and leave it alone. In his zeal he went on to > paint feet on the snake. Which of course completely spoiled the picture, > as snakes don't have feet. > Hence "to paint feet on the snake": to be unable to resist tinkering with > something that is already good. (I suppose "If it ain't broke, don't fix > it" is an approximate Western equivalent.) > You see where I'm going with this - adding type hints to Python feels a > bit like painting feet on the snake. Or at least turning it into a > different language. > > Best wishes > Rob Cliffe > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/hjwp2%40cantab.net > > -- ------------------------------ Harry J.W. Percival ------------------------------ Twitter: @hjwp Mobile: +44 (0) 78877 02511 Skype: harry.percival -------------- next part -------------- An HTML attachment was scrubbed... URL: From pmiscml at gmail.com Tue Apr 21 13:38:39 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Tue, 21 Apr 2015 14:38:39 +0300 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <55362CCF.3020809@btinternet.com> References: <55362CCF.3020809@btinternet.com> Message-ID: <20150421143839.2cc950fa@x230> Hello, On Tue, 21 Apr 2015 11:56:15 +0100 Rob Cliffe wrote: > On 21/04/2015 10:33, Cory Benfield wrote: > > On 21 April 2015 at 10:10, Chris Angelico wrote: > >> At this point, you may want to just stop caring about the exact > >> type. Part of the point of gradual typing is that you can > >> short-cut a lot of this. And quite frankly, this isn't really > >> helping anything. Just skip it and say that it's Union[Mapping, > >> Iterable, None]. > > I think this is my problem with the proposal. This change doesn't > > catch any of the bugs anyone was going to hit (no-one is passing a > > callable to this parameter), and it doesn't catch any of the bugs > > someone might actually hit (getting the tuple elements in the wrong > > order, for example). At this point the type signature is worse than > > useless. > Exactly. At this point putting the type signatures in seems to be a > pointless exercise in masochism (for the author) and sadism (for the > reader). Or interesting and productive exercise, which will lead to further improvements and benefits - for other kinds of authors and readers. Those kinds left your kind alone by letting to keep not writing annotations. Why you feel so embarrassed to let the other kind progress? Python becomes multi-paradigm language, what's so bad with that? You're afraid that you will have discomfort reading other's code? My god, I love Python, and hate 50% of code written in it, most of it needs to be rewritten to be satisfactory! Why it bothers you that there will be direction on how other people may (re)write it? > > The refreshing thing about Python is that it is a fantastic, > *concise*, dynamically typed language, at the opposite end of the > spectrum from C++ (ugh!). C++ is fantastic, concise, statically typed language. You've just used to see 90% of code in it which needs to be rewritten. [] > Apologies, Guido, but: [] > You see where I'm going with this - adding type hints to Python feels > a bit like painting feet on the snake. Or at least turning it into a > different language. I'm sure type annotations never could be discussed for inclusion if Guido didn't feel that they fit with Python's way. Sometimes it makes sense to trust BDFL. It's much less obtrusive and qustionable change than forcing Unicode by default, making print a function, or keeping 3-headed sys.exc_info() ugliness instead throwing it out of Python3. Because well, type annotation *are optional*. And it's by now clear that there will be --strip-type-annotations flag to pip and stickers "This system runs type annotations free". > > Best wishes > Rob Cliffe -- Best regards, Paul mailto:pmiscml at gmail.com From barry at python.org Tue Apr 21 15:48:59 2015 From: barry at python.org (Barry Warsaw) Date: Tue, 21 Apr 2015 09:48:59 -0400 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150421033421.GJ5663@ando.pearwood.info> References: <20150420144106.63513828@anarchist.wooz.org> <20150421033421.GJ5663@ando.pearwood.info> Message-ID: <20150421094859.3109b5e0@limelight.wooz.org> On Apr 21, 2015, at 01:34 PM, Steven D'Aprano wrote: >Putting the type information in a stub file is an exponentially more distant >fourth best, or to put it another way, *the worst* solution for where to put >type hints. Not only do you Repeat Yourself with the name of the parameter, >but also the name of the function (or method and class) AND module. The type >information *isn't even in the same file*, which increases the chance of it >being lost, forgotten, deleted, out of date, unmaintained, etc. All true, but the trade-off is the agility and ease of working on, reading, and understanding the stdlib, all of which IMHO will suffer if type hints are inlined there. What I don't want to have happen is for type hints to slowly infiltrate the stdlib to the point where no patch will be accepted unless it also has hints. I have the same gut reaction to this as RDM expressed a few posts back. One of the thing I love most about Python is its dynamic typing. I'm all for giving linter developers a hook for experimenting with their tools, I just don't care and I don't want to *have* to care. Maybe some day they will make it so compelling that I will care, but I want to be convinced first. So I think stub files in the stdlib are the right compromise today. Cheers, -Barry From steve at pearwood.info Tue Apr 21 16:06:54 2015 From: steve at pearwood.info (Steven D'Aprano) Date: Wed, 22 Apr 2015 00:06:54 +1000 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150421150827.3c2a0af6@fsol> References: <55362CCF.3020809@btinternet.com> <20150421124720.GK5663@ando.pearwood.info> <20150421150827.3c2a0af6@fsol> Message-ID: <20150421140653.GN5663@ando.pearwood.info> On Tue, Apr 21, 2015 at 03:08:27PM +0200, Antoine Pitrou wrote: > On Tue, 21 Apr 2015 22:47:23 +1000 > Steven D'Aprano wrote: > > > > Ironically, type hinting will *reduce* the need for intrusive, > > anti-duck-testing explicit calls to isinstance() at runtime: > > It won't, since as you pointed out yourself, type checks are purely > optional and entirely separate from compilation and runtime evaluation. Perhaps you are thinking of libraries, where the library function has to deal with whatever junk people throw at it. To such libraries, I believe that the major benefit of type hints is not so much in proving the library's correctness in the face of random arguments, but as documentation. In any case, of course you are correct that public library functions and methods will continue to need to check their arguments. (Private functions, perhaps not.) But for applications, the situation is different. If my application talks to a database and extracts a string which it passes on to its own function spam(), then it will be a string. Not a string-like object. Not something that quacks like a string. A string. Once the type checker is satisfied that spam() always receives a string, then further isinstance checks inside spam() is a waste of time. If spam()'s caller changes and might return something which is not a string, then the type checker will flag that. Obviously to get this benefit you need to actually use a type checker. I didn't think I needed to mention that. -- Steve From antoine at python.org Tue Apr 21 16:11:51 2015 From: antoine at python.org (Antoine Pitrou) Date: Tue, 21 Apr 2015 16:11:51 +0200 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150421165042.34f935ef@x230> References: <55362CCF.3020809@btinternet.com> <20150421124720.GK5663@ando.pearwood.info> <20150421150827.3c2a0af6@fsol> <20150421165042.34f935ef@x230> Message-ID: <55365AA7.3060500@python.org> Le 21/04/2015 15:50, Paul Sokolovsky a ?crit : > Hello, > > On Tue, 21 Apr 2015 15:08:27 +0200 > Antoine Pitrou wrote: > > [] > >> Because the user might not run the type checker, obviously. To quote >> you: """When we say that type checking is optional, we mean it.""" >> >> You can't at the same time point out that type checking has no >> power or control over runtime behaviour, and then claim that type >> checking makes runtime behaviour (for example, ability to accept or >> reject certain types) saner. It is a trivial contradiction. > > I suspected there's either short-sightedness, or just a word play for > for a purpose of word play. > > Type annotation are NOT introduced for the purpose of static type > checking. I was replying to Steven's message. Did you read it? > Runtime type checking (run as via "make test", not > as "production") is much more interesting to catch errors. Obviously you're giving the word "runtime" a different meaning than I do. The type checker isn't supposed to actually execute the user's functions (it's not that it's forbidden, simply that it's not how it will work in all likelihood): therefore, it doesn't have any handle on what *actually* happens at runtime, vs. what is declared in the typing declarations. But if by "runtime" you mean any action that happens *out-of-band* during development (such as running pip or a syntax-checking linter), then sure, fine :-) > Even more interesting usage is to allow ahead-of-time, and thus > unbloated, optimization. There're bunch of JITters and AOTters for > Python language, each of which uses own syntax (via decorators, etc.) > to annotate functions. As a developer of one of those tools, I've already said that I find it unlikely for the PEP to be useful for that purpose. The issue is that the vocabulary created in the PEP is not extensible enough. Note I'm not saying it's impossible. I'm just skeptical that in its current form it will help us. And apparently none of our "competitors" seems very enthusiastic either (feel free to prove me wrong: I might have missed something :-)). > Having language-specified type annotations > allows for portable syntax for such optimized code. Only if the annotations allow expressing the subtleties required by the specific optimizer. For example, "Float" is too vague for Numba: we would like to know if that is meant to be a single- or double-precision float. > Don't block the language if you're stuck > with an unimaginative implementation, there's much more to Python than > that. The Python language doesn't really have anything to do with that. It's just an additional library with a set of conventions. Which is also why a PEP wouldn't be required to make it alive, it's just there to make it an official standard. Regards Antoine. From rosuav at gmail.com Tue Apr 21 16:31:48 2015 From: rosuav at gmail.com (Chris Angelico) Date: Wed, 22 Apr 2015 00:31:48 +1000 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: On Tue, Apr 21, 2015 at 7:56 PM, Arnaud Delobelle wrote: > If people constantly get told by their editor / IDE that they are calling > function with the wrong argument types, what are they going to do? They may > start adopting the same approach as in Java / C++ etc... where interfaces > must be explicitly defined and the practice of duck typing may become > forgotten because discouraged by the tools programmers use. Style guides should generally be recommending the use of concrete types for return values, but abstract types for parameters - you might well have a function that returns a list, but most functions that accept lists are really accepting sequences. Granted, there are some vague areas - how many functions take a "file-like object", and are they all the same? - but between MyPy types and the abstract base types that already exist, there are plenty of ways to formalize duck typing. And frankly, even with the uncertainties, I'd still rather have a function declared as taking a "file-like object" than "an object with a .read() method that takes an integer and returns up to that many bytes of data, and a .seek() method that blah blah blah blah". Sometimes, the precision is completely useless. What an editor would do with the hint that a function takes a file-like object I'm not sure, but if it issues a complaint because I'm passing it something that doesn't have a writelines() method, it's the fault of the editor, not the type hint. ChrisA From pmiscml at gmail.com Tue Apr 21 15:50:42 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Tue, 21 Apr 2015 16:50:42 +0300 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150421150827.3c2a0af6@fsol> References: <55362CCF.3020809@btinternet.com> <20150421124720.GK5663@ando.pearwood.info> <20150421150827.3c2a0af6@fsol> Message-ID: <20150421165042.34f935ef@x230> Hello, On Tue, 21 Apr 2015 15:08:27 +0200 Antoine Pitrou wrote: [] > Because the user might not run the type checker, obviously. To quote > you: """When we say that type checking is optional, we mean it.""" > > You can't at the same time point out that type checking has no > power or control over runtime behaviour, and then claim that type > checking makes runtime behaviour (for example, ability to accept or > reject certain types) saner. It is a trivial contradiction. I suspected there's either short-sightedness, or just a word play for for a purpose of word play. Type annotation are NOT introduced for the purpose of static type checking. The current PEP just gives an example of their usage for static type checking as an example backed by the immediate availability of a tool which does that, MyPy. But granted, static type checking is the most boring of possible usages for type annotation. Runtime type checking (run as via "make test", not as "production") is much more interesting to catch errors. The tooling for that is yet to be written though. Even more interesting usage is to allow ahead-of-time, and thus unbloated, optimization. There're bunch of JITters and AOTters for Python language, each of which uses own syntax (via decorators, etc.) to annotate functions. Having language-specified type annotations allows for portable syntax for such optimized code. Granted, the most juicy usages described above won't be available in CPython out of the box. But there should be that motto: "CPython is the most boring of all Pythons". Don't block the language if you're stuck with an unimaginative implementation, there's much more to Python than that. -- Best regards, Paul mailto:pmiscml at gmail.com From tritium-list at sdamon.com Tue Apr 21 16:20:09 2015 From: tritium-list at sdamon.com (Alexander Walters) Date: Tue, 21 Apr 2015 10:20:09 -0400 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150421030048.GH5663@ando.pearwood.info> References: <20150420144106.63513828@anarchist.wooz.org> <20150421030048.GH5663@ando.pearwood.info> Message-ID: <55365C99.4070106@sdamon.com> So. This is how you try and get me to care about Python 3. Can't speak for others, but this does the opposite for me. This makes me ecstatic that Python 2 has a nearly-frozen api. From cory at lukasa.co.uk Tue Apr 21 16:51:05 2015 From: cory at lukasa.co.uk (Cory Benfield) Date: Tue, 21 Apr 2015 15:51:05 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: On 21 April 2015 at 15:31, Chris Angelico wrote: > Granted, there are some > vague areas - how many functions take a "file-like object", and are > they all the same? - but between MyPy types and the abstract base > types that already exist, there are plenty of ways to formalize duck > typing. Are there? Can I have a link or an example, please? I feel like I don't know how I'm supposed to do this, and I'd like to see how that works. I'll even give a concrete use-case: I want to be able to take a file-like object that has a .read() method and a .seek() method. > And frankly, even with the uncertainties, I'd still rather > have a function declared as taking a "file-like object" than "an > object with a .read() method that takes an integer and returns up to > that many bytes of data, and a .seek() method that blah blah blah > blah". Sometimes, the precision is completely useless. It is completely useless. Happily, this is a strawman, and no-one was asking for it, so we can all live happily ever after! The correct specification is "read method with this type signature" and "seek method with this type signature". I would even be prepared to waive the type signatures on read and seek, given that enforcing the type hinting on others is not a good idea. I suspect I have a mismatch with several others in this discussion. My position is that if I'm going to have a type system, I'd like to have a powerful one: Haskell, not Java. Otherwise I'll get by with the duck typing that has worked just fine for us over the last 20 years. I suspect, however, that many others in this conversation want any type system at all, so long as they can have one. Is that an accurate characterisation of your position, Chris? From Nikolaus at rath.org Tue Apr 21 17:05:59 2015 From: Nikolaus at rath.org (Nikolaus Rath) Date: Tue, 21 Apr 2015 08:05:59 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: (Chris Angelico's message of "Tue, 21 Apr 2015 11:07:02 +1000") References: <20150420144106.63513828@anarchist.wooz.org> <85sibumhej.fsf@benfinney.id.au> Message-ID: <87618p5xns.fsf@thinkpad.rath.org> On Apr 20 2015, Chris Angelico wrote: > Maybe it'd be of value to have a quick "code stripper" that takes away > all the annotations, plus any other junk/framing that you're not > interested in, and gives you something you can browse in a text > editor? If you need to preprocess your source code to make it suitable for human consumption something is very wrong with your language design. I can't believe you're seriously suggesting this. Best, -Nikolaus -- GPG encrypted emails preferred. Key id: 0xD113FCAC3C4E599F Fingerprint: ED31 791B 2C5C 1613 AF38 8B8A D113 FCAC 3C4E 599F ?Time flies like an arrow, fruit flies like a Banana.? From rosuav at gmail.com Tue Apr 21 17:09:52 2015 From: rosuav at gmail.com (Chris Angelico) Date: Wed, 22 Apr 2015 01:09:52 +1000 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: On Wed, Apr 22, 2015 at 12:51 AM, Cory Benfield wrote: > On 21 April 2015 at 15:31, Chris Angelico wrote: >> Granted, there are some >> vague areas - how many functions take a "file-like object", and are >> they all the same? - but between MyPy types and the abstract base >> types that already exist, there are plenty of ways to formalize duck >> typing. > > Are there? Can I have a link or an example, please? I feel like I > don't know how I'm supposed to do this, and I'd like to see how that > works. I'll even give a concrete use-case: I want to be able to take a > file-like object that has a .read() method and a .seek() method. Someone who's been more involved in the details of MyPy can probably say more specifically what ought to be done about file-like objects. >> And frankly, even with the uncertainties, I'd still rather >> have a function declared as taking a "file-like object" than "an >> object with a .read() method that takes an integer and returns up to >> that many bytes of data, and a .seek() method that blah blah blah >> blah". Sometimes, the precision is completely useless. > > It is completely useless. Happily, this is a strawman, and no-one was > asking for it, so we can all live happily ever after! > > The correct specification is "read method with this type signature" > and "seek method with this type signature". I would even be prepared > to waive the type signatures on read and seek, given that enforcing > the type hinting on others is not a good idea. > > I suspect I have a mismatch with several others in this discussion. My > position is that if I'm going to have a type system, I'd like to have > a powerful one: Haskell, not Java. Otherwise I'll get by with the duck > typing that has worked just fine for us over the last 20 years. I > suspect, however, that many others in this conversation want any type > system at all, so long as they can have one. > > Is that an accurate characterisation of your position, Chris? Pretty accurate, yeah. Here's how I see it: def incremental_parser(input: FileLike) -> List[Token]: tokens = [] data = "" while True: if not data: data = input.read(64) token = Token(data[0]); data = data[1:] while token.wants_more(): token.give_more(data[0]); data = data[1:] tokens.append(token) if token.is_end_of_stream(): break input.seek(-len(data), 1) return tokens If you were to exhaustively stipulate the requirements on the file-like object, you'd have to say: * Provides a read() method which takes an integer and returns up to that many bytes * Provides a seek() method which takes two integers * Is capable of relative seeking by at least 63 bytes backward * Is open for reading * Etcetera That's not the type system's job. Not in Python. Maybe in Haskell, but not in Python. So how much _should_ go into the type hint? I'm happy with "FileLike" or however it's to be spelled; maybe separate readable files from writable ones, as those are two fairly clear variants, but that's about all you really need. If you provide incremental_parser() with an input file that's non-seekable, it's going to have problems - and your editor may or may not even be able to detect that (some files are seekable but only in the forward direction, but they'll have the exact same seek() method). I might be wrong about where MyPy is trying to go with this, but doubtless someone will correct me on any inaccuracies above. ChrisA From pmiscml at gmail.com Tue Apr 21 17:27:50 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Tue, 21 Apr 2015 18:27:50 +0300 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <55365AA7.3060500@python.org> References: <55362CCF.3020809@btinternet.com> <20150421124720.GK5663@ando.pearwood.info> <20150421150827.3c2a0af6@fsol> <20150421165042.34f935ef@x230> <55365AA7.3060500@python.org> Message-ID: <20150421182750.3efcb597@x230> Hello, On Tue, 21 Apr 2015 16:11:51 +0200 Antoine Pitrou wrote: [] > >> You can't at the same time point out that type checking has no > >> power or control over runtime behaviour, and then claim that type > >> checking makes runtime behaviour (for example, ability to accept or > >> reject certain types) saner. It is a trivial contradiction. > > > > I suspected there's either short-sightedness, or just a word play > > for for a purpose of word play. > > > > Type annotation are NOT introduced for the purpose of static type > > checking. > > I was replying to Steven's message. Did you read it? Yes. And I try to follow general course of discussion, as its hard to follow individual sub-threads. And for example yesterday's big theme was people blackmailing that they stop contributing to stdlib if annotations are in, and today's theme appear to be people telling that static type checking won't be useful. And just your reply to Steven was a final straw which prompted me to point out that static type checking is not a crux of it, but just one (not the biggest IMHO) usage. > > Runtime type checking (run as via "make test", not > > as "production") is much more interesting to catch errors. > > Obviously you're giving the word "runtime" a different meaning than I > do. The type checker isn't supposed to actually execute the user's > functions (it's not that it's forbidden, simply that it's not how it > will work in all likelihood): therefore, it doesn't have any handle on > what *actually* happens at runtime, vs. what is declared in the typing > declarations. Well, maybe "typechecker" is the wrong word then. I'm talking about instrumented VM which actually interprets type annotation while running bytecode - for example, on entry to a function, it takes argument annotations, and executes sequence of equivalent isinstance() checks, etc., etc. One won't use such instrumented VM at "production" runtime, but it will be useful to enable while running testsuite (or integration tests), to catch more issues. > > Even more interesting usage is to allow ahead-of-time, and thus > > unbloated, optimization. There're bunch of JITters and AOTters for > > Python language, each of which uses own syntax (via decorators, > > etc.) to annotate functions. > > As a developer of one of those tools, I've already said that I find it > unlikely for the PEP to be useful for that purpose. The issue is that > the vocabulary created in the PEP is not extensible enough. How so, if it essentially allows you make typedefs? Syntax of such typedef follow basic conventions (so *any* tool should understand its *basic* semantics), but you're free to assign (implicit, particular for your tool, but of course documented for it) semantics. > Note I'm not saying it's impossible. I'm just skeptical that in its > current form it will help us. And apparently none of our "competitors" That's my biggest fear - that "JIT" Python community is not yet ready to receive and value this feature. > seems very enthusiastic either (feel free to prove me wrong: I might > have missed something :-)). Let me try: MicroPython already uses type annotations for statically typed functions. E.g. def add(x:int, y:int): return x + y will translate the function to just 2 machine instructions. And we'd prefer to use standard language syntax, instead of having our own conventions (e.g. above you see that return type "is inferred"). > > Having language-specified type annotations > > allows for portable syntax for such optimized code. > > Only if the annotations allow expressing the subtleties required by > the specific optimizer. For example, "Float" is too vague for Numba: > we would like to know if that is meant to be a single- or > double-precision float. Oh really, you care to support single-precisions in Numba? We have a lot of problems with sfloats in MicroPython, because whole Python stdlib API was written with implicit assumption that float is (at least) double. E.g., we cannot have time.time() which both returns fractional seconds and is based of Jan 1, 1970 - there's simply not enough bits in single-precision floats! It's on my backlog to bring up this and related issues before wider Python development community. Anyway, back to your example, it would be done like: SFloat = float DFloat = float For a random tool out there, "SFloat" and "DFloat" would be just aliases to floats, but Numba will know they have additional semantics behind them. (That assumes that typedefs like SFloat can be accessed in symbolic form - that's certainly possible if you have your own parser/VM, but might worth to think how to do it on "CPython" level). > > Don't block the language if you're stuck > > with an unimaginative implementation, there's much more to Python > > than that. > > The Python language doesn't really have anything to do with that. It's > just an additional library with a set of conventions. Which is also > why a PEP wouldn't be required to make it alive, it's just there to > make it an official standard. Yes, and IMHO "JIT/compiler" Python community is too fragmented and to unaware (ignorant?) of other projects' efforts, and having "official standard" would help both "JIT" community and to establish Python as a language with support for efficient compile-time optimizations. > > Regards > > Antoine. -- Best regards, Paul mailto:pmiscml at gmail.com From pmiscml at gmail.com Tue Apr 21 17:39:21 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Tue, 21 Apr 2015 18:39:21 +0300 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <87618p5xns.fsf@thinkpad.rath.org> References: <20150420144106.63513828@anarchist.wooz.org> <85sibumhej.fsf@benfinney.id.au> <87618p5xns.fsf@thinkpad.rath.org> Message-ID: <20150421183921.534a37df@x230> Hello, On Tue, 21 Apr 2015 08:05:59 -0700 Nikolaus Rath wrote: > On Apr 20 2015, Chris Angelico wrote: > > Maybe it'd be of value to have a quick "code stripper" that takes > > away all the annotations, plus any other junk/framing that you're > > not interested in, and gives you something you can browse in a text > > editor? > > If you need to preprocess your source code to make it suitable for > human consumption something is very wrong with your language design. > I can't believe you're seriously suggesting this. I'm sure that was irony, d'oh. Just as my suggestion to have stickers "This system runs free of type annotations" for the most zealous anti-annotations folks. The proposed type annotations are very readable. Not as readable as C's type syntax, but ok. Sorry, irony again. I remember myself as a 10-years old kid, trying to wrap my head about C's types syntax, and not getting anything of it. So, please contrast it to C - PEP484 gives very readable syntax, and even both human- and machine-readable. If you feel like writing 3-story type annotation, just resist the temptation, "Any" is your best friend. -- Best regards, Paul mailto:pmiscml at gmail.com From cory at lukasa.co.uk Tue Apr 21 17:43:12 2015 From: cory at lukasa.co.uk (Cory Benfield) Date: Tue, 21 Apr 2015 16:43:12 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: On 21 April 2015 at 16:09, Chris Angelico wrote: > Pretty accurate, yeah. Here's how I see it: > > def incremental_parser(input: FileLike) -> List[Token]: > tokens = [] > data = "" > while True: > if not data: > data = input.read(64) > token = Token(data[0]); data = data[1:] > while token.wants_more(): > token.give_more(data[0]); data = data[1:] > tokens.append(token) > if token.is_end_of_stream(): break > input.seek(-len(data), 1) > return tokens > > If you were to exhaustively stipulate the requirements on the > file-like object, you'd have to say: > > * Provides a read() method which takes an integer and returns up to > that many bytes > * Provides a seek() method which takes two integers > * Is capable of relative seeking by at least 63 bytes backward > * Is open for reading > * Etcetera > > That's not the type system's job. Not in Python. Maybe in Haskell, but > not in Python. So how much _should_ go into the type hint? I'm happy > with "FileLike" or however it's to be spelled; maybe separate readable > files from writable ones, as those are two fairly clear variants, but > that's about all you really need. If you provide incremental_parser() > with an input file that's non-seekable, it's going to have problems - > and your editor may or may not even be able to detect that (some files > are seekable but only in the forward direction, but they'll have the > exact same seek() method). Ok, that makes sense to me. =) Looking at mypy's source code, I see shutil.copyfileobj has the following signature: def copyfileobj(fsrc: IO[AnyStr], fdst: IO[AnyStr], length: int = 16*1024) -> None: so it seems that mypy defines a general IO datatype here. I continue to be worried about the lack of a *general* solution to this problem, but I'm glad that some specific solutions exist. I still think library authors will not do much to take up this proposal. =) From jbaker at zyasoft.com Tue Apr 21 17:59:00 2015 From: jbaker at zyasoft.com (Jim Baker) Date: Tue, 21 Apr 2015 09:59:00 -0600 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: On Tue, Apr 21, 2015 at 9:09 AM, Chris Angelico wrote: > ... > > Pretty accurate, yeah. Here's how I see it: > > def incremental_parser(input: FileLike) -> List[Token]: > tokens = [] > data = "" > while True: > if not data: > data = input.read(64) > token = Token(data[0]); data = data[1:] > while token.wants_more(): > token.give_more(data[0]); data = data[1:] > tokens.append(token) > if token.is_end_of_stream(): break > input.seek(-len(data), 1) > return tokens > > If you were to exhaustively stipulate the requirements on the > file-like object, you'd have to say: > > * Provides a read() method which takes an integer and returns up to > that many bytes > * Provides a seek() method which takes two integers > * Is capable of relative seeking by at least 63 bytes backward > * Is open for reading > * Etcetera > Potentially you could use io.RawIOBase as the ABC for the type you need for FileLike, including read and seek. See the mixins in https://docs.python.org/3/library/io.html#class-hierarchy > > > That's not the type system's job. Not in Python. Maybe in Haskell, Not in Haskell either FWIW in terms of what its type system can prove > but > not in Python. So how much _should_ go into the type hint? I'm happy > with "FileLike" or however it's to be spelled; maybe separate readable > files from writable ones, as those are two fairly clear variants, but > that's about all you really need. RawIOBase is also potential overkill... maybe you just wanted something that duck typed for a few methods you implemented. Any and dynamic typing is a good choice then. > If you provide incremental_parser() > with an input file that's non-seekable, it's going to have problems - > and your editor may or may not even be able to detect that (some files > are seekable but only in the forward direction, but they'll have the > exact same seek() method). > With what has been proposed, there are no static guarantees about how many bytes can be read, or that input is even seekable (does seekable() return True?) or it is open for reading. Instead we can only *prove* a limited amount in static type systems about the runtime dynamic behavior of code, and PEP 484 is weaker than other approaches (for very good reasons IMHO). Still useful however :) - Jim -------------- next part -------------- An HTML attachment was scrubbed... URL: From yselivanov.ml at gmail.com Tue Apr 21 18:01:32 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Tue, 21 Apr 2015 12:01:32 -0400 Subject: [Python-Dev] async/await PEP In-Reply-To: References: <55315B32.5000205@gmail.com> Message-ID: <5536745C.8090905@gmail.com> Hi Martin, On 2015-04-21 4:23 AM, Martin Teichmann wrote: > Hi Yury, Hi List, > > I do certainly like the idea of PEP 492, just some small comments: Thank you! > > why do we need two keywords? To me it is not necessarily intuitive > when to use async and when to use await (why is it async for and not > await for?), so wouldn't it be much simpler, and more symmetric, to > just have one keyword? I personally prefer await for that, then it is > "await def", and "await for" (and "await with", etc.). "await" is a verb - call for action, while "async" is an adjective (although non-existent in plain English). Hence "async" tells us that the statement after it is asynchronous. At least that's my reasoning about it ;) Yury From greg at krypto.org Tue Apr 21 18:01:46 2015 From: greg at krypto.org (Gregory P. Smith) Date: Tue, 21 Apr 2015 16:01:46 +0000 Subject: [Python-Dev] Surely "nullable" is a reasonable name? In-Reply-To: <20150420095404.29e76d21@anarchist.wooz.org> References: <53DF326F.9030908@hastings.org> <53E0F488.1090105@v.loewis.de> <53E454E9.3030100@hastings.org> <55336517.6090702@hastings.org> <20150420095404.29e76d21@anarchist.wooz.org> Message-ID: On Mon, Apr 20, 2015 at 6:55 AM Barry Warsaw wrote: > On Apr 19, 2015, at 01:19 AM, Larry Hastings wrote: > > >We should rename "types" to "accept". "accept" should takes a set of > types; > >these types specify the types of Python objects the Clinic parameter > should > >accept. For the funny pseudo-types needed in some Clinic declarations > >("buffer", "robuffer", and "rwbuffer"), Clinic provides empty class > >declarations so these behave like types too. > > Having only followed the AC discussions tangentially, I have to say that > the > above suggestion and the given examples make a lot more intuitive sense to > me. > +1 as well: gps(accept={NewlyProposedArgumentClinicSyntax, Cookies}) > I had the same initial thought as Glenn regarding type annotations. It's > fine > that they're separate concepts, but it's also helpful that Larry's > suggestion > above seems to align them better. > > Cheers, > -Barry > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/greg%40krypto.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Tue Apr 21 18:08:16 2015 From: guido at python.org (Guido van Rossum) Date: Tue, 21 Apr 2015 09:08:16 -0700 Subject: [Python-Dev] typeshed for 3rd party packages (was: Type hints -- a mediocre programmer's reaction) In-Reply-To: <5535FD60.7020404@egenix.com> References: <20150420144106.63513828@anarchist.wooz.org> <5535FD60.7020404@egenix.com> Message-ID: On Tue, Apr 21, 2015 at 12:33 AM, M.-A. Lemburg wrote: > On 21.04.2015 05:37, Guido van Rossum wrote: > > On Mon, Apr 20, 2015 at 4:41 PM, Jack Diederich > wrote: > >> * Uploading stubs for other people's code is a terrible idea. Who do I > >> contact when I update the interface to my library? The random Joe who > >> "helped" by uploading annotations three months ago and then quit the > >> internet? I don't even want to think about people maliciously adding > stubs > >> to PyPI. > >> > > > > We're certainly not planning to let arbitrary people upload stubs for > > arbitrary code to PyPI that will automatically be used by the type > > checkers. (We can't stop people from publishing their stubs, just as you > > can't stop people from writing blog posts or stackoverflow answers with > > examples for your library.) > > > > The actual plan is to have a common repository of stubs (a prototype > exists > > at https://github.com/JukkaL/typeshed) but we also plan some kind of > > submission review. I've proposed that when submitting stubs for a package > > you don't own, the typeshed owners ask the package owner what their > > position is, and we expect the answers to fall on the following spectrum: > > > > - I don't want stubs uploaded for my package > > - I'll write the stubs myself > > - I want to review all stubs that are uploaded for my package before they > > are accepted > > - Please go ahead and add stubs for my package and let me know when > they're > > ready > > - Go ahead, I trust you > > > > This seems a reasonable due diligence policy that avoids the scenarios > > you're worried about. (Of course if you refuse stubs a black market for > > stubs might spring into existence. That sounds kind of exciting... :-) > > Hmm, that's the first time I've heard about this. I agree with > Jack that it's a terrible idea to allow this for 3rd party > packages. > > If people want to contribute stubs, they should contribute them > to the package in the usual ways, not in a side channel. The important > part missing in the above setup is maintenance and to some extent > an external change of the API definitions. > > Both require active participation in the package project, > not the separated setup proposed above, to be effective and > actually work out in the long run. > > For the stdlib, typeshed looks like a nice idea to spread the > workload. > I hesitate to speak for others, but IIUC the reason why typeshed was started is that companies like PyCharm and Google (and maybe others) are *already* creating their own stubs for 3rd party packages, because they have a need to type-check code that *uses* 3rd party packages. Their use cases are otherwise quite different (the user code type-checked by PyCharm is that of PyCharm users, and the code type-checked by Google is their own proprietary code) but they both find themselves needing stubs for commonly used 3rd party packages. mypy finds itself in a similar position. Think of it this way. Suppose you wrote an app that downloaded some files from the web using the popular Requests package. Now suppose you wanted to run mypy over your app. You're willing to do the work of adding signatures to your own app, and presumably there are stubs for those parts of the stdlib that you're using, but without stubs for Requests, mypy won't do a very good job type-checking your calls into Requests. It's not rocket science to come up with stubs for Requests (there aren't that many classes and methods) but the Requests package is in maintenance mode, and while they respond quickly to security issues they might take their time to release a new version that includes your stub files, and until there are a lot of people clamoring for stubs for Requests, they might not care at all. So what does Requests have to lose if, instead of including the stubs in Requests, they let the typeshed people distribute stubs for Requests? Presumably having the stubs in typeshed means that PyCharm and mypy (and the 50 other type-checkers that are being written right now :-) can give better diagnostics for code using Requests, and once in a while this may save a user of Requests from doing something dumb and blaming Requests. The only downside would be if something was missing in the stubs and users would get incorrect error messages from their favorite type checker. But it's a long stretch to see this rain down on Requests' reputation -- more likely the type checker will be blamed, so type checker authors/distributors will be vigilant before distributing stubs. OTOH if you prefer to make and distribute your own stubs, type checkers will use those, and there won't be a need to include stubs in typeshed when a package already provides stubs. And if you really don't want anything to do with stubs for your package, just tell the typeshed owners and your wish will be respected. -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From rdmurray at bitdance.com Tue Apr 21 18:09:05 2015 From: rdmurray at bitdance.com (R. David Murray) Date: Tue, 21 Apr 2015 12:09:05 -0400 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: <20150421160905.7B018250E5F@webabinitio.net> On Wed, 22 Apr 2015 01:09:52 +1000, Chris Angelico wrote: > def incremental_parser(input: FileLike) -> List[Token]: > tokens = [] > data = "" > while True: > if not data: > data = input.read(64) > token = Token(data[0]); data = data[1:] > while token.wants_more(): > token.give_more(data[0]); data = data[1:] > tokens.append(token) > if token.is_end_of_stream(): break > input.seek(-len(data), 1) > return tokens > > If you were to exhaustively stipulate the requirements on the > file-like object, you'd have to say: > > * Provides a read() method which takes an integer and returns up to > that many bytes > * Provides a seek() method which takes two integers > * Is capable of relative seeking by at least 63 bytes backward > * Is open for reading > * Etcetera > > That's not the type system's job. Not in Python. Maybe in Haskell, but Just a note that if I'm reading the high level description right, this kind of analysis is exactly the kind of thing that Flow does for javascript. --David From rdmurray at bitdance.com Tue Apr 21 18:17:01 2015 From: rdmurray at bitdance.com (R. David Murray) Date: Tue, 21 Apr 2015 12:17:01 -0400 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150421182750.3efcb597@x230> References: <55362CCF.3020809@btinternet.com> <20150421124720.GK5663@ando.pearwood.info> <20150421150827.3c2a0af6@fsol> <20150421165042.34f935ef@x230> <55365AA7.3060500@python.org> <20150421182750.3efcb597@x230> Message-ID: <20150421161701.E06C0B20095@webabinitio.net> On Tue, 21 Apr 2015 18:27:50 +0300, Paul Sokolovsky wrote: > > I was replying to Steven's message. Did you read it? > > Yes. And I try to follow general course of discussion, as its hard to > follow individual sub-threads. And for example yesterday's big theme > was people blackmailing that they stop contributing to stdlib if > annotations are in, and today's theme appear to be people telling that > static type checking won't be useful. And just your reply to Steven > was a final straw which prompted me to point out that static type > checking is not a crux of it, but just one (not the biggest IMHO) usage. Please be respectful rather than inflammatory. If you read what I wrote, I did not say that I was going to stop contributing, I specifically talked about that gut reaction being both emotional and illogical. That doesn't make the reaction any less real, and the fact that such reactions exist is a data point you should consider in conducting your PR campaign for this issue. (I don't mean that last as a negative: this issue *requires* an honest PR campaign.) --David From guido at python.org Tue Apr 21 18:28:45 2015 From: guido at python.org (Guido van Rossum) Date: Tue, 21 Apr 2015 09:28:45 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150421094958.2960986c@fsol> References: <20150420144106.63513828@anarchist.wooz.org> <20150421004339.1DA5CB500F5@webabinitio.net> <20150421094958.2960986c@fsol> Message-ID: On Tue, Apr 21, 2015 at 12:49 AM, Antoine Pitrou wrote: > On Mon, 20 Apr 2015 20:43:38 -0400 > "R. David Murray" wrote: > > +1 to this from me too. I'm afraid that means I'm -1 on the PEP. > > > > I didn't write this in my earlier email because I wasn't sure about it, > > but my gut reaction after reading Harry's email was "if type annotations > > are used in the stdlib, I'll probably stop contributing". That doesn't > > mean that's *true*, but that's the first time I've ever had that > > thought, so it is probably worth sharing. > > I think it would be nice to know what the PEP means for daily stdlib > development. If patches have to carry typing information each time they > add/enhance an API that's an addition burden. If typing is done > separately by interested people then it sounds like it wouldn't have > much of an impact on everyone else's workflow. > This point will be moot until new code appears in the stdlib whose author likes type hints. As I said, we won't be converting existing code to add type hints (I'm actively against that for the stdlib, for reasons I've explained already). *If* type hints prove useful, I expect that adding type hints **to code that deserves them** is treated no different in the workflow than adding tests or docs. I.e. something that is the right thing to do because it has obvious benefits for users and/or future maintainers. If at some point running a type checker over the stdlib as part of continuous integration become routine, type hints can also replace certain silly tests. Until some point in a possible but distant future when we're all thinking back fondly about the argument we're currently having, it will be the choice of the author of new (and *only* new) stdlib modules whether and how to use type hints. Such a hypothetical author would also be reviewing updates to "their" module and point out lack of type hints just like you might point out an incomplete docstring, an outdated comment, or a missing test. (The type checker would be responsible for pointing out bugs. :-P ) -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From solipsis at pitrou.net Tue Apr 21 18:47:21 2015 From: solipsis at pitrou.net (Antoine Pitrou) Date: Tue, 21 Apr 2015 18:47:21 +0200 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> <20150421004339.1DA5CB500F5@webabinitio.net> <20150421094958.2960986c@fsol> Message-ID: <20150421184721.419cef10@fsol> On Tue, 21 Apr 2015 09:28:45 -0700 Guido van Rossum wrote: > On Tue, Apr 21, 2015 at 12:49 AM, Antoine Pitrou > wrote: > > > On Mon, 20 Apr 2015 20:43:38 -0400 > > "R. David Murray" wrote: > > > +1 to this from me too. I'm afraid that means I'm -1 on the PEP. > > > > > > I didn't write this in my earlier email because I wasn't sure about it, > > > but my gut reaction after reading Harry's email was "if type annotations > > > are used in the stdlib, I'll probably stop contributing". That doesn't > > > mean that's *true*, but that's the first time I've ever had that > > > thought, so it is probably worth sharing. > > > > I think it would be nice to know what the PEP means for daily stdlib > > development. If patches have to carry typing information each time they > > add/enhance an API that's an addition burden. If typing is done > > separately by interested people then it sounds like it wouldn't have > > much of an impact on everyone else's workflow. > > > > This point will be moot until new code appears in the stdlib whose author > likes type hints. As I said, we won't be converting existing code to add > type hints (I'm actively against that for the stdlib, for reasons I've > explained already). I was thinking of potential stub files. Or are you also putting a ban on those for existing stdlib code? Sorry if that has already been answered... Regards Antoine. From ethan at stoneleaf.us Tue Apr 21 18:50:59 2015 From: ethan at stoneleaf.us (Ethan Furman) Date: Tue, 21 Apr 2015 09:50:59 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150421182750.3efcb597@x230> References: <55362CCF.3020809@btinternet.com> <20150421124720.GK5663@ando.pearwood.info> <20150421150827.3c2a0af6@fsol> <20150421165042.34f935ef@x230> <55365AA7.3060500@python.org> <20150421182750.3efcb597@x230> Message-ID: <20150421165059.GA4073@stoneleaf.us> On 04/21, Paul Sokolovsky wrote: > > And for example yesterday's big theme was people blackmailing that they > stop contributing to stdlib if annotations are in [...] A volunteer's honest reaction is not blackmail, and categorizing it as such is not helpful to the discussion. -- ~Ethan~ From greg at krypto.org Tue Apr 21 18:55:49 2015 From: greg at krypto.org (Gregory P. Smith) Date: Tue, 21 Apr 2015 16:55:49 +0000 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150421094958.2960986c@fsol> References: <20150420144106.63513828@anarchist.wooz.org> <20150421004339.1DA5CB500F5@webabinitio.net> <20150421094958.2960986c@fsol> Message-ID: On Tue, Apr 21, 2015 at 12:50 AM Antoine Pitrou wrote: > On Mon, 20 Apr 2015 20:43:38 -0400 > "R. David Murray" wrote: > > +1 to this from me too. I'm afraid that means I'm -1 on the PEP. > > > > I didn't write this in my earlier email because I wasn't sure about it, > > but my gut reaction after reading Harry's email was "if type annotations > > are used in the stdlib, I'll probably stop contributing". That doesn't > > mean that's *true*, but that's the first time I've ever had that > > thought, so it is probably worth sharing. > > I think it would be nice to know what the PEP means for daily stdlib > development. If patches have to carry typing information each time they > add/enhance an API that's an addition burden. If typing is done > separately by interested people then it sounds like it wouldn't have > much of an impact on everyone else's workflow. > Separately by interested people. That won't change until tools appear and mature that help maintain the types for us. (if ever) Nobody wants unreadable code. Nobody is proposing to make unreadable code happen or encourage its creation. One thing I feel is often overlooked in the discussion on this PEP: It is about creating a unified type expression syntax for everyone working on Python typing to centralize around. Regardless of if the PEPs version falls short for some purposes. It allows for sharing work. There are multiple ongoing projects that are trying to make use of type information with Python, this allows them to all speak the same language. (MyPy, MicroPython, Cython, the static analyzer we are trying to create at Google, several others not on the top of my head I'm sure, etc.) We will not be putting type annotations anywhere in the stdlib or expecting anyone else to maintain them there. That would never happen until tools that are convincing enough in their utility for developers to _want_ to use are available and accepted. That'll be a developer workflow thing we could address with a later PEP. IF it happens at all. I view most of this thread as FUD. The fear is understandable, I'm trying to tell people to stop panicing. This PEP does not mean that Python is suddenly going to become unreadable. Just that a set of people working on a common goal have a way to communicate with one another. If that work bears fruit, great, it'll be shared and provided as tools that people want to use. If not, it won't matter in the slightest and the typing module and this PEP will be relegated to history. This is a 100% non-invasive PEP. No new keywords! Motivation behind static analysis and type checkers comes directly from the success of the type annotations and checking done to Javascript in Google's javascript Closure compiler that has been available for years. Steven mentioned Facebook's Flow which does a similar thing. These are both opt-in and by and large we've found that developers love using them in any decent sized code base. That model is the goal for any of our Python typing related projects to get to. If developers don't _want_ to use it in the end, we have failed and they can happily continue not using it because it was never required. The reason this PEP exists is for tool developers to be able to do their thing and prove to everyone that it is (a) possible and (b) genuinely useful. IF that proves successful, we can consider if we need a saner syntax for anyone to want to use it. For now we've got this PEP which is a bit of a hack using the Python 3 annotations and a typing module but at the same time doesn't involve any language changes we might regret. I call that a win! -gps -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Tue Apr 21 18:59:50 2015 From: guido at python.org (Guido van Rossum) Date: Tue, 21 Apr 2015 09:59:50 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: On Tue, Apr 21, 2015 at 7:51 AM, Cory Benfield wrote: > The correct specification is "read method with this type signature" > and "seek method with this type signature". I would even be prepared > to waive the type signatures on read and seek, given that enforcing > the type hinting on others is not a good idea. > > I suspect I have a mismatch with several others in this discussion. My > position is that if I'm going to have a type system, I'd like to have > a powerful one: Haskell, not Java. Otherwise I'll get by with the duck > typing that has worked just fine for us over the last 20 years. I > suspect, however, that many others in this conversation want any type > system at all, so long as they can have one. > For me, PEP 484 is a stepping stone. Among the authors of PEP 484 there was much discussion about duck typing, and mypy even has some limited support for duck typing (I think you can still find it by searching the mypy code for "protocol"). But we ran out of time getting all the details written up and agreed upon, so we decided to punt -- for now. But duck typing still needs to have a way to talk about things like "seek method with this type signature" (something like `def seek(self, offset: int, whence: int=SEEK_SET) -> int`) so the current proposal gets us part of the way there. The hope is that once 3.5 is out (with PEP 484's typing.py included *provisional* mode) we can start working on the duck typing specification. The alternative would have been to wait until 3.6, but we didn't think that there would be much of an advantage to postponing the more basic type hinting syntax (it would be like refusing to include "import" until you've sorted out packages). During the run of 3.5 we'll hopefully get feedback on where duck typing is most needed and how to specify it -- valuable input that would be much harder to obtain of *no* part of the type hints notation were standardized. -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From Nikolaus at rath.org Tue Apr 21 19:03:06 2015 From: Nikolaus at rath.org (Nikolaus Rath) Date: Tue, 21 Apr 2015 10:03:06 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150421183921.534a37df@x230> (Paul Sokolovsky's message of "Tue, 21 Apr 2015 18:39:21 +0300") References: <20150420144106.63513828@anarchist.wooz.org> <85sibumhej.fsf@benfinney.id.au> <87618p5xns.fsf@thinkpad.rath.org> <20150421183921.534a37df@x230> Message-ID: <87pp6x4do5.fsf@thinkpad.rath.org> On Apr 21 2015, Paul Sokolovsky wrote: > Hello, > > On Tue, 21 Apr 2015 08:05:59 -0700 > Nikolaus Rath wrote: > >> On Apr 20 2015, Chris Angelico wrote: >> > Maybe it'd be of value to have a quick "code stripper" that takes >> > away all the annotations, plus any other junk/framing that you're >> > not interested in, and gives you something you can browse in a text >> > editor? >> >> If you need to preprocess your source code to make it suitable for >> human consumption something is very wrong with your language design. >> I can't believe you're seriously suggesting this. > > I'm sure that was irony, d'oh. That'd be a relief. It didn't sound ironic to me. > The proposed type annotations are very readable. [..] I don't have an informed opinion about that yet. I was just commenting on the general idea of stripping them away if they're not readable. Best, -Nikolaus -- GPG encrypted emails preferred. Key id: 0xD113FCAC3C4E599F Fingerprint: ED31 791B 2C5C 1613 AF38 8B8A D113 FCAC 3C4E 599F ?Time flies like an arrow, fruit flies like a Banana.? From guido at python.org Tue Apr 21 19:10:06 2015 From: guido at python.org (Guido van Rossum) Date: Tue, 21 Apr 2015 10:10:06 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150421161701.E06C0B20095@webabinitio.net> References: <55362CCF.3020809@btinternet.com> <20150421124720.GK5663@ando.pearwood.info> <20150421150827.3c2a0af6@fsol> <20150421165042.34f935ef@x230> <55365AA7.3060500@python.org> <20150421182750.3efcb597@x230> <20150421161701.E06C0B20095@webabinitio.net> Message-ID: On Tue, Apr 21, 2015 at 9:17 AM, R. David Murray wrote: > Please be respectful rather than inflammatory. If you read what I > wrote, I did not say that I was going to stop contributing, I > specifically talked about that gut reaction being both emotional and > illogical. That doesn't make the reaction any less real, and the fact > that such reactions exist is a data point you should consider in > conducting your PR campaign for this issue. (I don't mean that last as > a negative: this issue *requires* an honest PR campaign.) > Well, my own reactions at this point in the flame war are also quite emotional. :-( I have done my best in being honest in my PR campaign. But I feel like the opposition (not you, but definitely some others -- have you seen Twitter?) are spreading FUD based on an irrational conviction that this will destroy Python. It will not. It may not prove the solution to all Python's problems -- there's always 3.6. (Oh wait, Python 2.7 is perfect. I've heard that before -- Paul Everitt famously said the same of Python 1.5.2. Aren't you glad I didn't take him literally? :-P ) -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at pearwood.info Tue Apr 21 19:12:51 2015 From: steve at pearwood.info (Steven D'Aprano) Date: Wed, 22 Apr 2015 03:12:51 +1000 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: Message-ID: <20150421171250.GO5663@ando.pearwood.info> On Tue, Apr 21, 2015 at 03:51:05PM +0100, Cory Benfield wrote: > On 21 April 2015 at 15:31, Chris Angelico wrote: > > Granted, there are some > > vague areas - how many functions take a "file-like object", and are > > they all the same? - but between MyPy types and the abstract base > > types that already exist, there are plenty of ways to formalize duck > > typing. > > Are there? Can I have a link or an example, please? I feel like I > don't know how I'm supposed to do this, and I'd like to see how that > works. I'll even give a concrete use-case: I want to be able to take a > file-like object that has a .read() method and a .seek() method. I've never done this before, so I might not quite have done it correctly, but this appears to work just fine: py> import abc py> class SeekableReadable(metaclass=abc.ABCMeta): ... @classmethod ... def __subclasshook__(cls, C): ... if hasattr(C, 'seek') and hasattr(C, 'read'): ... return True ... return NotImplemented ... py> f = open('/tmp/foo') py> isinstance(f, SeekableReadable) True py> from io import StringIO py> issubclass(StringIO, SeekableReadable) True py> issubclass(int, SeekableReadable) False That gives you your runtime check for an object with seek() and read() methods. For compile-time checking, I expect you would define SeekableReadable as above, then make the declaration: def read_from_start(f:SeekableReadable, size:int): f.seek(0) return f.read(size) So now you have runtime interface checking via an ABC, plus documentation for the function parameter type via annotation. But will the static checker understand that annotation? My guess is, probably not as it stands. According to the docs, MyPy currently doesn't support this sort of duck typing, but will: [quote] There are also plans to support more Python-style ?duck typing? in the type system. The details are still open. [end quote] http://mypy.readthedocs.org/en/latest/class_basics.html#abstract-base-classes-and-multiple-inheritance I expect that dealing with duck typing will be very high on the list of priorities for the future. In the meantime, for this specific use-case, you're probably not going to be able to statically check this type hint. Your choices would be: - don't type check anything; - don't type check the read_from_start() function, but type check everything else; - don't type check the f parameter (remove the SeekableReadable annotation, or replace it with Any, but leave the size:int annotation); - possibly some type checkers will infer from the function body that f must have seek() and read() methods, and you don't have to declare anything (structural typing instead of nominal?); - (a bad idea, but just for the sake of completeness) leave the annotation in, and ignore false negatives. Remember that there is no built-in Python type checker. If you have no checker, the annotations are just documentation and nothing else will have changed. If you don't like the checker you have, you'll be able to replace it with another. -- Steve From yselivanov.ml at gmail.com Tue Apr 21 19:26:48 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Tue, 21 Apr 2015 13:26:48 -0400 Subject: [Python-Dev] async/await in Python; v2 Message-ID: <55368858.4010007@gmail.com> Hi python-dev, I'm moving the discussion from python-ideas to here. The updated version of the PEP should be available shortly at https://www.python.org/dev/peps/pep-0492 and is also pasted in this email. Updates: 1. CO_ASYNC flag was renamed to CO_COROUTINE; 2. sys.set_async_wrapper() was renamed to sys.set_coroutine_wrapper(); 3. New function: sys.get_coroutine_wrapper(); 4. types.async_def() renamed to types.coroutine(); 5. New section highlighting differences from PEP 3152. 6. New AST node - AsyncFunctionDef; the proposal now is 100% backwards compatible; 7. A new section clarifying that coroutine-generators are not part of the current proposal; 8. Various small edits/typos fixes. There's is a bug tracker issue to track code review of the reference implementation (Victor Stinner is doing the review): http://bugs.python.org/issue24017 While the PEP isn't accepted, we want to make sure that the reference implementation is ready when such a decision will be made. Let's discuss some open questions: 1. Victor raised a question if we should locate coroutine() function from 'types' module to 'functools'. My opinion is that 'types' module is a better place for 'corotine()', since it adjusts the type of the passed generator. 'functools' is about 'partials', 'lru_cache' and 'wraps' kind of things. 2. I propose to disallow using of 'for..in' loops, and builtins like 'list()', 'iter()', 'next()', 'tuple()' etc on coroutines. It's possible by modifying PyObject_GetIter to raise an exception if it receives a coroutine-object. 'yield from' can also be modified to only accept coroutine objects if it is called from a generator with CO_COROUTINE flag. This will further separate coroutines from generators, making it harder to screw something up by an accident. I have a branch of reference implementation https://github.com/1st1/cpython/tree/await_noiter where this is implemented. I did not observe any performance drop. There is just one possible backwards compatibility issue here: there will be an exception if some user of asyncio actually used to iterate over generators decorated with @coroutine. But I can't imagine why would someone do that, and even if they did -- it's probably a bug or wrong usage of asyncio. That's it! I'd be happy to hear some feedback! Thanks, Yury PEP: 492 Title: Coroutines with async and await syntax Version: $Revision$ Last-Modified: $Date$ Author: Yury Selivanov Status: Draft Type: Standards Track Content-Type: text/x-rst Created: 09-Apr-2015 Python-Version: 3.5 Post-History: 17-Apr-2015, 21-Apr-2015 Abstract ======== This PEP introduces new syntax for coroutines, asynchronous ``with`` statements and ``for`` loops. The main motivation behind this proposal is to streamline writing and maintaining asynchronous code, as well as to simplify previously hard to implement code patterns. Rationale and Goals =================== Current Python supports implementing coroutines via generators (PEP 342), further enhanced by the ``yield from`` syntax introduced in PEP 380. This approach has a number of shortcomings: * it is easy to confuse coroutines with regular generators, since they share the same syntax; async libraries often attempt to alleviate this by using decorators (e.g. ``@asyncio.coroutine`` [1]_); * it is not possible to natively define a coroutine which has no ``yield`` or ``yield from`` statements, again requiring the use of decorators to fix potential refactoring issues; * support for asynchronous calls is limited to expressions where ``yield`` is allowed syntactically, limiting the usefulness of syntactic features, such as ``with`` and ``for`` statements. This proposal makes coroutines a native Python language feature, and clearly separates them from generators. This removes generator/coroutine ambiguity, and makes it possible to reliably define coroutines without reliance on a specific library. This also enables linters and IDEs to improve static code analysis and refactoring. Native coroutines and the associated new syntax features make it possible to define context manager and iteration protocols in asynchronous terms. As shown later in this proposal, the new ``async with`` statement lets Python programs perform asynchronous calls when entering and exiting a runtime context, and the new ``async for`` statement makes it possible to perform asynchronous calls in iterators. Specification ============= This proposal introduces new syntax and semantics to enhance coroutine support in Python, it does not change the internal implementation of coroutines, which are still based on generators. It is strongly suggested that the reader understands how coroutines are implemented in Python (PEP 342 and PEP 380). It is also recommended to read PEP 3156 (asyncio framework) and PEP 3152 (Cofunctions). From this point in this document we use the word *coroutine* to refer to functions declared using the new syntax. *generator-based coroutine* is used where necessary to refer to coroutines that are based on generator syntax. New Coroutine Declaration Syntax -------------------------------- The following new syntax is used to declare a coroutine:: async def read_data(db): pass Key properties of coroutines: * ``async def`` functions are always coroutines, even if they do not contain ``await`` expressions. * It is a ``SyntaxError`` to have ``yield`` or ``yield from`` expressions in an ``async`` function. * Internally, a new code object flag - ``CO_COROUTINE`` - is introduced to enable runtime detection of coroutines (and migrating existing code). All coroutines have both ``CO_COROUTINE`` and ``CO_GENERATOR`` flags set. * Regular generators, when called, return a *generator object*; similarly, coroutines return a *coroutine object*. * ``StopIteration`` exceptions are not propagated out of coroutines, and are replaced with a ``RuntimeError``. For regular generators such behavior requires a future import (see PEP 479). types.coroutine() ----------------- A new function ``coroutine(gen)`` is added to the ``types`` module. It applies ``CO_COROUTINE`` flag to the passed generator-function's code object, making it to return a *coroutine object* when called. This feature enables an easy upgrade path for existing libraries. Await Expression ---------------- The following new ``await`` expression is used to obtain a result of coroutine execution:: async def read_data(db): data = await db.fetch('SELECT ...') ... ``await``, similarly to ``yield from``, suspends execution of ``read_data`` coroutine until ``db.fetch`` *awaitable* completes and returns the result data. It uses the ``yield from`` implementation with an extra step of validating its argument. ``await`` only accepts an *awaitable*, which can be one of: * A *coroutine object* returned from a *coroutine* or a generator decorated with ``types.coroutine()``. * An object with an ``__await__`` method returning an iterator. Any ``yield from`` chain of calls ends with a ``yield``. This is a fundamental mechanism of how *Futures* are implemented. Since, internally, coroutines are a special kind of generators, every ``await`` is suspended by a ``yield`` somewhere down the chain of ``await`` calls (please refer to PEP 3156 for a detailed explanation.) To enable this behavior for coroutines, a new magic method called ``__await__`` is added. In asyncio, for instance, to enable Future objects in ``await`` statements, the only change is to add ``__await__ = __iter__`` line to ``asyncio.Future`` class. Objects with ``__await__`` method are called *Future-like* objects in the rest of this PEP. Also, please note that ``__aiter__`` method (see its definition below) cannot be used for this purpose. It is a different protocol, and would be like using ``__iter__`` instead of ``__call__`` for regular callables. It is a ``SyntaxError`` to use ``await`` outside of a coroutine. Asynchronous Context Managers and "async with" ---------------------------------------------- An *asynchronous context manager* is a context manager that is able to suspend execution in its *enter* and *exit* methods. To make this possible, a new protocol for asynchronous context managers is proposed. Two new magic methods are added: ``__aenter__`` and ``__aexit__``. Both must return an *awaitable*. An example of an asynchronous context manager:: class AsyncContextManager: async def __aenter__(self): await log('entering context') async def __aexit__(self, exc_type, exc, tb): await log('exiting context') New Syntax '''''''''' A new statement for asynchronous context managers is proposed:: async with EXPR as VAR: BLOCK which is semantically equivalent to:: mgr = (EXPR) aexit = type(mgr).__aexit__ aenter = type(mgr).__aenter__(mgr) exc = True try: try: VAR = await aenter BLOCK except: exc = False exit_res = await aexit(mgr, *sys.exc_info()) if not exit_res: raise finally: if exc: await aexit(mgr, None, None, None) As with regular ``with`` statements, it is possible to specify multiple context managers in a single ``async with`` statement. It is an error to pass a regular context manager without ``__aenter__`` and ``__aexit__`` methods to ``async with``. It is a ``SyntaxError`` to use ``async with`` outside of a coroutine. Example ''''''' With asynchronous context managers it is easy to implement proper database transaction managers for coroutines:: async def commit(session, data): ... async with session.transaction(): ... await session.update(data) ... Code that needs locking also looks lighter:: async with lock: ... instead of:: with (yield from lock): ... Asynchronous Iterators and "async for" -------------------------------------- An *asynchronous iterable* is able to call asynchronous code in its *iter* implementation, and *asynchronous iterator* can call asynchronous code in its *next* method. To support asynchronous iteration: 1. An object must implement an ``__aiter__`` method returning an *awaitable* resulting in an *asynchronous iterator object*. 2. An *asynchronous iterator object* must implement an ``__anext__`` method returning an *awaitable*. 3. To stop iteration ``__anext__`` must raise a ``StopAsyncIteration`` exception. An example of asynchronous iterable:: class AsyncIterable: async def __aiter__(self): return self async def __anext__(self): data = await self.fetch_data() if data: return data else: raise StopAsyncIteration async def fetch_data(self): ... New Syntax '''''''''' A new statement for iterating through asynchronous iterators is proposed:: async for TARGET in ITER: BLOCK else: BLOCK2 which is semantically equivalent to:: iter = (ITER) iter = await type(iter).__aiter__(iter) running = True while running: try: TARGET = await type(iter).__anext__(iter) except StopAsyncIteration: running = False else: BLOCK else: BLOCK2 It is an error to pass a regular iterable without ``__aiter__`` method to ``async for``. It is a ``SyntaxError`` to use ``async for`` outside of a coroutine. As for with regular ``for`` statement, ``async for`` has an optional ``else`` clause. Example 1 ''''''''' With asynchronous iteration protocol it is possible to asynchronously buffer data during iteration:: async for data in cursor: ... Where ``cursor`` is an asynchronous iterator that prefetches ``N`` rows of data from a database after every ``N`` iterations. The following code illustrates new asynchronous iteration protocol:: class Cursor: def __init__(self): self.buffer = collections.deque() def _prefetch(self): ... async def __aiter__(self): return self async def __anext__(self): if not self.buffer: self.buffer = await self._prefetch() if not self.buffer: raise StopAsyncIteration return self.buffer.popleft() then the ``Cursor`` class can be used as follows:: async for row in Cursor(): print(row) which would be equivalent to the following code:: i = await Cursor().__aiter__() while True: try: row = await i.__anext__() except StopAsyncIteration: break else: print(row) Example 2 ''''''''' The following is a utility class that transforms a regular iterable to an asynchronous one. While this is not a very useful thing to do, the code illustrates the relationship between regular and asynchronous iterators. :: class AsyncIteratorWrapper: def __init__(self, obj): self._it = iter(obj) async def __aiter__(self): return self async def __anext__(self): try: value = next(self._it) except StopIteration: raise StopAsyncIteration return value async for item in AsyncIteratorWrapper("abc"): print(item) Why StopAsyncIteration? ''''''''''''''''''''''' Coroutines are still based on generators internally. So, before PEP 479, there was no fundamental difference between :: def g1(): yield from fut return 'spam' and :: def g2(): yield from fut raise StopIteration('spam') And since PEP 479 is accepted and enabled by default for coroutines, the following example will have its ``StopIteration`` wrapped into a ``RuntimeError`` :: async def a1(): await fut raise StopIteration('spam') The only way to tell the outside code that the iteration has ended is to raise something other than ``StopIteration``. Therefore, a new built-in exception class ``StopAsyncIteration`` was added. Moreover, with semantics from PEP 479, all ``StopIteration`` exceptions raised in coroutines are wrapped in ``RuntimeError``. Debugging Features ------------------ One of the most frequent mistakes that people make when using generators as coroutines is forgetting to use ``yield from``:: @asyncio.coroutine def useful(): asyncio.sleep(1) # this will do noting without 'yield from' For debugging this kind of mistakes there is a special debug mode in asyncio, in which ``@coroutine`` decorator wraps all functions with a special object with a destructor logging a warning. Whenever a wrapped generator gets garbage collected, a detailed logging message is generated with information about where exactly the decorator function was defined, stack trace of where it was collected, etc. Wrapper object also provides a convenient ``__repr__`` function with detailed information about the generator. The only problem is how to enable these debug capabilities. Since debug facilities should be a no-op in production mode, ``@coroutine`` decorator makes the decision of whether to wrap or not to wrap based on an OS environment variable ``PYTHONASYNCIODEBUG``. This way it is possible to run asyncio programs with asyncio's own functions instrumented. ``EventLoop.set_debug``, a different debug facility, has no impact on ``@coroutine`` decorator's behavior. With this proposal, coroutines is a native, distinct from generators, concept. New methods ``set_coroutine_wrapper`` and ``get_coroutine_wrapper`` are added to the ``sys`` module, with which frameworks can provide advanced debugging facilities. It is also important to make coroutines as fast and efficient as possible, therefore there are no debug features enabled by default. Example:: async def debug_me(): await asyncio.sleep(1) def async_debug_wrap(generator): return asyncio.AsyncDebugWrapper(generator) sys.set_coroutine_wrapper(async_debug_wrap) debug_me() # <- this line will likely GC the coroutine object and # trigger AsyncDebugWrapper's code. assert isinstance(debug_me(), AsyncDebugWrapper) sys.set_coroutine_wrapper(None) # <- this unsets any # previously set wrapper assert not isinstance(debug_me(), AsyncDebugWrapper) If ``sys.set_coroutine_wrapper()`` is called twice, the new wrapper replaces the previous wrapper. ``sys.set_coroutine_wrapper(None)`` unsets the wrapper. Glossary ======== :Coroutine: A coroutine function, or just "coroutine", is declared with ``async def``. It uses ``await`` and ``return value``; see `New Coroutine Declaration Syntax`_ for details. :Coroutine object: Returned from a coroutine function. See `Await Expression`_ for details. :Future-like object: An object with an ``__await__`` method. Can be consumed by an ``await`` expression in a coroutine. A coroutine waiting for a Future-like object is suspended until the Future-like object's ``__await__`` completes, and returns the result. See `Await Expression`_ for details. :Awaitable: A *Future-like* object or a *coroutine object*. See `Await Expression`_ for details. :Generator-based coroutine: Coroutines based in generator syntax. Most common example is ``@asyncio.coroutine``. :Asynchronous context manager: An asynchronous context manager has ``__aenter__`` and ``__aexit__`` methods and can be used with ``async with``. See `Asynchronous Context Managers and "async with"`_ for details. :Asynchronous iterable: An object with an ``__aiter__`` method, which must return an *asynchronous iterator* object. Can be used with ``async for``. See `Asynchronous Iterators and "async for"`_ for details. :Asynchronous iterator: An asynchronous iterator has an ``__anext__`` method. See `Asynchronous Iterators and "async for"`_ for details. List of functions and methods ============================= ================= =================================== ================= Method Can contain Can't contain ================= =================================== ================= async def func await, return value yield, yield from async def __a*__ await, return value yield, yield from def __a*__ return awaitable await def __await__ yield, yield from, return iterable await generator yield, yield from, return value await ================= =================================== ================= Where: * "async def func": coroutine; * "async def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``, ``__aexit__`` defined with the ``async`` keyword; * "def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``, ``__aexit__`` defined without the ``async`` keyword, must return an *awaitable*; * "def __await__": ``__await__`` method to implement *Future-like* objects; * generator: a "regular" generator, function defined with ``def`` and which contains a least one ``yield`` or ``yield from`` expression. Transition Plan =============== To avoid backwards compatibility issues with ``async`` and ``await`` keywords, it was decided to modify ``tokenizer.c`` in such a way, that it: * recognizes ``async def`` name tokens combination (start of a coroutine); * keeps track of regular functions and coroutines; * replaces ``'async'`` token with ``ASYNC`` and ``'await'`` token with ``AWAIT`` when in the process of yielding tokens for coroutines. This approach allows for seamless combination of new syntax features (all of them available only in ``async`` functions) with any existing code. An example of having "async def" and "async" attribute in one piece of code:: class Spam: async = 42 async def ham(): print(getattr(Spam, 'async')) # The coroutine can be executed and will print '42' Backwards Compatibility ----------------------- This proposal preserves 100% backwards compatibility. Grammar Updates --------------- Grammar changes are also fairly minimal:: await_expr: AWAIT test await_stmt: await_expr decorated: decorators (classdef | funcdef | async_funcdef) async_funcdef: ASYNC funcdef async_stmt: ASYNC (funcdef | with_stmt | for_stmt) compound_stmt: (if_stmt | while_stmt | for_stmt | try_stmt | with_stmt | funcdef | classdef | decorated | async_stmt) atom: ('(' [yield_expr|await_expr|testlist_comp] ')' | '[' [testlist_comp] ']' | '{' [dictorsetmaker] '}' | NAME | NUMBER | STRING+ | '...' | 'None' | 'True' | 'False?) expr_stmt: testlist_star_expr (augassign (yield_expr|await_expr|testlist) | ('=' (yield_expr|await_expr|testlist_star_expr))*) Transition Period Shortcomings ------------------------------ There is just one. Until ``async`` and ``await`` are not proper keywords, it is not possible (or at least very hard) to fix ``tokenizer.c`` to recognize them on the **same line** with ``def`` keyword:: # async and await will always be parsed as variables async def outer(): # 1 def nested(a=(await fut)): pass async def foo(): return (await fut) # 2 Since ``await`` and ``async`` in such cases are parsed as ``NAME`` tokens, a ``SyntaxError`` will be raised. To workaround these issues, the above examples can be easily rewritten to a more readable form:: async def outer(): # 1 a_default = await fut def nested(a=a_default): pass async def foo(): # 2 return (await fut) This limitation will go away as soon as ``async`` and ``await`` ate proper keywords. Or if it's decided to use a future import for this PEP. Deprecation Plans ----------------- ``async`` and ``await`` names will be softly deprecated in CPython 3.5 and 3.6. In 3.7 we will transform them to proper keywords. Making ``async`` and ``await`` proper keywords before 3.7 might make it harder for people to port their code to Python 3. asyncio ------- ``asyncio`` module was adapted and tested to work with coroutines and new statements. Backwards compatibility is 100% preserved. The required changes are mainly: 1. Modify ``@asyncio.coroutine`` decorator to use new ``types.coroutine()`` function. 2. Add ``__await__ = __iter__`` line to ``asyncio.Future`` class. 3. Add ``ensure_task()`` as an alias for ``async()`` function. Deprecate ``async()`` function. Design Considerations ===================== PEP 3152 -------- PEP 3152 by Gregory Ewing proposes a different mechanism for coroutines (called "cofunctions"). Some key points: 1. A new keyword ``codef`` to declare a *cofunction*. *Cofunction* is always a generator, even if there is no ``cocall`` expressions inside it. Maps to ``async def`` in this proposal. 2. A new keyword ``cocall`` to call a *cofunction*. Can only be used inside a *cofunction*. Maps to ``await`` in this proposal (with some differences, see below.) 3. It is not possible to call a *cofunction* without a ``cocall`` keyword. 4. ``cocall`` grammatically requires parentheses after it:: atom: cocall | cocall: 'cocall' atom cotrailer* '(' [arglist] ')' cotrailer: '[' subscriptlist ']' | '.' NAME 5. ``cocall f(*args, **kwds)`` is semantically equivalent to ``yield from f.__cocall__(*args, **kwds)``. Differences from this proposal: 1. There is no equivalent of ``__cocall__`` in this PEP, which is called and its result is passed to ``yield from`` in the ``cocall`` expression. ``await`` keyword expects an *awaitable* object, validates the type, and executes ``yield from`` on it. Although, ``__await__`` method is similar to ``__cocall__``, but is only used to define *Future-like* objects. 2. ``await`` is defined in almost the same way as ``yield from`` in the grammar (it is later enforced that ``await`` can only be inside ``async def``). It is possible to simply write ``await future``, whereas ``cocall`` always requires parentheses. 3. To make asyncio work with PEP 3152 it would be required to modify ``@asyncio.coroutine`` decorator to wrap all functions in an object with a ``__cocall__`` method. To call *cofunctions* from existing generator-based coroutines it would be required to use ``costart`` built-in. In this proposal ``@asyncio.coroutine`` simply sets ``CO_COROUTINE`` on the wrapped function's code object and everything works automatically. 4. Since it is impossible to call a *cofunction* without a ``cocall`` keyword, it automatically prevents the common mistake of forgetting to use ``yield from`` on generator-based coroutines. This proposal addresses this problem with a different approach, see `Debugging Features`_. 5. A shortcoming of requiring a ``cocall`` keyword to call a coroutine is that if is decided to implement coroutine-generators -- coroutines with ``yield`` or ``async yield`` expressions -- we wouldn't need a ``cocall`` keyword to call them. So we'll end up having ``__cocall__`` and no ``__call__`` for regular coroutines, and having ``__call__`` and no ``__cocall__`` for coroutine- generators. 6. There are no equivalents of ``async for`` and ``async with`` in PEP 3152. Coroutine-generators -------------------- With ``async for`` keyword it is desirable to have a concept of a *coroutine-generator* -- a coroutine with ``yield`` and ``yield from`` expressions. To avoid any ambiguity with regular generators, we would likely require to have an ``async`` keyword before ``yield``, and ``async yield from`` would raise a ``StopAsyncIteration`` exception. While it is possible to implement coroutine-generators, we believe that they are out of scope of this proposal. It is an advanced concept that should be carefully considered and balanced, with a non-trivial changes in the implementation of current generator objects. This is a matter for a separate PEP. No implicit wrapping in Futures ------------------------------- There is a proposal to add similar mechanism to ECMAScript 7 [2]_. A key difference is that JavaScript "async functions" always return a Promise. While this approach has some advantages, it also implies that a new Promise object is created on each "async function" invocation. We could implement a similar functionality in Python, by wrapping all coroutines in a Future object, but this has the following disadvantages: 1. Performance. A new Future object would be instantiated on each coroutine call. Moreover, this makes implementation of ``await`` expressions slower (disabling optimizations of ``yield from``). 2. A new built-in ``Future`` object would need to be added. 3. Coming up with a generic ``Future`` interface that is usable for any use case in any framework is a very hard to solve problem. 4. It is not a feature that is used frequently, when most of the code is coroutines. Why "async" and "await" keywords -------------------------------- async/await is not a new concept in programming languages: * C# has it since long time ago [5]_; * proposal to add async/await in ECMAScript 7 [2]_; see also Traceur project [9]_; * Facebook's Hack/HHVM [6]_; * Google's Dart language [7]_; * Scala [8]_; * proposal to add async/await to C++ [10]_; * and many other less popular languages. This is a huge benefit, as some users already have experience with async/await, and because it makes working with many languages in one project easier (Python with ECMAScript 7 for instance). Why "__aiter__" is a coroutine ------------------------------ In principle, ``__aiter__`` could be a regular function. There are several good reasons to make it a coroutine: * as most of the ``__anext__``, ``__aenter__``, and ``__aexit__`` methods are coroutines, users would often make a mistake defining it as ``async`` anyways; * there might be a need to run some asynchronous operations in ``__aiter__``, for instance to prepare DB queries or do some file operation. Importance of "async" keyword ----------------------------- While it is possible to just implement ``await`` expression and treat all functions with at least one ``await`` as coroutines, this approach makes APIs design, code refactoring and its long time support harder. Let's pretend that Python only has ``await`` keyword:: def useful(): ... await log(...) ... def important(): await useful() If ``useful()`` function is refactored and someone removes all ``await`` expressions from it, it would become a regular python function, and all code that depends on it, including ``important()`` would be broken. To mitigate this issue a decorator similar to ``@asyncio.coroutine`` has to be introduced. Why "async def" --------------- For some people bare ``async name(): pass`` syntax might look more appealing than ``async def name(): pass``. It is certainly easier to type. But on the other hand, it breaks the symmetry between ``async def``, ``async with`` and ``async for``, where ``async`` is a modifier, stating that the statement is asynchronous. It is also more consistent with the existing grammar. Why not a __future__ import --------------------------- ``__future__`` imports are inconvenient and easy to forget to add. Also, they are enabled for the whole source file. Consider that there is a big project with a popular module named "async.py". With future imports it is required to either import it using ``__import__()`` or ``importlib.import_module()`` calls, or to rename the module. The proposed approach makes it possible to continue using old code and modules without a hassle, while coming up with a migration plan for future python versions. Why magic methods start with "a" -------------------------------- New asynchronous magic methods ``__aiter__``, ``__anext__``, ``__aenter__``, and ``__aexit__`` all start with the same prefix "a". An alternative proposal is to use "async" prefix, so that ``__aiter__`` becomes ``__async_iter__``. However, to align new magic methods with the existing ones, such as ``__radd__`` and ``__iadd__`` it was decided to use a shorter version. Why not reuse existing magic names ---------------------------------- An alternative idea about new asynchronous iterators and context managers was to reuse existing magic methods, by adding an ``async`` keyword to their declarations:: class CM: async def __enter__(self): # instead of __aenter__ ... This approach has the following downsides: * it would not be possible to create an object that works in both ``with`` and ``async with`` statements; * it would look confusing and would require some implicit magic behind the scenes in the interpreter; * one of the main points of this proposal is to make coroutines as simple and foolproof as possible. Comprehensions -------------- For the sake of restricting the broadness of this PEP there is no new syntax for asynchronous comprehensions. This should be considered in a separate PEP, if there is a strong demand for this feature. Async lambdas ------------- Lambda coroutines are not part of this proposal. In this proposal they would look like ``async lambda(parameters): expression``. Unless there is a strong demand to have them as part of this proposal, it is recommended to consider them later in a separate PEP. Performance =========== Overall Impact -------------- This proposal introduces no observable performance impact. Here is an output of python's official set of benchmarks [4]_: :: python perf.py -r -b default ../cpython/python.exe ../cpython-aw/python.exe [skipped] Report on Darwin ysmac 14.3.0 Darwin Kernel Version 14.3.0: Mon Mar 23 11:59:05 PDT 2015; root:xnu-2782.20.48~5/RELEASE_X86_64 x86_64 i386 Total CPU cores: 8 ### etree_iterparse ### Min: 0.365359 -> 0.349168: 1.05x faster Avg: 0.396924 -> 0.379735: 1.05x faster Significant (t=9.71) Stddev: 0.01225 -> 0.01277: 1.0423x larger The following not significant results are hidden, use -v to show them: django_v2, 2to3, etree_generate, etree_parse, etree_process, fastpickle, fastunpickle, json_dump_v2, json_load, nbody, regex_v8, tornado_http. Tokenizer modifications ----------------------- There is no observable slowdown of parsing python files with the modified tokenizer: parsing of one 12Mb file (``Lib/test/test_binop.py`` repeated 1000 times) takes the same amount of time. async/await ----------- The following micro-benchmark was used to determine performance difference between "async" functions and generators:: import sys import time def binary(n): if n <= 0: return 1 l = yield from binary(n - 1) r = yield from binary(n - 1) return l + 1 + r async def abinary(n): if n <= 0: return 1 l = await abinary(n - 1) r = await abinary(n - 1) return l + 1 + r def timeit(gen, depth, repeat): t0 = time.time() for _ in range(repeat): list(gen(depth)) t1 = time.time() print('{}({}) * {}: total {:.3f}s'.format( gen.__name__, depth, repeat, t1-t0)) The result is that there is no observable performance difference. Minimum timing of 3 runs :: abinary(19) * 30: total 12.985s binary(19) * 30: total 12.953s Note that depth of 19 means 1,048,575 calls. Reference Implementation ======================== The reference implementation can be found here: [3]_. List of high-level changes and new protocols -------------------------------------------- 1. New syntax for defining coroutines: ``async def`` and new ``await`` keyword. 2. New ``__await__`` method for Future-like objects. 3. New syntax for asynchronous context managers: ``async with``. And associated protocol with ``__aenter__`` and ``__aexit__`` methods. 4. New syntax for asynchronous iteration: ``async for``. And associated protocol with ``__aiter__``, ``__aexit__`` and new built- in exception ``StopAsyncIteration``. 5. New AST nodes: ``AsyncFunctionDef``, ``AsyncFor``, ``AsyncWith``, ``Await``. 6. New functions: ``sys.set_coroutine_wrapper(callback)``, ``sys.get_coroutine_wrapper()``, and ``types.coroutine(gen)``. 7. New ``CO_COROUTINE`` bit flag for code objects. While the list of changes and new things is not short, it is important to understand, that most users will not use these features directly. It is intended to be used in frameworks and libraries to provide users with convenient to use and unambiguous APIs with ``async def``, ``await``, ``async for`` and ``async with`` syntax. Working example --------------- All concepts proposed in this PEP are implemented [3]_ and can be tested. :: import asyncio async def echo_server(): print('Serving on localhost:8000') await asyncio.start_server(handle_connection, 'localhost', 8000) async def handle_connection(reader, writer): print('New connection...') while True: data = await reader.read(8192) if not data: break print('Sending {:.10}... back'.format(repr(data))) writer.write(data) loop = asyncio.get_event_loop() loop.run_until_complete(echo_server()) try: loop.run_forever() finally: loop.close() References ========== .. [1] https://docs.python.org/3/library/asyncio-task.html#asyncio.coroutine .. [2] http://wiki.ecmascript.org/doku.php?id=strawman:async_functions .. [3] https://github.com/1st1/cpython/tree/await .. [4] https://hg.python.org/benchmarks .. [5] https://msdn.microsoft.com/en-us/library/hh191443.aspx .. [6] http://docs.hhvm.com/manual/en/hack.async.php .. [7] https://www.dartlang.org/articles/await-async/ .. [8] http://docs.scala-lang.org/sips/pending/async.html .. [9] https://github.com/google/traceur-compiler/wiki/LanguageFeatures#async-functions-experimental .. [10] http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3722.pdf (PDF) Acknowledgments =============== I thank Guido van Rossum, Victor Stinner, Elvis Pranskevichus, Andrew Svetlov, and ?ukasz Langa for their initial feedback. Copyright ========= This document has been placed in the public domain. From larry at hastings.org Tue Apr 21 19:31:39 2015 From: larry at hastings.org (Larry Hastings) Date: Tue, 21 Apr 2015 10:31:39 -0700 Subject: [Python-Dev] Surely "nullable" is a reasonable name? In-Reply-To: References: <53DF326F.9030908@hastings.org> <53E0F488.1090105@v.loewis.de> <53E454E9.3030100@hastings.org> <55336517.6090702@hastings.org> Message-ID: <5536897B.80006@hastings.org> On 04/21/2015 04:50 AM, Tal Einat wrote: > As for the default set of accepted types for various convertors, if we > could choose any syntax we liked, something like "accept=+{NoneType}" > would be much better IMO. In theory Argument Clinic could use any syntax it likes. In practice, under the covers we tease out one or two bits of non-Python syntax, then run ast.parse over it. Saved us a lot of work. "s: accept={str,NoneType}" is a legal Python parameter declaration; "s: accept+={NoneType}" is not. If I could figure out a clean way to hack in support for += I'll support it. Otherwise you'll be forced to spell it out. //arry/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Tue Apr 21 19:33:48 2015 From: guido at python.org (Guido van Rossum) Date: Tue, 21 Apr 2015 10:33:48 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150421184721.419cef10@fsol> References: <20150420144106.63513828@anarchist.wooz.org> <20150421004339.1DA5CB500F5@webabinitio.net> <20150421094958.2960986c@fsol> <20150421184721.419cef10@fsol> Message-ID: On Tue, Apr 21, 2015 at 9:47 AM, Antoine Pitrou wrote: > On Tue, 21 Apr 2015 09:28:45 -0700 > Guido van Rossum wrote: > > On Tue, Apr 21, 2015 at 12:49 AM, Antoine Pitrou > > wrote: > > > > > On Mon, 20 Apr 2015 20:43:38 -0400 > > > "R. David Murray" wrote: > > > > +1 to this from me too. I'm afraid that means I'm -1 on the PEP. > > > > > > > > I didn't write this in my earlier email because I wasn't sure about > it, > > > > but my gut reaction after reading Harry's email was "if type > annotations > > > > are used in the stdlib, I'll probably stop contributing". That > doesn't > > > > mean that's *true*, but that's the first time I've ever had that > > > > thought, so it is probably worth sharing. > > > > > > I think it would be nice to know what the PEP means for daily stdlib > > > development. If patches have to carry typing information each time they > > > add/enhance an API that's an addition burden. If typing is done > > > separately by interested people then it sounds like it wouldn't have > > > much of an impact on everyone else's workflow. > > > > This point will be moot until new code appears in the stdlib whose author > > likes type hints. As I said, we won't be converting existing code to add > > type hints (I'm actively against that for the stdlib, for reasons I've > > explained already). > > I was thinking of potential stub files. Or are you also putting a ban > on those for existing stdlib code? Sorry if that has already been > answered... > No ban, but for now the plan is to collect and manage those separately as part of typeshed. I imagine the typical failure case is where the stubs are more restrictive than the implementation -- after all if the implementation becomes *more* restrictive it's breaking backward compatibility. I think it's okay for the stubs to lag behind like that, so I don't expect pressure on stdlib contributors. If anything the pressure will be on typeshed. -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From dholth at gmail.com Tue Apr 21 19:34:30 2015 From: dholth at gmail.com (Daniel Holth) Date: Tue, 21 Apr 2015 13:34:30 -0400 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <55362CCF.3020809@btinternet.com> <20150421124720.GK5663@ando.pearwood.info> <20150421150827.3c2a0af6@fsol> <20150421165042.34f935ef@x230> <55365AA7.3060500@python.org> <20150421182750.3efcb597@x230> <20150421161701.E06C0B20095@webabinitio.net> Message-ID: On Tue, Apr 21, 2015 at 1:10 PM, Guido van Rossum wrote: > On Tue, Apr 21, 2015 at 9:17 AM, R. David Murray > wrote: >> >> Please be respectful rather than inflammatory. If you read what I >> wrote, I did not say that I was going to stop contributing, I >> specifically talked about that gut reaction being both emotional and >> illogical. That doesn't make the reaction any less real, and the fact >> that such reactions exist is a data point you should consider in >> conducting your PR campaign for this issue. (I don't mean that last as >> a negative: this issue *requires* an honest PR campaign.) > > > Well, my own reactions at this point in the flame war are also quite > emotional. :-( > > I have done my best in being honest in my PR campaign. But I feel like the > opposition (not you, but definitely some others -- have you seen Twitter?) > are spreading FUD based on an irrational conviction that this will destroy > Python. It will not. It may not prove the solution to all Python's problems > -- there's always 3.6. (Oh wait, Python 2.7 is perfect. I've heard that > before -- Paul Everitt famously said the same of Python 1.5.2. Aren't you > glad I didn't take him literally? :-P ) > > -- > --Guido van Rossum (python.org/~guido) I am looking forward to using type annotations. I am not very good at remembering the operations that are available on the objects passed around in my code, but I am very good at typing CTRL-space. To get that I am happy to modify my code with weird docstrings or any other notation. Good support for completion, aided by standard annotations, eliminates a huge amount of cross-referencing while coding. I'm also hopeful that static type checking, aided with annotations, will help with unicode porting. Duck typing does not work very well when you are trying to differentiate between bytes and str. Also, Python 1.5.2 was pretty good :-) From solipsis at pitrou.net Tue Apr 21 19:36:31 2015 From: solipsis at pitrou.net (Antoine Pitrou) Date: Tue, 21 Apr 2015 19:36:31 +0200 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction References: <55362CCF.3020809@btinternet.com> <20150421124720.GK5663@ando.pearwood.info> <20150421150827.3c2a0af6@fsol> <20150421165042.34f935ef@x230> <55365AA7.3060500@python.org> <20150421182750.3efcb597@x230> Message-ID: <20150421193631.3f96987c@fsol> On Tue, 21 Apr 2015 18:27:50 +0300 Paul Sokolovsky wrote: > Let me try: MicroPython already uses type annotations for statically > typed functions. E.g. > > def add(x:int, y:int): > return x + y > > will translate the function to just 2 machine instructions. That's quite nice. > Oh really, you care to support single-precisions in Numba? Numba is quite specialized. In our area, single-precision arithmetic can sometimes give double the speed on modern CPUs (especially when LLVM is able to vectorize the code). The speedup can be even larger on a GPU (where double-precision execution resources are scarce). I agree it doesn't make sense for a general-purpose Python compiler. > Anyway, back to your example, it would be done like: > > SFloat = float > DFloat = float > > For a random tool out there, "SFloat" and "DFloat" would be just > aliases to floats, but Numba will know they have additional semantics > behind them. (That assumes that typedefs like SFloat can be accessed in > symbolic form - that's certainly possible if you have your own > parser/VM, but might worth to think how to do it on "CPython" level). Fortunately, we don't have our own parser :-) We work from the CPython bytecode. Regards Antoine. From rdmurray at bitdance.com Tue Apr 21 19:49:35 2015 From: rdmurray at bitdance.com (R. David Murray) Date: Tue, 21 Apr 2015 13:49:35 -0400 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> <20150421004339.1DA5CB500F5@webabinitio.net> <20150421094958.2960986c@fsol> Message-ID: <20150421174936.4B861250E5B@webabinitio.net> On Tue, 21 Apr 2015 16:55:49 -0000, "Gregory P. Smith" wrote: > We will not be putting type annotations anywhere in the stdlib or expecting > anyone else to maintain them there. That would never happen until tools > that are convincing enough in their utility for developers to _want_ to use > are available and accepted. That'll be a developer workflow thing we could > address with a later PEP. IF it happens at all. This is the most reassuring statement I've heard so far ;) --David From donald at stufft.io Tue Apr 21 19:54:55 2015 From: donald at stufft.io (Donald Stufft) Date: Tue, 21 Apr 2015 13:54:55 -0400 Subject: [Python-Dev] Python 3.x Adoption for PyPI and PyPI Download Numbers Message-ID: <70D6B7F8-DB34-4484-BB17-730CE94E3F0E@stufft.io> Just thought I'd share this since it shows how what people are using to download things from PyPI have changed over the past year. Of particular interest to most people will be the final graphs showing what percentage of downloads from PyPI are for Python 3.x or 2.x. As always it's good to keep in mind, "Lies, Damn Lies, and Statistics". I've tried not to bias the results too much, but some bias is unavoidable. Of particular note is that a lot of these numbers come from pip, and as of version 6.0 of pip, pip will cache downloads by default. This would mean that older versions of pip are more likely to "inflate" the downloads than newer versions since they don't cache by default. In addition if a project has a file which is used for both 2.x and 3.x and they do a ``pip install`` on the 2.x version first then it will show up as counted under 2.x but not 3.x due to caching (and of course the inverse is true, if they install on 3.x first it won't show up on 2.x). Here's the link: https://caremad.io/2015/04/a-year-of-pypi-downloads/ Anyways, I'll have access to the data set for another day or two before I shut down the (expensive) server that I have to use to crunch the numbers so if there's anything anyone else wants to see before I shut it down, speak up soon. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 801 bytes Desc: Message signed with OpenPGP using GPGMail URL: From rdmurray at bitdance.com Tue Apr 21 19:59:33 2015 From: rdmurray at bitdance.com (R. David Murray) Date: Tue, 21 Apr 2015 13:59:33 -0400 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <55362CCF.3020809@btinternet.com> <20150421124720.GK5663@ando.pearwood.info> <20150421150827.3c2a0af6@fsol> <20150421165042.34f935ef@x230> <55365AA7.3060500@python.org> <20150421182750.3efcb597@x230> <20150421161701.E06C0B20095@webabinitio.net> Message-ID: <20150421175933.96494250E5B@webabinitio.net> On Tue, 21 Apr 2015 10:10:06 -0700, Guido van Rossum wrote: > On Tue, Apr 21, 2015 at 9:17 AM, R. David Murray > wrote: > > > Please be respectful rather than inflammatory. If you read what I > > wrote, I did not say that I was going to stop contributing, I > > specifically talked about that gut reaction being both emotional and > > illogical. That doesn't make the reaction any less real, and the fact > > that such reactions exist is a data point you should consider in > > conducting your PR campaign for this issue. (I don't mean that last as > > a negative: this issue *requires* an honest PR campaign.) > > > > Well, my own reactions at this point in the flame war are also quite > emotional. :-( > > I have done my best in being honest in my PR campaign. But I feel like the I believe you have. My inclusion of the word 'honest' was meant to contrast the kind of PR we need with the kind of PR people typically think about, which is often not particularly honest. > opposition (not you, but definitely some others -- have you seen Twitter?) > are spreading FUD based on an irrational conviction that this will destroy No, I tend only to peek in there occasionally. > Python. It will not. It may not prove the solution to all Python's problems > -- there's always 3.6. (Oh wait, Python 2.7 is perfect. I've heard that > before -- Paul Everitt famously said the same of Python 1.5.2. Aren't you > glad I didn't take him literally? :-P ) Yes. But somewhere there or not long before was my introduction to Python, so I remember it fondly :) --David From guido at python.org Tue Apr 21 20:01:15 2015 From: guido at python.org (Guido van Rossum) Date: Tue, 21 Apr 2015 11:01:15 -0700 Subject: [Python-Dev] Python 3.x Adoption for PyPI and PyPI Download Numbers In-Reply-To: <70D6B7F8-DB34-4484-BB17-730CE94E3F0E@stufft.io> References: <70D6B7F8-DB34-4484-BB17-730CE94E3F0E@stufft.io> Message-ID: Thanks for the detailed research! On Tue, Apr 21, 2015 at 10:54 AM, Donald Stufft wrote: > Just thought I'd share this since it shows how what people are using to > download things from PyPI have changed over the past year. Of particular > interest to most people will be the final graphs showing what percentage of > downloads from PyPI are for Python 3.x or 2.x. > > As always it's good to keep in mind, "Lies, Damn Lies, and Statistics". > I've > tried not to bias the results too much, but some bias is unavoidable. Of > particular note is that a lot of these numbers come from pip, and as of > version > 6.0 of pip, pip will cache downloads by default. This would mean that older > versions of pip are more likely to "inflate" the downloads than newer > versions > since they don't cache by default. In addition if a project has a file > which > is used for both 2.x and 3.x and they do a ``pip install`` on the 2.x > version > first then it will show up as counted under 2.x but not 3.x due to caching > (and > of course the inverse is true, if they install on 3.x first it won't show > up > on 2.x). > > Here's the link: https://caremad.io/2015/04/a-year-of-pypi-downloads/ > > Anyways, I'll have access to the data set for another day or two before I > shut down the (expensive) server that I have to use to crunch the numbers > so if > there's anything anyone else wants to see before I shut it down, speak up > soon. > > --- > Donald Stufft > PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From willingc at willingconsulting.com Tue Apr 21 19:03:32 2015 From: willingc at willingconsulting.com (Carol Willing) Date: Tue, 21 Apr 2015 10:03:32 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150421161701.E06C0B20095@webabinitio.net> References: <55362CCF.3020809@btinternet.com> <20150421124720.GK5663@ando.pearwood.info> <20150421150827.3c2a0af6@fsol> <20150421165042.34f935ef@x230> <55365AA7.3060500@python.org> <20150421182750.3efcb597@x230> <20150421161701.E06C0B20095@webabinitio.net> Message-ID: <553682E4.2020306@willingconsulting.com> On 4/21/15 9:17 AM, R. David Murray wrote: > Please be respectful rather than inflammatory. Thank you David. > If you read what I > wrote, I did not say that I was going to stop contributing, I > specifically talked about that gut reaction being both emotional and > illogical. That doesn't make the reaction any less real, and the fact > that such reactions exist is a data point you should consider in > conducting your PR campaign for this issue. (I don't mean that last as > a negative: this issue *requires* an honest PR campaign.) As David stated, gut reactions are real. These reactions have the potential, if listened to and respected, to lead toward an optimal (not ideal) solution. Likely, the implementation of optional static type checking will evolve from reasoned, respectful debate of the issues not inflammatory quotes. Quite frankly, belittling someone's understanding or knowledge does not serve the PEP or technical issues at hand. I like the option of static type checking for critical high availability and high reliability applications (i.e. air traffic control, financial transactions). I'm less interested in static type checking of inputs when prototyping or developing less critical applications. There have been good technical points made by many on this thread especially given the different use cases. These use cases, and likely a few more, are important to an honest, continued technical refinement of the PEP. Two areas of clarification would be helpful for me: 1. Optional: What does this really mean in practice? Am I opting in to static type checking and type hints? Or must I opt out of type hints? Having to opt out would probably put more burden on the educational use cases than opting in would for a large corporate project. 2. Clearly, great thought has been put into this PEP. If anyone has a good analysis of the potential impact on Python 3 adoption, please do pass along. I would be interested in reading the information. Warmly, Carol P.S. I do member a time when tools to easily check for memory leaks in C were new and magical. Looking at the Coverity scans, I'm glad that the old magic is reaping real benefits. -- *Carol Willing* Developer | Willing Consulting https://willingconsulting.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Tue Apr 21 20:23:57 2015 From: guido at python.org (Guido van Rossum) Date: Tue, 21 Apr 2015 11:23:57 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <553682E4.2020306@willingconsulting.com> References: <55362CCF.3020809@btinternet.com> <20150421124720.GK5663@ando.pearwood.info> <20150421150827.3c2a0af6@fsol> <20150421165042.34f935ef@x230> <55365AA7.3060500@python.org> <20150421182750.3efcb597@x230> <20150421161701.E06C0B20095@webabinitio.net> <553682E4.2020306@willingconsulting.com> Message-ID: On Tue, Apr 21, 2015 at 10:03 AM, Carol Willing < willingc at willingconsulting.com> wrote: > > Two areas of clarification would be helpful for me: > > 1. Optional: What does this really mean in practice? Am I opting in to > static type checking and type hints? Or must I opt out of type hints? > Having to opt out would probably put more burden on the educational use > cases than opting in would for a large corporate project. > You're definitely opting in. You must do two things to opt in: (a) add type hints to some of your signatures; (b) run a type checker over your code. You will only get the benefits of type hints of you do both, and there is no pressure to opt in. If someone else adds type hints to their code, which you import, but you don't run the type checker, nothing changes for you when you run your code (you can still get away with things that happen to work even the type checker would reject them). When you are reading their source code, you will see their type hints -- like every other aspect of writing good code, some people's type hints will be readable, while others' less so. At least nobody will be writing type hints in Cyrillic. :-) > > 2. Clearly, great thought has been put into this PEP. If anyone has a good > analysis of the potential impact on Python 3 adoption, please do pass > along. I would be interested in reading the information. > I wish I had a crystal ball, but this is hard to predict. Anecdotally, some people believe this will be catnip, while others believe it to be poison. The truth will surely be somewhere in the middle. At this point we don't know what drives Python 3 adoption except time -- it's definitely going up. :-) -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From pmiscml at gmail.com Tue Apr 21 20:31:49 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Tue, 21 Apr 2015 21:31:49 +0300 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150421165059.GA4073@stoneleaf.us> References: <55362CCF.3020809@btinternet.com> <20150421124720.GK5663@ando.pearwood.info> <20150421150827.3c2a0af6@fsol> <20150421165042.34f935ef@x230> <55365AA7.3060500@python.org> <20150421182750.3efcb597@x230> <20150421165059.GA4073@stoneleaf.us> Message-ID: <20150421213149.7955c8b0@x230> Hello, On Tue, 21 Apr 2015 09:50:59 -0700 Ethan Furman wrote: > On 04/21, Paul Sokolovsky wrote: > > > > And for example yesterday's big theme was people blackmailing that > > they stop contributing to stdlib if annotations are in [...] > > A volunteer's honest reaction is not blackmail, and categorizing it > as such is not helpful to the discussion. Sure, that was rather humoresque note. Still, one may wonder why "honest reaction" is like that, if from reading PEP484 it's clear that it doesn't change status quo: https://www.python.org/dev/peps/pep-3107 added annotations long ago, and PEP484 just provides default semantics for them. Note that *default*, not the "only". PEP484 is full of conditions and long transition windows. Did PEP3107 shutter everyone's world? No. And PEP484 is nothing but a logical continuation of PEP3107, coming forward with real use for annotations, but not intended to shutter everyone's world. Well, hopefully Guido now clarified what was already written PEP484 (or not written, like nowhere it says "we add requirement of mandatory typehints in version 3.x [of stdlib or otherwise]"). But now people try to come up with *anti*-patterns on how to use type annotation and argue that these anti-patterns are not useful. Surely, annotations are useful in some places and not useful in other. requests doesn't need them? Good. But it's quite useful to annotate FFT routines and subroutines as taking arrays of floats, we can't get "faster than C"(tm) without that, like some other languages did (or claim to have done, and now look attractive). (You don't do FFT in Python? OMG, that's old.) -- Best regards, Paul mailto:pmiscml at gmail.com From alexander.belopolsky at gmail.com Tue Apr 21 20:37:32 2015 From: alexander.belopolsky at gmail.com (Alexander Belopolsky) Date: Tue, 21 Apr 2015 14:37:32 -0400 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <55362CCF.3020809@btinternet.com> <20150421124720.GK5663@ando.pearwood.info> <20150421150827.3c2a0af6@fsol> <20150421165042.34f935ef@x230> <55365AA7.3060500@python.org> <20150421182750.3efcb597@x230> <20150421161701.E06C0B20095@webabinitio.net> <553682E4.2020306@willingconsulting.com> Message-ID: On Tue, Apr 21, 2015 at 2:23 PM, Guido van Rossum wrote: > At least nobody will be writing type hints in Cyrillic. :-) Why not? It works just fine: >>> ?????? = list >>> def sum(x: ??????): ... pass ... >>> (See https://en.wikipedia.org/wiki/Rapira for some prior art.) -------------- next part -------------- An HTML attachment was scrubbed... URL: From taleinat at gmail.com Tue Apr 21 20:44:16 2015 From: taleinat at gmail.com (Tal Einat) Date: Tue, 21 Apr 2015 21:44:16 +0300 Subject: [Python-Dev] Surely "nullable" is a reasonable name? In-Reply-To: <5536897B.80006@hastings.org> References: <53DF326F.9030908@hastings.org> <53E0F488.1090105@v.loewis.de> <53E454E9.3030100@hastings.org> <55336517.6090702@hastings.org> <5536897B.80006@hastings.org> Message-ID: On Tue, Apr 21, 2015 at 8:31 PM, Larry Hastings wrote: > > On 04/21/2015 04:50 AM, Tal Einat wrote: > > As for the default set of accepted types for various convertors, if we > could choose any syntax we liked, something like "accept=+{NoneType}" > would be much better IMO. > > > In theory Argument Clinic could use any syntax it likes. In practice, under > the covers we tease out one or two bits of non-Python syntax, then run > ast.parse over it. Saved us a lot of work. > > "s: accept={str,NoneType}" is a legal Python parameter declaration; "s: > accept+={NoneType}" is not. If I could figure out a clean way to hack in > support for += I'll support it. Otherwise you'll be forced to spell it out. Actually, I wrote "accept=+{NoneType}" - note the plus is *after* the equal sign. This is a valid Python assignment expression. (The unary addition operator is not defined for sets, however.) - Tal Einat From lukasz at langa.pl Tue Apr 21 20:55:49 2015 From: lukasz at langa.pl (=?utf-8?Q?=C5=81ukasz_Langa?=) Date: Tue, 21 Apr 2015 11:55:49 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <55362CCF.3020809@btinternet.com> <20150421124720.GK5663@ando.pearwood.info> <20150421150827.3c2a0af6@fsol> <20150421165042.34f935ef@x230> <55365AA7.3060500@python.org> <20150421182750.3efcb597@x230> <20150421161701.E06C0B20095@webabinitio.net> <553682E4.2020306@willingconsulting.com> Message-ID: <7E5349E5-3EC1-4E9B-ACB4-3319C84DF003@langa.pl> > On Apr 21, 2015, at 11:23 AM, Guido van Rossum wrote: > > 2. Clearly, great thought has been put into this PEP. If anyone has a good analysis of the potential impact on Python 3 adoption, please do pass along. I would be interested in reading the information. > > I wish I had a crystal ball, but this is hard to predict. Anecdotally, some people believe this will be catnip, while others believe it to be poison. The truth will surely be somewhere in the middle. At this point we don't know what drives Python 3 adoption except time -- it's definitely going up. :-) Anecdotal evidence shows that some organizations perceive this feature as one that justifies migration. Some of those organizations are pretty serious about open-sourcing things. That makes me believe that by sheer volume of the code they?re producing, Python 3 adoption will continue to increase. As Gregory Smith rightfully pointed out, nobody wants ugly code. I understand why people are afraid of that and it warms my heart that they are. The community cares so much about aesthetics and readability, it?s great! We will evolve this over time. This will be a learning process for everybody but we can?t learn to swim by only theorizing about it. We thought of the evolving nature of the solution from Day 1, hence the *provisional* nature of it. The wordy syntax is another example of that. Not requiring changes to the interpreter and the standard library was very high on the list of priorities. Once the *concept* proves itself, then we can improve on the syntax. Acknowledging PEP 484 being just the second step^ in a long journey is why some ?obvious? parts are left out for now (hello, duck typing; hello, multiple dispatch; hello, runtime type checks in unit tests; etc. etc.). Those are often big enough to warrant their own PEPs. ^ PEP 3107 being the first. -- Lukasz Langa | Facebook Production Engineer | Global Consistency (+1) 650-681-7811 -------------- next part -------------- An HTML attachment was scrubbed... URL: From pmiscml at gmail.com Tue Apr 21 20:58:20 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Tue, 21 Apr 2015 21:58:20 +0300 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150421161701.E06C0B20095@webabinitio.net> References: <55362CCF.3020809@btinternet.com> <20150421124720.GK5663@ando.pearwood.info> <20150421150827.3c2a0af6@fsol> <20150421165042.34f935ef@x230> <55365AA7.3060500@python.org> <20150421182750.3efcb597@x230> <20150421161701.E06C0B20095@webabinitio.net> Message-ID: <20150421215820.50e543b8@x230> Hello, On Tue, 21 Apr 2015 12:17:01 -0400 "R. David Murray" wrote: > On Tue, 21 Apr 2015 18:27:50 +0300, Paul Sokolovsky > wrote: > > > I was replying to Steven's message. Did you read it? > > > > Yes. And I try to follow general course of discussion, as its hard > > to follow individual sub-threads. And for example yesterday's big > > theme was people blackmailing that they stop contributing to stdlib > > if annotations are in, and today's theme appear to be people > > telling that static type checking won't be useful. And just your > > reply to Steven was a final straw which prompted me to point out > > that static type checking is not a crux of it, but just one (not > > the biggest IMHO) usage. > > Please be respectful rather than inflammatory. If you read what I > wrote, I did not say that I was going to stop contributing, I > specifically talked about that gut reaction being both emotional and > illogical. Sure, and I didn't mean you personally - your original message conveyed that it was "gut feeling" very well. More like following to your message, where people gave "+1". To a gut feeling? To the thought of stopping contribution? I dunno, and that "blackmail" note wasn't intended to be more serious than "burn the witch" and other funny stylistic devices which already appeared and make the discussion a bit more lively ;-). > That doesn't make the reaction any less real, and the fact > that such reactions exist is a data point you should consider in > conducting your PR campaign for this issue. (I don't mean that last > as a negative: this issue *requires* an honest PR campaign.) It does, and hope people won't be caught in "static typechecking" loop and consider other usages too. > > --David -- Best regards, Paul mailto:pmiscml at gmail.com From a.badger at gmail.com Tue Apr 21 21:15:00 2015 From: a.badger at gmail.com (Toshio Kuratomi) Date: Tue, 21 Apr 2015 12:15:00 -0700 Subject: [Python-Dev] [Distutils] Python 3.x Adoption for PyPI and PyPI Download Numbers In-Reply-To: <70D6B7F8-DB34-4484-BB17-730CE94E3F0E@stufft.io> References: <70D6B7F8-DB34-4484-BB17-730CE94E3F0E@stufft.io> Message-ID: <20150421191500.GA10283@roan.lan> On Tue, Apr 21, 2015 at 01:54:55PM -0400, Donald Stufft wrote: > > Anyways, I'll have access to the data set for another day or two before I > shut down the (expensive) server that I have to use to crunch the numbers so if > there's anything anyone else wants to see before I shut it down, speak up soon. > Where are curl and wget getting categorized in the User Agent graphs? Just morbidly curious as to whether they're in with Browser and therefore mostly unused or Unknown and therefore only slightly less unused ;-) -Toshio -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 181 bytes Desc: not available URL: From donald at stufft.io Tue Apr 21 21:21:49 2015 From: donald at stufft.io (Donald Stufft) Date: Tue, 21 Apr 2015 15:21:49 -0400 Subject: [Python-Dev] [Distutils] Python 3.x Adoption for PyPI and PyPI Download Numbers In-Reply-To: <20150421191500.GA10283@roan.lan> References: <70D6B7F8-DB34-4484-BB17-730CE94E3F0E@stufft.io> <20150421191500.GA10283@roan.lan> Message-ID: > On Apr 21, 2015, at 3:15 PM, Toshio Kuratomi wrote: > > On Tue, Apr 21, 2015 at 01:54:55PM -0400, Donald Stufft wrote: >> >> Anyways, I'll have access to the data set for another day or two before I >> shut down the (expensive) server that I have to use to crunch the numbers so if >> there's anything anyone else wants to see before I shut it down, speak up soon. >> > Where are curl and wget getting categorized in the User Agent graphs? > > Just morbidly curious as to whether they're in with Browser and therefore > mostly unused or Unknown and therefore only slightly less unused ;-) They get classified as Unknown, here?s the hacky script I use to parse the log files: https://bpaste.net/show/515017c78e32 --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 801 bytes Desc: Message signed with OpenPGP using GPGMail URL: From p.f.moore at gmail.com Tue Apr 21 21:23:56 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Tue, 21 Apr 2015 20:23:56 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> <20150421004339.1DA5CB500F5@webabinitio.net> <20150421094958.2960986c@fsol> Message-ID: On 21 April 2015 at 17:55, Gregory P. Smith wrote: > I view most of this thread as FUD. The fear is understandable, I'm trying to > tell people to stop panicing. I think (hope!) everyone is clear that what's being expressed in this thread is honest (emotional) reactions. There's a negative connotation to the term FUD that's uncomfortable in this context, although it's understandable that the authors and supporters of the PEP feel frustrated about the feedback, so merely using terms with negative connotations is a pretty measured response, actually :-) We've got people expressing fear and others telling them not to panic. Realistically, just being told not to panic won't help. What will help is feeling that the fears are understood and being considered, even if ultimately the response is "we think it'll be alright - wait and see". Most people contributing to this thread have probably not been involved in the earlier discussions (I know I tuned out of the threads on python-ideas) so are now reacting at short notice to what looks like a big change. Things should calm down in due course, but "what the heck is all this?" reactions are to be expected. And honestly, the PEP is pretty difficult to read. That's probably the nature of the subject (my eyes glaze over when reading about type theory) but it doesn't help. When I read the PEP, after a few sections I found myself losing focus because I kept thinking "but what about X?" Result - I didn't manage to read the whole PEP and came away with a feeling that all those things I thought of "couldn't be handled". Not a positive reaction, unfortunately. That's the fault of the reader rather than the PEP, but maybe the PEP could include a "common concerns/FAQ" section, with discussions of the issues people are worried about? One final point. The overwhelming feeling I'm getting about the debate is that the main response to people with concerns is "you don't have to use it" and "it's optional". But that's not the point - we're talking past each other to an extent. I (and I presume most others) understand the fact that type hints are optional. We may or may not use them ourselves (I'm not actually ruling out that I might end up liking them!) But that's not the concern - what we're trying to understand is how we, as a community of programmers, and an ecosystem of interdependent codebases, deal with the possibility that there could be two differing philosophies (to hint or not to hint) trying to work together. Maybe it'll be a non-event, and we carry on as usual. Maybe nobody will use types in public code for years, because we all end up having to support 2.x whether we want to or not. Maybe we shouldn't try to solve big social issues like that and we should just trust in the fact that Python has a great community who *will* find a way to work together. I don't know, but that's my real question, and "it's optional" isn't really the response I'm looking for. (The response I probably deserve is likely "don't take it all so seriously", and it's probably fair. I apologise - it's been that sort of day :-)) Paul From rdmurray at bitdance.com Tue Apr 21 21:28:47 2015 From: rdmurray at bitdance.com (R. David Murray) Date: Tue, 21 Apr 2015 15:28:47 -0400 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150421213149.7955c8b0@x230> References: <55362CCF.3020809@btinternet.com> <20150421124720.GK5663@ando.pearwood.info> <20150421150827.3c2a0af6@fsol> <20150421165042.34f935ef@x230> <55365AA7.3060500@python.org> <20150421182750.3efcb597@x230> <20150421165059.GA4073@stoneleaf.us> <20150421213149.7955c8b0@x230> Message-ID: <20150421192848.14316B500F6@webabinitio.net> On Tue, 21 Apr 2015 21:31:49 +0300, Paul Sokolovsky wrote: > On Tue, 21 Apr 2015 09:50:59 -0700 Ethan Furman wrote: > > > On 04/21, Paul Sokolovsky wrote: > > > > > > And for example yesterday's big theme was people blackmailing that > > > they stop contributing to stdlib if annotations are in [...] > > > > A volunteer's honest reaction is not blackmail, and categorizing it > > as such is not helpful to the discussion. > > Sure, that was rather humoresque note. Still, one may wonder why > "honest reaction" is like that, if from reading PEP484 it's clear that > it doesn't change status quo: https://www.python.org/dev/peps/pep-3107 > added annotations long ago, and PEP484 just provides default But what concerned me as a Python core developer was the perception that as a core developer I would have to deal with type hints and type checking in the stdlib, because there was talk of including type hints for the stdlib (as stub files) in 3.6. *That* was the source of my concern, which is reduced by the statement that we'll only need to think about type checking in the stdlib once the tools are *clearly* adding value, in *our* (the stdlib module maintainers') opinion, so that we *want* to use the type hints. The discussion of stub files being maintained by interested parties in a separate repository is also helpful. Again, that means that wider adoption should mostly only happen if the developers see real benefit. I still dislike the idea of having to read type hints (I agree with whoever it was said a well written docstring is more helpful than abstract type hints when reading Python code), but perhaps I will get used to it if/when it shows up in libraries I use. I think it would serve your goal better if instead of dismissing concerns you were to strive to understand them and then figure out how to allay them. Overall, I suspect that those who are doubtful are put off by the "rah rah this is great you are fools for not wanting to use it" tone of (some) of the proponents (which makes us think this is going to get pushed on us willy-nilly), whereas hearing a consistent message that "this will *enable* interested parties to collaborate on type checking without impacting the way the rest of you code *unless it proves its value to you*" would be better received. Guido has done the latter, IMO, as have some others. --David From p.f.moore at gmail.com Tue Apr 21 22:02:45 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Tue, 21 Apr 2015 21:02:45 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <7E5349E5-3EC1-4E9B-ACB4-3319C84DF003@langa.pl> References: <55362CCF.3020809@btinternet.com> <20150421124720.GK5663@ando.pearwood.info> <20150421150827.3c2a0af6@fsol> <20150421165042.34f935ef@x230> <55365AA7.3060500@python.org> <20150421182750.3efcb597@x230> <20150421161701.E06C0B20095@webabinitio.net> <553682E4.2020306@willingconsulting.com> <7E5349E5-3EC1-4E9B-ACB4-3319C84DF003@langa.pl> Message-ID: (Gmail messed up the attributions - apologies if I didn't fix them up correctly). 21 April 2015 at 19:55, ?ukasz Langa wrote: >> >> On Apr 21, 2015, at 11:23 AM, Guido van Rossum wrote: >> >>> 2. Clearly, great thought has been put into this PEP. If anyone has a good >>> analysis of the potential impact on Python 3 adoption, please do pass along. >>> I would be interested in reading the information. >> >> I wish I had a crystal ball, but this is hard to predict. Anecdotally, some >> people believe this will be catnip, while others believe it to be poison. >> The truth will surely be somewhere in the middle. At this point we don't >> know what drives Python 3 adoption except time -- it's definitely going up. >> :-) >> > > Anecdotal evidence shows that some organizations perceive this feature as > one that justifies migration. Some of those organizations are pretty serious > about open-sourcing things. That makes me believe that by sheer volume of > the code they?re producing, Python 3 adoption will continue to increase. > > As Gregory Smith rightfully pointed out, nobody wants ugly code. I > understand why people are afraid of that and it warms my heart that they > are. The community cares so much about aesthetics and readability, it?s > great! > > We will evolve this over time. This will be a learning process for everybody > but we can?t learn to swim by only theorizing about it. We thought of the > evolving nature of the solution from Day 1, hence the *provisional* nature > of it. The wordy syntax is another example of that. Not requiring changes to > the interpreter and the standard library was very high on the list of > priorities. Once the *concept* proves itself, then we can improve on the > syntax. > > Acknowledging PEP 484 being just the second step^ in a long journey is why > some ?obvious? parts are left out for now (hello, duck typing; hello, > multiple dispatch; hello, runtime type checks in unit tests; etc. etc.). > Those are often big enough to warrant their own PEPs. > > ^ PEP 3107 being the first. Thank you for this response. For some reason, it's reassured me a lot (I've no idea really why it struck a chord with me more than any of the other responses in the thread, just one of those things, I guess). Paul From p.f.moore at gmail.com Tue Apr 21 22:13:11 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Tue, 21 Apr 2015 21:13:11 +0100 Subject: [Python-Dev] [python-committers] [RELEASED] Python 3.5.0a4 is now available In-Reply-To: <5534B5C0.2030800@hastings.org> References: <5534B5C0.2030800@hastings.org> Message-ID: On 20 April 2015 at 09:16, Larry Hastings wrote: > There is now a third type of Windows installer for Python 3.5. In addition > to the conventional installer and the web-based installer, Python 3.5 now > has an embeddable installer designed to be run as part of a larger > application's installer for apps using or extending Python. Probably a question for Steve mostly, but how does this work, exactly? I see it's an exe - I was sort of expecting a zip file. Specifically, I was considering using this to make a build of Vim, with Python support, using a copy of Python included with Vim (so no reliance on there being a "system" install of Python). As Vim is open source, "build it yourself", I typically just compile vim and then zip up the application directory. So I'm not clear how an exe "embeddable" installer fits into that scheme. Apologies if the exe is just a self-extracting archive, and the answer is just "run it and tell it where to dump the files". I've not even had a chance to get to somewhere I can download the installer to try it yet :-) Paul From robertc at robertcollins.net Tue Apr 21 22:18:43 2015 From: robertc at robertcollins.net (Robert Collins) Date: Wed, 22 Apr 2015 08:18:43 +1200 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> <20150421004339.1DA5CB500F5@webabinitio.net> <20150421094958.2960986c@fsol> Message-ID: On 22 April 2015 at 04:28, Guido van Rossum wrote: > On Tue, Apr 21, 2015 at 12:49 AM, Antoine Pitrou > wrote: >> >> On Mon, 20 Apr 2015 20:43:38 -0400 >> "R. David Murray" wrote: >> > +1 to this from me too. I'm afraid that means I'm -1 on the PEP. >> > >> > I didn't write this in my earlier email because I wasn't sure about it, >> > but my gut reaction after reading Harry's email was "if type annotations >> > are used in the stdlib, I'll probably stop contributing". That doesn't >> > mean that's *true*, but that's the first time I've ever had that >> > thought, so it is probably worth sharing. >> >> I think it would be nice to know what the PEP means for daily stdlib >> development. If patches have to carry typing information each time they >> add/enhance an API that's an addition burden. If typing is done >> separately by interested people then it sounds like it wouldn't have >> much of an impact on everyone else's workflow. > > > This point will be moot until new code appears in the stdlib whose author > likes type hints. As I said, we won't be converting existing code to add > type hints (I'm actively against that for the stdlib, for reasons I've > explained already). > > *If* type hints prove useful, I expect that adding type hints **to code that > deserves them** is treated no different in the workflow than adding tests or > docs. I.e. something that is the right thing to do because it has obvious > benefits for users and/or future maintainers. If at some point running a > type checker over the stdlib as part of continuous integration become > routine, type hints can also replace certain silly tests. > > Until some point in a possible but distant future when we're all thinking > back fondly about the argument we're currently having, it will be the choice > of the author of new (and *only* new) stdlib modules whether and how to use > type hints. Such a hypothetical author would also be reviewing updates to > "their" module and point out lack of type hints just like you might point > out an incomplete docstring, an outdated comment, or a missing test. (The > type checker would be responsible for pointing out bugs. :-P ) What about major changes to existing modules? I have a backlog of intended feature uplifts from testtools into unittest - if the type hints thing works out I am likely to put them into testtools. Whats your view on type hints to such *new code* in existing modules? -Rob -- Robert Collins Distinguished Technologist HP Converged Cloud From guido at python.org Tue Apr 21 22:26:04 2015 From: guido at python.org (Guido van Rossum) Date: Tue, 21 Apr 2015 13:26:04 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> <20150421004339.1DA5CB500F5@webabinitio.net> <20150421094958.2960986c@fsol> Message-ID: On Tue, Apr 21, 2015 at 1:18 PM, Robert Collins wrote: > On 22 April 2015 at 04:28, Guido van Rossum wrote: > > Until some point in a possible but distant future when we're all thinking > > back fondly about the argument we're currently having, it will be the > choice > > of the author of new (and *only* new) stdlib modules whether and how to > use > > type hints. Such a hypothetical author would also be reviewing updates to > > "their" module and point out lack of type hints just like you might point > > out an incomplete docstring, an outdated comment, or a missing test. (The > > type checker would be responsible for pointing out bugs. :-P ) > > What about major changes to existing modules? I have a backlog of > intended feature uplifts from testtools into unittest - if the type > hints thing works out I am likely to put them into testtools. Whats > your view on type hints to such *new code* in existing modules? > In the end this should be up to you and the reviewers, but for such a venerable module like unittest I'd be hesitant to be an early adopter. I'd also expect that much of unittest is too dynamic in nature to benefit from type hints. But maybe you should just try to use them for testtools and see for yourself how beneficial or cumbersome they are in that particular case? -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From damien.p.george at gmail.com Tue Apr 21 23:20:59 2015 From: damien.p.george at gmail.com (Damien George) Date: Tue, 21 Apr 2015 22:20:59 +0100 Subject: [Python-Dev] async/await in Python; v2 Message-ID: Hi Yury, In your PEP 492 draft, in the Grammar section, I think you're missing the modifications to the "flow_stmt" line. Cheers, Damien. From p.f.moore at gmail.com Wed Apr 22 00:33:38 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Tue, 21 Apr 2015 23:33:38 +0100 Subject: [Python-Dev] [python-committers] [RELEASED] Python 3.5.0a4 is now available In-Reply-To: <1429653904845.96438@microsoft.com> References: <5534B5C0.2030800@hastings.org> <1429653904845.96438@microsoft.com> Message-ID: On 21 April 2015 at 23:05, Steve Dower wrote: > It is indeed just a run-and-dump extractor. I haven't had a chance to write up any docs for it yet, but there are some open bugs I want to fix first (specifically http://bugs.python.org/issue23955) before this becomes too formalized. Thanks. See, I should have just run the random code I found on the internet and everything would have worked out alright :-) But seriously, thanks for doing this. I can see it being *really* useful. Paul From donald at stufft.io Wed Apr 22 00:35:59 2015 From: donald at stufft.io (Donald Stufft) Date: Tue, 21 Apr 2015 18:35:59 -0400 Subject: [Python-Dev] [python-committers] [RELEASED] Python 3.5.0a4 is now available In-Reply-To: References: <5534B5C0.2030800@hastings.org> <1429653904845.96438@microsoft.com> Message-ID: > On Apr 21, 2015, at 6:33 PM, Paul Moore wrote: > > On 21 April 2015 at 23:05, Steve Dower wrote: >> It is indeed just a run-and-dump extractor. I haven't had a chance to write up any docs for it yet, but there are some open bugs I want to fix first (specifically http://bugs.python.org/issue23955) before this becomes too formalized. > > Thanks. See, I should have just run the random code I found on the > internet and everything would have worked out alright :-) > > But seriously, thanks for doing this. I can see it being *really* useful. > Paul Is this version statically linked by any chance? --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 801 bytes Desc: Message signed with OpenPGP using GPGMail URL: From Steve.Dower at microsoft.com Wed Apr 22 00:05:04 2015 From: Steve.Dower at microsoft.com (Steve Dower) Date: Tue, 21 Apr 2015 22:05:04 +0000 Subject: [Python-Dev] [python-committers] [RELEASED] Python 3.5.0a4 is now available In-Reply-To: References: <5534B5C0.2030800@hastings.org>, Message-ID: <1429653904845.96438@microsoft.com> Paul Moore wrote: > On 20 April 2015 at 09:16, Larry Hastings wrote: >> There is now a third type of Windows installer for Python 3.5. In addition >> to the conventional installer and the web-based installer, Python 3.5 now >> has an embeddable installer designed to be run as part of a larger >> application's installer for apps using or extending Python. > > Probably a question for Steve mostly, but how does this work, exactly? > I see it's an exe - I was sort of expecting a zip file. I made it a self-extracting RAR file so it could be signed, but I've already had multiple people query it so the next release will probably just be a plain ZIP file. I just need to figure out some reliable way of validating the download other than GPG, since I'd like installers to be able to do the download transparently and ideally without hard-coding hash values. I might add a CSV of SHA hashes to the zip too. > Specifically, I was considering using this to make a build of Vim, > with Python support, using a copy of Python included with Vim (so no > reliance on there being a "system" install of Python). As Vim is open > source, "build it yourself", I typically just compile vim and then zip > up the application directory. So I'm not clear how an exe "embeddable" > installer fits into that scheme. Apologies if the exe is just a > self-extracting archive, and the answer is just "run it and tell it > where to dump the files". I've not even had a chance to get to > somewhere I can download the installer to try it yet :-) It is indeed just a run-and-dump extractor. I haven't had a chance to write up any docs for it yet, but there are some open bugs I want to fix first (specifically http://bugs.python.org/issue23955) before this becomes too formalized. Cheers, Steve > Paul From chris.barker at noaa.gov Wed Apr 22 00:41:03 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Tue, 21 Apr 2015 15:41:03 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: On Tue, Apr 21, 2015 at 2:33 AM, Cory Benfield wrote: > It seems like the only place the type annotations will get used is in > relatively trivial cases where the types are obvious anyway. I don't > deny that *some* bugs will be caught, but I suspect they'll > overwhelmingly be crass ones that would have been caught quickly > anyway. Well, it'll catch passing in a string instead of a sequence of strings -- one of teh common and semi-insidious type errors I see a lot (at least with newbies). Oh wait, maybe it won't -- a string IS a sequence of strings. That's why this is an insidioous bug in teh first place: List[string] will work, yes -- but now I'm locked down the user to an actual, list, yes? So I try: Iterable[string] ah yes -- now users can use tuples of strings, or a custom class, or -- but wait, darn it, a string IS an iterable of strings.. SIGH. NOTE: I've thought for years that Python should have a character type for this reason, but that's a side note. Oh, and I'm a heavy numpy user. And, in fact, I write a lot of functions that are essentially statically typed -- i.e. they will only work with a Nx3 array of float64 for instance. However, in this case, I want run-time type checking, and I DON'T want the API to be statically typed, so I use something like: the_arr = np.asarray( input_object, dypte-np.float64).reshape(-1, 3) and in my docstring: :param input_object: the input data to compute something from :type input_object: (Nx3) numpy array or floats or somethign that can be turned into one. Is there any way for type hinting to help here??? - Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Wed Apr 22 00:50:37 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Tue, 21 Apr 2015 15:50:37 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150421215820.50e543b8@x230> References: <55362CCF.3020809@btinternet.com> <20150421124720.GK5663@ando.pearwood.info> <20150421150827.3c2a0af6@fsol> <20150421165042.34f935ef@x230> <55365AA7.3060500@python.org> <20150421182750.3efcb597@x230> <20150421161701.E06C0B20095@webabinitio.net> <20150421215820.50e543b8@x230> Message-ID: On Tue, Apr 21, 2015 at 11:58 AM, Paul Sokolovsky wrote: > It does, and hope people won't be caught in "static typechecking" > loop and consider other usages too. > I"m confused -- from the bit I've been skimming the discussion, over on python-ideas, and now here, is that this is all about "static typechecking". It's not about run-time type checking. It's not about type-base performance optimization. It's not about any use of annotations other than types. What is it about other than static typechecking? -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Wed Apr 22 00:45:08 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Tue, 21 Apr 2015 15:45:08 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: Thank you Jack. Jack: "I hate code and I want as little of it as possible in our product" I love that quote -- and I ALWAYS use it when I teach newbies Python. It's kind of the point of Python -- you can get a lot done by writing very little code. I'm still confused about what all this type annotation is about -- yes, I certainly see lots of advantages to static typing -- but also many disadvantages. And one of them is a bunch more stuff to read and write. But key is: If you like typing, use a statically typed language. If you want the advantages of dynamic typing -- then why do you want all the extra clutter and "safety"? And it seems to me that optional/partial typing buys you much of the hassle and restrictions, and essentially NONE of the safety (type casting pointers in C, anyone?). And I've had this debate with proponents of JAVA and the like a bunch -- sure, type checking will catch bugs in your code -- but they are almost always shallow bugs -- and bugs that if your tests didn't catch them, you have really lame tests! On the other had, you just can't get good performance doing low-level things with a dynamic language. So I use Cython, which, in a way, is Python with optional static typing. It gives you down to the metal performance, where, and only where, you need it, but not really much in the way of type checking. However, this is exactly the opposite of what everyone seem to be talking about using optional typing in Python for -- which is type checking ,but not any run-time optimizations -- in fact, AFAICT, not even run-time type checking. So a whole lot of clutter for very little gain. :-( NOTE: MyPy is out in the wild -- I'd be really interested to see how it all is really working out -- even on those "enterprise" code bases -- other than managers feeling better, are developers finding out that: "WOW -- this baby caught some nasty, bugs that I would not have found with testing -- and would have been hard to debug after the fact!" But in the end, I agree with the OP here -- stub files let pre-run-time static type checking happen without cluttering up Python for the rest of us -- so a nice compromise. So I guess we'll see. - Chris On Mon, Apr 20, 2015 at 4:41 PM, Jack Diederich wrote: > Twelve years ago a wise man said to me "I suggest that you also propose a > new name for the resulting language" > > I talked with many of you at PyCon about the costs of PEP 484. There are > plenty of people who have done a fine job promoting the benefits. > > * It is not optional. Please stop saying that. The people promoting it > would prefer that everyone use it. If it is approved it will be optional in > the way that PEP8 is optional. If I'm reading your annotated code it is > certainly /not/ optional that I understand the annotations. > > * Uploading stubs for other people's code is a terrible idea. Who do I > contact when I update the interface to my library? The random Joe who > "helped" by uploading annotations three months ago and then quit the > internet? I don't even want to think about people maliciously adding stubs > to PyPI. > > * The cognitive load is very high. The average function signature will > double in length. This is not a small cost and telling me it is "optional" > to pretend that every other word on the line doesn't exist is a farce. > > * Every company's style guide is about to get much longer. That in itself > is an indicator that this is a MAJOR language change and not just some > "optional" add-on. > > * People will screw it up. The same people who can't be trusted to program > without type annotations are also going to be *writing* those type > annotations. > > * Teaching python is about to get much less attractive. It will not be > optional for teachers to say "just pretend all this stuff over here doesn't > exist" > > * "No new syntax" is a lie. Or rather a red herring. There are lots of new > things it will be required to know and just because the compiler doesn't > have to change doesn't mean the language isn't undergoing a major change. > > If this wasn't in a PEP and it wasn't going to ship in the stdlib very few > people would use it. If you told everyone they had to install a different > python implementation they wouldn't. This is much worse than that - it is > Python4 hidden away inside a PEP. > > There are many fine languages that have sophisticated type systems. And > many bondage & discipline languages that make you type things three times > to make really really sure you meant to type that. If you find those other > languages appealing I invite you to go use them instead. > > -Jack > > https://mail.python.org/pipermail/python-dev/2003-February/033291.html > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/chris.barker%40noaa.gov > > -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From jheiv at jheiv.com Wed Apr 22 00:23:28 2015 From: jheiv at jheiv.com (James Edwards) Date: Tue, 21 Apr 2015 18:23:28 -0400 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> <20150421004339.1DA5CB500F5@webabinitio.net> <20150421094958.2960986c@fsol> Message-ID: Cory Benfield "python-dev at python.org" On Tue, Apr 21, 2015 at 8:47 AM, Cory Benfield wrote: > I'm talking from the position of being a library author, where supporting > versions of Python lower than 3.5 will be a reality for at least 5 more years. > I will not be able to inline my type hints, so they'll have to go in > stub files, and now we've got the same problem: type hints can go out > of step just as easily as documentation can. I imagine authors interested in both type hinting and backwards compatibility could write their code using the 3.0+ annotation syntax, while using the various backwards compatibility packages and schemes, then "split" the source file using some inevitably-available tool to generate both annotation-free source and associated stub files. The author maintains (directly, at least) files on a 1-to-1 ratio -- the fact that they are split prior to distribution is transparent (much like generating documentation from source / comments). Running such a preprocessing step may not be ideal, but it is certainly an option that the motivated package author would seem to have, if they were so inclined. -- FWIW, my gut reaction was the same as many on the list (the syntax can get verbose and negatively impact readability), but I think we're only seeing one side of the eventual coin, looking at individual, sometimes specifically chosen examples. In theory, if this "takes off", IDEs could do a lot to help reduce the burden. Maybe they provide a "read-only" view of functions/methods where annotations are hidden until you put the caret inside the block, at which point the hints are revealed. Maybe even inside the block they're hidden, until you attempt to edit the arguments. Maybe they're just always gathered up by the IDE and accessible through some panel widget. Some of these approaches may sound helpful to some, others not so much, but the point is that there are a lot of creative ways that IDEs *could* help ease the transition, if type hinting becomes of significant interest to the community. From ryan at ryanhiebert.com Wed Apr 22 00:30:06 2015 From: ryan at ryanhiebert.com (Ryan Hiebert) Date: Tue, 21 Apr 2015 17:30:06 -0500 Subject: [Python-Dev] async/await PEP In-Reply-To: References: <55315B32.5000205@gmail.com> Message-ID: <6D06D6D1-E476-4FAC-9A62-D3D05C3AAEBA@ryanhiebert.com> On Apr 21, 2015, at 3:23 AM, Martin Teichmann wrote: > > Hi Yury, Hi List, > > I do certainly like the idea of PEP 492, just some small comments: > > why do we need two keywords? To me it is not necessarily intuitive > when to use async and when to use await (why is it async for and not > await for?), so wouldn't it be much simpler, and more symmetric, to > just have one keyword? I personally prefer await for that, then it is > "await def", and "await for" (and "await with", etc.). Based on my understanding, you use ``async`` for blocks that can _use_ ``await``, and ``await`` to wait on an asynchronous action, such as calling an ``async`` function. From donald at stufft.io Wed Apr 22 01:27:47 2015 From: donald at stufft.io (Donald Stufft) Date: Tue, 21 Apr 2015 19:27:47 -0400 Subject: [Python-Dev] [python-committers] [RELEASED] Python 3.5.0a4 is now available In-Reply-To: References: <5534B5C0.2030800@hastings.org> <1429653904845.96438@microsoft.com> Message-ID: > On Apr 21, 2015, at 7:18 PM, Steve Dower wrote: > > Donald Stufft wrote: >> Is this version statically linked by any chance? > > No, there's no change to the compilation process, so you can get exactly the same result by using the regular installer and copying the files around. > > Not sure if this is what you're referring to, but on Windows DLLs are always loaded from alongside the executable before searching in other locations. I learned at PyCon that some platforms embed full paths for .so's (I'd always just assumed that there was a similar resolution process), but this is not the case here. Dropping all the DLLs and PYDs in the same location is sufficient to make sure you always load the right files, unless sys.path has been filled in with incorrect registry values (the bug I referenced). > > Cheers, > Steve My use case is using PyInstaller to attempt to create a single file executable. Doing it on Windows was giving a strange error message about mscrvt or something so I thought maybe I needed to statically compile Python. I was hoping maybe this thing would have done that for me so I didn?t need to figure out how. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 801 bytes Desc: Message signed with OpenPGP using GPGMail URL: From ezio.melotti at gmail.com Wed Apr 22 01:36:42 2015 From: ezio.melotti at gmail.com (Ezio Melotti) Date: Wed, 22 Apr 2015 01:36:42 +0200 Subject: [Python-Dev] Starting CPython development w/ Docker In-Reply-To: References: Message-ID: On Mon, Apr 20, 2015 at 3:52 PM, Saul Shanabrook wrote: > I started trying some CPythong development a week ago at PyCon and first got > testing working using Docker on my mac. This had the advantage of not having > to worry about installing and dependencies, and also let me test on > different Python versions easily. > > If you are interested in trying it, I laid out all the steps here: > http://www.saulshanabrook.com/cpython-dev-w-docker/ > Thanks for the link! I was just looking into the possibility of dockerizing bugs.python.org and what you wrote looks quite helpful. Best Regards, Ezio Melotti > Saul Shanabrook > From Steve.Dower at microsoft.com Wed Apr 22 01:18:52 2015 From: Steve.Dower at microsoft.com (Steve Dower) Date: Tue, 21 Apr 2015 23:18:52 +0000 Subject: [Python-Dev] [python-committers] [RELEASED] Python 3.5.0a4 is now available In-Reply-To: References: <5534B5C0.2030800@hastings.org> <1429653904845.96438@microsoft.com> Message-ID: Donald Stufft wrote: > Is this version statically linked by any chance? No, there's no change to the compilation process, so you can get exactly the same result by using the regular installer and copying the files around. Not sure if this is what you're referring to, but on Windows DLLs are always loaded from alongside the executable before searching in other locations. I learned at PyCon that some platforms embed full paths for .so's (I'd always just assumed that there was a similar resolution process), but this is not the case here. Dropping all the DLLs and PYDs in the same location is sufficient to make sure you always load the right files, unless sys.path has been filled in with incorrect registry values (the bug I referenced). Cheers, Steve From yselivanov.ml at gmail.com Wed Apr 22 02:53:05 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Tue, 21 Apr 2015 20:53:05 -0400 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: Message-ID: <5536F0F1.2090508@gmail.com> Hi Damien, Thanks for noticing! I pushed a fix to the peps repo. Thanks, Yury On 2015-04-21 5:20 PM, Damien George wrote: > Hi Yury, > > In your PEP 492 draft, in the Grammar section, I think you're missing > the modifications to the "flow_stmt" line. > > Cheers, > Damien. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/yselivanov.ml%40gmail.com From i at introo.me Wed Apr 22 03:06:52 2015 From: i at introo.me (Shiyao Ma) Date: Tue, 21 Apr 2015 21:06:52 -0400 Subject: [Python-Dev] Tracker user registration problem In-Reply-To: References: <1428788375.4050095.252170001.6A567456@webmail.messagingengine.com> Message-ID: ITSM for gmail accounts, the activation mail is always in the spam folder. > > -- ????????????????http://introo.me? -------------- next part -------------- An HTML attachment was scrubbed... URL: From robertc at robertcollins.net Wed Apr 22 04:35:27 2015 From: robertc at robertcollins.net (Robert Collins) Date: Wed, 22 Apr 2015 14:35:27 +1200 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> <20150421004339.1DA5CB500F5@webabinitio.net> <20150421094958.2960986c@fsol> Message-ID: On 22 April 2015 at 08:26, Guido van Rossum wrote: > In the end this should be up to you and the reviewers, but for such a > venerable module like unittest I'd be hesitant to be an early adopter. I'd > also expect that much of unittest is too dynamic in nature to benefit from > type hints. But maybe you should just try to use them for testtools and see > for yourself how beneficial or cumbersome they are in that particular case? Exactly yes. I've been experimenting recently with mypy to see. So far I've regressed backthrough 4 repos (unittest2, testtools, traceback2, linecache2) to get something small enough to work and experiment with. -Rob -- Robert Collins Distinguished Technologist HP Converged Cloud From greg at krypto.org Wed Apr 22 05:35:21 2015 From: greg at krypto.org (Gregory P. Smith) Date: Wed, 22 Apr 2015 03:35:21 +0000 Subject: [Python-Dev] Python 3.x Adoption for PyPI and PyPI Download Numbers In-Reply-To: <70D6B7F8-DB34-4484-BB17-730CE94E3F0E@stufft.io> References: <70D6B7F8-DB34-4484-BB17-730CE94E3F0E@stufft.io> Message-ID: On Tue, Apr 21, 2015 at 10:55 AM Donald Stufft wrote: > Just thought I'd share this since it shows how what people are using to > download things from PyPI have changed over the past year. Of particular > interest to most people will be the final graphs showing what percentage of > downloads from PyPI are for Python 3.x or 2.x. > > As always it's good to keep in mind, "Lies, Damn Lies, and Statistics". > I've > tried not to bias the results too much, but some bias is unavoidable. Of > particular note is that a lot of these numbers come from pip, and as of > version > 6.0 of pip, pip will cache downloads by default. This would mean that older > versions of pip are more likely to "inflate" the downloads than newer > versions > since they don't cache by default. In addition if a project has a file > which > is used for both 2.x and 3.x and they do a ``pip install`` on the 2.x > version > first then it will show up as counted under 2.x but not 3.x due to caching > (and > of course the inverse is true, if they install on 3.x first it won't show > up > on 2.x). > > Here's the link: https://caremad.io/2015/04/a-year-of-pypi-downloads/ > > Anyways, I'll have access to the data set for another day or two before I > shut down the (expensive) server that I have to use to crunch the numbers > so if > there's anything anyone else wants to see before I shut it down, speak up > soon. > Thanks! I like your focus on particular packages of note such as django and requests. How do CDNs influence these "lies"? I thought the download counts on PyPI were effectively meaningless due to CDN mirrors fetching and hosting things? Do we have user-agent logs from all PyPI package CDN mirrors or just from the master? -gps -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Wed Apr 22 05:46:27 2015 From: donald at stufft.io (Donald Stufft) Date: Tue, 21 Apr 2015 23:46:27 -0400 Subject: [Python-Dev] Python 3.x Adoption for PyPI and PyPI Download Numbers In-Reply-To: References: <70D6B7F8-DB34-4484-BB17-730CE94E3F0E@stufft.io> Message-ID: <8A88415D-8766-4C44-B3F8-81096718FC9D@stufft.io> > On Apr 21, 2015, at 11:35 PM, Gregory P. Smith wrote: > > > > On Tue, Apr 21, 2015 at 10:55 AM Donald Stufft > wrote: > Just thought I'd share this since it shows how what people are using to > download things from PyPI have changed over the past year. Of particular > interest to most people will be the final graphs showing what percentage of > downloads from PyPI are for Python 3.x or 2.x. > > As always it's good to keep in mind, "Lies, Damn Lies, and Statistics". I've > tried not to bias the results too much, but some bias is unavoidable. Of > particular note is that a lot of these numbers come from pip, and as of version > 6.0 of pip, pip will cache downloads by default. This would mean that older > versions of pip are more likely to "inflate" the downloads than newer versions > since they don't cache by default. In addition if a project has a file which > is used for both 2.x and 3.x and they do a ``pip install`` on the 2.x version > first then it will show up as counted under 2.x but not 3.x due to caching (and > of course the inverse is true, if they install on 3.x first it won't show up > on 2.x). > > Here's the link: https://caremad.io/2015/04/a-year-of-pypi-downloads/ > > Anyways, I'll have access to the data set for another day or two before I > shut down the (expensive) server that I have to use to crunch the numbers so if > there's anything anyone else wants to see before I shut it down, speak up soon. > > Thanks! > > I like your focus on particular packages of note such as django and requests. > > How do CDNs influence these "lies"? I thought the download counts on PyPI were effectively meaningless due to CDN mirrors fetching and hosting things? > > Do we have user-agent logs from all PyPI package CDN mirrors or just from the master? > > -gps We took the download counts offline for awhile because of the CDN, however within a month or two (now almost two years ago) they enabled logs on our account to bring them back. So these numbers are from the CDN edge and they reflect the ?true? traffic. I say ?true? because although we have logs, logging isn?t considered an essential service so in times of problems logging can be reduced or disabled completely (you can see in the data set some weeks had a massive drop, this was due to missing a day or two of logs). That being said though, ontop of the Fastly provided CDN, there is also the ability to mirror PyPI (which shows up as bandersnatch or others in the logs) and if someone is installing from a mirror we don?t see that data at all. On top of that, all versions of pip prior to 6.0 had an opt in download cache which would mean that, on an opt in basis, we wouldn?t see downloads for those people and since 6.0 there is now an opt-out cache. Specifically to the mirror network itself, that represents about 20% of the total traffic on PyPI, however we can determine when it was a mirror and those downloads show up as ?Unknown? in other charts since it?s a mirror client we don?t know what the final target environment will be. This might mean that future snapshots will look at API accesses instead, or perhaps we try to implement some sort of optional popcon or maybe we continue to look at package installs and we just interpret the data with the knowledge that these things are at play. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 801 bytes Desc: Message signed with OpenPGP using GPGMail URL: From tjreedy at udel.edu Wed Apr 22 08:01:20 2015 From: tjreedy at udel.edu (Terry Reedy) Date: Wed, 22 Apr 2015 02:01:20 -0400 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <55362CCF.3020809@btinternet.com> <20150421124720.GK5663@ando.pearwood.info> <20150421150827.3c2a0af6@fsol> <20150421165042.34f935ef@x230> <55365AA7.3060500@python.org> <20150421182750.3efcb597@x230> <20150421161701.E06C0B20095@webabinitio.net> <20150421215820.50e543b8@x230> Message-ID: On 4/21/2015 6:50 PM, Chris Barker wrote: > On Tue, Apr 21, 2015 at 11:58 AM, Paul Sokolovsky > wrote: > > It does, and hope people won't be caught in "static typechecking" > loop and consider other usages too. I an interested is using type hints for automatic or at least semi-automatic generation of tests, in particular for parameters hinted as generic and abstract. A parameter "ints: Iterable(int)" should be work with empty, length 1, and longer tuples, lists, sets, dicts (with int keys) and a custom class. Unless there is supplementary doc otherwise, it should handle negative, 0, and positive ints, small and large, in regular and random patterns. (Thinking about this more, some of this could be done without the PEP being approved, but there has to be *some* syntax for test generator input information, and using the standard vocabulary of typing instead of inventing something would be easier.) > I"m confused -- from the bit I've been skimming the discussion, over on > python-ideas, and now here, is that this is all about "static typechecking". The PEP is about a) defining type hints as the primary use of annotations* and b) adding a new module to standardize type hints instead of having multiple groups define multiple variations on the theme. The urgency of developing a standard now is that there *are* at least two or three independent efforts. * http://www.artima.com/weblogs/viewpost.jsp?thread=85551 (2004) makes it clear that Guido thought of static typing first and annotations second, in the form added to 3.0 years later, as an implementation thereof. The primary motivation of the people working on the new module is static typechecking by 3rd party programs. This seems to be Guido's primary interest and what he believes will be the primary use. It is also the primary promotional argument since type checking already exists. However, this does not stop others from having other motivations and making other uses of the information once available. > It's not about run-time type checking. The PEP does not prevent that, and will make it more feasible by standardizing on one system. > It's not about type-base performance optimization. Nothing is planned for CPython, but type hints are already being used by MicroPython for this purpose. (To me, this is one of the pluses of it having targeted 3.3.) > It's not about any use of annotations other than types. Right, but there are multiple possible uses for type information. > What is it about other than static typechecking? There is also documentation. Type hints are partially an alternative to stylized entries in doc strings. A program might combine the two or check consistency. -- Terry Jan Reedy From greg.ewing at canterbury.ac.nz Wed Apr 22 08:05:19 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Wed, 22 Apr 2015 18:05:19 +1200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <55368858.4010007@gmail.com> References: <55368858.4010007@gmail.com> Message-ID: <55373A1F.2020704@canterbury.ac.nz> Yury Selivanov wrote: > 1. CO_ASYNC flag was renamed to CO_COROUTINE; > > 2. sys.set_async_wrapper() was renamed to > sys.set_coroutine_wrapper(); > > 3. New function: sys.get_coroutine_wrapper(); > > 4. types.async_def() renamed to types.coroutine(); I still don't like the idea of hijacking the generic term "coroutine" and using it to mean this particular type of object. > 2. I propose to disallow using of 'for..in' loops, > and builtins like 'list()', 'iter()', 'next()', > 'tuple()' etc on coroutines. PEP 3152 takes care of this automatically from the fact that you can't make an ordinary call to a cofunction, and cocall combines a call and a yield-from. You have to go out of your way to get hold of the underlying iterator to use in a for-loop, etc. -- Greg From tjreedy at udel.edu Wed Apr 22 08:32:18 2015 From: tjreedy at udel.edu (Terry Reedy) Date: Wed, 22 Apr 2015 02:32:18 -0400 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: Message-ID: On 4/21/2015 6:41 PM, Chris Barker wrote: > Well, it'll catch passing in a string instead of a sequence of strings > -- one of teh common and semi-insidious type errors I see a lot (at > least with newbies). > > Oh wait, maybe it won't -- a string IS a sequence of strings. That's why > this is an insidioous bug in teh first place: > > List[string] > > will work, yes -- but now I'm locked down the user to an actual, list, > yes? So I try: > > Iterable[string] > > ah yes -- now users can use tuples of strings, or a custom class, or -- > but wait, darn it, a string IS an iterable of strings.. SIGH. I was just thinking today that for this, typing needs a subtraction (difference) operation in addition to an addition (union) operation: Difference(Iterable(str), str) -- Terry Jan Reedy From mal at egenix.com Wed Apr 22 12:46:16 2015 From: mal at egenix.com (M.-A. Lemburg) Date: Wed, 22 Apr 2015 12:46:16 +0200 Subject: [Python-Dev] typeshed for 3rd party packages In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> <5535FD60.7020404@egenix.com> Message-ID: <55377BF8.8030402@egenix.com> On 21.04.2015 18:08, Guido van Rossum wrote: > On Tue, Apr 21, 2015 at 12:33 AM, M.-A. Lemburg wrote: > >> On 21.04.2015 05:37, Guido van Rossum wrote: >>> On Mon, Apr 20, 2015 at 4:41 PM, Jack Diederich >> wrote: >>>> * Uploading stubs for other people's code is a terrible idea. Who do I >>>> contact when I update the interface to my library? The random Joe who >>>> "helped" by uploading annotations three months ago and then quit the >>>> internet? I don't even want to think about people maliciously adding >> stubs >>>> to PyPI. >>>> >>> >>> We're certainly not planning to let arbitrary people upload stubs for >>> arbitrary code to PyPI that will automatically be used by the type >>> checkers. (We can't stop people from publishing their stubs, just as you >>> can't stop people from writing blog posts or stackoverflow answers with >>> examples for your library.) >>> >>> The actual plan is to have a common repository of stubs (a prototype >> exists >>> at https://github.com/JukkaL/typeshed) but we also plan some kind of >>> submission review. I've proposed that when submitting stubs for a package >>> you don't own, the typeshed owners ask the package owner what their >>> position is, and we expect the answers to fall on the following spectrum: >>> >>> - I don't want stubs uploaded for my package >>> - I'll write the stubs myself >>> - I want to review all stubs that are uploaded for my package before they >>> are accepted >>> - Please go ahead and add stubs for my package and let me know when >> they're >>> ready >>> - Go ahead, I trust you >>> >>> This seems a reasonable due diligence policy that avoids the scenarios >>> you're worried about. (Of course if you refuse stubs a black market for >>> stubs might spring into existence. That sounds kind of exciting... :-) >> >> Hmm, that's the first time I've heard about this. I agree with >> Jack that it's a terrible idea to allow this for 3rd party >> packages. >> >> If people want to contribute stubs, they should contribute them >> to the package in the usual ways, not in a side channel. The important >> part missing in the above setup is maintenance and to some extent >> an external change of the API definitions. >> >> Both require active participation in the package project, >> not the separated setup proposed above, to be effective and >> actually work out in the long run. >> >> For the stdlib, typeshed looks like a nice idea to spread the >> workload. >> > > I hesitate to speak for others, but IIUC the reason why typeshed was > started is that companies like PyCharm and Google (and maybe others) are > *already* creating their own stubs for 3rd party packages, because they > have a need to type-check code that *uses* 3rd party packages. Their use > cases are otherwise quite different (the user code type-checked by PyCharm > is that of PyCharm users, and the code type-checked by Google is their own > proprietary code) but they both find themselves needing stubs for commonly > used 3rd party packages. mypy finds itself in a similar position. > > Think of it this way. Suppose you wrote an app that downloaded some files > from the web using the popular Requests package. Now suppose you wanted to > run mypy over your app. You're willing to do the work of adding signatures > to your own app, and presumably there are stubs for those parts of the > stdlib that you're using, but without stubs for Requests, mypy won't do a > very good job type-checking your calls into Requests. It's not rocket > science to come up with stubs for Requests (there aren't that many classes > and methods) but the Requests package is in maintenance mode, and while > they respond quickly to security issues they might take their time to > release a new version that includes your stub files, and until there are a > lot of people clamoring for stubs for Requests, they might not care at all. For projects in maintenance mode, it does make sense indeed. For active ones, I think you'd quickly run into a situation similar to translation projects: there are always parts which haven't been translated yet or where the translation no longer matches the original intent. Unlike with translations, where missing or poor ones don't have much effect on the usefulness of the software, a type checker would complain loudly and probably show lots of false positives (if you read a type bug as "positive"), causing the usual complaints by users to the software authors. I don't really think that users would instead complain to the type checker authors or find the actual source of the problem which are the broken stub files. OTOH, if the type checkers are written in a way where they can detect authoritative stubs compared to non-authoritative ones and point users to the possible type stub file problem, this could be resolved, I guess. The stub files would then need an "authoritative" flag and probably also be versioned to get this working. > So what does Requests have to lose if, instead of including the stubs in > Requests, they let the typeshed people distribute stubs for Requests? > Presumably having the stubs in typeshed means that PyCharm and mypy (and > the 50 other type-checkers that are being written right now :-) can give Hmm, with 50 type-checkers around it sounds like the above idea wouldn't work by simple convention. I guess a PEP would be needed to standardize it instead. > better diagnostics for code using Requests, and once in a while this may > save a user of Requests from doing something dumb and blaming Requests. The > only downside would be if something was missing in the stubs and users > would get incorrect error messages from their favorite type checker. But > it's a long stretch to see this rain down on Requests' reputation -- more > likely the type checker will be blamed, so type checker > authors/distributors will be vigilant before distributing stubs. As I've explained above, in my experience, people (*) often first go to the authors of the software and not do research to find out that the tool they were using has a problem (via the non-authoritative stub files it's using). (*) More experienced users of pylint like tools will probably think twice due to the many false positives these tools tend to generate. I'm not sure whether people using type checkers would have the same approach, though, esp. not if they are coming from the land of statically typed languages. > OTOH if you prefer to make and distribute your own stubs, type checkers > will use those, and there won't be a need to include stubs in typeshed when > a package already provides stubs. > > And if you really don't want anything to do with stubs for your package, > just tell the typeshed owners and your wish will be respected. While this sounds like a fair deal, I think you're underestimating the social pressure this can impose on the software authors and this is really the main reason why I think we need to approach this carefully. I'm not really worried about the technical side of things with the approach, but more with the social side of forcing everyone to provide stubs or use type annotations in their code. As always, I'm probably too worried ;-), but if there's something we could do to avoid it, I believe we should. Another question: Will these stubs also work for closed-source software, i.e. commercial Python extensions such as the ones eGenix is selling ? -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source (#1, Apr 22 2015) >>> Python Projects, Coaching and Consulting ... http://www.egenix.com/ >>> mxODBC Plone/Zope Database Adapter ... http://zope.egenix.com/ >>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ ________________________________________________________________________ ::::: Try our mxODBC.Connect Python Database Interface for free ! :::::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/ From cory at lukasa.co.uk Wed Apr 22 13:09:25 2015 From: cory at lukasa.co.uk (Cory Benfield) Date: Wed, 22 Apr 2015 12:09:25 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> Message-ID: On 21 April 2015 at 17:59, Guido van Rossum wrote: > For me, PEP 484 is a stepping stone. Among the authors of PEP 484 there was > much discussion about duck typing, and mypy even has some limited support > for duck typing (I think you can still find it by searching the mypy code > for "protocol"). But we ran out of time getting all the details written up > and agreed upon, so we decided to punt -- for now. But duck typing still > needs to have a way to talk about things like "seek method with this type > signature" (something like `def seek(self, offset: int, whence: > int=SEEK_SET) -> int`) so the current proposal gets us part of the way > there. > > The hope is that once 3.5 is out (with PEP 484's typing.py included > *provisional* mode) we can start working on the duck typing specification. > The alternative would have been to wait until 3.6, but we didn't think that > there would be much of an advantage to postponing the more basic type > hinting syntax (it would be like refusing to include "import" until you've > sorted out packages). During the run of 3.5 we'll hopefully get feedback on > where duck typing is most needed and how to specify it -- valuable input > that would be much harder to obtain of *no* part of the type hints notation > were standardized. This makes a lot of sense. If PEP 484 is intended to be a stepping stone (or compromise, or beta, or whatever word one wants to use), then it is easy to forgive it its limitations, and I'm looking forward to seeing it improve. From cory at lukasa.co.uk Wed Apr 22 13:13:55 2015 From: cory at lukasa.co.uk (Cory Benfield) Date: Wed, 22 Apr 2015 12:13:55 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150421171250.GO5663@ando.pearwood.info> References: <20150421171250.GO5663@ando.pearwood.info> Message-ID: On 21 April 2015 at 18:12, Steven D'Aprano wrote: > I expect that dealing with duck typing will be very high on the list > of priorities for the future. In the meantime, for this specific use-case, > you're probably not going to be able to statically check this type hint. > Your choices would be: > > - don't type check anything; To be clear, for the moment this is what requests will do, unless the other maintainers strongly disagree with me (they don't). I am quite convinced that PEP 484 is insufficiently powerful to make it worthwhile for requests to provide 'official' type hints. I suspect someone will provide hints to typeshed, and I certainly hope they're good, because if they're bad we'll definitely field bug reports about them (more on this in a different thread I think). From cory at lukasa.co.uk Wed Apr 22 13:22:18 2015 From: cory at lukasa.co.uk (Cory Benfield) Date: Wed, 22 Apr 2015 12:22:18 +0100 Subject: [Python-Dev] typeshed for 3rd party packages In-Reply-To: <55377BF8.8030402@egenix.com> References: <20150420144106.63513828@anarchist.wooz.org> <5535FD60.7020404@egenix.com> <55377BF8.8030402@egenix.com> Message-ID: On 22 April 2015 at 11:46, M.-A. Lemburg wrote: > Unlike with translations, where missing or poor ones don't have > much effect on the usefulness of the software, a type checker > would complain loudly and probably show lots of false positives > (if you read a type bug as "positive"), causing the usual complaints > by users to the software authors. > > I don't really think that users would instead complain to the type > checker authors or find the actual source of the problem which are > the broken stub files. This is my expectation as well. Requests receives bug reports for bugs that were introduced by downstream packagers, or for bugs that are outright in unrelated projects. I field IRC questions about 'requests bugs' that are actually bugs in the web application on the other end of the HTTP connection! I can *guarantee* that if a stub file is bad, I'll get told, not the author of the stub file. > OTOH, if the type checkers are written in a way where they can > detect authoritative stubs compared to non-authoritative ones > and point users to the possible type stub file problem, this > could be resolved, I guess. > > The stub files would then need an "authoritative" flag and > probably also be versioned to get this working. This would be great: +1. > As I've explained above, in my experience, people (*) often first go > to the authors of the software and not do research to find out > that the tool they were using has a problem (via the non-authoritative > stub files it's using). > > (*) More experienced users of pylint like tools will probably think > twice due to the many false positives these tools tend to generate. > I'm not sure whether people using type checkers would have the same > approach, though, esp. not if they are coming from the land of > statically typed languages. I can back this up, as can a search through Requests' past GitHub issues. pylint in particular has caused me pain due to at least one GitHub issue that was nothing more than a dump of pylint output when run over Requests', where *every single line* was a false positive. This ends up having negative network effects as well, because the next time someone opens a GitHub issue with the word 'pylint' in it I'm simply going to close it, rather than waste another 45 minutes visually confirming that every line is a false positive. I worry that stub files will have the same problems. From p.f.moore at gmail.com Wed Apr 22 14:45:31 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Wed, 22 Apr 2015 13:45:31 +0100 Subject: [Python-Dev] [python-committers] [RELEASED] Python 3.5.0a4 is now available In-Reply-To: <1429653904845.96438@microsoft.com> References: <5534B5C0.2030800@hastings.org> <1429653904845.96438@microsoft.com> Message-ID: On 21 April 2015 at 23:05, Steve Dower wrote: > I made it a self-extracting RAR file so it could be signed, but I've already had multiple people query it so the next release will probably just be a plain ZIP file. I just need to figure out some reliable way of validating the download other than GPG, since I'd like installers to be able to do the download transparently and ideally without hard-coding hash values. I might add a CSV of SHA hashes to the zip too. You could probably just leave it as is (or make it a self-extracting zip file) and just describe it on the web page as "Windows amd64 embeddable self-extracting archive". People are (I think) pretty used to the idea that they can open a self-extracting archive in tools like 7-zip, so those who didn't want to run the exe could do that (and would know they could). Obviously extracting that way you don't get the signature check, but that's to be expected. Paul From p.f.moore at gmail.com Wed Apr 22 14:48:23 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Wed, 22 Apr 2015 13:48:23 +0100 Subject: [Python-Dev] [python-committers] [RELEASED] Python 3.5.0a4 is now available In-Reply-To: References: <5534B5C0.2030800@hastings.org> <1429653904845.96438@microsoft.com> Message-ID: On 22 April 2015 at 13:45, Paul Moore wrote: > On 21 April 2015 at 23:05, Steve Dower wrote: >> I made it a self-extracting RAR file so it could be signed, but I've already had multiple people query it so the next release will probably just be a plain ZIP file. I just need to figure out some reliable way of validating the download other than GPG, since I'd like installers to be able to do the download transparently and ideally without hard-coding hash values. I might add a CSV of SHA hashes to the zip too. > > You could probably just leave it as is (or make it a self-extracting > zip file) and just describe it on the web page as "Windows amd64 > embeddable self-extracting archive". People are (I think) pretty used > to the idea that they can open a self-extracting archive in tools like > 7-zip, so those who didn't want to run the exe could do that (and > would know they could). Obviously extracting that way you don't get > the signature check, but that's to be expected. Whoops, no - I changed my mind. If you double click on the downloaded file (which I just did) it unpacks it into the directory you downloaded the exe to, with no option to put it anywhere else, and no UI telling you what it's doing. That's going to annoy people badly. Better make it a simple zipfile in that case. Paul (off to tidy up his download directory :-() From yselivanov.ml at gmail.com Wed Apr 22 17:40:27 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 22 Apr 2015 11:40:27 -0400 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <55373A1F.2020704@canterbury.ac.nz> References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> Message-ID: <5537C0EB.7070300@gmail.com> Hi Greg, On 2015-04-22 2:05 AM, Greg Ewing wrote: > Yury Selivanov wrote: >> 1. CO_ASYNC flag was renamed to CO_COROUTINE; >> >> 2. sys.set_async_wrapper() was renamed to >> sys.set_coroutine_wrapper(); >> >> 3. New function: sys.get_coroutine_wrapper(); >> >> 4. types.async_def() renamed to types.coroutine(); > > I still don't like the idea of hijacking the generic > term "coroutine" and using it to mean this particular > type of object. In my opinion it's OK. We propose a new dedicated syntax to define coroutines. Old approach to use generators to implement coroutines will, eventually, be obsolete. That's why, in PEP 492, we call 'async def' functions "coroutines", and the ones that are defined with generators "generator-based coroutines". You can also have "greenlets-based coroutines" and "stackless coroutines", but those aren't native Python concepts. I'm not sure if you can apply term "cofunctions" to coroutines in PEP 492. I guess we can call them "async functions". > >> 2. I propose to disallow using of 'for..in' loops, >> and builtins like 'list()', 'iter()', 'next()', >> 'tuple()' etc on coroutines. > > PEP 3152 takes care of this automatically from the fact > that you can't make an ordinary call to a cofunction, > and cocall combines a call and a yield-from. You have > to go out of your way to get hold of the underlying > iterator to use in a for-loop, etc. > On the one hand I like your idea to disallow calling coroutines without a special keyword (await in case of PEP 492). It has downsides, but there is some elegance in it. On the other hand, I hate the idea of grammatically requiring parentheses for 'await' expressions. That feels non-pytonic to me. I'd be happy to hear others opinion on this topic. Thanks! Yury From guido at python.org Wed Apr 22 17:43:51 2015 From: guido at python.org (Guido van Rossum) Date: Wed, 22 Apr 2015 08:43:51 -0700 Subject: [Python-Dev] typeshed for 3rd party packages In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> <5535FD60.7020404@egenix.com> <55377BF8.8030402@egenix.com> Message-ID: I definitely think that we shouldn't jump the gun here and tread carefully. Both Marc-Andr? and Cory brought up good things to watch out for. For closed-source software the only way to obtain stubs is presumably from the author, if they care. As Gregory Smith said in another thread, the tooling will have to prove itself to the point where library developers *want* to use it. For Requests, it looks like it may be better not to have stubs at all. You can assign any and all tracker issues asking for stubs to me, I'll gladly explain why in Requests' case it's better to live without stubs. On Wed, Apr 22, 2015 at 4:22 AM, Cory Benfield wrote: > On 22 April 2015 at 11:46, M.-A. Lemburg wrote: > > Unlike with translations, where missing or poor ones don't have > > much effect on the usefulness of the software, a type checker > > would complain loudly and probably show lots of false positives > > (if you read a type bug as "positive"), causing the usual complaints > > by users to the software authors. > > > > I don't really think that users would instead complain to the type > > checker authors or find the actual source of the problem which are > > the broken stub files. > > This is my expectation as well. > > Requests receives bug reports for bugs that were introduced by > downstream packagers, or for bugs that are outright in unrelated > projects. I field IRC questions about 'requests bugs' that are > actually bugs in the web application on the other end of the HTTP > connection! I can *guarantee* that if a stub file is bad, I'll get > told, not the author of the stub file. > > > > OTOH, if the type checkers are written in a way where they can > > detect authoritative stubs compared to non-authoritative ones > > and point users to the possible type stub file problem, this > > could be resolved, I guess. > > > > The stub files would then need an "authoritative" flag and > > probably also be versioned to get this working. > > This would be great: +1. > > > As I've explained above, in my experience, people (*) often first go > > to the authors of the software and not do research to find out > > that the tool they were using has a problem (via the non-authoritative > > stub files it's using). > > > > (*) More experienced users of pylint like tools will probably think > > twice due to the many false positives these tools tend to generate. > > I'm not sure whether people using type checkers would have the same > > approach, though, esp. not if they are coming from the land of > > statically typed languages. > > I can back this up, as can a search through Requests' past GitHub > issues. pylint in particular has caused me pain due to at least one > GitHub issue that was nothing more than a dump of pylint output when > run over Requests', where *every single line* was a false positive. > > This ends up having negative network effects as well, because the next > time someone opens a GitHub issue with the word 'pylint' in it I'm > simply going to close it, rather than waste another 45 minutes > visually confirming that every line is a false positive. I worry that > stub files will have the same problems. > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Wed Apr 22 17:50:56 2015 From: guido at python.org (Guido van Rossum) Date: Wed, 22 Apr 2015 08:50:56 -0700 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <5537C0EB.7070300@gmail.com> References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> Message-ID: On Wed, Apr 22, 2015 at 8:40 AM, Yury Selivanov wrote: > > On the one hand I like your idea to disallow calling > coroutines without a special keyword (await in case of > PEP 492). It has downsides, but there is some > elegance in it. On the other hand, I hate the idea > of grammatically requiring parentheses for 'await' > expressions. That feels non-pytonic to me. > > I'd be happy to hear others opinion on this topic. > I'm slowly warming up to Greg's notion that you can't call a coroutine (or whatever it's called) without a special keyword. This makes a whole class of bugs obvious the moment the code is executed. OTOH I'm still struggling with what you have to do to wrap a coroutine in a Task, the way its done in asyncio by the Task() constructor, the loop.create_task() method, and the async() function (perhaps to be renamed to ensure_task() to make way for the async keyword). -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From skip.montanaro at gmail.com Wed Apr 22 18:12:01 2015 From: skip.montanaro at gmail.com (Skip Montanaro) Date: Wed, 22 Apr 2015 11:12:01 -0500 Subject: [Python-Dev] typeshed for 3rd party packages In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> <5535FD60.7020404@egenix.com> <55377BF8.8030402@egenix.com> Message-ID: On Wed, Apr 22, 2015 at 10:43 AM, Guido van Rossum wrote: > For Requests, it looks like it may be better not to have stubs at all. Can you expand on this? Why would Requests be any different than any other module/package? As for versioning, I think stub files would absolutely have to declare the appropriate version(s) to which they apply (probably via embedded directives), so type checkers can ignore them when necessary. That also means that type checkers must be able to figure out the version of the package used by the application being analyzed. Not sure I'm being too clear, so I will provide an example. I have app "myapp" which imports module "yourmod" v 1.2.7. The yourmod author hasn't yet provided type annotations, so someone else contributed a stub to the typeshed. Time passes and a new version of "yourmod" comes out, v 2.0.0. (Semantic versioning tells us the API has changed in an incompatible way because of the major version bump.) I decide I need some of its new features and update "myapp". There is no new stub file in the typeshed yet. When I run my fancy type checker (someone suggested I will shortly have 50 to choose from!), it needs to recognize that the stub no longer matches the version of "yourmod" I am using, and must ignore it. Does that suggest the typeshed needs some sort of structure which allows all versions of stubs for the same package to be gathered together? My apologies if I'm following along way behind the curve. Skip -------------- next part -------------- An HTML attachment was scrubbed... URL: From yselivanov.ml at gmail.com Wed Apr 22 18:18:24 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 22 Apr 2015 12:18:24 -0400 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> Message-ID: <5537C9D0.80505@gmail.com> Hi Guido, On 2015-04-22 11:50 AM, Guido van Rossum wrote: > On Wed, Apr 22, 2015 at 8:40 AM, Yury Selivanov > wrote: >> On the one hand I like your idea to disallow calling >> coroutines without a special keyword (await in case of >> PEP 492). It has downsides, but there is some >> elegance in it. On the other hand, I hate the idea >> of grammatically requiring parentheses for 'await' >> expressions. That feels non-pytonic to me. >> >> I'd be happy to hear others opinion on this topic. >> > I'm slowly warming up to Greg's notion that you can't call a coroutine (or > whatever it's called) without a special keyword. This makes a whole class > of bugs obvious the moment the code is executed. > > OTOH I'm still struggling with what you have to do to wrap a coroutine in a > Task, the way its done in asyncio by the Task() constructor, the > loop.create_task() method, and the async() function (perhaps to be renamed > to ensure_task() to make way for the async keyword). > If we apply Greg's ideas to PEP 492 we will have the following (Greg, correct me if I'm wrong): 1. '_typeobject' struct will get a new field 'tp_await'. We can reuse 'tp_reserved' for that probably. 2. We'll hack Gen(/ceval.c?) objects to raise an error if they are called directly and have a 'CO_COROUTINE' flag. 3. Task(), create_task() and async() will be modified to call 'coro.__await__(..)' if 'coro' has a 'CO_COROUTINE' flag. 4. 'await' will require parentheses grammatically. That will make it different from 'yield' expression. For instance, I still don't know what would 'await coro(123)()' mean. 5. 'await foo(*a, **k)' will be an equivalent to 'yield from type(coro).__await__(coro, *a, **k)' 6. If we ever decide to implement coroutine-generators -- async def functions with 'await' *and* some form of 'yield' -- we'll need to reverse the rule -- allow __call__ and disallow __await__ on such objects (so that you'll be able to write 'async for item in coro_gen()' instead of 'async for item in await coro_gen()'. To be honest, that's a lot of steps and hacks to make this concept work. I think that 'set_coroutine_wrapper()' solves all these problems while keeping the grammar and implementation simpler. Moreover, it allows to add some additional features to the wrapped coroutines, such as nicer repr() in debug mode (CoroWrapper in asyncio already does that) and other runtime checks. Thanks, Yury From graffatcolmingov at gmail.com Wed Apr 22 18:26:14 2015 From: graffatcolmingov at gmail.com (Ian Cordasco) Date: Wed, 22 Apr 2015 11:26:14 -0500 Subject: [Python-Dev] typeshed for 3rd party packages In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> <5535FD60.7020404@egenix.com> <55377BF8.8030402@egenix.com> Message-ID: On Wed, Apr 22, 2015 at 11:12 AM, Skip Montanaro wrote: > > On Wed, Apr 22, 2015 at 10:43 AM, Guido van Rossum > wrote: > >> For Requests, it looks like it may be better not to have stubs at all. > > > Can you expand on this? Why would Requests be any different than any other > module/package? > > On a separate thread Cory provided an example of what the hints would look like for *part* of one function in the requests public functional API. While our API is outwardly simple, the values we accept in certain cases are actually non-trivially represented. Getting the hints *exactly* correct would be extraordinarily difficult. > As for versioning, I think stub files would absolutely have to declare the > appropriate version(s) to which they apply (probably via embedded > directives), so type checkers can ignore them when necessary. That also > means that type checkers must be able to figure out the version of the > package used by the application being analyzed. > > Not sure I'm being too clear, so I will provide an example. I have app > "myapp" which imports module "yourmod" v 1.2.7. The yourmod author hasn't > yet provided type annotations, so someone else contributed a stub to the > typeshed. Time passes and a new version of "yourmod" comes out, v 2.0.0. > (Semantic versioning tells us the API has changed in an incompatible way > because of the major version bump.) I decide I need some of its new > features and update "myapp". There is no new stub file in the typeshed yet. > When I run my fancy type checker (someone suggested I will shortly have 50 > to choose from!), it needs to recognize that the stub no longer matches the > version of "yourmod" I am using, and must ignore it. > > Which of course also assumes that the author of that library is even using Semantic Versioning (which is not a universal release strategy, even in the Ruby community). I understand where you're coming from, but I think this is a reason as to why typeshed as a catch-all for third party type-hints may not be feasible. > > Does that suggest the typeshed needs some sort of structure which allows > all versions of stubs for the same package to be gathered together? > > My apologies if I'm following along way behind the curve. > No need to apologize. =) As the other maintainer of requests, I think having hints *might* help some developers, but looking at what Cory generated (which looks to be valid), I'm wondering about something else with Type Hints. I've heard several people say "Just create an aliased type for the hint so it's shorter!" but doesn't that mean we then have to document that alias for our users? I mean if the IDE suggests that the developer use XYZ for this parameter and there's no explanation for XYZ actually is (in the IDE), doesn't this just add a lot more maintenance to adding hints? Maintainers now have to: - Keep the stubs up-to-date - Document the stubs (and if the stubs are in typeshed, does $MyPackage link to the docs in typeshed to avoid users filing bugs on $MyPackage's issue tracker?) - Version the stubs (assuming they're maintained in a third-party location, e.g., typeshed) Don't get me wrong. I really like the idea of moving towards Type Hints. I'm not even particularly against adding type hints for Requests to typeshed. I'm just hesitant that it will be easy to make them entirely accurate. Cheers, Ian -------------- next part -------------- An HTML attachment was scrubbed... URL: From graffatcolmingov at gmail.com Wed Apr 22 18:36:23 2015 From: graffatcolmingov at gmail.com (Ian Cordasco) Date: Wed, 22 Apr 2015 11:36:23 -0500 Subject: [Python-Dev] typeshed for 3rd party packages In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> <5535FD60.7020404@egenix.com> <55377BF8.8030402@egenix.com> Message-ID: On Wed, Apr 22, 2015 at 11:30 AM, Skip Montanaro wrote: > > > On Wed, Apr 22, 2015 at 11:26 AM, Ian Cordasco > wrote: > >> On a separate thread Cory provided an example of what the hints would >> look like for *part* of one function in the requests public functional API. >> > > Thanks. That encouraged me to look around for recent posts from Cory. > Wow... > You're welcome! And yeah. That union that Cory posted was for *one* parameter if I remember correctly. I won't speak for Cory, but I'm not against the type hints in 484 but they will be difficult for us as a project. They'll be marginally less difficult for me in a different project of mine. I also wonder about importing type definitions from other packages. The Requests-Toolbelt adds a few features that are enhanced versions of what's already in Requests. I can think of a few type hints that we might create to represent certain parameters, but I don't want to have to copy those for the features in the Requests-Toolbelt. I would expect this to "Just Work", but I wonder if anyone else has considered the possibility of this being a need. Cheers, Ian -------------- next part -------------- An HTML attachment was scrubbed... URL: From rajiv.kumar at gmail.com Wed Apr 22 18:53:39 2015 From: rajiv.kumar at gmail.com (Rajiv Kumar) Date: Wed, 22 Apr 2015 09:53:39 -0700 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <5537C9D0.80505@gmail.com> References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <5537C9D0.80505@gmail.com> Message-ID: I'd like to suggest another way around some of the issues here, with apologies if this has already been discussed sometime in the past. >From the viewpoint of a Python programmer, there are two distinct reasons for wanting to suspend execution in a block of code: 1. To yield a value from an iterator, as Python generators do today. 2. To cede control to the event loop while waiting for an asynchronous task to make progress in a coroutine. As of today both of these reasons to suspend are supported by the same underlying mechanism, i.e. a "yield" at the end of the chain of "yield from"s. PEPs 492 and 3152 introduce "await" and "cocall", but at the bottom of it all there's effectively still a yield as I understand it. I think that the fact that these two concepts use the same mechanism is what leads to the issues with coroutine-generators that Greg and Yury have raised. With that in mind, would it be possible to introduce a second form of suspension to Python to specifically handle the case of ceding to the event loop? I don't know what the implementation complexity of this would be, or if it's even feasible. But roughly speaking, the syntax for this could use "await", and code would look just like it does in the PEP. The semantics of "await " would be analogous to "yield from " today, with the difference that the Task would go up the chain of "await"s to the outermost caller, which would typically be asyncio, with some modifications from its form today. Progress would be made via __anext__ instead of __next__. Again, this might be impossible to do, but the mental model for the Python programmer becomes cleaner, I think. Most of the issues around combining generators and coroutines would go away - you could freely use "await" inside a generator since it cedes control to the event loop, not the caller of the generator. All of the "async def"/"await" examples in PEP 492 would work as is. It might also make it easier in the future to add support for async calls insider __getattr__ etc. Thanks for reading! Rajiv -------------- next part -------------- An HTML attachment was scrubbed... URL: From Steve.Dower at microsoft.com Wed Apr 22 18:59:53 2015 From: Steve.Dower at microsoft.com (Steve Dower) Date: Wed, 22 Apr 2015 16:59:53 +0000 Subject: [Python-Dev] [python-committers] [RELEASED] Python 3.5.0a4 is now available In-Reply-To: References: <5534B5C0.2030800@hastings.org> <1429653904845.96438@microsoft.com> , Message-ID: Whoops, sorry. Yeah, I knew about that behavior, but in hindsight it's obviously going to be a surprise for others :) One plain zip file coming up for the next release. All the binaries will be signed and paranoid people can check the GPG sig and embed their own hashes if not the files themselves. Cheers, Steve Top-posted from my Windows Phone ________________________________ From: Paul Moore Sent: ?4/?22/?2015 5:48 To: Steve Dower Cc: Larry Hastings; Python Dev Subject: Re: [python-committers] [RELEASED] Python 3.5.0a4 is now available On 22 April 2015 at 13:45, Paul Moore wrote: > On 21 April 2015 at 23:05, Steve Dower wrote: >> I made it a self-extracting RAR file so it could be signed, but I've already had multiple people query it so the next release will probably just be a plain ZIP file. I just need to figure out some reliable way of validating the download other than GPG, since I'd like installers to be able to do the download transparently and ideally without hard-coding hash values. I might add a CSV of SHA hashes to the zip too. > > You could probably just leave it as is (or make it a self-extracting > zip file) and just describe it on the web page as "Windows amd64 > embeddable self-extracting archive". People are (I think) pretty used > to the idea that they can open a self-extracting archive in tools like > 7-zip, so those who didn't want to run the exe could do that (and > would know they could). Obviously extracting that way you don't get > the signature check, but that's to be expected. Whoops, no - I changed my mind. If you double click on the downloaded file (which I just did) it unpacks it into the directory you downloaded the exe to, with no option to put it anywhere else, and no UI telling you what it's doing. That's going to annoy people badly. Better make it a simple zipfile in that case. Paul (off to tidy up his download directory :-() -------------- next part -------------- An HTML attachment was scrubbed... URL: From yselivanov.ml at gmail.com Wed Apr 22 19:13:24 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 22 Apr 2015 13:13:24 -0400 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <5537C9D0.80505@gmail.com> Message-ID: <5537D6B4.8060204@gmail.com> Hi Rajiv, On 2015-04-22 12:53 PM, Rajiv Kumar wrote: > I'd like to suggest another way around some of the issues here, with > apologies if this has already been discussed sometime in the past. > > From the viewpoint of a Python programmer, there are two distinct reasons > for wanting to suspend execution in a block of code: > > 1. To yield a value from an iterator, as Python generators do today. > > 2. To cede control to the event loop while waiting for an asynchronous task > to make progress in a coroutine. > > As of today both of these reasons to suspend are supported by the same > underlying mechanism, i.e. a "yield" at the end of the chain of "yield > from"s. PEPs 492 and 3152 introduce "await" and "cocall", but at the bottom > of it all there's effectively still a yield as I understand it. > > I think that the fact that these two concepts use the same mechanism is > what leads to the issues with coroutine-generators that Greg and Yury have > raised. > > With that in mind, would it be possible to introduce a second form of > suspension to Python to specifically handle the case of ceding to the event > loop? I don't know what the implementation complexity of this would be, or > if it's even feasible. But roughly speaking, the syntax for this could use > "await", and code would look just like it does in the PEP. The semantics of > "await " would be analogous to "yield from " today, with the > difference that the Task would go up the chain of "await"s to the outermost > caller, which would typically be asyncio, with some modifications from its > form today. Progress would be made via __anext__ instead of __next__. I think that what you propose is a great idea. However, its implementation will be far more invasive than what PEP 492 proposes. I doubt that we'll be able to make it in 3.5 if we choose this route. BUT: With my latest proposal to disallow for..in loops and iter()/list()-like builtins, the fact that coroutines are based internally on generators is just an implementation detail. There is no way users can exploit the underlying generator object. Coroutine-objects only provide 'send()' and 'throw()' methods, which they would also have with your implementation idea. This gives us freedom to consider your approach in 3.6 if we decide to add coroutine-generators. To make this work we might want to patch inspect.py to make isgenerator() family of functions to return False for coroutines/coroutine-objects. Thanks a lot for the feedback! Yury From guido at python.org Wed Apr 22 19:23:13 2015 From: guido at python.org (Guido van Rossum) Date: Wed, 22 Apr 2015 10:23:13 -0700 Subject: [Python-Dev] typeshed for 3rd party packages In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> <5535FD60.7020404@egenix.com> <55377BF8.8030402@egenix.com> Message-ID: On Wed, Apr 22, 2015 at 9:12 AM, Skip Montanaro wrote: > > On Wed, Apr 22, 2015 at 10:43 AM, Guido van Rossum > wrote: > >> For Requests, it looks like it may be better not to have stubs at all. > > > Can you expand on this? Why would Requests be any different than any other > module/package? > Did you see the crazy union that Cory came up with for one of the parameters of get() in another thread? Requests assumes duck typing for most of its arguments *and* it handles different forms of many arguments (e.g. you can specify an iterable or a mapping). Requests is different for a number of reasons; it has evolved for years to support many different use cases with a few methods, so it is using every trick in the book from dynamic typing. Type checking (using the current state of the tools) is probably more useful for "business logic" code than for "library" code. We'll iterate over the next few Python versions (and even while Python 3.5 is out we can tweak typing.py because it's provisional in the PEP 411 sense) and eventually it may be possible to provide stubs for Requests. But since PEP 484 doesn't address duck typing it's simply too early. (As I said elsewhere, duck typing is an important next step, but we need PEP 484 as the foundation first). > > As for versioning, I think stub files would absolutely have to declare the > appropriate version(s) to which they apply (probably via embedded > directives), so type checkers can ignore them when necessary. That also > means that type checkers must be able to figure out the version of the > package used by the application being analyzed. > > Not sure I'm being too clear, so I will provide an example. I have app > "myapp" which imports module "yourmod" v 1.2.7. The yourmod author hasn't > yet provided type annotations, so someone else contributed a stub to the > typeshed. Time passes and a new version of "yourmod" comes out, v 2.0.0. > (Semantic versioning tells us the API has changed in an incompatible way > because of the major version bump.) I decide I need some of its new > features and update "myapp". There is no new stub file in the typeshed yet. > When I run my fancy type checker (someone suggested I will shortly have 50 > to choose from!), it needs to recognize that the stub no longer matches the > version of "yourmod" I am using, and must ignore it. > > Does that suggest the typeshed needs some sort of structure which allows > all versions of stubs for the same package to be gathered together? > > My apologies if I'm following along way behind the curve. > No, I think you can start filing (or adding your view to) issues with the typeshed tracker: https://github.com/JukkaL/typeshed/issues -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Wed Apr 22 19:32:18 2015 From: guido at python.org (Guido van Rossum) Date: Wed, 22 Apr 2015 10:32:18 -0700 Subject: [Python-Dev] typeshed for 3rd party packages In-Reply-To: References: <20150420144106.63513828@anarchist.wooz.org> <5535FD60.7020404@egenix.com> <55377BF8.8030402@egenix.com> Message-ID: On Wed, Apr 22, 2015 at 9:26 AM, Ian Cordasco wrote: > As the other maintainer of requests, I think having hints *might* help > some developers, but looking at what Cory generated (which looks to be > valid), I'm wondering about something else with Type Hints. > > I've heard several people say "Just create an aliased type for the hint so > it's shorter!" but doesn't that mean we then have to document that alias > for our users? I mean if the IDE suggests that the developer use XYZ for > this parameter and there's no explanation for XYZ actually is (in the IDE), > doesn't this just add a lot more maintenance to adding hints? Maintainers > now have to: > > - Keep the stubs up-to-date > - Document the stubs (and if the stubs are in typeshed, does $MyPackage > link to the docs in typeshed to avoid users filing bugs on $MyPackage's > issue tracker?) > - Version the stubs (assuming they're maintained in a third-party > location, e.g., typeshed) > > Don't get me wrong. I really like the idea of moving towards Type Hints. > I'm not even particularly against adding type hints for Requests to > typeshed. I'm just hesitant that it will be easy to make them entirely > accurate. > To be useful for the users of a package, type aliases need to be exported by the package, which means that the package itself grows a dependency on typing.py. You could probably make that a conditional dependency, e.g. try: from typing import Union, Tuple, AnyStr, Optional HeaderTuple = Union[Tuple[AnyStr, AnyStr], Tuple[AnyStr, AnyStr, Optional[AnyStr]]] # etc. except ImportError: pass # Don't define type aliases and use a stub file for the actual signatures. User code that itself has a hard dependency on typing could import and use the type aliases unconditionally; user code with a conditional dependency on typing should stick to stubs (or similar import hacks). If you use type hints this way you should probably maintain the stubs as part of your package (as .pyi files living alongside the .py files) so that you don't have to deal with typeshed being out of date. There are many other possible workflows; we haven't discovered the best one(s) yet. It's a work in progress. -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From andrew.svetlov at gmail.com Wed Apr 22 20:32:40 2015 From: andrew.svetlov at gmail.com (Andrew Svetlov) Date: Wed, 22 Apr 2015 21:32:40 +0300 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <5537D6B4.8060204@gmail.com> References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <5537C9D0.80505@gmail.com> <5537D6B4.8060204@gmail.com> Message-ID: For now I can use mix asyncio.coroutines and `async def` functions, I mean I can write `await f()` inside async def to call asyncio.coroutine `f` and vise versa: I can use `yield from g()` inside asyncio.coroutine to call `async def g(): ...`. If we forbid to call `async def` from regualr code how asyncio should work? I'd like to push `async def` everywhere in asyncio API where asyncio.coroutine required. On Wed, Apr 22, 2015 at 8:13 PM, Yury Selivanov wrote: > Hi Rajiv, > > On 2015-04-22 12:53 PM, Rajiv Kumar wrote: >> >> I'd like to suggest another way around some of the issues here, with >> apologies if this has already been discussed sometime in the past. >> >> From the viewpoint of a Python programmer, there are two distinct reasons >> for wanting to suspend execution in a block of code: >> >> 1. To yield a value from an iterator, as Python generators do today. >> >> 2. To cede control to the event loop while waiting for an asynchronous >> task >> to make progress in a coroutine. >> >> As of today both of these reasons to suspend are supported by the same >> underlying mechanism, i.e. a "yield" at the end of the chain of "yield >> from"s. PEPs 492 and 3152 introduce "await" and "cocall", but at the >> bottom >> of it all there's effectively still a yield as I understand it. >> >> I think that the fact that these two concepts use the same mechanism is >> what leads to the issues with coroutine-generators that Greg and Yury have >> raised. >> >> With that in mind, would it be possible to introduce a second form of >> suspension to Python to specifically handle the case of ceding to the >> event >> loop? I don't know what the implementation complexity of this would be, or >> if it's even feasible. But roughly speaking, the syntax for this could use >> "await", and code would look just like it does in the PEP. The semantics >> of >> "await " would be analogous to "yield from " today, with the >> difference that the Task would go up the chain of "await"s to the >> outermost >> caller, which would typically be asyncio, with some modifications from its >> form today. Progress would be made via __anext__ instead of __next__. > > > I think that what you propose is a great idea. However, its > implementation will be far more invasive than what PEP 492 > proposes. I doubt that we'll be able to make it in 3.5 if > we choose this route. > > BUT: With my latest proposal to disallow for..in loops and > iter()/list()-like builtins, the fact that coroutines are > based internally on generators is just an implementation > detail. > > There is no way users can exploit the underlying generator > object. Coroutine-objects only provide 'send()' and 'throw()' > methods, which they would also have with your implementation > idea. > > This gives us freedom to consider your approach in 3.6 if > we decide to add coroutine-generators. To make this work > we might want to patch inspect.py to make isgenerator() family > of functions to return False for coroutines/coroutine-objects. > > Thanks a lot for the feedback! > > Yury > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.com -- Thanks, Andrew Svetlov From yselivanov.ml at gmail.com Wed Apr 22 20:45:00 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 22 Apr 2015 14:45:00 -0400 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <5537C9D0.80505@gmail.com> <5537D6B4.8060204@gmail.com> Message-ID: <5537EC2C.4070105@gmail.com> Andrew, On 2015-04-22 2:32 PM, Andrew Svetlov wrote: > For now I can use mix asyncio.coroutines and `async def` functions, I > mean I can write `await f()` inside async def to call > asyncio.coroutine `f` and vise versa: I can use `yield from g()` > inside asyncio.coroutine to call `async def g(): ...`. That's another good point that I forgot to add to the list. Thanks for bringing this up. > > If we forbid to call `async def` from regualr code how asyncio should > work? I'd like to push `async def` everywhere in asyncio API where > asyncio.coroutine required. You'll have to use a wrapper that will do the following: async def foo(): return 'spam' @asyncio.coroutine def bar(): what = yield from foo.__await__(foo, *args, **kwargs) # OR: what = yield from await_call(foo, *args, **kwargs) Thanks, Yury From andrew.svetlov at gmail.com Wed Apr 22 20:53:09 2015 From: andrew.svetlov at gmail.com (Andrew Svetlov) Date: Wed, 22 Apr 2015 21:53:09 +0300 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <5537EC2C.4070105@gmail.com> References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <5537C9D0.80505@gmail.com> <5537D6B4.8060204@gmail.com> <5537EC2C.4070105@gmail.com> Message-ID: On Wed, Apr 22, 2015 at 9:45 PM, Yury Selivanov wrote: > Andrew, > > On 2015-04-22 2:32 PM, Andrew Svetlov wrote: >> >> For now I can use mix asyncio.coroutines and `async def` functions, I >> mean I can write `await f()` inside async def to call >> asyncio.coroutine `f` and vise versa: I can use `yield from g()` >> inside asyncio.coroutine to call `async def g(): ...`. > > > That's another good point that I forgot to add to the list. > Thanks for bringing this up. > >> >> If we forbid to call `async def` from regualr code how asyncio should >> work? I'd like to push `async def` everywhere in asyncio API where >> asyncio.coroutine required. > > > You'll have to use a wrapper that will do the following: > > async def foo(): > return 'spam' > > @asyncio.coroutine > def bar(): > what = yield from foo.__await__(foo, *args, **kwargs) > # OR: > what = yield from await_call(foo, *args, **kwargs) > If I cannot directly use `yield from f()` with `async def f():` then almost every `yield from` inside asyncio library should be wrapped in `await_call()`. Every third-party asyncio-based library should do the same. Also I expect a performance degradation on `await_call()` calls. > Thanks, > Yury -- Thanks, Andrew Svetlov From yselivanov.ml at gmail.com Wed Apr 22 21:24:41 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 22 Apr 2015 15:24:41 -0400 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <5537C9D0.80505@gmail.com> <5537D6B4.8060204@gmail.com> <5537EC2C.4070105@gmail.com> Message-ID: <5537F579.9020704@gmail.com> On 2015-04-22 2:53 PM, Andrew Svetlov wrote: > On Wed, Apr 22, 2015 at 9:45 PM, Yury Selivanov wrote: [...] >> >>> If we forbid to call `async def` from regualr code how asyncio should >>> work? I'd like to push `async def` everywhere in asyncio API where >>> asyncio.coroutine required. >> >> You'll have to use a wrapper that will do the following: >> >> async def foo(): >> return 'spam' >> >> @asyncio.coroutine >> def bar(): >> what = yield from foo.__await__(foo, *args, **kwargs) >> # OR: >> what = yield from await_call(foo, *args, **kwargs) >> > If I cannot directly use `yield from f()` with `async def f():` then > almost every `yield from` inside asyncio library should be wrapped in > `await_call()`. Every third-party asyncio-based library should do the > same. > > Also I expect a performance degradation on `await_call()` calls. > I think there is another way... instead of pushing GET_ITER ... YIELD_FROM opcodes, we'll need to replace GET_ITER with another one: GET_ITER_SPECIAL ... YIELD_FROM Where "GET_ITER_SPECIAL (obj)" (just a working name) would check that if the current code object has CO_COROUTINE and the object that you will yield-from has it as well, it would push to the stack the result of (obj.__await__()) Yury From andrew.svetlov at gmail.com Wed Apr 22 21:37:16 2015 From: andrew.svetlov at gmail.com (Andrew Svetlov) Date: Wed, 22 Apr 2015 22:37:16 +0300 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <5537F579.9020704@gmail.com> References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <5537C9D0.80505@gmail.com> <5537D6B4.8060204@gmail.com> <5537EC2C.4070105@gmail.com> <5537F579.9020704@gmail.com> Message-ID: On Wed, Apr 22, 2015 at 10:24 PM, Yury Selivanov wrote: > > > On 2015-04-22 2:53 PM, Andrew Svetlov wrote: >> >> On Wed, Apr 22, 2015 at 9:45 PM, Yury Selivanov >> wrote: > > [...] >>> >>> >>>> If we forbid to call `async def` from regualr code how asyncio should >>>> work? I'd like to push `async def` everywhere in asyncio API where >>>> asyncio.coroutine required. >>> >>> >>> You'll have to use a wrapper that will do the following: >>> >>> async def foo(): >>> return 'spam' >>> >>> @asyncio.coroutine >>> def bar(): >>> what = yield from foo.__await__(foo, *args, **kwargs) >>> # OR: >>> what = yield from await_call(foo, *args, **kwargs) >>> >> If I cannot directly use `yield from f()` with `async def f():` then >> almost every `yield from` inside asyncio library should be wrapped in >> `await_call()`. Every third-party asyncio-based library should do the >> same. >> >> Also I expect a performance degradation on `await_call()` calls. >> > > I think there is another way... instead of pushing > > GET_ITER > ... > YIELD_FROM > > opcodes, we'll need to replace GET_ITER with another one: > > GET_ITER_SPECIAL > ... > YIELD_FROM > > > Where "GET_ITER_SPECIAL (obj)" (just a working name) would check > that if the current code object has CO_COROUTINE and the > object that you will yield-from has it as well, it would > push to the stack the result of (obj.__await__()) > GET_ITER_SPECIAL sounds better than wrapper for `coro.__await__()` call. > Yury -- Thanks, Andrew Svetlov From pje at telecommunity.com Wed Apr 22 21:44:46 2015 From: pje at telecommunity.com (PJ Eby) Date: Wed, 22 Apr 2015 15:44:46 -0400 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <55368858.4010007@gmail.com> References: <55368858.4010007@gmail.com> Message-ID: On Tue, Apr 21, 2015 at 1:26 PM, Yury Selivanov wrote: > It is an error to pass a regular context manager without ``__aenter__`` > and ``__aexit__`` methods to ``async with``. It is a ``SyntaxError`` > to use ``async with`` outside of a coroutine. I find this a little weird. Why not just have `with` and `for` inside a coroutine dynamically check the iterator or context manager, and either behave sync or async accordingly? Why must there be a *syntactic* difference? Not only would this simplify the syntax, it would also allow dropping the need for `async` to be a true keyword, since functions could be defined via "def async foo():" rather than "async def foo():" ...which, incidentally, highlights one of the things that's been bothering me about all this "async foo" stuff: "async def" looks like it *defines the function* asynchronously (as with "async with" and "async for"), rather than defining an asynchronous function. ISTM it should be "def async bar():" or even "def bar() async:". Also, even that seems suspect to me: if `await` looks for an __await__ method and simply returns the same object (synchronously) if the object doesn't have an await method, then your code sample that supposedly will fail if a function ceases to be a coroutine *will not actually fail*. In my experience working with coroutine systems, making a system polymorphic (do something appropriate with what's given) and idempotent (don't do anything if what's wanted is already done) makes it more robust. In particular, it eliminates the issue of mixing coroutines and non-coroutines. To sum up: I can see the use case for a new `await` distinguished from `yield`, but I don't see the need to create new syntax for everything; ISTM that adding the new asynchronous protocols and using them on demand is sufficient. Marking a function asynchronous so it can use asynchronous iteration and context management seems reasonably useful, but I don't think it's terribly important for the type of function result. Indeed, ISTM that the built-in `object` class could just implement `__await__` as a no-op returning self, and then *all* results are trivially asynchronous results and can be awaited idempotently, so that awaiting something that has already been waited for is a no-op. (Prior art: the Javascript Promise.resolve() method, which takes either a promise or a plain value and returns a promise, so that you can write code which is always-async in the presence of values that may already be known.) Finally, if the async for and with operations have to be distinguished by syntax at the point of use (vs. just always being used in coroutines), then ISTM that they should be `with async foo:` and `for async x in bar:`, since the asynchronousness is just an aspect of how the main keyword is executed. tl;dr: I like the overall ideas but hate the syntax and type segregation involved: declaring a function async at the top is OK to enable async with/for semantics and await expressions, but the rest seems unnecessary and bad for writing robust code. (e.g. note that requiring different syntax means a function must either duplicate code or restrict its input types more, and type changes in remote parts of the program will propagate syntax changes throughout.) From andrew.svetlov at gmail.com Wed Apr 22 22:10:44 2015 From: andrew.svetlov at gmail.com (Andrew Svetlov) Date: Wed, 22 Apr 2015 23:10:44 +0300 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> Message-ID: On Wed, Apr 22, 2015 at 10:44 PM, PJ Eby wrote: > On Tue, Apr 21, 2015 at 1:26 PM, Yury Selivanov wrote: >> It is an error to pass a regular context manager without ``__aenter__`` >> and ``__aexit__`` methods to ``async with``. It is a ``SyntaxError`` >> to use ``async with`` outside of a coroutine. > > I find this a little weird. Why not just have `with` and `for` inside > a coroutine dynamically check the iterator or context manager, and > either behave sync or async accordingly? Why must there be a > *syntactic* difference? IIRC Guido always like to have different syntax for calling regular functions and coroutines. That's why we need explicit syntax for asynchronous context managers and iterators. > > Not only would this simplify the syntax, it would also allow dropping > the need for `async` to be a true keyword, since functions could be > defined via "def async foo():" rather than "async def foo():" > > ...which, incidentally, highlights one of the things that's been > bothering me about all this "async foo" stuff: "async def" looks like > it *defines the function* asynchronously (as with "async with" and > "async for"), rather than defining an asynchronous function. ISTM it > should be "def async bar():" or even "def bar() async:". > > Also, even that seems suspect to me: if `await` looks for an __await__ > method and simply returns the same object (synchronously) if the > object doesn't have an await method, then your code sample that > supposedly will fail if a function ceases to be a coroutine *will not > actually fail*. > > In my experience working with coroutine systems, making a system > polymorphic (do something appropriate with what's given) and > idempotent (don't do anything if what's wanted is already done) makes > it more robust. In particular, it eliminates the issue of mixing > coroutines and non-coroutines. > > To sum up: I can see the use case for a new `await` distinguished from > `yield`, but I don't see the need to create new syntax for everything; > ISTM that adding the new asynchronous protocols and using them on > demand is sufficient. Marking a function asynchronous so it can use > asynchronous iteration and context management seems reasonably useful, > but I don't think it's terribly important for the type of function > result. Indeed, ISTM that the built-in `object` class could just > implement `__await__` as a no-op returning self, and then *all* > results are trivially asynchronous results and can be awaited > idempotently, so that awaiting something that has already been waited > for is a no-op. (Prior art: the Javascript Promise.resolve() method, > which takes either a promise or a plain value and returns a promise, > so that you can write code which is always-async in the presence of > values that may already be known.) > > Finally, if the async for and with operations have to be distinguished > by syntax at the point of use (vs. just always being used in > coroutines), then ISTM that they should be `with async foo:` and `for > async x in bar:`, since the asynchronousness is just an aspect of how > the main keyword is executed. > > tl;dr: I like the overall ideas but hate the syntax and type > segregation involved: declaring a function async at the top is OK to > enable async with/for semantics and await expressions, but the rest > seems unnecessary and bad for writing robust code. (e.g. note that > requiring different syntax means a function must either duplicate code > or restrict its input types more, and type changes in remote parts of > the program will propagate syntax changes throughout.) > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.com -- Thanks, Andrew Svetlov From ethan at stoneleaf.us Wed Apr 22 22:13:01 2015 From: ethan at stoneleaf.us (Ethan Furman) Date: Wed, 22 Apr 2015 13:13:01 -0700 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> Message-ID: <20150422201301.GA24912@stoneleaf.us> On 04/22, PJ Eby wrote: > tl;dr: I like the overall ideas but hate the syntax and type > segregation involved: declaring a function async at the top is OK to > enable async with/for semantics and await expressions, but the rest > seems unnecessary and bad for writing robust code. (e.g. note that > requiring different syntax means a function must either duplicate code > or restrict its input types more, and type changes in remote parts of > the program will propagate syntax changes throughout.) Agreed. -- ~Ethan~ From gmludo at gmail.com Wed Apr 22 21:42:13 2015 From: gmludo at gmail.com (Ludovic Gasc) Date: Wed, 22 Apr 2015 21:42:13 +0200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <5537C9D0.80505@gmail.com> <5537D6B4.8060204@gmail.com> Message-ID: +1 about Andrew Svetlov proposition: please help to migrate as smoothly as possible to async/await. -- Ludovic Gasc (GMLudo) http://www.gmludo.eu/ 2015-04-22 20:32 GMT+02:00 Andrew Svetlov : > For now I can use mix asyncio.coroutines and `async def` functions, I > mean I can write `await f()` inside async def to call > asyncio.coroutine `f` and vise versa: I can use `yield from g()` > inside asyncio.coroutine to call `async def g(): ...`. > > If we forbid to call `async def` from regualr code how asyncio should > work? I'd like to push `async def` everywhere in asyncio API where > asyncio.coroutine required. > > On Wed, Apr 22, 2015 at 8:13 PM, Yury Selivanov > wrote: > > Hi Rajiv, > > > > On 2015-04-22 12:53 PM, Rajiv Kumar wrote: > >> > >> I'd like to suggest another way around some of the issues here, with > >> apologies if this has already been discussed sometime in the past. > >> > >> From the viewpoint of a Python programmer, there are two distinct > reasons > >> for wanting to suspend execution in a block of code: > >> > >> 1. To yield a value from an iterator, as Python generators do today. > >> > >> 2. To cede control to the event loop while waiting for an asynchronous > >> task > >> to make progress in a coroutine. > >> > >> As of today both of these reasons to suspend are supported by the same > >> underlying mechanism, i.e. a "yield" at the end of the chain of "yield > >> from"s. PEPs 492 and 3152 introduce "await" and "cocall", but at the > >> bottom > >> of it all there's effectively still a yield as I understand it. > >> > >> I think that the fact that these two concepts use the same mechanism is > >> what leads to the issues with coroutine-generators that Greg and Yury > have > >> raised. > >> > >> With that in mind, would it be possible to introduce a second form of > >> suspension to Python to specifically handle the case of ceding to the > >> event > >> loop? I don't know what the implementation complexity of this would be, > or > >> if it's even feasible. But roughly speaking, the syntax for this could > use > >> "await", and code would look just like it does in the PEP. The semantics > >> of > >> "await " would be analogous to "yield from " today, with the > >> difference that the Task would go up the chain of "await"s to the > >> outermost > >> caller, which would typically be asyncio, with some modifications from > its > >> form today. Progress would be made via __anext__ instead of __next__. > > > > > > I think that what you propose is a great idea. However, its > > implementation will be far more invasive than what PEP 492 > > proposes. I doubt that we'll be able to make it in 3.5 if > > we choose this route. > > > > BUT: With my latest proposal to disallow for..in loops and > > iter()/list()-like builtins, the fact that coroutines are > > based internally on generators is just an implementation > > detail. > > > > There is no way users can exploit the underlying generator > > object. Coroutine-objects only provide 'send()' and 'throw()' > > methods, which they would also have with your implementation > > idea. > > > > This gives us freedom to consider your approach in 3.6 if > > we decide to add coroutine-generators. To make this work > > we might want to patch inspect.py to make isgenerator() family > > of functions to return False for coroutines/coroutine-objects. > > > > Thanks a lot for the feedback! > > > > Yury > > _______________________________________________ > > Python-Dev mailing list > > Python-Dev at python.org > > https://mail.python.org/mailman/listinfo/python-dev > > Unsubscribe: > > > https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.com > > > > -- > Thanks, > Andrew Svetlov > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/gmludo%40gmail.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From yselivanov.ml at gmail.com Wed Apr 22 22:25:38 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 22 Apr 2015 16:25:38 -0400 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> Message-ID: <553803C2.6060902@gmail.com> Hi PJ, On 2015-04-22 3:44 PM, PJ Eby wrote: > On Tue, Apr 21, 2015 at 1:26 PM, Yury Selivanov wrote: >> It is an error to pass a regular context manager without ``__aenter__`` >> and ``__aexit__`` methods to ``async with``. It is a ``SyntaxError`` >> to use ``async with`` outside of a coroutine. > I find this a little weird. Why not just have `with` and `for` inside > a coroutine dynamically check the iterator or context manager, and > either behave sync or async accordingly? Why must there be a > *syntactic* difference? One of the things that we try to avoid is to have implicit places where code execution might be suspended. For that we use 'yield from' right now, and want to use 'await' with PEP 492. To have implicit context switches there is Stackless Python and greenlets, however, it's harder to reason about the code written in such a way. Having explicit 'yield from/await' is the selling point of asyncio and other frameworks that use generator-based coroutines. Hence, we want to stress that 'async with' and 'async for' do suspend the execution in their protocols. I don't want to loose control over what kind of iteration or context manager I'm using. I don't want to iterate through a cursor that doesn't do prefetching, I want to make sure that it does. This problem is solved by the PEP. > > Not only would this simplify the syntax, it would also allow dropping > the need for `async` to be a true keyword, since functions could be > defined via "def async foo():" rather than "async def foo():" > > ...which, incidentally, highlights one of the things that's been > bothering me about all this "async foo" stuff: "async def" looks like > it *defines the function* asynchronously (as with "async with" and > "async for"), rather than defining an asynchronous function. ISTM it > should be "def async bar():" or even "def bar() async:". If we keep 'async with', then we'll have to keep 'async def' to make it symmetric and easier to remember. But, in theory, I'd be OK with 'def async'. 'def name() async' is something that will be extremely hard to notice in the code. > > Also, even that seems suspect to me: if `await` looks for an __await__ > method and simply returns the same object (synchronously) if the > object doesn't have an await method, then your code sample that > supposedly will fail if a function ceases to be a coroutine *will not > actually fail*. It doesn't just do that. In the reference implementation, a single 'await o' compiles to: (o) # await arg on top of the stack GET_AWAITABLE LOAD_CONST None YIELD_FROM Where GET_AWAITABLE does the following: - If it's a coroutine-object -- return it - If it's an object with __await__, return iter(object.__await__()) - Raise a TypeError of two above steps don't return If you had a code like that: await coro() where coro is async def coro(): pass you then can certainly refactor core to: def coro(): return future # or some awaitable, please refer to PEP492 And it won't break anything. So I'm not sure I understand your remark about "*will not actually fail*". > > In my experience working with coroutine systems, making a system > polymorphic (do something appropriate with what's given) and > idempotent (don't do anything if what's wanted is already done) makes > it more robust. In particular, it eliminates the issue of mixing > coroutines and non-coroutines. Unfortunately, to completely eliminate the issue of reusing existing "non-coroutine" code, or of writing "coroutine" code that can be used with "non-coroutine" code, you have to use gevent-kind of libraries. > > To sum up: I can see the use case for a new `await` distinguished from > `yield`, but I don't see the need to create new syntax for everything; > ISTM that adding the new asynchronous protocols and using them on > demand is sufficient. Marking a function asynchronous so it can use > asynchronous iteration and context management seems reasonably useful, > but I don't think it's terribly important for the type of function > result. Indeed, ISTM that the built-in `object` class could just > implement `__await__` as a no-op returning self, and then *all* > results are trivially asynchronous results and can be awaited > idempotently, so that awaiting something that has already been waited > for is a no-op. I see all objects implementing __await__ returning "self" as a very error prone approach. It's totally OK to write code like that: async def coro(): return fut future = await coro() In the above example, if coro ceases to be a coroutine, 'future' will be a result of 'fut', not 'fut' itself. > (Prior art: the Javascript Promise.resolve() method, > which takes either a promise or a plain value and returns a promise, > so that you can write code which is always-async in the presence of > values that may already be known.) > > Finally, if the async for and with operations have to be distinguished > by syntax at the point of use (vs. just always being used in > coroutines), then ISTM that they should be `with async foo:` and `for > async x in bar:`, since the asynchronousness is just an aspect of how > the main keyword is executed. > > tl;dr: I like the overall ideas but hate the syntax and type > segregation involved: declaring a function async at the top is OK to > enable async with/for semantics and await expressions, but the rest > seems unnecessary and bad for writing robust code. (e.g. note that > requiring different syntax means a function must either duplicate code > or restrict its input types more, and type changes in remote parts of > the program will propagate syntax changes throughout.) Thanks, Yury From victor.stinner at gmail.com Wed Apr 22 22:26:08 2015 From: victor.stinner at gmail.com (Victor Stinner) Date: Wed, 22 Apr 2015 20:26:08 +0000 (UTC) Subject: [Python-Dev] async/await in Python; v2 References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> Message-ID: Hi, Guido van Rossum python.org> writes: > I'm slowly warming up to Greg's notion that you can't call a coroutine (or whatever it's called) without a special keyword. A huge part of the asyncio module is based on "yield from fut" where fut is a Future object. How do you write this using the PEP 3152? Do you need to call an artifical method like "cocall fut.return_self()" where the return_self() method simply returns fut? When I discovered Python for the first time, I fall into the trap of trying to call a function without parenthesis: "hello_world" instead of "hello_world()". I was very surprised that the language didn't protect me against such obvious bug. But later, I used this feature in almost all my applications: passing a callback is just a must have feature, and Python syntax for this is great! (just pass the function without parenthesis) Would it be possible that creating coroutine object by calling a coroutine function is a feature, and not a bug? I mean that it may be used in some cases. I worked a lot of asyncio and I saw a lot of hacks to solve some corner case issues, to be able to propose a nice API at the end. @asyncio.coroutine currently calls a function and *then* check if it should yields from it or not: res = func(*args, **kw) if isinstance(res, futures.Future) or inspect.isgenerator(res): res = yield from res With the PEP 3152, it's no more possible to write such code. I fear that we miss cases where it would be needed. maybeDeferred() is an important function in Twisted. As expected, users ask for a similar function in asyncio: http://stackoverflow.com/questions/20730248/maybedeferred-analog-with-asyncio Currently, it's possible to implement it using yield from. > OTOH I'm still struggling with what you have to do to wrap a coroutine in a Task, the way its done in asyncio by the Task() constructor, the loop.create_task() method, and the async() function (perhaps to be renamed to ensure_task() to make way for the async keyword). Logging a warning when a coroutine object is not "consumed" ("yielded from"?) is only one of the asyncio.CoroWrapper features. It's now also used to remember where the coroutine object was created: it's very useful to rebuild the chain of function calls/tasks/coroutines to show where the bug comes from. (I still have a project to enhance debugging to create a full stack where a task was created. Currently, the stack stops at the last "Task._step()", but it's technically possible to go further (I have a PoC somewhere). I already introduced BaseEventLoop._current_handle as a first step.) Oh, and CoroWrapper also provides a better representation. But we might enhance repr(coroutine_object) directly in Python. Yury proposed to store the source (filename, line number) of the most recent frame where a coroutine object was created. But a single frame is not enough (usually, the interesting frame is at least the 3rd frame, not the most recent one). Storing more frames would kill performances in debug mode (and/or create reference cycles if we keep frame objects, not only filename+line number). For all these reasons, I'm in favor of keeping the ability of wrapping coroutine objects. It has a negligible impact in release mode and you can do whatever you want in debug mode which is very convenient. Victor From guido at python.org Wed Apr 22 22:31:18 2015 From: guido at python.org (Guido van Rossum) Date: Wed, 22 Apr 2015 13:31:18 -0700 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> Message-ID: On Wed, Apr 22, 2015 at 1:10 PM, Andrew Svetlov wrote: > On Wed, Apr 22, 2015 at 10:44 PM, PJ Eby wrote: > > On Tue, Apr 21, 2015 at 1:26 PM, Yury Selivanov > wrote: > >> It is an error to pass a regular context manager without ``__aenter__`` > >> and ``__aexit__`` methods to ``async with``. It is a ``SyntaxError`` > >> to use ``async with`` outside of a coroutine. > > > > I find this a little weird. Why not just have `with` and `for` inside > > a coroutine dynamically check the iterator or context manager, and > > either behave sync or async accordingly? Why must there be a > > *syntactic* difference? > > IIRC Guido always like to have different syntax for calling regular > functions and coroutines. > That's why we need explicit syntax for asynchronous context managers > and iterators. > To clarify: the philosophy behind asyncio coroutines is that you should be able to tell statically where a task may be suspended simply by looking for `yield from`. This means that *no* implicit suspend points may exist, and it rules out gevent, stackless and similar microthreading frameworks. In the new PEP this would become `await`, plus specific points dictated by `async for` and `async with` -- `async for` can suspend (block) at each iteration step, and `async with` can suspend at the enter and exit points. The use case for both is database drivers: `async for` may block for the next record to become available from the query, and `async with` may block in the implied `finally` clause in order to wait for a commit. (Both may also suspend at the top, but that's less important.) -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From victor.stinner at gmail.com Wed Apr 22 22:46:56 2015 From: victor.stinner at gmail.com (Victor Stinner) Date: Wed, 22 Apr 2015 20:46:56 +0000 (UTC) Subject: [Python-Dev] async/await in Python; v2 References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> Message-ID: Greg Ewing canterbury.ac.nz> writes: > I still don't like the idea of hijacking the generic > term "coroutine" and using it to mean this particular > type of object. There are only two hard things in Computer Science: cache invalidation and naming things. -- Phil Karlton :-) When reviewing Yury's PEP, I read Wikipedia's article of Coroutine because I didn't know if a "coroutine" is something new in Python nor if it was well defined. https://en.wikipedia.org/wiki/Coroutine Answer: it's not new, and it's implemented in many languages, and it's well defined. But coroutines are not always directly called "coroutines" in other programming languages. Using a custom name like "cofunction" may confuse users coming from other programming languages. I prefer to keep "coroutine", but I agree that we should make some effort to define the different categories of "Python coroutines". Well, there are two kind kinds of coroutines: (A) asyncio coroutine in Python 3.4: use yield from, yield denied, decorated with @asyncio.coroutine (B) PEP 492 coroutine in Python 3.5: use await, yield & yield from denied, function definition prefixed by "async" Yury proposed "generator-based coroutine for the kind (A). Maybe not a great name, since we can learn in the PEP 492 that the kind (B) is also (internally) based on generators. I don't think that we should use distinct names for the two kinds in common cases. But when we need to clearly use distinct names, I propose the following names: Kind (A): - "yield-from coroutines" or "coroutines based on yield-from" - maybe "asyncio coroutines" - "legacy coroutines"? Kind (B): - "awaitable coroutines" or "coroutines based on await" - "asynchronous coroutine" to remember the "async" keyword even if it sounds wrong to repeat that a coroutine can be interrupted (it's almost the definition of a coroutine, no?) - or just "asynchronous function" (coroutine function) & "asynchronous object" (coroutine object) Victor From gmludo at gmail.com Wed Apr 22 23:00:44 2015 From: gmludo at gmail.com (Ludovic Gasc) Date: Wed, 22 Apr 2015 23:00:44 +0200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> Message-ID: 2015-04-22 22:46 GMT+02:00 Victor Stinner : > > Kind (A): > > - "yield-from coroutines" or "coroutines based on yield-from" > - maybe "asyncio coroutines" > - "legacy coroutines"? > "legacy coroutines" name has the advantage to be directly clear it isn't a good idea to write new source code with that. > Kind (B): > > - "awaitable coroutines" or "coroutines based on await" > - "asynchronous coroutine" to remember the "async" keyword even if it > sounds > wrong to repeat that a coroutine can be interrupted (it's almost the > definition of a coroutine, no?) > - or just "asynchronous function" (coroutine function) & "asynchronous > object" (coroutine object) > Personally, if I've a vote right, "async coroutine" is just enough, even if it's a repetition. Or just "coroutine" ? I'm not fan for "new-style coroutines" like name. By the way, I hope you don't change a third time how to write async code in Python, because it will be harder to define a new name. Not related, but one of my coworkers asked me if with the new syntax it will be possible to write an async decorator for coroutines. If I understand correctly new grammar in PEP, it seems to be yes, but could you confirm ? -------------- next part -------------- An HTML attachment was scrubbed... URL: From yselivanov.ml at gmail.com Wed Apr 22 23:08:03 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 22 Apr 2015 17:08:03 -0400 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> Message-ID: <55380DB3.507@gmail.com> Ludovic, On 2015-04-22 5:00 PM, Ludovic Gasc wrote: > Not related, but one of my coworkers asked me if with the new syntax it > will be possible to write an async decorator for coroutines. > If I understand correctly new grammar in PEP, it seems to be yes, but could > you confirm ? There shouldn't be any problems with writing a decorator. Yury From facundobatista at gmail.com Wed Apr 22 23:45:00 2015 From: facundobatista at gmail.com (Facundo Batista) Date: Wed, 22 Apr 2015 18:45:00 -0300 Subject: [Python-Dev] Committing into which branches? Message-ID: Hola! I just commited a simple improvement to HTTPError repr, and checking in the source code page [0], I see that my commit has a small "default" besides it; and other commits don't have that, but have 2.7, or 3.4, etc... So, question: Did I commit in the correct branch? Should I have done anything different? Thanks! [0] https://hg.python.org/cpython/ -- . Facundo Blog: http://www.taniquetil.com.ar/plog/ PyAr: http://www.python.org/ar/ Twitter: @facundobatista From brett at python.org Wed Apr 22 23:46:57 2015 From: brett at python.org (Brett Cannon) Date: Wed, 22 Apr 2015 21:46:57 +0000 Subject: [Python-Dev] Committing into which branches? In-Reply-To: References: Message-ID: The default branch is going to become 3.5, so you're fine. On Wed, Apr 22, 2015 at 5:45 PM Facundo Batista wrote: > Hola! > > I just commited a simple improvement to HTTPError repr, and checking > in the source code page [0], I see that my commit has a small > "default" besides it; and other commits don't have that, but have 2.7, > or 3.4, etc... > > So, question: Did I commit in the correct branch? Should I have done > anything different? > > Thanks! > > [0] https://hg.python.org/cpython/ > > -- > . Facundo > > Blog: http://www.taniquetil.com.ar/plog/ > PyAr: http://www.python.org/ar/ > Twitter: @facundobatista > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From facundobatista at gmail.com Wed Apr 22 23:47:25 2015 From: facundobatista at gmail.com (Facundo Batista) Date: Wed, 22 Apr 2015 18:47:25 -0300 Subject: [Python-Dev] Committing into which branches? In-Reply-To: References: Message-ID: On Wed, Apr 22, 2015 at 6:46 PM, Brett Cannon wrote: > The default branch is going to become 3.5, so you're fine. Thanks!! -- . Facundo Blog: http://www.taniquetil.com.ar/plog/ PyAr: http://www.python.org/ar/ Twitter: @facundobatista From lukasz at langa.pl Wed Apr 22 23:49:11 2015 From: lukasz at langa.pl (=?utf-8?Q?=C5=81ukasz_Langa?=) Date: Wed, 22 Apr 2015 14:49:11 -0700 Subject: [Python-Dev] Committing into which branches? In-Reply-To: References: Message-ID: <6F36A7A5-9FFD-4366-A719-F98A7B8407B5@langa.pl> > On Apr 22, 2015, at 2:45 PM, Facundo Batista wrote: > > Hola! > > I just commited a simple improvement to HTTPError repr, and checking > in the source code page [0], I see that my commit has a small > "default" besides it; and other commits don't have that, but have 2.7, > or 3.4, etc... > > So, question: Did I commit in the correct branch? Should I have done > anything different? This is an enhancement (http://bugs.python.org/issue23887 ) so it should go to default. You committed to the right branch. The mark on the website simply tells you that this is the newest commit on default. Otherwise it?s, well, default. -- Best regards, ?ukasz Langa WWW: http://lukasz.langa.pl/ Twitter: @llanga IRC: ambv on #python-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: From pmiscml at gmail.com Thu Apr 23 00:21:54 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Thu, 23 Apr 2015 01:21:54 +0300 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> Message-ID: <20150423012154.3b991d63@x230> Hello, On Wed, 22 Apr 2015 13:31:18 -0700 Guido van Rossum wrote: > On Wed, Apr 22, 2015 at 1:10 PM, Andrew Svetlov > wrote: > > > On Wed, Apr 22, 2015 at 10:44 PM, PJ Eby > > wrote: > > > On Tue, Apr 21, 2015 at 1:26 PM, Yury Selivanov > > > > > wrote: > > >> It is an error to pass a regular context manager without > > >> ``__aenter__`` and ``__aexit__`` methods to ``async with``. It > > >> is a ``SyntaxError`` to use ``async with`` outside of a > > >> coroutine. > > > > > > I find this a little weird. Why not just have `with` and `for` > > > inside a coroutine dynamically check the iterator or context > > > manager, and either behave sync or async accordingly? Why must > > > there be a *syntactic* difference? > > > > IIRC Guido always like to have different syntax for calling regular > > functions and coroutines. > > That's why we need explicit syntax for asynchronous context managers > > and iterators. > > > > To clarify: the philosophy behind asyncio coroutines is that you > should be able to tell statically where a task may be suspended > simply by looking for `yield from`. This means that *no* implicit > suspend points may exist, and it rules out gevent, stackless and > similar microthreading frameworks. I always wanted to ask - does that mean that Python could have symmetric coroutines (in a sense that it would be Pythonic feature), as long as the call syntax is different from a function call? E.g.: sym def coro1(val): while True: val = coro2.corocall(val) sym def coro2(val): while True: val = coro1.corocall(val) coro1.call(1) -- Best regards, Paul mailto:pmiscml at gmail.com From pmiscml at gmail.com Thu Apr 23 01:12:36 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Thu, 23 Apr 2015 02:12:36 +0300 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <5537C9D0.80505@gmail.com> Message-ID: <20150423021236.2b2039ca@x230> Hello, On Wed, 22 Apr 2015 09:53:39 -0700 Rajiv Kumar wrote: > I'd like to suggest another way around some of the issues here, with > apologies if this has already been discussed sometime in the past. > > From the viewpoint of a Python programmer, there are two distinct > reasons for wanting to suspend execution in a block of code: > > 1. To yield a value from an iterator, as Python generators do today. > > 2. To cede control to the event loop while waiting for an > asynchronous task to make progress in a coroutine. > > As of today both of these reasons to suspend are supported by the same > underlying mechanism, i.e. a "yield" at the end of the chain of "yield > from"s. PEPs 492 and 3152 introduce "await" and "cocall", but at the > bottom of it all there's effectively still a yield as I understand it. > > I think that the fact that these two concepts use the same mechanism > is what leads to the issues with coroutine-generators that Greg and > Yury have raised. > > With that in mind, would it be possible to introduce a second form of > suspension to Python to specifically handle the case of ceding to the > event loop? Barring adding an adhoc statement "yield_to_a_main_loop", there's a generic programming device to do it: symmetric coroutines. But it's unlikely to help with your sentiment that the same device is used for different purposes. At least with asymmetric coroutines as currently in Python, you have next() to "call" a coroutine and yield to "return" from. With symmetric coroutines, you don't have a place to return - you can only "call" another coroutine, and then have freedom to call any (including a main loop), but need to bother to always know whom you want to call. But I guess it's already sounds confusing enough for folks who didn't hear about symmetric coroutines, whereas call/return paradigm is much more familiar and understandable. That's certainly why Python implements asymmetric model. And having both asymmetric and symmetric would quite confusing, especially that symmetric are more powerful and asymmetric can be easily implemented in terms of symmetric using continuation-passing style. On the last occurrence of "easily" mere users of course start to run away, shouting that if they wanted to use Scheme, they'd have taken classes on it and used long ago. So, the real problem with dichotomy you describe above is not technical, but rather educational/documentational. And the current approach asyncio takes is "you should not care if a coroutine yields to main loop, or how it is done". Actually, the current approach is to forbid and deny you knowledge how it is done, quoting Victor Stinner from another mail: "(A) asyncio coroutine in Python 3.4: use yield from, yield denied". So, just pretend that there's no yield, only yield from, problem solved. But people know there's yield - they knew it for long time before "yield from". And there're valid usages for yield in a coroutine, like implementing your, application-level generator/generation. Currently, any generation ability is usurped by asyncio's main loop. Much better approach IMHO is given in David Beazley's presentations on generators and coroutines, http://www.dabeaz.com/generators/ . He says that coroutines provided by framework are essentially "system calls". And that's why you don't want to know how they work, and shouldn't care - because users usually don't care how OS kernel implements system calls while sitting in the web browser. But if you want, you can, and you will discover that they're implemented by yield'ing objects of a special class. That's why you're *suggested* not to use yield's in coroutines - because if you want to catch yours, application-level, yields, you may also get any time a system yield object. You would need to expect such possibility, filter such yields and pass them up (re-yield). But there's no forbidden magic in all that, and understanding that helps a lot IMHO. -- Best regards, Paul mailto:pmiscml at gmail.com From greg.ewing at canterbury.ac.nz Thu Apr 23 01:47:29 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 23 Apr 2015 11:47:29 +1200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <5537C0EB.7070300@gmail.com> References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> Message-ID: <55383311.7060605@canterbury.ac.nz> Yury Selivanov wrote: > On the other hand, I hate the idea > of grammatically requiring parentheses for 'await' > expressions. That feels non-pytonic to me. How is it any different from grammatically requiring parens in an ordinary function call? Nobody ever complained about that. In the PEP 3152 way of thinking, a cocall is just a function call that happens to be suspendable. The fact that there is an iterator object involved behind the scenes is an implementation detail. You don't have to think about it or even know about it in order to write or understand suspendable code. It's possible to think about "yield from f(x)" or "await f(x)" that way, but only by exploiting a kind of pun in the code, where you think of f(x) as doing all the work and the rest as a syntactic marker indicating that the call is suspendable. PEP 3152 removes the pun by making this the *actual* interpretation of "cocall f(x)". -- Greg From greg.ewing at canterbury.ac.nz Thu Apr 23 02:12:36 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 23 Apr 2015 12:12:36 +1200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> Message-ID: <553838F4.7070303@canterbury.ac.nz> Guido van Rossum wrote: > On Wed, Apr 22, > OTOH I'm still struggling with what you have to do to wrap a coroutine > in a Task, the way its done in asyncio by the Task() constructor, the > loop.create_task() method, and the async() function That's easy. You can always use costart() to adapt a cofunction for use with something expecting a generator-based coroutine, e.g. codef my_task_func(arg): ... my_task = Task(costart(my_task_func, arg)) If you're willing to make changes, Task() et al could be made to recognise cofunctions and apply costart() where needed. -- Greg From chris.barker at noaa.gov Wed Apr 22 23:38:01 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Wed, 22 Apr 2015 14:38:01 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: Message-ID: > Oh wait, maybe it won't -- a string IS a sequence of strings. That's why > this is an insidious bug in the first place. On Tue, Apr 21, 2015 at 11:32 PM, Terry Reedy wrote: > I was just thinking today that for this, typing needs a subtraction > (difference) operation in addition to an addition (union) operation: > Difference(Iterable(str), str) > Yup -- that might solve, it, but it feels a bit odd -- I can take any Iterable of string, except a string. -- but what if there are others that won't work??? But I guess that's the core of putting type hints on a dynamic language. Still, I tend to think that this particular issue is really a limitation with Python's type system -- nothing to do with type hinting. I can see that a character type seems useless in Python, but there are lessons from other places: a numpy array is a collection of (usually) numbers that can be treated as a single entity -- much like a string is a collection of characters that is treated as a single entity -- in both cases, it's core to convenience and performance to do that. But with numpy, when you index an array, you get something back with one less dimension: index into a 3-d array, you get a 2-d array index into a 2-d array, you get a 1-d array index into a 1-d array, you get a scalar -- NOT a length-one 1-d array Sometimes this is a pain for generic code, but more often than not it's critical to writing dynamic code -- not because you couldn't do the operations you want, but because it's important to distinguish between a scalar and an array that happens to have only one value. Anyway, the point is that being able to say "all these types, except this one" would solve this particular problem -- but would it solve any others? Do we want this to work around a quirk in Pythons string type? NOTE: I know full well that adding a character type to Python is not worth it. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Thu Apr 23 02:35:18 2015 From: guido at python.org (Guido van Rossum) Date: Wed, 22 Apr 2015 17:35:18 -0700 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <553838F4.7070303@canterbury.ac.nz> References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <553838F4.7070303@canterbury.ac.nz> Message-ID: On Wed, Apr 22, 2015 at 5:12 PM, Greg Ewing wrote: > Guido van Rossum wrote: > >> On Wed, Apr 22, > OTOH I'm still struggling with what you have to do to >> wrap a coroutine in a Task, the way its done in asyncio by the Task() >> constructor, the loop.create_task() method, and the async() function >> > > That's easy. You can always use costart() to adapt a cofunction > for use with something expecting a generator-based coroutine, > e.g. > > codef my_task_func(arg): > ... > > my_task = Task(costart(my_task_func, arg)) > > If you're willing to make changes, Task() et al could be made to > recognise cofunctions and apply costart() where needed. Hm, that feels backwards incompatible (since currently I can write Task(my_task_func(arg)) and also a step backwards in elegance (having to pass the args separately). OTOH the benefit is that it's much harder to accidentally forget to wait for a coroutine. And maybe the backward compatibility issue is not really a problem because you have to opt in by using codef or async def. So I'm still torn. :-) Somebody would need to take a mature asyncio app and see how often this is used (i.e. how many place would require adding costart() as in the above example). -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Thu Apr 23 02:45:13 2015 From: guido at python.org (Guido van Rossum) Date: Wed, 22 Apr 2015 17:45:13 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: Message-ID: On Wed, Apr 22, 2015 at 2:38 PM, Chris Barker wrote: > > > Oh wait, maybe it won't -- a string IS a sequence of strings. That's why >> this is an insidious bug in the first place. > > On Tue, Apr 21, 2015 at 11:32 PM, Terry Reedy wrote: > > >> I was just thinking today that for this, typing needs a subtraction >> (difference) operation in addition to an addition (union) operation: >> Difference(Iterable(str), str) >> > > Yup -- that might solve, it, but it feels a bit odd -- I can take any > Iterable of string, except a string. -- but what if there are others that > won't work??? But I guess that's the core of putting type hints on a > dynamic language. > > Still, I tend to think that this particular issue is really a limitation > with Python's type system -- nothing to do with type hinting. > > I can see that a character type seems useless in Python, but there are > lessons from other places: a numpy array is a collection of (usually) > numbers that can be treated as a single entity -- much like a string is a > collection of characters that is treated as a single entity -- in both > cases, it's core to convenience and performance to do that. But with numpy, > when you index an array, you get something back with one less dimension: > > index into a 3-d array, you get a 2-d array > index into a 2-d array, you get a 1-d array > index into a 1-d array, you get a scalar -- NOT a length-one 1-d array > > Sometimes this is a pain for generic code, but more often than not it's > critical to writing dynamic code -- not because you couldn't do the > operations you want, but because it's important to distinguish between a > scalar and an array that happens to have only one value. > > Anyway, the point is that being able to say "all these types, except this > one" would solve this particular problem -- but would it solve any others? > Do we want this to work around a quirk in Pythons string type? > > NOTE: I know full well that adding a character type to Python is not worth > it. > If you switch to bytes the problem goes away. :-P More seriously, I doubt there are other important use cases for Difference. Given that even if Difference existed, and even if we had a predefined type alias for Difference[Iterable[str], str], you' still have to remember to mark up all those functions with that annotation. It almost sounds simpler to just predefine this function: def make_string_list(a: Union[str, Iterable[str]]) -> Iterable[str]: if isinstance(a, str): return [a] else: return a and call this in those functions that have an Interable[str] argument. Now instead of getting errors for all the places where a caller mistakenly passes a single str, you've *fixed* all those call sites. Isn't that more Pythonic? :-) -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From yselivanov.ml at gmail.com Thu Apr 23 02:55:54 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 22 Apr 2015 20:55:54 -0400 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <553838F4.7070303@canterbury.ac.nz> Message-ID: <5538431A.1020709@gmail.com> On 2015-04-22 8:35 PM, Guido van Rossum wrote: > On Wed, Apr 22, 2015 at 5:12 PM, Greg Ewing > wrote: > >> Guido van Rossum wrote: >> >>> On Wed, Apr 22, > OTOH I'm still struggling with what you have to do to >>> wrap a coroutine in a Task, the way its done in asyncio by the Task() >>> constructor, the loop.create_task() method, and the async() function >>> >> That's easy. You can always use costart() to adapt a cofunction >> for use with something expecting a generator-based coroutine, >> e.g. >> >> codef my_task_func(arg): >> ... >> >> my_task = Task(costart(my_task_func, arg)) >> >> If you're willing to make changes, Task() et al could be made to >> recognise cofunctions and apply costart() where needed. > > Hm, that feels backwards incompatible (since currently I can write > Task(my_task_func(arg)) and also a step backwards in elegance (having to > pass the args separately). > > OTOH the benefit is that it's much harder to accidentally forget to wait > for a coroutine. And maybe the backward compatibility issue is not really a > problem because you have to opt in by using codef or async def. > > So I'm still torn. :-) > > Somebody would need to take a mature asyncio app and see how often this is > used (i.e. how many place would require adding costart() as in the above > example). Somewhere in this thread Victor Stinner wrote: """A huge part of the asyncio module is based on "yield from fut" where fut is a Future object.""" So how would we do "await fut" if await requires parentheses? I think that the problem of forgetting 'yield from' is a bit exaggerated. Yes, I myself forgot 'yield from' once or twice. But that's it, it has never happened since. Yury From guido at python.org Thu Apr 23 03:04:57 2015 From: guido at python.org (Guido van Rossum) Date: Wed, 22 Apr 2015 18:04:57 -0700 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <5538431A.1020709@gmail.com> References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <553838F4.7070303@canterbury.ac.nz> <5538431A.1020709@gmail.com> Message-ID: On Wed, Apr 22, 2015 at 5:55 PM, Yury Selivanov wrote: > On 2015-04-22 8:35 PM, Guido van Rossum wrote: > >> On Wed, Apr 22, 2015 at 5:12 PM, Greg Ewing >> wrote: >> >> Guido van Rossum wrote: >>> >>> On Wed, Apr 22, > OTOH I'm still struggling with what you have to do to >>>> wrap a coroutine in a Task, the way its done in asyncio by the Task() >>>> constructor, the loop.create_task() method, and the async() function >>>> >>>> That's easy. You can always use costart() to adapt a cofunction >>> for use with something expecting a generator-based coroutine, >>> e.g. >>> >>> codef my_task_func(arg): >>> ... >>> >>> my_task = Task(costart(my_task_func, arg)) >>> >>> If you're willing to make changes, Task() et al could be made to >>> recognise cofunctions and apply costart() where needed. >>> >> >> Hm, that feels backwards incompatible (since currently I can write >> Task(my_task_func(arg)) and also a step backwards in elegance (having to >> pass the args separately). >> >> OTOH the benefit is that it's much harder to accidentally forget to wait >> for a coroutine. And maybe the backward compatibility issue is not really >> a >> problem because you have to opt in by using codef or async def. >> >> So I'm still torn. :-) >> >> Somebody would need to take a mature asyncio app and see how often this is >> used (i.e. how many place would require adding costart() as in the above >> example). >> > > Somewhere in this thread Victor Stinner wrote: > > """A huge part of the asyncio module is based on "yield from fut" where > fut is a Future object.""" > > So how would we do "await fut" if await requires parentheses? > We could make Future a valid co-callable object. > I think that the problem of forgetting 'yield from' is a bit exaggerated. > Yes, I myself forgot 'yield from' once or twice. But that's it, it has > never happened since. Maybe, but it *is* a part of everybody's learning curve. -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From yselivanov.ml at gmail.com Thu Apr 23 03:10:52 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 22 Apr 2015 21:10:52 -0400 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <553838F4.7070303@canterbury.ac.nz> <5538431A.1020709@gmail.com> Message-ID: <5538469C.6050100@gmail.com> On 2015-04-22 9:04 PM, Guido van Rossum wrote: > On Wed, Apr 22, 2015 at 5:55 PM, Yury Selivanov > wrote: > >> On 2015-04-22 8:35 PM, Guido van Rossum wrote: >> >>> On Wed, Apr 22, 2015 at 5:12 PM, Greg Ewing >>> wrote: >>> >>> Guido van Rossum wrote: >>>> On Wed, Apr 22, > OTOH I'm still struggling with what you have to do to >>>>> wrap a coroutine in a Task, the way its done in asyncio by the Task() >>>>> constructor, the loop.create_task() method, and the async() function >>>>> >>>>> That's easy. You can always use costart() to adapt a cofunction >>>> for use with something expecting a generator-based coroutine, >>>> e.g. >>>> >>>> codef my_task_func(arg): >>>> ... >>>> >>>> my_task = Task(costart(my_task_func, arg)) >>>> >>>> If you're willing to make changes, Task() et al could be made to >>>> recognise cofunctions and apply costart() where needed. >>>> >>> Hm, that feels backwards incompatible (since currently I can write >>> Task(my_task_func(arg)) and also a step backwards in elegance (having to >>> pass the args separately). >>> >>> OTOH the benefit is that it's much harder to accidentally forget to wait >>> for a coroutine. And maybe the backward compatibility issue is not really >>> a >>> problem because you have to opt in by using codef or async def. >>> >>> So I'm still torn. :-) >>> >>> Somebody would need to take a mature asyncio app and see how often this is >>> used (i.e. how many place would require adding costart() as in the above >>> example). >>> >> Somewhere in this thread Victor Stinner wrote: >> >> """A huge part of the asyncio module is based on "yield from fut" where >> fut is a Future object.""" >> >> So how would we do "await fut" if await requires parentheses? >> > We could make Future a valid co-callable object. So you would have to write 'await fut()'? This is non-intuitive. To make Greg's proposal work it'd be a *requirement* for 'await' (enforced by the grammar!) to have '()' after it. Yury From andrew.svetlov at gmail.com Thu Apr 23 03:16:01 2015 From: andrew.svetlov at gmail.com (Andrew Svetlov) Date: Thu, 23 Apr 2015 04:16:01 +0300 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <553838F4.7070303@canterbury.ac.nz> Message-ID: On Thu, Apr 23, 2015 at 3:35 AM, Guido van Rossum wrote: > On Wed, Apr 22, 2015 at 5:12 PM, Greg Ewing > wrote: >> >> Guido van Rossum wrote: >>> >>> On Wed, Apr 22, > OTOH I'm still struggling with what you have to do to >>> wrap a coroutine in a Task, the way its done in asyncio by the Task() >>> constructor, the loop.create_task() method, and the async() function >> >> >> That's easy. You can always use costart() to adapt a cofunction >> for use with something expecting a generator-based coroutine, >> e.g. >> >> codef my_task_func(arg): >> ... >> >> my_task = Task(costart(my_task_func, arg)) >> >> If you're willing to make changes, Task() et al could be made to >> recognise cofunctions and apply costart() where needed. > > > Hm, that feels backwards incompatible (since currently I can write > Task(my_task_func(arg)) and also a step backwards in elegance (having to > pass the args separately). > > OTOH the benefit is that it's much harder to accidentally forget to wait for > a coroutine. And maybe the backward compatibility issue is not really a > problem because you have to opt in by using codef or async def. > > So I'm still torn. :-) > > Somebody would need to take a mature asyncio app and see how often this is > used (i.e. how many place would require adding costart() as in the above > example). > I have not found fresh patch for 3152 to play with, but at least aiohttp [1] library very often creates new tasks by `async(coro(...))` call. The same for aiozmq, aioredis, sockjs (aiohttp-based library for sock.js), aiokafka etc. My applications created for my job also has a `async(...)` calls or direct `Task(f(arg))` creations -- the numbers are between 3 and 10 usage lines per application. Not a big deal to fix them all but it's backward incompatibility. In opposite, I've finished experimental branch [2] of aiomysql library (asyncio driver for MySQL database) with support for `async for` and `async with`. The main problem with public released version is impossibility to handle transactions (requires async context manager) and iteration with async fetching data from cursor (required for server-side cursors for example). Now both problems are solved with keeping full backward compatibility. The library can be used with Python 3.3+ but obviously no new features are available for old Pythons. I use asyncio coroutines, not async functions, e.g.: class Cursor: # ... @asyncio.coroutine def __aiter__(self): return self @asyncio.coroutine def __anext__(self): ret = yield from self.fetchone() if ret is not None: return ret else: raise StopAsyncIteration The whole aiomysql code is correct from Python 3.3+ perspective. For testing new features I use new syntax of in separate test files, test runner will skip test modules with syntax errors on old Python but run those modules on python from PEP 492 branch. Usage example (table 'tbl' is pre-filled, DB engine is connected to server): async def go(engine): async with engine.connect() as conn: async with (await conn.begin()) as tr: await conn.execute("DELETE FROM tbl WHERE (id % 2) = 0") async for row in conn.execute("SELECT * FROM tbl"): print(row['id'], row['name']) [1] https://github.com/KeepSafe/aiohttp [2] https://github.com/aio-libs/aiomysql/tree/await > -- > --Guido van Rossum (python.org/~guido) > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.com > -- Thanks, Andrew Svetlov From yselivanov.ml at gmail.com Thu Apr 23 03:27:06 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 22 Apr 2015 21:27:06 -0400 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <55383311.7060605@canterbury.ac.nz> References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <55383311.7060605@canterbury.ac.nz> Message-ID: <55384A6A.2010002@gmail.com> Greg, On 2015-04-22 7:47 PM, Greg Ewing wrote: > Yury Selivanov wrote: > >> On the other hand, I hate the idea >> of grammatically requiring parentheses for 'await' >> expressions. That feels non-pytonic to me. > > How is it any different from grammatically requiring > parens in an ordinary function call? Nobody ever > complained about that. It is different. 1. Because 'await' keyword might be at a great distance from the object you're really calling: await foo.bar.baz['spam']() +-----------------------+ Can I chain the calls: await foo()() ? or await foo().bar()? 2. Because there is no other keyword in python with similar behaviour. 3. Moreover: unless I can write 'await future' - your proposal *won't* work with a lot of existing code and patterns. It's going to be radically different from all other languages that implement 'await' too. Yury From andrew.svetlov at gmail.com Thu Apr 23 03:50:48 2015 From: andrew.svetlov at gmail.com (Andrew Svetlov) Date: Thu, 23 Apr 2015 04:50:48 +0300 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <55384A6A.2010002@gmail.com> References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <55383311.7060605@canterbury.ac.nz> <55384A6A.2010002@gmail.com> Message-ID: I guess to raise exception on unwinded async generator in destructor even in non-debug mode. Debug mode may have more complex info with source_traceback included, as Victor Stinner does for CoroWrapper. On Thu, Apr 23, 2015 at 4:27 AM, Yury Selivanov wrote: > Greg, > > On 2015-04-22 7:47 PM, Greg Ewing wrote: >> >> Yury Selivanov wrote: >> >>> On the other hand, I hate the idea >>> of grammatically requiring parentheses for 'await' >>> expressions. That feels non-pytonic to me. >> >> >> How is it any different from grammatically requiring >> parens in an ordinary function call? Nobody ever >> complained about that. > > > It is different. > > 1. Because 'await' keyword might be at a great distance > from the object you're really calling: > > await foo.bar.baz['spam']() > +-----------------------+ > > Can I chain the calls: > > await foo()() ? > > or await foo().bar()? > > 2. Because there is no other keyword in python > with similar behaviour. > > 3. Moreover: unless I can write 'await future' - your > proposal *won't* work with a lot of existing code > and patterns. It's going to be radically different > from all other languages that implement 'await' too. > > Yury > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.com -- Thanks, Andrew Svetlov From greg.ewing at canterbury.ac.nz Thu Apr 23 04:58:37 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 23 Apr 2015 14:58:37 +1200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <5537C9D0.80505@gmail.com> References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <5537C9D0.80505@gmail.com> Message-ID: <55385FDD.8080403@canterbury.ac.nz> On 04/23/2015 04:18 AM, Yury Selivanov wrote: > 2. We'll hack Gen(/ceval.c?) objects to raise an error if they > are called directly and have a 'CO_COROUTINE' flag. By "Gen", do you mean the generator-function or the generator-iterator? That flag has to be on the generator-function, not the generator-iterator, otherwise by the time ceval sees it, the call that should have been forbidden has already been made. To make this work without flagging the function, it would be necessary to check the result of every function call that wasn't immediately awaited and raise an exception if it were awaitable. But that would mean awaitable objects not being fully first-class citizens, since there would be some perfectly reasonable things that you can't do with them. I suspect it would make writing the kernel of a coroutine-scheduling system such as asyncio very awkward, perhaps impossible, to write in pure Python. > 3. Task(), create_task() and async() will be modified to call > 'coro.__await__(..)' if 'coro' has a 'CO_COROUTINE' flag. Or, as I pointed out earlier, the caller can wrap the argument in something equivalent to costart(). > 4. 'await' will require parentheses grammatically. That will > make it different from 'yield' expression. For instance, > I still don't know what would 'await coro(123)()' mean. In PEP 3152, cocall binds to the nearest set of function-calling parens, so 'cocall f()()' is parsed as '(cocall f())()'. If you want it the other way, you have to write it as 'cocall (f())()'. I know that's a somewhat arbitrary thing to remember, and it makes chained function calls a bit harder to write and read. But chaining calls like that is a fairly rare thing to do, in contrast with using a call expression as an argument to another call, which is very common. That's not the only case, either. Just about any unparenthesised use of yield-from other than the sole contents of the RHS of an assignment seems to be disallowed. All of these are currently syntax errors, for example: yield from f(x) + yield from g(x) x + yield from g(x) [yield from f(x)] > 5. 'await foo(*a, **k)' will be an equivalent to > 'yield from type(coro).__await__(coro, *a, **k)' Again, I'm not sure whether you're proposing to make the functions the await-able objects rather than the iterators (which would effectively be PEP 3152 with __cocall__ renamed to __await__) or something else. I won't comment further on this point until that's clearer. > 6. If we ever decide to implement coroutine-generators -- > async def functions with 'await' *and* some form of 'yield' -- > we'll need to reverse the rule -- allow __call__ and > disallow __await__ on such objects (so that you'll be able > to write 'async for item in coro_gen()' instead of > 'async for item in await coro_gen()'. Maybe. I haven't thought that idea through properly yet. Possibly the answer is that you define such a function using an ordinary "def", to match the way it's called. The fact that it's an async generator is then indicated by the fact that it contains "async yield". -- Greg From greg.ewing at canterbury.ac.nz Thu Apr 23 06:38:06 2015 From: greg.ewing at canterbury.ac.nz (Greg) Date: Thu, 23 Apr 2015 16:38:06 +1200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <5537C9D0.80505@gmail.com> <5537D6B4.8060204@gmail.com> Message-ID: <5538772E.4030706@canterbury.ac.nz> On 23/04/2015 6:32 a.m., Andrew Svetlov wrote: > If we forbid to call `async def` from regualr code how asyncio should > work? I'd like to push `async def` everywhere in asyncio API where > asyncio.coroutine required. As I suggested earlier, a way could be provided to mark a function as callable using either yield from f() or await f(). That would water down the error catching ability a bit, but it would allow interoperability with existing asyncio code. -- Greg From greg.ewing at canterbury.ac.nz Thu Apr 23 08:16:17 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 23 Apr 2015 18:16:17 +1200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <5537F579.9020704@gmail.com> References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <5537C9D0.80505@gmail.com> <5537D6B4.8060204@gmail.com> <5537EC2C.4070105@gmail.com> <5537F579.9020704@gmail.com> Message-ID: <55388E31.3090908@canterbury.ac.nz> Yury Selivanov wrote: > I think there is another way... instead of pushing > > GET_ITER > ... > YIELD_FROM > > opcodes, we'll need to replace GET_ITER with another one: > > GET_ITER_SPECIAL > ... > YIELD_FROM I'm lost. What Python code are you suggesting this would be generated from? -- Greg From greg.ewing at canterbury.ac.nz Thu Apr 23 08:37:28 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 23 Apr 2015 18:37:28 +1200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> Message-ID: <55389328.9040406@canterbury.ac.nz> PJ Eby wrote: > I find this a little weird. Why not just have `with` and `for` inside > a coroutine dynamically check the iterator or context manager, and > either behave sync or async accordingly? Why must there be a > *syntactic* difference? It depends on whether you think it's important to have a syntactic marker for points where the code can potentially be suspended. In my original vision for PEP 3152, there was no "cocall" syntax -- you just wrote an ordinary call, and whether to make a cocall or not was determined at run time. But Guido and others felt that it would be better for suspension points to be explicit, so I ended up with cocall. The same reasoning presumably applies to asynchronous 'for' and 'with'. If you think that it's important to make suspendable calls explicit, you probably want to mark them as well. > ...which, incidentally, highlights one of the things that's been > bothering me about all this "async foo" stuff: "async def" looks like > it *defines the function* asynchronously That bothers me a bit, too, but my main problem with it is the way it displaces the function name. "def f() async:" would solve both of those problems. -- Greg From greg.ewing at canterbury.ac.nz Thu Apr 23 09:03:32 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 23 Apr 2015 19:03:32 +1200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <553803C2.6060902@gmail.com> References: <55368858.4010007@gmail.com> <553803C2.6060902@gmail.com> Message-ID: <55389944.9080004@canterbury.ac.nz> Yury Selivanov wrote: > - If it's an object with __await__, return iter(object.__await__()) Is the iter() really needed? Couldn't the contract of __await__ be that it always returns an iterator? -- Greg From agriff at tin.it Thu Apr 23 09:21:11 2015 From: agriff at tin.it (Andrea Griffini) Date: Thu, 23 Apr 2015 09:21:11 +0200 Subject: [Python-Dev] Questionable TCP server example Message-ID: It's not the first time someone is confused by the server example of https://docs.python.org/3/library/socketserver.html where the receiving side is not making a loop over recv. Moreover the documentation contains a misleading description of what really happens: "The difference is that the readline() call in the second handler will call recv() multiple times until it encounters a newline character, while the single recv() call in the first handler will just return what has been sent from the client in one sendall() call." Unless I'm missing something there's no way to know client side when all data sent by "sendall" has been received (TCP stream protocol doesn't have message boundaries) and the `recv` based code doesn't handle neither fragmentation nor clients that send more than 1024 bytes. Am I missing something or that is indeed an example of how NOT to write a socket-based server? Andrea -------------- next part -------------- An HTML attachment was scrubbed... URL: From tjreedy at udel.edu Thu Apr 23 09:29:04 2015 From: tjreedy at udel.edu (Terry Reedy) Date: Thu, 23 Apr 2015 03:29:04 -0400 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: Message-ID: On 4/22/2015 8:45 PM, Guido van Rossum wrote: > On Wed, Apr 22, 2015 at 2:38 PM, Chris Barker > wrote: > On Tue, Apr 21, 2015 at 11:32 PM, Terry Reedy > wrote: > > I was just thinking today that for this, typing needs a > subtraction (difference) operation in addition to an addition > (union) operation: Difference(Iterable(str), str) > Anyway, the point is that being able to say "all these types, except > this one" would solve this particular problem -- but would it solve > any others? Do we want this to work around a quirk in Pythons string > type? > More seriously, I doubt there are other important use cases for Difference. I thought about Difference(numbers.Number, complex) to get ordered numbers, but numbers.Real should probably work. I agree more real uses are needed before adding Difference. -- Terry Jan Reedy From tds333+pydev at gmail.com Thu Apr 23 09:30:30 2015 From: tds333+pydev at gmail.com (Wolfgang Langner) Date: Thu, 23 Apr 2015 09:30:30 +0200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <55368858.4010007@gmail.com> References: <55368858.4010007@gmail.com> Message-ID: Hi, most of the time I am a silent reader but in this discussion I must step in. I use twisted and async stuff a lot over years followed development of asyncio closely. First it is good to do differentiate async coroutines from generators. So everyone can see it and have this in mind and don't mix up booth. It is also easier to explain for new users. Sometimes generators blows their mind and it takes a while to get used to them. Async stuff is even harder. 1. I am fine with using something special instead of "yield" or "yield from" for this. C# "await" is ok. Everything else suggested complicates the language and makes it harder to read. 2. async def f(): is harder to read and something special also it breaks the symmetry in front (def indent). Also every existing tooling must be changed to support it. Same for def async, def f() async: I thing a decorator is enough here @coroutine def f(): is the best solution to mark something as a coroutine. 3. async with and async for Bead idea, we clutter the language even more and it is one more thing every newbie could do wrong. for x in y: result = await f() is enough, every 'async' framework lived without it over years. Same for with statement. The main use case suggested was for database stuff and this is also where most are best with defer something to a thread and keep it none async. All together it is very late in the development cycle for 3.5 to incorporate such a big change. Best is to give all this some more time and defer it to 3.6 and some alpha releases to experiment with. Regards, Wolfgang On Tue, Apr 21, 2015 at 7:26 PM, Yury Selivanov wrote: > Hi python-dev, > > I'm moving the discussion from python-ideas to here. > > The updated version of the PEP should be available shortly > at https://www.python.org/dev/peps/pep-0492 > and is also pasted in this email. > > Updates: > > 1. CO_ASYNC flag was renamed to CO_COROUTINE; > > 2. sys.set_async_wrapper() was renamed to > sys.set_coroutine_wrapper(); > > 3. New function: sys.get_coroutine_wrapper(); > > 4. types.async_def() renamed to types.coroutine(); > > 5. New section highlighting differences from > PEP 3152. > > 6. New AST node - AsyncFunctionDef; the proposal > now is 100% backwards compatible; > > 7. A new section clarifying that coroutine-generators > are not part of the current proposal; > > 8. Various small edits/typos fixes. > > > There's is a bug tracker issue to track code review > of the reference implementation (Victor Stinner is > doing the review): http://bugs.python.org/issue24017 > While the PEP isn't accepted, we want to make sure > that the reference implementation is ready when such > a decision will be made. > > > Let's discuss some open questions: > > 1. Victor raised a question if we should locate > coroutine() function from 'types' module to > 'functools'. > > My opinion is that 'types' module is a better > place for 'corotine()', since it adjusts the > type of the passed generator. 'functools' is > about 'partials', 'lru_cache' and 'wraps' kind > of things. > > 2. I propose to disallow using of 'for..in' loops, > and builtins like 'list()', 'iter()', 'next()', > 'tuple()' etc on coroutines. > > It's possible by modifying PyObject_GetIter to > raise an exception if it receives a coroutine-object. > > 'yield from' can also be modified to only accept > coroutine objects if it is called from a generator > with CO_COROUTINE flag. > > This will further separate coroutines from > generators, making it harder to screw something > up by an accident. > > I have a branch of reference implementation > https://github.com/1st1/cpython/tree/await_noiter > where this is implemented. I did not observe > any performance drop. > > There is just one possible backwards compatibility > issue here: there will be an exception if some user > of asyncio actually used to iterate over generators > decorated with @coroutine. But I can't imagine > why would someone do that, and even if they did -- > it's probably a bug or wrong usage of asyncio. > > > That's it! I'd be happy to hear some feedback! > > Thanks, > Yury > > > > PEP: 492 > Title: Coroutines with async and await syntax > Version: $Revision$ > Last-Modified: $Date$ > Author: Yury Selivanov > Status: Draft > Type: Standards Track > Content-Type: text/x-rst > Created: 09-Apr-2015 > Python-Version: 3.5 > Post-History: 17-Apr-2015, 21-Apr-2015 > > > Abstract > ======== > > This PEP introduces new syntax for coroutines, asynchronous ``with`` > statements and ``for`` loops. The main motivation behind this proposal > is to streamline writing and maintaining asynchronous code, as well as > to simplify previously hard to implement code patterns. > > > Rationale and Goals > =================== > > Current Python supports implementing coroutines via generators (PEP > 342), further enhanced by the ``yield from`` syntax introduced in PEP > 380. This approach has a number of shortcomings: > > * it is easy to confuse coroutines with regular generators, since they > share the same syntax; async libraries often attempt to alleviate > this by using decorators (e.g. ``@asyncio.coroutine`` [1]_); > > * it is not possible to natively define a coroutine which has no > ``yield`` or ``yield from`` statements, again requiring the use of > decorators to fix potential refactoring issues; > > * support for asynchronous calls is limited to expressions where > ``yield`` is allowed syntactically, limiting the usefulness of > syntactic features, such as ``with`` and ``for`` statements. > > This proposal makes coroutines a native Python language feature, and > clearly separates them from generators. This removes > generator/coroutine ambiguity, and makes it possible to reliably define > coroutines without reliance on a specific library. This also enables > linters and IDEs to improve static code analysis and refactoring. > > Native coroutines and the associated new syntax features make it > possible to define context manager and iteration protocols in > asynchronous terms. As shown later in this proposal, the new ``async > with`` statement lets Python programs perform asynchronous calls when > entering and exiting a runtime context, and the new ``async for`` > statement makes it possible to perform asynchronous calls in iterators. > > > Specification > ============= > > This proposal introduces new syntax and semantics to enhance coroutine > support in Python, it does not change the internal implementation of > coroutines, which are still based on generators. > > It is strongly suggested that the reader understands how coroutines are > implemented in Python (PEP 342 and PEP 380). It is also recommended to > read PEP 3156 (asyncio framework) and PEP 3152 (Cofunctions). > > From this point in this document we use the word *coroutine* to refer > to functions declared using the new syntax. *generator-based > coroutine* is used where necessary to refer to coroutines that are > based on generator syntax. > > > New Coroutine Declaration Syntax > -------------------------------- > > The following new syntax is used to declare a coroutine:: > > async def read_data(db): > pass > > Key properties of coroutines: > > * ``async def`` functions are always coroutines, even if they do not > contain ``await`` expressions. > > * It is a ``SyntaxError`` to have ``yield`` or ``yield from`` > expressions in an ``async`` function. > > * Internally, a new code object flag - ``CO_COROUTINE`` - is introduced > to enable runtime detection of coroutines (and migrating existing > code). All coroutines have both ``CO_COROUTINE`` and ``CO_GENERATOR`` > flags set. > > * Regular generators, when called, return a *generator object*; > similarly, coroutines return a *coroutine object*. > > * ``StopIteration`` exceptions are not propagated out of coroutines, > and are replaced with a ``RuntimeError``. For regular generators > such behavior requires a future import (see PEP 479). > > > types.coroutine() > ----------------- > > A new function ``coroutine(gen)`` is added to the ``types`` module. It > applies ``CO_COROUTINE`` flag to the passed generator-function's code > object, making it to return a *coroutine object* when called. > > This feature enables an easy upgrade path for existing libraries. > > > Await Expression > ---------------- > > The following new ``await`` expression is used to obtain a result of > coroutine execution:: > > async def read_data(db): > data = await db.fetch('SELECT ...') > ... > > ``await``, similarly to ``yield from``, suspends execution of > ``read_data`` coroutine until ``db.fetch`` *awaitable* completes and > returns the result data. > > It uses the ``yield from`` implementation with an extra step of > validating its argument. ``await`` only accepts an *awaitable*, which > can be one of: > > * A *coroutine object* returned from a *coroutine* or a generator > decorated with ``types.coroutine()``. > > * An object with an ``__await__`` method returning an iterator. > > Any ``yield from`` chain of calls ends with a ``yield``. This is a > fundamental mechanism of how *Futures* are implemented. Since, > internally, coroutines are a special kind of generators, every > ``await`` is suspended by a ``yield`` somewhere down the chain of > ``await`` calls (please refer to PEP 3156 for a detailed > explanation.) > > To enable this behavior for coroutines, a new magic method called > ``__await__`` is added. In asyncio, for instance, to enable Future > objects in ``await`` statements, the only change is to add > ``__await__ = __iter__`` line to ``asyncio.Future`` class. > > Objects with ``__await__`` method are called *Future-like* objects in > the rest of this PEP. > > Also, please note that ``__aiter__`` method (see its definition > below) cannot be used for this purpose. It is a different protocol, > and would be like using ``__iter__`` instead of ``__call__`` for > regular callables. > > It is a ``SyntaxError`` to use ``await`` outside of a coroutine. > > > Asynchronous Context Managers and "async with" > ---------------------------------------------- > > An *asynchronous context manager* is a context manager that is able to > suspend execution in its *enter* and *exit* methods. > > To make this possible, a new protocol for asynchronous context managers > is proposed. Two new magic methods are added: ``__aenter__`` and > ``__aexit__``. Both must return an *awaitable*. > > An example of an asynchronous context manager:: > > class AsyncContextManager: > async def __aenter__(self): > await log('entering context') > > async def __aexit__(self, exc_type, exc, tb): > await log('exiting context') > > > New Syntax > '''''''''' > > A new statement for asynchronous context managers is proposed:: > > async with EXPR as VAR: > BLOCK > > > which is semantically equivalent to:: > > mgr = (EXPR) > aexit = type(mgr).__aexit__ > aenter = type(mgr).__aenter__(mgr) > exc = True > > try: > try: > VAR = await aenter > BLOCK > except: > exc = False > exit_res = await aexit(mgr, *sys.exc_info()) > if not exit_res: > raise > > finally: > if exc: > await aexit(mgr, None, None, None) > > > As with regular ``with`` statements, it is possible to specify multiple > context managers in a single ``async with`` statement. > > It is an error to pass a regular context manager without ``__aenter__`` > and ``__aexit__`` methods to ``async with``. It is a ``SyntaxError`` > to use ``async with`` outside of a coroutine. > > > Example > ''''''' > > With asynchronous context managers it is easy to implement proper > database transaction managers for coroutines:: > > async def commit(session, data): > ... > > async with session.transaction(): > ... > await session.update(data) > ... > > Code that needs locking also looks lighter:: > > async with lock: > ... > > instead of:: > > with (yield from lock): > ... > > > Asynchronous Iterators and "async for" > -------------------------------------- > > An *asynchronous iterable* is able to call asynchronous code in its > *iter* implementation, and *asynchronous iterator* can call > asynchronous code in its *next* method. To support asynchronous > iteration: > > 1. An object must implement an ``__aiter__`` method returning an > *awaitable* resulting in an *asynchronous iterator object*. > > 2. An *asynchronous iterator object* must implement an ``__anext__`` > method returning an *awaitable*. > > 3. To stop iteration ``__anext__`` must raise a ``StopAsyncIteration`` > exception. > > An example of asynchronous iterable:: > > class AsyncIterable: > async def __aiter__(self): > return self > > async def __anext__(self): > data = await self.fetch_data() > if data: > return data > else: > raise StopAsyncIteration > > async def fetch_data(self): > ... > > > New Syntax > '''''''''' > > A new statement for iterating through asynchronous iterators is > proposed:: > > async for TARGET in ITER: > BLOCK > else: > BLOCK2 > > which is semantically equivalent to:: > > iter = (ITER) > iter = await type(iter).__aiter__(iter) > running = True > while running: > try: > TARGET = await type(iter).__anext__(iter) > except StopAsyncIteration: > running = False > else: > BLOCK > else: > BLOCK2 > > > It is an error to pass a regular iterable without ``__aiter__`` method > to ``async for``. It is a ``SyntaxError`` to use ``async for`` outside > of a coroutine. > > As for with regular ``for`` statement, ``async for`` has an optional > ``else`` clause. > > > Example 1 > ''''''''' > > With asynchronous iteration protocol it is possible to asynchronously > buffer data during iteration:: > > async for data in cursor: > ... > > Where ``cursor`` is an asynchronous iterator that prefetches ``N`` rows > of data from a database after every ``N`` iterations. > > The following code illustrates new asynchronous iteration protocol:: > > class Cursor: > def __init__(self): > self.buffer = collections.deque() > > def _prefetch(self): > ... > > async def __aiter__(self): > return self > > async def __anext__(self): > if not self.buffer: > self.buffer = await self._prefetch() > if not self.buffer: > raise StopAsyncIteration > return self.buffer.popleft() > > then the ``Cursor`` class can be used as follows:: > > async for row in Cursor(): > print(row) > > which would be equivalent to the following code:: > > i = await Cursor().__aiter__() > while True: > try: > row = await i.__anext__() > except StopAsyncIteration: > break > else: > print(row) > > > Example 2 > ''''''''' > > The following is a utility class that transforms a regular iterable to > an asynchronous one. While this is not a very useful thing to do, the > code illustrates the relationship between regular and asynchronous > iterators. > > :: > > class AsyncIteratorWrapper: > def __init__(self, obj): > self._it = iter(obj) > > async def __aiter__(self): > return self > > async def __anext__(self): > try: > value = next(self._it) > except StopIteration: > raise StopAsyncIteration > return value > > async for item in AsyncIteratorWrapper("abc"): > print(item) > > > Why StopAsyncIteration? > ''''''''''''''''''''''' > > Coroutines are still based on generators internally. So, before PEP > 479, there was no fundamental difference between > > :: > > def g1(): > yield from fut > return 'spam' > > and > > :: > > def g2(): > yield from fut > raise StopIteration('spam') > > And since PEP 479 is accepted and enabled by default for coroutines, > the following example will have its ``StopIteration`` wrapped into a > ``RuntimeError`` > > :: > > async def a1(): > await fut > raise StopIteration('spam') > > The only way to tell the outside code that the iteration has ended is > to raise something other than ``StopIteration``. Therefore, a new > built-in exception class ``StopAsyncIteration`` was added. > > Moreover, with semantics from PEP 479, all ``StopIteration`` exceptions > raised in coroutines are wrapped in ``RuntimeError``. > > > Debugging Features > ------------------ > > One of the most frequent mistakes that people make when using > generators as coroutines is forgetting to use ``yield from``:: > > @asyncio.coroutine > def useful(): > asyncio.sleep(1) # this will do noting without 'yield from' > > For debugging this kind of mistakes there is a special debug mode in > asyncio, in which ``@coroutine`` decorator wraps all functions with a > special object with a destructor logging a warning. Whenever a wrapped > generator gets garbage collected, a detailed logging message is > generated with information about where exactly the decorator function > was defined, stack trace of where it was collected, etc. Wrapper > object also provides a convenient ``__repr__`` function with detailed > information about the generator. > > The only problem is how to enable these debug capabilities. Since > debug facilities should be a no-op in production mode, ``@coroutine`` > decorator makes the decision of whether to wrap or not to wrap based on > an OS environment variable ``PYTHONASYNCIODEBUG``. This way it is > possible to run asyncio programs with asyncio's own functions > instrumented. ``EventLoop.set_debug``, a different debug facility, has > no impact on ``@coroutine`` decorator's behavior. > > With this proposal, coroutines is a native, distinct from generators, > concept. New methods ``set_coroutine_wrapper`` and > ``get_coroutine_wrapper`` are added to the ``sys`` module, with which > frameworks can provide advanced debugging facilities. > > It is also important to make coroutines as fast and efficient as > possible, therefore there are no debug features enabled by default. > > Example:: > > async def debug_me(): > await asyncio.sleep(1) > > def async_debug_wrap(generator): > return asyncio.AsyncDebugWrapper(generator) > > sys.set_coroutine_wrapper(async_debug_wrap) > > debug_me() # <- this line will likely GC the coroutine object and > # trigger AsyncDebugWrapper's code. > > assert isinstance(debug_me(), AsyncDebugWrapper) > > sys.set_coroutine_wrapper(None) # <- this unsets any > # previously set wrapper > assert not isinstance(debug_me(), AsyncDebugWrapper) > > If ``sys.set_coroutine_wrapper()`` is called twice, the new wrapper > replaces the previous wrapper. ``sys.set_coroutine_wrapper(None)`` > unsets the wrapper. > > > Glossary > ======== > > :Coroutine: > A coroutine function, or just "coroutine", is declared with ``async > def``. It uses ``await`` and ``return value``; see `New Coroutine > Declaration Syntax`_ for details. > > :Coroutine object: > Returned from a coroutine function. See `Await Expression`_ for > details. > > :Future-like object: > An object with an ``__await__`` method. Can be consumed by an > ``await`` expression in a coroutine. A coroutine waiting for a > Future-like object is suspended until the Future-like object's > ``__await__`` completes, and returns the result. See `Await > Expression`_ for details. > > :Awaitable: > A *Future-like* object or a *coroutine object*. See `Await > Expression`_ for details. > > :Generator-based coroutine: > Coroutines based in generator syntax. Most common example is > ``@asyncio.coroutine``. > > :Asynchronous context manager: > An asynchronous context manager has ``__aenter__`` and ``__aexit__`` > methods and can be used with ``async with``. See `Asynchronous > Context Managers and "async with"`_ for details. > > :Asynchronous iterable: > An object with an ``__aiter__`` method, which must return an > *asynchronous iterator* object. Can be used with ``async for``. > See `Asynchronous Iterators and "async for"`_ for details. > > :Asynchronous iterator: > An asynchronous iterator has an ``__anext__`` method. See > `Asynchronous Iterators and "async for"`_ for details. > > > List of functions and methods > ============================= > > ================= =================================== ================= > Method Can contain Can't contain > ================= =================================== ================= > async def func await, return value yield, yield from > async def __a*__ await, return value yield, yield from > def __a*__ return awaitable await > def __await__ yield, yield from, return iterable await > generator yield, yield from, return value await > ================= =================================== ================= > > Where: > > * "async def func": coroutine; > > * "async def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``, > ``__aexit__`` defined with the ``async`` keyword; > > * "def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``, > ``__aexit__`` defined without the ``async`` keyword, must return an > *awaitable*; > > * "def __await__": ``__await__`` method to implement *Future-like* > objects; > > * generator: a "regular" generator, function defined with ``def`` and > which contains a least one ``yield`` or ``yield from`` expression. > > > Transition Plan > =============== > > To avoid backwards compatibility issues with ``async`` and ``await`` > keywords, it was decided to modify ``tokenizer.c`` in such a way, that > it: > > * recognizes ``async def`` name tokens combination (start of a > coroutine); > > * keeps track of regular functions and coroutines; > > * replaces ``'async'`` token with ``ASYNC`` and ``'await'`` token with > ``AWAIT`` when in the process of yielding tokens for coroutines. > > This approach allows for seamless combination of new syntax features > (all of them available only in ``async`` functions) with any existing > code. > > An example of having "async def" and "async" attribute in one piece of > code:: > > class Spam: > async = 42 > > async def ham(): > print(getattr(Spam, 'async')) > > # The coroutine can be executed and will print '42' > > > Backwards Compatibility > ----------------------- > > This proposal preserves 100% backwards compatibility. > > > Grammar Updates > --------------- > > Grammar changes are also fairly minimal:: > > await_expr: AWAIT test > await_stmt: await_expr > > decorated: decorators (classdef | funcdef | async_funcdef) > async_funcdef: ASYNC funcdef > > async_stmt: ASYNC (funcdef | with_stmt | for_stmt) > > compound_stmt: (if_stmt | while_stmt | for_stmt | try_stmt | > with_stmt | funcdef | classdef | decorated | > async_stmt) > > atom: ('(' [yield_expr|await_expr|testlist_comp] ')' | > '[' [testlist_comp] ']' | > '{' [dictorsetmaker] '}' | > NAME | NUMBER | STRING+ | '...' | 'None' | 'True' | 'False?) > > expr_stmt: testlist_star_expr > (augassign (yield_expr|await_expr|testlist) | > ('=' (yield_expr|await_expr|testlist_star_expr))*) > > > Transition Period Shortcomings > ------------------------------ > > There is just one. > > Until ``async`` and ``await`` are not proper keywords, it is not > possible (or at least very hard) to fix ``tokenizer.c`` to recognize > them on the **same line** with ``def`` keyword:: > > # async and await will always be parsed as variables > > async def outer(): # 1 > def nested(a=(await fut)): > pass > > async def foo(): return (await fut) # 2 > > Since ``await`` and ``async`` in such cases are parsed as ``NAME`` > tokens, a ``SyntaxError`` will be raised. > > To workaround these issues, the above examples can be easily rewritten > to a more readable form:: > > async def outer(): # 1 > a_default = await fut > def nested(a=a_default): > pass > > async def foo(): # 2 > return (await fut) > > This limitation will go away as soon as ``async`` and ``await`` ate > proper keywords. Or if it's decided to use a future import for this > PEP. > > > Deprecation Plans > ----------------- > > ``async`` and ``await`` names will be softly deprecated in CPython 3.5 > and 3.6. In 3.7 we will transform them to proper keywords. Making > ``async`` and ``await`` proper keywords before 3.7 might make it harder > for people to port their code to Python 3. > > > asyncio > ------- > > ``asyncio`` module was adapted and tested to work with coroutines and > new statements. Backwards compatibility is 100% preserved. > > The required changes are mainly: > > 1. Modify ``@asyncio.coroutine`` decorator to use new > ``types.coroutine()`` function. > > 2. Add ``__await__ = __iter__`` line to ``asyncio.Future`` class. > > 3. Add ``ensure_task()`` as an alias for ``async()`` function. > Deprecate ``async()`` function. > > > Design Considerations > ===================== > > PEP 3152 > -------- > > PEP 3152 by Gregory Ewing proposes a different mechanism for coroutines > (called "cofunctions"). Some key points: > > 1. A new keyword ``codef`` to declare a *cofunction*. *Cofunction* is > always a generator, even if there is no ``cocall`` expressions > inside it. Maps to ``async def`` in this proposal. > > 2. A new keyword ``cocall`` to call a *cofunction*. Can only be used > inside a *cofunction*. Maps to ``await`` in this proposal (with > some differences, see below.) > > 3. It is not possible to call a *cofunction* without a ``cocall`` > keyword. > > 4. ``cocall`` grammatically requires parentheses after it:: > > atom: cocall | > cocall: 'cocall' atom cotrailer* '(' [arglist] ')' > cotrailer: '[' subscriptlist ']' | '.' NAME > > 5. ``cocall f(*args, **kwds)`` is semantically equivalent to > ``yield from f.__cocall__(*args, **kwds)``. > > Differences from this proposal: > > 1. There is no equivalent of ``__cocall__`` in this PEP, which is > called and its result is passed to ``yield from`` in the ``cocall`` > expression. ``await`` keyword expects an *awaitable* object, > validates the type, and executes ``yield from`` on it. Although, > ``__await__`` method is similar to ``__cocall__``, but is only used > to define *Future-like* objects. > > 2. ``await`` is defined in almost the same way as ``yield from`` in the > grammar (it is later enforced that ``await`` can only be inside > ``async def``). It is possible to simply write ``await future``, > whereas ``cocall`` always requires parentheses. > > 3. To make asyncio work with PEP 3152 it would be required to modify > ``@asyncio.coroutine`` decorator to wrap all functions in an object > with a ``__cocall__`` method. To call *cofunctions* from existing > generator-based coroutines it would be required to use ``costart`` > built-in. In this proposal ``@asyncio.coroutine`` simply sets > ``CO_COROUTINE`` on the wrapped function's code object and > everything works automatically. > > 4. Since it is impossible to call a *cofunction* without a ``cocall`` > keyword, it automatically prevents the common mistake of forgetting > to use ``yield from`` on generator-based coroutines. This proposal > addresses this problem with a different approach, see `Debugging > Features`_. > > 5. A shortcoming of requiring a ``cocall`` keyword to call a coroutine > is that if is decided to implement coroutine-generators -- > coroutines with ``yield`` or ``async yield`` expressions -- we > wouldn't need a ``cocall`` keyword to call them. So we'll end up > having ``__cocall__`` and no ``__call__`` for regular coroutines, > and having ``__call__`` and no ``__cocall__`` for coroutine- > generators. > > 6. There are no equivalents of ``async for`` and ``async with`` in PEP > 3152. > > > Coroutine-generators > -------------------- > > With ``async for`` keyword it is desirable to have a concept of a > *coroutine-generator* -- a coroutine with ``yield`` and ``yield from`` > expressions. To avoid any ambiguity with regular generators, we would > likely require to have an ``async`` keyword before ``yield``, and > ``async yield from`` would raise a ``StopAsyncIteration`` exception. > > While it is possible to implement coroutine-generators, we believe that > they are out of scope of this proposal. It is an advanced concept that > should be carefully considered and balanced, with a non-trivial changes > in the implementation of current generator objects. This is a matter > for a separate PEP. > > > No implicit wrapping in Futures > ------------------------------- > > There is a proposal to add similar mechanism to ECMAScript 7 [2]_. A > key difference is that JavaScript "async functions" always return a > Promise. While this approach has some advantages, it also implies that > a new Promise object is created on each "async function" invocation. > > We could implement a similar functionality in Python, by wrapping all > coroutines in a Future object, but this has the following > disadvantages: > > 1. Performance. A new Future object would be instantiated on each > coroutine call. Moreover, this makes implementation of ``await`` > expressions slower (disabling optimizations of ``yield from``). > > 2. A new built-in ``Future`` object would need to be added. > > 3. Coming up with a generic ``Future`` interface that is usable for any > use case in any framework is a very hard to solve problem. > > 4. It is not a feature that is used frequently, when most of the code > is coroutines. > > > Why "async" and "await" keywords > -------------------------------- > > async/await is not a new concept in programming languages: > > * C# has it since long time ago [5]_; > > * proposal to add async/await in ECMAScript 7 [2]_; > see also Traceur project [9]_; > > * Facebook's Hack/HHVM [6]_; > > * Google's Dart language [7]_; > > * Scala [8]_; > > * proposal to add async/await to C++ [10]_; > > * and many other less popular languages. > > This is a huge benefit, as some users already have experience with > async/await, and because it makes working with many languages in one > project easier (Python with ECMAScript 7 for instance). > > > Why "__aiter__" is a coroutine > ------------------------------ > > In principle, ``__aiter__`` could be a regular function. There are > several good reasons to make it a coroutine: > > * as most of the ``__anext__``, ``__aenter__``, and ``__aexit__`` > methods are coroutines, users would often make a mistake defining it > as ``async`` anyways; > > * there might be a need to run some asynchronous operations in > ``__aiter__``, for instance to prepare DB queries or do some file > operation. > > > Importance of "async" keyword > ----------------------------- > > While it is possible to just implement ``await`` expression and treat > all functions with at least one ``await`` as coroutines, this approach > makes APIs design, code refactoring and its long time support harder. > > Let's pretend that Python only has ``await`` keyword:: > > def useful(): > ... > await log(...) > ... > > def important(): > await useful() > > If ``useful()`` function is refactored and someone removes all > ``await`` expressions from it, it would become a regular python > function, and all code that depends on it, including ``important()`` > would be broken. To mitigate this issue a decorator similar to > ``@asyncio.coroutine`` has to be introduced. > > > Why "async def" > --------------- > > For some people bare ``async name(): pass`` syntax might look more > appealing than ``async def name(): pass``. It is certainly easier to > type. But on the other hand, it breaks the symmetry between ``async > def``, ``async with`` and ``async for``, where ``async`` is a modifier, > stating that the statement is asynchronous. It is also more consistent > with the existing grammar. > > > Why not a __future__ import > --------------------------- > > ``__future__`` imports are inconvenient and easy to forget to add. > Also, they are enabled for the whole source file. Consider that there > is a big project with a popular module named "async.py". With future > imports it is required to either import it using ``__import__()`` or > ``importlib.import_module()`` calls, or to rename the module. The > proposed approach makes it possible to continue using old code and > modules without a hassle, while coming up with a migration plan for > future python versions. > > > Why magic methods start with "a" > -------------------------------- > > New asynchronous magic methods ``__aiter__``, ``__anext__``, > ``__aenter__``, and ``__aexit__`` all start with the same prefix "a". > An alternative proposal is to use "async" prefix, so that ``__aiter__`` > becomes ``__async_iter__``. However, to align new magic methods with > the existing ones, such as ``__radd__`` and ``__iadd__`` it was decided > to use a shorter version. > > > Why not reuse existing magic names > ---------------------------------- > > An alternative idea about new asynchronous iterators and context > managers was to reuse existing magic methods, by adding an ``async`` > keyword to their declarations:: > > class CM: > async def __enter__(self): # instead of __aenter__ > ... > > This approach has the following downsides: > > * it would not be possible to create an object that works in both > ``with`` and ``async with`` statements; > > * it would look confusing and would require some implicit magic behind > the scenes in the interpreter; > > * one of the main points of this proposal is to make coroutines as > simple and foolproof as possible. > > > Comprehensions > -------------- > > For the sake of restricting the broadness of this PEP there is no new > syntax for asynchronous comprehensions. This should be considered in a > separate PEP, if there is a strong demand for this feature. > > > Async lambdas > ------------- > > Lambda coroutines are not part of this proposal. In this proposal they > would look like ``async lambda(parameters): expression``. Unless there > is a strong demand to have them as part of this proposal, it is > recommended to consider them later in a separate PEP. > > > Performance > =========== > > Overall Impact > -------------- > > This proposal introduces no observable performance impact. Here is an > output of python's official set of benchmarks [4]_: > > :: > > python perf.py -r -b default ../cpython/python.exe > ../cpython-aw/python.exe > > [skipped] > > Report on Darwin ysmac 14.3.0 Darwin Kernel Version 14.3.0: > Mon Mar 23 11:59:05 PDT 2015; root:xnu-2782.20.48~5/RELEASE_X86_64 > x86_64 i386 > > Total CPU cores: 8 > > ### etree_iterparse ### > Min: 0.365359 -> 0.349168: 1.05x faster > Avg: 0.396924 -> 0.379735: 1.05x faster > Significant (t=9.71) > Stddev: 0.01225 -> 0.01277: 1.0423x larger > > The following not significant results are hidden, use -v to show them: > django_v2, 2to3, etree_generate, etree_parse, etree_process, > fastpickle, > fastunpickle, json_dump_v2, json_load, nbody, regex_v8, tornado_http. > > > Tokenizer modifications > ----------------------- > > There is no observable slowdown of parsing python files with the > modified tokenizer: parsing of one 12Mb file > (``Lib/test/test_binop.py`` repeated 1000 times) takes the same amount > of time. > > > async/await > ----------- > > The following micro-benchmark was used to determine performance > difference between "async" functions and generators:: > > import sys > import time > > def binary(n): > if n <= 0: > return 1 > l = yield from binary(n - 1) > r = yield from binary(n - 1) > return l + 1 + r > > async def abinary(n): > if n <= 0: > return 1 > l = await abinary(n - 1) > r = await abinary(n - 1) > return l + 1 + r > > def timeit(gen, depth, repeat): > t0 = time.time() > for _ in range(repeat): > list(gen(depth)) > t1 = time.time() > print('{}({}) * {}: total {:.3f}s'.format( > gen.__name__, depth, repeat, t1-t0)) > > The result is that there is no observable performance difference. > Minimum timing of 3 runs > > :: > > abinary(19) * 30: total 12.985s > binary(19) * 30: total 12.953s > > Note that depth of 19 means 1,048,575 calls. > > > Reference Implementation > ======================== > > The reference implementation can be found here: [3]_. > > List of high-level changes and new protocols > -------------------------------------------- > > 1. New syntax for defining coroutines: ``async def`` and new ``await`` > keyword. > > 2. New ``__await__`` method for Future-like objects. > > 3. New syntax for asynchronous context managers: ``async with``. And > associated protocol with ``__aenter__`` and ``__aexit__`` methods. > > 4. New syntax for asynchronous iteration: ``async for``. And > associated protocol with ``__aiter__``, ``__aexit__`` and new built- > in exception ``StopAsyncIteration``. > > 5. New AST nodes: ``AsyncFunctionDef``, ``AsyncFor``, ``AsyncWith``, > ``Await``. > > 6. New functions: ``sys.set_coroutine_wrapper(callback)``, > ``sys.get_coroutine_wrapper()``, and ``types.coroutine(gen)``. > > 7. New ``CO_COROUTINE`` bit flag for code objects. > > While the list of changes and new things is not short, it is important > to understand, that most users will not use these features directly. > It is intended to be used in frameworks and libraries to provide users > with convenient to use and unambiguous APIs with ``async def``, > ``await``, ``async for`` and ``async with`` syntax. > > > Working example > --------------- > > All concepts proposed in this PEP are implemented [3]_ and can be > tested. > > :: > > import asyncio > > async def echo_server(): > print('Serving on localhost:8000') > await asyncio.start_server(handle_connection, > 'localhost', 8000) > > async def handle_connection(reader, writer): > print('New connection...') > > while True: > data = await reader.read(8192) > > if not data: > break > > print('Sending {:.10}... back'.format(repr(data))) > writer.write(data) > > loop = asyncio.get_event_loop() > loop.run_until_complete(echo_server()) > try: > loop.run_forever() > finally: > loop.close() > > > References > ========== > > .. [1] > https://docs.python.org/3/library/asyncio-task.html#asyncio.coroutine > > .. [2] http://wiki.ecmascript.org/doku.php?id=strawman:async_functions > > .. [3] https://github.com/1st1/cpython/tree/await > > .. [4] https://hg.python.org/benchmarks > > .. [5] https://msdn.microsoft.com/en-us/library/hh191443.aspx > > .. [6] http://docs.hhvm.com/manual/en/hack.async.php > > .. [7] https://www.dartlang.org/articles/await-async/ > > .. [8] http://docs.scala-lang.org/sips/pending/async.html > > .. [9] > https://github.com/google/traceur-compiler/wiki/LanguageFeatures#async-functions-experimental > > .. [10] http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3722.pdf > (PDF) > > > Acknowledgments > =============== > > I thank Guido van Rossum, Victor Stinner, Elvis Pranskevichus, Andrew > Svetlov, and ?ukasz Langa for their initial feedback. > > > Copyright > ========= > > This document has been placed in the public domain. > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/tds333%2Bpydev%40gmail.com > -- bye by Wolfgang -------------- next part -------------- An HTML attachment was scrubbed... URL: From greg.ewing at canterbury.ac.nz Thu Apr 23 09:36:23 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 23 Apr 2015 19:36:23 +1200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> Message-ID: <5538A0F7.5080500@canterbury.ac.nz> Victor Stinner wrote: > A huge part of the asyncio module is based on "yield from fut" where fut is > a Future object. > > How do you write this using the PEP 3152? Do you need to call an artifical > method like "cocall fut.return_self()" where the return_self() method simply > returns fut? In a PEP 3152 world, Future objects and the like would be expected to implement __cocall__, just as in a PEP 492 world they would be expected to implement __await__. > @asyncio.coroutine currently calls a function and *then* check if it should > yields from it or not: > > res = func(*args, **kw) > if isinstance(res, futures.Future) or inspect.isgenerator(res): > res = yield from res To accommodate the possibility of func being a cofunction, you would need to add something like if is_cofunction(func): res = yield from costart(func, *args, **kw) else: # as above -- Greg From greg.ewing at canterbury.ac.nz Thu Apr 23 09:55:50 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 23 Apr 2015 19:55:50 +1200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> Message-ID: <5538A586.7090407@canterbury.ac.nz> Victor Stinner wrote: > Using a custom name like "cofunction" may confuse users coming from other > programming languages. I prefer to keep "coroutine", but I agree that we > should make some effort to define the different categories of "Python > coroutines". I should perhaps point out that "cofunction" is not just an arbitrary word I made up to replace "coroutine". It is literally a kind of function, and is meant to be thought of that way. As for confusing new users, I would think that, as an unfamiliar word, it would point out that there is something they need to look up and learn about. Whereas they may think they already know what a "coroutine" is and not bother to look further. -- Greg From greg.ewing at canterbury.ac.nz Thu Apr 23 10:02:57 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 23 Apr 2015 20:02:57 +1200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> Message-ID: <5538A731.5010408@canterbury.ac.nz> Ludovic Gasc wrote: > Not related, but one of my coworkers asked me if with the new syntax it > will be possible to write an async decorator for coroutines. This is certainly possible with PEP 3152. The decorator just needs to be an ordinary function whose return value is a cofunction. -- Greg From greg.ewing at canterbury.ac.nz Thu Apr 23 10:39:51 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 23 Apr 2015 20:39:51 +1200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <20150423021236.2b2039ca@x230> References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <5537C9D0.80505@gmail.com> <20150423021236.2b2039ca@x230> Message-ID: <5538AFD7.5010709@canterbury.ac.nz> Paul Sokolovsky wrote: > And having both asymmetric and symmetric > would quite confusing, especially that symmetric are more powerful and > asymmetric can be easily implemented in terms of symmetric using > continuation-passing style. You can also use a trampoline of some kind to relay values back and forth between generators, to get something symmetric. -- Greg From tds333+pydev at gmail.com Thu Apr 23 10:43:52 2015 From: tds333+pydev at gmail.com (Wolfgang Langner) Date: Thu, 23 Apr 2015 10:43:52 +0200 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: Message-ID: Hi, having a lot experience with Python beginners and people programming Java/Python I have also an opinion about this. ;-) First reaction was, oh good. Then I read every thread and comment about it, looked at a lot internal code give all some time and the result is: I found a lot of code written like Java with assert and isinstance for type checks. Not very Pythonic but someone from Java thinks this is good. Because Python has no type checks. A lot of people think types help in the IDE and for tools to do automatic checks, find some errors earlier. My conclusion: 1. typing.py is good to make a standard how to specify type hints. Can also be used by tools and IDE's 2. Using it in the language as part of the function signature, my first thought was oh good, then I changed my mind to: oh it can be very ugly and unreadable, it is the wrong place. Now I am against it, best is, if I have to specify type signatures, do it in one place, keep them up to date. Most of the time this is the documentation. Why not use the docstring with a standard type specifier for this. Suggested here: http://pydev.blogspot.de/2015/04/type-hinting-on-python.html 3. Stub files are good if someone needs this feature, it is complete separate and if someone don't want to write them it is no problem. But it is good to have a way to name them and a place to find them and know this. IDE's parse the docstring already, it is optional to have one and every Editor has folding support to disable it if it is to much of information. Checkers like mypy can use the type info from there to do checks. If there is a standard way to specify types in docstrings. It is also possible to specify a type hint for a variable, this is good for documentations and can also be used for type checks. For stub files the same can be used. So no different syntax. Also If someone wants it and a module has no documentation it can be added from there. For nearly every function I have written, there is a docstring and most of the time also a type specified. But if I must provide all this in a second place it is not the right way to go. Over time normally one place misses some changes and is wrong. I am convinced something like: def myfunction(a, b): """ My special function. :param a: ... :type a: int :param b: second value :type b: int :rtype: None """ or shorter form also possible in Sphinx if only one type for parameter is specified: def myfunction(a, b): """ My special function. :param int a: ... :param int b: second value :rtype: None """ Is easier to maintain good to read, changes not the world how to write Python functions and is useful for IDE's and checkers. And the best, most editors and IDE's support folding of docstrings and if someone don't want to see them it can be done now. The docstring is a good place for type specification, because as stated Python never will do type checks on execution. So no need to make it part of the language syntax. All this can also be applied over time to the standard library. If it makes sense so specify a type it is good to have it in the documentation. And sphinx is the standard documentation tool for Python now. Also ask why no one used type specifier, they are possible since Python 3.0 ? Because it is the wrong way for Python. Regards, Wolfgang -------------- next part -------------- An HTML attachment was scrubbed... URL: From solipsis at pitrou.net Thu Apr 23 10:54:43 2015 From: solipsis at pitrou.net (Antoine Pitrou) Date: Thu, 23 Apr 2015 10:54:43 +0200 Subject: [Python-Dev] async/await in Python; v2 References: <55368858.4010007@gmail.com> Message-ID: <20150423105443.7baa091f@fsol> Hi, I agree with most of Wolfgang's points below. As a data point, I haven't used asyncio for anything real (despite having approved the PEP!), but I have some extensive prior experience with Twisted and Tornado :-) Regards Antoine. On Thu, 23 Apr 2015 09:30:30 +0200 Wolfgang Langner wrote: > Hi, > > most of the time I am a silent reader but in this discussion I must step in. > I use twisted and async stuff a lot over years followed development of > asyncio closely. > > First it is good to do differentiate async coroutines from generators. So > everyone can see it and have this in mind > and don't mix up booth. It is also easier to explain for new users. > Sometimes generators blows their mind and it takes > a while to get used to them. Async stuff is even harder. > > 1. I am fine with using something special instead of "yield" or "yield > from" for this. C# "await" is ok. > > Everything else suggested complicates the language and makes it harder to > read. > > 2. > async def f(): is harder to read and something special also it breaks the > symmetry in front (def indent). > Also every existing tooling must be changed to support it. Same for def > async, def f() async: > I thing a decorator is enough here > @coroutine > def f(): > is the best solution to mark something as a coroutine. > > > 3. > async with and async for > Bead idea, we clutter the language even more and it is one more thing every > newbie could do wrong. > for x in y: > result = await f() > is enough, every 'async' framework lived without it over years. > Same for with statement. > > The main use case suggested was for database stuff and this is also where > most are best with > defer something to a thread and keep it none async. > > > All together it is very late in the development cycle for 3.5 to > incorporate such a big change. > Best is to give all this some more time and defer it to 3.6 and some alpha > releases to experiment with. > > Regards, > > Wolfgang From solipsis at pitrou.net Thu Apr 23 10:59:23 2015 From: solipsis at pitrou.net (Antoine Pitrou) Date: Thu, 23 Apr 2015 10:59:23 +0200 Subject: [Python-Dev] Questionable TCP server example References: Message-ID: <20150423105923.685d253a@fsol> On Thu, 23 Apr 2015 09:21:11 +0200 Andrea Griffini wrote: > It's not the first time someone is confused by the server example of > > https://docs.python.org/3/library/socketserver.html > > where the receiving side is not making a loop over recv. This is a trivial example indeed. If you think something more realistic yet simple is desirable, we welcome contributions :-) You can take a look at https://docs.python.org/devguide/ to get started. > Moreover the documentation contains a misleading description of what really > happens: > > "The difference is that the readline() call in the second handler will call > recv() multiple times until it encounters a newline character, while the > single recv() call in the first handler will just return what has been sent > from the client in one sendall() call." > > Unless I'm missing something there's no way to know client side when all > data sent by "sendall" has been received (TCP stream protocol doesn't have > message boundaries) and the `recv` based code doesn't handle neither > fragmentation nor clients that send more than 1024 bytes. Indeed, the quoted paragraph is wrong. > Am I missing something or that is indeed an example of how NOT to write a > socket-based server? The problem is coming up with an example that reflects better practices while being simple enough :-) Regards Antoine. From greg.ewing at canterbury.ac.nz Thu Apr 23 11:16:55 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 23 Apr 2015 21:16:55 +1200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <5538431A.1020709@gmail.com> References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <553838F4.7070303@canterbury.ac.nz> <5538431A.1020709@gmail.com> Message-ID: <5538B887.10804@canterbury.ac.nz> Yury Selivanov wrote: > So how would we do "await fut" if await requires parentheses? I've answered this with respect to PEP 3152 -- futures would implement __cocall__, so you would write 'cocall fut()'. I'm not sure what to say about PEP 492 here, because it depends on exactly what a version of await that "requires parentheses" would mean. It's not clear to me what you have in mind for that from what you've said so far. I'm not really in favour of just tweaking the existing PEP 492 notion of await so that it superficially resembles a PEP 3152 cocall. That misses the point, which is that a cocall is a special kind of function call, not a special kind of yield-from. -- Greg From andrew.svetlov at gmail.com Thu Apr 23 11:18:51 2015 From: andrew.svetlov at gmail.com (Andrew Svetlov) Date: Thu, 23 Apr 2015 12:18:51 +0300 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> Message-ID: On Thu, Apr 23, 2015 at 10:30 AM, Wolfgang Langner wrote: > Hi, > > most of the time I am a silent reader but in this discussion I must step in. > I use twisted and async stuff a lot over years followed development of > asyncio closely. > > First it is good to do differentiate async coroutines from generators. So > everyone can see it and have this in mind > and don't mix up booth. It is also easier to explain for new users. > Sometimes generators blows their mind and it takes > a while to get used to them. Async stuff is even harder. > > 1. I am fine with using something special instead of "yield" or "yield from" > for this. C# "await" is ok. > > Everything else suggested complicates the language and makes it harder to > read. > > 2. > async def f(): is harder to read and something special also it breaks the > symmetry in front (def indent). > Also every existing tooling must be changed to support it. Same for def > async, def f() async: > I thing a decorator is enough here > @coroutine > def f(): > is the best solution to mark something as a coroutine. > Sorry, `@coroutine` decorator is part of Python semantics, not Python syntax. That means Python compiler cannot relay on @coroutine in parsing. `away`, `async for` and `async with` are available only inside `async def`, not inside regular `def`. > > 3. > async with and async for > Bead idea, we clutter the language even more and it is one more thing every > newbie could do wrong. > for x in y: > result = await f() > is enough, every 'async' framework lived without it over years. async for i in iterable: pass is not equal for for fut in iterable: i = yield from fut async for is also suitable when size of iterable sequence is unknown on iteration start. Let's look, for example, on Redis SCAN command (http://redis.io/commands/scan) for iterating over redis keys. It returns bulk of keys and opaque value for next SCAN call. In current Python it must be written as cur = 0 while True: bulk, cur = yield from redis.scan(cur) for key in bulk: process_key(key) if cur == 0: break With new syntax iteration looks much more native: async for key in redis.scan(cur): process_key(key) > Same for with statement. > > The main use case suggested was for database stuff and this is also where > most are best with > defer something to a thread and keep it none async. > `async with` is not only for transactions in relation databases. As well as plain `with` is used not only for databases but for many other things -- from files to decimal contexts. As realistic example not related to databases please recall RabbitMQ. Every message processing may be acknowledged. The native API for that may be while True: async with channel.consume("my queue") as msg: process_msg(msg) with acknowledgement on successful message processing only. Acknowledgement requires network communication and must be asynchronous operation. > > All together it is very late in the development cycle for 3.5 to incorporate > such a big change. > Best is to give all this some more time and defer it to 3.6 and some alpha > releases to experiment with. > > Regards, > > Wolfgang > > > > On Tue, Apr 21, 2015 at 7:26 PM, Yury Selivanov > wrote: >> >> Hi python-dev, >> >> I'm moving the discussion from python-ideas to here. >> >> The updated version of the PEP should be available shortly >> at https://www.python.org/dev/peps/pep-0492 >> and is also pasted in this email. >> >> Updates: >> >> 1. CO_ASYNC flag was renamed to CO_COROUTINE; >> >> 2. sys.set_async_wrapper() was renamed to >> sys.set_coroutine_wrapper(); >> >> 3. New function: sys.get_coroutine_wrapper(); >> >> 4. types.async_def() renamed to types.coroutine(); >> >> 5. New section highlighting differences from >> PEP 3152. >> >> 6. New AST node - AsyncFunctionDef; the proposal >> now is 100% backwards compatible; >> >> 7. A new section clarifying that coroutine-generators >> are not part of the current proposal; >> >> 8. Various small edits/typos fixes. >> >> >> There's is a bug tracker issue to track code review >> of the reference implementation (Victor Stinner is >> doing the review): http://bugs.python.org/issue24017 >> While the PEP isn't accepted, we want to make sure >> that the reference implementation is ready when such >> a decision will be made. >> >> >> Let's discuss some open questions: >> >> 1. Victor raised a question if we should locate >> coroutine() function from 'types' module to >> 'functools'. >> >> My opinion is that 'types' module is a better >> place for 'corotine()', since it adjusts the >> type of the passed generator. 'functools' is >> about 'partials', 'lru_cache' and 'wraps' kind >> of things. >> >> 2. I propose to disallow using of 'for..in' loops, >> and builtins like 'list()', 'iter()', 'next()', >> 'tuple()' etc on coroutines. >> >> It's possible by modifying PyObject_GetIter to >> raise an exception if it receives a coroutine-object. >> >> 'yield from' can also be modified to only accept >> coroutine objects if it is called from a generator >> with CO_COROUTINE flag. >> >> This will further separate coroutines from >> generators, making it harder to screw something >> up by an accident. >> >> I have a branch of reference implementation >> https://github.com/1st1/cpython/tree/await_noiter >> where this is implemented. I did not observe >> any performance drop. >> >> There is just one possible backwards compatibility >> issue here: there will be an exception if some user >> of asyncio actually used to iterate over generators >> decorated with @coroutine. But I can't imagine >> why would someone do that, and even if they did -- >> it's probably a bug or wrong usage of asyncio. >> >> >> That's it! I'd be happy to hear some feedback! >> >> Thanks, >> Yury >> >> >> >> PEP: 492 >> Title: Coroutines with async and await syntax >> Version: $Revision$ >> Last-Modified: $Date$ >> Author: Yury Selivanov >> Status: Draft >> Type: Standards Track >> Content-Type: text/x-rst >> Created: 09-Apr-2015 >> Python-Version: 3.5 >> Post-History: 17-Apr-2015, 21-Apr-2015 >> >> >> Abstract >> ======== >> >> This PEP introduces new syntax for coroutines, asynchronous ``with`` >> statements and ``for`` loops. The main motivation behind this proposal >> is to streamline writing and maintaining asynchronous code, as well as >> to simplify previously hard to implement code patterns. >> >> >> Rationale and Goals >> =================== >> >> Current Python supports implementing coroutines via generators (PEP >> 342), further enhanced by the ``yield from`` syntax introduced in PEP >> 380. This approach has a number of shortcomings: >> >> * it is easy to confuse coroutines with regular generators, since they >> share the same syntax; async libraries often attempt to alleviate >> this by using decorators (e.g. ``@asyncio.coroutine`` [1]_); >> >> * it is not possible to natively define a coroutine which has no >> ``yield`` or ``yield from`` statements, again requiring the use of >> decorators to fix potential refactoring issues; >> >> * support for asynchronous calls is limited to expressions where >> ``yield`` is allowed syntactically, limiting the usefulness of >> syntactic features, such as ``with`` and ``for`` statements. >> >> This proposal makes coroutines a native Python language feature, and >> clearly separates them from generators. This removes >> generator/coroutine ambiguity, and makes it possible to reliably define >> coroutines without reliance on a specific library. This also enables >> linters and IDEs to improve static code analysis and refactoring. >> >> Native coroutines and the associated new syntax features make it >> possible to define context manager and iteration protocols in >> asynchronous terms. As shown later in this proposal, the new ``async >> with`` statement lets Python programs perform asynchronous calls when >> entering and exiting a runtime context, and the new ``async for`` >> statement makes it possible to perform asynchronous calls in iterators. >> >> >> Specification >> ============= >> >> This proposal introduces new syntax and semantics to enhance coroutine >> support in Python, it does not change the internal implementation of >> coroutines, which are still based on generators. >> >> It is strongly suggested that the reader understands how coroutines are >> implemented in Python (PEP 342 and PEP 380). It is also recommended to >> read PEP 3156 (asyncio framework) and PEP 3152 (Cofunctions). >> >> From this point in this document we use the word *coroutine* to refer >> to functions declared using the new syntax. *generator-based >> coroutine* is used where necessary to refer to coroutines that are >> based on generator syntax. >> >> >> New Coroutine Declaration Syntax >> -------------------------------- >> >> The following new syntax is used to declare a coroutine:: >> >> async def read_data(db): >> pass >> >> Key properties of coroutines: >> >> * ``async def`` functions are always coroutines, even if they do not >> contain ``await`` expressions. >> >> * It is a ``SyntaxError`` to have ``yield`` or ``yield from`` >> expressions in an ``async`` function. >> >> * Internally, a new code object flag - ``CO_COROUTINE`` - is introduced >> to enable runtime detection of coroutines (and migrating existing >> code). All coroutines have both ``CO_COROUTINE`` and ``CO_GENERATOR`` >> flags set. >> >> * Regular generators, when called, return a *generator object*; >> similarly, coroutines return a *coroutine object*. >> >> * ``StopIteration`` exceptions are not propagated out of coroutines, >> and are replaced with a ``RuntimeError``. For regular generators >> such behavior requires a future import (see PEP 479). >> >> >> types.coroutine() >> ----------------- >> >> A new function ``coroutine(gen)`` is added to the ``types`` module. It >> applies ``CO_COROUTINE`` flag to the passed generator-function's code >> object, making it to return a *coroutine object* when called. >> >> This feature enables an easy upgrade path for existing libraries. >> >> >> Await Expression >> ---------------- >> >> The following new ``await`` expression is used to obtain a result of >> coroutine execution:: >> >> async def read_data(db): >> data = await db.fetch('SELECT ...') >> ... >> >> ``await``, similarly to ``yield from``, suspends execution of >> ``read_data`` coroutine until ``db.fetch`` *awaitable* completes and >> returns the result data. >> >> It uses the ``yield from`` implementation with an extra step of >> validating its argument. ``await`` only accepts an *awaitable*, which >> can be one of: >> >> * A *coroutine object* returned from a *coroutine* or a generator >> decorated with ``types.coroutine()``. >> >> * An object with an ``__await__`` method returning an iterator. >> >> Any ``yield from`` chain of calls ends with a ``yield``. This is a >> fundamental mechanism of how *Futures* are implemented. Since, >> internally, coroutines are a special kind of generators, every >> ``await`` is suspended by a ``yield`` somewhere down the chain of >> ``await`` calls (please refer to PEP 3156 for a detailed >> explanation.) >> >> To enable this behavior for coroutines, a new magic method called >> ``__await__`` is added. In asyncio, for instance, to enable Future >> objects in ``await`` statements, the only change is to add >> ``__await__ = __iter__`` line to ``asyncio.Future`` class. >> >> Objects with ``__await__`` method are called *Future-like* objects in >> the rest of this PEP. >> >> Also, please note that ``__aiter__`` method (see its definition >> below) cannot be used for this purpose. It is a different protocol, >> and would be like using ``__iter__`` instead of ``__call__`` for >> regular callables. >> >> It is a ``SyntaxError`` to use ``await`` outside of a coroutine. >> >> >> Asynchronous Context Managers and "async with" >> ---------------------------------------------- >> >> An *asynchronous context manager* is a context manager that is able to >> suspend execution in its *enter* and *exit* methods. >> >> To make this possible, a new protocol for asynchronous context managers >> is proposed. Two new magic methods are added: ``__aenter__`` and >> ``__aexit__``. Both must return an *awaitable*. >> >> An example of an asynchronous context manager:: >> >> class AsyncContextManager: >> async def __aenter__(self): >> await log('entering context') >> >> async def __aexit__(self, exc_type, exc, tb): >> await log('exiting context') >> >> >> New Syntax >> '''''''''' >> >> A new statement for asynchronous context managers is proposed:: >> >> async with EXPR as VAR: >> BLOCK >> >> >> which is semantically equivalent to:: >> >> mgr = (EXPR) >> aexit = type(mgr).__aexit__ >> aenter = type(mgr).__aenter__(mgr) >> exc = True >> >> try: >> try: >> VAR = await aenter >> BLOCK >> except: >> exc = False >> exit_res = await aexit(mgr, *sys.exc_info()) >> if not exit_res: >> raise >> >> finally: >> if exc: >> await aexit(mgr, None, None, None) >> >> >> As with regular ``with`` statements, it is possible to specify multiple >> context managers in a single ``async with`` statement. >> >> It is an error to pass a regular context manager without ``__aenter__`` >> and ``__aexit__`` methods to ``async with``. It is a ``SyntaxError`` >> to use ``async with`` outside of a coroutine. >> >> >> Example >> ''''''' >> >> With asynchronous context managers it is easy to implement proper >> database transaction managers for coroutines:: >> >> async def commit(session, data): >> ... >> >> async with session.transaction(): >> ... >> await session.update(data) >> ... >> >> Code that needs locking also looks lighter:: >> >> async with lock: >> ... >> >> instead of:: >> >> with (yield from lock): >> ... >> >> >> Asynchronous Iterators and "async for" >> -------------------------------------- >> >> An *asynchronous iterable* is able to call asynchronous code in its >> *iter* implementation, and *asynchronous iterator* can call >> asynchronous code in its *next* method. To support asynchronous >> iteration: >> >> 1. An object must implement an ``__aiter__`` method returning an >> *awaitable* resulting in an *asynchronous iterator object*. >> >> 2. An *asynchronous iterator object* must implement an ``__anext__`` >> method returning an *awaitable*. >> >> 3. To stop iteration ``__anext__`` must raise a ``StopAsyncIteration`` >> exception. >> >> An example of asynchronous iterable:: >> >> class AsyncIterable: >> async def __aiter__(self): >> return self >> >> async def __anext__(self): >> data = await self.fetch_data() >> if data: >> return data >> else: >> raise StopAsyncIteration >> >> async def fetch_data(self): >> ... >> >> >> New Syntax >> '''''''''' >> >> A new statement for iterating through asynchronous iterators is >> proposed:: >> >> async for TARGET in ITER: >> BLOCK >> else: >> BLOCK2 >> >> which is semantically equivalent to:: >> >> iter = (ITER) >> iter = await type(iter).__aiter__(iter) >> running = True >> while running: >> try: >> TARGET = await type(iter).__anext__(iter) >> except StopAsyncIteration: >> running = False >> else: >> BLOCK >> else: >> BLOCK2 >> >> >> It is an error to pass a regular iterable without ``__aiter__`` method >> to ``async for``. It is a ``SyntaxError`` to use ``async for`` outside >> of a coroutine. >> >> As for with regular ``for`` statement, ``async for`` has an optional >> ``else`` clause. >> >> >> Example 1 >> ''''''''' >> >> With asynchronous iteration protocol it is possible to asynchronously >> buffer data during iteration:: >> >> async for data in cursor: >> ... >> >> Where ``cursor`` is an asynchronous iterator that prefetches ``N`` rows >> of data from a database after every ``N`` iterations. >> >> The following code illustrates new asynchronous iteration protocol:: >> >> class Cursor: >> def __init__(self): >> self.buffer = collections.deque() >> >> def _prefetch(self): >> ... >> >> async def __aiter__(self): >> return self >> >> async def __anext__(self): >> if not self.buffer: >> self.buffer = await self._prefetch() >> if not self.buffer: >> raise StopAsyncIteration >> return self.buffer.popleft() >> >> then the ``Cursor`` class can be used as follows:: >> >> async for row in Cursor(): >> print(row) >> >> which would be equivalent to the following code:: >> >> i = await Cursor().__aiter__() >> while True: >> try: >> row = await i.__anext__() >> except StopAsyncIteration: >> break >> else: >> print(row) >> >> >> Example 2 >> ''''''''' >> >> The following is a utility class that transforms a regular iterable to >> an asynchronous one. While this is not a very useful thing to do, the >> code illustrates the relationship between regular and asynchronous >> iterators. >> >> :: >> >> class AsyncIteratorWrapper: >> def __init__(self, obj): >> self._it = iter(obj) >> >> async def __aiter__(self): >> return self >> >> async def __anext__(self): >> try: >> value = next(self._it) >> except StopIteration: >> raise StopAsyncIteration >> return value >> >> async for item in AsyncIteratorWrapper("abc"): >> print(item) >> >> >> Why StopAsyncIteration? >> ''''''''''''''''''''''' >> >> Coroutines are still based on generators internally. So, before PEP >> 479, there was no fundamental difference between >> >> :: >> >> def g1(): >> yield from fut >> return 'spam' >> >> and >> >> :: >> >> def g2(): >> yield from fut >> raise StopIteration('spam') >> >> And since PEP 479 is accepted and enabled by default for coroutines, >> the following example will have its ``StopIteration`` wrapped into a >> ``RuntimeError`` >> >> :: >> >> async def a1(): >> await fut >> raise StopIteration('spam') >> >> The only way to tell the outside code that the iteration has ended is >> to raise something other than ``StopIteration``. Therefore, a new >> built-in exception class ``StopAsyncIteration`` was added. >> >> Moreover, with semantics from PEP 479, all ``StopIteration`` exceptions >> raised in coroutines are wrapped in ``RuntimeError``. >> >> >> Debugging Features >> ------------------ >> >> One of the most frequent mistakes that people make when using >> generators as coroutines is forgetting to use ``yield from``:: >> >> @asyncio.coroutine >> def useful(): >> asyncio.sleep(1) # this will do noting without 'yield from' >> >> For debugging this kind of mistakes there is a special debug mode in >> asyncio, in which ``@coroutine`` decorator wraps all functions with a >> special object with a destructor logging a warning. Whenever a wrapped >> generator gets garbage collected, a detailed logging message is >> generated with information about where exactly the decorator function >> was defined, stack trace of where it was collected, etc. Wrapper >> object also provides a convenient ``__repr__`` function with detailed >> information about the generator. >> >> The only problem is how to enable these debug capabilities. Since >> debug facilities should be a no-op in production mode, ``@coroutine`` >> decorator makes the decision of whether to wrap or not to wrap based on >> an OS environment variable ``PYTHONASYNCIODEBUG``. This way it is >> possible to run asyncio programs with asyncio's own functions >> instrumented. ``EventLoop.set_debug``, a different debug facility, has >> no impact on ``@coroutine`` decorator's behavior. >> >> With this proposal, coroutines is a native, distinct from generators, >> concept. New methods ``set_coroutine_wrapper`` and >> ``get_coroutine_wrapper`` are added to the ``sys`` module, with which >> frameworks can provide advanced debugging facilities. >> >> It is also important to make coroutines as fast and efficient as >> possible, therefore there are no debug features enabled by default. >> >> Example:: >> >> async def debug_me(): >> await asyncio.sleep(1) >> >> def async_debug_wrap(generator): >> return asyncio.AsyncDebugWrapper(generator) >> >> sys.set_coroutine_wrapper(async_debug_wrap) >> >> debug_me() # <- this line will likely GC the coroutine object and >> # trigger AsyncDebugWrapper's code. >> >> assert isinstance(debug_me(), AsyncDebugWrapper) >> >> sys.set_coroutine_wrapper(None) # <- this unsets any >> # previously set wrapper >> assert not isinstance(debug_me(), AsyncDebugWrapper) >> >> If ``sys.set_coroutine_wrapper()`` is called twice, the new wrapper >> replaces the previous wrapper. ``sys.set_coroutine_wrapper(None)`` >> unsets the wrapper. >> >> >> Glossary >> ======== >> >> :Coroutine: >> A coroutine function, or just "coroutine", is declared with ``async >> def``. It uses ``await`` and ``return value``; see `New Coroutine >> Declaration Syntax`_ for details. >> >> :Coroutine object: >> Returned from a coroutine function. See `Await Expression`_ for >> details. >> >> :Future-like object: >> An object with an ``__await__`` method. Can be consumed by an >> ``await`` expression in a coroutine. A coroutine waiting for a >> Future-like object is suspended until the Future-like object's >> ``__await__`` completes, and returns the result. See `Await >> Expression`_ for details. >> >> :Awaitable: >> A *Future-like* object or a *coroutine object*. See `Await >> Expression`_ for details. >> >> :Generator-based coroutine: >> Coroutines based in generator syntax. Most common example is >> ``@asyncio.coroutine``. >> >> :Asynchronous context manager: >> An asynchronous context manager has ``__aenter__`` and ``__aexit__`` >> methods and can be used with ``async with``. See `Asynchronous >> Context Managers and "async with"`_ for details. >> >> :Asynchronous iterable: >> An object with an ``__aiter__`` method, which must return an >> *asynchronous iterator* object. Can be used with ``async for``. >> See `Asynchronous Iterators and "async for"`_ for details. >> >> :Asynchronous iterator: >> An asynchronous iterator has an ``__anext__`` method. See >> `Asynchronous Iterators and "async for"`_ for details. >> >> >> List of functions and methods >> ============================= >> >> ================= =================================== ================= >> Method Can contain Can't contain >> ================= =================================== ================= >> async def func await, return value yield, yield from >> async def __a*__ await, return value yield, yield from >> def __a*__ return awaitable await >> def __await__ yield, yield from, return iterable await >> generator yield, yield from, return value await >> ================= =================================== ================= >> >> Where: >> >> * "async def func": coroutine; >> >> * "async def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``, >> ``__aexit__`` defined with the ``async`` keyword; >> >> * "def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``, >> ``__aexit__`` defined without the ``async`` keyword, must return an >> *awaitable*; >> >> * "def __await__": ``__await__`` method to implement *Future-like* >> objects; >> >> * generator: a "regular" generator, function defined with ``def`` and >> which contains a least one ``yield`` or ``yield from`` expression. >> >> >> Transition Plan >> =============== >> >> To avoid backwards compatibility issues with ``async`` and ``await`` >> keywords, it was decided to modify ``tokenizer.c`` in such a way, that >> it: >> >> * recognizes ``async def`` name tokens combination (start of a >> coroutine); >> >> * keeps track of regular functions and coroutines; >> >> * replaces ``'async'`` token with ``ASYNC`` and ``'await'`` token with >> ``AWAIT`` when in the process of yielding tokens for coroutines. >> >> This approach allows for seamless combination of new syntax features >> (all of them available only in ``async`` functions) with any existing >> code. >> >> An example of having "async def" and "async" attribute in one piece of >> code:: >> >> class Spam: >> async = 42 >> >> async def ham(): >> print(getattr(Spam, 'async')) >> >> # The coroutine can be executed and will print '42' >> >> >> Backwards Compatibility >> ----------------------- >> >> This proposal preserves 100% backwards compatibility. >> >> >> Grammar Updates >> --------------- >> >> Grammar changes are also fairly minimal:: >> >> await_expr: AWAIT test >> await_stmt: await_expr >> >> decorated: decorators (classdef | funcdef | async_funcdef) >> async_funcdef: ASYNC funcdef >> >> async_stmt: ASYNC (funcdef | with_stmt | for_stmt) >> >> compound_stmt: (if_stmt | while_stmt | for_stmt | try_stmt | >> with_stmt | funcdef | classdef | decorated | >> async_stmt) >> >> atom: ('(' [yield_expr|await_expr|testlist_comp] ')' | >> '[' [testlist_comp] ']' | >> '{' [dictorsetmaker] '}' | >> NAME | NUMBER | STRING+ | '...' | 'None' | 'True' | 'False?) >> >> expr_stmt: testlist_star_expr >> (augassign (yield_expr|await_expr|testlist) | >> ('=' (yield_expr|await_expr|testlist_star_expr))*) >> >> >> Transition Period Shortcomings >> ------------------------------ >> >> There is just one. >> >> Until ``async`` and ``await`` are not proper keywords, it is not >> possible (or at least very hard) to fix ``tokenizer.c`` to recognize >> them on the **same line** with ``def`` keyword:: >> >> # async and await will always be parsed as variables >> >> async def outer(): # 1 >> def nested(a=(await fut)): >> pass >> >> async def foo(): return (await fut) # 2 >> >> Since ``await`` and ``async`` in such cases are parsed as ``NAME`` >> tokens, a ``SyntaxError`` will be raised. >> >> To workaround these issues, the above examples can be easily rewritten >> to a more readable form:: >> >> async def outer(): # 1 >> a_default = await fut >> def nested(a=a_default): >> pass >> >> async def foo(): # 2 >> return (await fut) >> >> This limitation will go away as soon as ``async`` and ``await`` ate >> proper keywords. Or if it's decided to use a future import for this >> PEP. >> >> >> Deprecation Plans >> ----------------- >> >> ``async`` and ``await`` names will be softly deprecated in CPython 3.5 >> and 3.6. In 3.7 we will transform them to proper keywords. Making >> ``async`` and ``await`` proper keywords before 3.7 might make it harder >> for people to port their code to Python 3. >> >> >> asyncio >> ------- >> >> ``asyncio`` module was adapted and tested to work with coroutines and >> new statements. Backwards compatibility is 100% preserved. >> >> The required changes are mainly: >> >> 1. Modify ``@asyncio.coroutine`` decorator to use new >> ``types.coroutine()`` function. >> >> 2. Add ``__await__ = __iter__`` line to ``asyncio.Future`` class. >> >> 3. Add ``ensure_task()`` as an alias for ``async()`` function. >> Deprecate ``async()`` function. >> >> >> Design Considerations >> ===================== >> >> PEP 3152 >> -------- >> >> PEP 3152 by Gregory Ewing proposes a different mechanism for coroutines >> (called "cofunctions"). Some key points: >> >> 1. A new keyword ``codef`` to declare a *cofunction*. *Cofunction* is >> always a generator, even if there is no ``cocall`` expressions >> inside it. Maps to ``async def`` in this proposal. >> >> 2. A new keyword ``cocall`` to call a *cofunction*. Can only be used >> inside a *cofunction*. Maps to ``await`` in this proposal (with >> some differences, see below.) >> >> 3. It is not possible to call a *cofunction* without a ``cocall`` >> keyword. >> >> 4. ``cocall`` grammatically requires parentheses after it:: >> >> atom: cocall | >> cocall: 'cocall' atom cotrailer* '(' [arglist] ')' >> cotrailer: '[' subscriptlist ']' | '.' NAME >> >> 5. ``cocall f(*args, **kwds)`` is semantically equivalent to >> ``yield from f.__cocall__(*args, **kwds)``. >> >> Differences from this proposal: >> >> 1. There is no equivalent of ``__cocall__`` in this PEP, which is >> called and its result is passed to ``yield from`` in the ``cocall`` >> expression. ``await`` keyword expects an *awaitable* object, >> validates the type, and executes ``yield from`` on it. Although, >> ``__await__`` method is similar to ``__cocall__``, but is only used >> to define *Future-like* objects. >> >> 2. ``await`` is defined in almost the same way as ``yield from`` in the >> grammar (it is later enforced that ``await`` can only be inside >> ``async def``). It is possible to simply write ``await future``, >> whereas ``cocall`` always requires parentheses. >> >> 3. To make asyncio work with PEP 3152 it would be required to modify >> ``@asyncio.coroutine`` decorator to wrap all functions in an object >> with a ``__cocall__`` method. To call *cofunctions* from existing >> generator-based coroutines it would be required to use ``costart`` >> built-in. In this proposal ``@asyncio.coroutine`` simply sets >> ``CO_COROUTINE`` on the wrapped function's code object and >> everything works automatically. >> >> 4. Since it is impossible to call a *cofunction* without a ``cocall`` >> keyword, it automatically prevents the common mistake of forgetting >> to use ``yield from`` on generator-based coroutines. This proposal >> addresses this problem with a different approach, see `Debugging >> Features`_. >> >> 5. A shortcoming of requiring a ``cocall`` keyword to call a coroutine >> is that if is decided to implement coroutine-generators -- >> coroutines with ``yield`` or ``async yield`` expressions -- we >> wouldn't need a ``cocall`` keyword to call them. So we'll end up >> having ``__cocall__`` and no ``__call__`` for regular coroutines, >> and having ``__call__`` and no ``__cocall__`` for coroutine- >> generators. >> >> 6. There are no equivalents of ``async for`` and ``async with`` in PEP >> 3152. >> >> >> Coroutine-generators >> -------------------- >> >> With ``async for`` keyword it is desirable to have a concept of a >> *coroutine-generator* -- a coroutine with ``yield`` and ``yield from`` >> expressions. To avoid any ambiguity with regular generators, we would >> likely require to have an ``async`` keyword before ``yield``, and >> ``async yield from`` would raise a ``StopAsyncIteration`` exception. >> >> While it is possible to implement coroutine-generators, we believe that >> they are out of scope of this proposal. It is an advanced concept that >> should be carefully considered and balanced, with a non-trivial changes >> in the implementation of current generator objects. This is a matter >> for a separate PEP. >> >> >> No implicit wrapping in Futures >> ------------------------------- >> >> There is a proposal to add similar mechanism to ECMAScript 7 [2]_. A >> key difference is that JavaScript "async functions" always return a >> Promise. While this approach has some advantages, it also implies that >> a new Promise object is created on each "async function" invocation. >> >> We could implement a similar functionality in Python, by wrapping all >> coroutines in a Future object, but this has the following >> disadvantages: >> >> 1. Performance. A new Future object would be instantiated on each >> coroutine call. Moreover, this makes implementation of ``await`` >> expressions slower (disabling optimizations of ``yield from``). >> >> 2. A new built-in ``Future`` object would need to be added. >> >> 3. Coming up with a generic ``Future`` interface that is usable for any >> use case in any framework is a very hard to solve problem. >> >> 4. It is not a feature that is used frequently, when most of the code >> is coroutines. >> >> >> Why "async" and "await" keywords >> -------------------------------- >> >> async/await is not a new concept in programming languages: >> >> * C# has it since long time ago [5]_; >> >> * proposal to add async/await in ECMAScript 7 [2]_; >> see also Traceur project [9]_; >> >> * Facebook's Hack/HHVM [6]_; >> >> * Google's Dart language [7]_; >> >> * Scala [8]_; >> >> * proposal to add async/await to C++ [10]_; >> >> * and many other less popular languages. >> >> This is a huge benefit, as some users already have experience with >> async/await, and because it makes working with many languages in one >> project easier (Python with ECMAScript 7 for instance). >> >> >> Why "__aiter__" is a coroutine >> ------------------------------ >> >> In principle, ``__aiter__`` could be a regular function. There are >> several good reasons to make it a coroutine: >> >> * as most of the ``__anext__``, ``__aenter__``, and ``__aexit__`` >> methods are coroutines, users would often make a mistake defining it >> as ``async`` anyways; >> >> * there might be a need to run some asynchronous operations in >> ``__aiter__``, for instance to prepare DB queries or do some file >> operation. >> >> >> Importance of "async" keyword >> ----------------------------- >> >> While it is possible to just implement ``await`` expression and treat >> all functions with at least one ``await`` as coroutines, this approach >> makes APIs design, code refactoring and its long time support harder. >> >> Let's pretend that Python only has ``await`` keyword:: >> >> def useful(): >> ... >> await log(...) >> ... >> >> def important(): >> await useful() >> >> If ``useful()`` function is refactored and someone removes all >> ``await`` expressions from it, it would become a regular python >> function, and all code that depends on it, including ``important()`` >> would be broken. To mitigate this issue a decorator similar to >> ``@asyncio.coroutine`` has to be introduced. >> >> >> Why "async def" >> --------------- >> >> For some people bare ``async name(): pass`` syntax might look more >> appealing than ``async def name(): pass``. It is certainly easier to >> type. But on the other hand, it breaks the symmetry between ``async >> def``, ``async with`` and ``async for``, where ``async`` is a modifier, >> stating that the statement is asynchronous. It is also more consistent >> with the existing grammar. >> >> >> Why not a __future__ import >> --------------------------- >> >> ``__future__`` imports are inconvenient and easy to forget to add. >> Also, they are enabled for the whole source file. Consider that there >> is a big project with a popular module named "async.py". With future >> imports it is required to either import it using ``__import__()`` or >> ``importlib.import_module()`` calls, or to rename the module. The >> proposed approach makes it possible to continue using old code and >> modules without a hassle, while coming up with a migration plan for >> future python versions. >> >> >> Why magic methods start with "a" >> -------------------------------- >> >> New asynchronous magic methods ``__aiter__``, ``__anext__``, >> ``__aenter__``, and ``__aexit__`` all start with the same prefix "a". >> An alternative proposal is to use "async" prefix, so that ``__aiter__`` >> becomes ``__async_iter__``. However, to align new magic methods with >> the existing ones, such as ``__radd__`` and ``__iadd__`` it was decided >> to use a shorter version. >> >> >> Why not reuse existing magic names >> ---------------------------------- >> >> An alternative idea about new asynchronous iterators and context >> managers was to reuse existing magic methods, by adding an ``async`` >> keyword to their declarations:: >> >> class CM: >> async def __enter__(self): # instead of __aenter__ >> ... >> >> This approach has the following downsides: >> >> * it would not be possible to create an object that works in both >> ``with`` and ``async with`` statements; >> >> * it would look confusing and would require some implicit magic behind >> the scenes in the interpreter; >> >> * one of the main points of this proposal is to make coroutines as >> simple and foolproof as possible. >> >> >> Comprehensions >> -------------- >> >> For the sake of restricting the broadness of this PEP there is no new >> syntax for asynchronous comprehensions. This should be considered in a >> separate PEP, if there is a strong demand for this feature. >> >> >> Async lambdas >> ------------- >> >> Lambda coroutines are not part of this proposal. In this proposal they >> would look like ``async lambda(parameters): expression``. Unless there >> is a strong demand to have them as part of this proposal, it is >> recommended to consider them later in a separate PEP. >> >> >> Performance >> =========== >> >> Overall Impact >> -------------- >> >> This proposal introduces no observable performance impact. Here is an >> output of python's official set of benchmarks [4]_: >> >> :: >> >> python perf.py -r -b default ../cpython/python.exe >> ../cpython-aw/python.exe >> >> [skipped] >> >> Report on Darwin ysmac 14.3.0 Darwin Kernel Version 14.3.0: >> Mon Mar 23 11:59:05 PDT 2015; root:xnu-2782.20.48~5/RELEASE_X86_64 >> x86_64 i386 >> >> Total CPU cores: 8 >> >> ### etree_iterparse ### >> Min: 0.365359 -> 0.349168: 1.05x faster >> Avg: 0.396924 -> 0.379735: 1.05x faster >> Significant (t=9.71) >> Stddev: 0.01225 -> 0.01277: 1.0423x larger >> >> The following not significant results are hidden, use -v to show them: >> django_v2, 2to3, etree_generate, etree_parse, etree_process, >> fastpickle, >> fastunpickle, json_dump_v2, json_load, nbody, regex_v8, tornado_http. >> >> >> Tokenizer modifications >> ----------------------- >> >> There is no observable slowdown of parsing python files with the >> modified tokenizer: parsing of one 12Mb file >> (``Lib/test/test_binop.py`` repeated 1000 times) takes the same amount >> of time. >> >> >> async/await >> ----------- >> >> The following micro-benchmark was used to determine performance >> difference between "async" functions and generators:: >> >> import sys >> import time >> >> def binary(n): >> if n <= 0: >> return 1 >> l = yield from binary(n - 1) >> r = yield from binary(n - 1) >> return l + 1 + r >> >> async def abinary(n): >> if n <= 0: >> return 1 >> l = await abinary(n - 1) >> r = await abinary(n - 1) >> return l + 1 + r >> >> def timeit(gen, depth, repeat): >> t0 = time.time() >> for _ in range(repeat): >> list(gen(depth)) >> t1 = time.time() >> print('{}({}) * {}: total {:.3f}s'.format( >> gen.__name__, depth, repeat, t1-t0)) >> >> The result is that there is no observable performance difference. >> Minimum timing of 3 runs >> >> :: >> >> abinary(19) * 30: total 12.985s >> binary(19) * 30: total 12.953s >> >> Note that depth of 19 means 1,048,575 calls. >> >> >> Reference Implementation >> ======================== >> >> The reference implementation can be found here: [3]_. >> >> List of high-level changes and new protocols >> -------------------------------------------- >> >> 1. New syntax for defining coroutines: ``async def`` and new ``await`` >> keyword. >> >> 2. New ``__await__`` method for Future-like objects. >> >> 3. New syntax for asynchronous context managers: ``async with``. And >> associated protocol with ``__aenter__`` and ``__aexit__`` methods. >> >> 4. New syntax for asynchronous iteration: ``async for``. And >> associated protocol with ``__aiter__``, ``__aexit__`` and new built- >> in exception ``StopAsyncIteration``. >> >> 5. New AST nodes: ``AsyncFunctionDef``, ``AsyncFor``, ``AsyncWith``, >> ``Await``. >> >> 6. New functions: ``sys.set_coroutine_wrapper(callback)``, >> ``sys.get_coroutine_wrapper()``, and ``types.coroutine(gen)``. >> >> 7. New ``CO_COROUTINE`` bit flag for code objects. >> >> While the list of changes and new things is not short, it is important >> to understand, that most users will not use these features directly. >> It is intended to be used in frameworks and libraries to provide users >> with convenient to use and unambiguous APIs with ``async def``, >> ``await``, ``async for`` and ``async with`` syntax. >> >> >> Working example >> --------------- >> >> All concepts proposed in this PEP are implemented [3]_ and can be >> tested. >> >> :: >> >> import asyncio >> >> async def echo_server(): >> print('Serving on localhost:8000') >> await asyncio.start_server(handle_connection, >> 'localhost', 8000) >> >> async def handle_connection(reader, writer): >> print('New connection...') >> >> while True: >> data = await reader.read(8192) >> >> if not data: >> break >> >> print('Sending {:.10}... back'.format(repr(data))) >> writer.write(data) >> >> loop = asyncio.get_event_loop() >> loop.run_until_complete(echo_server()) >> try: >> loop.run_forever() >> finally: >> loop.close() >> >> >> References >> ========== >> >> .. [1] >> https://docs.python.org/3/library/asyncio-task.html#asyncio.coroutine >> >> .. [2] http://wiki.ecmascript.org/doku.php?id=strawman:async_functions >> >> .. [3] https://github.com/1st1/cpython/tree/await >> >> .. [4] https://hg.python.org/benchmarks >> >> .. [5] https://msdn.microsoft.com/en-us/library/hh191443.aspx >> >> .. [6] http://docs.hhvm.com/manual/en/hack.async.php >> >> .. [7] https://www.dartlang.org/articles/await-async/ >> >> .. [8] http://docs.scala-lang.org/sips/pending/async.html >> >> .. [9] >> https://github.com/google/traceur-compiler/wiki/LanguageFeatures#async-functions-experimental >> >> .. [10] http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3722.pdf >> (PDF) >> >> >> Acknowledgments >> =============== >> >> I thank Guido van Rossum, Victor Stinner, Elvis Pranskevichus, Andrew >> Svetlov, and ?ukasz Langa for their initial feedback. >> >> >> Copyright >> ========= >> >> This document has been placed in the public domain. >> >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/tds333%2Bpydev%40gmail.com > > > > > -- > bye by Wolfgang > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.com > -- Thanks, Andrew Svetlov From greg.ewing at canterbury.ac.nz Thu Apr 23 11:25:02 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 23 Apr 2015 21:25:02 +1200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <5538431A.1020709@gmail.com> References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <553838F4.7070303@canterbury.ac.nz> <5538431A.1020709@gmail.com> Message-ID: <5538BA6E.4080906@canterbury.ac.nz> Yury Selivanov wrote: > I think that the problem of forgetting 'yield from' is a bit > exaggerated. Yes, I myself forgot 'yield from' once or twice. But that's > it, it has never happened since. I think it's more likely to happen when you start with an ordinary function, then discover that it needs to be suspendable, so you need to track down all the places that call it, and all the places that call those, etc. PEP 3152 ensures that you get clear diagnostics if you miss any. -- Greg From andrew.svetlov at gmail.com Thu Apr 23 11:39:48 2015 From: andrew.svetlov at gmail.com (Andrew Svetlov) Date: Thu, 23 Apr 2015 12:39:48 +0300 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <5538BA6E.4080906@canterbury.ac.nz> References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <553838F4.7070303@canterbury.ac.nz> <5538431A.1020709@gmail.com> <5538BA6E.4080906@canterbury.ac.nz> Message-ID: Greg, how waiting for multiple cocalls should look and work? In asyncio when I need to wait for two and more coroutines/futures I use `asyncio.gather()`: yield from gather(coro1(a1, a2), coro2(), fut3) >From my understanding to use cofunctions I must wrap it with costart call: yield from gather(costart(coro1, a1, a2), costart(coro2), fut3) That looks weird. There are other places in asyncio API those accept coroutines or futures as parameters, not only Task() and async(). On Thu, Apr 23, 2015 at 12:25 PM, Greg Ewing wrote: > Yury Selivanov wrote: >> >> I think that the problem of forgetting 'yield from' is a bit exaggerated. >> Yes, I myself forgot 'yield from' once or twice. But that's it, it has never >> happened since. > > > I think it's more likely to happen when you start with > an ordinary function, then discover that it needs to > be suspendable, so you need to track down all the > places that call it, and all the places that call > those, etc. PEP 3152 ensures that you get clear > diagnostics if you miss any. > > -- > Greg > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.com -- Thanks, Andrew Svetlov From greg.ewing at canterbury.ac.nz Thu Apr 23 12:29:19 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 23 Apr 2015 22:29:19 +1200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <5538469C.6050100@gmail.com> References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <553838F4.7070303@canterbury.ac.nz> <5538431A.1020709@gmail.com> <5538469C.6050100@gmail.com> Message-ID: <5538C97F.7010804@canterbury.ac.nz> Yury Selivanov wrote: > So you would have to write 'await fut()'? This is non-intuitive. That's because PEP 492 and its terminology encourage you to think of 'await f()' as a two-step process: evaluate f(), and then wait for the thing it returns to produce a result. PEP 3152 has a different philosophy. There, 'cocall f()' is a one-step process: call f and get back a result (while being prepared to get suspended in the meantime). The two-step approach has the advantage that you can get hold of the intermediate object and manipulate it. But I don't see much utility in being able to do that. Keep in mind that you can treat cofunctions themselves as objects to be manipulated, just like you can with ordinary functions, and all the usual techniques such as closures, * and ** parameters, etc. are available if you want to encapsulate one with some arguments. About the only thing you gain from being able to pass generator-iterators around instead of the functions that produce them is that you get to write t = Task(func(args)) instead of t = Task(func, args) which seems like a very minor thing to me. I would even argue that the latter is clearer, because it makes it very obvious that the body of func is *not* executed before the Task is constructed. The former makes it look as though the *result* of executing func with args is being passed to Task, rather than func itself. -- Greg From greg.ewing at canterbury.ac.nz Thu Apr 23 14:10:56 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Fri, 24 Apr 2015 00:10:56 +1200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <553838F4.7070303@canterbury.ac.nz> <5538431A.1020709@gmail.com> <5538BA6E.4080906@canterbury.ac.nz> Message-ID: <5538E150.2050803@canterbury.ac.nz> Andrew Svetlov wrote: > From my understanding to use cofunctions I must wrap it with costart call: > > yield from gather(costart(coro1, a1, a2), costart(coro2), fut3) > > There are other places in asyncio API those accept coroutines or > futures as parameters, not only Task() and async(). In a PEP 3152 aware version of asyncio, they would all know about cofunctions and what to do with them. -- Greg From greg.ewing at canterbury.ac.nz Thu Apr 23 14:18:50 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Fri, 24 Apr 2015 00:18:50 +1200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <20150423131357.73c954e1@x230> References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <5537C9D0.80505@gmail.com> <20150423021236.2b2039ca@x230> <5538AFD7.5010709@canterbury.ac.nz> <20150423131357.73c954e1@x230> Message-ID: <5538E32A.1000705@canterbury.ac.nz> Paul Sokolovsky wrote: > Greg Ewing wrote: > >>You can also use a trampoline of some kind to >>relay values back and forth between generators, >>to get something symmetric. > > Yes, that's of course how coroutine frameworks were done long before > "yield from" appeared and how Trollius works now. No, what I mean is that if you want to send stuff back and forth between two particular coroutines in a symmetric way, you can write a specialised scheduler that just handles those coroutines. If you want to do that at the same time that other things are going on, I think you're better off not trying to do it using yield. Use a general scheduler such as asyncio, and some traditional IPC mechanism such as a queue for communication. -- Greg From andrew.svetlov at gmail.com Thu Apr 23 14:24:28 2015 From: andrew.svetlov at gmail.com (Andrew Svetlov) Date: Thu, 23 Apr 2015 15:24:28 +0300 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <5538E150.2050803@canterbury.ac.nz> References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <553838F4.7070303@canterbury.ac.nz> <5538431A.1020709@gmail.com> <5538BA6E.4080906@canterbury.ac.nz> <5538E150.2050803@canterbury.ac.nz> Message-ID: On Thu, Apr 23, 2015 at 3:10 PM, Greg Ewing wrote: > Andrew Svetlov wrote: >> >> From my understanding to use cofunctions I must wrap it with costart call: >> >> yield from gather(costart(coro1, a1, a2), costart(coro2), fut3) >> >> There are other places in asyncio API those accept coroutines or >> futures as parameters, not only Task() and async(). > > > In a PEP 3152 aware version of asyncio, they would all > know about cofunctions and what to do with them. > But we already have asyncio and code based on asyncio coroutines. To make it work I should always use costart() in places where asyncio requires coroutine. Maybe your proposal is better than current asyncio practice. But now asyncio is built on top of two-step process, as you have mentioned: building coroutine and waiting for it's result. That's why I prefer `await` as replace for well-known `yield from`. > > -- > Greg > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.com -- Thanks, Andrew Svetlov From tds333+pydev at gmail.com Thu Apr 23 14:27:07 2015 From: tds333+pydev at gmail.com (Wolfgang Langner) Date: Thu, 23 Apr 2015 14:27:07 +0200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <20150423133554.1fe32575@x230> References: <55368858.4010007@gmail.com> <20150423133554.1fe32575@x230> Message-ID: On Thu, Apr 23, 2015 at 12:35 PM, Paul Sokolovsky wrote: > Hello, > > On Thu, 23 Apr 2015 12:18:51 +0300 > Andrew Svetlov wrote: > > [] > > > > 3. > > > async with and async for > > > Bead idea, we clutter the language even more and it is one more > > > thing every newbie could do wrong. > > > for x in y: > > > result = await f() > > > is enough, every 'async' framework lived without it over years. > > > > async for i in iterable: > > pass > > > > is not equal for > > > > for fut in iterable: > > i = yield from fut > > But people who used Twisted all their life don't know that! They just > know that "async for" is not needed and bad. > > I don't think it is bad nor not needed, but the syntax is not beautiful and for the 90% not doing async stuff irritating and one more thing to learn and do right/wrong. I had also a need for async loop. But there are other solutions like channels, not needing a new syntax. Also possible a function returning futures and yield in the loop with a sentinel. All this goes the road down to a producer consumer pattern. Nothing more. > I know I'm a bad guy to make such comments, too bad there's a bit of > truth in them, or everyone would just call me an a%$&ole right away. > > > Generally, I already provided feedback (on asyncio list) that asyncio > is based not on native Python concepts like a coroutine, but on > foreign concurrency concepts like callback or Future, and a coroutine > is fitted as a second-class citizen on top of that. I understand why > that was done - to not leave out all those twisteds from a shiny new > world of asyncio, but sometimes one may wonder if having a clear cut > would've helped (compat could then have been added as a clearly separate > subpackage, implemented in terms of coroutines). Now people coming from > non-coroutine frameworks who were promised compatibility see "bad" > things in asyncio (and related areas), and people lured by a promise of > native Python framework see bad things too. > > This has nothing to do with people using twisted or other async frameworks like tornado. I think a coroutine should be first class. But all this should be done in a way a beginner can handle and not design this stuff for experts only. If we do this we scare away new people. This can be done step by step. No need to hurry. And finally we have stackless Python but explicit. ;-) -- bye by Wolfgang -------------- next part -------------- An HTML attachment was scrubbed... URL: From tds333+pydev at gmail.com Thu Apr 23 14:48:58 2015 From: tds333+pydev at gmail.com (Wolfgang Langner) Date: Thu, 23 Apr 2015 14:48:58 +0200 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150423125957.631a8711@x230> References: <20150423125957.631a8711@x230> Message-ID: Hello, On Thu, Apr 23, 2015 at 11:59 AM, Paul Sokolovsky wrote: > Hello, > > On Thu, 23 Apr 2015 10:43:52 +0200 > Wolfgang Langner wrote: > > [] > > > Also ask why no one used type specifier, they are possible since > > Python 3.0 ? > > Because it is the wrong way for Python. > > That's an example of how perceptions differ. In my list, everyone(*) > uses them - MyPy, MicroPython, etc. Even more should use them (any JIT > module, which are many), but sit in the bushes, waiting for a kick, like > PEP484 provides. > > > (*) Everyone of those who needs them. Otherwise, let's throw out > metaclasses - noone uses them. > > They are there to be used and won't go away. But for most Libraries out there no one used it. JIT (Numba), Cython and other compilers/tools doing optimization have their own syntax and needs. They need even more information like i32, i64, different floats and so on. MyPy is really new and in development. And the docstring type spec way is still possible. MicroPython uses one annotation for their inline assembler stuff not type hints. For the library there are no type hints for function definitions. Remark: I use Metaclasses, seldom but if needed very useful. :-) -- bye by Wolfgang -------------- next part -------------- An HTML attachment was scrubbed... URL: From yselivanov.ml at gmail.com Thu Apr 23 15:01:31 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Thu, 23 Apr 2015 09:01:31 -0400 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <5538E150.2050803@canterbury.ac.nz> References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <553838F4.7070303@canterbury.ac.nz> <5538431A.1020709@gmail.com> <5538BA6E.4080906@canterbury.ac.nz> <5538E150.2050803@canterbury.ac.nz> Message-ID: <5538ED2B.2070108@gmail.com> On 2015-04-23 8:10 AM, Greg Ewing wrote: > Andrew Svetlov wrote: >> From my understanding to use cofunctions I must wrap it with costart >> call: >> >> yield from gather(costart(coro1, a1, a2), costart(coro2), fut3) >> >> There are other places in asyncio API those accept coroutines or >> futures as parameters, not only Task() and async(). > > In a PEP 3152 aware version of asyncio, they would all > know about cofunctions and what to do with them. > What do you mean by that? In a PEP 3152 aware version of asyncio, it's just *not possible to write* cocall gather(coro1(1,2), coro(2,3)) you just have to use your 'costart' built-in: cocall gather(costart(coro1, 1, 2), costart(coro, 2,3)). That's all. That's PEP 3152-aware world. Somehow you think that it's OK to write cocall fut() # instead of just cocall fut() *But it's not*. A huge amount of existing code won't work. You won't be able to migrate it to new syntax easily. If you have a future object 'fut', it's not intuitive or pythonic to write 'cocall fut()'. PEP 3152 was created in pre-asyncio era, and it shows. It's just not gonna work. I know because I designed PEP 492 with a reference implementation at hand, tuning the proposal to make it backwards compatible and on the other hand to actually improve things. Your idea of syntaticaly forcing to use 'cocall' with parens is cute, but it breaks so many things and habits that it just doesn't worth it. Yury From yselivanov.ml at gmail.com Thu Apr 23 15:03:59 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Thu, 23 Apr 2015 09:03:59 -0400 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <5538ED2B.2070108@gmail.com> References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <553838F4.7070303@canterbury.ac.nz> <5538431A.1020709@gmail.com> <5538BA6E.4080906@canterbury.ac.nz> <5538E150.2050803@canterbury.ac.nz> <5538ED2B.2070108@gmail.com> Message-ID: <5538EDBF.7040302@gmail.com> On 2015-04-23 9:01 AM, Yury Selivanov wrote: > cocall fut() # instead of just cocall fut() Should be: cocall fut() # instead of just cocall fut Yury From me at the-compiler.org Thu Apr 23 11:01:23 2015 From: me at the-compiler.org (Florian Bruhin) Date: Thu, 23 Apr 2015 11:01:23 +0200 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: Message-ID: <20150423090123.GW429@tonks> * Wolfgang Langner [2015-04-23 10:43:52 +0200]: > 2. Using it in the language as part of the function signature, my first > thought was oh good, then I changed my mind > to: oh it can be very ugly and unreadable, it is the wrong place. > Now I am against it, best is, if I have to specify type signatures, do > it in one place, keep them up to date. > Most of the time this is the documentation. Why not use the docstring > with a standard type specifier for this. > Suggested here: > http://pydev.blogspot.de/2015/04/type-hinting-on-python.html While I happen to agree with you (but I'm happy with both variants really), I think that's a thing which has definitely been decided already. The idea is also listed in the PEP: https://www.python.org/dev/peps/pep-0484/#other-backwards-compatible-conventions Docstrings. There is an existing convention for docstrings, based on the Sphinx notation ( :type arg1: description ). This is pretty verbose (an extra line per parameter), and not very elegant. We could also make up something new, but the annotation syntax is hard to beat (because it was designed for this very purpose). > For nearly every function I have written, there is a docstring and most of > the time also a type specified. > But if I must provide all this in a second place it is not the right way to > go. Over time normally one place misses some changes and is wrong. It seems there's an extension for Sphinx already to use type annotations: https://pypi.python.org/pypi/sphinx-autodoc-annotation It seems to be older than PEP 484 (December 2013), so I hope it'll be updated or already work well with the ideas in the PEP. > Also ask why no one used type specifier, they are possible since Python 3.0 > ? > Because it is the wrong way for Python. Well, except that Sphinx extension, and MyPy, and MicroPython, and a thing which already exists for run-time type checking[1] and probably a whole lot more. The whole *reason* for PEP 484 (at least from my perspective) is to have a common base for the existing usages of type annotations. Florian [1] https://github.com/ceronman/typeannotations -- http://www.the-compiler.org | me at the-compiler.org (Mail/XMPP) GPG: 916E B0C8 FD55 A072 | http://the-compiler.org/pubkey.asc I love long mails! | http://email.is-not-s.ms/ -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 819 bytes Desc: not available URL: From pmiscml at gmail.com Thu Apr 23 11:59:57 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Thu, 23 Apr 2015 12:59:57 +0300 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: Message-ID: <20150423125957.631a8711@x230> Hello, On Thu, 23 Apr 2015 10:43:52 +0200 Wolfgang Langner wrote: [] > Also ask why no one used type specifier, they are possible since > Python 3.0 ? > Because it is the wrong way for Python. That's an example of how perceptions differ. In my list, everyone(*) uses them - MyPy, MicroPython, etc. Even more should use them (any JIT module, which are many), but sit in the bushes, waiting for a kick, like PEP484 provides. (*) Everyone of those who needs them. Otherwise, let's throw out metaclasses - noone uses them. > Regards, > > Wolfgang -- Best regards, Paul mailto:pmiscml at gmail.com From pmiscml at gmail.com Thu Apr 23 12:13:57 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Thu, 23 Apr 2015 13:13:57 +0300 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <5538AFD7.5010709@canterbury.ac.nz> References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <5537C9D0.80505@gmail.com> <20150423021236.2b2039ca@x230> <5538AFD7.5010709@canterbury.ac.nz> Message-ID: <20150423131357.73c954e1@x230> Hello, On Thu, 23 Apr 2015 20:39:51 +1200 Greg Ewing wrote: > Paul Sokolovsky wrote: > > And having both asymmetric and symmetric > > would quite confusing, especially that symmetric are more powerful > > and asymmetric can be easily implemented in terms of symmetric using > > continuation-passing style. > > You can also use a trampoline of some kind to > relay values back and forth between generators, > to get something symmetric. Yes, that's of course how coroutine frameworks were done long before "yield from" appeared and how Trollius works now. But this just proves point given to the original subtopic starter - that Python already has powerful enough machinery to achieve functionality needed for "yield to asyncio main loop", and adding something specifically for that will only make situation more complicated (because what is asyncio main loop? Just a random user-level function/coroutine, if you need to support yielding "directly" to it, you need to supporting yielding directly to an arbitrary coroutine). > > -- > Greg -- Best regards, Paul mailto:pmiscml at gmail.com From pmiscml at gmail.com Thu Apr 23 12:35:54 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Thu, 23 Apr 2015 13:35:54 +0300 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> Message-ID: <20150423133554.1fe32575@x230> Hello, On Thu, 23 Apr 2015 12:18:51 +0300 Andrew Svetlov wrote: [] > > 3. > > async with and async for > > Bead idea, we clutter the language even more and it is one more > > thing every newbie could do wrong. > > for x in y: > > result = await f() > > is enough, every 'async' framework lived without it over years. > > async for i in iterable: > pass > > is not equal for > > for fut in iterable: > i = yield from fut But people who used Twisted all their life don't know that! They just know that "async for" is not needed and bad. I know I'm a bad guy to make such comments, too bad there's a bit of truth in them, or everyone would just call me an a%$&ole right away. Generally, I already provided feedback (on asyncio list) that asyncio is based not on native Python concepts like a coroutine, but on foreign concurrency concepts like callback or Future, and a coroutine is fitted as a second-class citizen on top of that. I understand why that was done - to not leave out all those twisteds from a shiny new world of asyncio, but sometimes one may wonder if having a clear cut would've helped (compat could then have been added as a clearly separate subpackage, implemented in terms of coroutines). Now people coming from non-coroutine frameworks who were promised compatibility see "bad" things in asyncio (and related areas), and people lured by a promise of native Python framework see bad things too. -- Best regards, Paul mailto:pmiscml at gmail.com From ldvc112 at gmail.com Thu Apr 23 14:27:37 2015 From: ldvc112 at gmail.com (Leonid Kokorev) Date: Thu, 23 Apr 2015 15:27:37 +0300 Subject: [Python-Dev] Task Request Message-ID: <5538E539.10306@gmail.com> Hello. My name is Leonid. I'm very interested in Python programming language. I'm sending this letter to the python-dev mailing list in order to get any task, that I could perform. I want to help my community and improve my skills. If anybody has not very complicated task, please email me at ldvc112 at gmail.com. From victor.stinner at gmail.com Thu Apr 23 10:43:11 2015 From: victor.stinner at gmail.com (Victor Stinner) Date: Thu, 23 Apr 2015 10:43:11 +0200 Subject: [Python-Dev] PEP 3152 and yield from Future() Message-ID: (I prefer to start a new thread, the previous one is too long for me :-)) Hi, I'm still trying to understand how the PEP 3152 would impact asyncio. Guido suggests to replace "yield from fut" with "cocall fut()" (add parenthesis) and so add a __cocall__() method to asyncio.Future. Problem: PEP 3152 says "A cofunction (...) does not contain any yield or yield from expressions". Currently, Future.__iter__() is implemented using yield: def __iter__(self): if not self.done(): self._blocking = True yield self # This tells Task to wait for completion. assert self.done(), "yield from wasn't used with future" return self.result() # May raise too. >From my understanding, the PEP 3151 simply does not support asyncio.Future. Am I right? How is it possible to suspend a cofunction if it's not possible to use yield? If waiting for a future in a cofunction is not supported, the PEP 3151 is useless for asyncio, since asyncio completly relies on futures. Victor From dholth at gmail.com Thu Apr 23 15:15:44 2015 From: dholth at gmail.com (Daniel Holth) Date: Thu, 23 Apr 2015 09:15:44 -0400 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150423125957.631a8711@x230> References: <20150423125957.631a8711@x230> Message-ID: On Thu, Apr 23, 2015 at 5:59 AM, Paul Sokolovsky wrote: > Hello, > > On Thu, 23 Apr 2015 10:43:52 +0200 > Wolfgang Langner wrote: > > [] > >> Also ask why no one used type specifier, they are possible since >> Python 3.0 ? >> Because it is the wrong way for Python. > > That's an example of how perceptions differ. In my list, everyone(*) > uses them - MyPy, MicroPython, etc. Even more should use them (any JIT > module, which are many), but sit in the bushes, waiting for a kick, like > PEP484 provides. It's OK that type hints are only to assist the programmer. PyPy's FAQ has an explanation of why type hints are not for performance. http://pypy.readthedocs.org/en/latest/faq.html#would-type-annotations-help-pypy-s-performance From andrew.svetlov at gmail.com Thu Apr 23 15:44:15 2015 From: andrew.svetlov at gmail.com (Andrew Svetlov) Date: Thu, 23 Apr 2015 16:44:15 +0300 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> <20150423133554.1fe32575@x230> Message-ID: On Thu, Apr 23, 2015 at 3:27 PM, Wolfgang Langner wrote: > > > On Thu, Apr 23, 2015 at 12:35 PM, Paul Sokolovsky wrote: >> >> Hello, >> >> On Thu, 23 Apr 2015 12:18:51 +0300 >> Andrew Svetlov wrote: >> >> [] >> >> > > 3. >> > > async with and async for >> > > Bead idea, we clutter the language even more and it is one more >> > > thing every newbie could do wrong. >> > > for x in y: >> > > result = await f() >> > > is enough, every 'async' framework lived without it over years. >> > >> > async for i in iterable: >> > pass >> > >> > is not equal for >> > >> > for fut in iterable: >> > i = yield from fut >> >> But people who used Twisted all their life don't know that! They just >> know that "async for" is not needed and bad. >> > > I don't think it is bad nor not needed, but the syntax is not beautiful and > for the 90% not doing async stuff irritating and one more thing to learn > and do right/wrong. > > I had also a need for async loop. But there are other solutions like > channels, > not needing a new syntax. > By `channels` do you mean something like `asyncio.Queue`? It requires that producer and consumer should be separate tasks. Often it works (with some performance penalty cost) but creating 2 tasks is not always obvious way to solve problem. > Also possible a function returning futures and yield in the loop with a > sentinel. A proposal looks like guess to avoid `for` loop and use `while` everywhere. Just compare `while` loop: it = iter(it) while True: try: i = next(it) process(i) except StopIteration: break with `for` alternative: for i in it: process(i) > > All this goes the road down to a producer consumer pattern. Nothing more. > I think one of the most convenient consumer-producer pattern implementation in Python is `for` loop and iterators concept. It's sometimes too limited but works pretty well in 95% of use cases. > -- Thanks, Andrew Svetlov From pmiscml at gmail.com Thu Apr 23 15:44:10 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Thu, 23 Apr 2015 16:44:10 +0300 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150423125957.631a8711@x230> Message-ID: <20150423164410.6c8641e1@x230> Hello, On Thu, 23 Apr 2015 14:48:58 +0200 Wolfgang Langner wrote: > Hello, > > On Thu, Apr 23, 2015 at 11:59 AM, Paul Sokolovsky > wrote: > > > Hello, > > > > On Thu, 23 Apr 2015 10:43:52 +0200 > > Wolfgang Langner wrote: > > > > [] > > > > > Also ask why no one used type specifier, they are possible since > > > Python 3.0 ? > > > Because it is the wrong way for Python. > > > > That's an example of how perceptions differ. In my list, everyone(*) > > uses them - MyPy, MicroPython, etc. Even more should use them (any > > JIT module, which are many), but sit in the bushes, waiting for a > > kick, like PEP484 provides. > > > > > > (*) Everyone of those who needs them. Otherwise, let's throw out > > metaclasses - noone uses them. > > > > > They are there to be used and won't go away. > But for most Libraries out there no one used it. > > JIT (Numba), Cython and other compilers/tools doing optimization have > their own syntax and needs. That's exactly what needs to change, and where this PEP helps, as was already pointed out: http://code.activestate.com/lists/python-dev/135659/ > They need even more information like i32, i64, different floats and > so on. That's poor excuse for not trying to be a good member of Python community (and work on standard type annotation syntax, instead of digging own hole). > MyPy is really new and in development. And the docstring type > spec way is still possible. Anything is possible. The talk is about what makes most of sense. Docstrings were always arbitrary string designated at human's consumption, how (ab)using them for type annotations is better than having clean language-grammar syntax? > MicroPython uses one annotation for their inline assembler stuff not > type hints. Here's how MicroPython uses type annotations: https://github.com/micropython/micropython/blob/master/tests/micropython/viper_ptr8_load.py > For the library there are no type hints for function definitions. > > > Remark: I use Metaclasses, seldom but if needed very useful. :-) > > > -- > bye by Wolfgang -- Best regards, Paul mailto:pmiscml at gmail.com From pmiscml at gmail.com Thu Apr 23 15:55:13 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Thu, 23 Apr 2015 16:55:13 +0300 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150423125957.631a8711@x230> Message-ID: <20150423165513.239317a3@x230> Hello, On Thu, 23 Apr 2015 09:15:44 -0400 Daniel Holth wrote: [] > >> Also ask why no one used type specifier, they are possible since > >> Python 3.0 ? > >> Because it is the wrong way for Python. > > > > That's an example of how perceptions differ. In my list, everyone(*) > > uses them - MyPy, MicroPython, etc. Even more should use them (any > > JIT module, which are many), but sit in the bushes, waiting for a > > kick, like PEP484 provides. > > It's OK that type hints are only to assist the programmer. Yes, it's OK to have a situation where type hints assist only a programmer. It's not OK to think that type hints may be useful only for programmer, instead of bunch more purposes, several of which were already shown in the long previous discussion. > PyPy's FAQ > has an explanation of why type hints are not for performance. > http://pypy.readthedocs.org/en/latest/faq.html#would-type-annotations-help-pypy-s-performance You probably intended to write "why type hints are not for *PyPy's* performance". There're many other language implementations and modules for which it may be useful, please don't limit your imagination by a single case. And speaking of PyPy, it really should think how to improve its performance - not of generated programs, but of generation itself. If compilation of a trivial program on a pumpy hardware takes 5 minutes and gigabytes of RAM and diskspace, few people will use it for other purposes beyond curiosity. There's something very un-Pythonic in waiting 5 mins just to run 10-line script. Type hints can help here too ;-) (by not wasting resources propagating types thru the same old standard library for example). -- Best regards, Paul mailto:pmiscml at gmail.com From brett at python.org Thu Apr 23 16:07:45 2015 From: brett at python.org (Brett Cannon) Date: Thu, 23 Apr 2015 14:07:45 +0000 Subject: [Python-Dev] Task Request In-Reply-To: <5538E539.10306@gmail.com> References: <5538E539.10306@gmail.com> Message-ID: A better place to ask this question is the core-mentorship mailing list which was set up specifically to help people contribute to Python. On Thu, Apr 23, 2015 at 9:07 AM Leonid Kokorev wrote: > Hello. > My name is Leonid. I'm very interested in Python programming language. > I'm sending this letter to the python-dev mailing list in order to get > any task, that I could perform. > I want to help my community and improve my skills. > If anybody has not very complicated task, please email me at > ldvc112 at gmail.com. > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From dholth at gmail.com Thu Apr 23 16:12:04 2015 From: dholth at gmail.com (Daniel Holth) Date: Thu, 23 Apr 2015 10:12:04 -0400 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150423165513.239317a3@x230> References: <20150423125957.631a8711@x230> <20150423165513.239317a3@x230> Message-ID: On Thu, Apr 23, 2015 at 9:55 AM, Paul Sokolovsky wrote: > Hello, > > On Thu, 23 Apr 2015 09:15:44 -0400 > Daniel Holth wrote: > > [] > >> >> Also ask why no one used type specifier, they are possible since >> >> Python 3.0 ? >> >> Because it is the wrong way for Python. >> > >> > That's an example of how perceptions differ. In my list, everyone(*) >> > uses them - MyPy, MicroPython, etc. Even more should use them (any >> > JIT module, which are many), but sit in the bushes, waiting for a >> > kick, like PEP484 provides. >> >> It's OK that type hints are only to assist the programmer. > > Yes, it's OK to have a situation where type hints assist only a > programmer. It's not OK to think that type hints may be useful only for > programmer, instead of bunch more purposes, several of which > were already shown in the long previous discussion. > >> PyPy's FAQ >> has an explanation of why type hints are not for performance. >> http://pypy.readthedocs.org/en/latest/faq.html#would-type-annotations-help-pypy-s-performance > > You probably intended to write "why type hints are not for *PyPy's* > performance". There're many other language implementations and modules > for which it may be useful, please don't limit your imagination by a > single case. > > And speaking of PyPy, it really should think how to improve its > performance - not of generated programs, but of generation itself. If > compilation of a trivial program on a pumpy hardware takes 5 minutes > and gigabytes of RAM and diskspace, few people will use it for other > purposes beyond curiosity. There's something very un-Pythonic in > waiting 5 mins just to run 10-line script. Type hints can help here > too ;-) (by not wasting resources propagating types thru the same old > standard library for example). Naturally, PyPy is very controversial. Type annotations can help to compile Python into a subset of Python. From harry.percival at gmail.com Thu Apr 23 16:25:30 2015 From: harry.percival at gmail.com (Harry Percival) Date: Thu, 23 Apr 2015 15:25:30 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150423164410.6c8641e1@x230> References: <20150423125957.631a8711@x230> <20150423164410.6c8641e1@x230> Message-ID: lol @ the fact that the type hints are breaking github's syntax highlighter :) On 23 April 2015 at 14:44, Paul Sokolovsky wrote: > Hello, > > On Thu, 23 Apr 2015 14:48:58 +0200 > Wolfgang Langner wrote: > > > Hello, > > > > On Thu, Apr 23, 2015 at 11:59 AM, Paul Sokolovsky > > wrote: > > > > > Hello, > > > > > > On Thu, 23 Apr 2015 10:43:52 +0200 > > > Wolfgang Langner wrote: > > > > > > [] > > > > > > > Also ask why no one used type specifier, they are possible since > > > > Python 3.0 ? > > > > Because it is the wrong way for Python. > > > > > > That's an example of how perceptions differ. In my list, everyone(*) > > > uses them - MyPy, MicroPython, etc. Even more should use them (any > > > JIT module, which are many), but sit in the bushes, waiting for a > > > kick, like PEP484 provides. > > > > > > > > > (*) Everyone of those who needs them. Otherwise, let's throw out > > > metaclasses - noone uses them. > > > > > > > > They are there to be used and won't go away. > > But for most Libraries out there no one used it. > > > > JIT (Numba), Cython and other compilers/tools doing optimization have > > their own syntax and needs. > > That's exactly what needs to change, and where this PEP helps, as was > already pointed out: > http://code.activestate.com/lists/python-dev/135659/ > > > They need even more information like i32, i64, different floats and > > so on. > > That's poor excuse for not trying to be a good member of Python > community (and work on standard type annotation syntax, instead of > digging own hole). > > > MyPy is really new and in development. And the docstring type > > spec way is still possible. > > Anything is possible. The talk is about what makes most of sense. > Docstrings were always arbitrary string designated at human's > consumption, how (ab)using them for type annotations is better than > having clean language-grammar syntax? > > > MicroPython uses one annotation for their inline assembler stuff not > > type hints. > > Here's how MicroPython uses type annotations: > > https://github.com/micropython/micropython/blob/master/tests/micropython/viper_ptr8_load.py > > > For the library there are no type hints for function definitions. > > > > > > Remark: I use Metaclasses, seldom but if needed very useful. :-) > > > > > > -- > > bye by Wolfgang > > > > -- > Best regards, > Paul mailto:pmiscml at gmail.com > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/hjwp2%40cantab.net > -- ------------------------------ Harry J.W. Percival ------------------------------ Twitter: @hjwp Mobile: +44 (0) 78877 02511 Skype: harry.percival -------------- next part -------------- An HTML attachment was scrubbed... URL: From ldvc112 at gmail.com Thu Apr 23 16:21:08 2015 From: ldvc112 at gmail.com (Leonid Kokorev) Date: Thu, 23 Apr 2015 17:21:08 +0300 Subject: [Python-Dev] Task Request In-Reply-To: References: <5538E539.10306@gmail.com> Message-ID: <5538FFD4.2000707@gmail.com> Thanks for advice. On 04/23/2015 05:07 PM, Brett Cannon wrote: > A better place to ask this question is the core-mentorship mailing > list which was set up specifically to help people contribute to Python. > > On Thu, Apr 23, 2015 at 9:07 AM Leonid Kokorev > wrote: > > Hello. > My name is Leonid. I'm very interested in Python programming language. > I'm sending this letter to the python-dev mailing list in order to get > any task, that I could perform. > I want to help my community and improve my skills. > If anybody has not very complicated task, please email me at > ldvc112 at gmail.com . > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From pmiscml at gmail.com Thu Apr 23 16:52:22 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Thu, 23 Apr 2015 17:52:22 +0300 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150423125957.631a8711@x230> <20150423164410.6c8641e1@x230> Message-ID: <20150423175222.70603d3c@x230> Hello, On Thu, 23 Apr 2015 15:25:30 +0100 Harry Percival wrote: > lol @ the fact that the type hints are breaking github's syntax > highlighter :) What one can expect from software written in Ruby? ;-) -- Best regards, Paul mailto:pmiscml at gmail.com From andrew.svetlov at gmail.com Thu Apr 23 17:22:27 2015 From: andrew.svetlov at gmail.com (andrew.svetlov at gmail.com) Date: Thu, 23 Apr 2015 08:22:27 -0700 (PDT) Subject: [Python-Dev] PEP 3152 and yield from Future() In-Reply-To: References: Message-ID: <1429802547315.7259b888@Nodemailer> I can live with `cocall fut()` but the difference between `data = yield from loop.sock_recv(sock, 1024)` and?`data = cocall (loop.sock_recv(sock, 1024))()` frustrates me very much. ? Sent from Mailbox On Thu, Apr 23, 2015 at 4:09 PM, Victor Stinner wrote: > (I prefer to start a new thread, the previous one is too long for me :-)) > Hi, > I'm still trying to understand how the PEP 3152 would impact asyncio. > Guido suggests to replace "yield from fut" with "cocall fut()" (add > parenthesis) and so add a __cocall__() method to asyncio.Future. > Problem: PEP 3152 says "A cofunction (...) does not contain any yield > or yield from expressions". Currently, Future.__iter__() is > implemented using yield: > def __iter__(self): > if not self.done(): > self._blocking = True > yield self # This tells Task to wait for completion. > assert self.done(), "yield from wasn't used with future" > return self.result() # May raise too. > From my understanding, the PEP 3151 simply does not support > asyncio.Future. Am I right? > How is it possible to suspend a cofunction if it's not possible to use yield? > If waiting for a future in a cofunction is not supported, the PEP 3151 > is useless for asyncio, since asyncio completly relies on futures. > Victor > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From lukasz at langa.pl Thu Apr 23 17:30:26 2015 From: lukasz at langa.pl (=?utf-8?Q?=C5=81ukasz_Langa?=) Date: Thu, 23 Apr 2015 08:30:26 -0700 Subject: [Python-Dev] PEP 3152 and yield from Future() In-Reply-To: <1429802547315.7259b888@Nodemailer> References: <1429802547315.7259b888@Nodemailer> Message-ID: > On Apr 23, 2015, at 8:22 AM, andrew.svetlov at gmail.com wrote: > > I can live with `cocall fut()` but the difference between `data = yield from loop.sock_recv(sock, 1024)` and `data = cocall (loop.sock_recv(sock, 1024))()` frustrates me very much. This is unacceptable. None of the existing async/await implementations in other languages force you to write stuff like this. -- Best regards, ?ukasz Langa WWW: http://lukasz.langa.pl/ Twitter: @llanga IRC: ambv on #python-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: From yselivanov.ml at gmail.com Thu Apr 23 17:32:33 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Thu, 23 Apr 2015 11:32:33 -0400 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> <20150423133554.1fe32575@x230> Message-ID: <55391091.9000507@gmail.com> Hi Wolfgang, On 2015-04-23 8:27 AM, Wolfgang Langner wrote: > On Thu, Apr 23, 2015 at 12:35 PM, Paul Sokolovsky wrote: > >> Hello, >> >> On Thu, 23 Apr 2015 12:18:51 +0300 >> Andrew Svetlov wrote: >> >> [] >> >>>> 3. >>>> async with and async for >>>> Bead idea, we clutter the language even more and it is one more >>>> thing every newbie could do wrong. >>>> for x in y: >>>> result = await f() >>>> is enough, every 'async' framework lived without it over years. >>> async for i in iterable: >>> pass >>> >>> is not equal for >>> >>> for fut in iterable: >>> i = yield from fut >> But people who used Twisted all their life don't know that! They just >> know that "async for" is not needed and bad. >> >> > I don't think it is bad nor not needed, but the syntax is not beautiful and > for the 90% not doing async stuff irritating and one more thing to learn > and do right/wrong. There is no way to do things wrong in PEP 492. An object either has __aiter__ or it will be rejected by async for. An object either has __aenter__ or it will be rejected by async with. transaction = yield from connection.transaction() try: ... except: yield from transaction.rollback() else: yield from transaction.commit() is certainly more irritating than async with connection.transcation(): ... > > I had also a need for async loop. But there are other solutions like > channels, > not needing a new syntax. > > Also possible a function returning futures and yield in the loop with a > sentinel. > > All this goes the road down to a producer consumer pattern. Nothing more. > > > >> I know I'm a bad guy to make such comments, too bad there's a bit of >> truth in them, or everyone would just call me an a%$&ole right away. >> >> >> Generally, I already provided feedback (on asyncio list) that asyncio >> is based not on native Python concepts like a coroutine, but on >> foreign concurrency concepts like callback or Future, and a coroutine >> is fitted as a second-class citizen on top of that. I understand why >> that was done - to not leave out all those twisteds from a shiny new >> world of asyncio, but sometimes one may wonder if having a clear cut >> would've helped (compat could then have been added as a clearly separate >> subpackage, implemented in terms of coroutines). Now people coming from >> non-coroutine frameworks who were promised compatibility see "bad" >> things in asyncio (and related areas), and people lured by a promise of >> native Python framework see bad things too. >> >> > This has nothing to do with people using twisted or other async frameworks > like tornado. > I think a coroutine should be first class. But all this should be done in a > way a beginner > can handle and not design this stuff for experts only. I think that most of async frameworks out there are for experts only. Precisely because of 'yield from', 'yield', inlineCallbacks, '@coroutine', channels and other stuff. PEP 492 will make it all easier. And Twisted can use its features too. > If we do this we > scare away new people. It doesn't scare away anyone. async/await were the most awaited features in dart and javascript. One of the most popular features in c#. Yury From victor.stinner at gmail.com Thu Apr 23 17:26:38 2015 From: victor.stinner at gmail.com (Victor Stinner) Date: Thu, 23 Apr 2015 17:26:38 +0200 Subject: [Python-Dev] PEP 3152 and yield from Future() In-Reply-To: <1429802547315.7259b888@Nodemailer> References: <1429802547315.7259b888@Nodemailer> Message-ID: 2015-04-23 17:22 GMT+02:00 : > I can live with `cocall fut()` but the difference between `data = yield from > loop.sock_recv(sock, 1024)` and `data = cocall (loop.sock_recv(sock, > 1024))()` frustrates me very much. You didn't answer to my question. My question is: is it possible to implement Future.__cocall__() since yield is defined in cofunctions. If it's possible, can you please show how? (Show me the code!) Victor From yselivanov.ml at gmail.com Thu Apr 23 17:38:02 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Thu, 23 Apr 2015 11:38:02 -0400 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> Message-ID: <553911DA.40704@gmail.com> Hi, On 2015-04-23 3:30 AM, Wolfgang Langner wrote: > Hi, > > most of the time I am a silent reader but in this discussion I must step in. > I use twisted and async stuff a lot over years followed development of > asyncio closely. > > First it is good to do differentiate async coroutines from generators. So > everyone can see it and have this in mind > and don't mix up booth. It is also easier to explain for new users. > Sometimes generators blows their mind and it takes > a while to get used to them. Async stuff is even harder. > > 1. I am fine with using something special instead of "yield" or "yield > from" for this. C# "await" is ok. > > Everything else suggested complicates the language and makes it harder to > read. > > 2. > async def f(): is harder to read and something special also it breaks the > symmetry in front (def indent). > Also every existing tooling must be changed to support it. Same for def > async, def f() async: > I thing a decorator is enough here > @coroutine > def f(): > is the best solution to mark something as a coroutine. You can't combine a keyword (await) with runtime decorator. Also it's harder for tools to support @coroutine / @inlineCallbacks than "async". > > > 3. > async with and async for > Bead idea, we clutter the language even more and it is one more thing every > newbie could do wrong. > for x in y: > result = await f() > is enough, every 'async' framework lived without it over years. I only lived without it because I used greenlets for async for's & with's. There must be a native language concept to do these things. > Same for with statement. > > The main use case suggested was for database stuff and this is also where > most are best with > defer something to a thread and keep it none async. > > > All together it is very late in the development cycle for 3.5 to > incorporate such a big change. The PEP isn't a result of some quick brainstorming. It's a result of long experience using asyncio and working around many painpoints of async programming. > Best is to give all this some more time and defer it to 3.6 and some alpha > releases to experiment with. There is reference implementation. asyncio is fully ported, every package for asyncio should work. You can experiment right now and find a real issue why the PEP doesn't work. Yury From yselivanov.ml at gmail.com Thu Apr 23 17:54:06 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Thu, 23 Apr 2015 11:54:06 -0400 Subject: [Python-Dev] PEP 3152 and yield from Future() In-Reply-To: References: Message-ID: <5539159E.2070806@gmail.com> Hi Victor, On 2015-04-23 4:43 AM, Victor Stinner wrote: [...] > From my understanding, the PEP 3151 simply does not support > asyncio.Future. Am I right? > Greg wants to implement __cocall__ on futures. This way you'll be able to write cocall fut() # instead of "await fut" So you *will have to* use "()" Another problem is functions that return future: def do_something(): ... return fut With Greg's idea to call it you would do: cocall (do_something())() That means that you can't refactor your "do_something" function and make it a coroutine. Yury From tds333+pydev at gmail.com Thu Apr 23 17:57:21 2015 From: tds333+pydev at gmail.com (Wolfgang Langner) Date: Thu, 23 Apr 2015 17:57:21 +0200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <55391091.9000507@gmail.com> References: <55368858.4010007@gmail.com> <20150423133554.1fe32575@x230> <55391091.9000507@gmail.com> Message-ID: On Thu, Apr 23, 2015 at 5:32 PM, Yury Selivanov wrote: > Hi Wolfgang, > > On 2015-04-23 8:27 AM, Wolfgang Langner wrote: > >> On Thu, Apr 23, 2015 at 12:35 PM, Paul Sokolovsky >> wrote: >> >> Hello, >>> >>> On Thu, 23 Apr 2015 12:18:51 +0300 >>> Andrew Svetlov wrote: >>> >>> [] >>> >>> 3. >>>>> async with and async for >>>>> Bead idea, we clutter the language even more and it is one more >>>>> thing every newbie could do wrong. >>>>> for x in y: >>>>> result = await f() >>>>> is enough, every 'async' framework lived without it over years. >>>>> >>>> async for i in iterable: >>>> pass >>>> >>>> is not equal for >>>> >>>> for fut in iterable: >>>> i = yield from fut >>>> >>> But people who used Twisted all their life don't know that! They just >>> know that "async for" is not needed and bad. >>> >>> >>> I don't think it is bad nor not needed, but the syntax is not beautiful >> and >> for the 90% not doing async stuff irritating and one more thing to learn >> and do right/wrong. >> > There is no way to do things wrong in PEP 492. An object > either has __aiter__ or it will be rejected by async for. > An object either has __aenter__ or it will be rejected by > async with. > > transaction = yield from connection.transaction() > try: > ... > except: > yield from transaction.rollback() > else: > yield from transaction.commit() > > is certainly more irritating than > > async with connection.transcation(): > ... > > >> I had also a need for async loop. But there are other solutions like >> channels, >> not needing a new syntax. >> >> Also possible a function returning futures and yield in the loop with a >> sentinel. >> >> All this goes the road down to a producer consumer pattern. Nothing more. >> >> >> >> I know I'm a bad guy to make such comments, too bad there's a bit of >>> truth in them, or everyone would just call me an a%$&ole right away. >>> >>> >>> Generally, I already provided feedback (on asyncio list) that asyncio >>> is based not on native Python concepts like a coroutine, but on >>> foreign concurrency concepts like callback or Future, and a coroutine >>> is fitted as a second-class citizen on top of that. I understand why >>> that was done - to not leave out all those twisteds from a shiny new >>> world of asyncio, but sometimes one may wonder if having a clear cut >>> would've helped (compat could then have been added as a clearly separate >>> subpackage, implemented in terms of coroutines). Now people coming from >>> non-coroutine frameworks who were promised compatibility see "bad" >>> things in asyncio (and related areas), and people lured by a promise of >>> native Python framework see bad things too. >>> >>> >>> This has nothing to do with people using twisted or other async >> frameworks >> like tornado. >> I think a coroutine should be first class. But all this should be done in >> a >> way a beginner >> can handle and not design this stuff for experts only. >> > > I think that most of async frameworks out there are for > experts only. Precisely because of 'yield from', 'yield', > inlineCallbacks, '@coroutine', channels and other stuff. > > PEP 492 will make it all easier. And Twisted can use > its features too. > Yes and it is good to make it easier. But not complicate it for others. Beginners will be confronted with all this new syntax an my feel lost. Oh I have to different for loops, one with async. Same for with statement. > > If we do this we >> scare away new people. >> > > It doesn't scare away anyone. async/await were the most > awaited features in dart and javascript. One of the most > popular features in c#. > I like it in C#. I like await for Python but I don't like async there and how to specify it. I still think a decorator is enough and no special for and with syntax. async in JavaScript is for execution a whole script asynchronously used in the script tag. dart is for the google universe with less usage outside. -- bye by Wolfgang -------------- next part -------------- An HTML attachment was scrubbed... URL: From yselivanov.ml at gmail.com Thu Apr 23 18:08:43 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Thu, 23 Apr 2015 12:08:43 -0400 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> <20150423133554.1fe32575@x230> <55391091.9000507@gmail.com> Message-ID: <5539190B.5090508@gmail.com> Hi Wolfgang, On 2015-04-23 11:57 AM, Wolfgang Langner wrote: > On Thu, Apr 23, 2015 at 5:32 PM, Yury Selivanov > wrote: > >> Hi Wolfgang, >> >> On 2015-04-23 8:27 AM, Wolfgang Langner wrote: >> >>> On Thu, Apr 23, 2015 at 12:35 PM, Paul Sokolovsky >>> wrote: >>> >>> Hello, >>>> On Thu, 23 Apr 2015 12:18:51 +0300 >>>> Andrew Svetlov wrote: >>>> >>>> [] >>>> >>>> 3. >>>>>> async with and async for >>>>>> Bead idea, we clutter the language even more and it is one more >>>>>> thing every newbie could do wrong. >>>>>> for x in y: >>>>>> result = await f() >>>>>> is enough, every 'async' framework lived without it over years. >>>>>> >>>>> async for i in iterable: >>>>> pass >>>>> >>>>> is not equal for >>>>> >>>>> for fut in iterable: >>>>> i = yield from fut >>>>> >>>> But people who used Twisted all their life don't know that! They just >>>> know that "async for" is not needed and bad. >>>> >>>> >>>> I don't think it is bad nor not needed, but the syntax is not beautiful >>> and >>> for the 90% not doing async stuff irritating and one more thing to learn >>> and do right/wrong. >>> >> There is no way to do things wrong in PEP 492. An object >> either has __aiter__ or it will be rejected by async for. >> An object either has __aenter__ or it will be rejected by >> async with. >> >> transaction = yield from connection.transaction() >> try: >> ... >> except: >> yield from transaction.rollback() >> else: >> yield from transaction.commit() >> >> is certainly more irritating than >> >> async with connection.transcation(): >> ... >> >> >>> I had also a need for async loop. But there are other solutions like >>> channels, >>> not needing a new syntax. >>> >>> Also possible a function returning futures and yield in the loop with a >>> sentinel. >>> >>> All this goes the road down to a producer consumer pattern. Nothing more. >>> >>> >>> >>> I know I'm a bad guy to make such comments, too bad there's a bit of >>>> truth in them, or everyone would just call me an a%$&ole right away. >>>> >>>> >>>> Generally, I already provided feedback (on asyncio list) that asyncio >>>> is based not on native Python concepts like a coroutine, but on >>>> foreign concurrency concepts like callback or Future, and a coroutine >>>> is fitted as a second-class citizen on top of that. I understand why >>>> that was done - to not leave out all those twisteds from a shiny new >>>> world of asyncio, but sometimes one may wonder if having a clear cut >>>> would've helped (compat could then have been added as a clearly separate >>>> subpackage, implemented in terms of coroutines). Now people coming from >>>> non-coroutine frameworks who were promised compatibility see "bad" >>>> things in asyncio (and related areas), and people lured by a promise of >>>> native Python framework see bad things too. >>>> >>>> >>>> This has nothing to do with people using twisted or other async >>> frameworks >>> like tornado. >>> I think a coroutine should be first class. But all this should be done in >>> a >>> way a beginner >>> can handle and not design this stuff for experts only. >>> >> I think that most of async frameworks out there are for >> experts only. Precisely because of 'yield from', 'yield', >> inlineCallbacks, '@coroutine', channels and other stuff. >> >> PEP 492 will make it all easier. And Twisted can use >> its features too. >> > Yes and it is good to make it easier. But not complicate it for others. > Beginners will be confronted with all this new syntax an my feel lost. > Oh I have to different for loops, one with async. Same for with statement. Absolute beginners don't write async code and http servers. And when they start doing things like that, they're not someone who can't understand 'async' and 'await'. It's not harder than '@coroutine' and 'yield from'. What is hard for even experienced users, is to understand what's the difference between 'yield from' and 'yield', and how asyncio works in the core. Why you should always use the former etc. Unfortunately I just can't agree with you here. Too much time was spent trying to explain people how to write asyncio code, and it's hard. PEP 492 will make it easier. > > >> If we do this we >>> scare away new people. >>> >> It doesn't scare away anyone. async/await were the most >> awaited features in dart and javascript. One of the most >> popular features in c#. >> > I like it in C#. > I like await for Python but I don't like async there and how to specify it. > I still think a decorator is enough and no special for and with syntax. Please read the "Debugging Features" section in the PEP. It explains why decorator is not enough. Also the "Rationale" section also stresses some points. Decorator solution is "enough", but it's far from being ideal. For IDEs and linters it's harder to support. What if you write "from asyncio import coroutine as coro"? You have to analyze the code statically to reason about what "@coroutine" is. Moreover, you need to enable good IDE support for other frameworks too. > > async in JavaScript is for execution a whole script asynchronously used in > the script tag. > dart is for the google universe with less usage outside. > > Please read a proposal to add async/await in JS (refd in the PEP). It's very likely to be accepted, because JS is asynchronous to its very core, and it's a pain to program in it with callbacks. yields in JS will mitigate the problem but not completely, so it's only matter of time. In the PEP there is a good list of other languages with async/await. Thank you, Yury From tds333+pydev at gmail.com Thu Apr 23 18:12:37 2015 From: tds333+pydev at gmail.com (Wolfgang Langner) Date: Thu, 23 Apr 2015 18:12:37 +0200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <55391091.9000507@gmail.com> References: <55368858.4010007@gmail.com> <20150423133554.1fe32575@x230> <55391091.9000507@gmail.com> Message-ID: On Thu, Apr 23, 2015 at 5:32 PM, Yury Selivanov wrote: > Hi Wolfgang, > > On 2015-04-23 8:27 AM, Wolfgang Langner wrote: > >> On Thu, Apr 23, 2015 at 12:35 PM, Paul Sokolovsky >> wrote: >> >> Hello, >>> >>> On Thu, 23 Apr 2015 12:18:51 +0300 >>> Andrew Svetlov wrote: >>> >>> [] >>> >>> 3. >>>>> async with and async for >>>>> Bead idea, we clutter the language even more and it is one more >>>>> thing every newbie could do wrong. >>>>> for x in y: >>>>> result = await f() >>>>> is enough, every 'async' framework lived without it over years. >>>>> >>>> async for i in iterable: >>>> pass >>>> >>>> is not equal for >>>> >>>> for fut in iterable: >>>> i = yield from fut >>>> >>> But people who used Twisted all their life don't know that! They just >>> know that "async for" is not needed and bad. >>> >>> >>> I don't think it is bad nor not needed, but the syntax is not beautiful >> and >> for the 90% not doing async stuff irritating and one more thing to learn >> and do right/wrong. >> > There is no way to do things wrong in PEP 492. An object > either has __aiter__ or it will be rejected by async for. > An object either has __aenter__ or it will be rejected by > async with. > Don't mean it can be done wrong on execution or syntax level. I mean for a beginner it is not as easy an more and some will try async in some places, yes they will get an error. But there is a new possibility to get such errors if async is there for with and for statements. And the next beginner will then implement __aiter__ instead of __iter__ because he/she don't get it. On the other side I like "await" and __await__ implementation. Symmetric good easy to explain, same with "int" and "__int__" and all others. > > transaction = yield from connection.transaction() > try: > ... > except: > yield from transaction.rollback() > else: > yield from transaction.commit() > > is certainly more irritating than > > async with connection.transcation(): > ... > > Sorry till now I use async stuff and database access and do it in an extra thread in sync mode. No performance problems and can use all good maintained database libraries. Also twisteds RDBMS (adbapi) is enough here. First I thought it is not enough or to slow but this was not the case. https://twistedmatrix.com/documents/current/core/howto/rdbms.html Here I am on line with Mike Bayer: http://techspot.zzzeek.org/2015/02/15/asynchronous-python-and-databases/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From yselivanov.ml at gmail.com Thu Apr 23 18:13:14 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Thu, 23 Apr 2015 12:13:14 -0400 Subject: [Python-Dev] PEP 3152 and yield from Future() In-Reply-To: References: <1429802547315.7259b888@Nodemailer> Message-ID: <55391A1A.9070809@gmail.com> On 2015-04-23 11:26 AM, Victor Stinner wrote: > 2015-04-23 17:22 GMT+02:00 : >> I can live with `cocall fut()` but the difference between `data = yield from >> loop.sock_recv(sock, 1024)` and `data = cocall (loop.sock_recv(sock, >> 1024))()` frustrates me very much. > You didn't answer to my question. My question is: is it possible to > implement Future.__cocall__() since yield is defined in cofunctions. > If it's possible, can you please show how? (Show me the code!) I can do that. class Future: .... def __iter__(self): .... __cocall__ = __iter__ I've outlined the problem with this approach in parallel email: https://mail.python.org/pipermail/python-dev/2015-April/139456.html Yury From yselivanov.ml at gmail.com Thu Apr 23 18:22:40 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Thu, 23 Apr 2015 12:22:40 -0400 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> <20150423133554.1fe32575@x230> <55391091.9000507@gmail.com> Message-ID: <55391C50.9030003@gmail.com> Wolfgang, On 2015-04-23 12:12 PM, Wolfgang Langner wrote: > On Thu, Apr 23, 2015 at 5:32 PM, Yury Selivanov > wrote: > >> Hi Wolfgang, >> >> On 2015-04-23 8:27 AM, Wolfgang Langner wrote: >> >>> On Thu, Apr 23, 2015 at 12:35 PM, Paul Sokolovsky >>> wrote: >>> >>> Hello, >>>> On Thu, 23 Apr 2015 12:18:51 +0300 >>>> Andrew Svetlov wrote: >>>> >>>> [] >>>> >>>> 3. >>>>>> async with and async for >>>>>> Bead idea, we clutter the language even more and it is one more >>>>>> thing every newbie could do wrong. >>>>>> for x in y: >>>>>> result = await f() >>>>>> is enough, every 'async' framework lived without it over years. >>>>>> >>>>> async for i in iterable: >>>>> pass >>>>> >>>>> is not equal for >>>>> >>>>> for fut in iterable: >>>>> i = yield from fut >>>>> >>>> But people who used Twisted all their life don't know that! They just >>>> know that "async for" is not needed and bad. >>>> >>>> >>>> I don't think it is bad nor not needed, but the syntax is not beautiful >>> and >>> for the 90% not doing async stuff irritating and one more thing to learn >>> and do right/wrong. >>> >> There is no way to do things wrong in PEP 492. An object >> either has __aiter__ or it will be rejected by async for. >> An object either has __aenter__ or it will be rejected by >> async with. >> > Don't mean it can be done wrong on execution or syntax level. > I mean for a beginner it is not as easy an more and some will try > async in some places, yes they will get an error. But there is a > new possibility to get such errors if async is there for with and for > statements. > And the next beginner will then implement __aiter__ instead of __iter__ > because > he/she don't get it. Sorry, Wolfgang, but I don't get your argument. Beginners shouldn't just randomly try to use statements. There is a documentation for that. Plus we can make exception messages better. > > On the other side I like "await" and __await__ implementation. > Symmetric good easy to explain, same with "int" and "__int__" and all > others. Glad to hear that! ;) > > >> transaction = yield from connection.transaction() >> try: >> ... >> except: >> yield from transaction.rollback() >> else: >> yield from transaction.commit() >> >> is certainly more irritating than >> >> async with connection.transcation(): >> ... >> >> > Sorry till now I use async stuff and database access and do it in an extra > thread in sync mode. > No performance problems and can use all good maintained database libraries. > Also twisteds RDBMS (adbapi) is enough here. First I thought it is not > enough or to slow but this was not the case. > https://twistedmatrix.com/documents/current/core/howto/rdbms.html > > Here I am on line with Mike Bayer: > http://techspot.zzzeek.org/2015/02/15/asynchronous-python-and-databases/ > It's good article, I read it. The PEP will help those who don't want to use threads and want to use coroutines. There are no fundamental reasons why coroutines are slower than threads. It will only be improved. There is a fundamental reason to avoid doing threads in python though, because it can harm performance drastically. There are some fundamental reasons why Mike will always love threads though -- we won't ever have asynchronous __getattr__, so SQLAlchemy won't be as elegant as it is in sync mode. Thanks! Yury From yselivanov.ml at gmail.com Thu Apr 23 18:54:27 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Thu, 23 Apr 2015 12:54:27 -0400 Subject: [Python-Dev] PEP 3152 and yield from Future() In-Reply-To: References: <1429802547315.7259b888@Nodemailer> Message-ID: <553923C3.5070608@gmail.com> I've updated the PEP 492: https://hg.python.org/peps/rev/352d4f907266 Thanks, Yury From tds333+pydev at gmail.com Thu Apr 23 18:58:15 2015 From: tds333+pydev at gmail.com (Wolfgang Langner) Date: Thu, 23 Apr 2015 18:58:15 +0200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <55391C50.9030003@gmail.com> References: <55368858.4010007@gmail.com> <20150423133554.1fe32575@x230> <55391091.9000507@gmail.com> <55391C50.9030003@gmail.com> Message-ID: On Thu, Apr 23, 2015 at 6:22 PM, Yury Selivanov wrote: > Wolfgang, > > > On 2015-04-23 12:12 PM, Wolfgang Langner wrote: > >> On Thu, Apr 23, 2015 at 5:32 PM, Yury Selivanov >> wrote: >> >> Hi Wolfgang, >>> >>> On 2015-04-23 8:27 AM, Wolfgang Langner wrote: >>> >>> On Thu, Apr 23, 2015 at 12:35 PM, Paul Sokolovsky >>>> wrote: >>>> >>>> Hello, >>>> >>>>> On Thu, 23 Apr 2015 12:18:51 +0300 >>>>> Andrew Svetlov wrote: >>>>> >>>>> [] >>>>> >>>>> 3. >>>>> >>>>>> async with and async for >>>>>>> Bead idea, we clutter the language even more and it is one more >>>>>>> thing every newbie could do wrong. >>>>>>> for x in y: >>>>>>> result = await f() >>>>>>> is enough, every 'async' framework lived without it over years. >>>>>>> >>>>>>> async for i in iterable: >>>>>> pass >>>>>> >>>>>> is not equal for >>>>>> >>>>>> for fut in iterable: >>>>>> i = yield from fut >>>>>> >>>>>> But people who used Twisted all their life don't know that! They just >>>>> know that "async for" is not needed and bad. >>>>> >>>>> >>>>> I don't think it is bad nor not needed, but the syntax is not >>>>> beautiful >>>>> >>>> and >>>> for the 90% not doing async stuff irritating and one more thing to learn >>>> and do right/wrong. >>>> >>>> There is no way to do things wrong in PEP 492. An object >>> either has __aiter__ or it will be rejected by async for. >>> An object either has __aenter__ or it will be rejected by >>> async with. >>> >>> Don't mean it can be done wrong on execution or syntax level. >> I mean for a beginner it is not as easy an more and some will try >> async in some places, yes they will get an error. But there is a >> new possibility to get such errors if async is there for with and for >> statements. >> And the next beginner will then implement __aiter__ instead of __iter__ >> because >> he/she don't get it. >> > > Sorry, Wolfgang, but I don't get your argument. Beginners > shouldn't just randomly try to use statements. There is a > documentation for that. Plus we can make exception messages > better. > Had to coach a lot of new users to Python and some to async stuff in twisted. And what beginners not should do don't care them. ;-) They will do strange stuff and go other way's than you expected. I only like to make it as easy as possible for them. Nothing more. Less keywords, less ways to do something, best only one way to do it. Don't get me wrong, I like the PEP it is well written and covers a lot of areas about async programming. I know Python must improve in this area and has a lot of potential. But don't hesitate, give the people time to try it and mature it. If all this should be in 3.5 it is to early. Also we can avoid the async keyword completely and do the same as for generators. If there is an await, it is a coroutine. IDE's can detect generators and must only be expanded to detect coroutines. Function with yield -> return a generator with await -> return a coroutine Move async for and async with to another PEP and handle it later or with a different syntax using the new await keyword. Only one new keyword, good improvement for async programming. Sorry still don't like the word async in a language and sprinkle it every where. And still have the feeling if we provide async for loop we next must provide async like generator expresions or list comprehensions or ... never ends ;-) The only downside with await converting a function to a coroutine is, it is not explicit marked. But if we care about this, whats about generators and yield ? -- bye by Wolfgang -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Thu Apr 23 18:58:33 2015 From: guido at python.org (Guido van Rossum) Date: Thu, 23 Apr 2015 09:58:33 -0700 Subject: [Python-Dev] PEP 3152 and yield from Future() In-Reply-To: <553923C3.5070608@gmail.com> References: <1429802547315.7259b888@Nodemailer> <553923C3.5070608@gmail.com> Message-ID: I think this is the nail in PEP 3152's coffin. On Thu, Apr 23, 2015 at 9:54 AM, Yury Selivanov wrote: > I've updated the PEP 492: https://hg.python.org/peps/rev/352d4f907266 > > Thanks, > Yury > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From solipsis at pitrou.net Thu Apr 23 19:07:46 2015 From: solipsis at pitrou.net (Antoine Pitrou) Date: Thu, 23 Apr 2015 19:07:46 +0200 Subject: [Python-Dev] PEP 3152 and yield from Future() References: <1429802547315.7259b888@Nodemailer> <553923C3.5070608@gmail.com> Message-ID: <20150423190746.5463bf4e@fsol> On Thu, 23 Apr 2015 09:58:33 -0700 Guido van Rossum wrote: > I think this is the nail in PEP 3152's coffin. If you only put one nail, it might manage to get out. Regards Antoine. From rymg19 at gmail.com Thu Apr 23 19:12:55 2015 From: rymg19 at gmail.com (Ryan Gonzalez) Date: Thu, 23 Apr 2015 12:12:55 -0500 Subject: [Python-Dev] PEP 3152 and yield from Future() In-Reply-To: <20150423190746.5463bf4e@fsol> References: <1429802547315.7259b888@Nodemailer> <553923C3.5070608@gmail.com> <20150423190746.5463bf4e@fsol> Message-ID: Then blow it up like Duck Dynasty does. On April 23, 2015 12:07:46 PM CDT, Antoine Pitrou wrote: >On Thu, 23 Apr 2015 09:58:33 -0700 >Guido van Rossum wrote: >> I think this is the nail in PEP 3152's coffin. > >If you only put one nail, it might manage to get out. > >Regards > >Antoine. > > >_______________________________________________ >Python-Dev mailing list >Python-Dev at python.org >https://mail.python.org/mailman/listinfo/python-dev >Unsubscribe: >https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com -- Sent from my Android device with K-9 Mail. Please excuse my brevity. -------------- next part -------------- An HTML attachment was scrubbed... URL: From yselivanov.ml at gmail.com Thu Apr 23 19:30:47 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Thu, 23 Apr 2015 13:30:47 -0400 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> <20150423133554.1fe32575@x230> <55391091.9000507@gmail.com> <55391C50.9030003@gmail.com> Message-ID: <55392C47.80006@gmail.com> Wolfgang, On 2015-04-23 12:58 PM, Wolfgang Langner wrote: > On Thu, Apr 23, 2015 at 6:22 PM, Yury Selivanov > wrote: > >> Wolfgang, >> >> >> On 2015-04-23 12:12 PM, Wolfgang Langner wrote: >> >>> On Thu, Apr 23, 2015 at 5:32 PM, Yury Selivanov >>> wrote: >>> >>> Hi Wolfgang, >>>> On 2015-04-23 8:27 AM, Wolfgang Langner wrote: >>>> >>>> On Thu, Apr 23, 2015 at 12:35 PM, Paul Sokolovsky >>>>> wrote: >>>>> >>>>> Hello, >>>>> >>>>>> On Thu, 23 Apr 2015 12:18:51 +0300 >>>>>> Andrew Svetlov wrote: >>>>>> >>>>>> [] >>>>>> >>>>>> 3. >>>>>> >>>>>>> async with and async for >>>>>>>> Bead idea, we clutter the language even more and it is one more >>>>>>>> thing every newbie could do wrong. >>>>>>>> for x in y: >>>>>>>> result = await f() >>>>>>>> is enough, every 'async' framework lived without it over years. >>>>>>>> >>>>>>>> async for i in iterable: >>>>>>> pass >>>>>>> >>>>>>> is not equal for >>>>>>> >>>>>>> for fut in iterable: >>>>>>> i = yield from fut >>>>>>> >>>>>>> But people who used Twisted all their life don't know that! They just >>>>>> know that "async for" is not needed and bad. >>>>>> >>>>>> >>>>>> I don't think it is bad nor not needed, but the syntax is not >>>>>> beautiful >>>>>> >>>>> and >>>>> for the 90% not doing async stuff irritating and one more thing to learn >>>>> and do right/wrong. >>>>> >>>>> There is no way to do things wrong in PEP 492. An object >>>> either has __aiter__ or it will be rejected by async for. >>>> An object either has __aenter__ or it will be rejected by >>>> async with. >>>> >>>> Don't mean it can be done wrong on execution or syntax level. >>> I mean for a beginner it is not as easy an more and some will try >>> async in some places, yes they will get an error. But there is a >>> new possibility to get such errors if async is there for with and for >>> statements. >>> And the next beginner will then implement __aiter__ instead of __iter__ >>> because >>> he/she don't get it. >>> >> Sorry, Wolfgang, but I don't get your argument. Beginners >> shouldn't just randomly try to use statements. There is a >> documentation for that. Plus we can make exception messages >> better. >> > Had to coach a lot of new users to Python and some to async stuff in > twisted. > And what beginners not should do don't care them. ;-) > They will do strange stuff and go other way's than you expected. I only > like to make > it as easy as possible for them. Nothing more. > Less keywords, less ways to do something, best only one way to do it. > > > > Don't get me wrong, I like the PEP it is well written and covers a lot of > areas about async programming. > I know Python must improve in this area and has a lot of potential. > But don't hesitate, give the people time to try it and mature it. If all > this should be in 3.5 it is to early. The thing about this PEP is that it's build on existing concepts that were validated with asyncio. As for is it enough time to review it or not -- it's up to BDFL to decide. I'm doing my best trying to get the reference implementation reviewed and to address all questions in the PEP, hoping that it will help. I can only say that if it doesn't land in 3.5, we'll have to wait another *1.5 years*. And it's not that people will download CPython 3.6.alpha0 and start rewriting their code and playing with it. In the meanwhile, people want more good reasons to migrate to Python 3, and I strongly believe that this PEP is a great reason. Moreover, it's several months before 3.5 is released. We still will be able to slightly alter the behaviour and gather feedback during beta periods. > > Also we can avoid the async keyword completely and do the same as for > generators. > If there is an await, it is a coroutine. Unfortunately there is a problem with this approach: refactoring of code becomes much harder. That's one of the corner cases that @coroutine decorator solves; see https://www.python.org/dev/peps/pep-0492/#importance-of-async-keyword Thanks, Yury From barry at python.org Thu Apr 23 19:51:52 2015 From: barry at python.org (Barry Warsaw) Date: Thu, 23 Apr 2015 13:51:52 -0400 Subject: [Python-Dev] async/await in Python; v2 References: <55368858.4010007@gmail.com> Message-ID: <20150423135152.36195a1e@limelight.wooz.org> On Apr 21, 2015, at 01:26 PM, Yury Selivanov wrote: >The updated version of the PEP should be available shortly at >https://www.python.org/dev/peps/pep-0492 and is also pasted in this email. There's a lot to like about PEP 492. I only want to mildly bikeshed a bit on the proposed new syntax. Why "async def" and not "def async"? My concern is about existing tools that already know that "def" as the first non-whitespace on the line starts a function/method definition. Think of a regexp in an IDE that searches backwards from the current line to find the function its defined on. Sure, tools can be updated but it is it *necessary* to choose a syntax that breaks tools? def async useful(): seems okay to me. Probably the biggest impact on the PEP would be symmetry with asynchronous with and for. What about: with async lock: and for async data in cursor: That would also preserve at least some behavior of existing tools. Anyway, since the PEP doesn't explicitly describe this as an alternative, I want to bring it up. (I have mild concerns about __a*__ magic methods, since I think they'll be harder to visually separate, but here the PEP does describe the __async_*__ alternatives.) Cheers, -Barry -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 819 bytes Desc: OpenPGP digital signature URL: From yselivanov.ml at gmail.com Thu Apr 23 20:02:32 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Thu, 23 Apr 2015 14:02:32 -0400 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <20150423135152.36195a1e@limelight.wooz.org> References: <55368858.4010007@gmail.com> <20150423135152.36195a1e@limelight.wooz.org> Message-ID: <553933B8.8070306@gmail.com> Hi Barry, On 2015-04-23 1:51 PM, Barry Warsaw wrote: > On Apr 21, 2015, at 01:26 PM, Yury Selivanov wrote: > >> The updated version of the PEP should be available shortly at >> https://www.python.org/dev/peps/pep-0492 and is also pasted in this email. > There's a lot to like about PEP 492. I only want to mildly bikeshed a bit on > the proposed new syntax. Thanks! > > Why "async def" and not "def async"? To my eye 'async def name()', 'async with', 'async for' look better than 'def async name()', 'with async' and 'for async'. But that's highly subjective. I also read "for async item in iter:" as "I'm iterating iter with async item". > > My concern is about existing tools that already know that "def" as the first > non-whitespace on the line starts a function/method definition. Think of a > regexp in an IDE that searches backwards from the current line to find the > function its defined on. Sure, tools can be updated but it is it *necessary* > to choose a syntax that breaks tools? > > def async useful(): > > seems okay to me. > > Probably the biggest impact on the PEP would be symmetry with asynchronous > with and for. What about: > > with async lock: > > and > > for async data in cursor: > > That would also preserve at least some behavior of existing tools. > > Anyway, since the PEP doesn't explicitly describe this as an alternative, I > want to bring it up. > > (I have mild concerns about __a*__ magic methods, since I think they'll be > harder to visually separate, but here the PEP does describe the __async_*__ > alternatives.) > > Anyways, I'm open to change the order of keywords if most people like it that way. Same for __async_*__ naming. Thanks! Yury From barry at python.org Thu Apr 23 20:12:19 2015 From: barry at python.org (Barry Warsaw) Date: Thu, 23 Apr 2015 14:12:19 -0400 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <553933B8.8070306@gmail.com> References: <55368858.4010007@gmail.com> <20150423135152.36195a1e@limelight.wooz.org> <553933B8.8070306@gmail.com> Message-ID: <20150423141219.298be99b@anarchist.wooz.org> On Apr 23, 2015, at 02:02 PM, Yury Selivanov wrote: >To my eye 'async def name()', 'async with', 'async for' look >better than 'def async name()', 'with async' and 'for async'. >But that's highly subjective. Would you be willing to add this as an alternative to the PEP, under the "Why async def" section probably? As with all such bikesheds, Guido will pick the color and we'll ooh and ahh. :) Cheers, -Barry From python at mrabarnett.plus.com Thu Apr 23 20:21:08 2015 From: python at mrabarnett.plus.com (MRAB) Date: Thu, 23 Apr 2015 19:21:08 +0100 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <20150423135152.36195a1e@limelight.wooz.org> References: <55368858.4010007@gmail.com> <20150423135152.36195a1e@limelight.wooz.org> Message-ID: <55393814.50601@mrabarnett.plus.com> On 2015-04-23 18:51, Barry Warsaw wrote: > On Apr 21, 2015, at 01:26 PM, Yury Selivanov wrote: > >>The updated version of the PEP should be available shortly at >>https://www.python.org/dev/peps/pep-0492 and is also pasted in this email. > > There's a lot to like about PEP 492. I only want to mildly bikeshed a bit on > the proposed new syntax. > > Why "async def" and not "def async"? > > My concern is about existing tools that already know that "def" as the first > non-whitespace on the line starts a function/method definition. Think of a > regexp in an IDE that searches backwards from the current line to find the > function its defined on. Sure, tools can be updated but it is it *necessary* > to choose a syntax that breaks tools? > > def async useful(): > > seems okay to me. > > Probably the biggest impact on the PEP would be symmetry with asynchronous > with and for. What about: > > with async lock: > > and > > for async data in cursor: > > That would also preserve at least some behavior of existing tools. > > Anyway, since the PEP doesn't explicitly describe this as an alternative, I > want to bring it up. > > (I have mild concerns about __a*__ magic methods, since I think they'll be > harder to visually separate, but here the PEP does describe the __async_*__ > alternatives.) > On the other hand, existing tools might be expecting "def" and "for" to be followed by a name. From yselivanov.ml at gmail.com Thu Apr 23 20:25:31 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Thu, 23 Apr 2015 14:25:31 -0400 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <20150423141219.298be99b@anarchist.wooz.org> References: <55368858.4010007@gmail.com> <20150423135152.36195a1e@limelight.wooz.org> <553933B8.8070306@gmail.com> <20150423141219.298be99b@anarchist.wooz.org> Message-ID: <5539391B.307@gmail.com> Barry, On 2015-04-23 2:12 PM, Barry Warsaw wrote: > On Apr 23, 2015, at 02:02 PM, Yury Selivanov wrote: > >> To my eye 'async def name()', 'async with', 'async for' look >> better than 'def async name()', 'with async' and 'for async'. >> But that's highly subjective. > Would you be willing to add this as an alternative to the PEP, under the "Why > async def" section probably? > > As with all such bikesheds, Guido will pick the color and we'll ooh and > ahh. :) Sure! I think it's a great idea: https://hg.python.org/peps/rev/8cb4c0ab0931 https://hg.python.org/peps/rev/ec319bf4c86e Thanks! Yury From yselivanov.ml at gmail.com Thu Apr 23 20:39:48 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Thu, 23 Apr 2015 14:39:48 -0400 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <55389944.9080004@canterbury.ac.nz> References: <55368858.4010007@gmail.com> <553803C2.6060902@gmail.com> <55389944.9080004@canterbury.ac.nz> Message-ID: <55393C74.9030807@gmail.com> On 2015-04-23 3:03 AM, Greg Ewing wrote: > Yury Selivanov wrote: >> - If it's an object with __await__, return iter(object.__await__()) > > Is the iter() really needed? Couldn't the contract of > __await__ be that it always returns an iterator? > I wrote it the wrong way. iter() isn't needed, you're right. This is a quote from the ref implementation: if (!PyIter_Check(await_obj)) { PyErr_Format(PyExc_TypeError, "__await__ must return an iterator, %.100s", Py_TYPE(await_obj)->tp_name); Yury From lukasz at langa.pl Thu Apr 23 20:55:44 2015 From: lukasz at langa.pl (=?utf-8?Q?=C5=81ukasz_Langa?=) Date: Thu, 23 Apr 2015 11:55:44 -0700 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <20150423135152.36195a1e@limelight.wooz.org> References: <55368858.4010007@gmail.com> <20150423135152.36195a1e@limelight.wooz.org> Message-ID: <2DF635B1-62C6-4A00-9262-8A36709A3199@langa.pl> > On Apr 23, 2015, at 10:51 AM, Barry Warsaw wrote: > > (I have mild concerns about __a*__ magic methods, since I think they'll be > harder to visually separate, but here the PEP does describe the __async_*__ > alternatives.) Has it been historically a problem with __iadd__ vs __radd__ vs __add__, __ior__ vs __ror__ vs __or__, etc.? -- Best regards, ?ukasz Langa WWW: http://lukasz.langa.pl/ Twitter: @llanga IRC: ambv on #python-dev From victor.stinner at gmail.com Thu Apr 23 22:29:26 2015 From: victor.stinner at gmail.com (Victor Stinner) Date: Thu, 23 Apr 2015 22:29:26 +0200 Subject: [Python-Dev] PEP 3152 and yield from Future() In-Reply-To: <5539159E.2070806@gmail.com> References: <5539159E.2070806@gmail.com> Message-ID: Hi, 2015-04-23 17:54 GMT+02:00 Yury Selivanov : > Greg wants to implement __cocall__ on futures. This way > you'll be able to write > > cocall fut() # instead of "await fut" Oh, I missed something in the PEP 3152: a obj__cocall__() method can be an iterator/generator, it can be something different than a cofunction. So a __cocall__() *can* use yield and yield from. But to use cocall, it must be a cofunction. It's not easy to understand the whole puzzle. IMO the PEP 492 better explains how pieces are put together ;-) Victor From greg.ewing at canterbury.ac.nz Fri Apr 24 02:02:06 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Fri, 24 Apr 2015 12:02:06 +1200 Subject: [Python-Dev] PEP 3152 and yield from Future() In-Reply-To: References: Message-ID: <553987FE.6080500@canterbury.ac.nz> Victor Stinner wrote: > I'm still trying to understand how the PEP 3152 would impact asyncio. > Guido suggests to replace "yield from fut" with "cocall fut()" (add > parenthesis) and so add a __cocall__() method to asyncio.Future. > Problem: PEP 3152 says "A cofunction (...) does not contain any yield > or yield from expressions". A __cocall__ method doesn't have to be implemented with a cofunction. Any method that returns an iterator will do, including a generator. So a Future.__cocall__ that just invokes Future.__iter__ should work fine. > How is it possible to suspend a cofunction if it's not possible to use yield? The currently published version of PEP 3152 is not really complete. A few things would need to be added to it, one of them being a suspend() builtin that has the same effect as yield in a generator-based coroutine. -- Greg From chris.barker at noaa.gov Fri Apr 24 02:03:02 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Thu, 23 Apr 2015 17:03:02 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: Message-ID: On Wed, Apr 22, 2015 at 5:45 PM, Guido van Rossum wrote: > Given that even if Difference existed, and even if we had a predefined > type alias for Difference[Iterable[str], str], you' still have to remember > to mark up all those functions with that annotation. It almost sounds > simpler to just predefine this function: > > def make_string_list(a: Union[str, Iterable[str]]) -> Iterable[str]: > if isinstance(a, str): > return [a] > else: > return a > fair enough -- and I do indeed have that code in various places already. Somehow, I've always been uncomfortable with checking specifically for the str type -- guess I want everything to be fully duck-typable. But then I wouldn't be doing type hints, either, would I? -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From greg.ewing at canterbury.ac.nz Fri Apr 24 02:14:49 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Fri, 24 Apr 2015 12:14:49 +1200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <553838F4.7070303@canterbury.ac.nz> <5538431A.1020709@gmail.com> <5538BA6E.4080906@canterbury.ac.nz> <5538E150.2050803@canterbury.ac.nz> Message-ID: <55398AF9.40608@canterbury.ac.nz> Andrew Svetlov wrote: > But we already have asyncio and code based on asyncio coroutines. > To make it work I should always use costart() in places where asyncio > requires coroutine. As I understand it, asyncio would require changes to make it work seamlessly with PEP 492 as well, since an object needs to have either a special flag or an __await__ method before it can have 'await' applied to it. -- Greg From greg.ewing at canterbury.ac.nz Fri Apr 24 02:33:26 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Fri, 24 Apr 2015 12:33:26 +1200 Subject: [Python-Dev] PEP 3152 and yield from Future() In-Reply-To: <1429802547315.7259b888@Nodemailer> References: <1429802547315.7259b888@Nodemailer> Message-ID: <55398F56.50202@canterbury.ac.nz> andrew.svetlov at gmail.com wrote: > I can live with `cocall fut()` but the difference between `data = yield > from loop.sock_recv(sock, 1024)` and `data = cocall > (loop.sock_recv(sock, 1024))()` frustrates me very much. That's not the way it would be done. In a PEP-3152-ified version of asyncio, sock_recv would be a cofunction, so that would be just data = cocall loop.sock_recv(sock, 1024) -- Greg From greg.ewing at canterbury.ac.nz Fri Apr 24 02:49:05 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Fri, 24 Apr 2015 12:49:05 +1200 Subject: [Python-Dev] PEP 3152 and yield from Future() In-Reply-To: References: <1429802547315.7259b888@Nodemailer> Message-ID: <55399301.1010808@canterbury.ac.nz> Victor Stinner wrote: > You didn't answer to my question. My question is: is it possible to > implement Future.__cocall__() since yield is defined in cofunctions. > If it's possible, can you please show how? (Show me the code!) The implementation of a __cocall__ method is *not* a cofunction, it's an *ordinary* function that returns an iterator. In the case of Future, what it needs to do is identical to Future.__iter__. So the code can be just def __cocall__(self): return iter(self) or equivalent. -- Greg From greg.ewing at canterbury.ac.nz Fri Apr 24 03:05:45 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Fri, 24 Apr 2015 13:05:45 +1200 Subject: [Python-Dev] PEP 3152 and yield from Future() In-Reply-To: <5539159E.2070806@gmail.com> References: <5539159E.2070806@gmail.com> Message-ID: <553996E9.8010204@canterbury.ac.nz> Yury Selivanov wrote: > Another problem is functions that return future: > > def do_something(): > ... > return fut > > With Greg's idea to call it you would do: > > cocall (do_something())() > > That means that you can't refactor your "do_something" > function and make it a coroutine. There's no fundamental problem with a cofunction returning another cofunction: codef do_something(): return fut f = cocall do_something() result = cocall f() Combining those last two lines into one would require some extra parenthesisation, but I don't think that's something you're going to be doing much in practice. If you're just going to immediately call the result, there's no point in returning a future -- just do it all in do_something(). -- Greg From yselivanov.ml at gmail.com Fri Apr 24 03:15:44 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Thu, 23 Apr 2015 21:15:44 -0400 Subject: [Python-Dev] PEP 3152 and yield from Future() In-Reply-To: <553996E9.8010204@canterbury.ac.nz> References: <5539159E.2070806@gmail.com> <553996E9.8010204@canterbury.ac.nz> Message-ID: <55399940.5000207@gmail.com> On 2015-04-23 9:05 PM, Greg Ewing wrote: > Combining those last two lines into one would > require some extra parenthesisation, but I don't > think that's something you're going to be doing > much in practice. It's a common pattern in asyncio when functions return futures. It's OK later to refactor those functions to coroutines *and* vice-versa. This is a fundamental problem for PEP 3152 approach. Yury From stephen at xemacs.org Fri Apr 24 04:45:05 2015 From: stephen at xemacs.org (Stephen J. Turnbull) Date: Fri, 24 Apr 2015 11:45:05 +0900 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <553933B8.8070306@gmail.com> References: <55368858.4010007@gmail.com> <20150423135152.36195a1e@limelight.wooz.org> <553933B8.8070306@gmail.com> Message-ID: <87vbgmqm6m.fsf@uwakimon.sk.tsukuba.ac.jp> Yury Selivanov writes: > To my eye 'async def name()', 'async with', 'async for' look > better than 'def async name()', 'with async' and 'for async'. > But that's highly subjective. I'm with Barry on this one as far as looks go. (But count that as a +0, since I'm just a literary critic, I don't use coroutines in anger at present.) > I also read "for async item in iter:" as "I'm iterating iter > with async item". I thought that was precisely the intended semantics: item is available asynchronously. Again, count as a +0. FWIW, etc. From steve at pearwood.info Fri Apr 24 05:34:45 2015 From: steve at pearwood.info (Steven D'Aprano) Date: Fri, 24 Apr 2015 13:34:45 +1000 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: References: <20150423125957.631a8711@x230> <20150423164410.6c8641e1@x230> Message-ID: <20150424033445.GU5663@ando.pearwood.info> On Thu, Apr 23, 2015 at 03:25:30PM +0100, Harry Percival wrote: > lol @ the fact that the type hints are breaking github's syntax highlighter > :) That just tells us that Github's syntax highlighter has been broken for over five years. Function annotations go back to Python 3.0, more than five years ago. The only thing which is new about type hinting is that we're adding a standard *use* for those annotations. I just tested a version of kwrite from 2005, ten years old, and it highlights the following annotated function perfectly: def func(a:str='hello', b:int=int(x+1)) -> None: print(a + b) Of course, I'm hoping that any decent type checker won't need the type hints. It should be able to infer from the default values that a is a string and b an int, and only require a type hint if you want to accept other types as well. (It should also highlight that a+b cannot succeed.) -- Steve From greg.ewing at canterbury.ac.nz Fri Apr 24 10:03:28 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Fri, 24 Apr 2015 20:03:28 +1200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <20150423135152.36195a1e@limelight.wooz.org> References: <55368858.4010007@gmail.com> <20150423135152.36195a1e@limelight.wooz.org> Message-ID: <5539F8D0.5050503@canterbury.ac.nz> Barry Warsaw wrote: > Sure, tools can be updated but it is it *necessary* > to choose a syntax that breaks tools? > > def async useful(): > > seems okay to me. That will break any tool that assumes the word following 'def' is the name of the function being defined. Putting it at the end would seem least likely to cause breakage: def useful() async: -- Greg From greg.ewing at canterbury.ac.nz Fri Apr 24 10:13:11 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Fri, 24 Apr 2015 20:13:11 +1200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <5538ED2B.2070108@gmail.com> References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <553838F4.7070303@canterbury.ac.nz> <5538431A.1020709@gmail.com> <5538BA6E.4080906@canterbury.ac.nz> <5538E150.2050803@canterbury.ac.nz> <5538ED2B.2070108@gmail.com> Message-ID: <5539FB17.2000909@canterbury.ac.nz> Yury Selivanov wrote: > If you have a future object 'fut', it's not intuitive > or pythonic to write 'cocall fut()'. Another way to approach that would be to provide a cofunction await() used like this: cocall await(fut) That would read more naturally and wouldn't require modifying fut at all. -- Greg From greg.ewing at canterbury.ac.nz Fri Apr 24 10:13:24 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Fri, 24 Apr 2015 20:13:24 +1200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <5538ED2B.2070108@gmail.com> References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <553838F4.7070303@canterbury.ac.nz> <5538431A.1020709@gmail.com> <5538BA6E.4080906@canterbury.ac.nz> <5538E150.2050803@canterbury.ac.nz> <5538ED2B.2070108@gmail.com> Message-ID: <5539FB24.4030605@canterbury.ac.nz> Yury Selivanov wrote: > In a PEP 3152 aware version of asyncio, it's just *not > possible to write* > > cocall gather(coro1(1,2), coro(2,3)) > > you just have to use your 'costart' built-in: > > cocall gather(costart(coro1, 1, 2), costart(coro, 2,3)). Another way to write that would be cocall gather(Task(coro1, 1, 2), Task(coro, 2, 3)) I think that actually reads quite nicely, and makes it very clear that parallel tasks are being spawned, rather than invoked sequentially. With the current way, that's not clear at all. It's not quite as convenient, because you don't get currying for free the way you do with generators. But I feel that such implicit currying is detrimental to readability. It looks like you're passing the results returned by coro1 and coro2 to gather, rather than coro1 and coro2 themselves. Yes, it will require some code to be changed, but if you're turning all your coroutines into cofunctions or async defs, you're changing quite a lot of things already. > PEP 3152 was created in pre-asyncio era, and it shows. I would say that asyncio was created in a pre-PEP-3152 world. Or at least it was developed without allowing for the possibility of adopting something like PEP 3152 in the future. Asyncio was based on generators and yield-from because it was the best thing we had at the time. I'll be disappointed if we've raced so far ahead with those ideas that it's now impossible to replace them with anything better. PEP 3152 is designed to present a *simpler* model of coroutine programming, by having only one concept, the suspendable function, instead of two -- generator functions on the one hand, and iterators/futures/awaitables/ whatever you want to call them on the other. PEP 492 doesn't do that. It adds some things and changes some things, but it doesn't simplify anything. > Your idea of syntaticaly forcing to use 'cocall' with > parens is cute, You say that as though "forcing" the use of parens were a goal in itself. It's not -- it's a *consequence* of what a cocall is. -- Greg From greg.ewing at canterbury.ac.nz Fri Apr 24 10:13:35 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Fri, 24 Apr 2015 20:13:35 +1200 Subject: [Python-Dev] PEP 3152 and yield from Future() In-Reply-To: References: <1429802547315.7259b888@Nodemailer> <553923C3.5070608@gmail.com> Message-ID: <5539FB2F.3070505@canterbury.ac.nz> Guido van Rossum wrote: > I think this is the nail in PEP 3152's coffin. Seems more like a small tack to me. :-) I've addressed all the issues raised there in earlier posts. -- Greg From greg.ewing at canterbury.ac.nz Fri Apr 24 10:21:03 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Fri, 24 Apr 2015 20:21:03 +1200 Subject: [Python-Dev] PEP 3152 and yield from Future() In-Reply-To: References: <5539159E.2070806@gmail.com> Message-ID: <5539FCEF.8090101@canterbury.ac.nz> Victor Stinner wrote: > Oh, I missed something in the PEP 3152: a obj__cocall__() method can > be an iterator/generator, it can be something different than a > cofunction. In fact, it *can't* be cofunction. It's part of the machinery for implementing cofunctions. > It's not easy to understand the whole puzzle. IMO the PEP 492 better > explains how pieces are put together ;-) Yes, it's written in a rather minimal style, sorry about that. -- Greg From greg.ewing at canterbury.ac.nz Fri Apr 24 10:34:26 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Fri, 24 Apr 2015 20:34:26 +1200 Subject: [Python-Dev] PEP 3152 and yield from Future() In-Reply-To: <55399940.5000207@gmail.com> References: <5539159E.2070806@gmail.com> <553996E9.8010204@canterbury.ac.nz> <55399940.5000207@gmail.com> Message-ID: <553A0012.6030809@canterbury.ac.nz> Yury Selivanov wrote: > It's a common pattern in asyncio when functions > return futures. It's OK later to refactor those > functions to coroutines *and* vice-versa. This > is a fundamental problem for PEP 3152 approach. Hmmm. So you have an ordinary function that returns a future, and you want to turn it into a coroutine function, but still have it return a future in order to keep the API the same, is that right? Turning it into a coroutine means you're going to have to change every site that calls it, so its API has already changed. Given that, I'm not sure what advantage there is in keeping the future- returning part of the API. However, if we use the await()-cofunction idea, then a call to the initial version looks like cocall await(f(x)) and after the refactoring it becomes cocall await(cocall f(x)) That doesn't look so bad to me. -- Greg From greg.ewing at canterbury.ac.nz Fri Apr 24 10:39:28 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Fri, 24 Apr 2015 20:39:28 +1200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <87vbgmqm6m.fsf@uwakimon.sk.tsukuba.ac.jp> References: <55368858.4010007@gmail.com> <20150423135152.36195a1e@limelight.wooz.org> <553933B8.8070306@gmail.com> <87vbgmqm6m.fsf@uwakimon.sk.tsukuba.ac.jp> Message-ID: <553A0140.2060002@canterbury.ac.nz> Stephen J. Turnbull wrote: > Yury Selivanov writes: > > > I also read "for async item in iter:" as "I'm iterating iter > > with async item". > > I thought that was precisely the intended semantics: item is available > asynchronously. The async-at-the-end idea could be used here as well. for item in iter async: ... with something as x async: ... -- Greg From p.f.moore at gmail.com Fri Apr 24 10:44:47 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Fri, 24 Apr 2015 09:44:47 +0100 Subject: [Python-Dev] PEP 3152 and yield from Future() In-Reply-To: <553A0012.6030809@canterbury.ac.nz> References: <5539159E.2070806@gmail.com> <553996E9.8010204@canterbury.ac.nz> <55399940.5000207@gmail.com> <553A0012.6030809@canterbury.ac.nz> Message-ID: On 24 April 2015 at 09:34, Greg Ewing wrote: > and after the refactoring it becomes > > cocall await(cocall f(x)) > > That doesn't look so bad to me. I've not been following this discussion (and coroutines make my head hurt) but this idiom looks like it's bound to result in people getting the idea that you scatter "cocall" throughout an expression until you get it to work. Paul From greg.ewing at canterbury.ac.nz Fri Apr 24 11:07:33 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Fri, 24 Apr 2015 21:07:33 +1200 Subject: [Python-Dev] PEP 3152 and yield from Future() In-Reply-To: References: <5539159E.2070806@gmail.com> <553996E9.8010204@canterbury.ac.nz> <55399940.5000207@gmail.com> <553A0012.6030809@canterbury.ac.nz> Message-ID: <553A07D5.3040208@canterbury.ac.nz> Paul Moore wrote: > On 24 April 2015 at 09:34, Greg Ewing wrote: > >> cocall await(cocall f(x)) >> >>That doesn't look so bad to me. > > I've not been following this discussion (and coroutines make my head > hurt) but this idiom looks like it's bound to result in people getting > the idea that you scatter "cocall" throughout an expression until you > get it to work. They won't need to do that, because they'll get told exactly where they've left one out, or put one in that they shouldn't have. Also, the places you need to put cocall are exactly the same as the places you need yield-from currently, or await under PEP 492. -- Greg From andrew.svetlov at gmail.com Fri Apr 24 14:04:30 2015 From: andrew.svetlov at gmail.com (Andrew Svetlov) Date: Fri, 24 Apr 2015 15:04:30 +0300 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <55398AF9.40608@canterbury.ac.nz> References: <55368858.4010007@gmail.com> <55373A1F.2020704@canterbury.ac.nz> <5537C0EB.7070300@gmail.com> <553838F4.7070303@canterbury.ac.nz> <5538431A.1020709@gmail.com> <5538BA6E.4080906@canterbury.ac.nz> <5538E150.2050803@canterbury.ac.nz> <55398AF9.40608@canterbury.ac.nz> Message-ID: On Fri, Apr 24, 2015 at 3:14 AM, Greg Ewing wrote: > Andrew Svetlov wrote: > >> But we already have asyncio and code based on asyncio coroutines. >> To make it work I should always use costart() in places where asyncio >> requires coroutine. > > > As I understand it, asyncio would require changes to > make it work seamlessly with PEP 492 as well, since > an object needs to have either a special flag or > an __await__ method before it can have 'await' > applied to it. > PEP 492 requires a change of asyncio.Future only. PEP 3152 requires of change in any asyncio-based library, this is the difference. > > -- > Greg > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.com -- Thanks, Andrew Svetlov From andrew.svetlov at gmail.com Fri Apr 24 14:55:22 2015 From: andrew.svetlov at gmail.com (Andrew Svetlov) Date: Fri, 24 Apr 2015 15:55:22 +0300 Subject: [Python-Dev] PEP 3152 and yield from Future() In-Reply-To: <553A0012.6030809@canterbury.ac.nz> References: <5539159E.2070806@gmail.com> <553996E9.8010204@canterbury.ac.nz> <55399940.5000207@gmail.com> <553A0012.6030809@canterbury.ac.nz> Message-ID: On Fri, Apr 24, 2015 at 11:34 AM, Greg Ewing wrote: > Yury Selivanov wrote: > >> It's a common pattern in asyncio when functions >> return futures. It's OK later to refactor those >> functions to coroutines *and* vice-versa. This >> is a fundamental problem for PEP 3152 approach. > > > Hmmm. So you have an ordinary function that returns > a future, and you want to turn it into a coroutine > function, but still have it return a future in > order to keep the API the same, is that right? No. In asyncio there is no difference between coroutine and regular function returning future. >From caller site next both are equal: @asyncio.coroutine def f(): return 1 def g(): fut = asyncio.Future() fut.set_result(1) return fut Both may be called via `yield from`: ret1 = yield from f() ret2 = yield from g() > > Turning it into a coroutine means you're going > to have to change every site that calls it, so > its API has already changed. Given that, I'm not > sure what advantage there is in keeping the future- > returning part of the API. > > However, if we use the await()-cofunction idea, > then a call to the initial version looks like > > cocall await(f(x)) > > and after the refactoring it becomes > > cocall await(cocall f(x)) > > That doesn't look so bad to me. > > -- > Greg > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.com -- Thanks, Andrew Svetlov From steve at pearwood.info Fri Apr 24 15:17:54 2015 From: steve at pearwood.info (Steven D'Aprano) Date: Fri, 24 Apr 2015 23:17:54 +1000 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <20150423135152.36195a1e@limelight.wooz.org> References: <55368858.4010007@gmail.com> <20150423135152.36195a1e@limelight.wooz.org> Message-ID: <20150424131754.GW5663@ando.pearwood.info> On Thu, Apr 23, 2015 at 01:51:52PM -0400, Barry Warsaw wrote: > Why "async def" and not "def async"? > > My concern is about existing tools that already know that "def" as the first > non-whitespace on the line starts a function/method definition. Think of a > regexp in an IDE that searches backwards from the current line to find the > function its defined on. Sure, tools can be updated but it is it *necessary* > to choose a syntax that breaks tools? Surely its the other way? If I'm searching for the definition of a function manually, I search for "def spam". `async def spam` will still be found, while `def async spam` will not. It seems to me that tools that search for r"^\s*def\s+spam\s*\(" are going to break whichever choice is made, while a less pedantic search like r"def\s+spam\s*\(" will work only if async comes first. -- Steve From barry at python.org Fri Apr 24 15:32:51 2015 From: barry at python.org (Barry Warsaw) Date: Fri, 24 Apr 2015 09:32:51 -0400 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <20150424131754.GW5663@ando.pearwood.info> References: <55368858.4010007@gmail.com> <20150423135152.36195a1e@limelight.wooz.org> <20150424131754.GW5663@ando.pearwood.info> Message-ID: <20150424093251.2461871d@limelight.wooz.org> On Apr 24, 2015, at 11:17 PM, Steven D'Aprano wrote: >It seems to me that tools that search for r"^\s*def\s+spam\s*\(" are They would likely search for something like r"^\s*def\s+[a-zA-Z0-9_]+" which will hit "def async spam" but not "async def". Cheers, -Barry From barry at python.org Fri Apr 24 15:34:25 2015 From: barry at python.org (Barry Warsaw) Date: Fri, 24 Apr 2015 09:34:25 -0400 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <5539F8D0.5050503@canterbury.ac.nz> References: <55368858.4010007@gmail.com> <20150423135152.36195a1e@limelight.wooz.org> <5539F8D0.5050503@canterbury.ac.nz> Message-ID: <20150424093425.4743ec7c@limelight.wooz.org> On Apr 24, 2015, at 08:03 PM, Greg Ewing wrote: >Putting it at the end would seem least likely to >cause breakage: > > def useful() async: That's not bad IMHO. I wonder how crazy it is in the face of, ahem, function annotations. Cheers, -Barry From yselivanov.ml at gmail.com Fri Apr 24 15:54:52 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Fri, 24 Apr 2015 09:54:52 -0400 Subject: [Python-Dev] PEP 3152 and yield from Future() In-Reply-To: <5539FB2F.3070505@canterbury.ac.nz> References: <1429802547315.7259b888@Nodemailer> <553923C3.5070608@gmail.com> <5539FB2F.3070505@canterbury.ac.nz> Message-ID: <553A4B2C.7080603@gmail.com> Greg, On 2015-04-24 4:13 AM, Greg Ewing wrote: > Guido van Rossum wrote: >> I think this is the nail in PEP 3152's coffin. > > Seems more like a small tack to me. :-) > I've addressed all the issues raised there in > earlier posts. > I'm sorry, but this is ridiculous. You haven't addressed the issues. We raise issues, saying that your PEP isn't backwards compatible and is breaking existing idioms. You say - there is a workaround for that; or that we can rewrite that and make it like that; or something else that doesn't make any sense for existing asyncio developers. Your PEP isn't backwards compatible. Period. It *will* be harder for people to get, as it *does* introduce a new calling grammar that isn't obvious for at least some people. We, asyncio developers, who write asyncio code, *don't* want to write 'cocall fut()'. I don't understand *why* I'm required to put parentheses there (besides someone just requiring me to do so, because they failed to solve some problem in backwards compatible way). You avoid confusion in one place, but you introduce it in other places. I'm sorry, but your current way of handling the discussion isn't really productive. You don't listen to arguments by Victor Stinner, Andrew Svetlov, and me. At this point, this whole PEP 3152 related discussion isn't helping anyone. Yury P.S. I'm sorry if this sounded harsh, this wasn't my intent. From steve at pearwood.info Fri Apr 24 16:21:19 2015 From: steve at pearwood.info (Steven D'Aprano) Date: Sat, 25 Apr 2015 00:21:19 +1000 Subject: [Python-Dev] typeshed for 3rd party packages In-Reply-To: References: <5535FD60.7020404@egenix.com> <55377BF8.8030402@egenix.com> Message-ID: <20150424142118.GY5663@ando.pearwood.info> On Wed, Apr 22, 2015 at 11:26:14AM -0500, Ian Cordasco wrote: > On a separate thread Cory provided an example of what the hints would look > like for *part* of one function in the requests public functional API. > While our API is outwardly simple, the values we accept in certain cases > are actually non-trivially represented. Getting the hints *exactly* correct > would be extraordinarily difficult. I don't think you need to get them exactly correct. The type-checker does two things: (1) catch type errors involving types which should not be allowed; (2) allow code which involves types which should be allowed. If the type hints are wrong, there are two errors: false positives, when code which should be allowed is flagged as a type error; and false negatives, when code which should be flagged as an error is not. Ideally, there should be no false positives. But false negatives are not so important, since you will still be doing runtime checks. All that means is that the static type-checker will be a little less capable of picking up type errors at compile time. -- Steve From cory at lukasa.co.uk Fri Apr 24 16:44:45 2015 From: cory at lukasa.co.uk (Cory Benfield) Date: Fri, 24 Apr 2015 15:44:45 +0100 Subject: [Python-Dev] typeshed for 3rd party packages In-Reply-To: <20150424142118.GY5663@ando.pearwood.info> References: <5535FD60.7020404@egenix.com> <55377BF8.8030402@egenix.com> <20150424142118.GY5663@ando.pearwood.info> Message-ID: On 24 April 2015 at 15:21, Steven D'Aprano wrote: > If the type hints are wrong, there are two errors: false positives, when > code which should be allowed is flagged as a type error; and false > negatives, when code which should be flagged as an error is not. > Ideally, there should be no false positives. But false negatives are not > so important, since you will still be doing runtime checks. All that > means is that the static type-checker will be a little less capable of > picking up type errors at compile time. I think that's a rational view that will not be shared as widely as I'd like. Given that the purpose of a type checker is to catch bugs caused by passing incorrectly typed objects to a function, it seems entirely reasonable to me to raise a bug against a type hint that allows code that was of an incorrect type where that incorrectness *could* have been caught by the type hint. Extending from that into the general ratio of "reports that are actually bugs" versus "reports that are errors on the part of the reporter", I can assume that plenty of people will raise bug reports for incorrect cases as well. >From the perspective of sustainable long-term maintenance, I think the only way to do type hints is to have them be sufficiently exhaustive that a user would have to actively *try* to hit an edge case false negative. I believe that requests' API is too dynamically-typed to fit into that category at this time. PS: I should mention that, as Gary Bernhardt pointed out at PyCon, people often believe (incorrectly) that types are a replacement for tests. For that reason I feel like underspecified type hints are something of an attractive nuisance. Again, I really think this is a case of do it properly or not at all. From status at bugs.python.org Fri Apr 24 18:08:19 2015 From: status at bugs.python.org (Python tracker) Date: Fri, 24 Apr 2015 18:08:19 +0200 (CEST) Subject: [Python-Dev] Summary of Python tracker Issues Message-ID: <20150424160819.CEE7556234@psf.upfronthosting.co.za> ACTIVITY SUMMARY (2015-04-17 - 2015-04-24) Python tracker at http://bugs.python.org/ To view or respond to any of the issues listed below, click on the issue. Do NOT respond to this message. Issues counts and deltas: open 4814 (+22) closed 31000 (+43) total 35814 (+65) Open issues with patches: 2249 Issues opened (46) ================== #15582: Enhance inspect.getdoc to follow inheritance chains http://bugs.python.org/issue15582 reopened by serhiy.storchaka #23991: ZipFile sanity checks http://bugs.python.org/issue23991 opened by Antony.Lee #23992: multiprocessing: MapResult shouldn't fail fast upon exception http://bugs.python.org/issue23992 opened by neologix #23993: Use surrogateescape error handler by default in open() if the http://bugs.python.org/issue23993 opened by haypo #23994: argparse fails to detect program name when there is a slash at http://bugs.python.org/issue23994 opened by boramalper #23995: msvcrt could not be imported http://bugs.python.org/issue23995 opened by petrikas #23996: _PyGen_FetchStopIterationValue() crashes on unnormalised excep http://bugs.python.org/issue23996 opened by scoder #23997: unicodedata_UCD_lookup() has theoretical buffer overflow http://bugs.python.org/issue23997 opened by christian.heimes #23999: Undefined behavior in dtoa.c (rshift 32 of 32bit data type) http://bugs.python.org/issue23999 opened by christian.heimes #24000: More fixes for the Clinic mapping of converters to format unit http://bugs.python.org/issue24000 opened by larry #24001: Clinic: use raw types in types= set http://bugs.python.org/issue24001 opened by larry #24004: avoid explicit generator type check in asyncio http://bugs.python.org/issue24004 opened by scoder #24009: Get rid of rare format units in PyArg_Parse* http://bugs.python.org/issue24009 opened by serhiy.storchaka #24010: Add error checks to PyInit__locale() http://bugs.python.org/issue24010 opened by christian.heimes #24011: Add error checks to PyInit_signal() http://bugs.python.org/issue24011 opened by christian.heimes #24012: Add error checks to PyInit_pyexpat() http://bugs.python.org/issue24012 opened by christian.heimes #24013: Improve os.scandir() and DirEntry documentation http://bugs.python.org/issue24013 opened by benhoyt #24015: timeit should start with 1 loop, not 10 http://bugs.python.org/issue24015 opened by nomeata #24016: Add a Sprints organization/preparation section to devguide http://bugs.python.org/issue24016 opened by willingc #24017: Implemenation of the PEP 492 - Coroutines with async and await http://bugs.python.org/issue24017 opened by haypo #24018: add a Generator ABC http://bugs.python.org/issue24018 opened by scoder #24021: document urllib.urlretrieve http://bugs.python.org/issue24021 opened by krichter #24022: Python heap corruption issue http://bugs.python.org/issue24022 opened by benjamin.peterson #24024: str.__doc__ needs an update http://bugs.python.org/issue24024 opened by lemburg #24026: Python hangs forever in wait() of threading.py http://bugs.python.org/issue24026 opened by appidman #24027: IMAP library lacks documentation about expected parameter type http://bugs.python.org/issue24027 opened by pmoleri #24028: Idle: add doc subsection on calltips http://bugs.python.org/issue24028 opened by terry.reedy #24030: IMAP library encoding enhancement http://bugs.python.org/issue24030 opened by pmoleri #24032: urlparse.urljoin does not add query part http://bugs.python.org/issue24032 opened by albertsmuktupavels #24033: Update _test_multiprocessing.py to use script helpers http://bugs.python.org/issue24033 opened by bobcatfish #24034: Make fails Objects/typeslots.inc http://bugs.python.org/issue24034 opened by masamoto #24035: When Caps Locked, + alpha-character still displayed as http://bugs.python.org/issue24035 opened by principia1687 #24036: GB2312 codec is using a wrong covert table http://bugs.python.org/issue24036 opened by Ma Lin #24037: Argument Clinic: add the boolint converter http://bugs.python.org/issue24037 opened by serhiy.storchaka #24039: Minimize option doesn't work on Search Dialog box for idle http://bugs.python.org/issue24039 opened by prince09cs #24040: plistlib assumes dict_type is descendent of dict http://bugs.python.org/issue24040 opened by Behdad.Esfahbod #24041: Implement Mac East Asian encodings properly http://bugs.python.org/issue24041 opened by Behdad.Esfahbod #24042: Convert os._getfullpathname() and os._isdir() to Argument Clin http://bugs.python.org/issue24042 opened by serhiy.storchaka #24043: Implement mac_romanian and mac_croatian encodings http://bugs.python.org/issue24043 opened by Behdad.Esfahbod #24045: Behavior of large returncodes (sys.exit(nn)) http://bugs.python.org/issue24045 opened by ethan.furman #24046: Incomplete build on AIX http://bugs.python.org/issue24046 opened by aixtools at gmail.com #24048: remove_module() needs to save/restore exception state http://bugs.python.org/issue24048 opened by herring #24050: Segmentation fault (core dumped) http://bugs.python.org/issue24050 opened by nivin #24051: Argument Clinic no longer works with single optional argument http://bugs.python.org/issue24051 opened by serhiy.storchaka #24052: sys.exit(code) returns "success" to the OS for some values of http://bugs.python.org/issue24052 opened by belopolsky #24053: Define EXIT_SUCCESS and EXIT_FAILURE constants in sys http://bugs.python.org/issue24053 opened by belopolsky Most recent 15 issues with no replies (15) ========================================== #24053: Define EXIT_SUCCESS and EXIT_FAILURE constants in sys http://bugs.python.org/issue24053 #24052: sys.exit(code) returns "success" to the OS for some values of http://bugs.python.org/issue24052 #24051: Argument Clinic no longer works with single optional argument http://bugs.python.org/issue24051 #24048: remove_module() needs to save/restore exception state http://bugs.python.org/issue24048 #24045: Behavior of large returncodes (sys.exit(nn)) http://bugs.python.org/issue24045 #24039: Minimize option doesn't work on Search Dialog box for idle http://bugs.python.org/issue24039 #24032: urlparse.urljoin does not add query part http://bugs.python.org/issue24032 #24030: IMAP library encoding enhancement http://bugs.python.org/issue24030 #24028: Idle: add doc subsection on calltips http://bugs.python.org/issue24028 #24024: str.__doc__ needs an update http://bugs.python.org/issue24024 #24016: Add a Sprints organization/preparation section to devguide http://bugs.python.org/issue24016 #24013: Improve os.scandir() and DirEntry documentation http://bugs.python.org/issue24013 #24012: Add error checks to PyInit_pyexpat() http://bugs.python.org/issue24012 #24011: Add error checks to PyInit_signal() http://bugs.python.org/issue24011 #24010: Add error checks to PyInit__locale() http://bugs.python.org/issue24010 Most recent 15 issues waiting for review (15) ============================================= #24042: Convert os._getfullpathname() and os._isdir() to Argument Clin http://bugs.python.org/issue24042 #24037: Argument Clinic: add the boolint converter http://bugs.python.org/issue24037 #24036: GB2312 codec is using a wrong covert table http://bugs.python.org/issue24036 #24034: Make fails Objects/typeslots.inc http://bugs.python.org/issue24034 #24018: add a Generator ABC http://bugs.python.org/issue24018 #24017: Implemenation of the PEP 492 - Coroutines with async and await http://bugs.python.org/issue24017 #24013: Improve os.scandir() and DirEntry documentation http://bugs.python.org/issue24013 #24012: Add error checks to PyInit_pyexpat() http://bugs.python.org/issue24012 #24011: Add error checks to PyInit_signal() http://bugs.python.org/issue24011 #24010: Add error checks to PyInit__locale() http://bugs.python.org/issue24010 #24009: Get rid of rare format units in PyArg_Parse* http://bugs.python.org/issue24009 #24001: Clinic: use raw types in types= set http://bugs.python.org/issue24001 #24000: More fixes for the Clinic mapping of converters to format unit http://bugs.python.org/issue24000 #23997: unicodedata_UCD_lookup() has theoretical buffer overflow http://bugs.python.org/issue23997 #23996: _PyGen_FetchStopIterationValue() crashes on unnormalised excep http://bugs.python.org/issue23996 Top 10 most discussed issues (10) ================================= #24018: add a Generator ABC http://bugs.python.org/issue24018 17 msgs #23496: Steps for Android Native Build of Python 3.4.2 http://bugs.python.org/issue23496 16 msgs #14376: sys.exit documents argument as "integer" but actually requires http://bugs.python.org/issue14376 12 msgs #20182: Derby #13: Convert 50 sites to Argument Clinic across 5 files http://bugs.python.org/issue20182 9 msgs #23993: Use surrogateescape error handler by default in open() if the http://bugs.python.org/issue23993 9 msgs #24000: More fixes for the Clinic mapping of converters to format unit http://bugs.python.org/issue24000 9 msgs #24001: Clinic: use raw types in types= set http://bugs.python.org/issue24001 9 msgs #20180: Derby #11: Convert 50 sites to Argument Clinic across 9 files http://bugs.python.org/issue20180 8 msgs #23985: Crash when deleting slices from duplicated bytearray http://bugs.python.org/issue23985 7 msgs #9517: Make test.script_helper more comprehensive, and use it in the http://bugs.python.org/issue9517 6 msgs Issues closed (43) ================== #6824: help for a module should list supported platforms http://bugs.python.org/issue6824 closed by terry.reedy #11344: Add os.path.splitpath(path) function http://bugs.python.org/issue11344 closed by serhiy.storchaka #11587: METH_KEYWORDS alone gives "METH_OLDARGS is no longer supported http://bugs.python.org/issue11587 closed by berker.peksag #12712: weave build_tools library identification http://bugs.python.org/issue12712 closed by tim.golden #15183: it should be made clear that the statement in the --setup opti http://bugs.python.org/issue15183 closed by akuchling #15566: tarfile.TarInfo.frombuf documentation is out of date http://bugs.python.org/issue15566 closed by berker.peksag #16574: clarify policy on updates to final peps http://bugs.python.org/issue16574 closed by berker.peksag #17445: Handle bytes comparisons in difflib.Differ http://bugs.python.org/issue17445 closed by gward #17475: Better doc on using python-gdb.py http://bugs.python.org/issue17475 closed by willingc #17846: Building Python on Windows - Supplementary info http://bugs.python.org/issue17846 closed by willingc #21483: Skip os.utime() test on NFS? http://bugs.python.org/issue21483 closed by berker.peksag #22501: Optimise PyLong division by 1 or -1 http://bugs.python.org/issue22501 closed by akuchling #22785: range docstring is less useful than in python 2 http://bugs.python.org/issue22785 closed by python-dev #23008: pydoc enum.{,Int}Enum fails http://bugs.python.org/issue23008 closed by serhiy.storchaka #23536: Add explicit information on config file format not supporting http://bugs.python.org/issue23536 closed by python-dev #23713: intermittent failure of multiprocessing unit test test_imap_un http://bugs.python.org/issue23713 closed by serhiy.storchaka #23728: binascii.crc_hqx() can return negative integer http://bugs.python.org/issue23728 closed by serhiy.storchaka #23842: SystemError: ../Objects/longobject.c:998: bad argument to inte http://bugs.python.org/issue23842 closed by serhiy.storchaka #23887: HTTPError doesn't have a good "repr" representation http://bugs.python.org/issue23887 closed by facundobatista #23893: Forward-port future_builtins http://bugs.python.org/issue23893 closed by brett.cannon #23917: please fall back to sequential compilation when concurrent doe http://bugs.python.org/issue23917 closed by berker.peksag #23954: Idle autocomplete: enter or clicking does not select the optio http://bugs.python.org/issue23954 closed by terry.reedy #23982: Tkinter in Python 3.4 for Windows incorrectly renders certain http://bugs.python.org/issue23982 closed by serhiy.storchaka #23989: Add recommendation to use requests to the documentation, per s http://bugs.python.org/issue23989 closed by python-dev #23990: Callable builtin doesn't respect descriptors http://bugs.python.org/issue23990 closed by christian.heimes #23998: PyImport_ReInitLock() doesn't check for allocation error http://bugs.python.org/issue23998 closed by christian.heimes #24002: Add un-parse function to ast http://bugs.python.org/issue24002 closed by larry #24003: variable naming http://bugs.python.org/issue24003 closed by steven.daprano #24005: Documentation Error: Extra line Break http://bugs.python.org/issue24005 closed by willingc #24006: Multiprocessing fails when using functions defined in interact http://bugs.python.org/issue24006 closed by r.david.murray #24007: Write PyArg_Parse* format in a line with a function http://bugs.python.org/issue24007 closed by serhiy.storchaka #24008: inspect.getsource is failing for sys.excepthook http://bugs.python.org/issue24008 closed by Claudiu.Popa #24014: Second pass of PyCode_Optimize http://bugs.python.org/issue24014 closed by llllllllll #24019: str/unicode encoding kwarg causes exceptions http://bugs.python.org/issue24019 closed by benjamin.peterson #24020: threading.local() must be run at module level (doc improvement http://bugs.python.org/issue24020 closed by eric.snow #24023: Django tutorial 2 not able to create a superuser on Windows 7 http://bugs.python.org/issue24023 closed by r.david.murray #24025: str(bytes_obj) should raise an error http://bugs.python.org/issue24025 closed by pitrou #24029: Surprising name binding behavior of submodule imports needs do http://bugs.python.org/issue24029 closed by barry #24031: Add git support to make patchcheck http://bugs.python.org/issue24031 closed by christian.heimes #24038: Missing cleanup in list.sort() with key function http://bugs.python.org/issue24038 closed by benjamin.peterson #24044: NULL pointer dereference in listsort() with key function http://bugs.python.org/issue24044 closed by python-dev #24047: str.startswith and str.endswith should accept multiple argumen http://bugs.python.org/issue24047 closed by serhiy.storchaka #24049: Remove unused code in symtable.c and fix docs for import * che http://bugs.python.org/issue24049 closed by python-dev From guido at python.org Fri Apr 24 19:03:48 2015 From: guido at python.org (Guido van Rossum) Date: Fri, 24 Apr 2015 10:03:48 -0700 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round Message-ID: I've tried to catch up with the previous threads. A summary of issues brought up: 1. precise syntax of `async def` (or do we need it at all) 2. do we need `async for` and `async with` (and how to spell them) 3. syntactic priority of `await` 4. `cocall` vs. `await` 5. do we really need `__aiter__` and friends 6. StopAsyncException 7. compatibility with asyncio and existing users of it (I've added a few myself.) I'll try to take them one by one. *1. precise syntax of `async def`* Of all the places to put `async` I still like *before* the `def` the best. I often do "imprecise search" for e.g. /def foo/ and would be unhappy if this didn't find async defs. Putting it towards the end (`def foo async()` or `def foo() async`) makes it easier to miss. A decorator makes it hard to make the syntactic distinctions required to reject `await` outside an async function. So I still prefer *`async def`*. *2. do we need `async for` and `async with`* Yes we do. Most of you are too young to remember, but once upon a time you couldn't loop over the lines of a file with a `for` loop like you do now. The amount of code that was devoted to efficiently iterate over files was tremendous. We're back in that stone age with the asyncio `StreamReader` class -- it supports `read()`, `readline()` and so on, but you can't use it with `for`, so you have to write a `while True` loop. `asyncio for` makes it possible to add a simple `__anext__` to the `StreamReader` class, as follows: ``` async def __anext__(self): line = await self.readline() if not line: raise StopAsyncIteration return line ``` A similar argument can be made for `async with`; the transaction commit is pretty convincing, but it also helps to be able to wait e.g. for a transport to drain upon closing a write stream. As for how to spell these, I think having `async` at the front makes it most clear that this is a special form. (Though maybe we should consider `await for` and `await with`? That would have the advantage of making it easy to scan for all suspension points by searching for /await/. But being a verb it doesn't read very well.) *3. syntactic priority of `await`* Yury, could you tweak the syntax for `await` so that we can write the most common usages without parentheses? In particular I'd like to be able to write ``` return await foo() with await foo() as bar: ... foo(await bar(), await bletch()) ``` (I don't care about `await foo() + await bar()` but it would be okay.) ``` I think this is reasonable with some tweaks of the grammar (similar to what Greg did for cocall, but without requiring call syntax at the end). *4. `cocall` vs. `await`* Python evolves. We couldn't have PEP 380 (`yield from`) without prior experience with using generators as coroutines (PEP 342), which in turn required basic generators (PEP 255), and those were a natural evolution of Python's earlier `for` loop. We couldn't PEP 3156 (asyncio) without PEP 380 and all that came before. The asyncio library is getting plenty of adoption and it has the concept of separating the *getting* of a future[1] from *waiting* for it. IIUC this is also how `await` works in C# (it just requires something with an async type). This has enabled a variety of operations that take futures and produce more futures. [1] I write `future` with a lowercase 'f' to include concepts like coroutine generator objects. *I just can't get used to this aspect of PEP 3152, so I'm rejecting it.* Sorry Greg, but that's the end. We must see `await` as a refinement of `yield from`, not as an alternative. (Yury: PEP 492 is not accepted yet, but you're getting closer.) One more thing: this separation is "Pythonic" in the sense that it's similar to the way *getting* a callable object is a separate act from *calling* it. While this is a cause for newbie bugs (forgetting to call an argument-less function) it has also enabled the concept of "callable" as more general and more powerful in Python: any time you need to pass a callable, you can pass e.g. a bound method or a class or something you got from `functools.partial`, and that's a useful thing (other languages require you to introduce something like a lambda in such cases, which can be painful if the thing you wrap has a complex signature -- or they don't support function parameters at all, like Java). I know that Greg defends it by explaining that `cocal f(args)` is not a `cocall` operator applied to `f(args)`, it is the *single* operator `cocall ...(args)` applied to `f`. But this is too subtle, and it just doesn't jive with the long tradition of using `yield from f` where f is some previously obtained future. *5. do we really need `__aiter__` and friends* There's a lot of added complexity, but I think it's worth it. I don't think we need to make the names longer, the 'a' prefix is fine for these methods. I think it's all in the protocols: regular `with` uses `__enter__` and `__exit__`; `async with` uses `__aenter__` and `__aexit__` (which must return futures). Ditto for `__aiter__` and `__anext__`. I guess this means that the async equivalent to obtaining an iterator through `it = iter(xs)` followed by `for x over it` will have to look like `ait = await aiter(xs)` followed by `for x over ait`, where an iterator is required to have an `__aiter__` method that's an async function and returns self immediately. But what if you left out the `await` from the first call? I.e. can this work? ``` ait = aiter(xs) async for x in ait: print(x) ``` The question here is whether the object returned by aiter(xs) has an `__aiter__` method. Since it was intended to be the target of `await`, it has an `__await__` method. But that itself is mostly an alias for `__iter__`, not `__aiter__`. I guess it can be made to work, the object just has to implement a bunch of different protocols. *6. StopAsyncException* I'm not sure about this. The motivation given in the PEP seems to focus on the need for `__anext__` to be async. But is this really the right pattern? What if we required `ait.__anext__()` to return a future, which can either raise good old `StopIteration` or return the next value from the iteration when awaited? I'm wondering if there are a few alternatives to be explored around the async iterator protocol still. *7. compatibility with asyncio and existing users of it* This is just something I want to stress. On the one hand it should be really simple to take code written for pre-3.5 asyncio and rewrite it to PEP 492 -- simple change `@asyncio.coroutine` to `async def` and change `yield from` to `await`. (Everything else is optional; existing patterns for loops and context managers should continue to work unchanged, even if in some cases you may be able to simplify the code by using `async for` and `async with`.) But it's also important that *even if you don't switch* (i.e. if you want to keep your code compatible with pre-3.5 asyncio) you can still use the PEP 492 version of asyncio -- i.e. the asyncio library that comes with 3.5 must seamlessly support mixing code that uses `await` and code that uses `yield from`. And this should go both ways -- if you have some code that uses PEP 492 and some code that uses pre-3.5 asyncio, they should be able to pass their coroutines to each other and wait for each other's coroutines. That's all I have for now. Enjoy! -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at pearwood.info Fri Apr 24 19:07:27 2015 From: steve at pearwood.info (Steven D'Aprano) Date: Sat, 25 Apr 2015 03:07:27 +1000 Subject: [Python-Dev] typeshed for 3rd party packages In-Reply-To: References: <5535FD60.7020404@egenix.com> <55377BF8.8030402@egenix.com> <20150424142118.GY5663@ando.pearwood.info> Message-ID: <20150424170726.GZ5663@ando.pearwood.info> On Fri, Apr 24, 2015 at 03:44:45PM +0100, Cory Benfield wrote: > On 24 April 2015 at 15:21, Steven D'Aprano wrote: > > > If the type hints are wrong, there are two errors: false positives, when > > code which should be allowed is flagged as a type error; and false > > negatives, when code which should be flagged as an error is not. > > Ideally, there should be no false positives. But false negatives are not > > so important, since you will still be doing runtime checks. All that > > means is that the static type-checker will be a little less capable of > > picking up type errors at compile time. > > I think that's a rational view that will not be shared as widely as I'd like. I can't tell if you are agreeing with me, or disagreeing. The above sentence seems to be agreeing with me, but you later end your message with "do it properly or not at all" which disagrees. So I'm confused. > Given that the purpose of a type checker is to catch bugs caused by > passing incorrectly typed objects to a function, it seems entirely > reasonable to me to raise a bug against a type hint that allows code > that was of an incorrect type where that incorrectness *could* have > been caught by the type hint. Of course it is reasonable for people to submit bug reports to do with the type hints. And it is also reasonable for the package maintainer to reject the bug report as "Won't Fix" if it makes the type hint too complex. The beauty of gradual typing is that unlike Java or Haskell, you can choose to have as little or as much type checking as works for you. You don't have to satisfy the type checker over the entire program before the code will run, you only need check the parts you want to check. > Extending from that into the general > ratio of "reports that are actually bugs" versus "reports that are > errors on the part of the reporter", I can assume that plenty of > people will raise bug reports for incorrect cases as well. Okay. Do you get many false positive bug reports for your tests too? > From the perspective of sustainable long-term maintenance, I think the > only way to do type hints is to have them be sufficiently exhaustive > that a user would have to actively *try* to hit an edge case false > negative. I believe that requests' API is too dynamically-typed to fit > into that category at this time. I think we agree that, static type checks or no static type checks, requests is going to need to do runtime type checks. So why does it matter if it misses a few type errors at compile time? I think we're all in agreement that for extremely dynamic code like requests, you may not get as much value from static type checks as some other libraries or applications. You might even decide that you get no value at all. Okay, that's fine. I'm just suggesting that you don't have just two choices, "all or nothing". The whole point of gradual typing is to give developers more options. > PS: I should mention that, as Gary Bernhardt pointed out at PyCon, > people often believe (incorrectly) that types are a replacement for > tests. They *can* be a replacement for tests. You don't see Java or Haskell programmers writing unit tests to check that their code never tries to add a string to a float. Even if they could write such as test, they don't bother because the type checker will catch that sort of error. The situation in Python is a bit different, and as Antoine points out, libraries cannot rely on their callers obeying the type restrictions of the public API. (Private functions are different -- if you call my private function with the wrong type and blow up your computer, it's your own fault.) For libraries, I see type checks as complementing tests, not replacing them. But for application code, type checks may replace unit tests, provided that nobody checks in production code until both the type checker and the unit tests pass. If you work under that rule, there's no point in having the unit tests check what the type checker already tested. > For that reason I feel like underspecified type hints are > something of an attractive nuisance. Again, I really think this is a > case of do it properly or not at all. In my opinion, underspecified type hints are no more of an attractive nuisance than a test suite which doesn't test enough. Full coverage is great, but 10% coverage is better than 5% coverage, which is better than nothing. That applies whether we are talking about tests, type checks, or documentation. -- Steve From ronan.lamy at gmail.com Fri Apr 24 19:27:29 2015 From: ronan.lamy at gmail.com (Ronan Lamy) Date: Fri, 24 Apr 2015 18:27:29 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150423165513.239317a3@x230> References: <20150423125957.631a8711@x230> <20150423165513.239317a3@x230> Message-ID: <553A7D01.6010100@gmail.com> Le 23/04/15 14:55, Paul Sokolovsky a ?crit : > Hello, > > On Thu, 23 Apr 2015 09:15:44 -0400 > Daniel Holth wrote: > > [] > >>>> Also ask why no one used type specifier, they are possible since >>>> Python 3.0 ? >>>> Because it is the wrong way for Python. >>> >>> That's an example of how perceptions differ. In my list, everyone(*) >>> uses them - MyPy, MicroPython, etc. Even more should use them (any >>> JIT module, which are many), but sit in the bushes, waiting for a >>> kick, like PEP484 provides. >> >> It's OK that type hints are only to assist the programmer. > > Yes, it's OK to have a situation where type hints assist only a > programmer. It's not OK to think that type hints may be useful only for > programmer, instead of bunch more purposes, several of which > were already shown in the long previous discussion. > >> PyPy's FAQ >> has an explanation of why type hints are not for performance. >> http://pypy.readthedocs.org/en/latest/faq.html#would-type-annotations-help-pypy-s-performance > > You probably intended to write "why type hints are not for *PyPy's* > performance". There're many other language implementations and modules > for which it may be useful, please don't limit your imagination by a > single case. Those points apply to basically any compliant implementation of Python relying on speculative optimisation. Python is simply too dynamic for PEP484-style hints to provide any useful performance improvement targets. > And speaking of PyPy, it really should think how to improve its > performance - not of generated programs, but of generation itself. If > compilation of a trivial program on a pumpy hardware takes 5 minutes > and gigabytes of RAM and diskspace, few people will use it for other > purposes beyond curiosity. There's something very un-Pythonic in > waiting 5 mins just to run 10-line script. Type hints can help here > too ;-) (by not wasting resources propagating types thru the same old > standard library for example). Sorry, but that's nonsense. PyPy would be a seriously useless interpreter if running a 10-line script required such a lengthy compilation, so, obviously, that's not what happens. You seem to misunderstand what PyPy is: it's an interpreter with a just-in-time compiler, not a static compiler. It doesn't generate programs in any meaningful sense. Instead, it interprets the program, and when it detects a hot code path, it compiles it to machine code based on the precise types it sees. No resources are wasted on code that isn't actually executed. From yselivanov.ml at gmail.com Fri Apr 24 19:50:41 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Fri, 24 Apr 2015 13:50:41 -0400 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: References: Message-ID: <553A8271.7030407@gmail.com> Guido, On 2015-04-24 1:03 PM, Guido van Rossum wrote: > > *3. syntactic priority of `await`* > > Yury, could you tweak the syntax for `await` so that we can write the most > common usages without parentheses? In particular I'd like to be able to > write > ``` > return await foo() > with await foo() as bar: ... > foo(await bar(), await bletch()) > ``` > (I don't care about `await foo() + await bar()` but it would be okay.) > ``` > I think this is reasonable with some tweaks of the grammar (similar to what > Greg did for cocall, but without requiring call syntax at the end). I don't remember the reason why yield requires parentheses in expressions, hopefully it's not something fundamental. This has always annoyed me, so let's try to fix that for await. I'll experiment. > > Ditto for `__aiter__` and `__anext__`. I guess this means that the async > equivalent to obtaining an iterator through `it = iter(xs)` followed by > `for x over it` will have to look like `ait = await aiter(xs)` followed by > `for x over ait`, where an iterator is required to have an `__aiter__` > method that's an async function and returns self immediately. But what if > you left out the `await` from the first call? I.e. can this work? > ``` > ait = aiter(xs) > async for x in ait: > print(x) With the current semantics that PEP 492 proposes, "await" for "aiter()" is mandatory. You have to write ait = await aiter(xs) async for x in ait: print(c) We can add some logic that will check that the iterator passed to 'async for' is not an unresolved awaitable and resolve it (instead of immediately checking if it has __anext__ method), but that will complicate the implementation. It will also introduce more than one way of doing things. I think that users will recognize "async builtins" (when we add them) by the first letter "a" and use them in "await" expressions consistently. > ``` > The question here is whether the object returned by aiter(xs) has an > `__aiter__` method. Since it was intended to be the target of `await`, it > has an `__await__` method. But that itself is mostly an alias for > `__iter__`, not `__aiter__`. I guess it can be made to work, the object > just has to implement a bunch of different protocols. Correct. And yes, we address this all by having iteration protocols clearly separated. > > *6. StopAsyncException* > > I'm not sure about this. The motivation given in the PEP seems to focus on > the need for `__anext__` to be async. But is this really the right pattern? > What if we required `ait.__anext__()` to return a future, which can either > raise good old `StopIteration` or return the next value from the iteration > when awaited? I'm wondering if there are a few alternatives to be explored > around the async iterator protocol still. __anext__ should return an awaitable (following the terminology of the PEP), which can be a coroutine-object. I'm not sure that with semantics of PEP 479 it can actually raise StopIteration (without some hacks in genobject). I'm also trying to think forward about how we can add generator-coroutines (the ones that combine 'await' and some form of 'yield') to make writing asynchronous iterators easier. I think that reusing StopIteration on that level will be a very hard thing to understand and implement. I'll experiment with reference implementation and update the PEP. Thank you, Yury From ethan at stoneleaf.us Fri Apr 24 20:03:14 2015 From: ethan at stoneleaf.us (Ethan Furman) Date: Fri, 24 Apr 2015 11:03:14 -0700 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <553A8271.7030407@gmail.com> References: <553A8271.7030407@gmail.com> Message-ID: <20150424180314.GC32422@stoneleaf.us> On 04/24, Yury Selivanov wrote: > On 2015-04-24 1:03 PM, Guido van Rossum wrote: >> Ditto for `__aiter__` and `__anext__`. I guess this means that the async >> equivalent to obtaining an iterator through `it = iter(xs)` followed by >> `for x over it` will have to look like `ait = await aiter(xs)` followed by >> `for x over ait`, where an iterator is required to have an `__aiter__` >> method that's an async function and returns self immediately. But what if >> you left out the `await` from the first call? I.e. can this work? >> ``` >> ait = aiter(xs) >> async for x in ait: >> print(x) > > With the current semantics that PEP 492 proposes, "await" > for "aiter()" is mandatory. > > You have to write > > ait = await aiter(xs) > async for x in ait: > print(c) As a new user to asyncio and this type of programming in general, 'await aiter' feels terribly redundant. -- ~Ethan~ From guido at python.org Fri Apr 24 20:08:03 2015 From: guido at python.org (Guido van Rossum) Date: Fri, 24 Apr 2015 11:08:03 -0700 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <20150424180314.GC32422@stoneleaf.us> References: <553A8271.7030407@gmail.com> <20150424180314.GC32422@stoneleaf.us> Message-ID: On Fri, Apr 24, 2015 at 11:03 AM, Ethan Furman wrote: > On 04/24, Yury Selivanov wrote: > > On 2015-04-24 1:03 PM, Guido van Rossum wrote: > > >> Ditto for `__aiter__` and `__anext__`. I guess this means that the async > >> equivalent to obtaining an iterator through `it = iter(xs)` followed by > >> `for x over it` will have to look like `ait = await aiter(xs)` followed > by > >> `for x over ait`, where an iterator is required to have an `__aiter__` > >> method that's an async function and returns self immediately. But what > if > >> you left out the `await` from the first call? I.e. can this work? > >> ``` > >> ait = aiter(xs) > >> async for x in ait: > >> print(x) > > > > With the current semantics that PEP 492 proposes, "await" > > for "aiter()" is mandatory. > > > > You have to write > > > > ait = await aiter(xs) > > async for x in ait: > > print(c) > > As a new user to asyncio and this type of programming in general, 'await > aiter' > feels terribly redundant. > Yeah, but normally you would never do that. You'd just use `async for x in xs`. I'm just bickering over the exact expansion of that. -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at pearwood.info Fri Apr 24 20:30:41 2015 From: steve at pearwood.info (Steven D'Aprano) Date: Sat, 25 Apr 2015 04:30:41 +1000 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <20150424093251.2461871d@limelight.wooz.org> References: <55368858.4010007@gmail.com> <20150423135152.36195a1e@limelight.wooz.org> <20150424131754.GW5663@ando.pearwood.info> <20150424093251.2461871d@limelight.wooz.org> Message-ID: <20150424183040.GA5663@ando.pearwood.info> On Fri, Apr 24, 2015 at 09:32:51AM -0400, Barry Warsaw wrote: > On Apr 24, 2015, at 11:17 PM, Steven D'Aprano wrote: > > >It seems to me that tools that search for r"^\s*def\s+spam\s*\(" are > > They would likely search for something like r"^\s*def\s+[a-zA-Z0-9_]+" which > will hit "def async spam" but not "async def". Unless somebody wants to do a survey of editors and IDEs and other tools, arguments about what regex they may or may not use to search for function definitions is an exercise in futility. They may use regexes anchored to the start of the line. They may not. They may deal with "def async" better than "async def", or the other way around. Either way, it's a pretty thin argument for breaking the invariant that the token following `def` is the name of the function. Whatever new syntax is added, something is going to break. -- Steve From pmiscml at gmail.com Fri Apr 24 20:45:21 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Fri, 24 Apr 2015 21:45:21 +0300 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <553A7D01.6010100@gmail.com> References: <20150423125957.631a8711@x230> <20150423165513.239317a3@x230> <553A7D01.6010100@gmail.com> Message-ID: <20150424214521.6fd74baf@x230> Hello, On Fri, 24 Apr 2015 18:27:29 +0100 Ronan Lamy wrote: > >>>> Also ask why no one used type specifier, they are possible since > >>>> Python 3.0 ? > >>>> Because it is the wrong way for Python. > >>> > >>> That's an example of how perceptions differ. In my list, > >>> everyone(*) uses them - MyPy, MicroPython, etc. Even more should > >>> use them (any JIT module, which are many), but sit in the bushes, > >>> waiting for a kick, like PEP484 provides. > >> > >> It's OK that type hints are only to assist the programmer. > > > > Yes, it's OK to have a situation where type hints assist only a > > programmer. It's not OK to think that type hints may be useful only > > for programmer, instead of bunch more purposes, several of which > > were already shown in the long previous discussion. > > > >> PyPy's FAQ > >> has an explanation of why type hints are not for performance. > >> http://pypy.readthedocs.org/en/latest/faq.html#would-type-annotations-help-pypy-s-performance > > > > You probably intended to write "why type hints are not for *PyPy's* > > performance". There're many other language implementations and > > modules for which it may be useful, please don't limit your > > imagination by a single case. > > Those points apply to basically any compliant implementation of > Python relying on speculative optimisation. Python is simply too > dynamic for PEP484-style hints to provide any useful performance > improvement targets. What's your point - saying that type annotations alone not enough to achieve the best ("C-like") performance, which is true, or saying that if they are alone not enough, then they are not needed at all, which is ... strange ? > > And speaking of PyPy, it really should think how to improve its > > performance - not of generated programs, but of generation itself. > > If compilation of a trivial program on a pumpy hardware takes 5 > > minutes and gigabytes of RAM and diskspace, few people will use it > > for other purposes beyond curiosity. There's something very > > un-Pythonic in waiting 5 mins just to run 10-line script. Type > > hints can help here too ;-) (by not wasting resources propagating > > types thru the same old standard library for example). > > Sorry, but that's nonsense. PyPy would be a seriously useless > interpreter if running a 10-line script required such a lengthy > compilation, so, obviously, that's not what happens. > > You seem to misunderstand what PyPy is: it's an interpreter with a > just-in-time compiler, not a static compiler. It doesn't generate > programs in any meaningful sense. Instead, it interprets the program, > and when it detects a hot code path, it compiles it to machine code > based on the precise types it sees. No resources are wasted on code > that isn't actually executed. Regardless of whether I understood that meta-meta stuff, I just followed couple of tutorials, each of them warning of memory and disk space issues, and both running long to get results. Everyone else following tutorials will get the same message I did - PyPy is a slow-to-work-with bloat. As for uber-meta stuff PyPy offers - I'm glad that's all done in my favorite language, leaving all other languages behind. I'm saddened there's no mundane JIT or static compiler usable and accepted by all community - many other languages have that. This all goes pretty offtopic wrt to the original discussion, so again, what's your point - you say that all these things can't be done in Python, or there's no need for it to be done? That people should look somewhere else? I submitted a bug to jinja2 project and posted message on its mailing list - I didn't get reply for 3 months. Why? Because its maintainer went hacking another language, how was it called, alGOl, or something. You want me and other folks to go too? Sorry, I'm staying so far, and keep dreaming of better Python's future (where for example if I need to get more performance from existing app, I can gradually optimize it based on need, not rewrite it in another language or be hitting "not implemented" in uber-meta stuff). -- Best regards, Paul mailto:pmiscml at gmail.com From lukasz at langa.pl Fri Apr 24 21:04:27 2015 From: lukasz at langa.pl (=?utf-8?Q?=C5=81ukasz_Langa?=) Date: Fri, 24 Apr 2015 12:04:27 -0700 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <20150424093251.2461871d@limelight.wooz.org> References: <55368858.4010007@gmail.com> <20150423135152.36195a1e@limelight.wooz.org> <20150424131754.GW5663@ando.pearwood.info> <20150424093251.2461871d@limelight.wooz.org> Message-ID: > On Apr 24, 2015, at 6:32 AM, Barry Warsaw wrote: > > On Apr 24, 2015, at 11:17 PM, Steven D'Aprano wrote: > >> It seems to me that tools that search for r"^\s*def\s+spam\s*\(" are > > They would likely search for something like r"^\s*def\s+[a-zA-Z0-9_]+" which > will hit "def async spam" but not "async def?. Realistically that can?t be what they?re doing because of multiple string literals, internal-scope functions, etc. But I agree with Steven that guessing here is pointless. More importantly, consider: - if we optimize for some unproven backwards compatibility with tools, we?re sacrificing better readability of ?async def foo()? - if that tool wants to work with Python 3.5, it?ll still have to support ?await? so we?re going to be incompatible anyway; let alone ?async for? and ?async with? So all in all, I don?t buy this argument. -- Best regards, ?ukasz Langa WWW: http://lukasz.langa.pl/ Twitter: @llanga IRC: ambv on #python-dev From pmiscml at gmail.com Fri Apr 24 21:11:56 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Fri, 24 Apr 2015 22:11:56 +0300 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> <20150423135152.36195a1e@limelight.wooz.org> <20150424131754.GW5663@ando.pearwood.info> <20150424093251.2461871d@limelight.wooz.org> Message-ID: <20150424221156.59138199@x230> Hello, On Fri, 24 Apr 2015 12:04:27 -0700 ?ukasz Langa wrote: [] > > > > They would likely search for something like > > r"^\s*def\s+[a-zA-Z0-9_]+" which will hit "def async spam" but not > > "async def?. > > Realistically that can?t be what they?re doing because of multiple > string literals, internal-scope functions, etc. But I agree with > Steven that guessing here is pointless. More importantly, consider: > > - if we optimize for some unproven backwards compatibility with > tools, we?re sacrificing better readability of ?async def foo()? Yes, so hopefully another argument prevails: the sooner they break, the sooner they're fixed (no irony here, I really consider it strange to optimize language syntax based on background auxiliary utilities' features or misfeatures). [] -- Best regards, Paul mailto:pmiscml at gmail.com From yselivanov.ml at gmail.com Fri Apr 24 23:36:49 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Fri, 24 Apr 2015 17:36:49 -0400 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: References: Message-ID: <553AB771.6090906@gmail.com> Victor, On 2015-04-24 5:32 PM, Victor Stinner wrote: >> 7. compatibility with asyncio and existing users of it > The current state of the PEP makes types.coroutine() mandatory. If a > generator-based coroutine is not modified with types.coroutine, await > cannot be used on it. To be more concrete: asyncio coroutines not > declared with @asyncio.coroutine cannot be used with await. > > Would it be crazy to allow waiting on a generator-based coroutine > (current asyncio coroutines) without having to call types.coroutine() > on it? I'd be big -1 on that. The current PEP design is all about strictly prohibiting users from calling regular generators with 'await' expression. And if a generator isn't decorated with @coroutine - then it's a regular generator for us. > > Maybe I just missed the purpose of disallow this. > > It's also possible to modify asyncio to detect at runtime when an > asyncio coroutine is not decorated by @asyncio.coroutine (emit a > warning or even raise an exception). I'd be +1 to add a warning to Task and other places where we accept generator-based coroutines. Thanks! Yury From lukasz at langa.pl Fri Apr 24 23:37:04 2015 From: lukasz at langa.pl (=?utf-8?Q?=C5=81ukasz_Langa?=) Date: Fri, 24 Apr 2015 14:37:04 -0700 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: References: Message-ID: <8B9C4FD4-FB2D-460E-94AF-F4EDB4CF92EF@langa.pl> > On Apr 24, 2015, at 10:03 AM, Guido van Rossum wrote: > > 1. precise syntax of `async def` > > So I still prefer `async def`. Me too. Also, we would use a similar vocabulary to existing users of the feature. This is exactly how Hack does it: http://docs.hhvm.com/manual/en/hack.async.php , how ECMAScript 7 proposes it: http://wiki.ecmascript.org/doku.php?id=strawman:async_functions and similarly to how C# does it (async comes after the public/private modifier but before the return type): https://msdn.microsoft.com/en-us/library/hh156513.aspx > 2. do we need `async for` and `async with` > > Yes we do. +1 > (Though maybe we should consider `await for` and `await with`? That would have the advantage of making it easy to scan for all suspension points by searching for /await/. But being a verb it doesn't read very well.) I?m on the fence here. OT1H, I think ?await for something in a_container? and ?await with a_context_manager():? also read pretty well. It?s also more consistent with being the one way of finding ?yield points?. OTOH, ?await with a_context_manager():? suggests the entire statement is awaited on, which is not true. ?async with? is more opaque in this way, it simply states ?there are going to be implementation-specific awaits inside?. So it?s more consistent with ?async def foo()?. All in all I think I?m leaning towards ?async for? and ?async with?. More importantly though, I?m wondering how obvious will the failure mode be when somebody uses a bare ?for? instead of an ?async for?. Ditto for ?with? vs. ?async with?. How much debugging will be necessary to find that it?s only a missing ?async? before the loop? Side note: to add to the confusion about syntax, Hack?s equivalent for ?async for? - which doesn?t really translate well to Python - uses ?await? and ties it to the iterable: foreach ($list await as $input) { ? } The equivalent in Python would be: for input in list await: Notably, this is ugly and would be confused with `for input in await list:` which means something different. Also, this particular construct represents less than 0.01% of all ?await? occurences in Facebook code, suggesting it?s not performance critical. > 3. syntactic priority of `await` > > Yury, could you tweak the syntax for `await` so that we can write the most common usages without parentheses? +1 Yury points out there was likely a reason this wasn?t the case for `yield` in the first place. It would be good to revisit that. Maybe for yield itself, too? > 4. `cocall` vs. `await` > > I just can't get used to this aspect of PEP 3152, so I'm rejecting it. +1 > (Yury: PEP 492 is not accepted yet, but you're getting closer.) May I suggest using the bat-signal to summon Glyph to confirm this is going to be helpful/usable with Twisted as well? > 5. do we really need `__aiter__` and friends > > There's a lot of added complexity, but I think it's worth it. I don't think we need to make the names longer, the 'a' prefix is fine for these methods. +1 > 6. StopAsyncException > > I'm not sure about this. The motivation given in the PEP seems to focus on the need for `__anext__` to be async. But is this really the right pattern? What if we required `ait.__anext__()` to return a future, which can either raise good old `StopIteration` or return the next value from the iteration when awaited? I'm wondering if there are a few alternatives to be explored around the async iterator protocol still. So are you suggesting to pass the returned value in a future? In this case the future would need to be passed to __anext__, so the Cursor example from the PEP would look like this: class Cursor: def __init__(self): self.buffer = collections.deque() def _prefetch(self): ... async def __aiter__(self): return self async def __anext__(self, fut): if not self.buffer: self.buffer = await self._prefetch() if self.buffer: fut.set_result(self.buffer.popleft()) else: fut.set_exception(StopIteration) While this is elegant, my concern is that one-future-per-iteration-step might be bad for performance. Maybe consider the following. The `async def` syntax decouples the concept of a coroutine from the implementation. While it?s still based on generators under the hood, the user no longer considers his ?async function? to be a generator or conforming to the generator protocol. From the user?s perpective, it?s obvious that the return below means something different than the exception: async def __anext__(self): if not self.buffer: self.buffer = await self._prefetch() if not self.buffer: raise StopIteration return self.buffer.popleft() So, the same way we added wrapping in RuntimeErrors for generators in PEP 479, we might add transparent wrapping in a _StopAsyncIteration for CO_COROUTINE. > 7. compatibility with asyncio and existing users of it +1, this is really important. -- Best regards, ?ukasz Langa WWW: http://lukasz.langa.pl/ Twitter: @llanga IRC: ambv on #python-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: From victor.stinner at gmail.com Fri Apr 24 23:32:20 2015 From: victor.stinner at gmail.com (Victor Stinner) Date: Fri, 24 Apr 2015 23:32:20 +0200 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: References: Message-ID: Hi, 2015-04-24 19:03 GMT+02:00 Guido van Rossum : > 1. precise syntax of `async def` > > Of all the places to put `async` I still like *before* the `def` the best. So do I. > 2. do we need `async for` and `async with` > > Yes we do. I agree. > 3. syntactic priority of `await` > > Yury, could you tweak the syntax for `await` so that we can write the most > common usages without parentheses? IMO another point must be discussed, corner cases in the grammar and parser of the current PEP & implementation: https://www.python.org/dev/peps/pep-0492/#transition-period-shortcomings And the fact the Python 3.7 will make async & await keywords without proving a way to prepare the code for this major change in the Python syntax. According to the PEP, patching the parser to detect async & await as keywords is almost a hack, and there are corner cases where it doesn't work "as expected". That's why I suggest to reconsider the idea of supporting an *optional* "from __future__ import async" to get async and await as keywords in the current file. This import would allow all crazy syntax. The parser might suggest to use the import when it fails to parse an async or await keyword :-) If you don't like the compromise of a parser with corner cases and an optional __future__ to "workaround these cases", I'm also ok to reproduce what we did with the introduction of the with keyword. I mean not supporting async nor await by default, and maing __future__ mandatory to get the new feature. => 0 risk of backward compatibility issue => no more crazy hacks in the parser > 4. `cocall` vs. `await` > > The asyncio library is getting plenty of adoption and it has the concept of > separating the *getting* of a future[1] from *waiting* for it. My rationale in my other email was similar (ability to get a function without calling it, as you wrote, like bounded methods), so obviously I agree with it :-) I accept the compromise of creating a coroutine object without wait for it (obvious and common bug when learning asyncio). Hopefully, we keep the coroutine wrapper feature (ok, maybe I suggested this idea to Yury because I suffered so much when I learnt how to use asyncio ;-)), so it will still be easy to emit a warning in debug mode. On Stackoverflow, when someone posts a code with bug, I'm now replying "please rerun your code with asyncio debug mode enabled" ;-) I also mentionned it at the *beginning* of the asyncio doc (it's documented at the end of the asyncio doc!). > 5. do we really need `__aiter__` and friends > > There's a lot of added complexity, but I think it's worth it. I agree. I don't see how to keep the PEP consistent without having new dedicated protocols. > 6. StopAsyncException (Sorry, I have no opinion on this point.) > 7. compatibility with asyncio and existing users of it The current state of the PEP makes types.coroutine() mandatory. If a generator-based coroutine is not modified with types.coroutine, await cannot be used on it. To be more concrete: asyncio coroutines not declared with @asyncio.coroutine cannot be used with await. Would it be crazy to allow waiting on a generator-based coroutine (current asyncio coroutines) without having to call types.coroutine() on it? Maybe I just missed the purpose of disallow this. It's also possible to modify asyncio to detect at runtime when an asyncio coroutine is not decorated by @asyncio.coroutine (emit a warning or even raise an exception). Victor From yselivanov.ml at gmail.com Fri Apr 24 23:51:30 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Fri, 24 Apr 2015 17:51:30 -0400 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <8B9C4FD4-FB2D-460E-94AF-F4EDB4CF92EF@langa.pl> References: <8B9C4FD4-FB2D-460E-94AF-F4EDB4CF92EF@langa.pl> Message-ID: <553ABAE2.8070804@gmail.com> Lukasz, On 2015-04-24 5:37 PM, ?ukasz Langa wrote: > >> (Though maybe we should consider `await for` and `await with`? That would have the advantage of making it easy to scan for all suspension points by searching for /await/. But being a verb it doesn't read very well.) > I?m on the fence here. > > OT1H, I think ?await for something in a_container? and ?await with a_context_manager():? also read pretty well. It?s also more consistent with being the one way of finding ?yield points?. > > OTOH, ?await with a_context_manager():? suggests the entire statement is awaited on, which is not true. ?async with? is more opaque in this way, it simply states ?there are going to be implementation-specific awaits inside?. So it?s more consistent with ?async def foo()?. This. I also think of 'await with' as I'm awaiting on the whole statement. And I don't even know how to interpret that. >> 6. StopAsyncException >> >> I'm not sure about this. The motivation given in the PEP seems to focus on the need for `__anext__` to be async. But is this really the right pattern? What if we required `ait.__anext__()` to return a future, which can either raise good old `StopIteration` or return the next value from the iteration when awaited? I'm wondering if there are a few alternatives to be explored around the async iterator protocol still. > So are you suggesting to pass the returned value in a future? In this case the future would need to be passed to __anext__, so the Cursor example from the PEP would look like this: > > class Cursor: > def __init__(self): > self.buffer = collections.deque() > > def _prefetch(self): > ... > > async def __aiter__(self): > return self > > async def __anext__(self, fut): > if not self.buffer: > self.buffer = await self._prefetch() > if self.buffer: > fut.set_result(self.buffer.popleft()) > else: > fut.set_exception(StopIteration) > > While this is elegant, my concern is that one-future-per-iteration-step might be bad for performance. > > Maybe consider the following. The `async def` syntax decouples the concept of a coroutine from the implementation. While it?s still based on generators under the hood, the user no longer considers his ?async function? to be a generator or conforming to the generator protocol. From the user?s perpective, it?s obvious that the return below means something different than the exception: > > async def __anext__(self): > if not self.buffer: > self.buffer = await self._prefetch() > if not self.buffer: > raise StopIteration > return self.buffer.popleft() > > So, the same way we added wrapping in RuntimeErrors for generators in PEP 479, we might add transparent wrapping in a _StopAsyncIteration for CO_COROUTINE. FWIW I have to experiment more with the reference implementation, but at the moment I'm big -1 on touching StopIteration for coroutines. It's used for too many things. Thanks! Yury From lukasz at langa.pl Sat Apr 25 00:17:44 2015 From: lukasz at langa.pl (=?utf-8?Q?=C5=81ukasz_Langa?=) Date: Fri, 24 Apr 2015 15:17:44 -0700 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: References: Message-ID: > On Apr 24, 2015, at 10:03 AM, Guido van Rossum wrote: > > 6. StopAsyncException > > What if we required `ait.__anext__()` to return a future? On top of my previous response, one more thing to consider is that this idea brings a builtin Future back to the proposal, which has already been rejected in the "No implicit wrapping in Futures? section of the PEP. PEP 492 manages to solve all issues without introducing a built-in Future. -- Best regards, ?ukasz Langa WWW: http://lukasz.langa.pl/ Twitter: @llanga IRC: ambv on #python-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Sat Apr 25 00:23:07 2015 From: guido at python.org (Guido van Rossum) Date: Fri, 24 Apr 2015 15:23:07 -0700 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: References: Message-ID: Sorry, when I wrote "future" (lower-case 'f') I really meant what Yury calls *awaitable*. That's either a coroutine or something with an __await__ emthod. On Fri, Apr 24, 2015 at 3:17 PM, ?ukasz Langa wrote: > > On Apr 24, 2015, at 10:03 AM, Guido van Rossum wrote: > > *6. StopAsyncException* > > What if we required `ait.__anext__()` to return a future? > > > On top of my previous response, one more thing to consider is that this > idea brings a builtin Future back to the proposal, which has already been > rejected in the "No implicit wrapping in Futures? section of the PEP. > > PEP 492 manages to solve all issues without introducing a built-in Future. > > -- > Best regards, > ?ukasz Langa > > WWW: http://lukasz.langa.pl/ > Twitter: @llanga > IRC: ambv on #python-dev > > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From ronan.lamy at gmail.com Sat Apr 25 03:05:15 2015 From: ronan.lamy at gmail.com (Ronan Lamy) Date: Sat, 25 Apr 2015 02:05:15 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150424214521.6fd74baf@x230> References: <20150423125957.631a8711@x230> <20150423165513.239317a3@x230> <553A7D01.6010100@gmail.com> <20150424214521.6fd74baf@x230> Message-ID: <553AE84B.1090602@gmail.com> Le 24/04/15 19:45, Paul Sokolovsky a ?crit : > Hello, > > On Fri, 24 Apr 2015 18:27:29 +0100 > Ronan Lamy wrote: > >>>> PyPy's FAQ >>>> has an explanation of why type hints are not for performance. >>>> http://pypy.readthedocs.org/en/latest/faq.html#would-type-annotations-help-pypy-s-performance >>> >>> You probably intended to write "why type hints are not for *PyPy's* >>> performance". There're many other language implementations and >>> modules for which it may be useful, please don't limit your >>> imagination by a single case. >> >> Those points apply to basically any compliant implementation of >> Python relying on speculative optimisation. Python is simply too >> dynamic for PEP484-style hints to provide any useful performance >> improvement targets. > > What's your point - saying that type annotations alone not enough to > achieve the best ("C-like") performance, which is true, or saying that > if they are alone not enough, then they are not needed at all, which > is ... strange ? My point is that the arguments in the PyPy FAQ aren't actually specific to PyPy, and therefore that the conclusion, that hints are almost entirely useless if you?re looking at performance, holds in general. So let me restate these arguments in terms of a generic, performance-minded implementation of the full Python language spec: * Hints have no run-time effect. The interpreter cannot assume that they are obeyed. * PEP484 hints are too high-level. Replacing an 'int' object with a single machine word would be useful, but an 'int' annotation gives no guarantee that it's correct (because Python 3 ints can have arbitrary size and because subclasses of 'int' can override any operation to invoke arbitrary code). * A lot more information is needed to produce good code (e.g. ?this f() called here really means this function there, and will never be monkey-patched? ? same with len() or list(), btw). * Most of this information cannot easily be expressed as a type * If the interpreter gathers all that information, it'll probably have gathered a superset of what PEP484 can provide anyway. >>> And speaking of PyPy, it really should think how to improve its >>> performance - not of generated programs, but of generation itself. >>> If compilation of a trivial program on a pumpy hardware takes 5 >>> minutes and gigabytes of RAM and diskspace, few people will use it >>> for other purposes beyond curiosity. There's something very >>> un-Pythonic in waiting 5 mins just to run 10-line script. Type >>> hints can help here too ;-) (by not wasting resources propagating >>> types thru the same old standard library for example). >> >> Sorry, but that's nonsense. PyPy would be a seriously useless >> interpreter if running a 10-line script required such a lengthy >> compilation, so, obviously, that's not what happens. >> >> You seem to misunderstand what PyPy is: it's an interpreter with a >> just-in-time compiler, not a static compiler. It doesn't generate >> programs in any meaningful sense. Instead, it interprets the program, >> and when it detects a hot code path, it compiles it to machine code >> based on the precise types it sees. No resources are wasted on code >> that isn't actually executed. > > Regardless of whether I understood that meta-meta stuff, I just > followed couple of tutorials, each of them warning of memory and disk > space issues, and both running long to get results. Everyone else > following tutorials will get the same message I did - PyPy is a > slow-to-work-with bloat. Ah, I suppose you're talking about the RPython tool chain, which is used to build PyPy. Though it's an interesting topic in itself (and is pretty much comparable to Cython wrt. type hints), it has about as much relevance to PyPy users as the inner workings of GCC have to CPython users. Well, the thing is that people don't seem to want to write PyPy tutorials, because it's boring. However, I can give you the definitive 3-line version: 1. Download and install PyPy [http://pypy.org/download.html] 2. Launch the 'pypy' executable. 3. Go read https://docs.python.org/2/tutorial/ > As for uber-meta stuff PyPy offers - I'm glad that's all done in > my favorite language, leaving all other languages behind. I'm saddened > there's no mundane JIT or static compiler usable and accepted by all > community - many other languages have that. > > This all goes pretty offtopic wrt to the original discussion, so again, > what's your point - you say that all these things can't be done in > Python, or there's no need for it to be done? That people should look > somewhere else? I submitted a bug to jinja2 project and posted message > on its mailing list - I didn't get reply for 3 months. Why? Because its > maintainer went hacking another language, how was it called, alGOl, or > something. You want me and other folks to go too? Sorry, I'm staying so > far, and keep dreaming of better Python's future (where for example if > I need to get more performance from existing app, I can gradually > optimize it based on need, not rewrite it in another language or be > hitting "not implemented" in uber-meta stuff). "If you want your code magically to run faster, you should probably just use PyPy" - Guido van Rossum - https://www.youtube.com/watch?feature=player_detailpage&v=2wDvzy6Hgxg#t=1010 From kmod at dropbox.com Sat Apr 25 04:15:42 2015 From: kmod at dropbox.com (Kevin Modzelewski) Date: Fri, 24 Apr 2015 19:15:42 -0700 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <553AE84B.1090602@gmail.com> References: <20150423125957.631a8711@x230> <20150423165513.239317a3@x230> <553A7D01.6010100@gmail.com> <20150424214521.6fd74baf@x230> <553AE84B.1090602@gmail.com> Message-ID: On Fri, Apr 24, 2015 at 6:05 PM, Ronan Lamy wrote: > Le 24/04/15 19:45, Paul Sokolovsky a ?crit : > >> Hello, >> >> On Fri, 24 Apr 2015 18:27:29 +0100 >> Ronan Lamy wrote: >> >> PyPy's FAQ >>>>> has an explanation of why type hints are not for performance. >>>>> >>>>> http://pypy.readthedocs.org/en/latest/faq.html#would-type-annotations-help-pypy-s-performance >>>>> >>>> >>>> You probably intended to write "why type hints are not for *PyPy's* >>>> performance". There're many other language implementations and >>>> modules for which it may be useful, please don't limit your >>>> imagination by a single case. >>>> >>> >>> Those points apply to basically any compliant implementation of >>> Python relying on speculative optimisation. Python is simply too >>> dynamic for PEP484-style hints to provide any useful performance >>> improvement targets. >>> >> >> What's your point - saying that type annotations alone not enough to >> achieve the best ("C-like") performance, which is true, or saying that >> if they are alone not enough, then they are not needed at all, which >> is ... strange ? >> > > My point is that the arguments in the PyPy FAQ aren't actually specific to > PyPy, and therefore that the conclusion, that hints are almost entirely > useless if you?re looking at performance, holds in general. > So let me restate these arguments in terms of a generic, > performance-minded implementation of the full Python language spec: > > * Hints have no run-time effect. The interpreter cannot assume that they > are obeyed. > * PEP484 hints are too high-level. Replacing an 'int' object with a single > machine word would be useful, but an 'int' annotation gives no guarantee > that it's correct (because Python 3 ints can have arbitrary size and > because subclasses of 'int' can override any operation to invoke arbitrary > code). > * A lot more information is needed to produce good code (e.g. ?this f() > called here really means this function there, and will never be > monkey-patched? ? same with len() or list(), btw). > * Most of this information cannot easily be expressed as a type > * If the interpreter gathers all that information, it'll probably have > gathered a superset of what PEP484 can provide anyway. I'm with the PyPy folks here -- I don't see any use for PEP 484 type hints from a code generation perspective. Even if the hints were guaranteed to be correct, the PEP 484 type system doesn't follow substitutability. I don't mean that as a critique, I think it's a decision that makes it more useful by keeping it in line with the majority of type usage in Python, but it means that even if the hints are correct they don't really end up providing any guarantees to the JIT. > > > And speaking of PyPy, it really should think how to improve its >>>> performance - not of generated programs, but of generation itself. >>>> If compilation of a trivial program on a pumpy hardware takes 5 >>>> minutes and gigabytes of RAM and diskspace, few people will use it >>>> for other purposes beyond curiosity. There's something very >>>> un-Pythonic in waiting 5 mins just to run 10-line script. Type >>>> hints can help here too ;-) (by not wasting resources propagating >>>> types thru the same old standard library for example). >>>> >>> >>> Sorry, but that's nonsense. PyPy would be a seriously useless >>> interpreter if running a 10-line script required such a lengthy >>> compilation, so, obviously, that's not what happens. >>> >>> You seem to misunderstand what PyPy is: it's an interpreter with a >>> just-in-time compiler, not a static compiler. It doesn't generate >>> programs in any meaningful sense. Instead, it interprets the program, >>> and when it detects a hot code path, it compiles it to machine code >>> based on the precise types it sees. No resources are wasted on code >>> that isn't actually executed. >>> >> >> Regardless of whether I understood that meta-meta stuff, I just >> followed couple of tutorials, each of them warning of memory and disk >> space issues, and both running long to get results. Everyone else >> following tutorials will get the same message I did - PyPy is a >> slow-to-work-with bloat. >> > > Ah, I suppose you're talking about the RPython tool chain, which is used > to build PyPy. Though it's an interesting topic in itself (and is pretty > much comparable to Cython wrt. type hints), it has about as much relevance > to PyPy users as the inner workings of GCC have to CPython users. > > Well, the thing is that people don't seem to want to write PyPy tutorials, > because it's boring. However, I can give you the definitive 3-line version: > 1. Download and install PyPy [http://pypy.org/download.html] > 2. Launch the 'pypy' executable. > 3. Go read https://docs.python.org/2/tutorial/ > > As for uber-meta stuff PyPy offers - I'm glad that's all done in >> my favorite language, leaving all other languages behind. I'm saddened >> there's no mundane JIT or static compiler usable and accepted by all >> community - many other languages have that. >> >> This all goes pretty offtopic wrt to the original discussion, so again, >> what's your point - you say that all these things can't be done in >> Python, or there's no need for it to be done? That people should look >> somewhere else? I submitted a bug to jinja2 project and posted message >> on its mailing list - I didn't get reply for 3 months. Why? Because its >> maintainer went hacking another language, how was it called, alGOl, or >> something. You want me and other folks to go too? Sorry, I'm staying so >> far, and keep dreaming of better Python's future (where for example if >> I need to get more performance from existing app, I can gradually >> optimize it based on need, not rewrite it in another language or be >> hitting "not implemented" in uber-meta stuff). >> > > "If you want your code magically to run faster, you should probably just > use PyPy" - Guido van Rossum - > https://www.youtube.com/watch?feature=player_detailpage&v=2wDvzy6Hgxg#t=1010 > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/kmod%40dropbox.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at pearwood.info Sat Apr 25 05:07:06 2015 From: steve at pearwood.info (Steven D'Aprano) Date: Sat, 25 Apr 2015 13:07:06 +1000 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <553AE84B.1090602@gmail.com> References: <20150423125957.631a8711@x230> <20150423165513.239317a3@x230> <553A7D01.6010100@gmail.com> <20150424214521.6fd74baf@x230> <553AE84B.1090602@gmail.com> Message-ID: <20150425030706.GE5663@ando.pearwood.info> On Sat, Apr 25, 2015 at 02:05:15AM +0100, Ronan Lamy wrote: > * Hints have no run-time effect. The interpreter cannot assume that they > are obeyed. I know what you mean, but just for the record, annotations are runtime inspectable, so people can (and probably have already started) to write runtime argument checking decorators or frameworks which rely on the type hints. > * PEP484 hints are too high-level. Replacing an 'int' object with a > single machine word would be useful, but an 'int' annotation gives no > guarantee that it's correct (because Python 3 ints can have arbitrary > size and because subclasses of 'int' can override any operation to > invoke arbitrary code). Then create your own int16, uint64 etc types. > * A lot more information is needed to produce good code (e.g. ?this f() > called here really means this function there, and will never be > monkey-patched? ? same with len() or list(), btw). > * Most of this information cannot easily be expressed as a type > * If the interpreter gathers all that information, it'll probably have > gathered a superset of what PEP484 can provide anyway. All this is a red herring. If type hints are useful to PyPy, that's a bonus. Cython uses its own system of type hints, a future version may be able to use PEP 484 hints instead. But any performance benefit is a bonus. PEP 484 is for increasing correctness, not speed. -- Steve From ncoghlan at gmail.com Sat Apr 25 06:31:12 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 25 Apr 2015 14:31:12 +1000 Subject: [Python-Dev] How to behave regarding commiting In-Reply-To: References: Message-ID: On 17 April 2015 at 11:41, Berker Peksa? wrote: > On Fri, Apr 17, 2015 at 4:32 AM, Facundo Batista > wrote: >> Hola! >> >> I'm asking this because quite some time passed since I was active in >> the development of our beloved language. >> >> I'm trying to not break any new rule not known by me. >> >> I opened a bug recently [0], somebody else proposed a patch that I >> like. However, that patch has no test. I will do a test for that code, >> but then what? >> >> Shall I just commit and push? Or the whole branch should be proposed >> for further reviewing? > > Hi, > > Since writing a test for that patch is simple, I'd just commit it to > the default branch. (Catching up on several days of python-dev email) The ideal case is getting pre-commit reviews, but as a matter of practicality, we each get to decide whether or not we're OK with just pushing a particular change and relying on the buildbots and post-commit review. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Sat Apr 25 06:34:34 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 25 Apr 2015 14:34:34 +1000 Subject: [Python-Dev] Changing PyModuleDef.m_reload to m_slots In-Reply-To: References: <55311038.5080200@gmail.com> Message-ID: On 18 April 2015 at 15:58, Stefan Behnel wrote: > Petr Viktorin schrieb am 17.04.2015 um 15:52: >> As a background, the PyModuleDef structure [1] is currently: >> >> struct PyModuleDef{ >> PyModuleDef_Base m_base; >> const char* m_name; >> const char* m_doc; >> Py_ssize_t m_size; >> PyMethodDef *m_methods; >> inquiry m_reload; >> traverseproc m_traverse; >> inquiry m_clear; >> freefunc m_free; >> }; >> >> ... where the m_reload pointer is unused, and must be NULL. >> My proposal is to repurpose this pointer to hold an array of slots, in the >> style of PEP 384's PyType_Spec [2], which would allow adding extensions -- >> both those needed for PEP 489 and future ones. > > FWIW, I'm +1 on this. It replaces a struct field that risks staying unused > basically forever with an extensible interface that massively improves the > current extension module protocol and allows future extensions (including > getting back the pointer we replaced). > > The alternative of essentially duplicating all sorts of things to > accommodate for a new metadata struct is way too cumbersome and ugly in > comparison. Sorry for the delayed reply (post-PyCon travel), but +1 from me as well for the reasons Stefan gives. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From greg.ewing at canterbury.ac.nz Sat Apr 25 06:39:37 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Sat, 25 Apr 2015 16:39:37 +1200 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: References: Message-ID: <553B1A89.8060209@canterbury.ac.nz> Guido van Rossum wrote: > Yury, could you tweak the syntax for `await` so that we can write the > most common usages without parentheses? In particular I'd like to be > able to write > ``` > return await foo() > with await foo() as bar: ... > foo(await bar(), await bletch()) > ``` Making 'await' a prefix operator with the same precedence as unary minus would allow most reasonable usages, I think. The only reason "yield from" has such a constrained syntax is that it starts with "yield", which is similarly constrained. Since 'await' is a brand new keyword isn't bound by those constraints. -- Greg From ncoghlan at gmail.com Sat Apr 25 06:44:50 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 25 Apr 2015 14:44:50 +1000 Subject: [Python-Dev] Surely "nullable" is a reasonable name? In-Reply-To: <5536897B.80006@hastings.org> References: <53DF326F.9030908@hastings.org> <53E0F488.1090105@v.loewis.de> <53E454E9.3030100@hastings.org> <55336517.6090702@hastings.org> <5536897B.80006@hastings.org> Message-ID: On 22 April 2015 at 03:31, Larry Hastings wrote: > > On 04/21/2015 04:50 AM, Tal Einat wrote: > > As for the default set of accepted types for various convertors, if we > could choose any syntax we liked, something like "accept=+{NoneType}" > would be much better IMO. > > > In theory Argument Clinic could use any syntax it likes. In practice, under > the covers we tease out one or two bits of non-Python syntax, then run > ast.parse over it. Saved us a lot of work. > > "s: accept={str,NoneType}" is a legal Python parameter declaration; "s: > accept+={NoneType}" is not. If I could figure out a clean way to hack in > support for += I'll support it. Otherwise you'll be forced to spell it out. Ellipsis seems potentially useful here to mean "whatever the default accepted types are": "s: accept={...,NoneType}" My other question would be whether we can use "None" in preference to NoneType, as PEP 484 does: https://www.python.org/dev/peps/pep-0484/#using-none Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Sat Apr 25 06:45:49 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 25 Apr 2015 14:45:49 +1000 Subject: [Python-Dev] Surely "nullable" is a reasonable name? In-Reply-To: References: <53DF326F.9030908@hastings.org> <53E0F488.1090105@v.loewis.de> <53E454E9.3030100@hastings.org> <55336517.6090702@hastings.org> <5536897B.80006@hastings.org> Message-ID: On 25 April 2015 at 14:44, Nick Coghlan wrote: > On 22 April 2015 at 03:31, Larry Hastings wrote: >> >> On 04/21/2015 04:50 AM, Tal Einat wrote: >> >> As for the default set of accepted types for various convertors, if we >> could choose any syntax we liked, something like "accept=+{NoneType}" >> would be much better IMO. >> >> >> In theory Argument Clinic could use any syntax it likes. In practice, under >> the covers we tease out one or two bits of non-Python syntax, then run >> ast.parse over it. Saved us a lot of work. >> >> "s: accept={str,NoneType}" is a legal Python parameter declaration; "s: >> accept+={NoneType}" is not. If I could figure out a clean way to hack in >> support for += I'll support it. Otherwise you'll be forced to spell it out. > > Ellipsis seems potentially useful here to mean "whatever the default > accepted types are": "s: accept={...,NoneType}" Ah, I misread Tal's suggestion. Using unary + is an even neater approach. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From greg.ewing at canterbury.ac.nz Sat Apr 25 06:47:33 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Sat, 25 Apr 2015 16:47:33 +1200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <20150424183040.GA5663@ando.pearwood.info> References: <55368858.4010007@gmail.com> <20150423135152.36195a1e@limelight.wooz.org> <20150424131754.GW5663@ando.pearwood.info> <20150424093251.2461871d@limelight.wooz.org> <20150424183040.GA5663@ando.pearwood.info> Message-ID: <553B1C65.3050204@canterbury.ac.nz> Wild idea: Let "@" mean "async" when it's directly in front of a keyword. Then we would have: @def f(): ... @for x in iter: ... @with context as thing: ... -- Greg From greg.ewing at canterbury.ac.nz Sat Apr 25 07:02:25 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Sat, 25 Apr 2015 17:02:25 +1200 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: References: Message-ID: <553B1FE1.3050301@canterbury.ac.nz> Victor Stinner wrote: > That's why I suggest to reconsider the idea of supporting an > *optional* "from __future__ import async" to get async and await as > keywords in the current file. This import would allow all crazy > syntax. The parser might suggest to use the import when it fails to > parse an async or await keyword :-) To me, these new features *obviously* should require a __future__ import. Anything else would be crazy. > I accept the compromise of creating a coroutine object without wait > for it (obvious and common bug when learning asyncio). Hopefully, we > keep the coroutine wrapper feature (ok, maybe I suggested this idea to > Yury because I suffered so much when I learnt how to use asyncio ;-)), > so it will still be easy to emit a warning in debug mode. I'm disappointed that there will *still* be no direct and reliable way to detect and clearly report this kind of error, and that what there is will only be active in a special debug mode. -- Greg From ncoghlan at gmail.com Sat Apr 25 07:24:49 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 25 Apr 2015 15:24:49 +1000 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <553682E4.2020306@willingconsulting.com> References: <55362CCF.3020809@btinternet.com> <20150421124720.GK5663@ando.pearwood.info> <20150421150827.3c2a0af6@fsol> <20150421165042.34f935ef@x230> <55365AA7.3060500@python.org> <20150421182750.3efcb597@x230> <20150421161701.E06C0B20095@webabinitio.net> <553682E4.2020306@willingconsulting.com> Message-ID: On 22 April 2015 at 03:03, Carol Willing wrote: > 2. Clearly, great thought has been put into this PEP. If anyone has a good > analysis of the potential impact on Python 3 adoption, please do pass along. > I would be interested in reading the information. I don't have hard data, but I do get to see quite a few of the challenges that dynamically typed languages (including Python) have scaling to large development teams. When you look at the way folks actually *use* languages like Java and C# in practice, you find that they're almost universally "tool mavens" (in Oliver Steele's sense: http://blog.osteele.com/posts/2004/11/ides/) where automated tools are taking care of most of the boilerplate for them. Find me a C# developer, and I'll bet you they're a Visual Studio user, find me a Java developer, and I'll bet you they're an Eclipse or IntelliJ user. This approach actually offers a lot of benefits in putting a "skill floor" under a development team - while you can't make people think, you can at least automatically rule out broad categories of mundane errors, and focus on the business logic of the problem you're aiming to solve. As a result, my main reaction to PEP 484 in a Python 3 adoption context is that "Python 3 offers all the agility and flexibility of Python 2, with all the structural assurances of Java or C#" is actually a huge selling point for anyone in an institutional context attempting to persuade their management chain to back a migration effort from Python 2 to Python 3. Another factor to consider here is that my understanding is that one of the *reasons* folks want better structural analysis (by annotating the Python 2 stdlib and key third part libraries in typeshed) is to help automate Python 2 -> Python 3 conversions in the absence of comprehensive test coverage. While to some degree this works against the previous point, there's a difference between having this as an addon vs having it as a standard feature (and unlike the network security changes and bundling pip, this is one where I'm entirely happy leaving it as the kind of carrot that can take pride of place in a corporate business case). The legitimate concerns that arise are around what happens to *community* code bases, including the standard library itself, as well as what folks are likely to see if they run "inspect.getsource()" on standard library components. For that, I think there's a lot of value in continuing to have explicit type hints be the exception rather than the rule in the upstream community, so the idea of the typeshed project is enormously appealing to me. If anyone doesn't want to deal with type hints themselves, but has a contributor that really wants to annotate their library, then "take it to typeshed" will hopefully become a recurring refrain :) Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Sat Apr 25 08:04:10 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 25 Apr 2015 16:04:10 +1000 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <8B9C4FD4-FB2D-460E-94AF-F4EDB4CF92EF@langa.pl> References: <8B9C4FD4-FB2D-460E-94AF-F4EDB4CF92EF@langa.pl> Message-ID: On 25 April 2015 at 07:37, ?ukasz Langa wrote: > On Apr 24, 2015, at 10:03 AM, Guido van Rossum wrote: > 3. syntactic priority of `await` > > Yury, could you tweak the syntax for `await` so that we can write the most > common usages without parentheses? > > +1 > > Yury points out there was likely a reason this wasn?t the case for `yield` > in the first place. It would be good to revisit that. Maybe for yield > itself, too? yield requires parentheses in most cases for readability purposes. For example, does this pass one argument or two to "f"?: f(yield a, b) Answer: neither, it's a syntax error, as you have to supply the parentheses to say whether you mean "f((yield a, b))" or "f((yield a), b)". Requiring parentheses except in a few pre-approved cases (specifically, as a standalone statement and as the RHS of assignment statements) eliminated all that potential ambiguity. Allowing "return yield a, b" and "return yield from a, b" by analogy with assignment statements would be fine, but getting more permissive than that with yield expressions doesn't make sense due to the ambiguity between yielding a tuple and other uses of commas as separators in various parts of the grammar. PEP 492's "await" is a different case, as asynchronously waiting for a tuple doesn't make any sense. However, I'm not sure the grammar will let you reasonably express "anything except a tuple literal" as the subexpression after the await keyword, so it will likely still need special case handling in the affected statements in order to avoid "await a, b" being interpreted as waiting for a tuple. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From greg.ewing at canterbury.ac.nz Sat Apr 25 08:23:20 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Sat, 25 Apr 2015 18:23:20 +1200 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: References: Message-ID: <553B32D8.3050707@canterbury.ac.nz> Guido van Rossum wrote: > Sorry, when I wrote "future" (lower-case 'f') I really meant what Yury > calls *awaitable*. That's either a coroutine or something with an > __await__ emthod. But how is an awaitable supposed to raise StopIteration if it's implemented by a generator or async def[*] function? Those things use StopIteration to wrap return values. I like the idea of allowing StopIteration to be raised in an async def function and wrapping it somehow. I'd add that it could also be unwrapped automatically when it emerges from 'await', so that code manually invoking __anext__ can catch StopIteration as usual. I don't think this could conflict with any existing uses of StopIteration, since raising it inside generators is currently forbidden. [*] I'm still struggling with what to call those things. Calling them just "coroutines" seems far too ambiguous. (There should be a Zen item something along the lines of "If you can't think of a concise and unambiguous name for it, it's probably a bad idea".) -- Greg From ncoghlan at gmail.com Sat Apr 25 08:30:33 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 25 Apr 2015 16:30:33 +1000 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <553B32D8.3050707@canterbury.ac.nz> References: <553B32D8.3050707@canterbury.ac.nz> Message-ID: On 25 April 2015 at 16:23, Greg Ewing wrote: > Guido van Rossum wrote: >> >> Sorry, when I wrote "future" (lower-case 'f') I really meant what Yury >> calls *awaitable*. That's either a coroutine or something with an __await__ >> emthod. > > > But how is an awaitable supposed to raise StopIteration > if it's implemented by a generator or async def[*] function? > Those things use StopIteration to wrap return values. > > I like the idea of allowing StopIteration to be raised > in an async def function and wrapping it somehow. I'd > add that it could also be unwrapped automatically when > it emerges from 'await', so that code manually invoking > __anext__ can catch StopIteration as usual. > > I don't think this could conflict with any existing > uses of StopIteration, since raising it inside generators > is currently forbidden. > > [*] I'm still struggling with what to call those things. > Calling them just "coroutines" seems far too ambiguous. > (There should be a Zen item something along the lines > of "If you can't think of a concise and unambiguous name > for it, it's probably a bad idea".) I think "async function" is fine - "async function" is to "coroutine" as "function" is to "callable". Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From larry at hastings.org Sat Apr 25 09:58:58 2015 From: larry at hastings.org (Larry Hastings) Date: Sat, 25 Apr 2015 00:58:58 -0700 Subject: [Python-Dev] Surely "nullable" is a reasonable name? In-Reply-To: References: <53DF326F.9030908@hastings.org> <53E0F488.1090105@v.loewis.de> <53E454E9.3030100@hastings.org> <55336517.6090702@hastings.org> <5536897B.80006@hastings.org> Message-ID: <553B4942.1030403@hastings.org> On 04/24/2015 09:45 PM, Nick Coghlan wrote: > Ah, I misread Tal's suggestion. Using unary + is an even neater approach. Not exactly. The way I figure it, the best way to achieve this with unary plus is to ast.parse it (as we currently do) and then modify the parse tree. That works but it's kind of messy. My main objection to this notation is that that set objects /don't support +./ The union operator for sets is |. I've prototyped a hack allowing str(accept|={NoneType}) I used the tokenize module to tokenize, modify, and untokenize the converter invocation. Works fine. And since augmented assignment is (otherwise) illegal in expressions, it's totally unambiguous. I think if we do it at all it should be with that notation. //arry/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From taleinat at gmail.com Sat Apr 25 10:06:44 2015 From: taleinat at gmail.com (Tal Einat) Date: Sat, 25 Apr 2015 11:06:44 +0300 Subject: [Python-Dev] Surely "nullable" is a reasonable name? In-Reply-To: <553B4942.1030403@hastings.org> References: <53DF326F.9030908@hastings.org> <53E0F488.1090105@v.loewis.de> <53E454E9.3030100@hastings.org> <55336517.6090702@hastings.org> <5536897B.80006@hastings.org> <553B4942.1030403@hastings.org> Message-ID: On Sat, Apr 25, 2015 at 10:58 AM, Larry Hastings wrote: > > On 04/24/2015 09:45 PM, Nick Coghlan wrote: > > Ah, I misread Tal's suggestion. Using unary + is an even neater approach. > > > Not exactly. The way I figure it, the best way to achieve this with unary plus is to ast.parse it (as we currently do) and then modify the parse tree. That works but it's kind of messy. > > My main objection to this notation is that that set objects don't support +. The union operator for sets is |. > > I've prototyped a hack allowing > str(accept|={NoneType}) > I used the tokenize module to tokenize, modify, and untokenize the converter invocation. Works fine. And since augmented assignment is (otherwise) illegal in expressions, it's totally unambiguous. I think if we do it at all it should be with that notation. We're deep into bike-shedding territory at this point, but I prefer Nick's suggestion of using the Ellipses for this. It's the simplest and most obvious syntax suggested so far. - Tal From ncoghlan at gmail.com Sat Apr 25 10:07:05 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 25 Apr 2015 18:07:05 +1000 Subject: [Python-Dev] Surely "nullable" is a reasonable name? In-Reply-To: <553B4942.1030403@hastings.org> References: <53DF326F.9030908@hastings.org> <53E0F488.1090105@v.loewis.de> <53E454E9.3030100@hastings.org> <55336517.6090702@hastings.org> <5536897B.80006@hastings.org> <553B4942.1030403@hastings.org> Message-ID: On 25 April 2015 at 17:58, Larry Hastings wrote: > > On 04/24/2015 09:45 PM, Nick Coghlan wrote: > > Ah, I misread Tal's suggestion. Using unary + is an even neater approach. > > > Not exactly. The way I figure it, the best way to achieve this with unary > plus is to ast.parse it (as we currently do) and then modify the parse tree. > That works but it's kind of messy. > > My main objection to this notation is that that set objects don't support +. > The union operator for sets is |. Good point. > I've prototyped a hack allowing > str(accept|={NoneType}) > I used the tokenize module to tokenize, modify, and untokenize the converter > invocation. Works fine. And since augmented assignment is (otherwise) > illegal in expressions, it's totally unambiguous. I think if we do it at > all it should be with that notation. I'd say start without it, but if it gets annoying, then we have this in our back pocket as a potential fix. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From victor.stinner at gmail.com Sat Apr 25 10:29:19 2015 From: victor.stinner at gmail.com (Victor Stinner) Date: Sat, 25 Apr 2015 10:29:19 +0200 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <553B1FE1.3050301@canterbury.ac.nz> References: <553B1FE1.3050301@canterbury.ac.nz> Message-ID: Hi Greg, 2015-04-25 7:02 GMT+02:00 Greg Ewing : >> I accept the compromise of creating a coroutine object without wait >> for it (obvious and common bug when learning asyncio). Hopefully, we >> keep the coroutine wrapper feature (ok, maybe I suggested this idea to >> Yury because I suffered so much when I learnt how to use asyncio ;-)), >> so it will still be easy to emit a warning in debug mode. > > > I'm disappointed that there will *still* be no direct and > reliable way to detect and clearly report this kind of > error, and that what there is will only be active in > a special debug mode. It looks like our BDFL, Guido, made a choice. This point was discussed enough and most of us now agree on this point. There is no need to repeat yourself multiple time, we are well aware that you strongly disagree with this compromise ;-) It's probably the major difference between the PEP 492 and PEP 3152, and Guido decided to reject the PEP 3152. It's now time to focus our good energy on discussing remaining questions on the PEP 492 to make it the best PEP ever! Guido did a great great to summarize these questions. Thank you, Victor From victor.stinner at gmail.com Sat Apr 25 10:36:22 2015 From: victor.stinner at gmail.com (Victor Stinner) Date: Sat, 25 Apr 2015 10:36:22 +0200 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <553B32D8.3050707@canterbury.ac.nz> References: <553B32D8.3050707@canterbury.ac.nz> Message-ID: 2015-04-25 8:23 GMT+02:00 Greg Ewing : > But how is an awaitable supposed to raise StopIteration > if it's implemented by a generator or async def[*] function? > Those things use StopIteration to wrap return values. > > I like the idea of allowing StopIteration to be raised > in an async def function and wrapping it somehow. I'd > add that it could also be unwrapped automatically when > it emerges from 'await', so that code manually invoking > __anext__ can catch StopIteration as usual. Hum, wrap and then unwrap is not cheap. Exception handling in CPython is known to be slow (it's one of the favorite topic for microbenchmarks ;-)). I would prefer to not make wrapping/unwrapping a feature and instead use different exceptions. Again, I don't understand the problem, so I'm just sure that my remark makes sense. But please be careful of performances. I already saw microbenchmarks on function call vs consuming a generator to justify that asyncio is way too slow and must not be used... See for example the "2. AsyncIO uses appealing, but relatively inefficient Python paradigms" section of the following article: http://techspot.zzzeek.org/2015/02/15/asynchronous-python-and-databases/ The microbenchmark shows that yield from is 6x as slow as function calls (and function calls are also known to be slow in CPython). (I don't think that it makes sense to compare function calls versus consuming a generator to benchmark asyncio, but other developers have a different opinion of that.) Victor From andrew.svetlov at gmail.com Sat Apr 25 11:01:19 2015 From: andrew.svetlov at gmail.com (Andrew Svetlov) Date: Sat, 25 Apr 2015 12:01:19 +0300 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <553B1C65.3050204@canterbury.ac.nz> References: <55368858.4010007@gmail.com> <20150423135152.36195a1e@limelight.wooz.org> <20150424131754.GW5663@ando.pearwood.info> <20150424093251.2461871d@limelight.wooz.org> <20150424183040.GA5663@ando.pearwood.info> <553B1C65.3050204@canterbury.ac.nz> Message-ID: I used to think in the same way but found the result looks like Perl (or Haskell), not Python. On Sat, Apr 25, 2015 at 7:47 AM, Greg Ewing wrote: > Wild idea: > > Let "@" mean "async" when it's directly in front > of a keyword. > > Then we would have: > > @def f(): > ... > > @for x in iter: > ... > > @with context as thing: > ... > > -- > Greg > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.com -- Thanks, Andrew Svetlov From greg.ewing at canterbury.ac.nz Sat Apr 25 11:15:14 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Sat, 25 Apr 2015 21:15:14 +1200 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: References: <553B1FE1.3050301@canterbury.ac.nz> Message-ID: <553B5B22.7090506@canterbury.ac.nz> Victor Stinner wrote: > It's now time to focus our > good energy on discussing remaining questions on the PEP 492 to make > it the best PEP ever! That's what I'm trying to do. I just think it would be even better if it could be made to address that issue somehow. I haven't thought of a way to do that yet, though. -- Greg From ronan.lamy at gmail.com Sat Apr 25 18:01:04 2015 From: ronan.lamy at gmail.com (Ronan Lamy) Date: Sat, 25 Apr 2015 17:01:04 +0100 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <20150425030706.GE5663@ando.pearwood.info> References: <20150423125957.631a8711@x230> <20150423165513.239317a3@x230> <553A7D01.6010100@gmail.com> <20150424214521.6fd74baf@x230> <553AE84B.1090602@gmail.com> <20150425030706.GE5663@ando.pearwood.info> Message-ID: <553BBA40.5000106@gmail.com> Le 25/04/15 04:07, Steven D'Aprano a ?crit : > On Sat, Apr 25, 2015 at 02:05:15AM +0100, Ronan Lamy wrote: > >> * Hints have no run-time effect. The interpreter cannot assume that they >> are obeyed. > > I know what you mean, but just for the record, annotations are runtime > inspectable, so people can (and probably have already started) to write > runtime argument checking decorators or frameworks which rely on the > type hints. > > >> * PEP484 hints are too high-level. Replacing an 'int' object with a >> single machine word would be useful, but an 'int' annotation gives no >> guarantee that it's correct (because Python 3 ints can have arbitrary >> size and because subclasses of 'int' can override any operation to >> invoke arbitrary code). > > Then create your own int16, uint64 etc types. The PEP doesn't explain how to do that. And even if it's possible, such types wouldn't be very useful as they're not stable under any arithmetic operation (e.g. + doesn't necessarily fit in int16). >> * A lot more information is needed to produce good code (e.g. ?this f() >> called here really means this function there, and will never be >> monkey-patched? ? same with len() or list(), btw). >> * Most of this information cannot easily be expressed as a type >> * If the interpreter gathers all that information, it'll probably have >> gathered a superset of what PEP484 can provide anyway. > > All this is a red herring. If type hints are useful to PyPy, that's a > bonus. Cython uses its own system of type hints, a future version may be > able to use PEP 484 hints instead. But any performance benefit is a > bonus. PEP 484 is for increasing correctness, not speed. Yes, talking about performance in the context of PEP 484 is a red herring, that's what I'm saying. From yselivanov.ml at gmail.com Sat Apr 25 22:18:08 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Sat, 25 Apr 2015 16:18:08 -0400 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: References: Message-ID: <553BF680.10304@gmail.com> Hi Guido, On 2015-04-24 1:03 PM, Guido van Rossum wrote: > *3. syntactic priority of `await`* > > Yury, could you tweak the syntax for `await` so that we can write the most > common usages without parentheses? In particular I'd like to be able to > write > ``` > return await foo() > with await foo() as bar: ... > foo(await bar(), await bletch()) > ``` > (I don't care about `await foo() + await bar()` but it would be okay.) > ``` > I think this is reasonable with some tweaks of the grammar (similar to what > Greg did for cocall, but without requiring call syntax at the end). I've done some experiments with grammar, and it looks like we indeed can parse await quite differently from yield. Three different options: Option #1. Parse 'await' exactly like we parse 'yield'. This how we parse 'await' in the latest version of reference implementation. Required grammar changes: https://gist.github.com/1st1/cb0bd257b04adb87e167#file-option-1-patch Repo to play with: https://github.com/1st1/cpython/tree/await Syntax: await a() res = (await a()) + (await b()) res = (await (await a())) if (await a()): pass return (await a()) print((await a())) func(arg=(await a())) await a() * b() Option #2. Keep 'await_expr' in 'atom' terminal, but don't require parentheses. Required grammar changes: https://gist.github.com/1st1/cb0bd257b04adb87e167#file-option-2-patch Repo to play with (parser module is broken atm): https://github.com/1st1/cpython/tree/await_noparens Syntax: await a() res = (await a()) + await b() # w/o parens (a() + await b()) res = await await a() if await a(): pass return await a() print(await a()) func(arg=await a()) await a() * b() Option #3. Create a new terminal for await expression between 'atom' and 'power'. Required grammar changes: https://gist.github.com/1st1/cb0bd257b04adb87e167#file-option-3-patch Repo to play with (parser module is broken atm): https://github.com/1st1/cpython/tree/await_noparens2 Syntax: await a() res = await a() + await b() res = await (await a()) # invalid syntax w/o parens if await a(): pass return await a() print(await a()) func(arg=await a()) await (a() * b()) # w/o parens '(await a() * b()) I think that Option #3 is a clear winner. Thanks, Yury From arnodel at gmail.com Sat Apr 25 22:47:38 2015 From: arnodel at gmail.com (Arnaud Delobelle) Date: Sat, 25 Apr 2015 21:47:38 +0100 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <55368858.4010007@gmail.com> References: <55368858.4010007@gmail.com> Message-ID: On Tue, 21 Apr 2015 at 18:27 Yury Selivanov wrote: > > Hi python-dev, > > I'm moving the discussion from python-ideas to here. > > The updated version of the PEP should be available shortly > at https://www.python.org/dev/peps/pep-0492 > and is also pasted in this email. > Hi Yury, Having read this very interesting PEP I would like to make two remarks. I apologise in advance if they are points which have already been discussed. 1. About the 'async for' construct. Each iteration will create a new coroutine object (the one returned by Cursor.__anext__()) and it seems to me that it can be wasteful. In the example given of an 'aiterable' Cursor class, probably a large number of rows will fill the cursor buffer in one call of cursor._prefetch(). However each row that is iterated over will involve the creation execution of a new coroutine object. It seems to me that what is desirable in that case is that all the buffered rows will be iterated over as in a plain for loop. 2. I think the semantics of the new coroutine objects could be defined more clearly in the PEP. Of course they are pretty obvious when you know that the coroutines are meant to replace asyncio.coroutine as described in [1]. I understand that this PEP is mostly for the benefit of asyncio, hence mainly of interest of people who know it. However I think it would be good for it to be more self-contained. I have often read a PEP as an introduction to a new feature of Python. I feel that if I was not familiar with yield from and asyncio I would not be able to understand this PEP, even though potentially one could use the new constructs without knowing anything about them. Cheers, -- Arnaud Delobelle [1] https://docs.python.org/3/library/asyncio-task.html#coroutines From yselivanov.ml at gmail.com Sat Apr 25 23:02:06 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Sat, 25 Apr 2015 17:02:06 -0400 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: References: <55368858.4010007@gmail.com> Message-ID: <553C00CE.8010601@gmail.com> Hi Arnaud, On 2015-04-25 4:47 PM, Arnaud Delobelle wrote: > On Tue, 21 Apr 2015 at 18:27 Yury Selivanov wrote: >> Hi python-dev, >> >> I'm moving the discussion from python-ideas to here. >> >> The updated version of the PEP should be available shortly >> at https://www.python.org/dev/peps/pep-0492 >> and is also pasted in this email. >> > Hi Yury, > > Having read this very interesting PEP I would like to make two > remarks. I apologise in advance if they are points which have already > been discussed. > > 1. About the 'async for' construct. Each iteration will create a new > coroutine object (the one returned by Cursor.__anext__()) and it seems > to me that it can be wasteful. In the example given of an 'aiterable' > Cursor class, probably a large number of rows will fill the cursor > buffer in one call of cursor._prefetch(). However each row that is > iterated over will involve the creation execution of a new coroutine > object. It seems to me that what is desirable in that case is that > all the buffered rows will be iterated over as in a plain for loop. I agree that creating a new coroutine object is a little bit wasteful. However, the proposed iteration protocol was designed to: 1. Resemble already existing __iter__/__next__/StopIteration protocol; 2. Pave the road to introduce coroutine-generators in the future. We could, in theory, design the protocol to make __anext__ awaitable return a regular iterators (and raise StopAsyncIteration at the end) to make things more efficient, but that would complicate the protocol tremendously, and make it very hard to program and debug. My opinion is that this has to be addressed in 3.6 with coroutine-generators if there is enough interest from Python users. > > 2. I think the semantics of the new coroutine objects could be > defined more clearly in the PEP. Of course they are pretty obvious > when you know that the coroutines are meant to replace > asyncio.coroutine as described in [1]. I understand that this PEP is > mostly for the benefit of asyncio, hence mainly of interest of people > who know it. However I think it would be good for it to be more > self-contained. I have often read a PEP as an introduction to a new > feature of Python. I feel that if I was not familiar with yield from > and asyncio I would not be able to understand this PEP, even though > potentially one could use the new constructs without knowing anything > about them. > > I agree. I plan to update the PEP with some new semantics (prohibit passing coroutine-objects to iter(), tuple() and other builtins, as well as using them in 'for .. in coro()' loops). I'll add a section with a more detailed explanation of coroutine-objects. Best, Yury From rosuav at gmail.com Sun Apr 26 00:55:26 2015 From: rosuav at gmail.com (Chris Angelico) Date: Sun, 26 Apr 2015 08:55:26 +1000 Subject: [Python-Dev] Type hints -- a mediocre programmer's reaction In-Reply-To: <553BBA40.5000106@gmail.com> References: <20150423125957.631a8711@x230> <20150423165513.239317a3@x230> <553A7D01.6010100@gmail.com> <20150424214521.6fd74baf@x230> <553AE84B.1090602@gmail.com> <20150425030706.GE5663@ando.pearwood.info> <553BBA40.5000106@gmail.com> Message-ID: On Sun, Apr 26, 2015 at 2:01 AM, Ronan Lamy wrote: >>> * PEP484 hints are too high-level. Replacing an 'int' object with a >>> single machine word would be useful, but an 'int' annotation gives no >>> guarantee that it's correct (because Python 3 ints can have arbitrary >>> size and because subclasses of 'int' can override any operation to >>> invoke arbitrary code). >> >> >> Then create your own int16, uint64 etc types. > > > The PEP doesn't explain how to do that. And even if it's possible, such > types wouldn't be very useful as they're not stable under any arithmetic > operation (e.g. + doesn't necessarily fit in int16). If you define a function that adds two integers and want it to be optimized, it's going to have to check for overflow anyway. That said, though, I'd much rather a machine-word optimization be a feature of "int" than a special adornment that you give to some of your data. CPython 3.x doesn't do it, CPython 2.x kinda did it (an int would turn into a long if it got too big, but a long would never turn into an int), but MicroPython would be most welcome to do the job fully. (I would guess that PyPy already does this kind of thing.) Yes, that means integer addition can't become a single machine language instruction, but on the other hand, it means the Python code doesn't need to concern itself with any boundaries. ChrisA From guido at python.org Sun Apr 26 06:27:42 2015 From: guido at python.org (Guido van Rossum) Date: Sat, 25 Apr 2015 21:27:42 -0700 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <553BF680.10304@gmail.com> References: <553BF680.10304@gmail.com> Message-ID: +1 for option 3. On Sat, Apr 25, 2015 at 1:18 PM, Yury Selivanov wrote: > Hi Guido, > > On 2015-04-24 1:03 PM, Guido van Rossum wrote: > >> *3. syntactic priority of `await`* >> >> Yury, could you tweak the syntax for `await` so that we can write the most >> common usages without parentheses? In particular I'd like to be able to >> write >> ``` >> return await foo() >> with await foo() as bar: ... >> foo(await bar(), await bletch()) >> ``` >> (I don't care about `await foo() + await bar()` but it would be okay.) >> ``` >> I think this is reasonable with some tweaks of the grammar (similar to >> what >> Greg did for cocall, but without requiring call syntax at the end). >> > > I've done some experiments with grammar, and it looks like > we indeed can parse await quite differently from yield. Three > different options: > > > Option #1. Parse 'await' exactly like we parse 'yield'. > This how we parse 'await' in the latest version of reference > implementation. > > Required grammar changes: > https://gist.github.com/1st1/cb0bd257b04adb87e167#file-option-1-patch > > Repo to play with: > https://github.com/1st1/cpython/tree/await > > Syntax: > > await a() > res = (await a()) + (await b()) > res = (await (await a())) > if (await a()): pass > return (await a()) > print((await a())) > func(arg=(await a())) > await a() * b() > > > Option #2. Keep 'await_expr' in 'atom' terminal, but don't > require parentheses. > > Required grammar changes: > https://gist.github.com/1st1/cb0bd257b04adb87e167#file-option-2-patch > > Repo to play with (parser module is broken atm): > https://github.com/1st1/cpython/tree/await_noparens > > Syntax: > > await a() > res = (await a()) + await b() # w/o parens (a() + await b()) > res = await await a() > if await a(): pass > return await a() > print(await a()) > func(arg=await a()) > await a() * b() > > > Option #3. Create a new terminal for await expression between > 'atom' and 'power'. > > Required grammar changes: > https://gist.github.com/1st1/cb0bd257b04adb87e167#file-option-3-patch > > Repo to play with (parser module is broken atm): > https://github.com/1st1/cpython/tree/await_noparens2 > > Syntax: > > await a() > res = await a() + await b() > res = await (await a()) # invalid syntax w/o parens > if await a(): pass > return await a() > print(await a()) > func(arg=await a()) > await (a() * b()) # w/o parens '(await a() * b()) > > > I think that Option #3 is a clear winner. > > Thanks, > Yury > > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Sun Apr 26 07:23:04 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun, 26 Apr 2015 15:23:04 +1000 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <553BF680.10304@gmail.com> References: <553BF680.10304@gmail.com> Message-ID: On 26 April 2015 at 06:18, Yury Selivanov wrote: > Option #3. Create a new terminal for await expression between > 'atom' and 'power'. > > Required grammar changes: > https://gist.github.com/1st1/cb0bd257b04adb87e167#file-option-3-patch > > Repo to play with (parser module is broken atm): > https://github.com/1st1/cpython/tree/await_noparens2 > > Syntax: > > await a() > res = await a() + await b() > res = await (await a()) # invalid syntax w/o parens > if await a(): pass > return await a() > print(await a()) > func(arg=await a()) > await (a() * b()) # w/o parens '(await a() * b()) > > I think that Option #3 is a clear winner. Very nice! How would the following statements parse with this option? res = await a(), b() res = [await a(), b()] with await a(), b: pass f(await a(), b) I think it's OK if these end up requiring parentheses in order to do the right thing (as that will be helpful for humans regardless), but the PEP should be clear as to whether or not they do: res = (await a()), b() res = [(await a()), b()] with (await a()), b: pass f((await a()), b) Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From victor.stinner at gmail.com Sun Apr 26 11:43:03 2015 From: victor.stinner at gmail.com (Victor Stinner) Date: Sun, 26 Apr 2015 11:43:03 +0200 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <553C00CE.8010601@gmail.com> References: <55368858.4010007@gmail.com> <553C00CE.8010601@gmail.com> Message-ID: Le 25 avr. 2015 23:02, "Yury Selivanov" a ?crit : > I agree. I plan to update the PEP with some new > semantics (prohibit passing coroutine-objects to iter(), > tuple() and other builtins, as well as using them in > 'for .. in coro()' loops). I'll add a section with > a more detailed explanation of coroutine-objects. Guido rejected the PEP 3152 because it disallow some use cases (create a coroutine and then wait for it). But careful of not introducing similar limitation in the PEP 492. There is an open issue to change how to check if an object is a coroutine: see issues 24004 and 24018. Well, the issue doesn't propose to remove checks (they are useful to detect common bugs), but to accept generators implemented in other types for Cython. I spend some time in asyncio to valdiate input type because it was to easy to misuse the API. Example: .call_soon(coro_func) was allowed whereas it's an obvious bug See asyncio tests and type checks for more cases. Victor -------------- next part -------------- An HTML attachment was scrubbed... URL: From mark at hotpy.org Sun Apr 26 22:21:19 2015 From: mark at hotpy.org (Mark Shannon) Date: Sun, 26 Apr 2015 21:21:19 +0100 Subject: [Python-Dev] PEP 492: No new syntax is required Message-ID: <553D48BF.1070600@hotpy.org> Hi, I was looking at PEP 492 and it seems to me that no new syntax is required. Looking at the code, it does four things; all of which, or a functional equivalent, could be done with no new syntax. 1. Make a normal function into a generator or coroutine. This can be done with a decorator. 2. Support a parallel set of special methods starting with 'a' or 'async'. Why not just use the current set of special methods? 3. "await". "await" is an operator that takes one argument and produces a single result, without altering flow control and can thus be replaced by an function. 4. Asynchronous with statement. The PEP lists the equivalent as "with (yield from xxx)" which doesn't seem so bad. Please don't add unnecessary new syntax. Cheers, Mark. P.S. I'm not objecting to any of the other new features proposed, just the new syntax. From yoavglazner at gmail.com Sun Apr 26 22:26:15 2015 From: yoavglazner at gmail.com (yoav glazner) Date: Sun, 26 Apr 2015 23:26:15 +0300 Subject: [Python-Dev] PEP 492: No new syntax is required In-Reply-To: <553D48BF.1070600@hotpy.org> References: <553D48BF.1070600@hotpy.org> Message-ID: How do you implement "async for"? On Sun, Apr 26, 2015 at 11:21 PM, Mark Shannon wrote: > Hi, > > I was looking at PEP 492 and it seems to me that no new syntax is required. > > Looking at the code, it does four things; all of which, or a functional > equivalent, could be done with no new syntax. > 1. Make a normal function into a generator or coroutine. This can be done > with a decorator. > 2. Support a parallel set of special methods starting with 'a' or 'async'. > Why not just use the current set of special methods? > 3. "await". "await" is an operator that takes one argument and produces a > single result, without altering flow control and can thus be replaced by an > function. > 4. Asynchronous with statement. The PEP lists the equivalent as "with > (yield from xxx)" which doesn't seem so bad. > > Please don't add unnecessary new syntax. > > Cheers, > Mark. > > P.S. I'm not objecting to any of the other new features proposed, just the > new syntax. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/yoavglazner%40gmail.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From yselivanov.ml at gmail.com Sun Apr 26 22:40:03 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Sun, 26 Apr 2015 16:40:03 -0400 Subject: [Python-Dev] PEP 492: No new syntax is required In-Reply-To: <553D48BF.1070600@hotpy.org> References: <553D48BF.1070600@hotpy.org> Message-ID: <553D4D23.1060500@gmail.com> Hi Mark, On 2015-04-26 4:21 PM, Mark Shannon wrote: > Hi, > > I was looking at PEP 492 and it seems to me that no new syntax is > required. Mark, all your points are explained in the PEP in a great detail: > > Looking at the code, it does four things; all of which, or a > functional equivalent, could be done with no new syntax. Yes, everything that the PEP proposes can be done without new syntax. That's how people use asyncio right now, with only what we have in 3.4. But it's hard. Iterating through something asynchronously? Write a 'while True' loop. Instead of 1 line you now have 5 or 6. Want to commit your database transaction? Instead of 'async with' you will write 'try..except..finally' block, with a very high probability to introduce a bug, because you don't rollback or commit properly or propagate exception. > 1. Make a normal function into a generator or coroutine. This can be > done with a decorator. https://www.python.org/dev/peps/pep-0492/#rationale-and-goals https://www.python.org/dev/peps/pep-0492/#debugging-features https://www.python.org/dev/peps/pep-0492/#importance-of-async-keyword > 2. Support a parallel set of special methods starting with 'a' or > 'async'. Why not just use the current set of special methods? Because you can't reuse them. https://www.python.org/dev/peps/pep-0492/#why-not-reuse-existing-for-and-with-statements https://www.python.org/dev/peps/pep-0492/#why-not-reuse-existing-magic-names > 3. "await". "await" is an operator that takes one argument and > produces a single result, without altering flow control and can thus > be replaced by an function. It can't be replaced by a function. Only if you use greenlets or Stackless Python. > 4. Asynchronous with statement. The PEP lists the equivalent as "with > (yield from xxx)" which doesn't seem so bad. There is no equivalent to 'async with'. "with (yield from xxx)" only allows you to suspend execution in __enter__ (and it's not actually in __enter__, but in a coroutine that returns a context manager). https://www.python.org/dev/peps/pep-0492/#asynchronous-context-managers-and-async-with see "New Syntax" section to see what 'async with' is equivalent too. > > Please don't add unnecessary new syntax. It is necessary. Perhaps you haven't spent a lot of time maintaining huge code-bases developed with frameworks like asyncio, so I understand why it does look unnecessary to you. Thanks, Yury From mark at hotpy.org Sun Apr 26 23:48:53 2015 From: mark at hotpy.org (Mark Shannon) Date: Sun, 26 Apr 2015 22:48:53 +0100 Subject: [Python-Dev] PEP 492: No new syntax is required In-Reply-To: <553D4D23.1060500@gmail.com> References: <553D48BF.1070600@hotpy.org> <553D4D23.1060500@gmail.com> Message-ID: <553D5D45.2040909@hotpy.org> On 26/04/15 21:40, Yury Selivanov wrote: > Hi Mark, > > On 2015-04-26 4:21 PM, Mark Shannon wrote: >> Hi, >> >> I was looking at PEP 492 and it seems to me that no new syntax is >> required. > > Mark, all your points are explained in the PEP in a great detail: I did read the PEP. I do think that clarifying the distinction between coroutines and 'normal' generators is a good idea. Adding stuff to the standard library to help is fine. I just don't think that any new syntax is necessary. > >> >> Looking at the code, it does four things; all of which, or a >> functional equivalent, could be done with no new syntax. > > Yes, everything that the PEP proposes can be done without new syntax. > That's how people use asyncio right now, with only what we have in 3.4. > > But it's hard. Iterating through something asynchronously? Write a > 'while True' loop. Instead of 1 line you now have 5 or 6. Want to > commit your database transaction? Instead of 'async with' you will > write 'try..except..finally' block, with a very high probability to > introduce a bug, because you don't rollback or commit properly or > propagate exception. I don't see why you can't do transactions using a 'with' statement. > >> 1. Make a normal function into a generator or coroutine. This can be >> done with a decorator. > > https://www.python.org/dev/peps/pep-0492/#rationale-and-goals states that """ it is not possible to natively define a coroutine which has no yield or yield from statement """ which is just not true. > https://www.python.org/dev/peps/pep-0492/#debugging-features Requires the addition of the CO_COROUTINE flag, not any new keywords. > https://www.python.org/dev/peps/pep-0492/#importance-of-async-keyword Seems to be repeating the above. > >> 2. Support a parallel set of special methods starting with 'a' or >> 'async'. Why not just use the current set of special methods? > > Because you can't reuse them. > > https://www.python.org/dev/peps/pep-0492/#why-not-reuse-existing-for-and-with-statements Which seems back to front. The argument is that existing syntax constructs cannot be made to work with asynchronous objects. Why not make the asynchronous objects work with the existing syntax? > > https://www.python.org/dev/peps/pep-0492/#why-not-reuse-existing-magic-names The argument here relies on the validity of the previous points. > > >> 3. "await". "await" is an operator that takes one argument and >> produces a single result, without altering flow control and can thus >> be replaced by an function. > > It can't be replaced by a function. Only if you use greenlets or > Stackless Python. Why not? The implementation of await is here: https://github.com/python/cpython/compare/master...1st1:await#diff-23c87bfada1d01335a3019b9321502a0R642 which clearly could be made into a function. > >> 4. Asynchronous with statement. The PEP lists the equivalent as "with >> (yield from xxx)" which doesn't seem so bad. > > There is no equivalent to 'async with'. "with (yield from xxx)" only > allows you to suspend execution > in __enter__ (and it's not actually in __enter__, but in a coroutine > that returns a context manager). > > https://www.python.org/dev/peps/pep-0492/#asynchronous-context-managers-and-async-with > see "New Syntax" section to see what 'async with' is equivalent too. Which, by comparing with PEP 343, can be translated as: with expr as e: e = await(e) ... > >> >> Please don't add unnecessary new syntax. > > > It is necessary. This isn't an argument, it's just contradiction ;) Perhaps you haven't spent a lot of time maintaining > huge code-bases developed with frameworks like asyncio, so I understand > why it does look unnecessary to you. This is a good reason for clarifying the distinction between 'normal' generators and coroutines. It is not, IMO, justification for burdening the language (and everyone porting Python 2 code) with extra syntax. Cheers, Mark. From brett at python.org Mon Apr 27 00:09:57 2015 From: brett at python.org (Brett Cannon) Date: Sun, 26 Apr 2015 22:09:57 +0000 Subject: [Python-Dev] PEP 492: No new syntax is required In-Reply-To: <553D5D45.2040909@hotpy.org> References: <553D48BF.1070600@hotpy.org> <553D4D23.1060500@gmail.com> <553D5D45.2040909@hotpy.org> Message-ID: On Sun, Apr 26, 2015, 17:49 Mark Shannon wrote: On 26/04/15 21:40, Yury Selivanov wrote: > Hi Mark, > > On 2015-04-26 4:21 PM, Mark Shannon wrote: >> Hi, >> >> I was looking at PEP 492 and it seems to me that no new syntax is >> required. > > Mark, all your points are explained in the PEP in a great detail: I did read the PEP. I do think that clarifying the distinction between coroutines and 'normal' generators is a good idea. Adding stuff to the standard library to help is fine. I just don't think that any new syntax is necessary. > >> >> Looking at the code, it does four things; all of which, or a >> functional equivalent, could be done with no new syntax. > > Yes, everything that the PEP proposes can be done without new syntax. > That's how people use asyncio right now, with only what we have in 3.4. > > But it's hard. Iterating through something asynchronously? Write a > 'while True' loop. Instead of 1 line you now have 5 or 6. Want to > commit your database transaction? Instead of 'async with' you will > write 'try..except..finally' block, with a very high probability to > introduce a bug, because you don't rollback or commit properly or > propagate exception. I don't see why you can't do transactions using a 'with' statement. > >> 1. Make a normal function into a generator or coroutine. This can be >> done with a decorator. > > https://www.python.org/dev/peps/pep-0492/#rationale-and-goals states that """ it is not possible to natively define a coroutine which has no yield or yield from statement """ which is just not true. > https://www.python.org/dev/peps/pep-0492/#debugging-features Requires the addition of the CO_COROUTINE flag, not any new keywords. > https://www.python.org/dev/peps/pep-0492/#importance-of-async-keyword Seems to be repeating the above. > >> 2. Support a parallel set of special methods starting with 'a' or >> 'async'. Why not just use the current set of special methods? > > Because you can't reuse them. > > https://www.python.org/dev/peps/pep-0492/#why-not-reuse-existing-for-and-with-statements Which seems back to front. The argument is that existing syntax constructs cannot be made to work with asynchronous objects. Why not make the asynchronous objects work with the existing syntax? > > https://www.python.org/dev/peps/pep-0492/#why-not-reuse-existing-magic-names The argument here relies on the validity of the previous points. > > >> 3. "await". "await" is an operator that takes one argument and >> produces a single result, without altering flow control and can thus >> be replaced by an function. > > It can't be replaced by a function. Only if you use greenlets or > Stackless Python. Why not? The implementation of await is here: https://github.com/python/cpython/compare/master...1st1:await#diff-23c87bfada1d01335a3019b9321502a0R642 which clearly could be made into a function. > >> 4. Asynchronous with statement. The PEP lists the equivalent as "with >> (yield from xxx)" which doesn't seem so bad. > > There is no equivalent to 'async with'. "with (yield from xxx)" only > allows you to suspend execution > in __enter__ (and it's not actually in __enter__, but in a coroutine > that returns a context manager). > > https://www.python.org/dev/peps/pep-0492/#asynchronous-context-managers-and-async-with > see "New Syntax" section to see what 'async with' is equivalent too. Which, by comparing with PEP 343, can be translated as: with expr as e: e = await(e) ... > >> >> Please don't add unnecessary new syntax. > > > It is necessary. This isn't an argument, it's just contradiction ;) Perhaps you haven't spent a lot of time maintaining > huge code-bases developed with frameworks like asyncio, so I understand > why it does look unnecessary to you. This is a good reason for clarifying the distinction between 'normal' generators and coroutines. It is not, IMO, justification for burdening the language (and everyone porting Python 2 code) with extra syntax. How is it a burden for people porting Python 2 code? Because they won't get to name anything 'async' just like anyone supporting older Python 3 versions? Otherwise I don't see how it is of any consequence to people maintaining 2/3 code as it will just be another thing they can't use until they drop Python 2 support. -------------- next part -------------- An HTML attachment was scrubbed... URL: From yselivanov.ml at gmail.com Mon Apr 27 00:12:19 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Sun, 26 Apr 2015 18:12:19 -0400 Subject: [Python-Dev] PEP 492: No new syntax is required In-Reply-To: <553D5D45.2040909@hotpy.org> References: <553D48BF.1070600@hotpy.org> <553D4D23.1060500@gmail.com> <553D5D45.2040909@hotpy.org> Message-ID: <553D62C3.3030903@gmail.com> On 2015-04-26 5:48 PM, Mark Shannon wrote: > > > On 26/04/15 21:40, Yury Selivanov wrote: >> Hi Mark, >> >> On 2015-04-26 4:21 PM, Mark Shannon wrote: >>> Hi, >>> >>> I was looking at PEP 492 and it seems to me that no new syntax is >>> required. >> >> Mark, all your points are explained in the PEP in a great detail: > I did read the PEP. I do think that clarifying the distinction between > coroutines and 'normal' generators is a good idea. Adding stuff to the > standard library to help is fine. I just don't think that any new > syntax is necessary. Well, unfortunately, I can't explain why the new syntax is necessary better than it is already explained in the PEP, sorry. > >> >>> >>> Looking at the code, it does four things; all of which, or a >>> functional equivalent, could be done with no new syntax. >> >> Yes, everything that the PEP proposes can be done without new syntax. >> That's how people use asyncio right now, with only what we have in 3.4. >> >> But it's hard. Iterating through something asynchronously? Write a >> 'while True' loop. Instead of 1 line you now have 5 or 6. Want to >> commit your database transaction? Instead of 'async with' you will >> write 'try..except..finally' block, with a very high probability to >> introduce a bug, because you don't rollback or commit properly or >> propagate exception. > I don't see why you can't do transactions using a 'with' statement. Because you can't use 'yield from' in __exit__. And allowing it there isn't a very good idea. >> >>> 1. Make a normal function into a generator or coroutine. This can be >>> done with a decorator. >> >> https://www.python.org/dev/peps/pep-0492/#rationale-and-goals > states that """ > it is not possible to natively define a coroutine which has no yield > or yield from statement > """ > which is just not true. It *is* true. Please note the word "natively". See coroutine decorator from asyncio: https://github.com/python/cpython/blob/master/Lib/asyncio/coroutines.py#L130 To turn a regular function into a coroutine you have three options: 1. wrap the function; 2. use "if 0: yield" terrible hack; 3. flip CO_GENERATOR flag for the code object (which is CPython implementation detail); also terrible hack. @coroutine decorator is great because we can use asyncio even in 3.3. But it's a) easy to forget b) hard to statically analyze c) hard to explain why functions decorated with it must only contain 'yield from' instead of 'yield' d) few more reasons listed in the PEP > >> https://www.python.org/dev/peps/pep-0492/#debugging-features > Requires the addition of the CO_COROUTINE flag, not any new keywords. True. But the importance of new keywords is covered in other sections. > >> https://www.python.org/dev/peps/pep-0492/#importance-of-async-keyword > Seems to be repeating the above. >> >>> 2. Support a parallel set of special methods starting with 'a' or >>> 'async'. Why not just use the current set of special methods? >> >> Because you can't reuse them. >> >> https://www.python.org/dev/peps/pep-0492/#why-not-reuse-existing-for-and-with-statements >> > Which seems back to front. The argument is that existing syntax > constructs cannot be made to work with asynchronous objects. Why not > make the asynchronous objects work with the existing syntax? Because a) it's not possible; b) the point is to make suspend points visible in the code. That's one of the key principles of asyncio and other frameworks. > >> >> https://www.python.org/dev/peps/pep-0492/#why-not-reuse-existing-magic-names >> > The argument here relies on the validity of the previous points. >> >> >>> 3. "await". "await" is an operator that takes one argument and >>> produces a single result, without altering flow control and can thus >>> be replaced by an function. >> >> It can't be replaced by a function. Only if you use greenlets or >> Stackless Python. > Why not? The implementation of await is here: > https://github.com/python/cpython/compare/master...1st1:await#diff-23c87bfada1d01335a3019b9321502a0R642 > > which clearly could be made into a function. Implementation of 'await' requires YIELD_FROM opcode. As far as I know functions in Python can't push opcodes to the eval loop while running. The only way to do what you propose is to use greenlets or merge stackless into cpython. >> >>> 4. Asynchronous with statement. The PEP lists the equivalent as "with >>> (yield from xxx)" which doesn't seem so bad. >> >> There is no equivalent to 'async with'. "with (yield from xxx)" only >> allows you to suspend execution >> in __enter__ (and it's not actually in __enter__, but in a coroutine >> that returns a context manager). >> >> https://www.python.org/dev/peps/pep-0492/#asynchronous-context-managers-and-async-with >> >> see "New Syntax" section to see what 'async with' is equivalent too. > Which, by comparing with PEP 343, can be translated as: > with expr as e: > e = await(e) > ... >> >>> >>> Please don't add unnecessary new syntax. >> >> >> It is necessary. > This isn't an argument, it's just contradiction ;) > > Perhaps you haven't spent a lot of time maintaining >> huge code-bases developed with frameworks like asyncio, so I understand >> why it does look unnecessary to you. > This is a good reason for clarifying the distinction between 'normal' > generators and coroutines. It is not, IMO, justification for burdening > the language (and everyone porting Python 2 code) with extra syntax. > Everybody loves that Python is minimalistic, extremely easy to learn, and joyful to use. I agree that adding new syntax shouldn't be done without careful consideration. However, when new patterns and approaches evolve and the language isn't ready for them, we have to make using them easier. Otherwise we wouldn't have 'with' statements and many other useful concepts in Python at all. Thanks for the feedback, Yury From yselivanov.ml at gmail.com Mon Apr 27 00:16:26 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Sun, 26 Apr 2015 18:16:26 -0400 Subject: [Python-Dev] PEP 492: No new syntax is required In-Reply-To: References: <553D48BF.1070600@hotpy.org> <553D4D23.1060500@gmail.com> <553D5D45.2040909@hotpy.org> Message-ID: <553D63BA.8090705@gmail.com> Brett, On 2015-04-26 6:09 PM, Brett Cannon wrote: > How is it a burden for people porting Python 2 code? Because they won't > get to name anything 'async' just like anyone supporting older Python 3 > versions? Otherwise I don't see how it is of any consequence to people > maintaining 2/3 code as it will just be another thing they can't use until > they drop Python 2 support. Agree. Also, the PEP states that we'll either keep a modified tokenizer or use __future__ imports till at least 3.7. It's *3 or more years*. And if necessary, we can wait till 3.8 (python 2 won't be supported at that point). Thanks! Yury From ncoghlan at gmail.com Mon Apr 27 00:24:20 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Mon, 27 Apr 2015 08:24:20 +1000 Subject: [Python-Dev] PEP 492: No new syntax is required In-Reply-To: <553D5D45.2040909@hotpy.org> References: <553D48BF.1070600@hotpy.org> <553D4D23.1060500@gmail.com> <553D5D45.2040909@hotpy.org> Message-ID: On 27 Apr 2015 07:50, "Mark Shannon" wrote: > On 26/04/15 21:40, Yury Selivanov wrote: >> >> But it's hard. Iterating through something asynchronously? Write a >> 'while True' loop. Instead of 1 line you now have 5 or 6. Want to >> commit your database transaction? Instead of 'async with' you will >> write 'try..except..finally' block, with a very high probability to >> introduce a bug, because you don't rollback or commit properly or >> propagate exception. > > I don't see why you can't do transactions using a 'with' statement. Because you need to pass control back to the event loop from the *__exit__* method in order to wait for the commit/rollback operation without blocking the scheduler. The "with (yield from cm())" formulation doesn't allow either __enter__ *or* __exit__ to suspend the coroutine to wait for IO, so you have to do the IO up front and return a fully synchronous (but still non-blocking) CM as the result. We knew about these problems going into PEP 3156 ( http://python-notes.curiousefficiency.org/en/latest/pep_ideas/async_programming.html#using-special-methods-in-explicitly-asynchronous-code) so it's mainly a matter of having enough experience with asyncio now to be able to suggest specific syntactic sugar to make the right way and the easy way the same way. Cheers, Nick. -------------- next part -------------- An HTML attachment was scrubbed... URL: From yselivanov.ml at gmail.com Mon Apr 27 00:49:43 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Sun, 26 Apr 2015 18:49:43 -0400 Subject: [Python-Dev] PEP 492: No new syntax is required In-Reply-To: <20150427012512.6919d1eb@x230> References: <553D48BF.1070600@hotpy.org> <553D4D23.1060500@gmail.com> <20150427012512.6919d1eb@x230> Message-ID: <553D6B87.3040500@gmail.com> Paul, On 2015-04-26 6:25 PM, Paul Sokolovsky wrote: > Ok, so here're 3 points this link gives, with my concerns/questions: > >> >An alternative idea about new asynchronous iterators and context >> >managers was to reuse existing magic methods, by adding an async >> >keyword to their declarations: >> >[But:] >> > - it would not be possible to create an object that works in both >> >with and async with statements; > Yes, and I would say, for good. Behavior of sync and async code is > different enough to warrant separate classes (if not libraries!) to > implement each paradigm. What if, in otherwise async code, someone will > miss "async" in "async with" and call sync version of context manager? > So, losing ability stated above isn't big practical loss. That's debatable. It might be useful to create hybrid objects that can work in 'with (yield from lock)' and 'async with lock' statements. > > >> >- it would look confusing > Sorry, "async def __enter__" doesn't look more confusing than > "__aenter__" (vs "__enter__"). I'll update the PEP. The argument shouldn't be that it's confusing, the argument is that __aenter__ returns an 'awaitable', which is either a coroutine-object or a future. You can't reuse __enter__, because you'd break backwards compatibility -- it's perfectly normal for context managers in python to return any object from their __enter__. If we assign some special meaning to futures -- we'll break existing code. > >> >and would require some implicit magic behind the scenes in the >> >interpreter; > Interpreter already does a lot of "implicit magic". Can you please > elaborate what exactly would need to be done? Async with implies using YIELD_FROM opcodes. If you want to make regular with statements compatible you'd need to add a lot of opcodes that will at runtime make a decision whether to use YIELD_FROM or not. The other point is that you can't just randomly start using YIELD_FROM, you can only do so from generators/coroutines. > >> >one of the main points of this proposal is to make coroutines as >> >simple and foolproof as possible. > And it's possible to agree that "to separate notions, create a > dichotomy" is a simple principle on its own. But just taking bunch of > stuff - special methods, exceptions - and duplicating it is a bloat a > violates another principle - DRY. I'd say that a lot of terrible mistakes has happened in software development precisely because someone followed DRY religiously. Following it here will break backwards compatibility and/or make it harder to understand what is actually going on with your code. > You argue that this will make > coroutine writing simple. But coroutines are part of the language, and > duplicating notions makes language more complex/complicated. Again, it makes reasoning about your code simpler. That what matters. Anyways, I really doubt that you can convince anyone to reuse existing dunder methods for async stuff. Thanks, Yury From guido at python.org Mon Apr 27 01:13:43 2015 From: guido at python.org (Guido van Rossum) Date: Sun, 26 Apr 2015 16:13:43 -0700 Subject: [Python-Dev] PEP 492: No new syntax is required In-Reply-To: <553D48BF.1070600@hotpy.org> References: <553D48BF.1070600@hotpy.org> Message-ID: But new syntax is the whole point of the PEP. I want to be able to *syntactically* tell where the suspension points are in coroutines. Currently this means looking for yield [from]; PEP 492 just adds looking for await and async [for|with]. Making await() a function defeats the purpose because now aliasing can hide its presence, and we're back in the land of gevent or stackless (where *anything* can potentially suspend the current task). I don't want to live in that land. On Sun, Apr 26, 2015 at 1:21 PM, Mark Shannon wrote: > Hi, > > I was looking at PEP 492 and it seems to me that no new syntax is required. > > Looking at the code, it does four things; all of which, or a functional > equivalent, could be done with no new syntax. > 1. Make a normal function into a generator or coroutine. This can be done > with a decorator. > 2. Support a parallel set of special methods starting with 'a' or 'async'. > Why not just use the current set of special methods? > 3. "await". "await" is an operator that takes one argument and produces a > single result, without altering flow control and can thus be replaced by an > function. > 4. Asynchronous with statement. The PEP lists the equivalent as "with > (yield from xxx)" which doesn't seem so bad. > > Please don't add unnecessary new syntax. > > Cheers, > Mark. > > P.S. I'm not objecting to any of the other new features proposed, just the > new syntax. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From yselivanov.ml at gmail.com Mon Apr 27 01:45:30 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Sun, 26 Apr 2015 19:45:30 -0400 Subject: [Python-Dev] PEP 492: No new syntax is required In-Reply-To: <20150427023206.13999dc8@x230> References: <553D48BF.1070600@hotpy.org> <553D4D23.1060500@gmail.com> <20150427012512.6919d1eb@x230> <553D6B87.3040500@gmail.com> <20150427023206.13999dc8@x230> Message-ID: <553D789A.1000303@gmail.com> Paul, On 2015-04-26 7:32 PM, Paul Sokolovsky wrote: > Hello, > > On Sun, 26 Apr 2015 18:49:43 -0400 > Yury Selivanov wrote: > > [] > >>>>> - it would look confusing >>> Sorry, "async def __enter__" doesn't look more confusing than >>> "__aenter__" (vs "__enter__"). >> I'll update the PEP. >> >> The argument shouldn't be that it's confusing, the argument >> is that __aenter__ returns an 'awaitable', which is either >> a coroutine-object or a future. >> >> You can't reuse __enter__, because you'd break backwards >> compatibility -- it's perfectly normal for context >> managers in python to return any object from their __enter__. >> If we assign some special meaning to futures -- we'll break >> existing code. > So, again to make sure I (and hopefully other folks) understand it > right. You say "it's perfectly normal for context managers in python to > return any object from their __enter__". That's true, but we talk about > async context managers. There're no such at all, they yet need to be > written. And whoever writes them, would need to return from __enter__ > awaitable, because that's the requirement for an async context manager, > and it is error to return something else. > > Then, is the only logic for proposing __aenter__ is to reinsure against > a situation that someone starts to write async context manager, forgets > that they write async context manager, and make an __enter__ method > there. It's to make sure that it's impossible to accidentally use existing regular context manager that returns a future object from its __enter__ / __exit__ (nobody prohibits you to return a future object from __exit__, although it's pointless) in an 'async with' block. I really don't understand the desire to reuse existing magic methods. Even if it was decided to reuse them, it wouldn't even simplify the implementation in CPython; the code there is already DRY (I don't re-implement opcodes for 'with' statement; I reuse them). > Then your implementation will announce that "async context > manager lacks __aenter__", whereas "my" approach would announce > "Async's manager __enter__ did not return awaitable value". > > Again, is that the distinction you're shooting for, or do I miss > something? > > [] > >> Anyways, I really doubt that you can convince anyone to >> reuse existing dunder methods for async stuff. > Yeah, but it would be nice to understand why "everyone" and "so easily" > agrees to them, after pretty thorough discussion of other aspects. > NP :) FWIW, I wasn't trying to dodge the question, but rather stressing that the DRY argument is weak. And thanks for pointing out that this isn't covered in the PEP in enough detail. I'll update the PEP soon. Thanks! Yury From yselivanov.ml at gmail.com Mon Apr 27 02:39:55 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Sun, 26 Apr 2015 20:39:55 -0400 Subject: [Python-Dev] PEP 492: No new syntax is required In-Reply-To: <20150427031722.3dfa5026@x230> References: <553D48BF.1070600@hotpy.org> <553D4D23.1060500@gmail.com> <20150427012512.6919d1eb@x230> <553D6B87.3040500@gmail.com> <20150427023206.13999dc8@x230> <553D789A.1000303@gmail.com> <20150427031722.3dfa5026@x230> Message-ID: <553D855B.4050104@gmail.com> Paul, On 2015-04-26 8:17 PM, Paul Sokolovsky wrote: > Hello, > > On Sun, 26 Apr 2015 19:45:30 -0400 > Yury Selivanov wrote: > > [] > >>> Then, is the only logic for proposing __aenter__ is to reinsure >>> against a situation that someone starts to write async context >>> manager, forgets that they write async context manager, and make an >>> __enter__ method there. >> It's to make sure that it's impossible to accidentally use >> existing regular context manager that returns a future object >> from its __enter__ / __exit__ (nobody prohibits you to return a >> future object from __exit__, although it's pointless) in an >> 'async with' block. > I see, so it's just to close the final loophole, unlikely to be hit in > real life (unless you can say that there're cases of doing just that in > existing asyncio libs). Well, given that Greg Ewing wanted even > stricter error-proofness, and you rejected it as such strict as to > disallow useful behavior, I've just got to trust you that in this > case, you're as strict as needed. Well, backwards compatibility is something that better be preserved. Especially with things like context managers. Things with __aiter__/__iter__ are also different. It's easier for everyone to just clearly separate the protocols to avoid all kind of risks. > >> I really don't understand the desire to reuse existing magic >> methods. Even if it was decided to reuse them, it wouldn't >> even simplify the implementation in CPython; the code there >> is already DRY (I don't re-implement opcodes for 'with' >> statement; I reuse them). > Well, there're 3 levels of this stuff: > > 1. How "mere" people write their code - everyone would use async def and > await, this should be bullet- (and fool-) proof. > 2. How "library" code is written - async iterators won't be written by > everyone, and only few will write async context managers; it's fair to > expect that people doing these know what they do and don't do stupid > mistakes. > 3. How it all is coded in particular Python implementation. > > It's clear that __enter__ vs __aenter__ distinction is 1st kind of > issue in your list. It is. > > As for 3rd point, I'd like to remind that CPython is only one Python > implementation. And with my MicroPython hat on, I'd like to know if > (some of) these new features are "bloat" or "worthy" for the space > constraints we have. OT: MicroPython is an amazing project. Kudos for doing it. I really hope that addition of few new magic methods won't make it too hard for you guys to implement PEP 492 in MicroPython one day. Thanks! Yury From mark at hotpy.org Mon Apr 27 09:48:49 2015 From: mark at hotpy.org (Mark Shannon) Date: Mon, 27 Apr 2015 08:48:49 +0100 Subject: [Python-Dev] PEP 492: No new syntax is required In-Reply-To: References: <553D48BF.1070600@hotpy.org> Message-ID: <553DE9E1.4040403@hotpy.org> On 27/04/15 00:13, Guido van Rossum wrote: > But new syntax is the whole point of the PEP. I want to be able to > *syntactically* tell where the suspension points are in coroutines. Doesn't "yield from" already do that? > Currently this means looking for yield [from]; PEP 492 just adds looking > for await and async [for|with]. Making await() a function defeats the > purpose because now aliasing can hide its presence, and we're back in > the land of gevent or stackless (where *anything* can potentially > suspend the current task). I don't want to live in that land. I don't think I was clear enough. I said that "await" *is* a function, not that is should be disguised as one. Reading the code, "GetAwaitableIter" would be a better name for that element of the implementation. It is a straightforward non-blocking function. > > On Sun, Apr 26, 2015 at 1:21 PM, Mark Shannon > wrote: > > Hi, > > I was looking at PEP 492 and it seems to me that no new syntax is > required. > > Looking at the code, it does four things; all of which, or a > functional equivalent, could be done with no new syntax. > 1. Make a normal function into a generator or coroutine. This can be > done with a decorator. > 2. Support a parallel set of special methods starting with 'a' or > 'async'. Why not just use the current set of special methods? > 3. "await". "await" is an operator that takes one argument and > produces a single result, without altering flow control and can thus > be replaced by an function. > 4. Asynchronous with statement. The PEP lists the equivalent as > "with (yield from xxx)" which doesn't seem so bad. > > Please don't add unnecessary new syntax. > > Cheers, > Mark. > > P.S. I'm not objecting to any of the other new features proposed, > just the new syntax. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > > > > > -- > --Guido van Rossum (python.org/~guido ) From mark at hotpy.org Mon Apr 27 10:09:53 2015 From: mark at hotpy.org (Mark Shannon) Date: Mon, 27 Apr 2015 09:09:53 +0100 Subject: [Python-Dev] PEP 492: No new syntax is required In-Reply-To: References: <553D48BF.1070600@hotpy.org> <553D4D23.1060500@gmail.com> <553D5D45.2040909@hotpy.org> Message-ID: <553DEED1.5010500@hotpy.org> On 26/04/15 23:24, Nick Coghlan wrote: > > On 27 Apr 2015 07:50, "Mark Shannon" > wrote: > > On 26/04/15 21:40, Yury Selivanov wrote: > >> > >> But it's hard. Iterating through something asynchronously? Write a > >> 'while True' loop. Instead of 1 line you now have 5 or 6. Want to > >> commit your database transaction? Instead of 'async with' you will > >> write 'try..except..finally' block, with a very high probability to > >> introduce a bug, because you don't rollback or commit properly or > >> propagate exception. > > > > I don't see why you can't do transactions using a 'with' statement. > > Because you need to pass control back to the event loop from the > *__exit__* method in order to wait for the commit/rollback operation > without blocking the scheduler. The "with (yield from cm())" formulation > doesn't allow either __enter__ *or* __exit__ to suspend the coroutine to > wait for IO, so you have to do the IO up front and return a fully > synchronous (but still non-blocking) CM as the result. True. The 'with' statement cannot support this use case, but try-except can do the job: trans = yield from db_conn.transaction() try: ... except: yield from trans.roll_back() raise yield from trans.commit() Admittedly not as elegant as the 'with' statement, but perfectly readable. > > We knew about these problems going into PEP 3156 > (http://python-notes.curiousefficiency.org/en/latest/pep_ideas/async_programming.html#using-special-methods-in-explicitly-asynchronous-code) > so it's mainly a matter of having enough experience with asyncio now to > be able to suggest specific syntactic sugar to make the right way and > the easy way the same way. asyncio is just one module amongst thousands, does it really justify special syntax? Cheers, Mark. From greg.ewing at canterbury.ac.nz Mon Apr 27 14:52:25 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Tue, 28 Apr 2015 00:52:25 +1200 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <553BF680.10304@gmail.com> References: <553BF680.10304@gmail.com> Message-ID: <553E3109.2050106@canterbury.ac.nz> Yury Selivanov wrote: > I've done some experiments with grammar, and it looks like > we indeed can parse await quite differently from yield. Three > different options: You don't seem to have tried what I suggested, which is to make 'await' a unary operator with the same precedence as '-', i.e. replace factor: ('+'|'-'|'~') factor | power with factor: ('+'|'-'|'~'|'await') factor | power That would allow await a() res = await a() + await b() res = await await a() if await a(): pass return await a() print(await a()) func(arg=await a()) await a() * b() -- Greg From yselivanov.ml at gmail.com Mon Apr 27 15:44:00 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Mon, 27 Apr 2015 09:44:00 -0400 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <553E3109.2050106@canterbury.ac.nz> References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> Message-ID: <553E3D20.2020008@gmail.com> Hi Greg, I don't want this: "await a() * b()" to be parsed, it's not meaningful. Likely you'll see "await await a()" only once in your life, so I'm fine to use parens for it (moreover, I think it reads better with parens) Yury On 2015-04-27 8:52 AM, Greg Ewing wrote: > Yury Selivanov wrote: >> I've done some experiments with grammar, and it looks like >> we indeed can parse await quite differently from yield. Three >> different options: > > You don't seem to have tried what I suggested, which is > to make 'await' a unary operator with the same precedence > as '-', i.e. replace > > factor: ('+'|'-'|'~') factor | power > > with > > factor: ('+'|'-'|'~'|'await') factor | power > > That would allow > > await a() > res = await a() + await b() > res = await await a() > if await a(): pass > return await a() > print(await a()) > func(arg=await a()) > await a() * b() > From yselivanov.ml at gmail.com Tue Apr 28 05:07:49 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Mon, 27 Apr 2015 23:07:49 -0400 Subject: [Python-Dev] PEP 492: async/await in Python; v3 Message-ID: <553EF985.2070808@gmail.com> Hi python-dev, Another round of updates. Reference implementation has been updated: https://github.com/1st1/cpython/tree/await (includes all things from the below summary of updates + tests). Summary: 1. "PyTypeObject.tp_await" slot. Replaces "tp_reserved". This is to enable implementation of Futures with C API. Must return an iterator if implemented. 2. New grammar for "await" expressions, see 'Syntax of "await" expression' section 3. inspect.iscoroutine() and inspect.iscoroutineobjects() functions. 4. Full separation of coroutines and generators. This is a big one; let's discuss. a) Coroutine objects raise TypeError (is NotImplementedError better?) in their __iter__ and __next__. Therefore it's not not possible to pass them to iter(), tuple(), next() and other similar functions that work with iterables. b) Because of (a), for..in iteration also does not work on coroutines anymore. c) 'yield from' only accept coroutine objects from generators decorated with 'types.coroutine'. That means that existing asyncio generator-based coroutines will happily yield from both coroutines and generators. *But* every generator-based coroutine *must* be decorated with `asyncio.coroutine()`. This is potentially a backwards incompatible change. d) inspect.isgenerator() and inspect.isgeneratorfunction() return `False` for coroutine objects & coroutine functions. e) Should we add a coroutine ABC (for cython etc)? I, personally, think this is highly necessary. First, separation of coroutines from generators is extremely important. One day there won't be generator-based coroutines, and we want to avoid any kind of confusion. Second, we only can do this in 3.5. This kind of semantics change won't be ever possible. asyncio recommends using @coroutine decorator, and most projects that I've seen do use it. Also there is no reason for people to use iter() and next() functions on coroutines when writing asyncio code. I doubt that this will cause serious backwards compatibility problems (asyncio also has provisional status). Thank you, Yury PEP: 492 Title: Coroutines with async and await syntax Version: $Revision$ Last-Modified: $Date$ Author: Yury Selivanov Status: Draft Type: Standards Track Content-Type: text/x-rst Created: 09-Apr-2015 Python-Version: 3.5 Post-History: 17-Apr-2015, 21-Apr-2015, 27-Apr-2015 Abstract ======== This PEP introduces new syntax for coroutines, asynchronous ``with`` statements and ``for`` loops. The main motivation behind this proposal is to streamline writing and maintaining asynchronous code, as well as to simplify previously hard to implement code patterns. Rationale and Goals =================== Current Python supports implementing coroutines via generators (PEP 342), further enhanced by the ``yield from`` syntax introduced in PEP 380. This approach has a number of shortcomings: * it is easy to confuse coroutines with regular generators, since they share the same syntax; async libraries often attempt to alleviate this by using decorators (e.g. ``@asyncio.coroutine`` [1]_); * it is not possible to natively define a coroutine which has no ``yield`` or ``yield from`` statements, again requiring the use of decorators to fix potential refactoring issues; * support for asynchronous calls is limited to expressions where ``yield`` is allowed syntactically, limiting the usefulness of syntactic features, such as ``with`` and ``for`` statements. This proposal makes coroutines a native Python language feature, and clearly separates them from generators. This removes generator/coroutine ambiguity, and makes it possible to reliably define coroutines without reliance on a specific library. This also enables linters and IDEs to improve static code analysis and refactoring. Native coroutines and the associated new syntax features make it possible to define context manager and iteration protocols in asynchronous terms. As shown later in this proposal, the new ``async with`` statement lets Python programs perform asynchronous calls when entering and exiting a runtime context, and the new ``async for`` statement makes it possible to perform asynchronous calls in iterators. Specification ============= This proposal introduces new syntax and semantics to enhance coroutine support in Python, it does not change the internal implementation of coroutines, which are still based on generators. It is strongly suggested that the reader understands how coroutines are implemented in Python (PEP 342 and PEP 380). It is also recommended to read PEP 3156 (asyncio framework) and PEP 3152 (Cofunctions). From this point in this document we use the word *coroutine* to refer to functions declared using the new syntax. *generator-based coroutine* is used where necessary to refer to coroutines that are based on generator syntax. New Coroutine Declaration Syntax -------------------------------- The following new syntax is used to declare a coroutine:: async def read_data(db): pass Key properties of coroutines: * ``async def`` functions are always coroutines, even if they do not contain ``await`` expressions. * It is a ``SyntaxError`` to have ``yield`` or ``yield from`` expressions in an ``async`` function. * Internally, a new code object flag - ``CO_COROUTINE`` - is introduced to enable runtime detection of coroutines (and migrating existing code). All coroutines have both ``CO_COROUTINE`` and ``CO_GENERATOR`` flags set. * Regular generators, when called, return a *generator object*; similarly, coroutines return a *coroutine object*. * ``StopIteration`` exceptions are not propagated out of coroutines, and are replaced with a ``RuntimeError``. For regular generators such behavior requires a future import (see PEP 479). types.coroutine() ----------------- A new function ``coroutine(gen)`` is added to the ``types`` module. It applies ``CO_COROUTINE`` flag to the passed generator-function's code object, making it to return a *coroutine object* when called. This feature enables an easy upgrade path for existing libraries. Await Expression ---------------- The following new ``await`` expression is used to obtain a result of coroutine execution:: async def read_data(db): data = await db.fetch('SELECT ...') ... ``await``, similarly to ``yield from``, suspends execution of ``read_data`` coroutine until ``db.fetch`` *awaitable* completes and returns the result data. It uses the ``yield from`` implementation with an extra step of validating its argument. ``await`` only accepts an *awaitable*, which can be one of: * A *coroutine object* returned from a *coroutine* or a generator decorated with ``types.coroutine()``. * An object with an ``__await__`` method returning an iterator. Any ``yield from`` chain of calls ends with a ``yield``. This is a fundamental mechanism of how *Futures* are implemented. Since, internally, coroutines are a special kind of generators, every ``await`` is suspended by a ``yield`` somewhere down the chain of ``await`` calls (please refer to PEP 3156 for a detailed explanation.) To enable this behavior for coroutines, a new magic method called ``__await__`` is added. In asyncio, for instance, to enable Future objects in ``await`` statements, the only change is to add ``__await__ = __iter__`` line to ``asyncio.Future`` class. Objects with ``__await__`` method are called *Future-like* objects in the rest of this PEP. Also, please note that ``__aiter__`` method (see its definition below) cannot be used for this purpose. It is a different protocol, and would be like using ``__iter__`` instead of ``__call__`` for regular callables. It is a ``TypeError`` if ``__await__`` returns anything but an iterator. * Objects defined with CPython C API with a ``tp_await`` function, returning an iterator (similar to ``__await__`` method). It is a ``SyntaxError`` to use ``await`` outside of a coroutine. It is a ``TypeError`` to pass anything other than an *awaitable* object to an ``await`` expression. Syntax of "await" expression '''''''''''''''''''''''''''' ``await`` keyword is defined differently from ``yield`` and ``yield from``. The main difference is that *await expressions* do not require parentheses around them most of the times. Examples:: ================================== ================================== Expression Will be parsed as ================================== ================================== ``if await fut: pass`` ``if (await fut): pass`` ``if await fut + 1: pass`` ``if (await fut) + 1: pass`` ``pair = await fut, 'spam'`` ``pair = (await fut), 'spam'`` ``with await fut, open(): pass`` ``with (await fut), open(): pass`` ``await foo()['spam'].baz()()`` ``await ( foo()['spam'].baz()() )`` ``return await coro()`` ``return ( await coro() )`` ``res = await coro() ** 2`` ``res = (await coro()) ** 2`` ``func(a1=await coro(), a2=0)`` ``func(a1=(await coro()), a2=0)`` ================================== ================================== See `Grammar Updates`_ section for details. Asynchronous Context Managers and "async with" ---------------------------------------------- An *asynchronous context manager* is a context manager that is able to suspend execution in its *enter* and *exit* methods. To make this possible, a new protocol for asynchronous context managers is proposed. Two new magic methods are added: ``__aenter__`` and ``__aexit__``. Both must return an *awaitable*. An example of an asynchronous context manager:: class AsyncContextManager: async def __aenter__(self): await log('entering context') async def __aexit__(self, exc_type, exc, tb): await log('exiting context') New Syntax '''''''''' A new statement for asynchronous context managers is proposed:: async with EXPR as VAR: BLOCK which is semantically equivalent to:: mgr = (EXPR) aexit = type(mgr).__aexit__ aenter = type(mgr).__aenter__(mgr) exc = True try: try: VAR = await aenter BLOCK except: exc = False exit_res = await aexit(mgr, *sys.exc_info()) if not exit_res: raise finally: if exc: await aexit(mgr, None, None, None) As with regular ``with`` statements, it is possible to specify multiple context managers in a single ``async with`` statement. It is an error to pass a regular context manager without ``__aenter__`` and ``__aexit__`` methods to ``async with``. It is a ``SyntaxError`` to use ``async with`` outside of a coroutine. Example ''''''' With asynchronous context managers it is easy to implement proper database transaction managers for coroutines:: async def commit(session, data): ... async with session.transaction(): ... await session.update(data) ... Code that needs locking also looks lighter:: async with lock: ... instead of:: with (yield from lock): ... Asynchronous Iterators and "async for" -------------------------------------- An *asynchronous iterable* is able to call asynchronous code in its *iter* implementation, and *asynchronous iterator* can call asynchronous code in its *next* method. To support asynchronous iteration: 1. An object must implement an ``__aiter__`` method returning an *awaitable* resulting in an *asynchronous iterator object*. 2. An *asynchronous iterator object* must implement an ``__anext__`` method returning an *awaitable*. 3. To stop iteration ``__anext__`` must raise a ``StopAsyncIteration`` exception. An example of asynchronous iterable:: class AsyncIterable: async def __aiter__(self): return self async def __anext__(self): data = await self.fetch_data() if data: return data else: raise StopAsyncIteration async def fetch_data(self): ... New Syntax '''''''''' A new statement for iterating through asynchronous iterators is proposed:: async for TARGET in ITER: BLOCK else: BLOCK2 which is semantically equivalent to:: iter = (ITER) iter = await type(iter).__aiter__(iter) running = True while running: try: TARGET = await type(iter).__anext__(iter) except StopAsyncIteration: running = False else: BLOCK else: BLOCK2 It is a ``TypeError`` to pass a regular iterable without ``__aiter__`` method to ``async for``. It is a ``SyntaxError`` to use ``async for`` outside of a coroutine. As for with regular ``for`` statement, ``async for`` has an optional ``else`` clause. Example 1 ''''''''' With asynchronous iteration protocol it is possible to asynchronously buffer data during iteration:: async for data in cursor: ... Where ``cursor`` is an asynchronous iterator that prefetches ``N`` rows of data from a database after every ``N`` iterations. The following code illustrates new asynchronous iteration protocol:: class Cursor: def __init__(self): self.buffer = collections.deque() def _prefetch(self): ... async def __aiter__(self): return self async def __anext__(self): if not self.buffer: self.buffer = await self._prefetch() if not self.buffer: raise StopAsyncIteration return self.buffer.popleft() then the ``Cursor`` class can be used as follows:: async for row in Cursor(): print(row) which would be equivalent to the following code:: i = await Cursor().__aiter__() while True: try: row = await i.__anext__() except StopAsyncIteration: break else: print(row) Example 2 ''''''''' The following is a utility class that transforms a regular iterable to an asynchronous one. While this is not a very useful thing to do, the code illustrates the relationship between regular and asynchronous iterators. :: class AsyncIteratorWrapper: def __init__(self, obj): self._it = iter(obj) async def __aiter__(self): return self async def __anext__(self): try: value = next(self._it) except StopIteration: raise StopAsyncIteration return value async for letter in AsyncIteratorWrapper("abc"): print(letter) Why StopAsyncIteration? ''''''''''''''''''''''' Coroutines are still based on generators internally. So, before PEP 479, there was no fundamental difference between :: def g1(): yield from fut return 'spam' and :: def g2(): yield from fut raise StopIteration('spam') And since PEP 479 is accepted and enabled by default for coroutines, the following example will have its ``StopIteration`` wrapped into a ``RuntimeError`` :: async def a1(): await fut raise StopIteration('spam') The only way to tell the outside code that the iteration has ended is to raise something other than ``StopIteration``. Therefore, a new built-in exception class ``StopAsyncIteration`` was added. Moreover, with semantics from PEP 479, all ``StopIteration`` exceptions raised in coroutines are wrapped in ``RuntimeError``. Debugging Features ------------------ One of the most frequent mistakes that people make when using generators as coroutines is forgetting to use ``yield from``:: @asyncio.coroutine def useful(): asyncio.sleep(1) # this will do noting without 'yield from' For debugging this kind of mistakes there is a special debug mode in asyncio, in which ``@coroutine`` decorator wraps all functions with a special object with a destructor logging a warning. Whenever a wrapped generator gets garbage collected, a detailed logging message is generated with information about where exactly the decorator function was defined, stack trace of where it was collected, etc. Wrapper object also provides a convenient ``__repr__`` function with detailed information about the generator. The only problem is how to enable these debug capabilities. Since debug facilities should be a no-op in production mode, ``@coroutine`` decorator makes the decision of whether to wrap or not to wrap based on an OS environment variable ``PYTHONASYNCIODEBUG``. This way it is possible to run asyncio programs with asyncio's own functions instrumented. ``EventLoop.set_debug``, a different debug facility, has no impact on ``@coroutine`` decorator's behavior. With this proposal, coroutines is a native, distinct from generators, concept. New methods ``set_coroutine_wrapper`` and ``get_coroutine_wrapper`` are added to the ``sys`` module, with which frameworks can provide advanced debugging facilities. It is also important to make coroutines as fast and efficient as possible, therefore there are no debug features enabled by default. Example:: async def debug_me(): await asyncio.sleep(1) def async_debug_wrap(generator): return asyncio.CoroWrapper(generator) sys.set_coroutine_wrapper(async_debug_wrap) debug_me() # <- this line will likely GC the coroutine object and # trigger asyncio.CoroWrapper's code. assert isinstance(debug_me(), asyncio.CoroWrapper) sys.set_coroutine_wrapper(None) # <- this unsets any # previously set wrapper assert not isinstance(debug_me(), asyncio.CoroWrapper) If ``sys.set_coroutine_wrapper()`` is called twice, the new wrapper replaces the previous wrapper. ``sys.set_coroutine_wrapper(None)`` unsets the wrapper. inspect.iscoroutine() and inspect.iscoroutineobject() ----------------------------------------------------- Two new functions are added to the ``inspect`` module: * ``inspect.iscoroutine(obj)`` returns ``True`` if ``obj`` is a coroutine object. * ``inspect.iscoroutinefunction(obj)`` returns ``True`` is ``obj`` is a coroutine function. Differences between coroutines and generators --------------------------------------------- A great effort has been made to make sure that coroutines and generators are separate concepts: 1. Coroutine objects do not implement ``__iter__`` and ``__next__`` methods. Therefore they cannot be iterated over or passed to ``iter()``, ``list()``, ``tuple()`` and other built-ins. They also cannot be used in a ``for..in`` loop. 2. ``yield from`` does not accept coroutine objects (unless it is used in a generator-based coroutine decorated with ``types.coroutine``.) 3. ``yield from`` does not accept coroutine objects from plain Python generators (*not* generator-based coroutines.) 4. ``inspect.isgenerator()`` and ``inspect.isgeneratorfunction()`` return ``False`` for coroutine objects and coroutine functions. Coroutine objects ----------------- Coroutines are based on generators internally, thus they share the implementation. Similarly to generator objects, coroutine objects have ``throw``, ``send`` and ``close`` methods. ``StopIteration`` and ``GeneratorExit`` play the same role for coroutine objects (although PEP 479 is enabled by default for coroutines). Glossary ======== :Coroutine: A coroutine function, or just "coroutine", is declared with ``async def``. It uses ``await`` and ``return value``; see `New Coroutine Declaration Syntax`_ for details. :Coroutine object: Returned from a coroutine function. See `Await Expression`_ for details. :Future-like object: An object with an ``__await__`` method, or a C object with ``tp_await`` function, returning an iterator. Can be consumed by an ``await`` expression in a coroutine. A coroutine waiting for a Future-like object is suspended until the Future-like object's ``__await__`` completes, and returns the result. See `Await Expression`_ for details. :Awaitable: A *Future-like* object or a *coroutine object*. See `Await Expression`_ for details. :Generator-based coroutine: Coroutines based in generator syntax. Most common example is ``@asyncio.coroutine``. :Asynchronous context manager: An asynchronous context manager has ``__aenter__`` and ``__aexit__`` methods and can be used with ``async with``. See `Asynchronous Context Managers and "async with"`_ for details. :Asynchronous iterable: An object with an ``__aiter__`` method, which must return an *asynchronous iterator* object. Can be used with ``async for``. See `Asynchronous Iterators and "async for"`_ for details. :Asynchronous iterator: An asynchronous iterator has an ``__anext__`` method. See `Asynchronous Iterators and "async for"`_ for details. List of functions and methods ============================= ================= =================================== ================= Method Can contain Can't contain ================= =================================== ================= async def func await, return value yield, yield from async def __a*__ await, return value yield, yield from def __a*__ return awaitable await def __await__ yield, yield from, return iterable await generator yield, yield from, return value await ================= =================================== ================= Where: * "async def func": coroutine; * "async def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``, ``__aexit__`` defined with the ``async`` keyword; * "def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``, ``__aexit__`` defined without the ``async`` keyword, must return an *awaitable*; * "def __await__": ``__await__`` method to implement *Future-like* objects; * generator: a "regular" generator, function defined with ``def`` and which contains a least one ``yield`` or ``yield from`` expression. Transition Plan =============== To avoid backwards compatibility issues with ``async`` and ``await`` keywords, it was decided to modify ``tokenizer.c`` in such a way, that it: * recognizes ``async def`` name tokens combination (start of a coroutine); * keeps track of regular functions and coroutines; * replaces ``'async'`` token with ``ASYNC`` and ``'await'`` token with ``AWAIT`` when in the process of yielding tokens for coroutines. This approach allows for seamless combination of new syntax features (all of them available only in ``async`` functions) with any existing code. An example of having "async def" and "async" attribute in one piece of code:: class Spam: async = 42 async def ham(): print(getattr(Spam, 'async')) # The coroutine can be executed and will print '42' Backwards Compatibility ----------------------- This proposal preserves 100% backwards compatibility. Grammar Updates --------------- Grammar changes are also fairly minimal:: decorated: decorators (classdef | funcdef | async_funcdef) async_funcdef: ASYNC funcdef compound_stmt: (if_stmt | while_stmt | for_stmt | try_stmt | with_stmt | funcdef | classdef | decorated | async_stmt) async_stmt: ASYNC (funcdef | with_stmt | for_stmt) power: atom_expr ['**' factor] atom_expr: [AWAIT] atom trailer* Transition Period Shortcomings ------------------------------ There is just one. Until ``async`` and ``await`` are not proper keywords, it is not possible (or at least very hard) to fix ``tokenizer.c`` to recognize them on the **same line** with ``def`` keyword:: # async and await will always be parsed as variables async def outer(): # 1 def nested(a=(await fut)): pass async def foo(): return (await fut) # 2 Since ``await`` and ``async`` in such cases are parsed as ``NAME`` tokens, a ``SyntaxError`` will be raised. To workaround these issues, the above examples can be easily rewritten to a more readable form:: async def outer(): # 1 a_default = await fut def nested(a=a_default): pass async def foo(): # 2 return (await fut) This limitation will go away as soon as ``async`` and ``await`` ate proper keywords. Or if it's decided to use a future import for this PEP. Deprecation Plans ----------------- ``async`` and ``await`` names will be softly deprecated in CPython 3.5 and 3.6. In 3.7 we will transform them to proper keywords. Making ``async`` and ``await`` proper keywords before 3.7 might make it harder for people to port their code to Python 3. asyncio ------- ``asyncio`` module was adapted and tested to work with coroutines and new statements. Backwards compatibility is 100% preserved. The required changes are mainly: 1. Modify ``@asyncio.coroutine`` decorator to use new ``types.coroutine()`` function. 2. Add ``__await__ = __iter__`` line to ``asyncio.Future`` class. 3. Add ``ensure_task()`` as an alias for ``async()`` function. Deprecate ``async()`` function. Design Considerations ===================== PEP 3152 -------- PEP 3152 by Gregory Ewing proposes a different mechanism for coroutines (called "cofunctions"). Some key points: 1. A new keyword ``codef`` to declare a *cofunction*. *Cofunction* is always a generator, even if there is no ``cocall`` expressions inside it. Maps to ``async def`` in this proposal. 2. A new keyword ``cocall`` to call a *cofunction*. Can only be used inside a *cofunction*. Maps to ``await`` in this proposal (with some differences, see below.) 3. It is not possible to call a *cofunction* without a ``cocall`` keyword. 4. ``cocall`` grammatically requires parentheses after it:: atom: cocall | cocall: 'cocall' atom cotrailer* '(' [arglist] ')' cotrailer: '[' subscriptlist ']' | '.' NAME 5. ``cocall f(*args, **kwds)`` is semantically equivalent to ``yield from f.__cocall__(*args, **kwds)``. Differences from this proposal: 1. There is no equivalent of ``__cocall__`` in this PEP, which is called and its result is passed to ``yield from`` in the ``cocall`` expression. ``await`` keyword expects an *awaitable* object, validates the type, and executes ``yield from`` on it. Although, ``__await__`` method is similar to ``__cocall__``, but is only used to define *Future-like* objects. 2. ``await`` is defined in almost the same way as ``yield from`` in the grammar (it is later enforced that ``await`` can only be inside ``async def``). It is possible to simply write ``await future``, whereas ``cocall`` always requires parentheses. 3. To make asyncio work with PEP 3152 it would be required to modify ``@asyncio.coroutine`` decorator to wrap all functions in an object with a ``__cocall__`` method, or to implement ``__cocall__`` on generators. To call *cofunctions* from existing generator-based coroutines it would be required to use ``costart(cofunc, *args, **kwargs)`` built-in. 4. Since it is impossible to call a *cofunction* without a ``cocall`` keyword, it automatically prevents the common mistake of forgetting to use ``yield from`` on generator-based coroutines. This proposal addresses this problem with a different approach, see `Debugging Features`_. 5. A shortcoming of requiring a ``cocall`` keyword to call a coroutine is that if is decided to implement coroutine-generators -- coroutines with ``yield`` or ``async yield`` expressions -- we wouldn't need a ``cocall`` keyword to call them. So we'll end up having ``__cocall__`` and no ``__call__`` for regular coroutines, and having ``__call__`` and no ``__cocall__`` for coroutine- generators. 6. Requiring parentheses grammatically also introduces a whole lot of new problems. The following code:: await fut await function_returning_future() await asyncio.gather(coro1(arg1, arg2), coro2(arg1, arg2)) would look like:: cocall fut() # or cocall costart(fut) cocall (function_returning_future())() cocall asyncio.gather(costart(coro1, arg1, arg2), costart(coro2, arg1, arg2)) 7. There are no equivalents of ``async for`` and ``async with`` in PEP 3152. Coroutine-generators -------------------- With ``async for`` keyword it is desirable to have a concept of a *coroutine-generator* -- a coroutine with ``yield`` and ``yield from`` expressions. To avoid any ambiguity with regular generators, we would likely require to have an ``async`` keyword before ``yield``, and ``async yield from`` would raise a ``StopAsyncIteration`` exception. While it is possible to implement coroutine-generators, we believe that they are out of scope of this proposal. It is an advanced concept that should be carefully considered and balanced, with a non-trivial changes in the implementation of current generator objects. This is a matter for a separate PEP. No implicit wrapping in Futures ------------------------------- There is a proposal to add similar mechanism to ECMAScript 7 [2]_. A key difference is that JavaScript "async functions" always return a Promise. While this approach has some advantages, it also implies that a new Promise object is created on each "async function" invocation. We could implement a similar functionality in Python, by wrapping all coroutines in a Future object, but this has the following disadvantages: 1. Performance. A new Future object would be instantiated on each coroutine call. Moreover, this makes implementation of ``await`` expressions slower (disabling optimizations of ``yield from``). 2. A new built-in ``Future`` object would need to be added. 3. Coming up with a generic ``Future`` interface that is usable for any use case in any framework is a very hard to solve problem. 4. It is not a feature that is used frequently, when most of the code is coroutines. Why "async" and "await" keywords -------------------------------- async/await is not a new concept in programming languages: * C# has it since long time ago [5]_; * proposal to add async/await in ECMAScript 7 [2]_; see also Traceur project [9]_; * Facebook's Hack/HHVM [6]_; * Google's Dart language [7]_; * Scala [8]_; * proposal to add async/await to C++ [10]_; * and many other less popular languages. This is a huge benefit, as some users already have experience with async/await, and because it makes working with many languages in one project easier (Python with ECMAScript 7 for instance). Why "__aiter__" is a coroutine ------------------------------ In principle, ``__aiter__`` could be a regular function. There are several good reasons to make it a coroutine: * as most of the ``__anext__``, ``__aenter__``, and ``__aexit__`` methods are coroutines, users would often make a mistake defining it as ``async`` anyways; * there might be a need to run some asynchronous operations in ``__aiter__``, for instance to prepare DB queries or do some file operation. Importance of "async" keyword ----------------------------- While it is possible to just implement ``await`` expression and treat all functions with at least one ``await`` as coroutines, this approach makes APIs design, code refactoring and its long time support harder. Let's pretend that Python only has ``await`` keyword:: def useful(): ... await log(...) ... def important(): await useful() If ``useful()`` function is refactored and someone removes all ``await`` expressions from it, it would become a regular python function, and all code that depends on it, including ``important()`` would be broken. To mitigate this issue a decorator similar to ``@asyncio.coroutine`` has to be introduced. Why "async def" --------------- For some people bare ``async name(): pass`` syntax might look more appealing than ``async def name(): pass``. It is certainly easier to type. But on the other hand, it breaks the symmetry between ``async def``, ``async with`` and ``async for``, where ``async`` is a modifier, stating that the statement is asynchronous. It is also more consistent with the existing grammar. Why "async for/with" instead of "await for/with" ------------------------------------------------ ``async`` is an adjective, and hence it is a better choice for a *statement qualifier* keyword. ``await for/with`` would imply that something is awaiting for a completion of a ``for`` or ``with`` statement. Why "async def" and not "def async" ----------------------------------- ``async`` keyword is a *statement qualifier*. A good analogy to it are "static", "public", "unsafe" keywords from other languages. "async for" is an asynchronous "for" statement, "async with" is an asynchronous "with" statement, "async def" is an asynchronous function. Having "async" after the main statement keyword might introduce some confusion, like "for async item in iterator" can be read as "for each asynchronous item in iterator". Having ``async`` keyword before ``def``, ``with`` and ``for`` also makes the language grammar simpler. And "async def" better separates coroutines from regular functions visually. Why not a __future__ import --------------------------- ``__future__`` imports are inconvenient and easy to forget to add. Also, they are enabled for the whole source file. Consider that there is a big project with a popular module named "async.py". With future imports it is required to either import it using ``__import__()`` or ``importlib.import_module()`` calls, or to rename the module. The proposed approach makes it possible to continue using old code and modules without a hassle, while coming up with a migration plan for future python versions. Why magic methods start with "a" -------------------------------- New asynchronous magic methods ``__aiter__``, ``__anext__``, ``__aenter__``, and ``__aexit__`` all start with the same prefix "a". An alternative proposal is to use "async" prefix, so that ``__aiter__`` becomes ``__async_iter__``. However, to align new magic methods with the existing ones, such as ``__radd__`` and ``__iadd__`` it was decided to use a shorter version. Why not reuse existing magic names ---------------------------------- An alternative idea about new asynchronous iterators and context managers was to reuse existing magic methods, by adding an ``async`` keyword to their declarations:: class CM: async def __enter__(self): # instead of __aenter__ ... This approach has the following downsides: * it would not be possible to create an object that works in both ``with`` and ``async with`` statements; * it would break backwards compatibility, as nothing prohibits from returning a Future-like objects from ``__enter__`` and/or ``__exit__`` in Python <= 3.4; * one of the main points of this proposal is to make coroutines as simple and foolproof as possible, hence the clear separation of the protocols. Why not reuse existing "for" and "with" statements -------------------------------------------------- The vision behind existing generator-based coroutines and this proposal is to make it easy for users to see where the code might be suspended. Making existing "for" and "with" statements to recognize asynchronous iterators and context managers will inevitably create implicit suspend points, making it harder to reason about the code. Comprehensions -------------- For the sake of restricting the broadness of this PEP there is no new syntax for asynchronous comprehensions. This should be considered in a separate PEP, if there is a strong demand for this feature. Async lambdas ------------- Lambda coroutines are not part of this proposal. In this proposal they would look like ``async lambda(parameters): expression``. Unless there is a strong demand to have them as part of this proposal, it is recommended to consider them later in a separate PEP. Performance =========== Overall Impact -------------- This proposal introduces no observable performance impact. Here is an output of python's official set of benchmarks [4]_: :: python perf.py -r -b default ../cpython/python.exe ../cpython-aw/python.exe [skipped] Report on Darwin ysmac 14.3.0 Darwin Kernel Version 14.3.0: Mon Mar 23 11:59:05 PDT 2015; root:xnu-2782.20.48~5/RELEASE_X86_64 x86_64 i386 Total CPU cores: 8 ### etree_iterparse ### Min: 0.365359 -> 0.349168: 1.05x faster Avg: 0.396924 -> 0.379735: 1.05x faster Significant (t=9.71) Stddev: 0.01225 -> 0.01277: 1.0423x larger The following not significant results are hidden, use -v to show them: django_v2, 2to3, etree_generate, etree_parse, etree_process, fastpickle, fastunpickle, json_dump_v2, json_load, nbody, regex_v8, tornado_http. Tokenizer modifications ----------------------- There is no observable slowdown of parsing python files with the modified tokenizer: parsing of one 12Mb file (``Lib/test/test_binop.py`` repeated 1000 times) takes the same amount of time. async/await ----------- The following micro-benchmark was used to determine performance difference between "async" functions and generators:: import sys import time def binary(n): if n <= 0: return 1 l = yield from binary(n - 1) r = yield from binary(n - 1) return l + 1 + r async def abinary(n): if n <= 0: return 1 l = await abinary(n - 1) r = await abinary(n - 1) return l + 1 + r def timeit(gen, depth, repeat): t0 = time.time() for _ in range(repeat): list(gen(depth)) t1 = time.time() print('{}({}) * {}: total {:.3f}s'.format( gen.__name__, depth, repeat, t1-t0)) The result is that there is no observable performance difference. Minimum timing of 3 runs :: abinary(19) * 30: total 12.985s binary(19) * 30: total 12.953s Note that depth of 19 means 1,048,575 calls. Reference Implementation ======================== The reference implementation can be found here: [3]_. List of high-level changes and new protocols -------------------------------------------- 1. New syntax for defining coroutines: ``async def`` and new ``await`` keyword. 2. New ``__await__`` method for Future-like objects, and new ``tp_await`` slot in ``PyTypeObject``. 3. New syntax for asynchronous context managers: ``async with``. And associated protocol with ``__aenter__`` and ``__aexit__`` methods. 4. New syntax for asynchronous iteration: ``async for``. And associated protocol with ``__aiter__``, ``__aexit__`` and new built- in exception ``StopAsyncIteration``. 5. New AST nodes: ``AsyncFunctionDef``, ``AsyncFor``, ``AsyncWith``, ``Await``. 6. New functions: ``sys.set_coroutine_wrapper(callback)``, ``sys.get_coroutine_wrapper()``, ``types.coroutine(gen)``, ``inspect.iscoroutinefunction()``, and ``inspect.iscoroutine()``. 7. New ``CO_COROUTINE`` bit flag for code objects. While the list of changes and new things is not short, it is important to understand, that most users will not use these features directly. It is intended to be used in frameworks and libraries to provide users with convenient to use and unambiguous APIs with ``async def``, ``await``, ``async for`` and ``async with`` syntax. Working example --------------- All concepts proposed in this PEP are implemented [3]_ and can be tested. :: import asyncio async def echo_server(): print('Serving on localhost:8000') await asyncio.start_server(handle_connection, 'localhost', 8000) async def handle_connection(reader, writer): print('New connection...') while True: data = await reader.read(8192) if not data: break print('Sending {:.10}... back'.format(repr(data))) writer.write(data) loop = asyncio.get_event_loop() loop.run_until_complete(echo_server()) try: loop.run_forever() finally: loop.close() References ========== .. [1] https://docs.python.org/3/library/asyncio-task.html#asyncio.coroutine .. [2] http://wiki.ecmascript.org/doku.php?id=strawman:async_functions .. [3] https://github.com/1st1/cpython/tree/await .. [4] https://hg.python.org/benchmarks .. [5] https://msdn.microsoft.com/en-us/library/hh191443.aspx .. [6] http://docs.hhvm.com/manual/en/hack.async.php .. [7] https://www.dartlang.org/articles/await-async/ .. [8] http://docs.scala-lang.org/sips/pending/async.html .. [9] https://github.com/google/traceur-compiler/wiki/LanguageFeatures#async-functions-experimental .. [10] http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3722.pdf (PDF) Acknowledgments =============== I thank Guido van Rossum, Victor Stinner, Elvis Pranskevichus, Andrew Svetlov, and ?ukasz Langa for their initial feedback. Copyright ========= This document has been placed in the public domain. .. Local Variables: mode: indented-text indent-tabs-mode: nil sentence-end-double-space: t fill-column: 70 coding: utf-8 End: From stefan_ml at behnel.de Tue Apr 28 07:43:31 2015 From: stefan_ml at behnel.de (Stefan Behnel) Date: Tue, 28 Apr 2015 07:43:31 +0200 Subject: [Python-Dev] PEP 492: async/await in Python; v3 In-Reply-To: <553EF985.2070808@gmail.com> References: <553EF985.2070808@gmail.com> Message-ID: Yury Selivanov schrieb am 28.04.2015 um 05:07: > e) Should we add a coroutine ABC (for cython etc)? Sounds like the right thing to do, yes. IIUC, a Coroutine would be a new stand-alone ABC with send, throw and close methods. Should a Generator then inherit from both Iterator and Coroutine, or would that counter your intention to separate coroutines from generators as a concept? I mean, they do share the same interface ... It seems you're already aware of https://bugs.python.org/issue24018 Stefan From arnodel at gmail.com Sun Apr 26 10:48:49 2015 From: arnodel at gmail.com (Arnaud Delobelle) Date: Sun, 26 Apr 2015 09:48:49 +0100 Subject: [Python-Dev] async/await in Python; v2 In-Reply-To: <553C00CE.8010601@gmail.com> References: <55368858.4010007@gmail.com> <553C00CE.8010601@gmail.com> Message-ID: On 25 April 2015 at 22:02, Yury Selivanov wrote: [...] > On 2015-04-25 4:47 PM, Arnaud Delobelle wrote: [...] >> 1. About the 'async for' construct. Each iteration will create a new >> coroutine object (the one returned by Cursor.__anext__()) and it seems >> to me that it can be wasteful. In the example given of an 'aiterable' >> Cursor class, probably a large number of rows will fill the cursor >> buffer in one call of cursor._prefetch(). However each row that is >> iterated over will involve the creation execution of a new coroutine >> object. It seems to me that what is desirable in that case is that >> all the buffered rows will be iterated over as in a plain for loop. > > > I agree that creating a new coroutine object is a little bit > wasteful. > > However, the proposed iteration protocol was designed to: > > 1. Resemble already existing __iter__/__next__/StopIteration > protocol; > > 2. Pave the road to introduce coroutine-generators in the > future. Do you mean that __aiter__() would return a 'coroutine-generator'? I'm not sure what such an object is but if it is a suspendable generator in the same way that a coroutine is a suspendable function, then this is a strong argument to make the __aiter__() magic method a coroutine rather than a plain function. I.e. __aiter__() would return either an 'aiterator' or a 'coroutine generator object'. I think this could be mentioned in the section 'Why "__aiter__" is a coroutine' [1]. > We could, in theory, design the protocol to make __anext__ > awaitable return a regular iterators (and raise > StopAsyncIteration at the end) to make things more > efficient, but that would complicate the protocol > tremendously, and make it very hard to program and debug. > > My opinion is that this has to be addressed in 3.6 with > coroutine-generators if there is enough interest from > Python users. True, but to me this is bound to happen. I feel like the semantics of __anext__() is tied to the behaviour of this yet to be defined coroutine generator object and that if it turns out that the natural bevaviour of a coroutine generator is not consistent with the semantics of __anext__() then it would be a shame. I must say I have no evidence that this will happen! >> >> 2. I think the semantics of the new coroutine objects could be >> defined more clearly in the PEP. Of course they are pretty obvious >> when you know that the coroutines are meant to replace >> asyncio.coroutine as described in [1]. I understand that this PEP is >> mostly for the benefit of asyncio, hence mainly of interest of people >> who know it. However I think it would be good for it to be more >> self-contained. I have often read a PEP as an introduction to a new >> feature of Python. I feel that if I was not familiar with yield from >> and asyncio I would not be able to understand this PEP, even though >> potentially one could use the new constructs without knowing anything >> about them. >> > > I agree. I plan to update the PEP with some new > semantics (prohibit passing coroutine-objects to iter(), > tuple() and other builtins, as well as using them in > 'for .. in coro()' loops). I'll add a section with > a more detailed explanation of coroutine-objects. Great! Thanks, -- Arnaud PS: there's a slight asymmetry in the terminology between coroutines and generators. 'Generator functions' are to 'generators' what 'coroutines' are to 'coroutine objects', which makes it difficult to what one is talking about when referring to a 'coroutine generator'. [1] https://www.python.org/dev/peps/pep-0492/#id52 From encukou at gmail.com Mon Apr 27 16:02:02 2015 From: encukou at gmail.com (Petr Viktorin) Date: Mon, 27 Apr 2015 16:02:02 +0200 Subject: [Python-Dev] A macro for easier rich comparisons Message-ID: It seems the discussion on python-ideas, and also the patch review, died down. So I'm posting to python-dev. A macro like this would reduce boilerplate in stdlib and third-party C extensions. It would ease porting C extensions to Python 3, where rich comparison is mandatory. #define Py_RETURN_RICHCOMPARE(val1, val2, op) \ do { \ switch (op) { \ case Py_EQ: if ((val1) == (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ case Py_NE: if ((val1) != (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ case Py_LT: if ((val1) < (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ case Py_GT: if ((val1) > (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ case Py_LE: if ((val1) <= (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ case Py_GE: if ((val1) >= (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ } \ Py_RETURN_NOTIMPLEMENTED; \ } while (0) Is any of the core devs interested in this macro? Anything I can do to help get it in? http://bugs.python.org/issue23699 From pmiscml at gmail.com Mon Apr 27 00:25:12 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Mon, 27 Apr 2015 01:25:12 +0300 Subject: [Python-Dev] PEP 492: No new syntax is required In-Reply-To: <553D4D23.1060500@gmail.com> References: <553D48BF.1070600@hotpy.org> <553D4D23.1060500@gmail.com> Message-ID: <20150427012512.6919d1eb@x230> Hello, On Sun, 26 Apr 2015 16:40:03 -0400 Yury Selivanov wrote: > Hi Mark, > > On 2015-04-26 4:21 PM, Mark Shannon wrote: > > Hi, > > > > I was looking at PEP 492 and it seems to me that no new syntax is > > required. > > Mark, all your points are explained in the PEP in a great detail: Indeed, they are. It is the very basic counter-argument against this PEP is that everything it proposes is already doable. But the PEP makes very clear that the only reason it exists is to make coroutine programming easier and less error-prone. However, given that there're questions even like that and you're great at keeping up discussion, I'd appreciate additional argumentation on: > > 2. Support a parallel set of special methods starting with 'a' or > > 'async'. Why not just use the current set of special methods? > > Because you can't reuse them. > > https://www.python.org/dev/peps/pep-0492/#why-not-reuse-existing-magic-names Ok, so here're 3 points this link gives, with my concerns/questions: > An alternative idea about new asynchronous iterators and context > managers was to reuse existing magic methods, by adding an async > keyword to their declarations: > [But:] > - it would not be possible to create an object that works in both > with and async with statements; Yes, and I would say, for good. Behavior of sync and async code is different enough to warrant separate classes (if not libraries!) to implement each paradigm. What if, in otherwise async code, someone will miss "async" in "async with" and call sync version of context manager? So, losing ability stated above isn't big practical loss. > - it would look confusing Sorry, "async def __enter__" doesn't look more confusing than "__aenter__" (vs "__enter__"). > and would require some implicit magic behind the scenes in the > interpreter; Interpreter already does a lot of "implicit magic". Can you please elaborate what exactly would need to be done? > one of the main points of this proposal is to make coroutines as > simple and foolproof as possible. And it's possible to agree that "to separate notions, create a dichotomy" is a simple principle on its own. But just taking bunch of stuff - special methods, exceptions - and duplicating it is a bloat a violates another principle - DRY. You argue that this will make coroutine writing simple. But coroutines are part of the language, and duplicating notions makes language more complex/complicated. Can you please argue that in this case it's worth duplicating hierarchies instead of reusing existing lower-level concepts, given that higher-level concepts are already distinguished well enough (async and await keywords separate "old" and "new" things very visibly). -- Best regards, Paul mailto:pmiscml at gmail.com From pmiscml at gmail.com Mon Apr 27 01:32:06 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Mon, 27 Apr 2015 02:32:06 +0300 Subject: [Python-Dev] PEP 492: No new syntax is required In-Reply-To: <553D6B87.3040500@gmail.com> References: <553D48BF.1070600@hotpy.org> <553D4D23.1060500@gmail.com> <20150427012512.6919d1eb@x230> <553D6B87.3040500@gmail.com> Message-ID: <20150427023206.13999dc8@x230> Hello, On Sun, 26 Apr 2015 18:49:43 -0400 Yury Selivanov wrote: [] > >> >- it would look confusing > > Sorry, "async def __enter__" doesn't look more confusing than > > "__aenter__" (vs "__enter__"). > > I'll update the PEP. > > The argument shouldn't be that it's confusing, the argument > is that __aenter__ returns an 'awaitable', which is either > a coroutine-object or a future. > > You can't reuse __enter__, because you'd break backwards > compatibility -- it's perfectly normal for context > managers in python to return any object from their __enter__. > If we assign some special meaning to futures -- we'll break > existing code. So, again to make sure I (and hopefully other folks) understand it right. You say "it's perfectly normal for context managers in python to return any object from their __enter__". That's true, but we talk about async context managers. There're no such at all, they yet need to be written. And whoever writes them, would need to return from __enter__ awaitable, because that's the requirement for an async context manager, and it is error to return something else. Then, is the only logic for proposing __aenter__ is to reinsure against a situation that someone starts to write async context manager, forgets that they write async context manager, and make an __enter__ method there. Then your implementation will announce that "async context manager lacks __aenter__", whereas "my" approach would announce "Async's manager __enter__ did not return awaitable value". Again, is that the distinction you're shooting for, or do I miss something? [] > Anyways, I really doubt that you can convince anyone to > reuse existing dunder methods for async stuff. Yeah, but it would be nice to understand why "everyone" and "so easily" agrees to them, after pretty thorough discussion of other aspects. > Thanks, > Yury -- Best regards, Paul mailto:pmiscml at gmail.com From pmiscml at gmail.com Mon Apr 27 01:50:06 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Mon, 27 Apr 2015 02:50:06 +0300 Subject: [Python-Dev] PEP 492: No new syntax is required In-Reply-To: References: <553D48BF.1070600@hotpy.org> Message-ID: <20150427025006.6e1d17ef@x230> Hello, On Sun, 26 Apr 2015 16:13:43 -0700 Guido van Rossum wrote: > But new syntax is the whole point of the PEP. I want to be able to > *syntactically* tell where the suspension points are in coroutines. > Currently this means looking for yield [from]; PEP 492 just adds > looking for await and async [for|with]. Making await() a function > defeats the purpose because now aliasing can hide its presence, and > we're back in the land of gevent or stackless (where *anything* can > potentially suspend the current task). I don't want to live in that > land. And someone in this thread should have link in a signature to the page on why there's not wanting to live in that land. I don't have it handy, so may offer only reminiscence: if you don't know where there're suspension points, you essentially should assume that any function call can be a suspension point. And then you dropped almost as low as when using threads - you can lose control anytime, data may change anytime, you need to extensively use locks, etc. Oh, and btw, there was related claim earlier that some things are better done in thread pools, so async context managers can be done away with. That's another faulty point - if asyncio is to be feature-complete, it should work [well] with own functionality, without crutches of "foreign" paradigms. Otherwise you may be just happy to use threads for everything at all, like everyone did 5 years ago. (And yeah, there's even usages for that, for example, MicroPython is proudly GIL-free, translating to not supporting those thread thingies and relying on native Python concurrency primitives). > > On Sun, Apr 26, 2015 at 1:21 PM, Mark Shannon wrote: > > > Hi, > > > > I was looking at PEP 492 and it seems to me that no new syntax is > > required. > > > > Looking at the code, it does four things; all of which, or a > > functional equivalent, could be done with no new syntax. > > 1. Make a normal function into a generator or coroutine. This can > > be done with a decorator. > > 2. Support a parallel set of special methods starting with 'a' or > > 'async'. Why not just use the current set of special methods? > > 3. "await". "await" is an operator that takes one argument and > > produces a single result, without altering flow control and can > > thus be replaced by an function. > > 4. Asynchronous with statement. The PEP lists the equivalent as > > "with (yield from xxx)" which doesn't seem so bad. > > > > Please don't add unnecessary new syntax. > > > > Cheers, > > Mark. > > > > P.S. I'm not objecting to any of the other new features proposed, > > just the new syntax. > > _______________________________________________ > > Python-Dev mailing list > > Python-Dev at python.org > > https://mail.python.org/mailman/listinfo/python-dev > > Unsubscribe: > > https://mail.python.org/mailman/options/python-dev/guido%40python.org > > > > > > -- > --Guido van Rossum (python.org/~guido) -- Best regards, Paul mailto:pmiscml at gmail.com From pmiscml at gmail.com Mon Apr 27 02:17:22 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Mon, 27 Apr 2015 03:17:22 +0300 Subject: [Python-Dev] PEP 492: No new syntax is required In-Reply-To: <553D789A.1000303@gmail.com> References: <553D48BF.1070600@hotpy.org> <553D4D23.1060500@gmail.com> <20150427012512.6919d1eb@x230> <553D6B87.3040500@gmail.com> <20150427023206.13999dc8@x230> <553D789A.1000303@gmail.com> Message-ID: <20150427031722.3dfa5026@x230> Hello, On Sun, 26 Apr 2015 19:45:30 -0400 Yury Selivanov wrote: [] > > Then, is the only logic for proposing __aenter__ is to reinsure > > against a situation that someone starts to write async context > > manager, forgets that they write async context manager, and make an > > __enter__ method there. > > It's to make sure that it's impossible to accidentally use > existing regular context manager that returns a future object > from its __enter__ / __exit__ (nobody prohibits you to return a > future object from __exit__, although it's pointless) in an > 'async with' block. I see, so it's just to close the final loophole, unlikely to be hit in real life (unless you can say that there're cases of doing just that in existing asyncio libs). Well, given that Greg Ewing wanted even stricter error-proofness, and you rejected it as such strict as to disallow useful behavior, I've just got to trust you that in this case, you're as strict as needed. > I really don't understand the desire to reuse existing magic > methods. Even if it was decided to reuse them, it wouldn't > even simplify the implementation in CPython; the code there > is already DRY (I don't re-implement opcodes for 'with' > statement; I reuse them). Well, there're 3 levels of this stuff: 1. How "mere" people write their code - everyone would use async def and await, this should be bullet- (and fool-) proof. 2. How "library" code is written - async iterators won't be written by everyone, and only few will write async context managers; it's fair to expect that people doing these know what they do and don't do stupid mistakes. 3. How it all is coded in particular Python implementation. It's clear that __enter__ vs __aenter__ distinction is 1st kind of issue in your list. As for 3rd point, I'd like to remind that CPython is only one Python implementation. And with my MicroPython hat on, I'd like to know if (some of) these new features are "bloat" or "worthy" for the space constraints we have. I appreciate the answers you gave on all accounts! > > Thanks! > Yury -- Best regards, Paul mailto:pmiscml at gmail.com From pmiscml at gmail.com Tue Apr 28 00:48:49 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Tue, 28 Apr 2015 01:48:49 +0300 Subject: [Python-Dev] PEP 492: No new syntax is required In-Reply-To: <553D855B.4050104@gmail.com> References: <553D48BF.1070600@hotpy.org> <553D4D23.1060500@gmail.com> <20150427012512.6919d1eb@x230> <553D6B87.3040500@gmail.com> <20150427023206.13999dc8@x230> <553D789A.1000303@gmail.com> <20150427031722.3dfa5026@x230> <553D855B.4050104@gmail.com> Message-ID: <20150428014849.79fce431@x230> Hello, On Sun, 26 Apr 2015 20:39:55 -0400 Yury Selivanov wrote: [] > > As for 3rd point, I'd like to remind that CPython is only one Python > > implementation. And with my MicroPython hat on, I'd like to know if > > (some of) these new features are "bloat" or "worthy" for the space > > constraints we have. > > OT: MicroPython is an amazing project. Kudos for doing it. > > I really hope that addition of few new magic methods won't > make it too hard for you guys to implement PEP 492 in > MicroPython one day. Thanks! Damien George, MicroPython's author, actually already made a basic implementation of async def/await: https://github.com/micropython/micropython/commit/81afa7e098634605c04597d34a51ca2e59a87d7c So we surely do hope this PEP will be accepted, and soonish! ;-) -- Best regards, Paul mailto:pmiscml at gmail.com From victor.stinner at gmail.com Tue Apr 28 11:13:56 2015 From: victor.stinner at gmail.com (Victor Stinner) Date: Tue, 28 Apr 2015 11:13:56 +0200 Subject: [Python-Dev] A macro for easier rich comparisons In-Reply-To: References: Message-ID: Hi, 2015-04-27 16:02 GMT+02:00 Petr Viktorin : > A macro like this would reduce boilerplate in stdlib and third-party C > extensions. It would ease porting C extensions to Python 3, where rich > comparison is mandatory. It would be nice to have a six module for C extensions. I'm quite sure that many projects are already full of #ifdef PYTHON3 ... #else ... #endif macros. > #define Py_RETURN_RICHCOMPARE(val1, val2, op) \ > do { \ > switch (op) { \ > case Py_EQ: if ((val1) == (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ > case Py_NE: if ((val1) != (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ > case Py_LT: if ((val1) < (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ > case Py_GT: if ((val1) > (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ > case Py_LE: if ((val1) <= (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ > case Py_GE: if ((val1) >= (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ > } \ > Py_RETURN_NOTIMPLEMENTED; \ > } while (0) I would prefer a function for that: PyObject *Py_RichCompare(long val1, long2, int op); You should also handle invalid operator. PyUnicode_RichCompare() calls PyErr_BadArgument() in this case. Anyway, please open an issue for this idea. Victor From andrew.svetlov at gmail.com Tue Apr 28 14:22:56 2015 From: andrew.svetlov at gmail.com (Andrew Svetlov) Date: Tue, 28 Apr 2015 15:22:56 +0300 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <553E3D20.2020008@gmail.com> References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> Message-ID: I prefer option #3. On Mon, Apr 27, 2015 at 4:44 PM, Yury Selivanov wrote: > Hi Greg, > > I don't want this: "await a() * b()" to be parsed, it's not meaningful. > > Likely you'll see "await await a()" only once in your life, so I'm fine to > use parens for it (moreover, I think it reads better with parens) > > Yury > > > On 2015-04-27 8:52 AM, Greg Ewing wrote: >> >> Yury Selivanov wrote: >>> >>> I've done some experiments with grammar, and it looks like >>> we indeed can parse await quite differently from yield. Three >>> different options: >> >> >> You don't seem to have tried what I suggested, which is >> to make 'await' a unary operator with the same precedence >> as '-', i.e. replace >> >> factor: ('+'|'-'|'~') factor | power >> >> with >> >> factor: ('+'|'-'|'~'|'await') factor | power >> >> That would allow >> >> await a() >> res = await a() + await b() >> res = await await a() >> if await a(): pass >> return await a() >> print(await a()) >> func(arg=await a()) >> await a() * b() >> > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.com -- Thanks, Andrew Svetlov From walter at livinglogic.de Tue Apr 28 16:23:31 2015 From: walter at livinglogic.de (Walter =?utf-8?q?D=C3=B6rwald?=) Date: Tue, 28 Apr 2015 16:23:31 +0200 Subject: [Python-Dev] PEP 492: async/await in Python; v3 In-Reply-To: <553EF985.2070808@gmail.com> References: <553EF985.2070808@gmail.com> Message-ID: <37F01B37-63F0-421D-81D5-F08427C74C59@livinglogic.de> On 28 Apr 2015, at 5:07, Yury Selivanov wrote: > Hi python-dev, > > Another round of updates. Reference implementation > has been updated: https://github.com/1st1/cpython/tree/await > (includes all things from the below summary of updates + > tests). > > [...] > New Coroutine Declaration Syntax > -------------------------------- > > The following new syntax is used to declare a coroutine:: > > async def read_data(db): > pass > > Key properties of coroutines: > > * ``async def`` functions are always coroutines, even if they do not > contain ``await`` expressions. > > * It is a ``SyntaxError`` to have ``yield`` or ``yield from`` > expressions in an ``async`` function. Does this mean it's not possible to implement an async version of os.walk() if we had an async version of os.listdir()? I.e. for async code we're back to implementing iterators "by hand" instead of using generators for it. > [...] Servus, Walter From barry at python.org Tue Apr 28 16:59:17 2015 From: barry at python.org (Barry Warsaw) Date: Tue, 28 Apr 2015 10:59:17 -0400 Subject: [Python-Dev] A macro for easier rich comparisons In-Reply-To: References: Message-ID: <20150428105917.77c6f3df@anarchist.wooz.org> On Apr 28, 2015, at 11:13 AM, Victor Stinner wrote: >It would be nice to have a six module for C extensions. I'm quite sure >that many projects are already full of #ifdef PYTHON3 ... #else ... >#endif macros. Maybe encapsulating some of the recommendations here: https://wiki.python.org/moin/PortingToPy3k/BilingualQuickRef#Python_extension_modules (We really need to collect all this information in on place.) >> #define Py_RETURN_RICHCOMPARE(val1, val2, op) I think this macro would make a nice addition to the C API. It might read better as `Py_RETURN_RICHCOMPARE(val1, op, val2)`. Cheers, -Barry From yselivanov.ml at gmail.com Tue Apr 28 17:21:13 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Tue, 28 Apr 2015 11:21:13 -0400 Subject: [Python-Dev] PEP 492: async/await in Python; v3 In-Reply-To: <37F01B37-63F0-421D-81D5-F08427C74C59@livinglogic.de> References: <553EF985.2070808@gmail.com> <37F01B37-63F0-421D-81D5-F08427C74C59@livinglogic.de> Message-ID: <553FA569.70003@gmail.com> Hi Walter, On 2015-04-28 10:23 AM, Walter D?rwald wrote: >> Key properties of coroutines: >> >> * ``async def`` functions are always coroutines, even if they do not >> contain ``await`` expressions. >> >> * It is a ``SyntaxError`` to have ``yield`` or ``yield from`` >> expressions in an ``async`` function. > > Does this mean it's not possible to implement an async version of > os.walk() if we had an async version of os.listdir()? > > I.e. for async code we're back to implementing iterators "by hand" > instead of using generators for it. For now yes. Unfortunately, we don't have time to implement coroutine-generators properly in 3.5. Thanks, Yury From yselivanov.ml at gmail.com Tue Apr 28 17:22:39 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Tue, 28 Apr 2015 11:22:39 -0400 Subject: [Python-Dev] PEP 492: async/await in Python; v3 In-Reply-To: References: <553EF985.2070808@gmail.com> Message-ID: <553FA5BF.9010007@gmail.com> Hi Stefan, On 2015-04-28 1:43 AM, Stefan Behnel wrote: > Should a Generator then inherit from both Iterator and Coroutine, or would > that counter your intention to separate coroutines from generators as a > concept? I mean, they do share the same interface ... Them sharing the same interface depends on how the discussion goes :) But all in all, I think that it should be totally separate classes, even if they share some methods. Thanks, Yury From ethan at stoneleaf.us Tue Apr 28 17:29:17 2015 From: ethan at stoneleaf.us (Ethan Furman) Date: Tue, 28 Apr 2015 08:29:17 -0700 Subject: [Python-Dev] PEP 492: async/await in Python; v3 In-Reply-To: <553FA5BF.9010007@gmail.com> References: <553EF985.2070808@gmail.com> <553FA5BF.9010007@gmail.com> Message-ID: <20150428152917.GM32422@stoneleaf.us> On 04/28, Yury Selivanov wrote: > On 2015-04-28 1:43 AM, Stefan Behnel wrote: >> Should a Generator then inherit from both Iterator and Coroutine, or would >> that counter your intention to separate coroutines from generators as a >> concept? I mean, they do share the same interface ... > > Them sharing the same interface depends on how the discussion goes > :) But all in all, I think that it should be totally separate > classes, even if they share some methods. For those of us who like to meddle, would it be possible to create an object that supports __iter__, __next__, __aiter__, and __anext__? -- ~Ethan~ From yselivanov.ml at gmail.com Tue Apr 28 17:36:35 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Tue, 28 Apr 2015 11:36:35 -0400 Subject: [Python-Dev] PEP 492: async/await in Python; v3 In-Reply-To: <20150428152917.GM32422@stoneleaf.us> References: <553EF985.2070808@gmail.com> <553FA5BF.9010007@gmail.com> <20150428152917.GM32422@stoneleaf.us> Message-ID: <553FA903.70806@gmail.com> Ethan, On 2015-04-28 11:29 AM, Ethan Furman wrote: > On 04/28, Yury Selivanov wrote: >> On 2015-04-28 1:43 AM, Stefan Behnel wrote: >>> Should a Generator then inherit from both Iterator and Coroutine, or would >>> that counter your intention to separate coroutines from generators as a >>> concept? I mean, they do share the same interface ... >> Them sharing the same interface depends on how the discussion goes >> :) But all in all, I think that it should be totally separate >> classes, even if they share some methods. > For those of us who like to meddle, would it be possible to create an object > that supports __iter__, __next__, __aiter__, and __anext__? > Sure, there is nothing that could prevent you from doing that. Thanks, Yury From v+python at g.nevcal.com Tue Apr 28 18:46:53 2015 From: v+python at g.nevcal.com (Glenn Linderman) Date: Tue, 28 Apr 2015 09:46:53 -0700 Subject: [Python-Dev] PEP 492: No new syntax is required In-Reply-To: <20150427023206.13999dc8@x230> References: <553D48BF.1070600@hotpy.org> <553D4D23.1060500@gmail.com> <20150427012512.6919d1eb@x230> <553D6B87.3040500@gmail.com> <20150427023206.13999dc8@x230> Message-ID: <553FB97D.3040109@g.nevcal.com> On 4/26/2015 4:32 PM, Paul Sokolovsky wrote: > Then, is the only logic for proposing __aenter__ is to reinsure against > a situation that someone starts to write async context manager, forgets > that they write async context manager, and make an __enter__ method > there. Then your implementation will announce that "async context > manager lacks __aenter__", whereas "my" approach would announce > "Async's manager __enter__ did not return awaitable value". > > Again, is that the distinction you're shooting for, or do I miss > something? Seems like the missing __aenter__ can easily be detected by the interpreter at compile time, but the wrong type returned would be at run time, or after a complex type-analysis done at compile time (unlikely to be practical). So I think you've nailed the distinction... but I'm not the expert. -------------- next part -------------- An HTML attachment was scrubbed... URL: From v+python at g.nevcal.com Tue Apr 28 19:50:22 2015 From: v+python at g.nevcal.com (Glenn Linderman) Date: Tue, 28 Apr 2015 10:50:22 -0700 Subject: [Python-Dev] A macro for easier rich comparisons In-Reply-To: References: Message-ID: <553FC85E.5010609@g.nevcal.com> On 4/28/2015 2:13 AM, Victor Stinner wrote: >> #define Py_RETURN_RICHCOMPARE(val1, val2, op) \ >> > do { \ >> > switch (op) { \ >> > case Py_EQ: if ((val1) == (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ >> > case Py_NE: if ((val1) != (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ >> > case Py_LT: if ((val1) < (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ >> > case Py_GT: if ((val1) > (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ >> > case Py_LE: if ((val1) <= (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ >> > case Py_GE: if ((val1) >= (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ >> > } \ >> > Py_RETURN_NOTIMPLEMENTED; \ >> > } while (0) > I would prefer a function for that: > > PyObject *Py_RichCompare(long val1, long2, int op); Why would you prefer a function? As a macro, when the op is a constant, most of the code would be optimized away by a decent compiler. I suppose when the op is not a constant, then a function would save code space. So I suppose it depends on the predominant use cases. > You should also handle invalid operator. PyUnicode_RichCompare() calls > PyErr_BadArgument() in this case. One can quibble over the correct error return, but the above code does handle invalid operators after the switch. Glenn -------------- next part -------------- An HTML attachment was scrubbed... URL: From dmalcolm at redhat.com Tue Apr 28 20:12:10 2015 From: dmalcolm at redhat.com (David Malcolm) Date: Tue, 28 Apr 2015 14:12:10 -0400 Subject: [Python-Dev] A macro for easier rich comparisons In-Reply-To: <553FC85E.5010609@g.nevcal.com> References: <553FC85E.5010609@g.nevcal.com> Message-ID: <1430244730.32584.215.camel@surprise> On Tue, 2015-04-28 at 10:50 -0700, Glenn Linderman wrote: > On 4/28/2015 2:13 AM, Victor Stinner wrote: > > > > #define Py_RETURN_RICHCOMPARE(val1, val2, op) \ > > > > do { \ > > > > switch (op) { \ > > > > case Py_EQ: if ((val1) == (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ > > > > case Py_NE: if ((val1) != (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ > > > > case Py_LT: if ((val1) < (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ > > > > case Py_GT: if ((val1) > (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ > > > > case Py_LE: if ((val1) <= (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ > > > > case Py_GE: if ((val1) >= (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ > > > > } \ > > > > Py_RETURN_NOTIMPLEMENTED; \ > > > > } while (0) > > I would prefer a function for that: > > > > PyObject *Py_RichCompare(long val1, long2, int op); > Why would you prefer a function? As a macro, when the op is a > constant, most of the code would be optimized away by a decent > compiler. > > I suppose when the op is not a constant, then a function would save > code space. > > So I suppose it depends on the predominant use cases. There's also the possibility of wrapping C++ code that uses overloaded operators: having it as a macro could allow those C++ operators to be be mapped into Python. Hope this is constructive Dave From mark at hotpy.org Tue Apr 28 20:44:53 2015 From: mark at hotpy.org (Mark Shannon) Date: Tue, 28 Apr 2015 19:44:53 +0100 Subject: [Python-Dev] Issues with PEP 482 (1) Message-ID: <553FD525.1030202@hotpy.org> Hi, I still think that there are several issues that need addressing with PEP 492. This time, one issue at a time :) "async" The "Rationale and Goals" of PEP 492 states that PEP 380 has 3 shortcomings. The second of which is: """It is not possible to natively define a coroutine which has no yield or yield from statements.""" This is incorrect, although what is meant by 'natively' is unclear. A coroutine without a yield statement can be defined simply and concisely, thus: @coroutine def f(): return 1 This is only a few character longer than the proposed new syntax, perfectly explicit and requires no modification the language whatsoever. A pure-python definition of the "coroutine" decorator is given below. So could the "Rationale and Goals" be correctly accordingly, please. Also, either the "async def" syntax should be dropped, or a new justification is required. Cheers, Mark. #coroutine.py from types import FunctionType, CodeType CO_COROUTINE = 0x0080 CO_GENERATOR = 0x0020 def coroutine(f): 'Converts a function to a generator function' old_code = f.__code__ new_code = CodeType( old_code.co_argcount, old_code.co_kwonlyargcount, old_code.co_nlocals, old_code.co_stacksize, old_code.co_flags | CO_GENERATOR | CO_COROUTINE, old_code.co_code, old_code.co_consts, old_code.co_names, old_code.co_varnames, old_code.co_filename, old_code.co_name, old_code.co_firstlineno, old_code.co_lnotab, old_code.co_freevars, old_code.co_cellvars) return FunctionType(new_code, f.__globals__) P.S. The reverse of this decorator, which unsets the flags, converts a generator function into a normal function. :? From stefan_ml at behnel.de Tue Apr 28 20:51:33 2015 From: stefan_ml at behnel.de (Stefan Behnel) Date: Tue, 28 Apr 2015 20:51:33 +0200 Subject: [Python-Dev] PEP 492: No new syntax is required In-Reply-To: <553DE9E1.4040403@hotpy.org> References: <553D48BF.1070600@hotpy.org> <553DE9E1.4040403@hotpy.org> Message-ID: Mark Shannon schrieb am 27.04.2015 um 09:48: > On 27/04/15 00:13, Guido van Rossum wrote: >> Currently this means looking for yield [from]; PEP 492 just adds looking >> for await and async [for|with]. Making await() a function defeats the >> purpose because now aliasing can hide its presence, and we're back in >> the land of gevent or stackless (where *anything* can potentially >> suspend the current task). I don't want to live in that land. > > I don't think I was clear enough. I said that "await" *is* a function, not > that is should be disguised as one. Reading the code, "GetAwaitableIter" > would be a better name for that element of the implementation. It is a > straightforward non-blocking function. 1) it's not like people commonly alias "repr()" or "len()", so why would they alias an "await()" builtin ? Unless, obviously, there's an actual reason to do so, in which case having it as a functions comes in handy. :) We had the same line of reasoning with "print()" back in the days of Py3k. 2) an "await()" builtin function that calls an "__await__()" special method on its input object sounds very pythonic. Stefan From guido at python.org Tue Apr 28 20:57:04 2015 From: guido at python.org (Guido van Rossum) Date: Tue, 28 Apr 2015 11:57:04 -0700 Subject: [Python-Dev] Looking for someone to audit PEP 3156 (asyncio) against current version Message-ID: PEP 3156 (Asynchronous IO Support Rebooted: the "asyncio" Module) was accepted 18 months ago, and since then we've aggressively test-driven it in the 3.4 stdlib. I think that in some cases we've changed or added APIs or changed semantics, and I would like to update the PEP to match reality (so that people reading the PEP as a specification aren't going to be confused). If you're good at reading and editing technical documentation I'd love for you to help me with this! (Another form of help I could use might be to check the docs on docs.python.org against the PEP and the implementation.) -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From yselivanov.ml at gmail.com Tue Apr 28 21:19:52 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Tue, 28 Apr 2015 15:19:52 -0400 Subject: [Python-Dev] Issues with PEP 482 (1) In-Reply-To: <553FD525.1030202@hotpy.org> References: <553FD525.1030202@hotpy.org> Message-ID: <553FDD58.9040802@gmail.com> Mark, I'm sorry but you have to view the proposal as a whole. Discussing it point by point in isolation doesn't make any sense, as with any complex subject. Thanks, Yury On 2015-04-28 2:44 PM, Mark Shannon wrote: > Hi, > > I still think that there are several issues that need addressing with > PEP 492. This time, one issue at a time :) > > "async" > > The "Rationale and Goals" of PEP 492 states that PEP 380 has 3 > shortcomings. > The second of which is: > """It is not possible to natively define a coroutine which has no > yield or yield from statements.""" > This is incorrect, although what is meant by 'natively' is unclear. > > A coroutine without a yield statement can be defined simply and > concisely, thus: > > @coroutine > def f(): > return 1 > > This is only a few character longer than the proposed new syntax, > perfectly explicit and requires no modification the language whatsoever. > A pure-python definition of the "coroutine" decorator is given below. > > So could the "Rationale and Goals" be correctly accordingly, please. > Also, either the "async def" syntax should be dropped, or a new > justification is required. > > Cheers, > Mark. > > > #coroutine.py > > from types import FunctionType, CodeType > > CO_COROUTINE = 0x0080 > CO_GENERATOR = 0x0020 > > def coroutine(f): > 'Converts a function to a generator function' > old_code = f.__code__ > new_code = CodeType( > old_code.co_argcount, > old_code.co_kwonlyargcount, > old_code.co_nlocals, > old_code.co_stacksize, > old_code.co_flags | CO_GENERATOR | CO_COROUTINE, > old_code.co_code, > old_code.co_consts, > old_code.co_names, > old_code.co_varnames, > old_code.co_filename, > old_code.co_name, > old_code.co_firstlineno, > old_code.co_lnotab, > old_code.co_freevars, > old_code.co_cellvars) > return FunctionType(new_code, f.__globals__) > > > P.S. The reverse of this decorator, which unsets the flags, converts a > generator function into a normal function. :? > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/yselivanov.ml%40gmail.com From mark at hotpy.org Tue Apr 28 21:59:18 2015 From: mark at hotpy.org (Mark Shannon) Date: Tue, 28 Apr 2015 20:59:18 +0100 Subject: [Python-Dev] PEP 492: No new syntax is required In-Reply-To: <20150428222422.54d1884d@x230> References: <553D48BF.1070600@hotpy.org> <553DE9E1.4040403@hotpy.org> <20150428222422.54d1884d@x230> Message-ID: <553FE696.8060806@hotpy.org> On 28/04/15 20:24, Paul Sokolovsky wrote: > Hello, > [snip] > Based on all this passage, my guess is that you miss difference > between C and Python functions. This is rather patronising, almost to the point of being insulting. Please keep the debate civil. [snip] Cheers, Mark. From mark at hotpy.org Tue Apr 28 22:00:17 2015 From: mark at hotpy.org (Mark Shannon) Date: Tue, 28 Apr 2015 21:00:17 +0100 Subject: [Python-Dev] Issues with PEP 482 (1) In-Reply-To: <20150428223941.60c9ba55@x230> References: <553FD525.1030202@hotpy.org> <20150428223941.60c9ba55@x230> Message-ID: <553FE6D1.4030801@hotpy.org> On 28/04/15 20:39, Paul Sokolovsky wrote: > Hello, > > On Tue, 28 Apr 2015 19:44:53 +0100 > Mark Shannon wrote: > > [] > >> A coroutine without a yield statement can be defined simply and >> concisely, thus: >> >> @coroutine >> def f(): >> return 1 > > [] > >> A pure-python definition of the "coroutine" decorator is >> given below. >> > > [] > >> from types import FunctionType, CodeType >> >> CO_COROUTINE = 0x0080 >> CO_GENERATOR = 0x0020 >> >> def coroutine(f): >> 'Converts a function to a generator function' >> old_code = f.__code__ >> new_code = CodeType( >> old_code.co_argcount, >> old_code.co_kwonlyargcount, > > > This is joke right? Well it was partly for entertainment value, although it works on PyPy. The point is that something that can be done with a decorator, whether in pure Python or as builtin, does not require new syntax. Cheers, Mark. From guido at python.org Tue Apr 28 22:00:26 2015 From: guido at python.org (Guido van Rossum) Date: Tue, 28 Apr 2015 13:00:26 -0700 Subject: [Python-Dev] PEP 492: No new syntax is required In-Reply-To: References: <553D48BF.1070600@hotpy.org> <553DE9E1.4040403@hotpy.org> Message-ID: On Tue, Apr 28, 2015 at 11:51 AM, Stefan Behnel wrote: > Mark Shannon schrieb am 27.04.2015 um 09:48: > > On 27/04/15 00:13, Guido van Rossum wrote: > >> Currently this means looking for yield [from]; PEP 492 just adds looking > >> for await and async [for|with]. Making await() a function defeats the > >> purpose because now aliasing can hide its presence, and we're back in > >> the land of gevent or stackless (where *anything* can potentially > >> suspend the current task). I don't want to live in that land. > > > > I don't think I was clear enough. I said that "await" *is* a function, > not > > that is should be disguised as one. Reading the code, "GetAwaitableIter" > > would be a better name for that element of the implementation. It is a > > straightforward non-blocking function. > > 1) it's not like people commonly alias "repr()" or "len()", so why would > they alias an "await()" builtin ? Unless, obviously, there's an actual > reason to do so, in which case having it as a functions comes in handy. :) > We had the same line of reasoning with "print()" back in the days of Py3k. > > 2) an "await()" builtin function that calls an "__await__()" special method > on its input object sounds very pythonic. > This sounds confused. The await expression must be recognized by the parser so it can generate different code for it (the code to suspend the stack). A builtin function cannot generate different code -- to the compiler all functions look the same. I know we could change that rule, but that' would be a really a big deviation from Python's philosophy: Currently the code generator never needs to know the type of any variables -- and a builtin function 'await' would just be another variable to the code generator. -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Tue Apr 28 22:06:47 2015 From: guido at python.org (Guido van Rossum) Date: Tue, 28 Apr 2015 13:06:47 -0700 Subject: [Python-Dev] Issues with PEP 482 (1) In-Reply-To: <553FD525.1030202@hotpy.org> References: <553FD525.1030202@hotpy.org> Message-ID: On Tue, Apr 28, 2015 at 11:44 AM, Mark Shannon wrote: > Hi, > > I still think that there are several issues that need addressing with PEP > 492. This time, one issue at a time :) > > "async" > > The "Rationale and Goals" of PEP 492 states that PEP 380 has 3 > shortcomings. > The second of which is: > """It is not possible to natively define a coroutine which has no > yield or yield from statements.""" > This is incorrect, although what is meant by 'natively' is unclear. > > A coroutine without a yield statement can be defined simply and concisely, > thus: > > @coroutine > def f(): > return 1 > > This is only a few character longer than the proposed new syntax, > perfectly explicit and requires no modification the language whatsoever. > A pure-python definition of the "coroutine" decorator is given below. > > So could the "Rationale and Goals" be correctly accordingly, please. > Also, either the "async def" syntax should be dropped, or a new > justification is required. > So here's *my* motivation for this. I don't want the code generator to have to understand decorators. To the code generator, a decorator is just an expression, and it shouldn't be required to understand decorators in sufficient detail to know that *this* particular decorator means to generate different code. And it's not just generating different code -- it's also the desire to issue static errors (SyntaxError) when await (or async for/with) is used outside a coroutine, or when yield [from] is use inside one. The motivation is clear enough to me (and AFAIR I'm the BDFL for this PEP :-). -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From drekin at gmail.com Tue Apr 28 22:20:00 2015 From: drekin at gmail.com (=?UTF-8?B?QWRhbSBCYXJ0b8Wh?=) Date: Tue, 28 Apr 2015 22:20:00 +0200 Subject: [Python-Dev] Unicode literals in Python 2.7 Message-ID: Hello, is it possible to somehow tell Python 2.7 to compile a code entered in the interactive session with the flag PyCF_SOURCE_IS_UTF8 set? I'm considering adding support for Python 2 in my package ( https://github.com/Drekin/win-unicode-console) and I have run into the fact that when u"?" is entered in the interactive session, it results in u"\xce\xb1" rather than u"\u03b1". As this seems to be a highly specialized question, I'm asking it here. Regards, Drekin -------------- next part -------------- An HTML attachment was scrubbed... URL: From mark at hotpy.org Tue Apr 28 22:22:39 2015 From: mark at hotpy.org (Mark Shannon) Date: Tue, 28 Apr 2015 21:22:39 +0100 Subject: [Python-Dev] Issues with PEP 482 (1) In-Reply-To: References: <553FD525.1030202@hotpy.org> Message-ID: <553FEC0F.7070701@hotpy.org> On 28/04/15 21:06, Guido van Rossum wrote: > On Tue, Apr 28, 2015 at 11:44 AM, Mark Shannon > wrote: > > Hi, > > I still think that there are several issues that need addressing > with PEP 492. This time, one issue at a time :) > > "async" > > The "Rationale and Goals" of PEP 492 states that PEP 380 has 3 > shortcomings. > The second of which is: > """It is not possible to natively define a coroutine which has > no yield or yield from statements.""" > This is incorrect, although what is meant by 'natively' is unclear. > > A coroutine without a yield statement can be defined simply and > concisely, thus: > > @coroutine > def f(): > return 1 > > This is only a few character longer than the proposed new syntax, > perfectly explicit and requires no modification the language whatsoever. > A pure-python definition of the "coroutine" decorator is given below. > > So could the "Rationale and Goals" be correctly accordingly, please. > Also, either the "async def" syntax should be dropped, or a new > justification is required. > > > So here's *my* motivation for this. I don't want the code generator to > have to understand decorators. To the code generator, a decorator is > just an expression, and it shouldn't be required to understand > decorators in sufficient detail to know that *this* particular decorator > means to generate different code. The code generator knows nothing about it. The generated bytecode is identical, only the flags are changed. The decorator can just return a copy of the function with modified co_flags. > > And it's not just generating different code -- it's also the desire to > issue static errors (SyntaxError) when await (or async for/with) is used > outside a coroutine, or when yield [from] is use inside one. Would raising a TypeError at runtime be sufficient to catch the sort of errors that you are worried about? > > The motivation is clear enough to me (and AFAIR I'm the BDFL for this > PEP :-). Can't argue with that. Cheers, Mark. From guido at python.org Tue Apr 28 22:25:57 2015 From: guido at python.org (Guido van Rossum) Date: Tue, 28 Apr 2015 13:25:57 -0700 Subject: [Python-Dev] Issues with PEP 482 (1) In-Reply-To: <553FEC0F.7070701@hotpy.org> References: <553FD525.1030202@hotpy.org> <553FEC0F.7070701@hotpy.org> Message-ID: On Tue, Apr 28, 2015 at 1:22 PM, Mark Shannon wrote: > > > On 28/04/15 21:06, Guido van Rossum wrote: > >> On Tue, Apr 28, 2015 at 11:44 AM, Mark Shannon > > wrote: >> >> Hi, >> >> I still think that there are several issues that need addressing >> with PEP 492. This time, one issue at a time :) >> >> "async" >> >> The "Rationale and Goals" of PEP 492 states that PEP 380 has 3 >> shortcomings. >> The second of which is: >> """It is not possible to natively define a coroutine which has >> no yield or yield from statements.""" >> This is incorrect, although what is meant by 'natively' is >> unclear. >> >> A coroutine without a yield statement can be defined simply and >> concisely, thus: >> >> @coroutine >> def f(): >> return 1 >> >> This is only a few character longer than the proposed new syntax, >> perfectly explicit and requires no modification the language >> whatsoever. >> A pure-python definition of the "coroutine" decorator is given below. >> >> So could the "Rationale and Goals" be correctly accordingly, please. >> Also, either the "async def" syntax should be dropped, or a new >> justification is required. >> >> >> So here's *my* motivation for this. I don't want the code generator to >> have to understand decorators. To the code generator, a decorator is >> just an expression, and it shouldn't be required to understand >> decorators in sufficient detail to know that *this* particular decorator >> means to generate different code. >> > The code generator knows nothing about it. The generated bytecode is > identical, only the flags are changed. The decorator can just return a copy > of the function with modified co_flags. > The situation may be different for other Python implementations though. The minimal changes to the code object are an implementation tactic -- the syntactic marking of coroutines is fundamental (like in the past the choice to recognize generators syntactically, albeit in that case by the presence of yield in their body). > > >> And it's not just generating different code -- it's also the desire to >> issue static errors (SyntaxError) when await (or async for/with) is used >> outside a coroutine, or when yield [from] is use inside one. >> > Would raising a TypeError at runtime be sufficient to catch the sort of > errors that you are worried about? > No. > > >> The motivation is clear enough to me (and AFAIR I'm the BDFL for this >> PEP :-). >> > Can't argue with that. > > Cheers, > Mark. > > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Tue Apr 28 23:49:56 2015 From: guido at python.org (Guido van Rossum) Date: Tue, 28 Apr 2015 14:49:56 -0700 Subject: [Python-Dev] PEP 492: async/await in Python; v3 In-Reply-To: <553EF985.2070808@gmail.com> References: <553EF985.2070808@gmail.com> Message-ID: Inline comments below... On Mon, Apr 27, 2015 at 8:07 PM, Yury Selivanov wrote: > Hi python-dev, > > Another round of updates. Reference implementation > has been updated: https://github.com/1st1/cpython/tree/await > (includes all things from the below summary of updates + > tests). > > > Summary: > > > 1. "PyTypeObject.tp_await" slot. Replaces "tp_reserved". > This is to enable implementation of Futures with C API. > Must return an iterator if implemented. > > That's fine (though I didn't follow this closely). > > 2. New grammar for "await" expressions, see > 'Syntax of "await" expression' section > > I like it. > > 3. inspect.iscoroutine() and inspect.iscoroutineobjects() > functions. > > What's the use case for these? I wonder if it makes more sense to have a check for a generalized awaitable rather than specifically a coroutine. > > 4. Full separation of coroutines and generators. > This is a big one; let's discuss. > > a) Coroutine objects raise TypeError (is NotImplementedError > better?) in their __iter__ and __next__. Therefore it's > not not possible to pass them to iter(), tuple(), next() and > other similar functions that work with iterables. > > I think it should be TypeError -- what you *really* want is not to define these methods at all but given the implementation tactic for coroutines that may not be possible, so the nearest approximation is TypeError. (Also, NotImplementedError is typically to indicate that a subclass should implement it.) > b) Because of (a), for..in iteration also does not work > on coroutines anymore. > > Sounds good. > c) 'yield from' only accept coroutine objects from > generators decorated with 'types.coroutine'. That means > that existing asyncio generator-based coroutines will > happily yield from both coroutines and generators. > *But* every generator-based coroutine *must* be > decorated with `asyncio.coroutine()`. This is > potentially a backwards incompatible change. > > See below. I worry about backward compatibility. A lot. Are you saying that asycio-based code that doesn't use @coroutine will break in 3.5? > d) inspect.isgenerator() and inspect.isgeneratorfunction() > return `False` for coroutine objects & coroutine functions. > > Makes sense. > e) Should we add a coroutine ABC (for cython etc)? > > I, personally, think this is highly necessary. First, > separation of coroutines from generators is extremely > important. One day there won't be generator-based > coroutines, and we want to avoid any kind of confusion. > Second, we only can do this in 3.5. This kind of > semantics change won't be ever possible. > Sounds like Stefan agrees. Are you aware of http://bugs.python.org/issue24018 (Generator ABC)? > asyncio recommends using @coroutine decorator, and most > projects that I've seen do use it. Also there is no > reason for people to use iter() and next() functions > on coroutines when writing asyncio code. I doubt that > this will cause serious backwards compatibility problems > (asyncio also has provisional status). > > I wouldn't count too much on asyncio's provisional status. What are the consequences for code that is written to work with asyncio but doesn't use @coroutine? Such code will work with 3.4 and (despite the provisional status and the recommendation to use @coroutine) I don't want that code to break in 3.5 (though maybe a warning would be fine). I also hope that if someone has their own (renamed) copy of asyncio that works with 3.4, it will all still work with 3.5. Even if asyncio itself is provisional, none of the primitives (e.g. yield from) that it is built upon are provisional, so there should be no reason for it to break in 3.5. > > Thank you, > Yury > > Some more inline comments directly on the PEP below. > > PEP: 492 > Title: Coroutines with async and await syntax > Version: $Revision$ > Last-Modified: $Date$ > Author: Yury Selivanov > Status: Draft > Type: Standards Track > Content-Type: text/x-rst > Created: 09-Apr-2015 > Python-Version: 3.5 > Post-History: 17-Apr-2015, 21-Apr-2015, 27-Apr-2015 > > > Abstract > ======== > > This PEP introduces new syntax for coroutines, asynchronous ``with`` > statements and ``for`` loops. The main motivation behind this proposal > is to streamline writing and maintaining asynchronous code, as well as > to simplify previously hard to implement code patterns. > > > Rationale and Goals > =================== > > Current Python supports implementing coroutines via generators (PEP > 342), further enhanced by the ``yield from`` syntax introduced in PEP > 380. This approach has a number of shortcomings: > > * it is easy to confuse coroutines with regular generators, since they > share the same syntax; async libraries often attempt to alleviate > this by using decorators (e.g. ``@asyncio.coroutine`` [1]_); > > * it is not possible to natively define a coroutine which has no > ``yield`` or ``yield from`` statements, again requiring the use of > decorators to fix potential refactoring issues; > > (I have to agree with Mark that this point is pretty weak. :-) > * support for asynchronous calls is limited to expressions where > ``yield`` is allowed syntactically, limiting the usefulness of > syntactic features, such as ``with`` and ``for`` statements. > > This proposal makes coroutines a native Python language feature, and > clearly separates them from generators. This removes > generator/coroutine ambiguity, and makes it possible to reliably define > coroutines without reliance on a specific library. This also enables > linters and IDEs to improve static code analysis and refactoring. > > Native coroutines and the associated new syntax features make it > possible to define context manager and iteration protocols in > asynchronous terms. As shown later in this proposal, the new ``async > with`` statement lets Python programs perform asynchronous calls when > entering and exiting a runtime context, and the new ``async for`` > statement makes it possible to perform asynchronous calls in iterators. > I wonder if you could add some adaptation of the explanation I have posted (a few times now, I feel) for the reason why I prefer to suspend only at syntactically recognizable points (yield [from] in the past, await and async for/with in this PEP). Unless you already have it in the rationale (though it seems Mark didn't think it was enough :-). > > > Specification > ============= > > This proposal introduces new syntax and semantics to enhance coroutine > support in Python, it does not change the internal implementation of > coroutines, which are still based on generators. > > It's actually a separate issue whether any implementation changes. Implementation changes don't need to go through the PEP process, unless they're really also interface changes. > It is strongly suggested that the reader understands how coroutines are > implemented in Python (PEP 342 and PEP 380). It is also recommended to > read PEP 3156 (asyncio framework) and PEP 3152 (Cofunctions). > > From this point in this document we use the word *coroutine* to refer > to functions declared using the new syntax. *generator-based > coroutine* is used where necessary to refer to coroutines that are > based on generator syntax. > Despite reading this I still get confused when reading the PEP (probably because asyncio uses "coroutine" in the latter sense). Maybe it would make sense to write "native coroutine" for the new concept, to distinguish the two concepts more clearly? (You could even change "awaitable" to "coroutine". Though I like "awaitable" too.) > > > New Coroutine Declaration Syntax > -------------------------------- > > The following new syntax is used to declare a coroutine:: > > async def read_data(db): > pass > > Key properties of coroutines: > > * ``async def`` functions are always coroutines, even if they do not > contain ``await`` expressions. > > * It is a ``SyntaxError`` to have ``yield`` or ``yield from`` > expressions in an ``async`` function. > > For Mark's benefit, might add that this is similar to how ``return`` and ``yield`` are disallowed syntactically outside functions (as are the syntactic constraints on ``await` and ``async for|def``). > * Internally, a new code object flag - ``CO_COROUTINE`` - is introduced > to enable runtime detection of coroutines (and migrating existing > code). All coroutines have both ``CO_COROUTINE`` and ``CO_GENERATOR`` > flags set. > > * Regular generators, when called, return a *generator object*; > similarly, coroutines return a *coroutine object*. > > * ``StopIteration`` exceptions are not propagated out of coroutines, > and are replaced with a ``RuntimeError``. For regular generators > such behavior requires a future import (see PEP 479). > > > types.coroutine() > ----------------- > > A new function ``coroutine(gen)`` is added to the ``types`` module. It > applies ``CO_COROUTINE`` flag to the passed generator-function's code > object, making it to return a *coroutine object* when called. > > Clarify that this is a decorator that modifies the function object in place. > This feature enables an easy upgrade path for existing libraries. > > > Await Expression > ---------------- > > The following new ``await`` expression is used to obtain a result of > coroutine execution:: > > async def read_data(db): > data = await db.fetch('SELECT ...') > ... > > ``await``, similarly to ``yield from``, suspends execution of > ``read_data`` coroutine until ``db.fetch`` *awaitable* completes and > returns the result data. > > It uses the ``yield from`` implementation with an extra step of > validating its argument. ``await`` only accepts an *awaitable*, which > can be one of: > > * A *coroutine object* returned from a *coroutine* or a generator > decorated with ``types.coroutine()``. > > * An object with an ``__await__`` method returning an iterator. > > Any ``yield from`` chain of calls ends with a ``yield``. This is a > fundamental mechanism of how *Futures* are implemented. Since, > internally, coroutines are a special kind of generators, every > ``await`` is suspended by a ``yield`` somewhere down the chain of > ``await`` calls (please refer to PEP 3156 for a detailed > explanation.) > > To enable this behavior for coroutines, a new magic method called > ``__await__`` is added. In asyncio, for instance, to enable Future > objects in ``await`` statements, the only change is to add > ``__await__ = __iter__`` line to ``asyncio.Future`` class. > > Objects with ``__await__`` method are called *Future-like* objects in > the rest of this PEP. > > Also, please note that ``__aiter__`` method (see its definition > below) cannot be used for this purpose. It is a different protocol, > and would be like using ``__iter__`` instead of ``__call__`` for > regular callables. > > It is a ``TypeError`` if ``__await__`` returns anything but an > iterator. > > * Objects defined with CPython C API with a ``tp_await`` function, > returning an iterator (similar to ``__await__`` method). > > It is a ``SyntaxError`` to use ``await`` outside of a coroutine. > > It is a ``TypeError`` to pass anything other than an *awaitable* object > to an ``await`` expression. > > > Syntax of "await" expression > '''''''''''''''''''''''''''' > > ``await`` keyword is defined differently from ``yield`` and ``yield > from``. The main difference is that *await expressions* do not require > parentheses around them most of the times. > > Examples:: > > ================================== ================================== > Expression Will be parsed as > ================================== ================================== > ``if await fut: pass`` ``if (await fut): pass`` > ``if await fut + 1: pass`` ``if (await fut) + 1: pass`` > ``pair = await fut, 'spam'`` ``pair = (await fut), 'spam'`` > ``with await fut, open(): pass`` ``with (await fut), open(): pass`` > ``await foo()['spam'].baz()()`` ``await ( foo()['spam'].baz()() )`` > ``return await coro()`` ``return ( await coro() )`` > ``res = await coro() ** 2`` ``res = (await coro()) ** 2`` > ``func(a1=await coro(), a2=0)`` ``func(a1=(await coro()), a2=0)`` > ================================== ================================== > > See `Grammar Updates`_ section for details. > > > Asynchronous Context Managers and "async with" > ---------------------------------------------- > > An *asynchronous context manager* is a context manager that is able to > suspend execution in its *enter* and *exit* methods. > > To make this possible, a new protocol for asynchronous context managers > is proposed. Two new magic methods are added: ``__aenter__`` and > ``__aexit__``. Both must return an *awaitable*. > > An example of an asynchronous context manager:: > > class AsyncContextManager: > async def __aenter__(self): > await log('entering context') > > async def __aexit__(self, exc_type, exc, tb): > await log('exiting context') > > > New Syntax > '''''''''' > > A new statement for asynchronous context managers is proposed:: > > async with EXPR as VAR: > BLOCK > > > which is semantically equivalent to:: > > mgr = (EXPR) > aexit = type(mgr).__aexit__ > aenter = type(mgr).__aenter__(mgr) > exc = True > > try: > try: > VAR = await aenter > BLOCK > except: > exc = False > exit_res = await aexit(mgr, *sys.exc_info()) > if not exit_res: > raise > > finally: > if exc: > await aexit(mgr, None, None, None) > > I realize you copied this from PEP 343, but I wonder, do we really need two nested try statements and the 'exc' flag? Why can't the finally step be placed in an 'else' clause on the inner try? (There may well be a reason but I can't figure what it is, and PEP 343 doesn't seem to explain it.) Also, it's a shame we're perpetuating the sys.exc_info() triple in the API here, but I agree making __exit__ and __aexit__ different also isn't a great idea. :-( PS. With the new tighter syntax for ``await`` you don't need the ``exit_res`` variable any more. > > As with regular ``with`` statements, it is possible to specify multiple > context managers in a single ``async with`` statement. > > It is an error to pass a regular context manager without ``__aenter__`` > and ``__aexit__`` methods to ``async with``. It is a ``SyntaxError`` > to use ``async with`` outside of a coroutine. > > > Example > ''''''' > > With asynchronous context managers it is easy to implement proper > database transaction managers for coroutines:: > > async def commit(session, data): > ... > > async with session.transaction(): > ... > await session.update(data) > ... > > Code that needs locking also looks lighter:: > > async with lock: > ... > > instead of:: > > with (yield from lock): > ... > (Also, the implementation of the latter is problematic -- check asyncio/locks.py and notice that __enter__ is empty...) > > > Asynchronous Iterators and "async for" > -------------------------------------- > > An *asynchronous iterable* is able to call asynchronous code in its > *iter* implementation, and *asynchronous iterator* can call > asynchronous code in its *next* method. To support asynchronous > iteration: > > 1. An object must implement an ``__aiter__`` method returning an > *awaitable* resulting in an *asynchronous iterator object*. > Have you considered making __aiter__ not an awaitable? It's not strictly necessary I think, one could do all the awaiting in __anext__. Though perhaps there are use cases that are more naturally expressed by awaiting in __aiter__? (Your examples all use ``async def __aiter__(self): return self`` suggesting this would be no great loss.) > 2. An *asynchronous iterator object* must implement an ``__anext__`` > method returning an *awaitable*. > > 3. To stop iteration ``__anext__`` must raise a ``StopAsyncIteration`` > exception. > > An example of asynchronous iterable:: > > class AsyncIterable: > async def __aiter__(self): > return self > > async def __anext__(self): > data = await self.fetch_data() > if data: > return data > else: > raise StopAsyncIteration > > async def fetch_data(self): > ... > > > New Syntax > '''''''''' > > A new statement for iterating through asynchronous iterators is > proposed:: > > async for TARGET in ITER: > BLOCK > else: > BLOCK2 > > which is semantically equivalent to:: > > iter = (ITER) > iter = await type(iter).__aiter__(iter) > running = True > while running: > try: > TARGET = await type(iter).__anext__(iter) > except StopAsyncIteration: > running = False > else: > BLOCK > else: > BLOCK2 > > > It is a ``TypeError`` to pass a regular iterable without ``__aiter__`` > method to ``async for``. It is a ``SyntaxError`` to use ``async for`` > outside of a coroutine. > > As for with regular ``for`` statement, ``async for`` has an optional > ``else`` clause. > > (Not because we're particularly fond of it, but because its absence would just introduce more special cases. :-) > > Example 1 > ''''''''' > > With asynchronous iteration protocol it is possible to asynchronously > buffer data during iteration:: > > async for data in cursor: > ... > > Where ``cursor`` is an asynchronous iterator that prefetches ``N`` rows > of data from a database after every ``N`` iterations. > > The following code illustrates new asynchronous iteration protocol:: > > class Cursor: > def __init__(self): > self.buffer = collections.deque() > > def _prefetch(self): > ... > > async def __aiter__(self): > return self > > async def __anext__(self): > if not self.buffer: > self.buffer = await self._prefetch() > if not self.buffer: > raise StopAsyncIteration > return self.buffer.popleft() > > then the ``Cursor`` class can be used as follows:: > > async for row in Cursor(): > print(row) > > which would be equivalent to the following code:: > > i = await Cursor().__aiter__() > while True: > try: > row = await i.__anext__() > except StopAsyncIteration: > break > else: > print(row) > > > Example 2 > ''''''''' > > The following is a utility class that transforms a regular iterable to > an asynchronous one. While this is not a very useful thing to do, the > code illustrates the relationship between regular and asynchronous > iterators. > > :: > > class AsyncIteratorWrapper: > def __init__(self, obj): > self._it = iter(obj) > > async def __aiter__(self): > return self > > async def __anext__(self): > try: > value = next(self._it) > except StopIteration: > raise StopAsyncIteration > return value > > async for letter in AsyncIteratorWrapper("abc"): > print(letter) > > > Why StopAsyncIteration? > ''''''''''''''''''''''' > > I keep wanting to propose to rename this to AsyncStopIteration. I know it's about stopping an async iteration, but in my head I keep referring to it as AsyncStopIteration, probably because in other places we use async (or 'a') as a prefix. > Coroutines are still based on generators internally. So, before PEP > 479, there was no fundamental difference between > > :: > > def g1(): > yield from fut > return 'spam' > > and > > :: > > def g2(): > yield from fut > raise StopIteration('spam') > > And since PEP 479 is accepted and enabled by default for coroutines, > the following example will have its ``StopIteration`` wrapped into a > ``RuntimeError`` > > :: > > async def a1(): > await fut > raise StopIteration('spam') > > The only way to tell the outside code that the iteration has ended is > to raise something other than ``StopIteration``. Therefore, a new > built-in exception class ``StopAsyncIteration`` was added. > > Moreover, with semantics from PEP 479, all ``StopIteration`` exceptions > raised in coroutines are wrapped in ``RuntimeError``. > > > Debugging Features > ------------------ > > One of the most frequent mistakes that people make when using > generators as coroutines is forgetting to use ``yield from``:: > > @asyncio.coroutine > def useful(): > asyncio.sleep(1) # this will do noting without 'yield from' > > Might be useful to point out that this was the one major advantage of PEP 3152 -- although it wasn't enough to save that PEP, and in your response you pointed out that this mistake is not all that common. Although you seem to disagree with that here ("One of the most frequent mistakes ..."). > For debugging this kind of mistakes there is a special debug mode in > asyncio, in which ``@coroutine`` decorator wraps all functions with a > special object with a destructor logging a warning. Whenever a wrapped > generator gets garbage collected, a detailed logging message is > generated with information about where exactly the decorator function > was defined, stack trace of where it was collected, etc. Wrapper > object also provides a convenient ``__repr__`` function with detailed > information about the generator. > > The only problem is how to enable these debug capabilities. Since > debug facilities should be a no-op in production mode, ``@coroutine`` > decorator makes the decision of whether to wrap or not to wrap based on > an OS environment variable ``PYTHONASYNCIODEBUG``. This way it is > possible to run asyncio programs with asyncio's own functions > instrumented. ``EventLoop.set_debug``, a different debug facility, has > no impact on ``@coroutine`` decorator's behavior. > > With this proposal, coroutines is a native, distinct from generators, > concept. New methods ``set_coroutine_wrapper`` and > ``get_coroutine_wrapper`` are added to the ``sys`` module, with which > frameworks can provide advanced debugging facilities. > > These two appear to be unspecified except by example. > It is also important to make coroutines as fast and efficient as > possible, therefore there are no debug features enabled by default. > > Example:: > > async def debug_me(): > await asyncio.sleep(1) > > def async_debug_wrap(generator): > return asyncio.CoroWrapper(generator) > > sys.set_coroutine_wrapper(async_debug_wrap) > > debug_me() # <- this line will likely GC the coroutine object and > # trigger asyncio.CoroWrapper's code. > > assert isinstance(debug_me(), asyncio.CoroWrapper) > > sys.set_coroutine_wrapper(None) # <- this unsets any > # previously set wrapper > assert not isinstance(debug_me(), asyncio.CoroWrapper) > > If ``sys.set_coroutine_wrapper()`` is called twice, the new wrapper > replaces the previous wrapper. ``sys.set_coroutine_wrapper(None)`` > unsets the wrapper. > > > inspect.iscoroutine() and inspect.iscoroutineobject() > ----------------------------------------------------- > > Two new functions are added to the ``inspect`` module: > > * ``inspect.iscoroutine(obj)`` returns ``True`` if ``obj`` is a > coroutine object. > > * ``inspect.iscoroutinefunction(obj)`` returns ``True`` is ``obj`` is a > coroutine function. > > Maybe isawaitable() and isawaitablefunction() are also useful? (Or only isawaitable()?) > > Differences between coroutines and generators > --------------------------------------------- > > A great effort has been made to make sure that coroutines and > generators are separate concepts: > > 1. Coroutine objects do not implement ``__iter__`` and ``__next__`` > methods. Therefore they cannot be iterated over or passed to > ``iter()``, ``list()``, ``tuple()`` and other built-ins. They > also cannot be used in a ``for..in`` loop. > > 2. ``yield from`` does not accept coroutine objects (unless it is used > in a generator-based coroutine decorated with ``types.coroutine``.) > How does ``yield from`` know that it is occurring in a generator-based coroutine? > > 3. ``yield from`` does not accept coroutine objects from plain Python > generators (*not* generator-based coroutines.) > I am worried about this. PEP 380 gives clear semantics to "yield from " and I don't think you can revert that here. Or maybe I am misunderstanding what you meant here? (What exactly are "coroutine objects from plain Python generators"?) > > 4. ``inspect.isgenerator()`` and ``inspect.isgeneratorfunction()`` > return ``False`` for coroutine objects and coroutine functions. > > > Coroutine objects > ----------------- > > Coroutines are based on generators internally, thus they share the > implementation. Similarly to generator objects, coroutine objects have > ``throw``, ``send`` and ``close`` methods. ``StopIteration`` and > ``GeneratorExit`` play the same role for coroutine objects (although > PEP 479 is enabled by default for coroutines). > Does send() make sense for a native coroutine? Check PEP 380. I think the only way to access the send() argument is by using ``yield`` but that's disallowed. Or is this about send() being passed to the ``yield`` that ultimately suspends the chain of coroutines? (You may just have to rewrite the section about that -- it seems a bit hidden now.) > > > Glossary > ======== > > :Coroutine: > A coroutine function, or just "coroutine", is declared with ``async > def``. It uses ``await`` and ``return value``; see `New Coroutine > Declaration Syntax`_ for details. > > :Coroutine object: > Returned from a coroutine function. See `Await Expression`_ for > details. > > :Future-like object: > An object with an ``__await__`` method, or a C object with > ``tp_await`` function, returning an iterator. Can be consumed by > an ``await`` expression in a coroutine. A coroutine waiting for a > Future-like object is suspended until the Future-like object's > ``__await__`` completes, and returns the result. See `Await > Expression`_ for details. > > :Awaitable: > A *Future-like* object or a *coroutine object*. See `Await > Expression`_ for details. > > :Generator-based coroutine: > Coroutines based in generator syntax. Most common example is > ``@asyncio.coroutine``. > > :Asynchronous context manager: > An asynchronous context manager has ``__aenter__`` and ``__aexit__`` > methods and can be used with ``async with``. See `Asynchronous > Context Managers and "async with"`_ for details. > > :Asynchronous iterable: > An object with an ``__aiter__`` method, which must return an > *asynchronous iterator* object. Can be used with ``async for``. > See `Asynchronous Iterators and "async for"`_ for details. > > :Asynchronous iterator: > An asynchronous iterator has an ``__anext__`` method. See > `Asynchronous Iterators and "async for"`_ for details. > > > List of functions and methods > ============================= > (I'm not sure of the utility of this section.) > > ================= =================================== ================= > Method Can contain Can't contain > ================= =================================== ================= > async def func await, return value yield, yield from > async def __a*__ await, return value yield, yield from > (This line seems redundant.) > def __a*__ return awaitable await > def __await__ yield, yield from, return iterable await > generator yield, yield from, return value await > ================= =================================== ================= > > Where: > > * "async def func": coroutine; > > * "async def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``, > ``__aexit__`` defined with the ``async`` keyword; > > * "def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``, > ``__aexit__`` defined without the ``async`` keyword, must return an > *awaitable*; > > * "def __await__": ``__await__`` method to implement *Future-like* > objects; > > * generator: a "regular" generator, function defined with ``def`` and > which contains a least one ``yield`` or ``yield from`` expression. > > > Transition Plan > =============== > > This may need to be pulled forward or at least mentioned earlier (in the Abstract or near the top of the Specification). > To avoid backwards compatibility issues with ``async`` and ``await`` > keywords, it was decided to modify ``tokenizer.c`` in such a way, that > it: > > * recognizes ``async def`` name tokens combination (start of a > coroutine); > > * keeps track of regular functions and coroutines; > > * replaces ``'async'`` token with ``ASYNC`` and ``'await'`` token with > ``AWAIT`` when in the process of yielding tokens for coroutines. > > This approach allows for seamless combination of new syntax features > (all of them available only in ``async`` functions) with any existing > code. > > An example of having "async def" and "async" attribute in one piece of > code:: > > class Spam: > async = 42 > > async def ham(): > print(getattr(Spam, 'async')) > > # The coroutine can be executed and will print '42' > > > Backwards Compatibility > ----------------------- > > This proposal preserves 100% backwards compatibility. > Is this still true with the proposed restrictions on what ``yield from`` accepts? (Hopefully I'm the one who is confused. :-) > > > Grammar Updates > --------------- > > Grammar changes are also fairly minimal:: > > decorated: decorators (classdef | funcdef | async_funcdef) > async_funcdef: ASYNC funcdef > > compound_stmt: (if_stmt | while_stmt | for_stmt | try_stmt | with_stmt > | funcdef | classdef | decorated | async_stmt) > > async_stmt: ASYNC (funcdef | with_stmt | for_stmt) > > power: atom_expr ['**' factor] > atom_expr: [AWAIT] atom trailer* > > > Transition Period Shortcomings > ------------------------------ > > There is just one. > > Until ``async`` and ``await`` are not proper keywords, it is not > possible (or at least very hard) to fix ``tokenizer.c`` to recognize > them on the **same line** with ``def`` keyword:: > > # async and await will always be parsed as variables > > async def outer(): # 1 > def nested(a=(await fut)): > pass > > async def foo(): return (await fut) # 2 > > Since ``await`` and ``async`` in such cases are parsed as ``NAME`` > tokens, a ``SyntaxError`` will be raised. > > To workaround these issues, the above examples can be easily rewritten > to a more readable form:: > > async def outer(): # 1 > a_default = await fut > def nested(a=a_default): > pass > > async def foo(): # 2 > return (await fut) > > This limitation will go away as soon as ``async`` and ``await`` ate > proper keywords. Or if it's decided to use a future import for this > PEP. > > > Deprecation Plans > ----------------- > > ``async`` and ``await`` names will be softly deprecated in CPython 3.5 > and 3.6. In 3.7 we will transform them to proper keywords. Making > ``async`` and ``await`` proper keywords before 3.7 might make it harder > for people to port their code to Python 3. > > > asyncio > ------- > > ``asyncio`` module was adapted and tested to work with coroutines and > new statements. Backwards compatibility is 100% preserved. > > The required changes are mainly: > > 1. Modify ``@asyncio.coroutine`` decorator to use new > ``types.coroutine()`` function. > > 2. Add ``__await__ = __iter__`` line to ``asyncio.Future`` class. > > 3. Add ``ensure_task()`` as an alias for ``async()`` function. > Deprecate ``async()`` function. > > > Design Considerations > ===================== > > PEP 3152 > -------- > > PEP 3152 by Gregory Ewing proposes a different mechanism for coroutines > (called "cofunctions"). Some key points: > > 1. A new keyword ``codef`` to declare a *cofunction*. *Cofunction* is > always a generator, even if there is no ``cocall`` expressions > inside it. Maps to ``async def`` in this proposal. > > 2. A new keyword ``cocall`` to call a *cofunction*. Can only be used > inside a *cofunction*. Maps to ``await`` in this proposal (with > some differences, see below.) > > 3. It is not possible to call a *cofunction* without a ``cocall`` > keyword. > > 4. ``cocall`` grammatically requires parentheses after it:: > > atom: cocall | > cocall: 'cocall' atom cotrailer* '(' [arglist] ')' > cotrailer: '[' subscriptlist ']' | '.' NAME > > 5. ``cocall f(*args, **kwds)`` is semantically equivalent to > ``yield from f.__cocall__(*args, **kwds)``. > > Differences from this proposal: > > 1. There is no equivalent of ``__cocall__`` in this PEP, which is > called and its result is passed to ``yield from`` in the ``cocall`` > expression. ``await`` keyword expects an *awaitable* object, > validates the type, and executes ``yield from`` on it. Although, > ``__await__`` method is similar to ``__cocall__``, but is only used > to define *Future-like* objects. > > 2. ``await`` is defined in almost the same way as ``yield from`` in the > grammar (it is later enforced that ``await`` can only be inside > ``async def``). It is possible to simply write ``await future``, > whereas ``cocall`` always requires parentheses. > > 3. To make asyncio work with PEP 3152 it would be required to modify > ``@asyncio.coroutine`` decorator to wrap all functions in an object > with a ``__cocall__`` method, or to implement ``__cocall__`` on > generators. To call *cofunctions* from existing generator-based > coroutines it would be required to use ``costart(cofunc, *args, > **kwargs)`` built-in. > > 4. Since it is impossible to call a *cofunction* without a ``cocall`` > keyword, it automatically prevents the common mistake of forgetting > to use ``yield from`` on generator-based coroutines. This proposal > addresses this problem with a different approach, see `Debugging > Features`_. > > 5. A shortcoming of requiring a ``cocall`` keyword to call a coroutine > is that if is decided to implement coroutine-generators -- > coroutines with ``yield`` or ``async yield`` expressions -- we > wouldn't need a ``cocall`` keyword to call them. So we'll end up > having ``__cocall__`` and no ``__call__`` for regular coroutines, > and having ``__call__`` and no ``__cocall__`` for coroutine- > generators. > > 6. Requiring parentheses grammatically also introduces a whole lot > of new problems. > > The following code:: > > await fut > await function_returning_future() > await asyncio.gather(coro1(arg1, arg2), coro2(arg1, arg2)) > > would look like:: > > cocall fut() # or cocall costart(fut) > cocall (function_returning_future())() > cocall asyncio.gather(costart(coro1, arg1, arg2), > costart(coro2, arg1, arg2)) > > 7. There are no equivalents of ``async for`` and ``async with`` in PEP > 3152. > > > Coroutine-generators > -------------------- > > With ``async for`` keyword it is desirable to have a concept of a > *coroutine-generator* -- a coroutine with ``yield`` and ``yield from`` > expressions. To avoid any ambiguity with regular generators, we would > likely require to have an ``async`` keyword before ``yield``, and > ``async yield from`` would raise a ``StopAsyncIteration`` exception. > > While it is possible to implement coroutine-generators, we believe that > they are out of scope of this proposal. It is an advanced concept that > should be carefully considered and balanced, with a non-trivial changes > in the implementation of current generator objects. This is a matter > for a separate PEP. > > > No implicit wrapping in Futures > ------------------------------- > > There is a proposal to add similar mechanism to ECMAScript 7 [2]_. A > key difference is that JavaScript "async functions" always return a > Promise. While this approach has some advantages, it also implies that > a new Promise object is created on each "async function" invocation. > > We could implement a similar functionality in Python, by wrapping all > coroutines in a Future object, but this has the following > disadvantages: > > 1. Performance. A new Future object would be instantiated on each > coroutine call. Moreover, this makes implementation of ``await`` > expressions slower (disabling optimizations of ``yield from``). > > 2. A new built-in ``Future`` object would need to be added. > > 3. Coming up with a generic ``Future`` interface that is usable for any > use case in any framework is a very hard to solve problem. > > 4. It is not a feature that is used frequently, when most of the code > is coroutines. > > > Why "async" and "await" keywords > -------------------------------- > > async/await is not a new concept in programming languages: > > * C# has it since long time ago [5]_; > > * proposal to add async/await in ECMAScript 7 [2]_; > see also Traceur project [9]_; > > * Facebook's Hack/HHVM [6]_; > > * Google's Dart language [7]_; > > * Scala [8]_; > > * proposal to add async/await to C++ [10]_; > > * and many other less popular languages. > > This is a huge benefit, as some users already have experience with > async/await, and because it makes working with many languages in one > project easier (Python with ECMAScript 7 for instance). > > > Why "__aiter__" is a coroutine > ------------------------------ > > In principle, ``__aiter__`` could be a regular function. There are > several good reasons to make it a coroutine: > > * as most of the ``__anext__``, ``__aenter__``, and ``__aexit__`` > methods are coroutines, users would often make a mistake defining it > as ``async`` anyways; > > * there might be a need to run some asynchronous operations in > ``__aiter__``, for instance to prepare DB queries or do some file > operation. > > > Importance of "async" keyword > ----------------------------- > > While it is possible to just implement ``await`` expression and treat > all functions with at least one ``await`` as coroutines, this approach > makes APIs design, code refactoring and its long time support harder. > > Let's pretend that Python only has ``await`` keyword:: > > def useful(): > ... > await log(...) > ... > > def important(): > await useful() > > If ``useful()`` function is refactored and someone removes all > ``await`` expressions from it, it would become a regular python > function, and all code that depends on it, including ``important()`` > would be broken. To mitigate this issue a decorator similar to > ``@asyncio.coroutine`` has to be introduced. > > > Why "async def" > --------------- > > For some people bare ``async name(): pass`` syntax might look more > appealing than ``async def name(): pass``. It is certainly easier to > type. But on the other hand, it breaks the symmetry between ``async > def``, ``async with`` and ``async for``, where ``async`` is a modifier, > stating that the statement is asynchronous. It is also more consistent > with the existing grammar. > > > Why "async for/with" instead of "await for/with" > ------------------------------------------------ > > ``async`` is an adjective, and hence it is a better choice for a > *statement qualifier* keyword. ``await for/with`` would imply that > something is awaiting for a completion of a ``for`` or ``with`` > statement. > > > Why "async def" and not "def async" > ----------------------------------- > > ``async`` keyword is a *statement qualifier*. A good analogy to it are > "static", "public", "unsafe" keywords from other languages. "async > for" is an asynchronous "for" statement, "async with" is an > asynchronous "with" statement, "async def" is an asynchronous function. > > Having "async" after the main statement keyword might introduce some > confusion, like "for async item in iterator" can be read as "for each > asynchronous item in iterator". > > Having ``async`` keyword before ``def``, ``with`` and ``for`` also > makes the language grammar simpler. And "async def" better separates > coroutines from regular functions visually. > > > Why not a __future__ import > --------------------------- > > ``__future__`` imports are inconvenient and easy to forget to add. > Also, they are enabled for the whole source file. Consider that there > is a big project with a popular module named "async.py". With future > imports it is required to either import it using ``__import__()`` or > ``importlib.import_module()`` calls, or to rename the module. The > proposed approach makes it possible to continue using old code and > modules without a hassle, while coming up with a migration plan for > future python versions. > > > Why magic methods start with "a" > -------------------------------- > > New asynchronous magic methods ``__aiter__``, ``__anext__``, > ``__aenter__``, and ``__aexit__`` all start with the same prefix "a". > An alternative proposal is to use "async" prefix, so that ``__aiter__`` > becomes ``__async_iter__``. However, to align new magic methods with > the existing ones, such as ``__radd__`` and ``__iadd__`` it was decided > to use a shorter version. > > > Why not reuse existing magic names > ---------------------------------- > > An alternative idea about new asynchronous iterators and context > managers was to reuse existing magic methods, by adding an ``async`` > keyword to their declarations:: > > class CM: > async def __enter__(self): # instead of __aenter__ > ... > > This approach has the following downsides: > > * it would not be possible to create an object that works in both > ``with`` and ``async with`` statements; > > * it would break backwards compatibility, as nothing prohibits from > returning a Future-like objects from ``__enter__`` and/or > ``__exit__`` in Python <= 3.4; > > * one of the main points of this proposal is to make coroutines as > simple and foolproof as possible, hence the clear separation of the > protocols. > > > Why not reuse existing "for" and "with" statements > -------------------------------------------------- > > The vision behind existing generator-based coroutines and this proposal > is to make it easy for users to see where the code might be suspended. > Making existing "for" and "with" statements to recognize asynchronous > iterators and context managers will inevitably create implicit suspend > points, making it harder to reason about the code. > > > Comprehensions > -------------- > > For the sake of restricting the broadness of this PEP there is no new > syntax for asynchronous comprehensions. This should be considered in a > separate PEP, if there is a strong demand for this feature. > > > Async lambdas > ------------- > > Lambda coroutines are not part of this proposal. In this proposal they > would look like ``async lambda(parameters): expression``. Unless there > is a strong demand to have them as part of this proposal, it is > recommended to consider them later in a separate PEP. > > > Performance > =========== > > Overall Impact > -------------- > > This proposal introduces no observable performance impact. Here is an > output of python's official set of benchmarks [4]_: > > :: > > python perf.py -r -b default ../cpython/python.exe > ../cpython-aw/python.exe > > [skipped] > > Report on Darwin ysmac 14.3.0 Darwin Kernel Version 14.3.0: > Mon Mar 23 11:59:05 PDT 2015; root:xnu-2782.20.48~5/RELEASE_X86_64 > x86_64 i386 > > Total CPU cores: 8 > > ### etree_iterparse ### > Min: 0.365359 -> 0.349168: 1.05x faster > Avg: 0.396924 -> 0.379735: 1.05x faster > Significant (t=9.71) > Stddev: 0.01225 -> 0.01277: 1.0423x larger > > The following not significant results are hidden, use -v to show them: > django_v2, 2to3, etree_generate, etree_parse, etree_process, > fastpickle, > fastunpickle, json_dump_v2, json_load, nbody, regex_v8, tornado_http. > > > Tokenizer modifications > ----------------------- > > There is no observable slowdown of parsing python files with the > modified tokenizer: parsing of one 12Mb file > (``Lib/test/test_binop.py`` repeated 1000 times) takes the same amount > of time. > > > async/await > ----------- > > The following micro-benchmark was used to determine performance > difference between "async" functions and generators:: > > import sys > import time > > def binary(n): > if n <= 0: > return 1 > l = yield from binary(n - 1) > r = yield from binary(n - 1) > return l + 1 + r > > async def abinary(n): > if n <= 0: > return 1 > l = await abinary(n - 1) > r = await abinary(n - 1) > return l + 1 + r > > def timeit(gen, depth, repeat): > t0 = time.time() > for _ in range(repeat): > list(gen(depth)) > t1 = time.time() > print('{}({}) * {}: total {:.3f}s'.format( > gen.__name__, depth, repeat, t1-t0)) > > The result is that there is no observable performance difference. > Minimum timing of 3 runs > > :: > > abinary(19) * 30: total 12.985s > binary(19) * 30: total 12.953s > > Note that depth of 19 means 1,048,575 calls. > > > Reference Implementation > ======================== > > The reference implementation can be found here: [3]_. > > List of high-level changes and new protocols > -------------------------------------------- > > 1. New syntax for defining coroutines: ``async def`` and new ``await`` > keyword. > > 2. New ``__await__`` method for Future-like objects, and new > ``tp_await`` slot in ``PyTypeObject``. > > 3. New syntax for asynchronous context managers: ``async with``. And > associated protocol with ``__aenter__`` and ``__aexit__`` methods. > > 4. New syntax for asynchronous iteration: ``async for``. And > associated protocol with ``__aiter__``, ``__aexit__`` and new built- > in exception ``StopAsyncIteration``. > > 5. New AST nodes: ``AsyncFunctionDef``, ``AsyncFor``, ``AsyncWith``, > ``Await``. > > 6. New functions: ``sys.set_coroutine_wrapper(callback)``, > ``sys.get_coroutine_wrapper()``, ``types.coroutine(gen)``, > ``inspect.iscoroutinefunction()``, and ``inspect.iscoroutine()``. > > 7. New ``CO_COROUTINE`` bit flag for code objects. > > While the list of changes and new things is not short, it is important > to understand, that most users will not use these features directly. > It is intended to be used in frameworks and libraries to provide users > with convenient to use and unambiguous APIs with ``async def``, > ``await``, ``async for`` and ``async with`` syntax. > > > Working example > --------------- > > All concepts proposed in this PEP are implemented [3]_ and can be > tested. > > :: > > import asyncio > > async def echo_server(): > print('Serving on localhost:8000') > await asyncio.start_server(handle_connection, > 'localhost', 8000) > > async def handle_connection(reader, writer): > print('New connection...') > > while True: > data = await reader.read(8192) > > if not data: > break > > print('Sending {:.10}... back'.format(repr(data))) > writer.write(data) > > loop = asyncio.get_event_loop() > loop.run_until_complete(echo_server()) > try: > loop.run_forever() > finally: > loop.close() > > > References > ========== > > .. [1] > https://docs.python.org/3/library/asyncio-task.html#asyncio.coroutine > > .. [2] http://wiki.ecmascript.org/doku.php?id=strawman:async_functions > > .. [3] https://github.com/1st1/cpython/tree/await > > .. [4] https://hg.python.org/benchmarks > > .. [5] https://msdn.microsoft.com/en-us/library/hh191443.aspx > > .. [6] http://docs.hhvm.com/manual/en/hack.async.php > > .. [7] https://www.dartlang.org/articles/await-async/ > > .. [8] http://docs.scala-lang.org/sips/pending/async.html > > .. [9] > https://github.com/google/traceur-compiler/wiki/LanguageFeatures#async-functions-experimental > > .. [10] http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3722.pdf > (PDF) > > > Acknowledgments > =============== > > I thank Guido van Rossum, Victor Stinner, Elvis Pranskevichus, Andrew > Svetlov, and ?ukasz Langa for their initial feedback. > > > Copyright > ========= > > This document has been placed in the public domain. > > .. > Local Variables: > mode: indented-text > indent-tabs-mode: nil > sentence-end-double-space: t > fill-column: 70 > coding: utf-8 > End: > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From yselivanov.ml at gmail.com Wed Apr 29 01:26:57 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Tue, 28 Apr 2015 19:26:57 -0400 Subject: [Python-Dev] PEP 492: async/await in Python; v3 In-Reply-To: References: <553EF985.2070808@gmail.com> Message-ID: <55401741.9040703@gmail.com> Hi Guido, Thank you for a very detailed review. Comments below: On 2015-04-28 5:49 PM, Guido van Rossum wrote: > Inline comments below... > > On Mon, Apr 27, 2015 at 8:07 PM, Yury Selivanov > wrote: > >> Hi python-dev, >> >> Another round of updates. Reference implementation >> has been updated: https://github.com/1st1/cpython/tree/await >> (includes all things from the below summary of updates + >> tests). >> >> >> Summary: >> >> >> 1. "PyTypeObject.tp_await" slot. Replaces "tp_reserved". >> This is to enable implementation of Futures with C API. >> Must return an iterator if implemented. >> >> That's fine (though I didn't follow this closely). My main question here is it OK to reuse 'tp_reserved' (former tp_compare)? I had to remove this check: https://github.com/1st1/cpython/commit/4be6d0a77688b63b917ad88f09d446ac3b7e2ce9#diff-c3cf251f16d5a03a9e7d4639f2d6f998L4906 On the other hand I think that it's a slightly better solution than adding a new slot. >> 2. New grammar for "await" expressions, see >> 'Syntax of "await" expression' section >> >> I like it. Great! The current grammar requires parentheses for consequent await expressions: await (await coro()) I can change this (in theory), but I kind of like the parens in this case -- better readability. And it'll be a very rare case. >> 3. inspect.iscoroutine() and inspect.iscoroutineobjects() >> functions. >> >> What's the use case for these? I wonder if it makes more sense to have a > check for a generalized awaitable rather than specifically a coroutine. It's important to at least have 'iscoroutine' -- to check that the object is a coroutine function. A typical use-case would be a web framework that lets you to bind coroutines to specific http methods/paths: @http.get('/spam') async def handle_spam(request): ... 'http.get' decorator will need a way to raise an error if it's applied to a regular function (while the code is being imported, not in runtime). The idea here is to cover all kinds of python objects in inspect module, it's Python's reflection API. The other thing is that it's easy to implement this function for CPython: just check for CO_COROUTINE flag. For other Python implementations it might be a different story. (More arguments for isawaitable() below) > >> 4. Full separation of coroutines and generators. >> This is a big one; let's discuss. >> >> a) Coroutine objects raise TypeError (is NotImplementedError >> better?) in their __iter__ and __next__. Therefore it's >> not not possible to pass them to iter(), tuple(), next() and >> other similar functions that work with iterables. >> >> I think it should be TypeError -- what you *really* want is not to define > these methods at all but given the implementation tactic for coroutines > that may not be possible, so the nearest approximation is TypeError. (Also, > NotImplementedError is typically to indicate that a subclass should > implement it.) > Agree. >> b) Because of (a), for..in iteration also does not work >> on coroutines anymore. >> >> Sounds good. > >> c) 'yield from' only accept coroutine objects from >> generators decorated with 'types.coroutine'. That means >> that existing asyncio generator-based coroutines will >> happily yield from both coroutines and generators. >> *But* every generator-based coroutine *must* be >> decorated with `asyncio.coroutine()`. This is >> potentially a backwards incompatible change. >> >> See below. I worry about backward compatibility. A lot. Are you saying > that asycio-based code that doesn't use @coroutine will break in 3.5? I'll experiment with replacing (c) with a warning. We can disable __iter__ and __next__ for coroutines, but allow to use 'yield from' on them. Would it be a better approach? > > >> d) inspect.isgenerator() and inspect.isgeneratorfunction() >> return `False` for coroutine objects & coroutine functions. >> >> Makes sense. (d) can also break something (hypothetically). I'm not sure why would someone use isgenerator() and isgeneratorfunction() on generator-based coroutines in code based on asyncio, but there is a chance that someone did (it should be trivial to fix the code). Same for iter() and next(). The chance is slim, but we may break some obscure code. Are you OK with this? > >> e) Should we add a coroutine ABC (for cython etc)? >> >> I, personally, think this is highly necessary. First, >> separation of coroutines from generators is extremely >> important. One day there won't be generator-based >> coroutines, and we want to avoid any kind of confusion. >> Second, we only can do this in 3.5. This kind of >> semantics change won't be ever possible. >> > Sounds like Stefan agrees. Are you aware of > http://bugs.python.org/issue24018 (Generator ABC)? Yes, I saw the issue. I'll review it in more detail before thinking about Coroutine ABC for the next PEP update. > > >> asyncio recommends using @coroutine decorator, and most >> projects that I've seen do use it. Also there is no >> reason for people to use iter() and next() functions >> on coroutines when writing asyncio code. I doubt that >> this will cause serious backwards compatibility problems >> (asyncio also has provisional status). >> >> I wouldn't count too much on asyncio's provisional status. What are the > consequences for code that is written to work with asyncio but doesn't use > @coroutine? Such code will work with 3.4 and (despite the provisional > status and the recommendation to use @coroutine) I don't want that code to > break in 3.5 (though maybe a warning would be fine). > > I also hope that if someone has their own (renamed) copy of asyncio that > works with 3.4, it will all still work with 3.5. Even if asyncio itself is > provisional, none of the primitives (e.g. yield from) that it is built upon > are provisional, so there should be no reason for it to break in 3.5. I agree. I'll try warnings for yield-fromming coroutines from regular generators (so that we can disable it in 3.7/3.6). *If that doesn't work*, I think we need a compromise (not ideal, but breaking things is worse): - yield from would always accept coroutine-objects - iter(), next(), tuple(), etc won't work on coroutine-objects - for..in won't work on coroutine-objects > > >> Thank you, >> Yury >> >> Some more inline comments directly on the PEP below. > >> PEP: 492 >> Title: Coroutines with async and await syntax >> Version: $Revision$ >> Last-Modified: $Date$ >> Author: Yury Selivanov >> Status: Draft >> Type: Standards Track >> Content-Type: text/x-rst >> Created: 09-Apr-2015 >> Python-Version: 3.5 >> Post-History: 17-Apr-2015, 21-Apr-2015, 27-Apr-2015 >> >> >> Abstract >> ======== >> >> This PEP introduces new syntax for coroutines, asynchronous ``with`` >> statements and ``for`` loops. The main motivation behind this proposal >> is to streamline writing and maintaining asynchronous code, as well as >> to simplify previously hard to implement code patterns. >> >> >> Rationale and Goals >> =================== >> >> Current Python supports implementing coroutines via generators (PEP >> 342), further enhanced by the ``yield from`` syntax introduced in PEP >> 380. This approach has a number of shortcomings: >> >> * it is easy to confuse coroutines with regular generators, since they >> share the same syntax; async libraries often attempt to alleviate >> this by using decorators (e.g. ``@asyncio.coroutine`` [1]_); >> >> * it is not possible to natively define a coroutine which has no >> ``yield`` or ``yield from`` statements, again requiring the use of >> decorators to fix potential refactoring issues; >> >> (I have to agree with Mark that this point is pretty weak. :-) > >> * support for asynchronous calls is limited to expressions where >> ``yield`` is allowed syntactically, limiting the usefulness of >> syntactic features, such as ``with`` and ``for`` statements. >> >> This proposal makes coroutines a native Python language feature, and >> clearly separates them from generators. This removes >> generator/coroutine ambiguity, and makes it possible to reliably define >> coroutines without reliance on a specific library. This also enables >> linters and IDEs to improve static code analysis and refactoring. >> >> Native coroutines and the associated new syntax features make it >> possible to define context manager and iteration protocols in >> asynchronous terms. As shown later in this proposal, the new ``async >> with`` statement lets Python programs perform asynchronous calls when >> entering and exiting a runtime context, and the new ``async for`` >> statement makes it possible to perform asynchronous calls in iterators. >> > I wonder if you could add some adaptation of the explanation I have posted > (a few times now, I feel) for the reason why I prefer to suspend only at > syntactically recognizable points (yield [from] in the past, await and > async for/with in this PEP). Unless you already have it in the rationale > (though it seems Mark didn't think it was enough :-). I'll see what I can do. > >> >> Specification >> ============= >> >> This proposal introduces new syntax and semantics to enhance coroutine >> support in Python, it does not change the internal implementation of >> coroutines, which are still based on generators. >> >> It's actually a separate issue whether any implementation changes. > Implementation changes don't need to go through the PEP process, unless > they're really also interface changes. > > >> It is strongly suggested that the reader understands how coroutines are >> implemented in Python (PEP 342 and PEP 380). It is also recommended to >> read PEP 3156 (asyncio framework) and PEP 3152 (Cofunctions). >> >> From this point in this document we use the word *coroutine* to refer >> to functions declared using the new syntax. *generator-based >> coroutine* is used where necessary to refer to coroutines that are >> based on generator syntax. >> > Despite reading this I still get confused when reading the PEP (probably > because asyncio uses "coroutine" in the latter sense). Maybe it would make > sense to write "native coroutine" for the new concept, to distinguish the > two concepts more clearly? (You could even change "awaitable" to > "coroutine". Though I like "awaitable" too.) "awaitable" is a more generic term... It can be a future, or it can be a coroutine. Mixing them in one may create more confusion. Also, "awaitable" is more of an interface, or a trait, which means that the object won't be rejected by the 'await' expression. I like your 'native coroutine' suggestion. I'll update the PEP. > >> >> New Coroutine Declaration Syntax >> -------------------------------- >> >> The following new syntax is used to declare a coroutine:: >> >> async def read_data(db): >> pass >> >> Key properties of coroutines: >> >> * ``async def`` functions are always coroutines, even if they do not >> contain ``await`` expressions. >> >> * It is a ``SyntaxError`` to have ``yield`` or ``yield from`` >> expressions in an ``async`` function. >> >> For Mark's benefit, might add that this is similar to how ``return`` and > ``yield`` are disallowed syntactically outside functions (as are the > syntactic constraints on ``await` and ``async for|def``). OK > > >> * Internally, a new code object flag - ``CO_COROUTINE`` - is introduced >> to enable runtime detection of coroutines (and migrating existing >> code). All coroutines have both ``CO_COROUTINE`` and ``CO_GENERATOR`` >> flags set. >> >> * Regular generators, when called, return a *generator object*; >> similarly, coroutines return a *coroutine object*. >> >> * ``StopIteration`` exceptions are not propagated out of coroutines, >> and are replaced with a ``RuntimeError``. For regular generators >> such behavior requires a future import (see PEP 479). >> >> >> types.coroutine() >> ----------------- >> >> A new function ``coroutine(gen)`` is added to the ``types`` module. It >> applies ``CO_COROUTINE`` flag to the passed generator-function's code >> object, making it to return a *coroutine object* when called. >> >> Clarify that this is a decorator that modifies the function object in > place. Good catch. > > >> This feature enables an easy upgrade path for existing libraries. >> >> >> Await Expression >> ---------------- >> >> The following new ``await`` expression is used to obtain a result of >> coroutine execution:: >> >> async def read_data(db): >> data = await db.fetch('SELECT ...') >> ... >> >> ``await``, similarly to ``yield from``, suspends execution of >> ``read_data`` coroutine until ``db.fetch`` *awaitable* completes and >> returns the result data. >> >> It uses the ``yield from`` implementation with an extra step of >> validating its argument. ``await`` only accepts an *awaitable*, which >> can be one of: >> >> * A *coroutine object* returned from a *coroutine* or a generator >> decorated with ``types.coroutine()``. >> >> * An object with an ``__await__`` method returning an iterator. >> >> Any ``yield from`` chain of calls ends with a ``yield``. This is a >> fundamental mechanism of how *Futures* are implemented. Since, >> internally, coroutines are a special kind of generators, every >> ``await`` is suspended by a ``yield`` somewhere down the chain of >> ``await`` calls (please refer to PEP 3156 for a detailed >> explanation.) >> >> To enable this behavior for coroutines, a new magic method called >> ``__await__`` is added. In asyncio, for instance, to enable Future >> objects in ``await`` statements, the only change is to add >> ``__await__ = __iter__`` line to ``asyncio.Future`` class. >> >> Objects with ``__await__`` method are called *Future-like* objects in >> the rest of this PEP. >> >> Also, please note that ``__aiter__`` method (see its definition >> below) cannot be used for this purpose. It is a different protocol, >> and would be like using ``__iter__`` instead of ``__call__`` for >> regular callables. >> >> It is a ``TypeError`` if ``__await__`` returns anything but an >> iterator. >> >> * Objects defined with CPython C API with a ``tp_await`` function, >> returning an iterator (similar to ``__await__`` method). >> >> It is a ``SyntaxError`` to use ``await`` outside of a coroutine. >> >> It is a ``TypeError`` to pass anything other than an *awaitable* object >> to an ``await`` expression. >> >> >> Syntax of "await" expression >> '''''''''''''''''''''''''''' >> >> ``await`` keyword is defined differently from ``yield`` and ``yield >> from``. The main difference is that *await expressions* do not require >> parentheses around them most of the times. >> >> Examples:: >> >> ================================== ================================== >> Expression Will be parsed as >> ================================== ================================== >> ``if await fut: pass`` ``if (await fut): pass`` >> ``if await fut + 1: pass`` ``if (await fut) + 1: pass`` >> ``pair = await fut, 'spam'`` ``pair = (await fut), 'spam'`` >> ``with await fut, open(): pass`` ``with (await fut), open(): pass`` >> ``await foo()['spam'].baz()()`` ``await ( foo()['spam'].baz()() )`` >> ``return await coro()`` ``return ( await coro() )`` >> ``res = await coro() ** 2`` ``res = (await coro()) ** 2`` >> ``func(a1=await coro(), a2=0)`` ``func(a1=(await coro()), a2=0)`` >> ================================== ================================== >> >> See `Grammar Updates`_ section for details. >> >> >> Asynchronous Context Managers and "async with" >> ---------------------------------------------- >> >> An *asynchronous context manager* is a context manager that is able to >> suspend execution in its *enter* and *exit* methods. >> >> To make this possible, a new protocol for asynchronous context managers >> is proposed. Two new magic methods are added: ``__aenter__`` and >> ``__aexit__``. Both must return an *awaitable*. >> >> An example of an asynchronous context manager:: >> >> class AsyncContextManager: >> async def __aenter__(self): >> await log('entering context') >> >> async def __aexit__(self, exc_type, exc, tb): >> await log('exiting context') >> >> >> New Syntax >> '''''''''' >> >> A new statement for asynchronous context managers is proposed:: >> >> async with EXPR as VAR: >> BLOCK >> >> >> which is semantically equivalent to:: >> >> mgr = (EXPR) >> aexit = type(mgr).__aexit__ >> aenter = type(mgr).__aenter__(mgr) >> exc = True >> >> try: >> try: >> VAR = await aenter >> BLOCK >> except: >> exc = False >> exit_res = await aexit(mgr, *sys.exc_info()) >> if not exit_res: >> raise >> >> finally: >> if exc: >> await aexit(mgr, None, None, None) >> >> I realize you copied this from PEP 343, but I wonder, do we really need > two nested try statements and the 'exc' flag? Why can't the finally step be > placed in an 'else' clause on the inner try? (There may well be a reason > but I can't figure what it is, and PEP 343 doesn't seem to explain it.) > > Also, it's a shame we're perpetuating the sys.exc_info() triple in the API > here, but I agree making __exit__ and __aexit__ different also isn't a > great idea. :-( > > PS. With the new tighter syntax for ``await`` you don't need the > ``exit_res`` variable any more. Yes, this can be simplified. It was indeed copied from PEP 343. > > >> As with regular ``with`` statements, it is possible to specify multiple >> context managers in a single ``async with`` statement. >> >> It is an error to pass a regular context manager without ``__aenter__`` >> and ``__aexit__`` methods to ``async with``. It is a ``SyntaxError`` >> to use ``async with`` outside of a coroutine. >> >> >> Example >> ''''''' >> >> With asynchronous context managers it is easy to implement proper >> database transaction managers for coroutines:: >> >> async def commit(session, data): >> ... >> >> async with session.transaction(): >> ... >> await session.update(data) >> ... >> >> Code that needs locking also looks lighter:: >> >> async with lock: >> ... >> >> instead of:: >> >> with (yield from lock): >> ... >> > (Also, the implementation of the latter is problematic -- check > asyncio/locks.py and notice that __enter__ is empty...) > >> >> Asynchronous Iterators and "async for" >> -------------------------------------- >> >> An *asynchronous iterable* is able to call asynchronous code in its >> *iter* implementation, and *asynchronous iterator* can call >> asynchronous code in its *next* method. To support asynchronous >> iteration: >> >> 1. An object must implement an ``__aiter__`` method returning an >> *awaitable* resulting in an *asynchronous iterator object*. >> > Have you considered making __aiter__ not an awaitable? It's not strictly > necessary I think, one could do all the awaiting in __anext__. Though > perhaps there are use cases that are more naturally expressed by awaiting > in __aiter__? (Your examples all use ``async def __aiter__(self): return > self`` suggesting this would be no great loss.) There is a section in Design Considerations about this. I should add a reference to it. > > >> 2. An *asynchronous iterator object* must implement an ``__anext__`` >> method returning an *awaitable*. >> >> 3. To stop iteration ``__anext__`` must raise a ``StopAsyncIteration`` >> exception. >> >> An example of asynchronous iterable:: >> >> class AsyncIterable: >> async def __aiter__(self): >> return self >> >> async def __anext__(self): >> data = await self.fetch_data() >> if data: >> return data >> else: >> raise StopAsyncIteration >> >> async def fetch_data(self): >> ... >> >> >> New Syntax >> '''''''''' >> >> A new statement for iterating through asynchronous iterators is >> proposed:: >> >> async for TARGET in ITER: >> BLOCK >> else: >> BLOCK2 >> >> which is semantically equivalent to:: >> >> iter = (ITER) >> iter = await type(iter).__aiter__(iter) >> running = True >> while running: >> try: >> TARGET = await type(iter).__anext__(iter) >> except StopAsyncIteration: >> running = False >> else: >> BLOCK >> else: >> BLOCK2 >> >> >> It is a ``TypeError`` to pass a regular iterable without ``__aiter__`` >> method to ``async for``. It is a ``SyntaxError`` to use ``async for`` >> outside of a coroutine. >> >> As for with regular ``for`` statement, ``async for`` has an optional >> ``else`` clause. >> >> (Not because we're particularly fond of it, but because its absence would > just introduce more special cases. :-) > > >> Example 1 >> ''''''''' >> >> With asynchronous iteration protocol it is possible to asynchronously >> buffer data during iteration:: >> >> async for data in cursor: >> ... >> >> Where ``cursor`` is an asynchronous iterator that prefetches ``N`` rows >> of data from a database after every ``N`` iterations. >> >> The following code illustrates new asynchronous iteration protocol:: >> >> class Cursor: >> def __init__(self): >> self.buffer = collections.deque() >> >> def _prefetch(self): >> ... >> >> async def __aiter__(self): >> return self >> >> async def __anext__(self): >> if not self.buffer: >> self.buffer = await self._prefetch() >> if not self.buffer: >> raise StopAsyncIteration >> return self.buffer.popleft() >> >> then the ``Cursor`` class can be used as follows:: >> >> async for row in Cursor(): >> print(row) >> >> which would be equivalent to the following code:: >> >> i = await Cursor().__aiter__() >> while True: >> try: >> row = await i.__anext__() >> except StopAsyncIteration: >> break >> else: >> print(row) >> >> >> Example 2 >> ''''''''' >> >> The following is a utility class that transforms a regular iterable to >> an asynchronous one. While this is not a very useful thing to do, the >> code illustrates the relationship between regular and asynchronous >> iterators. >> >> :: >> >> class AsyncIteratorWrapper: >> def __init__(self, obj): >> self._it = iter(obj) >> >> async def __aiter__(self): >> return self >> >> async def __anext__(self): >> try: >> value = next(self._it) >> except StopIteration: >> raise StopAsyncIteration >> return value >> >> async for letter in AsyncIteratorWrapper("abc"): >> print(letter) >> >> >> Why StopAsyncIteration? >> ''''''''''''''''''''''' >> >> I keep wanting to propose to rename this to AsyncStopIteration. I know > it's about stopping an async iteration, but in my head I keep referring to > it as AsyncStopIteration, probably because in other places we use async (or > 'a') as a prefix. I'd be totally OK with that. Should I rename it? > > >> Coroutines are still based on generators internally. So, before PEP >> 479, there was no fundamental difference between >> >> :: >> >> def g1(): >> yield from fut >> return 'spam' >> >> and >> >> :: >> >> def g2(): >> yield from fut >> raise StopIteration('spam') >> >> And since PEP 479 is accepted and enabled by default for coroutines, >> the following example will have its ``StopIteration`` wrapped into a >> ``RuntimeError`` >> >> :: >> >> async def a1(): >> await fut >> raise StopIteration('spam') >> >> The only way to tell the outside code that the iteration has ended is >> to raise something other than ``StopIteration``. Therefore, a new >> built-in exception class ``StopAsyncIteration`` was added. >> >> Moreover, with semantics from PEP 479, all ``StopIteration`` exceptions >> raised in coroutines are wrapped in ``RuntimeError``. >> >> >> Debugging Features >> ------------------ >> >> One of the most frequent mistakes that people make when using >> generators as coroutines is forgetting to use ``yield from``:: >> >> @asyncio.coroutine >> def useful(): >> asyncio.sleep(1) # this will do noting without 'yield from' >> >> Might be useful to point out that this was the one major advantage of PEP > 3152 -- although it wasn't enough to save that PEP, and in your response > you pointed out that this mistake is not all that common. Although you seem > to disagree with that here ("One of the most frequent mistakes ..."). I think it's a mistake that a lot of beginners may make at some point (and in this sense it's frequent). I really doubt that once you were hit by it more than two times you would make it again. This is a small wart, but we have to have a solution for it. > > >> For debugging this kind of mistakes there is a special debug mode in >> asyncio, in which ``@coroutine`` decorator wraps all functions with a >> special object with a destructor logging a warning. Whenever a wrapped >> generator gets garbage collected, a detailed logging message is >> generated with information about where exactly the decorator function >> was defined, stack trace of where it was collected, etc. Wrapper >> object also provides a convenient ``__repr__`` function with detailed >> information about the generator. >> >> The only problem is how to enable these debug capabilities. Since >> debug facilities should be a no-op in production mode, ``@coroutine`` >> decorator makes the decision of whether to wrap or not to wrap based on >> an OS environment variable ``PYTHONASYNCIODEBUG``. This way it is >> possible to run asyncio programs with asyncio's own functions >> instrumented. ``EventLoop.set_debug``, a different debug facility, has >> no impact on ``@coroutine`` decorator's behavior. >> >> With this proposal, coroutines is a native, distinct from generators, >> concept. New methods ``set_coroutine_wrapper`` and >> ``get_coroutine_wrapper`` are added to the ``sys`` module, with which >> frameworks can provide advanced debugging facilities. >> >> These two appear to be unspecified except by example. Will add a subsection specifically for them. > >> It is also important to make coroutines as fast and efficient as >> possible, therefore there are no debug features enabled by default. >> >> Example:: >> >> async def debug_me(): >> await asyncio.sleep(1) >> >> def async_debug_wrap(generator): >> return asyncio.CoroWrapper(generator) >> >> sys.set_coroutine_wrapper(async_debug_wrap) >> >> debug_me() # <- this line will likely GC the coroutine object and >> # trigger asyncio.CoroWrapper's code. >> >> assert isinstance(debug_me(), asyncio.CoroWrapper) >> >> sys.set_coroutine_wrapper(None) # <- this unsets any >> # previously set wrapper >> assert not isinstance(debug_me(), asyncio.CoroWrapper) >> >> If ``sys.set_coroutine_wrapper()`` is called twice, the new wrapper >> replaces the previous wrapper. ``sys.set_coroutine_wrapper(None)`` >> unsets the wrapper. >> >> >> inspect.iscoroutine() and inspect.iscoroutineobject() >> ----------------------------------------------------- >> >> Two new functions are added to the ``inspect`` module: >> >> * ``inspect.iscoroutine(obj)`` returns ``True`` if ``obj`` is a >> coroutine object. >> >> * ``inspect.iscoroutinefunction(obj)`` returns ``True`` is ``obj`` is a >> coroutine function. >> >> Maybe isawaitable() and isawaitablefunction() are also useful? (Or only > isawaitable()?) I think that isawaitable would be really useful. Especially, to check if an object implemented with C API has a tp_await function. isawaitablefunction() looks a bit confusing to me: def foo(): return fut is awaitable, but there is no way to detect that. def foo(arg): if arg == 'spam': return fut is awaitable sometimes. > > >> Differences between coroutines and generators >> --------------------------------------------- >> >> A great effort has been made to make sure that coroutines and >> generators are separate concepts: >> >> 1. Coroutine objects do not implement ``__iter__`` and ``__next__`` >> methods. Therefore they cannot be iterated over or passed to >> ``iter()``, ``list()``, ``tuple()`` and other built-ins. They >> also cannot be used in a ``for..in`` loop. >> >> 2. ``yield from`` does not accept coroutine objects (unless it is used >> in a generator-based coroutine decorated with ``types.coroutine``.) >> > How does ``yield from`` know that it is occurring in a generator-based > coroutine? I check that in 'ceval.c' in the implementation of YIELD_FROM opcode. If the current code object doesn't have a CO_COROUTINE flag and the opcode arg is a generator-object with CO_COROUTINE -- we raise an error. > >> 3. ``yield from`` does not accept coroutine objects from plain Python >> generators (*not* generator-based coroutines.) >> > I am worried about this. PEP 380 gives clear semantics to "yield from > " and I don't think you can revert that here. Or maybe I am > misunderstanding what you meant here? (What exactly are "coroutine objects > from plain Python generators"?) # *Not* decorated with @coroutine def some_algorithm_impl(): yield 1 yield from native_coroutine() # <- this is a bug "some_algorithm_impl" is a regular generator. By mistake someone could try to use "yield from" on a native coroutine (which is 99.9% is a bug). So we can rephrase it to: ``yield from`` does not accept *native coroutine objects* from regular Python generators I also agree that raising an exception in this case in 3.5 might break too much existing code. I'll try warnings, and if it doesn't work we might want to just let this restriction slip. > >> 4. ``inspect.isgenerator()`` and ``inspect.isgeneratorfunction()`` >> return ``False`` for coroutine objects and coroutine functions. >> >> >> Coroutine objects >> ----------------- >> >> Coroutines are based on generators internally, thus they share the >> implementation. Similarly to generator objects, coroutine objects have >> ``throw``, ``send`` and ``close`` methods. ``StopIteration`` and >> ``GeneratorExit`` play the same role for coroutine objects (although >> PEP 479 is enabled by default for coroutines). >> > Does send() make sense for a native coroutine? Check PEP 380. I think the > only way to access the send() argument is by using ``yield`` but that's > disallowed. Or is this about send() being passed to the ``yield`` that > ultimately suspends the chain of coroutines? (You may just have to rewrite > the section about that -- it seems a bit hidden now.) Yes, 'send()' is needed to push values to the 'yield' statement somewhere (future) down the chain of coroutines (suspension point). This has to be articulated in a clear way, I'll think how to rewrite this section without replicating PEP 380 and python documentation on generators. > >> >> Glossary >> ======== >> >> :Coroutine: >> A coroutine function, or just "coroutine", is declared with ``async >> def``. It uses ``await`` and ``return value``; see `New Coroutine >> Declaration Syntax`_ for details. >> >> :Coroutine object: >> Returned from a coroutine function. See `Await Expression`_ for >> details. >> >> :Future-like object: >> An object with an ``__await__`` method, or a C object with >> ``tp_await`` function, returning an iterator. Can be consumed by >> an ``await`` expression in a coroutine. A coroutine waiting for a >> Future-like object is suspended until the Future-like object's >> ``__await__`` completes, and returns the result. See `Await >> Expression`_ for details. >> >> :Awaitable: >> A *Future-like* object or a *coroutine object*. See `Await >> Expression`_ for details. >> >> :Generator-based coroutine: >> Coroutines based in generator syntax. Most common example is >> ``@asyncio.coroutine``. >> >> :Asynchronous context manager: >> An asynchronous context manager has ``__aenter__`` and ``__aexit__`` >> methods and can be used with ``async with``. See `Asynchronous >> Context Managers and "async with"`_ for details. >> >> :Asynchronous iterable: >> An object with an ``__aiter__`` method, which must return an >> *asynchronous iterator* object. Can be used with ``async for``. >> See `Asynchronous Iterators and "async for"`_ for details. >> >> :Asynchronous iterator: >> An asynchronous iterator has an ``__anext__`` method. See >> `Asynchronous Iterators and "async for"`_ for details. >> >> >> List of functions and methods >> ============================= >> > (I'm not sure of the utility of this section.) It's a little bit hard to understand that "awaitable" is a general term that includes native coroutine objects, so it's OK to write both: def __aenter__(): return fut async def __aenter__(): ... We (Victor and I) decided that it might be useful to have an additional section that explains it. > >> ================= =================================== ================= >> Method Can contain Can't contain >> ================= =================================== ================= >> async def func await, return value yield, yield from >> async def __a*__ await, return value yield, yield from >> > (This line seems redundant.) > >> def __a*__ return awaitable await >> > def __await__ yield, yield from, return iterable await >> generator yield, yield from, return value await >> ================= =================================== ================= >> >> Where: >> >> * "async def func": coroutine; >> >> * "async def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``, >> ``__aexit__`` defined with the ``async`` keyword; >> >> * "def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``, >> ``__aexit__`` defined without the ``async`` keyword, must return an >> *awaitable*; >> >> * "def __await__": ``__await__`` method to implement *Future-like* >> objects; >> >> * generator: a "regular" generator, function defined with ``def`` and >> which contains a least one ``yield`` or ``yield from`` expression. >> >> >> Transition Plan >> =============== >> >> This may need to be pulled forward or at least mentioned earlier (in the > Abstract or near the top of the Specification). > > >> To avoid backwards compatibility issues with ``async`` and ``await`` >> keywords, it was decided to modify ``tokenizer.c`` in such a way, that >> it: >> >> * recognizes ``async def`` name tokens combination (start of a >> coroutine); >> >> * keeps track of regular functions and coroutines; >> >> * replaces ``'async'`` token with ``ASYNC`` and ``'await'`` token with >> ``AWAIT`` when in the process of yielding tokens for coroutines. >> >> This approach allows for seamless combination of new syntax features >> (all of them available only in ``async`` functions) with any existing >> code. >> >> An example of having "async def" and "async" attribute in one piece of >> code:: >> >> class Spam: >> async = 42 >> >> async def ham(): >> print(getattr(Spam, 'async')) >> >> # The coroutine can be executed and will print '42' >> >> >> Backwards Compatibility >> ----------------------- >> >> This proposal preserves 100% backwards compatibility. >> > Is this still true with the proposed restrictions on what ``yield from`` > accepts? (Hopefully I'm the one who is confused. :-) True for the code that uses @coroutine decorators properly. I'll see what I can do with warnings, but I'll update the section anyways. > >> >> Grammar Updates >> --------------- >> >> Grammar changes are also fairly minimal:: >> >> decorated: decorators (classdef | funcdef | async_funcdef) >> async_funcdef: ASYNC funcdef >> >> compound_stmt: (if_stmt | while_stmt | for_stmt | try_stmt | with_stmt >> | funcdef | classdef | decorated | async_stmt) >> >> async_stmt: ASYNC (funcdef | with_stmt | for_stmt) >> >> power: atom_expr ['**' factor] >> atom_expr: [AWAIT] atom trailer* >> >> >> Transition Period Shortcomings >> ------------------------------ >> >> There is just one. Are you OK with this thing? >> >> Until ``async`` and ``await`` are not proper keywords, it is not >> possible (or at least very hard) to fix ``tokenizer.c`` to recognize >> them on the **same line** with ``def`` keyword:: >> >> # async and await will always be parsed as variables >> >> async def outer(): # 1 >> def nested(a=(await fut)): >> pass >> >> async def foo(): return (await fut) # 2 >> >> Since ``await`` and ``async`` in such cases are parsed as ``NAME`` >> tokens, a ``SyntaxError`` will be raised. >> >> To workaround these issues, the above examples can be easily rewritten >> to a more readable form:: >> >> async def outer(): # 1 >> a_default = await fut >> def nested(a=a_default): >> pass >> >> async def foo(): # 2 >> return (await fut) >> >> This limitation will go away as soon as ``async`` and ``await`` ate >> proper keywords. Or if it's decided to use a future import for this >> PEP. >> >> >> Deprecation Plans >> ----------------- >> >> ``async`` and ``await`` names will be softly deprecated in CPython 3.5 >> and 3.6. In 3.7 we will transform them to proper keywords. Making >> ``async`` and ``await`` proper keywords before 3.7 might make it harder >> for people to port their code to Python 3. >> >> >> asyncio >> ------- >> >> ``asyncio`` module was adapted and tested to work with coroutines and >> new statements. Backwards compatibility is 100% preserved. >> >> The required changes are mainly: >> >> 1. Modify ``@asyncio.coroutine`` decorator to use new >> ``types.coroutine()`` function. >> >> 2. Add ``__await__ = __iter__`` line to ``asyncio.Future`` class. >> >> 3. Add ``ensure_task()`` as an alias for ``async()`` function. >> Deprecate ``async()`` function. >> >> >> Design Considerations >> ===================== >> >> PEP 3152 >> -------- >> >> PEP 3152 by Gregory Ewing proposes a different mechanism for coroutines >> (called "cofunctions"). Some key points: >> >> 1. A new keyword ``codef`` to declare a *cofunction*. *Cofunction* is >> always a generator, even if there is no ``cocall`` expressions >> inside it. Maps to ``async def`` in this proposal. >> >> 2. A new keyword ``cocall`` to call a *cofunction*. Can only be used >> inside a *cofunction*. Maps to ``await`` in this proposal (with >> some differences, see below.) >> >> 3. It is not possible to call a *cofunction* without a ``cocall`` >> keyword. >> >> 4. ``cocall`` grammatically requires parentheses after it:: >> >> atom: cocall | >> cocall: 'cocall' atom cotrailer* '(' [arglist] ')' >> cotrailer: '[' subscriptlist ']' | '.' NAME >> >> 5. ``cocall f(*args, **kwds)`` is semantically equivalent to >> ``yield from f.__cocall__(*args, **kwds)``. >> >> Differences from this proposal: >> >> 1. There is no equivalent of ``__cocall__`` in this PEP, which is >> called and its result is passed to ``yield from`` in the ``cocall`` >> expression. ``await`` keyword expects an *awaitable* object, >> validates the type, and executes ``yield from`` on it. Although, >> ``__await__`` method is similar to ``__cocall__``, but is only used >> to define *Future-like* objects. >> >> 2. ``await`` is defined in almost the same way as ``yield from`` in the >> grammar (it is later enforced that ``await`` can only be inside >> ``async def``). It is possible to simply write ``await future``, >> whereas ``cocall`` always requires parentheses. >> >> 3. To make asyncio work with PEP 3152 it would be required to modify >> ``@asyncio.coroutine`` decorator to wrap all functions in an object >> with a ``__cocall__`` method, or to implement ``__cocall__`` on >> generators. To call *cofunctions* from existing generator-based >> coroutines it would be required to use ``costart(cofunc, *args, >> **kwargs)`` built-in. >> >> 4. Since it is impossible to call a *cofunction* without a ``cocall`` >> keyword, it automatically prevents the common mistake of forgetting >> to use ``yield from`` on generator-based coroutines. This proposal >> addresses this problem with a different approach, see `Debugging >> Features`_. >> >> 5. A shortcoming of requiring a ``cocall`` keyword to call a coroutine >> is that if is decided to implement coroutine-generators -- >> coroutines with ``yield`` or ``async yield`` expressions -- we >> wouldn't need a ``cocall`` keyword to call them. So we'll end up >> having ``__cocall__`` and no ``__call__`` for regular coroutines, >> and having ``__call__`` and no ``__cocall__`` for coroutine- >> generators. >> >> 6. Requiring parentheses grammatically also introduces a whole lot >> of new problems. >> >> The following code:: >> >> await fut >> await function_returning_future() >> await asyncio.gather(coro1(arg1, arg2), coro2(arg1, arg2)) >> >> would look like:: >> >> cocall fut() # or cocall costart(fut) >> cocall (function_returning_future())() >> cocall asyncio.gather(costart(coro1, arg1, arg2), >> costart(coro2, arg1, arg2)) >> >> 7. There are no equivalents of ``async for`` and ``async with`` in PEP >> 3152. >> >> >> Coroutine-generators >> -------------------- >> >> With ``async for`` keyword it is desirable to have a concept of a >> *coroutine-generator* -- a coroutine with ``yield`` and ``yield from`` >> expressions. To avoid any ambiguity with regular generators, we would >> likely require to have an ``async`` keyword before ``yield``, and >> ``async yield from`` would raise a ``StopAsyncIteration`` exception. >> >> While it is possible to implement coroutine-generators, we believe that >> they are out of scope of this proposal. It is an advanced concept that >> should be carefully considered and balanced, with a non-trivial changes >> in the implementation of current generator objects. This is a matter >> for a separate PEP. >> >> >> No implicit wrapping in Futures >> ------------------------------- >> >> There is a proposal to add similar mechanism to ECMAScript 7 [2]_. A >> key difference is that JavaScript "async functions" always return a >> Promise. While this approach has some advantages, it also implies that >> a new Promise object is created on each "async function" invocation. >> >> We could implement a similar functionality in Python, by wrapping all >> coroutines in a Future object, but this has the following >> disadvantages: >> >> 1. Performance. A new Future object would be instantiated on each >> coroutine call. Moreover, this makes implementation of ``await`` >> expressions slower (disabling optimizations of ``yield from``). >> >> 2. A new built-in ``Future`` object would need to be added. >> >> 3. Coming up with a generic ``Future`` interface that is usable for any >> use case in any framework is a very hard to solve problem. >> >> 4. It is not a feature that is used frequently, when most of the code >> is coroutines. >> >> >> Why "async" and "await" keywords >> -------------------------------- >> >> async/await is not a new concept in programming languages: >> >> * C# has it since long time ago [5]_; >> >> * proposal to add async/await in ECMAScript 7 [2]_; >> see also Traceur project [9]_; >> >> * Facebook's Hack/HHVM [6]_; >> >> * Google's Dart language [7]_; >> >> * Scala [8]_; >> >> * proposal to add async/await to C++ [10]_; >> >> * and many other less popular languages. >> >> This is a huge benefit, as some users already have experience with >> async/await, and because it makes working with many languages in one >> project easier (Python with ECMAScript 7 for instance). >> >> >> Why "__aiter__" is a coroutine >> ------------------------------ >> >> In principle, ``__aiter__`` could be a regular function. There are >> several good reasons to make it a coroutine: >> >> * as most of the ``__anext__``, ``__aenter__``, and ``__aexit__`` >> methods are coroutines, users would often make a mistake defining it >> as ``async`` anyways; >> >> * there might be a need to run some asynchronous operations in >> ``__aiter__``, for instance to prepare DB queries or do some file >> operation. >> >> >> Importance of "async" keyword >> ----------------------------- >> >> While it is possible to just implement ``await`` expression and treat >> all functions with at least one ``await`` as coroutines, this approach >> makes APIs design, code refactoring and its long time support harder. >> >> Let's pretend that Python only has ``await`` keyword:: >> >> def useful(): >> ... >> await log(...) >> ... >> >> def important(): >> await useful() >> >> If ``useful()`` function is refactored and someone removes all >> ``await`` expressions from it, it would become a regular python >> function, and all code that depends on it, including ``important()`` >> would be broken. To mitigate this issue a decorator similar to >> ``@asyncio.coroutine`` has to be introduced. >> >> >> Why "async def" >> --------------- >> >> For some people bare ``async name(): pass`` syntax might look more >> appealing than ``async def name(): pass``. It is certainly easier to >> type. But on the other hand, it breaks the symmetry between ``async >> def``, ``async with`` and ``async for``, where ``async`` is a modifier, >> stating that the statement is asynchronous. It is also more consistent >> with the existing grammar. >> >> >> Why "async for/with" instead of "await for/with" >> ------------------------------------------------ >> >> ``async`` is an adjective, and hence it is a better choice for a >> *statement qualifier* keyword. ``await for/with`` would imply that >> something is awaiting for a completion of a ``for`` or ``with`` >> statement. >> >> >> Why "async def" and not "def async" >> ----------------------------------- >> >> ``async`` keyword is a *statement qualifier*. A good analogy to it are >> "static", "public", "unsafe" keywords from other languages. "async >> for" is an asynchronous "for" statement, "async with" is an >> asynchronous "with" statement, "async def" is an asynchronous function. >> >> Having "async" after the main statement keyword might introduce some >> confusion, like "for async item in iterator" can be read as "for each >> asynchronous item in iterator". >> >> Having ``async`` keyword before ``def``, ``with`` and ``for`` also >> makes the language grammar simpler. And "async def" better separates >> coroutines from regular functions visually. >> >> >> Why not a __future__ import >> --------------------------- >> >> ``__future__`` imports are inconvenient and easy to forget to add. >> Also, they are enabled for the whole source file. Consider that there >> is a big project with a popular module named "async.py". With future >> imports it is required to either import it using ``__import__()`` or >> ``importlib.import_module()`` calls, or to rename the module. The >> proposed approach makes it possible to continue using old code and >> modules without a hassle, while coming up with a migration plan for >> future python versions. >> >> >> Why magic methods start with "a" >> -------------------------------- >> >> New asynchronous magic methods ``__aiter__``, ``__anext__``, >> ``__aenter__``, and ``__aexit__`` all start with the same prefix "a". >> An alternative proposal is to use "async" prefix, so that ``__aiter__`` >> becomes ``__async_iter__``. However, to align new magic methods with >> the existing ones, such as ``__radd__`` and ``__iadd__`` it was decided >> to use a shorter version. >> >> >> Why not reuse existing magic names >> ---------------------------------- >> >> An alternative idea about new asynchronous iterators and context >> managers was to reuse existing magic methods, by adding an ``async`` >> keyword to their declarations:: >> >> class CM: >> async def __enter__(self): # instead of __aenter__ >> ... >> >> This approach has the following downsides: >> >> * it would not be possible to create an object that works in both >> ``with`` and ``async with`` statements; >> >> * it would break backwards compatibility, as nothing prohibits from >> returning a Future-like objects from ``__enter__`` and/or >> ``__exit__`` in Python <= 3.4; >> >> * one of the main points of this proposal is to make coroutines as >> simple and foolproof as possible, hence the clear separation of the >> protocols. >> >> >> Why not reuse existing "for" and "with" statements >> -------------------------------------------------- >> >> The vision behind existing generator-based coroutines and this proposal >> is to make it easy for users to see where the code might be suspended. >> Making existing "for" and "with" statements to recognize asynchronous >> iterators and context managers will inevitably create implicit suspend >> points, making it harder to reason about the code. >> >> >> Comprehensions >> -------------- >> >> For the sake of restricting the broadness of this PEP there is no new >> syntax for asynchronous comprehensions. This should be considered in a >> separate PEP, if there is a strong demand for this feature. >> >> >> Async lambdas >> ------------- >> >> Lambda coroutines are not part of this proposal. In this proposal they >> would look like ``async lambda(parameters): expression``. Unless there >> is a strong demand to have them as part of this proposal, it is >> recommended to consider them later in a separate PEP. >> >> >> Performance >> =========== >> >> Overall Impact >> -------------- >> >> This proposal introduces no observable performance impact. Here is an >> output of python's official set of benchmarks [4]_: >> >> :: >> >> python perf.py -r -b default ../cpython/python.exe >> ../cpython-aw/python.exe >> >> [skipped] >> >> Report on Darwin ysmac 14.3.0 Darwin Kernel Version 14.3.0: >> Mon Mar 23 11:59:05 PDT 2015; root:xnu-2782.20.48~5/RELEASE_X86_64 >> x86_64 i386 >> >> Total CPU cores: 8 >> >> ### etree_iterparse ### >> Min: 0.365359 -> 0.349168: 1.05x faster >> Avg: 0.396924 -> 0.379735: 1.05x faster >> Significant (t=9.71) >> Stddev: 0.01225 -> 0.01277: 1.0423x larger >> >> The following not significant results are hidden, use -v to show them: >> django_v2, 2to3, etree_generate, etree_parse, etree_process, >> fastpickle, >> fastunpickle, json_dump_v2, json_load, nbody, regex_v8, tornado_http. >> >> >> Tokenizer modifications >> ----------------------- >> >> There is no observable slowdown of parsing python files with the >> modified tokenizer: parsing of one 12Mb file >> (``Lib/test/test_binop.py`` repeated 1000 times) takes the same amount >> of time. >> >> >> async/await >> ----------- >> >> The following micro-benchmark was used to determine performance >> difference between "async" functions and generators:: >> >> import sys >> import time >> >> def binary(n): >> if n <= 0: >> return 1 >> l = yield from binary(n - 1) >> r = yield from binary(n - 1) >> return l + 1 + r >> >> async def abinary(n): >> if n <= 0: >> return 1 >> l = await abinary(n - 1) >> r = await abinary(n - 1) >> return l + 1 + r >> >> def timeit(gen, depth, repeat): >> t0 = time.time() >> for _ in range(repeat): >> list(gen(depth)) >> t1 = time.time() >> print('{}({}) * {}: total {:.3f}s'.format( >> gen.__name__, depth, repeat, t1-t0)) >> >> The result is that there is no observable performance difference. >> Minimum timing of 3 runs >> >> :: >> >> abinary(19) * 30: total 12.985s >> binary(19) * 30: total 12.953s >> >> Note that depth of 19 means 1,048,575 calls. >> >> >> Reference Implementation >> ======================== >> >> The reference implementation can be found here: [3]_. >> >> List of high-level changes and new protocols >> -------------------------------------------- >> >> 1. New syntax for defining coroutines: ``async def`` and new ``await`` >> keyword. >> >> 2. New ``__await__`` method for Future-like objects, and new >> ``tp_await`` slot in ``PyTypeObject``. >> >> 3. New syntax for asynchronous context managers: ``async with``. And >> associated protocol with ``__aenter__`` and ``__aexit__`` methods. >> >> 4. New syntax for asynchronous iteration: ``async for``. And >> associated protocol with ``__aiter__``, ``__aexit__`` and new built- >> in exception ``StopAsyncIteration``. >> >> 5. New AST nodes: ``AsyncFunctionDef``, ``AsyncFor``, ``AsyncWith``, >> ``Await``. >> >> 6. New functions: ``sys.set_coroutine_wrapper(callback)``, >> ``sys.get_coroutine_wrapper()``, ``types.coroutine(gen)``, >> ``inspect.iscoroutinefunction()``, and ``inspect.iscoroutine()``. >> >> 7. New ``CO_COROUTINE`` bit flag for code objects. >> >> While the list of changes and new things is not short, it is important >> to understand, that most users will not use these features directly. >> It is intended to be used in frameworks and libraries to provide users >> with convenient to use and unambiguous APIs with ``async def``, >> ``await``, ``async for`` and ``async with`` syntax. >> >> >> Working example >> --------------- >> >> All concepts proposed in this PEP are implemented [3]_ and can be >> tested. >> >> :: >> >> import asyncio >> >> async def echo_server(): >> print('Serving on localhost:8000') >> await asyncio.start_server(handle_connection, >> 'localhost', 8000) >> >> async def handle_connection(reader, writer): >> print('New connection...') >> >> while True: >> data = await reader.read(8192) >> >> if not data: >> break >> >> print('Sending {:.10}... back'.format(repr(data))) >> writer.write(data) >> >> loop = asyncio.get_event_loop() >> loop.run_until_complete(echo_server()) >> try: >> loop.run_forever() >> finally: >> loop.close() >> >> >> References >> ========== >> >> .. [1] >> https://docs.python.org/3/library/asyncio-task.html#asyncio.coroutine >> >> .. [2] http://wiki.ecmascript.org/doku.php?id=strawman:async_functions >> >> .. [3] https://github.com/1st1/cpython/tree/await >> >> .. [4] https://hg.python.org/benchmarks >> >> .. [5] https://msdn.microsoft.com/en-us/library/hh191443.aspx >> >> .. [6] http://docs.hhvm.com/manual/en/hack.async.php >> >> .. [7] https://www.dartlang.org/articles/await-async/ >> >> .. [8] http://docs.scala-lang.org/sips/pending/async.html >> >> .. [9] >> https://github.com/google/traceur-compiler/wiki/LanguageFeatures#async-functions-experimental >> >> .. [10] http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3722.pdf >> (PDF) >> >> >> Acknowledgments >> =============== >> >> I thank Guido van Rossum, Victor Stinner, Elvis Pranskevichus, Andrew >> Svetlov, and ?ukasz Langa for their initial feedback. >> >> >> Copyright >> ========= >> >> This document has been placed in the public domain. >> >> .. >> Local Variables: >> mode: indented-text >> indent-tabs-mode: nil >> sentence-end-double-space: t >> fill-column: 70 >> coding: utf-8 >> End: >> >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/guido%40python.org >> > > From ethan at stoneleaf.us Wed Apr 29 01:55:31 2015 From: ethan at stoneleaf.us (Ethan Furman) Date: Tue, 28 Apr 2015 16:55:31 -0700 Subject: [Python-Dev] PEP 492: async/await in Python; v3 In-Reply-To: <55401741.9040703@gmail.com> References: <553EF985.2070808@gmail.com> <55401741.9040703@gmail.com> Message-ID: <20150428235531.GP32422@stoneleaf.us> On 04/28, Yury Selivanov wrote: >>> This limitation will go away as soon as ``async`` and ``await`` ate >>> proper keywords. Or if it's decided to use a future import for this >>> PEP. `async` and `await` need to be proper keywords, and __future__ imports is how we do that (see, e.g., PEP 355 and and PEP 343) -- ~Ethan~ From guido at python.org Wed Apr 29 02:04:36 2015 From: guido at python.org (Guido van Rossum) Date: Tue, 28 Apr 2015 17:04:36 -0700 Subject: [Python-Dev] PEP 492: async/await in Python; v3 In-Reply-To: <20150428235531.GP32422@stoneleaf.us> References: <553EF985.2070808@gmail.com> <55401741.9040703@gmail.com> <20150428235531.GP32422@stoneleaf.us> Message-ID: On Tue, Apr 28, 2015 at 4:55 PM, Ethan Furman wrote: > On 04/28, Yury Selivanov wrote: > > >>> This limitation will go away as soon as ``async`` and ``await`` ate > >>> proper keywords. Or if it's decided to use a future import for this > >>> PEP. > > `async` and `await` need to be proper keywords, and __future__ imports > is how we do that (see, e.g., PEP 355 and and PEP 343) > You could at least provide an explanation about how the current proposal falls short. What code will break? There's a cost to __future__ imports too. The current proposal is a pretty clever hack -- and we've done similar hacks in the past (last I remember when "import ... as ..." was introduced but we didn't want to make 'as' a keyword right away). -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From yselivanov.ml at gmail.com Wed Apr 29 03:51:51 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Tue, 28 Apr 2015 21:51:51 -0400 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <553E3D20.2020008@gmail.com> References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> Message-ID: <55403937.7040409@gmail.com> Looking at the grammar -- the only downside of the current approach is that you can't do 'await await fut'. I still think that it reads better with parens. If we put 'await' to 'factor' terminal we would allow await -fut # await (-fut) I think I something like power: atom_expr ['**' factor] atom_expr: [AWAIT] atom_expr | atom_trailer atom_trailer: atom trailer* will fix 'await await' situation, but does it really need to be fixed? Yury On 2015-04-27 9:44 AM, Yury Selivanov wrote: > Hi Greg, > > I don't want this: "await a() * b()" to be parsed, it's not meaningful. > > Likely you'll see "await await a()" only once in your life, so I'm > fine to use parens for it (moreover, I think it reads better with parens) > > Yury > > > On 2015-04-27 8:52 AM, Greg Ewing wrote: >> Yury Selivanov wrote: >>> I've done some experiments with grammar, and it looks like >>> we indeed can parse await quite differently from yield. Three >>> different options: >> >> You don't seem to have tried what I suggested, which is >> to make 'await' a unary operator with the same precedence >> as '-', i.e. replace >> >> factor: ('+'|'-'|'~') factor | power >> >> with >> >> factor: ('+'|'-'|'~'|'await') factor | power >> >> That would allow >> >> await a() >> res = await a() + await b() >> res = await await a() >> if await a(): pass >> return await a() >> print(await a()) >> func(arg=await a()) >> await a() * b() >> > From guido at python.org Wed Apr 29 04:00:55 2015 From: guido at python.org (Guido van Rossum) Date: Tue, 28 Apr 2015 19:00:55 -0700 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <55403937.7040409@gmail.com> References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> Message-ID: I don't care for await await x. On Apr 28, 2015 6:53 PM, "Yury Selivanov" wrote: > Looking at the grammar -- the only downside of the current approach is that > you can't do 'await await fut'. I still think that it reads better with > parens. If we put 'await' to 'factor' terminal we would allow > > await -fut # await (-fut) > > I think I something like > > power: atom_expr ['**' factor] > atom_expr: [AWAIT] atom_expr | atom_trailer > atom_trailer: atom trailer* > > will fix 'await await' situation, but does it really need to be fixed? > > Yury > > > > On 2015-04-27 9:44 AM, Yury Selivanov wrote: > >> Hi Greg, >> >> I don't want this: "await a() * b()" to be parsed, it's not meaningful. >> >> Likely you'll see "await await a()" only once in your life, so I'm fine >> to use parens for it (moreover, I think it reads better with parens) >> >> Yury >> >> >> On 2015-04-27 8:52 AM, Greg Ewing wrote: >> >>> Yury Selivanov wrote: >>> >>>> I've done some experiments with grammar, and it looks like >>>> we indeed can parse await quite differently from yield. Three >>>> different options: >>>> >>> >>> You don't seem to have tried what I suggested, which is >>> to make 'await' a unary operator with the same precedence >>> as '-', i.e. replace >>> >>> factor: ('+'|'-'|'~') factor | power >>> >>> with >>> >>> factor: ('+'|'-'|'~'|'await') factor | power >>> >>> That would allow >>> >>> await a() >>> res = await a() + await b() >>> res = await await a() >>> if await a(): pass >>> return await a() >>> print(await a()) >>> func(arg=await a()) >>> await a() * b() >>> >>> >> > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From stephen at xemacs.org Wed Apr 29 05:03:06 2015 From: stephen at xemacs.org (Stephen J. Turnbull) Date: Wed, 29 Apr 2015 12:03:06 +0900 Subject: [Python-Dev] PEP 492: async/await in Python; v3 In-Reply-To: <553EF985.2070808@gmail.com> References: <553EF985.2070808@gmail.com> Message-ID: <87oam7prf9.fsf@uwakimon.sk.tsukuba.ac.jp> Literary critic here. In section "Specification" > It is strongly suggested that the reader understands how coroutines > are implemented in Python (PEP 342 and PEP 380). It is also > recommended to read PEP 3156 (asyncio framework) and PEP 3152 > (Cofunctions). The usual phrasing of "strongly suggested" in specifications is "presumes knowledge". Some people think "strongly suggest ing" is presumptuous and condescending, YMMV. Also, the relationship to PEP 3152 should be mentioned IMO. I propose: This specification presumes knowledge of the implementation of coroutines in Python (PEP 342 and PEP 380). Motivation for the syntax changes proposed here comes from the asyncio framework (PEP 3156) and the "Cofunctions" proposal (PEP 3152, now rejected in favor of this specification). I'm not entirely happy with my phrasing, because there are at least four more or less different concepts that might claim the bare word "coroutine": - this specification - the implementation of this specification - the syntax used to define coroutines via PEPs 342 and 380 - the semantics of PEP 342/380 coroutines In both your original and my rephrasing, the use of "coroutine" violates your convention that it refers to the PEP's proposed syntax for coroutines. Instead it refers to the semantics of coroutines implemented via PEP 342/380. This is probably the same concern that motivated Guido's suggestion to use "native coroutines" for the PEP 492 syntax (but I'm not Dutch, so maybe they're not the same :-). I feel this is a real hindrance to understanding for someone coming to the PEP for the first time. You know which meaning of coroutine you mean, but the new reader needs to think hard enough to disambiguate every time the word occurs. If people agree with me, I could go through the PEP and revise mentions of "coroutine" in "disambiguated" style. In section "Comprehensions": > For the sake of restricting the broadness of this PEP there is no > new syntax for asynchronous comprehensions. This should be > considered in a separate PEP, if there is a strong demand for this > feature. Don't invite trouble. How about: Syntax for asynchronous comprehensions could be provided, but this construct is outside of the scope of this PEP. In section "Async lambdas" > Lambda coroutines are not part of this proposal. In this proposal they > would look like ``async lambda(parameters): expression``. Unless there > is a strong demand to have them as part of this proposal, it is > recommended to consider them later in a separate PEP. Same recommendation as for "Comprehensions". I wouldn't mention the tentative syntax, it is both obvious and inviting to trouble. > Acknowledgments > =============== > > I thank Guido van Rossum, Victor Stinner, Elvis Pranskevichus, Andrew > Svetlov, and ?ukasz Langa for their initial feedback. A partial list of commentators I've found to be notable, YMMV: Greg Ewing for PEP 3152 and his Loyal Opposition to this PEP. Mark Shannon's comments have led to substantial clarifications of motivation for syntax, at least in my mind. Paul Sokolovsky for information about the MicroPython implementation. From yselivanov.ml at gmail.com Wed Apr 29 05:23:46 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Tue, 28 Apr 2015 23:23:46 -0400 Subject: [Python-Dev] PEP 492: async/await in Python; v3 In-Reply-To: <87oam7prf9.fsf@uwakimon.sk.tsukuba.ac.jp> References: <553EF985.2070808@gmail.com> <87oam7prf9.fsf@uwakimon.sk.tsukuba.ac.jp> Message-ID: <55404EC2.6040004@gmail.com> Hi Stephen, Thanks a lot for the feedback and suggestions. I'll apply them to the PEP. On 2015-04-28 11:03 PM, Stephen J. Turnbull wrote: > Literary critic here. > > In section "Specification" > > > It is strongly suggested that the reader understands how coroutines > > are implemented in Python (PEP 342 and PEP 380). It is also > > recommended to read PEP 3156 (asyncio framework) and PEP 3152 > > (Cofunctions). > > The usual phrasing of "strongly suggested" in specifications is > "presumes knowledge". Some people think "strongly suggest ing" is > presumptuous and condescending, YMMV. Also, the relationship to PEP > 3152 should be mentioned IMO. I propose: > > This specification presumes knowledge of the implementation of > coroutines in Python (PEP 342 and PEP 380). Motivation for the > syntax changes proposed here comes from the asyncio framework (PEP > 3156) and the "Cofunctions" proposal (PEP 3152, now rejected in > favor of this specification). Your wording is 100% better and it's time to mention PEP 3152 too. > > I'm not entirely happy with my phrasing, because there are at least > four more or less different concepts that might claim the bare word > "coroutine": > > - this specification > - the implementation of this specification > - the syntax used to define coroutines via PEPs 342 and 380 > - the semantics of PEP 342/380 coroutines > > In both your original and my rephrasing, the use of "coroutine" > violates your convention that it refers to the PEP's proposed syntax > for coroutines. Instead it refers to the semantics of coroutines > implemented via PEP 342/380. This is probably the same concern that > motivated Guido's suggestion to use "native coroutines" for the PEP > 492 syntax (but I'm not Dutch, so maybe they're not the same :-). > > I feel this is a real hindrance to understanding for someone coming to > the PEP for the first time. You know which meaning of coroutine you > mean, but the new reader needs to think hard enough to disambiguate > every time the word occurs. If people agree with me, I could go > through the PEP and revise mentions of "coroutine" in "disambiguated" > style. I also like Guido's suggestion to use "native coroutine" term. I'll update the PEP (I have several branches of it in the repo that I need to merge before the rename). > > In section "Comprehensions": > > > For the sake of restricting the broadness of this PEP there is no > > new syntax for asynchronous comprehensions. This should be > > considered in a separate PEP, if there is a strong demand for this > > feature. > > Don't invite trouble. How about: > > Syntax for asynchronous comprehensions could be provided, but this > construct is outside of the scope of this PEP. > > In section "Async lambdas" > > > Lambda coroutines are not part of this proposal. In this proposal they > > would look like ``async lambda(parameters): expression``. Unless there > > is a strong demand to have them as part of this proposal, it is > > recommended to consider them later in a separate PEP. > > Same recommendation as for "Comprehensions". I wouldn't mention the > tentative syntax, it is both obvious and inviting to trouble. Agree. Do you think it'd be better to combine comprehensions and async lambdas in one section? > > > > Acknowledgments > > =============== > > > > I thank Guido van Rossum, Victor Stinner, Elvis Pranskevichus, Andrew > > Svetlov, and ?ukasz Langa for their initial feedback. > > A partial list of commentators I've found to be notable, YMMV: Sure! I was going to add everybody after the PEP is accepted/rejected/postponed. > > Greg Ewing for PEP 3152 and his Loyal Opposition to this PEP. > Mark Shannon's comments have led to substantial clarifications of > motivation for syntax, at least in my mind. > Paul Sokolovsky for information about the MicroPython implementation. > Thanks! Yury From greg.ewing at canterbury.ac.nz Wed Apr 29 05:59:15 2015 From: greg.ewing at canterbury.ac.nz (Greg) Date: Wed, 29 Apr 2015 15:59:15 +1200 Subject: [Python-Dev] PEP 492: async/await in Python; v3 In-Reply-To: References: <553EF985.2070808@gmail.com> Message-ID: <55405713.3050001@canterbury.ac.nz> On 29/04/2015 9:49 a.m., Guido van Rossum wrote: > c) 'yield from' only accept coroutine objects from > generators decorated with 'types.coroutine'. That means > that existing asyncio generator-based coroutines will > happily yield from both coroutines and generators. > *But* every generator-based coroutine *must* be > decorated with `asyncio.coroutine()`. This is > potentially a backwards incompatible change. > > See below. I worry about backward compatibility. A lot. Are you saying > that asycio-based code that doesn't use @coroutine will break in 3.5? That seems unavoidable if the goal is for 'await' to only work on generators that are intended to implement coroutines, and not on generators that are intended to implement iterators. Because there's no way to tell them apart without marking them in some way. -- Greg From yselivanov.ml at gmail.com Wed Apr 29 06:10:45 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 29 Apr 2015 00:10:45 -0400 Subject: [Python-Dev] PEP 492: async/await in Python; v3 In-Reply-To: <55405713.3050001@canterbury.ac.nz> References: <553EF985.2070808@gmail.com> <55405713.3050001@canterbury.ac.nz> Message-ID: <554059C5.9040303@gmail.com> On 2015-04-28 11:59 PM, Greg wrote: > On 29/04/2015 9:49 a.m., Guido van Rossum wrote: >> c) 'yield from' only accept coroutine objects from >> generators decorated with 'types.coroutine'. That means >> that existing asyncio generator-based coroutines will >> happily yield from both coroutines and generators. >> *But* every generator-based coroutine *must* be >> decorated with `asyncio.coroutine()`. This is >> potentially a backwards incompatible change. >> >> See below. I worry about backward compatibility. A lot. Are you saying >> that asycio-based code that doesn't use @coroutine will break in 3.5? > > That seems unavoidable if the goal is for 'await' to only > work on generators that are intended to implement coroutines, > and not on generators that are intended to implement > iterators. Because there's no way to tell them apart > without marking them in some way. > Not sure what you mean by "unavoidable". Before the last revision of the PEP it was perfectly fine to use generators in 'yield from' in generator-based coroutines: @asyncio.coroutine def foo(): yield from gen() and yet you couldn't do the same with 'await' (as it has a special opcode instead of GET_ITER that can validate what you're awaiting). With the new version of the PEP - 'yield from' in foo() would raise a TypeError. If we change it to a RuntimeWarning then we're safe in terms of backwards compatibility. I just want to see how exactly warnings will work (i.e. will they occur multiple times at the same 'yield from' expression, etc) Yury From yselivanov.ml at gmail.com Wed Apr 29 07:00:28 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 29 Apr 2015 01:00:28 -0400 Subject: [Python-Dev] PEP 492: async/await in Python; v3 In-Reply-To: <55401741.9040703@gmail.com> References: <553EF985.2070808@gmail.com> <55401741.9040703@gmail.com> Message-ID: <5540656C.4060705@gmail.com> Guido, I found a solution how to disable 'yield from', iter()/tuple() and 'for..in' on native coroutines with 100% backwards compatibility. The idea is to add one more code object flag: CO_NATIVE_COROUTINE, which will be applied, along with CO_COROUTINE to all 'async def' functions. This way: 1. old generator-based coroutines from asyncio are awaitable, because of CO_COROUTINE flag (that asyncio.coroutine decorator will set with 'types.coroutine'). 2. new 'async def' functions are awaitable because of CO_COROUTINE flag. 3. GenObject __iter__ and __next__ raise error *only* if it has CO_NATIVE_COROUTINE flag. So iter(), next(), for..in aren't supported only for 'async def' functions (but will work ok on asyncio generator-based coroutines) 4. 'yield from' *only* raises an error if it yields a *coroutine with a CO_NATIVE_COROUTINE* from a regular generator. Thanks, Yury On 2015-04-28 7:26 PM, Yury Selivanov wrote: > Hi Guido, > > Thank you for a very detailed review. Comments below: > > On 2015-04-28 5:49 PM, Guido van Rossum wrote: >> Inline comments below... >> >> On Mon, Apr 27, 2015 at 8:07 PM, Yury Selivanov >> >> wrote: >> >>> Hi python-dev, >>> >>> Another round of updates. Reference implementation >>> has been updated: https://github.com/1st1/cpython/tree/await >>> (includes all things from the below summary of updates + >>> tests). >>> >>> >>> Summary: >>> >>> >>> 1. "PyTypeObject.tp_await" slot. Replaces "tp_reserved". >>> This is to enable implementation of Futures with C API. >>> Must return an iterator if implemented. >>> >>> That's fine (though I didn't follow this closely). > > My main question here is it OK to reuse 'tp_reserved' > (former tp_compare)? > > I had to remove this check: > https://github.com/1st1/cpython/commit/4be6d0a77688b63b917ad88f09d446ac3b7e2ce9#diff-c3cf251f16d5a03a9e7d4639f2d6f998L4906 > > > On the other hand I think that it's a slightly better > solution than adding a new slot. > > >>> 2. New grammar for "await" expressions, see >>> 'Syntax of "await" expression' section >>> >>> I like it. > > Great! > > The current grammar requires parentheses for consequent > await expressions: > > await (await coro()) > > I can change this (in theory), but I kind of like the parens > in this case -- better readability. And it'll be a very > rare case. > > >>> 3. inspect.iscoroutine() and inspect.iscoroutineobjects() >>> functions. >>> >>> What's the use case for these? I wonder if it makes more sense to >>> have a >> check for a generalized awaitable rather than specifically a coroutine. > > It's important to at least have 'iscoroutine' -- to check that > the object is a coroutine function. A typical use-case would be > a web framework that lets you to bind coroutines to specific > http methods/paths: > > @http.get('/spam') > async def handle_spam(request): > ... > > 'http.get' decorator will need a way to raise an error if it's > applied to a regular function (while the code is being imported, > not in runtime). > > > The idea here is to cover all kinds of python objects in > inspect module, it's Python's reflection API. > > The other thing is that it's easy to implement this function > for CPython: just check for CO_COROUTINE flag. For other > Python implementations it might be a different story. > > > (More arguments for isawaitable() below) > > >> >>> 4. Full separation of coroutines and generators. >>> This is a big one; let's discuss. >>> >>> a) Coroutine objects raise TypeError (is NotImplementedError >>> better?) in their __iter__ and __next__. Therefore it's >>> not not possible to pass them to iter(), tuple(), next() and >>> other similar functions that work with iterables. >>> >>> I think it should be TypeError -- what you *really* want is not to >>> define >> these methods at all but given the implementation tactic for coroutines >> that may not be possible, so the nearest approximation is TypeError. >> (Also, >> NotImplementedError is typically to indicate that a subclass should >> implement it.) >> > > Agree. > >>> b) Because of (a), for..in iteration also does not work >>> on coroutines anymore. >>> >>> Sounds good. >> >>> c) 'yield from' only accept coroutine objects from >>> generators decorated with 'types.coroutine'. That means >>> that existing asyncio generator-based coroutines will >>> happily yield from both coroutines and generators. >>> *But* every generator-based coroutine *must* be >>> decorated with `asyncio.coroutine()`. This is >>> potentially a backwards incompatible change. >>> >>> See below. I worry about backward compatibility. A lot. Are you saying >> that asycio-based code that doesn't use @coroutine will break in 3.5? > > I'll experiment with replacing (c) with a warning. > > We can disable __iter__ and __next__ for coroutines, but > allow to use 'yield from' on them. Would it be a better > approach? > >> >> >>> d) inspect.isgenerator() and inspect.isgeneratorfunction() >>> return `False` for coroutine objects & coroutine functions. >>> >>> Makes sense. > > (d) can also break something (hypothetically). I'm not > sure why would someone use isgenerator() and > isgeneratorfunction() on generator-based coroutines in > code based on asyncio, but there is a chance that > someone did (it should be trivial to fix the code). > > Same for iter() and next(). The chance is slim, but we > may break some obscure code. > > Are you OK with this? > > >> >>> e) Should we add a coroutine ABC (for cython etc)? >>> >>> I, personally, think this is highly necessary. First, >>> separation of coroutines from generators is extremely >>> important. One day there won't be generator-based >>> coroutines, and we want to avoid any kind of confusion. >>> Second, we only can do this in 3.5. This kind of >>> semantics change won't be ever possible. >>> >> Sounds like Stefan agrees. Are you aware of >> http://bugs.python.org/issue24018 (Generator ABC)? > > Yes, I saw the issue. I'll review it in more detail > before thinking about Coroutine ABC for the next PEP > update. > >> >> >>> asyncio recommends using @coroutine decorator, and most >>> projects that I've seen do use it. Also there is no >>> reason for people to use iter() and next() functions >>> on coroutines when writing asyncio code. I doubt that >>> this will cause serious backwards compatibility problems >>> (asyncio also has provisional status). >>> >>> I wouldn't count too much on asyncio's provisional status. What are >>> the >> consequences for code that is written to work with asyncio but >> doesn't use >> @coroutine? Such code will work with 3.4 and (despite the provisional >> status and the recommendation to use @coroutine) I don't want that >> code to >> break in 3.5 (though maybe a warning would be fine). >> >> I also hope that if someone has their own (renamed) copy of asyncio that >> works with 3.4, it will all still work with 3.5. Even if asyncio >> itself is >> provisional, none of the primitives (e.g. yield from) that it is >> built upon >> are provisional, so there should be no reason for it to break in 3.5. > > I agree. I'll try warnings for yield-fromming coroutines from > regular generators (so that we can disable it in 3.7/3.6). > > *If that doesn't work*, I think we need a compromise (not ideal, > but breaking things is worse): > > - yield from would always accept coroutine-objects > - iter(), next(), tuple(), etc won't work on coroutine-objects > - for..in won't work on coroutine-objects > > > >> >> >>> Thank you, >>> Yury >>> >>> Some more inline comments directly on the PEP below. >> >>> PEP: 492 >>> Title: Coroutines with async and await syntax >>> Version: $Revision$ >>> Last-Modified: $Date$ >>> Author: Yury Selivanov >>> Status: Draft >>> Type: Standards Track >>> Content-Type: text/x-rst >>> Created: 09-Apr-2015 >>> Python-Version: 3.5 >>> Post-History: 17-Apr-2015, 21-Apr-2015, 27-Apr-2015 >>> >>> >>> Abstract >>> ======== >>> >>> This PEP introduces new syntax for coroutines, asynchronous ``with`` >>> statements and ``for`` loops. The main motivation behind this proposal >>> is to streamline writing and maintaining asynchronous code, as well as >>> to simplify previously hard to implement code patterns. >>> >>> >>> Rationale and Goals >>> =================== >>> >>> Current Python supports implementing coroutines via generators (PEP >>> 342), further enhanced by the ``yield from`` syntax introduced in PEP >>> 380. This approach has a number of shortcomings: >>> >>> * it is easy to confuse coroutines with regular generators, since they >>> share the same syntax; async libraries often attempt to alleviate >>> this by using decorators (e.g. ``@asyncio.coroutine`` [1]_); >>> >>> * it is not possible to natively define a coroutine which has no >>> ``yield`` or ``yield from`` statements, again requiring the use of >>> decorators to fix potential refactoring issues; >>> >>> (I have to agree with Mark that this point is pretty weak. :-) >> >>> * support for asynchronous calls is limited to expressions where >>> ``yield`` is allowed syntactically, limiting the usefulness of >>> syntactic features, such as ``with`` and ``for`` statements. >>> >>> This proposal makes coroutines a native Python language feature, and >>> clearly separates them from generators. This removes >>> generator/coroutine ambiguity, and makes it possible to reliably define >>> coroutines without reliance on a specific library. This also enables >>> linters and IDEs to improve static code analysis and refactoring. >>> >>> Native coroutines and the associated new syntax features make it >>> possible to define context manager and iteration protocols in >>> asynchronous terms. As shown later in this proposal, the new ``async >>> with`` statement lets Python programs perform asynchronous calls when >>> entering and exiting a runtime context, and the new ``async for`` >>> statement makes it possible to perform asynchronous calls in iterators. >>> >> I wonder if you could add some adaptation of the explanation I have >> posted >> (a few times now, I feel) for the reason why I prefer to suspend only at >> syntactically recognizable points (yield [from] in the past, await and >> async for/with in this PEP). Unless you already have it in the rationale >> (though it seems Mark didn't think it was enough :-). > > I'll see what I can do. > >> >>> >>> Specification >>> ============= >>> >>> This proposal introduces new syntax and semantics to enhance coroutine >>> support in Python, it does not change the internal implementation of >>> coroutines, which are still based on generators. >>> >>> It's actually a separate issue whether any implementation changes. >> Implementation changes don't need to go through the PEP process, unless >> they're really also interface changes. >> >> >>> It is strongly suggested that the reader understands how coroutines are >>> implemented in Python (PEP 342 and PEP 380). It is also recommended to >>> read PEP 3156 (asyncio framework) and PEP 3152 (Cofunctions). >>> >>> From this point in this document we use the word *coroutine* to refer >>> to functions declared using the new syntax. *generator-based >>> coroutine* is used where necessary to refer to coroutines that are >>> based on generator syntax. >>> >> Despite reading this I still get confused when reading the PEP (probably >> because asyncio uses "coroutine" in the latter sense). Maybe it would >> make >> sense to write "native coroutine" for the new concept, to distinguish >> the >> two concepts more clearly? (You could even change "awaitable" to >> "coroutine". Though I like "awaitable" too.) > > "awaitable" is a more generic term... It can be a future, or > it can be a coroutine. Mixing them in one may create more > confusion. Also, "awaitable" is more of an interface, or a > trait, which means that the object won't be rejected by > the 'await' expression. > > I like your 'native coroutine' suggestion. I'll update the > PEP. > >> >>> >>> New Coroutine Declaration Syntax >>> -------------------------------- >>> >>> The following new syntax is used to declare a coroutine:: >>> >>> async def read_data(db): >>> pass >>> >>> Key properties of coroutines: >>> >>> * ``async def`` functions are always coroutines, even if they do not >>> contain ``await`` expressions. >>> >>> * It is a ``SyntaxError`` to have ``yield`` or ``yield from`` >>> expressions in an ``async`` function. >>> >>> For Mark's benefit, might add that this is similar to how ``return`` >>> and >> ``yield`` are disallowed syntactically outside functions (as are the >> syntactic constraints on ``await` and ``async for|def``). > > OK > >> >> >>> * Internally, a new code object flag - ``CO_COROUTINE`` - is introduced >>> to enable runtime detection of coroutines (and migrating existing >>> code). All coroutines have both ``CO_COROUTINE`` and >>> ``CO_GENERATOR`` >>> flags set. >>> >>> * Regular generators, when called, return a *generator object*; >>> similarly, coroutines return a *coroutine object*. >>> >>> * ``StopIteration`` exceptions are not propagated out of coroutines, >>> and are replaced with a ``RuntimeError``. For regular generators >>> such behavior requires a future import (see PEP 479). >>> >>> >>> types.coroutine() >>> ----------------- >>> >>> A new function ``coroutine(gen)`` is added to the ``types`` module. It >>> applies ``CO_COROUTINE`` flag to the passed generator-function's code >>> object, making it to return a *coroutine object* when called. >>> >>> Clarify that this is a decorator that modifies the function object in >> place. > > Good catch. > >> >> >>> This feature enables an easy upgrade path for existing libraries. >>> >>> >>> Await Expression >>> ---------------- >>> >>> The following new ``await`` expression is used to obtain a result of >>> coroutine execution:: >>> >>> async def read_data(db): >>> data = await db.fetch('SELECT ...') >>> ... >>> >>> ``await``, similarly to ``yield from``, suspends execution of >>> ``read_data`` coroutine until ``db.fetch`` *awaitable* completes and >>> returns the result data. >>> >>> It uses the ``yield from`` implementation with an extra step of >>> validating its argument. ``await`` only accepts an *awaitable*, which >>> can be one of: >>> >>> * A *coroutine object* returned from a *coroutine* or a generator >>> decorated with ``types.coroutine()``. >>> >>> * An object with an ``__await__`` method returning an iterator. >>> >>> Any ``yield from`` chain of calls ends with a ``yield``. This is a >>> fundamental mechanism of how *Futures* are implemented. Since, >>> internally, coroutines are a special kind of generators, every >>> ``await`` is suspended by a ``yield`` somewhere down the chain of >>> ``await`` calls (please refer to PEP 3156 for a detailed >>> explanation.) >>> >>> To enable this behavior for coroutines, a new magic method called >>> ``__await__`` is added. In asyncio, for instance, to enable Future >>> objects in ``await`` statements, the only change is to add >>> ``__await__ = __iter__`` line to ``asyncio.Future`` class. >>> >>> Objects with ``__await__`` method are called *Future-like* >>> objects in >>> the rest of this PEP. >>> >>> Also, please note that ``__aiter__`` method (see its definition >>> below) cannot be used for this purpose. It is a different protocol, >>> and would be like using ``__iter__`` instead of ``__call__`` for >>> regular callables. >>> >>> It is a ``TypeError`` if ``__await__`` returns anything but an >>> iterator. >>> >>> * Objects defined with CPython C API with a ``tp_await`` function, >>> returning an iterator (similar to ``__await__`` method). >>> >>> It is a ``SyntaxError`` to use ``await`` outside of a coroutine. >>> >>> It is a ``TypeError`` to pass anything other than an *awaitable* object >>> to an ``await`` expression. >>> >>> >>> Syntax of "await" expression >>> '''''''''''''''''''''''''''' >>> >>> ``await`` keyword is defined differently from ``yield`` and ``yield >>> from``. The main difference is that *await expressions* do not require >>> parentheses around them most of the times. >>> >>> Examples:: >>> >>> ================================== ================================== >>> Expression Will be parsed as >>> ================================== ================================== >>> ``if await fut: pass`` ``if (await fut): pass`` >>> ``if await fut + 1: pass`` ``if (await fut) + 1: pass`` >>> ``pair = await fut, 'spam'`` ``pair = (await fut), 'spam'`` >>> ``with await fut, open(): pass`` ``with (await fut), open(): pass`` >>> ``await foo()['spam'].baz()()`` ``await ( foo()['spam'].baz()() )`` >>> ``return await coro()`` ``return ( await coro() )`` >>> ``res = await coro() ** 2`` ``res = (await coro()) ** 2`` >>> ``func(a1=await coro(), a2=0)`` ``func(a1=(await coro()), a2=0)`` >>> ================================== ================================== >>> >>> See `Grammar Updates`_ section for details. >>> >>> >>> Asynchronous Context Managers and "async with" >>> ---------------------------------------------- >>> >>> An *asynchronous context manager* is a context manager that is able to >>> suspend execution in its *enter* and *exit* methods. >>> >>> To make this possible, a new protocol for asynchronous context managers >>> is proposed. Two new magic methods are added: ``__aenter__`` and >>> ``__aexit__``. Both must return an *awaitable*. >>> >>> An example of an asynchronous context manager:: >>> >>> class AsyncContextManager: >>> async def __aenter__(self): >>> await log('entering context') >>> >>> async def __aexit__(self, exc_type, exc, tb): >>> await log('exiting context') >>> >>> >>> New Syntax >>> '''''''''' >>> >>> A new statement for asynchronous context managers is proposed:: >>> >>> async with EXPR as VAR: >>> BLOCK >>> >>> >>> which is semantically equivalent to:: >>> >>> mgr = (EXPR) >>> aexit = type(mgr).__aexit__ >>> aenter = type(mgr).__aenter__(mgr) >>> exc = True >>> >>> try: >>> try: >>> VAR = await aenter >>> BLOCK >>> except: >>> exc = False >>> exit_res = await aexit(mgr, *sys.exc_info()) >>> if not exit_res: >>> raise >>> >>> finally: >>> if exc: >>> await aexit(mgr, None, None, None) >>> >>> I realize you copied this from PEP 343, but I wonder, do we really need >> two nested try statements and the 'exc' flag? Why can't the finally >> step be >> placed in an 'else' clause on the inner try? (There may well be a reason >> but I can't figure what it is, and PEP 343 doesn't seem to explain it.) >> >> Also, it's a shame we're perpetuating the sys.exc_info() triple in >> the API >> here, but I agree making __exit__ and __aexit__ different also isn't a >> great idea. :-( >> >> PS. With the new tighter syntax for ``await`` you don't need the >> ``exit_res`` variable any more. > > Yes, this can be simplified. It was indeed copied from PEP 343. > >> >> >>> As with regular ``with`` statements, it is possible to specify multiple >>> context managers in a single ``async with`` statement. >>> >>> It is an error to pass a regular context manager without ``__aenter__`` >>> and ``__aexit__`` methods to ``async with``. It is a ``SyntaxError`` >>> to use ``async with`` outside of a coroutine. >>> >>> >>> Example >>> ''''''' >>> >>> With asynchronous context managers it is easy to implement proper >>> database transaction managers for coroutines:: >>> >>> async def commit(session, data): >>> ... >>> >>> async with session.transaction(): >>> ... >>> await session.update(data) >>> ... >>> >>> Code that needs locking also looks lighter:: >>> >>> async with lock: >>> ... >>> >>> instead of:: >>> >>> with (yield from lock): >>> ... >>> >> (Also, the implementation of the latter is problematic -- check >> asyncio/locks.py and notice that __enter__ is empty...) >> >>> >>> Asynchronous Iterators and "async for" >>> -------------------------------------- >>> >>> An *asynchronous iterable* is able to call asynchronous code in its >>> *iter* implementation, and *asynchronous iterator* can call >>> asynchronous code in its *next* method. To support asynchronous >>> iteration: >>> >>> 1. An object must implement an ``__aiter__`` method returning an >>> *awaitable* resulting in an *asynchronous iterator object*. >>> >> Have you considered making __aiter__ not an awaitable? It's not strictly >> necessary I think, one could do all the awaiting in __anext__. Though >> perhaps there are use cases that are more naturally expressed by >> awaiting >> in __aiter__? (Your examples all use ``async def __aiter__(self): return >> self`` suggesting this would be no great loss.) > > There is a section in Design Considerations about this. > I should add a reference to it. > >> >> >>> 2. An *asynchronous iterator object* must implement an ``__anext__`` >>> method returning an *awaitable*. >>> >>> 3. To stop iteration ``__anext__`` must raise a ``StopAsyncIteration`` >>> exception. >>> >>> An example of asynchronous iterable:: >>> >>> class AsyncIterable: >>> async def __aiter__(self): >>> return self >>> >>> async def __anext__(self): >>> data = await self.fetch_data() >>> if data: >>> return data >>> else: >>> raise StopAsyncIteration >>> >>> async def fetch_data(self): >>> ... >>> >>> >>> New Syntax >>> '''''''''' >>> >>> A new statement for iterating through asynchronous iterators is >>> proposed:: >>> >>> async for TARGET in ITER: >>> BLOCK >>> else: >>> BLOCK2 >>> >>> which is semantically equivalent to:: >>> >>> iter = (ITER) >>> iter = await type(iter).__aiter__(iter) >>> running = True >>> while running: >>> try: >>> TARGET = await type(iter).__anext__(iter) >>> except StopAsyncIteration: >>> running = False >>> else: >>> BLOCK >>> else: >>> BLOCK2 >>> >>> >>> It is a ``TypeError`` to pass a regular iterable without ``__aiter__`` >>> method to ``async for``. It is a ``SyntaxError`` to use ``async for`` >>> outside of a coroutine. >>> >>> As for with regular ``for`` statement, ``async for`` has an optional >>> ``else`` clause. >>> >>> (Not because we're particularly fond of it, but because its absence >>> would >> just introduce more special cases. :-) >> >> >>> Example 1 >>> ''''''''' >>> >>> With asynchronous iteration protocol it is possible to asynchronously >>> buffer data during iteration:: >>> >>> async for data in cursor: >>> ... >>> >>> Where ``cursor`` is an asynchronous iterator that prefetches ``N`` rows >>> of data from a database after every ``N`` iterations. >>> >>> The following code illustrates new asynchronous iteration protocol:: >>> >>> class Cursor: >>> def __init__(self): >>> self.buffer = collections.deque() >>> >>> def _prefetch(self): >>> ... >>> >>> async def __aiter__(self): >>> return self >>> >>> async def __anext__(self): >>> if not self.buffer: >>> self.buffer = await self._prefetch() >>> if not self.buffer: >>> raise StopAsyncIteration >>> return self.buffer.popleft() >>> >>> then the ``Cursor`` class can be used as follows:: >>> >>> async for row in Cursor(): >>> print(row) >>> >>> which would be equivalent to the following code:: >>> >>> i = await Cursor().__aiter__() >>> while True: >>> try: >>> row = await i.__anext__() >>> except StopAsyncIteration: >>> break >>> else: >>> print(row) >>> >>> >>> Example 2 >>> ''''''''' >>> >>> The following is a utility class that transforms a regular iterable to >>> an asynchronous one. While this is not a very useful thing to do, the >>> code illustrates the relationship between regular and asynchronous >>> iterators. >>> >>> :: >>> >>> class AsyncIteratorWrapper: >>> def __init__(self, obj): >>> self._it = iter(obj) >>> >>> async def __aiter__(self): >>> return self >>> >>> async def __anext__(self): >>> try: >>> value = next(self._it) >>> except StopIteration: >>> raise StopAsyncIteration >>> return value >>> >>> async for letter in AsyncIteratorWrapper("abc"): >>> print(letter) >>> >>> >>> Why StopAsyncIteration? >>> ''''''''''''''''''''''' >>> >>> I keep wanting to propose to rename this to AsyncStopIteration. I know >> it's about stopping an async iteration, but in my head I keep >> referring to >> it as AsyncStopIteration, probably because in other places we use >> async (or >> 'a') as a prefix. > > I'd be totally OK with that. Should I rename it? > >> >> >>> Coroutines are still based on generators internally. So, before PEP >>> 479, there was no fundamental difference between >>> >>> :: >>> >>> def g1(): >>> yield from fut >>> return 'spam' >>> >>> and >>> >>> :: >>> >>> def g2(): >>> yield from fut >>> raise StopIteration('spam') >>> >>> And since PEP 479 is accepted and enabled by default for coroutines, >>> the following example will have its ``StopIteration`` wrapped into a >>> ``RuntimeError`` >>> >>> :: >>> >>> async def a1(): >>> await fut >>> raise StopIteration('spam') >>> >>> The only way to tell the outside code that the iteration has ended is >>> to raise something other than ``StopIteration``. Therefore, a new >>> built-in exception class ``StopAsyncIteration`` was added. >>> >>> Moreover, with semantics from PEP 479, all ``StopIteration`` exceptions >>> raised in coroutines are wrapped in ``RuntimeError``. >>> >>> >>> Debugging Features >>> ------------------ >>> >>> One of the most frequent mistakes that people make when using >>> generators as coroutines is forgetting to use ``yield from``:: >>> >>> @asyncio.coroutine >>> def useful(): >>> asyncio.sleep(1) # this will do noting without 'yield from' >>> >>> Might be useful to point out that this was the one major advantage >>> of PEP >> 3152 -- although it wasn't enough to save that PEP, and in your response >> you pointed out that this mistake is not all that common. Although >> you seem >> to disagree with that here ("One of the most frequent mistakes ..."). > > I think it's a mistake that a lot of beginners may make at some > point (and in this sense it's frequent). I really doubt that > once you were hit by it more than two times you would make it > again. > > This is a small wart, but we have to have a solution for it. > >> >> >>> For debugging this kind of mistakes there is a special debug mode in >>> asyncio, in which ``@coroutine`` decorator wraps all functions with a >>> special object with a destructor logging a warning. Whenever a wrapped >>> generator gets garbage collected, a detailed logging message is >>> generated with information about where exactly the decorator function >>> was defined, stack trace of where it was collected, etc. Wrapper >>> object also provides a convenient ``__repr__`` function with detailed >>> information about the generator. >>> >>> The only problem is how to enable these debug capabilities. Since >>> debug facilities should be a no-op in production mode, ``@coroutine`` >>> decorator makes the decision of whether to wrap or not to wrap based on >>> an OS environment variable ``PYTHONASYNCIODEBUG``. This way it is >>> possible to run asyncio programs with asyncio's own functions >>> instrumented. ``EventLoop.set_debug``, a different debug facility, has >>> no impact on ``@coroutine`` decorator's behavior. >>> >>> With this proposal, coroutines is a native, distinct from generators, >>> concept. New methods ``set_coroutine_wrapper`` and >>> ``get_coroutine_wrapper`` are added to the ``sys`` module, with which >>> frameworks can provide advanced debugging facilities. >>> >>> These two appear to be unspecified except by example. > > Will add a subsection specifically for them. > >> >>> It is also important to make coroutines as fast and efficient as >>> possible, therefore there are no debug features enabled by default. >>> >>> Example:: >>> >>> async def debug_me(): >>> await asyncio.sleep(1) >>> >>> def async_debug_wrap(generator): >>> return asyncio.CoroWrapper(generator) >>> >>> sys.set_coroutine_wrapper(async_debug_wrap) >>> >>> debug_me() # <- this line will likely GC the coroutine object and >>> # trigger asyncio.CoroWrapper's code. >>> >>> assert isinstance(debug_me(), asyncio.CoroWrapper) >>> >>> sys.set_coroutine_wrapper(None) # <- this unsets any >>> # previously set wrapper >>> assert not isinstance(debug_me(), asyncio.CoroWrapper) >>> >>> If ``sys.set_coroutine_wrapper()`` is called twice, the new wrapper >>> replaces the previous wrapper. ``sys.set_coroutine_wrapper(None)`` >>> unsets the wrapper. >>> >>> >>> inspect.iscoroutine() and inspect.iscoroutineobject() >>> ----------------------------------------------------- >>> >>> Two new functions are added to the ``inspect`` module: >>> >>> * ``inspect.iscoroutine(obj)`` returns ``True`` if ``obj`` is a >>> coroutine object. >>> >>> * ``inspect.iscoroutinefunction(obj)`` returns ``True`` is ``obj`` is a >>> coroutine function. >>> >>> Maybe isawaitable() and isawaitablefunction() are also useful? (Or only >> isawaitable()?) > > I think that isawaitable would be really useful. Especially, > to check if an object implemented with C API has a tp_await > function. > > isawaitablefunction() looks a bit confusing to me: > > def foo(): return fut > > is awaitable, but there is no way to detect that. > > def foo(arg): > if arg == 'spam': > return fut > > is awaitable sometimes. > >> >> >>> Differences between coroutines and generators >>> --------------------------------------------- >>> >>> A great effort has been made to make sure that coroutines and >>> generators are separate concepts: >>> >>> 1. Coroutine objects do not implement ``__iter__`` and ``__next__`` >>> methods. Therefore they cannot be iterated over or passed to >>> ``iter()``, ``list()``, ``tuple()`` and other built-ins. They >>> also cannot be used in a ``for..in`` loop. >>> >>> 2. ``yield from`` does not accept coroutine objects (unless it is used >>> in a generator-based coroutine decorated with ``types.coroutine``.) >>> >> How does ``yield from`` know that it is occurring in a generator-based >> coroutine? > > I check that in 'ceval.c' in the implementation of YIELD_FROM > opcode. > > If the current code object doesn't have a CO_COROUTINE flag > and the opcode arg is a generator-object with CO_COROUTINE -- > we raise an error. > >> >>> 3. ``yield from`` does not accept coroutine objects from plain Python >>> generators (*not* generator-based coroutines.) >>> >> I am worried about this. PEP 380 gives clear semantics to "yield from >> " and I don't think you can revert that here. Or maybe I am >> misunderstanding what you meant here? (What exactly are "coroutine >> objects >> from plain Python generators"?) > > # *Not* decorated with @coroutine > def some_algorithm_impl(): > yield 1 > yield from native_coroutine() # <- this is a bug > > > "some_algorithm_impl" is a regular generator. By mistake > someone could try to use "yield from" on a native coroutine > (which is 99.9% is a bug). > > So we can rephrase it to: > > ``yield from`` does not accept *native coroutine objects* > from regular Python generators > > > I also agree that raising an exception in this case in 3.5 > might break too much existing code. I'll try warnings, and > if it doesn't work we might want to just let this restriction > slip. > > >> >>> 4. ``inspect.isgenerator()`` and ``inspect.isgeneratorfunction()`` >>> return ``False`` for coroutine objects and coroutine functions. >>> >>> >>> Coroutine objects >>> ----------------- >>> >>> Coroutines are based on generators internally, thus they share the >>> implementation. Similarly to generator objects, coroutine objects have >>> ``throw``, ``send`` and ``close`` methods. ``StopIteration`` and >>> ``GeneratorExit`` play the same role for coroutine objects (although >>> PEP 479 is enabled by default for coroutines). >>> >> Does send() make sense for a native coroutine? Check PEP 380. I think >> the >> only way to access the send() argument is by using ``yield`` but that's >> disallowed. Or is this about send() being passed to the ``yield`` that >> ultimately suspends the chain of coroutines? (You may just have to >> rewrite >> the section about that -- it seems a bit hidden now.) > > Yes, 'send()' is needed to push values to the 'yield' statement > somewhere (future) down the chain of coroutines (suspension point). > > This has to be articulated in a clear way, I'll think how to > rewrite this section without replicating PEP 380 and python > documentation on generators. > >> >>> >>> Glossary >>> ======== >>> >>> :Coroutine: >>> A coroutine function, or just "coroutine", is declared with >>> ``async >>> def``. It uses ``await`` and ``return value``; see `New Coroutine >>> Declaration Syntax`_ for details. >>> >>> :Coroutine object: >>> Returned from a coroutine function. See `Await Expression`_ for >>> details. >>> >>> :Future-like object: >>> An object with an ``__await__`` method, or a C object with >>> ``tp_await`` function, returning an iterator. Can be consumed by >>> an ``await`` expression in a coroutine. A coroutine waiting for a >>> Future-like object is suspended until the Future-like object's >>> ``__await__`` completes, and returns the result. See `Await >>> Expression`_ for details. >>> >>> :Awaitable: >>> A *Future-like* object or a *coroutine object*. See `Await >>> Expression`_ for details. >>> >>> :Generator-based coroutine: >>> Coroutines based in generator syntax. Most common example is >>> ``@asyncio.coroutine``. >>> >>> :Asynchronous context manager: >>> An asynchronous context manager has ``__aenter__`` and >>> ``__aexit__`` >>> methods and can be used with ``async with``. See `Asynchronous >>> Context Managers and "async with"`_ for details. >>> >>> :Asynchronous iterable: >>> An object with an ``__aiter__`` method, which must return an >>> *asynchronous iterator* object. Can be used with ``async for``. >>> See `Asynchronous Iterators and "async for"`_ for details. >>> >>> :Asynchronous iterator: >>> An asynchronous iterator has an ``__anext__`` method. See >>> `Asynchronous Iterators and "async for"`_ for details. >>> >>> >>> List of functions and methods >>> ============================= >>> >> (I'm not sure of the utility of this section.) > > It's a little bit hard to understand that "awaitable" is > a general term that includes native coroutine objects, so > it's OK to write both: > > def __aenter__(): return fut > async def __aenter__(): ... > > We (Victor and I) decided that it might be useful to have an > additional section that explains it. > >> >>> ================= =================================== ================= >>> Method Can contain Can't contain >>> ================= =================================== ================= >>> async def func await, return value yield, yield from >>> async def __a*__ await, return value yield, yield from >>> >> (This line seems redundant.) >> >>> def __a*__ return awaitable await >>> >> def __await__ yield, yield from, return iterable await >>> generator yield, yield from, return value await >>> ================= =================================== ================= >>> >>> Where: >>> >>> * "async def func": coroutine; >>> >>> * "async def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``, >>> ``__aexit__`` defined with the ``async`` keyword; >>> >>> * "def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``, >>> ``__aexit__`` defined without the ``async`` keyword, must return an >>> *awaitable*; >>> >>> * "def __await__": ``__await__`` method to implement *Future-like* >>> objects; >>> >>> * generator: a "regular" generator, function defined with ``def`` and >>> which contains a least one ``yield`` or ``yield from`` expression. >>> >>> >>> Transition Plan >>> =============== >>> >>> This may need to be pulled forward or at least mentioned earlier (in >>> the >> Abstract or near the top of the Specification). >> >> >>> To avoid backwards compatibility issues with ``async`` and ``await`` >>> keywords, it was decided to modify ``tokenizer.c`` in such a way, that >>> it: >>> >>> * recognizes ``async def`` name tokens combination (start of a >>> coroutine); >>> >>> * keeps track of regular functions and coroutines; >>> >>> * replaces ``'async'`` token with ``ASYNC`` and ``'await'`` token with >>> ``AWAIT`` when in the process of yielding tokens for coroutines. >>> >>> This approach allows for seamless combination of new syntax features >>> (all of them available only in ``async`` functions) with any existing >>> code. >>> >>> An example of having "async def" and "async" attribute in one piece of >>> code:: >>> >>> class Spam: >>> async = 42 >>> >>> async def ham(): >>> print(getattr(Spam, 'async')) >>> >>> # The coroutine can be executed and will print '42' >>> >>> >>> Backwards Compatibility >>> ----------------------- >>> >>> This proposal preserves 100% backwards compatibility. >>> >> Is this still true with the proposed restrictions on what ``yield from`` >> accepts? (Hopefully I'm the one who is confused. :-) > > True for the code that uses @coroutine decorators properly. > I'll see what I can do with warnings, but I'll update the > section anyways. > >> >>> >>> Grammar Updates >>> --------------- >>> >>> Grammar changes are also fairly minimal:: >>> >>> decorated: decorators (classdef | funcdef | async_funcdef) >>> async_funcdef: ASYNC funcdef >>> >>> compound_stmt: (if_stmt | while_stmt | for_stmt | try_stmt | >>> with_stmt >>> | funcdef | classdef | decorated | async_stmt) >>> >>> async_stmt: ASYNC (funcdef | with_stmt | for_stmt) >>> >>> power: atom_expr ['**' factor] >>> atom_expr: [AWAIT] atom trailer* >>> >>> >>> Transition Period Shortcomings >>> ------------------------------ >>> >>> There is just one. > > Are you OK with this thing? > >>> >>> Until ``async`` and ``await`` are not proper keywords, it is not >>> possible (or at least very hard) to fix ``tokenizer.c`` to recognize >>> them on the **same line** with ``def`` keyword:: >>> >>> # async and await will always be parsed as variables >>> >>> async def outer(): # 1 >>> def nested(a=(await fut)): >>> pass >>> >>> async def foo(): return (await fut) # 2 >>> >>> Since ``await`` and ``async`` in such cases are parsed as ``NAME`` >>> tokens, a ``SyntaxError`` will be raised. >>> >>> To workaround these issues, the above examples can be easily rewritten >>> to a more readable form:: >>> >>> async def outer(): # 1 >>> a_default = await fut >>> def nested(a=a_default): >>> pass >>> >>> async def foo(): # 2 >>> return (await fut) >>> >>> This limitation will go away as soon as ``async`` and ``await`` ate >>> proper keywords. Or if it's decided to use a future import for this >>> PEP. >>> >>> >>> Deprecation Plans >>> ----------------- >>> >>> ``async`` and ``await`` names will be softly deprecated in CPython 3.5 >>> and 3.6. In 3.7 we will transform them to proper keywords. Making >>> ``async`` and ``await`` proper keywords before 3.7 might make it harder >>> for people to port their code to Python 3. >>> >>> >>> asyncio >>> ------- >>> >>> ``asyncio`` module was adapted and tested to work with coroutines and >>> new statements. Backwards compatibility is 100% preserved. >>> >>> The required changes are mainly: >>> >>> 1. Modify ``@asyncio.coroutine`` decorator to use new >>> ``types.coroutine()`` function. >>> >>> 2. Add ``__await__ = __iter__`` line to ``asyncio.Future`` class. >>> >>> 3. Add ``ensure_task()`` as an alias for ``async()`` function. >>> Deprecate ``async()`` function. >>> >>> >>> Design Considerations >>> ===================== >>> >>> PEP 3152 >>> -------- >>> >>> PEP 3152 by Gregory Ewing proposes a different mechanism for coroutines >>> (called "cofunctions"). Some key points: >>> >>> 1. A new keyword ``codef`` to declare a *cofunction*. *Cofunction* is >>> always a generator, even if there is no ``cocall`` expressions >>> inside it. Maps to ``async def`` in this proposal. >>> >>> 2. A new keyword ``cocall`` to call a *cofunction*. Can only be used >>> inside a *cofunction*. Maps to ``await`` in this proposal (with >>> some differences, see below.) >>> >>> 3. It is not possible to call a *cofunction* without a ``cocall`` >>> keyword. >>> >>> 4. ``cocall`` grammatically requires parentheses after it:: >>> >>> atom: cocall | >>> cocall: 'cocall' atom cotrailer* '(' [arglist] ')' >>> cotrailer: '[' subscriptlist ']' | '.' NAME >>> >>> 5. ``cocall f(*args, **kwds)`` is semantically equivalent to >>> ``yield from f.__cocall__(*args, **kwds)``. >>> >>> Differences from this proposal: >>> >>> 1. There is no equivalent of ``__cocall__`` in this PEP, which is >>> called and its result is passed to ``yield from`` in the ``cocall`` >>> expression. ``await`` keyword expects an *awaitable* object, >>> validates the type, and executes ``yield from`` on it. Although, >>> ``__await__`` method is similar to ``__cocall__``, but is only used >>> to define *Future-like* objects. >>> >>> 2. ``await`` is defined in almost the same way as ``yield from`` in the >>> grammar (it is later enforced that ``await`` can only be inside >>> ``async def``). It is possible to simply write ``await future``, >>> whereas ``cocall`` always requires parentheses. >>> >>> 3. To make asyncio work with PEP 3152 it would be required to modify >>> ``@asyncio.coroutine`` decorator to wrap all functions in an object >>> with a ``__cocall__`` method, or to implement ``__cocall__`` on >>> generators. To call *cofunctions* from existing generator-based >>> coroutines it would be required to use ``costart(cofunc, *args, >>> **kwargs)`` built-in. >>> >>> 4. Since it is impossible to call a *cofunction* without a ``cocall`` >>> keyword, it automatically prevents the common mistake of forgetting >>> to use ``yield from`` on generator-based coroutines. This proposal >>> addresses this problem with a different approach, see `Debugging >>> Features`_. >>> >>> 5. A shortcoming of requiring a ``cocall`` keyword to call a coroutine >>> is that if is decided to implement coroutine-generators -- >>> coroutines with ``yield`` or ``async yield`` expressions -- we >>> wouldn't need a ``cocall`` keyword to call them. So we'll end up >>> having ``__cocall__`` and no ``__call__`` for regular coroutines, >>> and having ``__call__`` and no ``__cocall__`` for coroutine- >>> generators. >>> >>> 6. Requiring parentheses grammatically also introduces a whole lot >>> of new problems. >>> >>> The following code:: >>> >>> await fut >>> await function_returning_future() >>> await asyncio.gather(coro1(arg1, arg2), coro2(arg1, arg2)) >>> >>> would look like:: >>> >>> cocall fut() # or cocall costart(fut) >>> cocall (function_returning_future())() >>> cocall asyncio.gather(costart(coro1, arg1, arg2), >>> costart(coro2, arg1, arg2)) >>> >>> 7. There are no equivalents of ``async for`` and ``async with`` in PEP >>> 3152. >>> >>> >>> Coroutine-generators >>> -------------------- >>> >>> With ``async for`` keyword it is desirable to have a concept of a >>> *coroutine-generator* -- a coroutine with ``yield`` and ``yield from`` >>> expressions. To avoid any ambiguity with regular generators, we would >>> likely require to have an ``async`` keyword before ``yield``, and >>> ``async yield from`` would raise a ``StopAsyncIteration`` exception. >>> >>> While it is possible to implement coroutine-generators, we believe that >>> they are out of scope of this proposal. It is an advanced concept that >>> should be carefully considered and balanced, with a non-trivial changes >>> in the implementation of current generator objects. This is a matter >>> for a separate PEP. >>> >>> >>> No implicit wrapping in Futures >>> ------------------------------- >>> >>> There is a proposal to add similar mechanism to ECMAScript 7 [2]_. A >>> key difference is that JavaScript "async functions" always return a >>> Promise. While this approach has some advantages, it also implies that >>> a new Promise object is created on each "async function" invocation. >>> >>> We could implement a similar functionality in Python, by wrapping all >>> coroutines in a Future object, but this has the following >>> disadvantages: >>> >>> 1. Performance. A new Future object would be instantiated on each >>> coroutine call. Moreover, this makes implementation of ``await`` >>> expressions slower (disabling optimizations of ``yield from``). >>> >>> 2. A new built-in ``Future`` object would need to be added. >>> >>> 3. Coming up with a generic ``Future`` interface that is usable for any >>> use case in any framework is a very hard to solve problem. >>> >>> 4. It is not a feature that is used frequently, when most of the code >>> is coroutines. >>> >>> >>> Why "async" and "await" keywords >>> -------------------------------- >>> >>> async/await is not a new concept in programming languages: >>> >>> * C# has it since long time ago [5]_; >>> >>> * proposal to add async/await in ECMAScript 7 [2]_; >>> see also Traceur project [9]_; >>> >>> * Facebook's Hack/HHVM [6]_; >>> >>> * Google's Dart language [7]_; >>> >>> * Scala [8]_; >>> >>> * proposal to add async/await to C++ [10]_; >>> >>> * and many other less popular languages. >>> >>> This is a huge benefit, as some users already have experience with >>> async/await, and because it makes working with many languages in one >>> project easier (Python with ECMAScript 7 for instance). >>> >>> >>> Why "__aiter__" is a coroutine >>> ------------------------------ >>> >>> In principle, ``__aiter__`` could be a regular function. There are >>> several good reasons to make it a coroutine: >>> >>> * as most of the ``__anext__``, ``__aenter__``, and ``__aexit__`` >>> methods are coroutines, users would often make a mistake defining it >>> as ``async`` anyways; >>> >>> * there might be a need to run some asynchronous operations in >>> ``__aiter__``, for instance to prepare DB queries or do some file >>> operation. >>> >>> >>> Importance of "async" keyword >>> ----------------------------- >>> >>> While it is possible to just implement ``await`` expression and treat >>> all functions with at least one ``await`` as coroutines, this approach >>> makes APIs design, code refactoring and its long time support harder. >>> >>> Let's pretend that Python only has ``await`` keyword:: >>> >>> def useful(): >>> ... >>> await log(...) >>> ... >>> >>> def important(): >>> await useful() >>> >>> If ``useful()`` function is refactored and someone removes all >>> ``await`` expressions from it, it would become a regular python >>> function, and all code that depends on it, including ``important()`` >>> would be broken. To mitigate this issue a decorator similar to >>> ``@asyncio.coroutine`` has to be introduced. >>> >>> >>> Why "async def" >>> --------------- >>> >>> For some people bare ``async name(): pass`` syntax might look more >>> appealing than ``async def name(): pass``. It is certainly easier to >>> type. But on the other hand, it breaks the symmetry between ``async >>> def``, ``async with`` and ``async for``, where ``async`` is a modifier, >>> stating that the statement is asynchronous. It is also more consistent >>> with the existing grammar. >>> >>> >>> Why "async for/with" instead of "await for/with" >>> ------------------------------------------------ >>> >>> ``async`` is an adjective, and hence it is a better choice for a >>> *statement qualifier* keyword. ``await for/with`` would imply that >>> something is awaiting for a completion of a ``for`` or ``with`` >>> statement. >>> >>> >>> Why "async def" and not "def async" >>> ----------------------------------- >>> >>> ``async`` keyword is a *statement qualifier*. A good analogy to it are >>> "static", "public", "unsafe" keywords from other languages. "async >>> for" is an asynchronous "for" statement, "async with" is an >>> asynchronous "with" statement, "async def" is an asynchronous function. >>> >>> Having "async" after the main statement keyword might introduce some >>> confusion, like "for async item in iterator" can be read as "for each >>> asynchronous item in iterator". >>> >>> Having ``async`` keyword before ``def``, ``with`` and ``for`` also >>> makes the language grammar simpler. And "async def" better separates >>> coroutines from regular functions visually. >>> >>> >>> Why not a __future__ import >>> --------------------------- >>> >>> ``__future__`` imports are inconvenient and easy to forget to add. >>> Also, they are enabled for the whole source file. Consider that there >>> is a big project with a popular module named "async.py". With future >>> imports it is required to either import it using ``__import__()`` or >>> ``importlib.import_module()`` calls, or to rename the module. The >>> proposed approach makes it possible to continue using old code and >>> modules without a hassle, while coming up with a migration plan for >>> future python versions. >>> >>> >>> Why magic methods start with "a" >>> -------------------------------- >>> >>> New asynchronous magic methods ``__aiter__``, ``__anext__``, >>> ``__aenter__``, and ``__aexit__`` all start with the same prefix "a". >>> An alternative proposal is to use "async" prefix, so that ``__aiter__`` >>> becomes ``__async_iter__``. However, to align new magic methods with >>> the existing ones, such as ``__radd__`` and ``__iadd__`` it was decided >>> to use a shorter version. >>> >>> >>> Why not reuse existing magic names >>> ---------------------------------- >>> >>> An alternative idea about new asynchronous iterators and context >>> managers was to reuse existing magic methods, by adding an ``async`` >>> keyword to their declarations:: >>> >>> class CM: >>> async def __enter__(self): # instead of __aenter__ >>> ... >>> >>> This approach has the following downsides: >>> >>> * it would not be possible to create an object that works in both >>> ``with`` and ``async with`` statements; >>> >>> * it would break backwards compatibility, as nothing prohibits from >>> returning a Future-like objects from ``__enter__`` and/or >>> ``__exit__`` in Python <= 3.4; >>> >>> * one of the main points of this proposal is to make coroutines as >>> simple and foolproof as possible, hence the clear separation of the >>> protocols. >>> >>> >>> Why not reuse existing "for" and "with" statements >>> -------------------------------------------------- >>> >>> The vision behind existing generator-based coroutines and this proposal >>> is to make it easy for users to see where the code might be suspended. >>> Making existing "for" and "with" statements to recognize asynchronous >>> iterators and context managers will inevitably create implicit suspend >>> points, making it harder to reason about the code. >>> >>> >>> Comprehensions >>> -------------- >>> >>> For the sake of restricting the broadness of this PEP there is no new >>> syntax for asynchronous comprehensions. This should be considered in a >>> separate PEP, if there is a strong demand for this feature. >>> >>> >>> Async lambdas >>> ------------- >>> >>> Lambda coroutines are not part of this proposal. In this proposal they >>> would look like ``async lambda(parameters): expression``. Unless there >>> is a strong demand to have them as part of this proposal, it is >>> recommended to consider them later in a separate PEP. >>> >>> >>> Performance >>> =========== >>> >>> Overall Impact >>> -------------- >>> >>> This proposal introduces no observable performance impact. Here is an >>> output of python's official set of benchmarks [4]_: >>> >>> :: >>> >>> python perf.py -r -b default ../cpython/python.exe >>> ../cpython-aw/python.exe >>> >>> [skipped] >>> >>> Report on Darwin ysmac 14.3.0 Darwin Kernel Version 14.3.0: >>> Mon Mar 23 11:59:05 PDT 2015; root:xnu-2782.20.48~5/RELEASE_X86_64 >>> x86_64 i386 >>> >>> Total CPU cores: 8 >>> >>> ### etree_iterparse ### >>> Min: 0.365359 -> 0.349168: 1.05x faster >>> Avg: 0.396924 -> 0.379735: 1.05x faster >>> Significant (t=9.71) >>> Stddev: 0.01225 -> 0.01277: 1.0423x larger >>> >>> The following not significant results are hidden, use -v to >>> show them: >>> django_v2, 2to3, etree_generate, etree_parse, etree_process, >>> fastpickle, >>> fastunpickle, json_dump_v2, json_load, nbody, regex_v8, >>> tornado_http. >>> >>> >>> Tokenizer modifications >>> ----------------------- >>> >>> There is no observable slowdown of parsing python files with the >>> modified tokenizer: parsing of one 12Mb file >>> (``Lib/test/test_binop.py`` repeated 1000 times) takes the same amount >>> of time. >>> >>> >>> async/await >>> ----------- >>> >>> The following micro-benchmark was used to determine performance >>> difference between "async" functions and generators:: >>> >>> import sys >>> import time >>> >>> def binary(n): >>> if n <= 0: >>> return 1 >>> l = yield from binary(n - 1) >>> r = yield from binary(n - 1) >>> return l + 1 + r >>> >>> async def abinary(n): >>> if n <= 0: >>> return 1 >>> l = await abinary(n - 1) >>> r = await abinary(n - 1) >>> return l + 1 + r >>> >>> def timeit(gen, depth, repeat): >>> t0 = time.time() >>> for _ in range(repeat): >>> list(gen(depth)) >>> t1 = time.time() >>> print('{}({}) * {}: total {:.3f}s'.format( >>> gen.__name__, depth, repeat, t1-t0)) >>> >>> The result is that there is no observable performance difference. >>> Minimum timing of 3 runs >>> >>> :: >>> >>> abinary(19) * 30: total 12.985s >>> binary(19) * 30: total 12.953s >>> >>> Note that depth of 19 means 1,048,575 calls. >>> >>> >>> Reference Implementation >>> ======================== >>> >>> The reference implementation can be found here: [3]_. >>> >>> List of high-level changes and new protocols >>> -------------------------------------------- >>> >>> 1. New syntax for defining coroutines: ``async def`` and new ``await`` >>> keyword. >>> >>> 2. New ``__await__`` method for Future-like objects, and new >>> ``tp_await`` slot in ``PyTypeObject``. >>> >>> 3. New syntax for asynchronous context managers: ``async with``. And >>> associated protocol with ``__aenter__`` and ``__aexit__`` methods. >>> >>> 4. New syntax for asynchronous iteration: ``async for``. And >>> associated protocol with ``__aiter__``, ``__aexit__`` and new >>> built- >>> in exception ``StopAsyncIteration``. >>> >>> 5. New AST nodes: ``AsyncFunctionDef``, ``AsyncFor``, ``AsyncWith``, >>> ``Await``. >>> >>> 6. New functions: ``sys.set_coroutine_wrapper(callback)``, >>> ``sys.get_coroutine_wrapper()``, ``types.coroutine(gen)``, >>> ``inspect.iscoroutinefunction()``, and ``inspect.iscoroutine()``. >>> >>> 7. New ``CO_COROUTINE`` bit flag for code objects. >>> >>> While the list of changes and new things is not short, it is important >>> to understand, that most users will not use these features directly. >>> It is intended to be used in frameworks and libraries to provide users >>> with convenient to use and unambiguous APIs with ``async def``, >>> ``await``, ``async for`` and ``async with`` syntax. >>> >>> >>> Working example >>> --------------- >>> >>> All concepts proposed in this PEP are implemented [3]_ and can be >>> tested. >>> >>> :: >>> >>> import asyncio >>> >>> async def echo_server(): >>> print('Serving on localhost:8000') >>> await asyncio.start_server(handle_connection, >>> 'localhost', 8000) >>> >>> async def handle_connection(reader, writer): >>> print('New connection...') >>> >>> while True: >>> data = await reader.read(8192) >>> >>> if not data: >>> break >>> >>> print('Sending {:.10}... back'.format(repr(data))) >>> writer.write(data) >>> >>> loop = asyncio.get_event_loop() >>> loop.run_until_complete(echo_server()) >>> try: >>> loop.run_forever() >>> finally: >>> loop.close() >>> >>> >>> References >>> ========== >>> >>> .. [1] >>> https://docs.python.org/3/library/asyncio-task.html#asyncio.coroutine >>> >>> .. [2] http://wiki.ecmascript.org/doku.php?id=strawman:async_functions >>> >>> .. [3] https://github.com/1st1/cpython/tree/await >>> >>> .. [4] https://hg.python.org/benchmarks >>> >>> .. [5] https://msdn.microsoft.com/en-us/library/hh191443.aspx >>> >>> .. [6] http://docs.hhvm.com/manual/en/hack.async.php >>> >>> .. [7] https://www.dartlang.org/articles/await-async/ >>> >>> .. [8] http://docs.scala-lang.org/sips/pending/async.html >>> >>> .. [9] >>> https://github.com/google/traceur-compiler/wiki/LanguageFeatures#async-functions-experimental >>> >>> >>> .. [10] >>> http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3722.pdf >>> (PDF) >>> >>> >>> Acknowledgments >>> =============== >>> >>> I thank Guido van Rossum, Victor Stinner, Elvis Pranskevichus, Andrew >>> Svetlov, and ?ukasz Langa for their initial feedback. >>> >>> >>> Copyright >>> ========= >>> >>> This document has been placed in the public domain. >>> >>> .. >>> Local Variables: >>> mode: indented-text >>> indent-tabs-mode: nil >>> sentence-end-double-space: t >>> fill-column: 70 >>> coding: utf-8 >>> End: >>> >>> _______________________________________________ >>> Python-Dev mailing list >>> Python-Dev at python.org >>> https://mail.python.org/mailman/listinfo/python-dev >>> Unsubscribe: >>> https://mail.python.org/mailman/options/python-dev/guido%40python.org >>> >> >> > From ethan at stoneleaf.us Wed Apr 29 07:18:58 2015 From: ethan at stoneleaf.us (Ethan Furman) Date: Tue, 28 Apr 2015 22:18:58 -0700 Subject: [Python-Dev] PEP 492: async/await in Python; v3 In-Reply-To: References: <553EF985.2070808@gmail.com> <55401741.9040703@gmail.com> <20150428235531.GP32422@stoneleaf.us> Message-ID: <20150429051858.GA32207@stoneleaf.us> On 04/28, Guido van Rossum wrote: > On Tue, Apr 28, 2015 at 4:55 PM, Ethan Furman wrote: >> On 04/28, Yury Selivanov wrote: >> >> >>> This limitation will go away as soon as ``async`` and ``await`` ate >> >>> proper keywords. Or if it's decided to use a future import for this >> >>> PEP. >> >> `async` and `await` need to be proper keywords, and __future__ imports >> is how we do that (see, e.g., PEP 355 and and PEP 343) >> > > You could at least provide an explanation about how the current proposal > falls short. What code will break? There's a cost to __future__ imports > too. The current proposal is a pretty clever hack -- and we've done similar > hacks in the past (last I remember when "import ... as ..." was introduced > but we didn't want to make 'as' a keyword right away). My apologies, I was unaware we had done psuedo-keywords before. -- ~Ethan~ From greg.ewing at canterbury.ac.nz Wed Apr 29 07:40:21 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Wed, 29 Apr 2015 17:40:21 +1200 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <553E3D20.2020008@gmail.com> References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> Message-ID: <55406EC5.7000101@canterbury.ac.nz> Yury Selivanov wrote: > I don't want this: "await a() * b()" to be parsed, it's not meaningful. Why not? If a() is a coroutine that returns a number, why shouldn't I be able to multiply it by something? I don't think your currently proposed grammar prevents that anyway. We can have --> '*' --> '*' --> '*' --> 'await' * '*' * --> 'await' 'a' '(' ')' '*' 'b' '(' ')' It does, on the other hand, seem to prevent x = - await a() which looks perfectly sensible to me. I don't like the idea of introducing another level of precedence. Python already has too many of those to keep in my brain. Being able to tell people "it's just like unary minus" makes it easy to explain (and therefore possibly a good idea!). -- Greg From yselivanov.ml at gmail.com Wed Apr 29 07:47:44 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 29 Apr 2015 01:47:44 -0400 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <55406EC5.7000101@canterbury.ac.nz> References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55406EC5.7000101@canterbury.ac.nz> Message-ID: <55407080.7060808@gmail.com> Greg, On 2015-04-29 1:40 AM, Greg Ewing wrote: > Yury Selivanov wrote: > >> I don't want this: "await a() * b()" to be parsed, it's not meaningful. > > Why not? If a() is a coroutine that returns a number, > why shouldn't I be able to multiply it by something? Sorry, I thought you meant parsing "await a()*b()" as "await (a() * b())". > > I don't think your currently proposed grammar prevents > that anyway. We can have > > --> '*' > --> '*' > --> '*' > --> 'await' * '*' * > --> 'await' 'a' '(' ')' '*' 'b' '(' ')' > > It does, on the other hand, seem to prevent > > x = - await a() This works just fine: https://github.com/1st1/cpython/commit/33b3cd052243cd71c064eb385c7a557eec3ced4b Current grammar prevents this: "await -fut", and this: "await fut ** 2" being parsed as "await (fut ** 2) > > which looks perfectly sensible to me. > > I don't like the idea of introducing another level > of precedence. Python already has too many of those > to keep in my brain. Being able to tell people "it's > just like unary minus" makes it easy to explain (and > therefore possibly a good idea!). > It's just like unary minus ;) Yury From ncoghlan at gmail.com Wed Apr 29 09:25:55 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Wed, 29 Apr 2015 17:25:55 +1000 Subject: [Python-Dev] Unicode literals in Python 2.7 In-Reply-To: References: Message-ID: On 29 April 2015 at 06:20, Adam Barto? wrote: > Hello, > > is it possible to somehow tell Python 2.7 to compile a code entered in the > interactive session with the flag PyCF_SOURCE_IS_UTF8 set? I'm considering > adding support for Python 2 in my package > (https://github.com/Drekin/win-unicode-console) and I have run into the fact > that when u"?" is entered in the interactive session, it results in > u"\xce\xb1" rather than u"\u03b1". As this seems to be a highly specialized > question, I'm asking it here. As far as I am aware, we don't have the equivalent of a "coding cookie" for the interactive interpreter, so if anyone else knows how to do it, I'll be learning something too :) Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From drekin at gmail.com Wed Apr 29 10:35:27 2015 From: drekin at gmail.com (=?UTF-8?B?QWRhbSBCYXJ0b8Wh?=) Date: Wed, 29 Apr 2015 10:35:27 +0200 Subject: [Python-Dev] Unicode literals in Python 2.7 In-Reply-To: References: Message-ID: This situation is a bit different from coding cookies. They are used when we have bytes from a source file, but we don't know its encoding. During interactive session the tokenizer always knows the encoding of the bytes. I would think that in the case of interactive session the PyCF_SOURCE_IS_UTF8 should be always set so the bytes containing encoded non-ASCII characters are interpreted correctly. Why I'm talking about PyCF_SOURCE_IS_UTF8? eval(u"u'\u03b1'") -> u'\u03b1' but eval(u"u'\u03b1'".encode('utf-8')) -> u'\xce\xb1'. I understand that in the second case eval has no idea how are the given bytes encoded. But the first case is actually implemented by encoding to utf-8 and setting PyCF_SOURCE_IS_UTF8. That's why I'm talking about the flag. Regards, Drekin On Wed, Apr 29, 2015 at 9:25 AM, Nick Coghlan wrote: > On 29 April 2015 at 06:20, Adam Barto? wrote: > > Hello, > > > > is it possible to somehow tell Python 2.7 to compile a code entered in > the > > interactive session with the flag PyCF_SOURCE_IS_UTF8 set? I'm > considering > > adding support for Python 2 in my package > > (https://github.com/Drekin/win-unicode-console) and I have run into the > fact > > that when u"?" is entered in the interactive session, it results in > > u"\xce\xb1" rather than u"\u03b1". As this seems to be a highly > specialized > > question, I'm asking it here. > > As far as I am aware, we don't have the equivalent of a "coding > cookie" for the interactive interpreter, so if anyone else knows how > to do it, I'll be learning something too :) > > Cheers, > Nick. > > -- > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia > -------------- next part -------------- An HTML attachment was scrubbed... URL: From greg.ewing at canterbury.ac.nz Wed Apr 29 11:12:47 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Wed, 29 Apr 2015 21:12:47 +1200 Subject: [Python-Dev] PEP 492: async/await in Python; v3 In-Reply-To: <55401741.9040703@gmail.com> References: <553EF985.2070808@gmail.com> <55401741.9040703@gmail.com> Message-ID: <5540A08F.50106@canterbury.ac.nz> Yury Selivanov wrote: > It's important to at least have 'iscoroutine' -- to check that > the object is a coroutine function. A typical use-case would be > a web framework that lets you to bind coroutines to specific > http methods/paths: > > @http.get('/spam') > async def handle_spam(request): > ... > > The other thing is that it's easy to implement this function > for CPython: just check for CO_COROUTINE flag. But isn't that too restrictive? Any function that returns an awaitable object would work in the above case. >>> One of the most frequent mistakes that people make when using >>> generators as coroutines is forgetting to use ``yield from``:: > > I think it's a mistake that a lot of beginners may make at some > point (and in this sense it's frequent). I really doubt that > once you were hit by it more than two times you would make it > again. What about when you change an existing non-suspendable function to make it suspendable, and have to deal with the ripple-on effects of that? Seems to me that affects everyone, not just beginners. >>> 3. ``yield from`` does not accept coroutine objects from plain Python >>> generators (*not* generator-based coroutines.) >>> >> What exactly are "coroutine >> objects >> from plain Python generators"?) > > # *Not* decorated with @coroutine > def some_algorithm_impl(): > yield 1 > yield from native_coroutine() # <- this is a bug So what you really mean is "yield-from, when used inside a function that doesn't have @coroutine applied to it, will not accept a coroutine object", is that right? If so, I think this part needs re-wording, because it sounded like you meant something quite different. I'm not sure I like this -- it seems weird that applying a decorator to a function should affect the semantics of something *inside* the function -- especially a piece of built-in syntax such as 'yield from'. It's similar to the idea of replacing 'async def' with a decorator, which you say you're against. BTW, by "coroutine object", do you mean only objects returned by an async def function, or any object having an __await__ method? I think a lot of things would be clearer if we could replace the term "coroutine object" with "awaitable object" everywhere. > ``yield from`` does not accept *native coroutine objects* > from regular Python generators It's the "from" there that's confusing -- it sounds like you're talking about where the argument to yield-from comes from, rather than where the yield-from expression resides. In other words, we though you were proposing to disallow *this*: # *Not* decorated with @coroutine def some_algorithm_impl(): yield 1 yield from iterator_implemented_by_generator() I hope to agree that this is a perfectly legitimate thing to do, and should remain so? -- Greg From greg.ewing at canterbury.ac.nz Wed Apr 29 11:12:49 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Wed, 29 Apr 2015 21:12:49 +1200 Subject: [Python-Dev] PEP 492: async/await in Python; v3 In-Reply-To: References: <553EF985.2070808@gmail.com> <55401741.9040703@gmail.com> <20150428235531.GP32422@stoneleaf.us> Message-ID: <5540A091.9070408@canterbury.ac.nz> Guido van Rossum wrote: > There's a cost to __future__ imports > too. The current proposal is a pretty clever hack -- and we've done > similar hacks in the past There's a benefit to having a __future__ import beyond avoiding hackery: by turning on the __future__ you can find out what will break when they become real keywords. But I suppose that could be achieved by having both the hack *and* the __future__ import available. -- Greg From greg.ewing at canterbury.ac.nz Wed Apr 29 11:12:51 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Wed, 29 Apr 2015 21:12:51 +1200 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <55403937.7040409@gmail.com> References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> Message-ID: <5540A093.2070808@canterbury.ac.nz> Yury Selivanov wrote: > Looking at the grammar -- the only downside of the current approach is that > you can't do 'await await fut'. I still think that it reads better with > parens. If we put 'await' to 'factor' terminal we would allow > > await -fut # await (-fut) Is there really a need to disallow that? It would take a fairly bizarre API to make it meaningful in the first place, but in any case, it's fairly clear what order of operations is intended without the parens. -- Greg From greg.ewing at canterbury.ac.nz Wed Apr 29 11:12:54 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Wed, 29 Apr 2015 21:12:54 +1200 Subject: [Python-Dev] PEP 492: async/await in Python; v3 In-Reply-To: References: <553EF985.2070808@gmail.com> Message-ID: <5540A096.70705@canterbury.ac.nz> Guido van Rossum wrote: > On Mon, Apr 27, 2015 at 8:07 PM, Yury Selivanov > wrote: > > Why StopAsyncIteration? > ''''''''''''''''''''''' > > I keep wanting to propose to rename this to AsyncStopIteration. +1, that seems more consistent to me too. > And since PEP 479 is accepted and enabled by default for coroutines, > the following example will have its ``StopIteration`` wrapped into a > ``RuntimeError`` I think that's a red herring in relation to the reason for StopAsyncIteration/AsyncStopIteration being needed. The real reason is that StopIteration is already being used to signal returning a value from an async function, so it can't also be used to signal the end of an async iteration. > One of the most frequent mistakes that people make when using > generators as coroutines is forgetting to use ``yield from``:: > > @asyncio.coroutine > def useful(): > asyncio.sleep(1) # this will do noting without 'yield from' > > Might be useful to point out that this was the one major advantage of > PEP 3152 -- although it wasn't enough to save that PEP, and in your > response you pointed out that this mistake is not all that common. > Although you seem to disagree with that here ("One of the most frequent > mistakes ..."). I think we need some actual evidence before we can claim that one of these mistakes is more easily made than the other. A priori, I would tend to assume that failing to use 'await' when it's needed would be the more insidious one. If you mistakenly treat the return value of a function as a future when it isn't one, you will probably find out about it pretty quickly even under the old regime, since most functions don't return iterators. On the other hand, consider refactoring a function that was previously not a coroutine so that it now is. All existing calls to that function now need to be located and have either 'yield from' or 'await' put in front of them. There are three possibilities: 1. The return value is not used. The destruction-before- iterated-over heuristic will catch this (although since it happens in a destructor, you won't get an exception that propagates in the usual way). 2. Some operation is immediately performed on the return value. Most likely this will fail, so you will find out about the problem promptly and get a stack trace, although the error message will be somewhat tangentially related to the cause. 3. The return value is stored away for later use. Some time later, an operation on it will fail, but it will no longer be obvious where the mistake was made. So it's all a bit of a mess, IMO. But maybe it's good enough. We need data. How often have people been bitten by this kind of problem, and how much trouble did it cause them? > Does send() make sense for a native coroutine? Check PEP 380. I think > the only way to access the send() argument is by using ``yield`` but > that's disallowed. Or is this about send() being passed to the ``yield`` > that ultimately suspends the chain of coroutines? That's made me think of something else. Suppose you want to suspend execution in an 'async def' function -- how do you do that if 'yield' is not allowed? You may need something like the suspend() primitive that I was thinking of adding to PEP 3152. > No implicit wrapping in Futures > ------------------------------- > > There is a proposal to add similar mechanism to ECMAScript 7 [2]_. A > key difference is that JavaScript "async functions" always return a > Promise. While this approach has some advantages, it also implies that > a new Promise object is created on each "async function" invocation. I don't see how this is different from an 'async def' function always returning an awaitable object, or a new awaitable object being created on each 'async def' function invocation. Sounds pretty much isomorphic to me. -- Greg From greg.ewing at canterbury.ac.nz Wed Apr 29 11:12:58 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Wed, 29 Apr 2015 21:12:58 +1200 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> Message-ID: <5540A09A.2000307@canterbury.ac.nz> Guido van Rossum wrote: > I don't care for await await x. But do you dislike it enough to go out of your way to disallow it? -- Greg From greg.ewing at canterbury.ac.nz Wed Apr 29 11:13:00 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Wed, 29 Apr 2015 21:13:00 +1200 Subject: [Python-Dev] PEP 492: async/await in Python; v3 In-Reply-To: <55404EC2.6040004@gmail.com> References: <553EF985.2070808@gmail.com> <87oam7prf9.fsf@uwakimon.sk.tsukuba.ac.jp> <55404EC2.6040004@gmail.com> Message-ID: <5540A09C.4080606@canterbury.ac.nz> Yury Selivanov wrote: > I also like Guido's suggestion to use "native coroutine" term. > I'll update the PEP (I have several branches of it in the repo > that I need to merge before the rename). I'd still prefer to avoid use of the word "coroutine" altogether as being far too overloaded. I think even the term "native coroutine" leaves room for ambiguity. It's not clear to me whether you intend it to refer only to functions declared with 'async def', or to any function that returns an awaitable object. The term "async function" seems like a clear and unabmigious way to refer to the former. I'm not sure what to call the latter. -- Greg From greg.ewing at canterbury.ac.nz Wed Apr 29 11:13:01 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Wed, 29 Apr 2015 21:13:01 +1200 Subject: [Python-Dev] PEP 492: async/await in Python; v3 In-Reply-To: <554059C5.9040303@gmail.com> References: <553EF985.2070808@gmail.com> <55405713.3050001@canterbury.ac.nz> <554059C5.9040303@gmail.com> Message-ID: <5540A09D.5010505@canterbury.ac.nz> Yury Selivanov wrote: > > On 2015-04-28 11:59 PM, Greg wrote: > >> On 29/04/2015 9:49 a.m., Guido van Rossum wrote: >> >>> *But* every generator-based coroutine *must* be >>> decorated with `asyncio.coroutine()`. This is >>> potentially a backwards incompatible change. >>> >>> See below. I worry about backward compatibility. A lot. Are you saying >>> that asycio-based code that doesn't use @coroutine will break in 3.5? >> >> That seems unavoidable if the goal is for 'await' to only >> work on generators that are intended to implement coroutines, > > Not sure what you mean by "unavoidable". Guido is worried about existing asyncio-based code that doesn't always decorate its generators with @coroutine. If I understand correctly, if you have @coroutine def coro1(): yield from coro2() def coro2(): yield from ... then coro1() would no longer work. In other words, some currently legitimate asyncio-based code will break under PEP 492 even if it doesn't use any PEP 492 features. What you seem to be trying to do here is catch the mistake of using a non-coroutine iterator as if it were a coroutine. By "unavoidable" I mean I can't see a way to achieve that in all possible permutations without giving up some backward compatibility. -- Greg From greg.ewing at canterbury.ac.nz Wed Apr 29 11:13:03 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Wed, 29 Apr 2015 21:13:03 +1200 Subject: [Python-Dev] PEP 492: async/await in Python; v3 In-Reply-To: <5540656C.4060705@gmail.com> References: <553EF985.2070808@gmail.com> <55401741.9040703@gmail.com> <5540656C.4060705@gmail.com> Message-ID: <5540A09F.8030709@canterbury.ac.nz> Yury Selivanov wrote: > 3. GenObject __iter__ and __next__ raise error > *only* if it has CO_NATIVE_COROUTINE flag. So > iter(), next(), for..in aren't supported only > for 'async def' functions (but will work ok > on asyncio generator-based coroutines) What about new 'async def' code called by existing code that expects to be able to use iter() or next() on the future objects it receives? > 4. 'yield from' *only* raises an error if it yields a > *coroutine with a CO_NATIVE_COROUTINE* > from a regular generator. Won't that prevent some existing generator-based coroutines (ones not decorated with @coroutine) from calling ones implemented with 'async def'? -- Greg From greg.ewing at canterbury.ac.nz Wed Apr 29 11:13:05 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Wed, 29 Apr 2015 21:13:05 +1200 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <55407080.7060808@gmail.com> References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55406EC5.7000101@canterbury.ac.nz> <55407080.7060808@gmail.com> Message-ID: <5540A0A1.4010807@canterbury.ac.nz> Yury Selivanov wrote: > It's just like unary minus ;) I'm confused. I thought you were arguing that your grammar is better because it's *not* just like unary minus? If being just like unary minus is okay, then why not just make it so? -- Greg From yselivanov.ml at gmail.com Wed Apr 29 15:55:25 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 29 Apr 2015 09:55:25 -0400 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <5540A093.2070808@canterbury.ac.nz> References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> Message-ID: <5540E2CD.6060101@gmail.com> On 2015-04-29 5:12 AM, Greg Ewing wrote: > Yury Selivanov wrote: >> Looking at the grammar -- the only downside of the current approach >> is that >> you can't do 'await await fut'. I still think that it reads better with >> parens. If we put 'await' to 'factor' terminal we would allow >> >> await -fut # await (-fut) > > Is there really a need to disallow that? It would take > a fairly bizarre API to make it meaningful in the first > place, but in any case, it's fairly clear what order > of operations is intended without the parens. Greg, if grammar can prevent this kind of mistakes - it should. I like my current approach. Yury From yselivanov.ml at gmail.com Wed Apr 29 16:01:10 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 29 Apr 2015 10:01:10 -0400 Subject: [Python-Dev] PEP 492: async/await in Python; v3 In-Reply-To: <5540A09F.8030709@canterbury.ac.nz> References: <553EF985.2070808@gmail.com> <55401741.9040703@gmail.com> <5540656C.4060705@gmail.com> <5540A09F.8030709@canterbury.ac.nz> Message-ID: <5540E426.40305@gmail.com> On 2015-04-29 5:13 AM, Greg Ewing wrote: > Yury Selivanov wrote: > >> 3. GenObject __iter__ and __next__ raise error >> *only* if it has CO_NATIVE_COROUTINE flag. So >> iter(), next(), for..in aren't supported only >> for 'async def' functions (but will work ok >> on asyncio generator-based coroutines) > > What about new 'async def' code called by existing > code that expects to be able to use iter() or > next() on the future objects it receives? > >> 4. 'yield from' *only* raises an error if it yields a >> *coroutine with a CO_NATIVE_COROUTINE* >> from a regular generator. > > Won't that prevent some existing generator-based > coroutines (ones not decorated with @coroutine) > from calling ones implemented with 'async def'? > It would. But that's not a backwards compatibility issue. Everything will work in 3.5 without a single line change. If you want to use new coroutines - use them, everything will work too. If, however, during the refactoring you've missed several generator-based coroutines *and* they are not decorated with @coroutine - then yes, you will get a runtime error. I see absolutely no problem with that. It's a small price to pay for a better design. Yury From yselivanov.ml at gmail.com Wed Apr 29 16:16:45 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 29 Apr 2015 10:16:45 -0400 Subject: [Python-Dev] PEP 492: async/await in Python; v3 In-Reply-To: <5540A08F.50106@canterbury.ac.nz> References: <553EF985.2070808@gmail.com> <55401741.9040703@gmail.com> <5540A08F.50106@canterbury.ac.nz> Message-ID: <5540E7CD.1080203@gmail.com> Greg, On 2015-04-29 5:12 AM, Greg Ewing wrote: > Yury Selivanov wrote: > >> It's important to at least have 'iscoroutine' -- to check that >> the object is a coroutine function. A typical use-case would be >> a web framework that lets you to bind coroutines to specific >> http methods/paths: >> >> @http.get('/spam') >> async def handle_spam(request): >> ... > > >> The other thing is that it's easy to implement this function >> for CPython: just check for CO_COROUTINE flag. > > But isn't that too restrictive? Any function that returns > an awaitable object would work in the above case. It's just an example. All in all, I think that we should have full coverage of python objects in the inspect module. There are many possible use cases besides the one that I used -- runtime introspection, reflection, debugging etc, where you might need them. > >>>> One of the most frequent mistakes that people make when using >>>> generators as coroutines is forgetting to use ``yield from``:: >> >> I think it's a mistake that a lot of beginners may make at some >> point (and in this sense it's frequent). I really doubt that >> once you were hit by it more than two times you would make it >> again. > > What about when you change an existing non-suspendable > function to make it suspendable, and have to deal with > the ripple-on effects of that? Seems to me that affects > everyone, not just beginners. I've been using coroutines on a daily basis for 6 or 7 years now, long before asyncio we had a coroutine-based framework at my firm (yield + trampoline). Neither I nor my colleagues had any problems with refactoring the code. I really try to speak from my experience when I say that it's not that big of a problem. Anyways, the PEP provides set_coroutine_wrapper which should solve the problem. > >>>> 3. ``yield from`` does not accept coroutine objects from plain Python >>>> generators (*not* generator-based coroutines.) >>>> >>> What exactly are "coroutine objects >>> from plain Python generators"?) >> >> # *Not* decorated with @coroutine >> def some_algorithm_impl(): >> yield 1 >> yield from native_coroutine() # <- this is a bug > > So what you really mean is "yield-from, when used inside > a function that doesn't have @coroutine applied to it, > will not accept a coroutine object", is that right? If > so, I think this part needs re-wording, because it sounded > like you meant something quite different. > > I'm not sure I like this -- it seems weird that applying > a decorator to a function should affect the semantics > of something *inside* the function -- especially a piece > of built-in syntax such as 'yield from'. It's similar > to the idea of replacing 'async def' with a decorator, > which you say you're against. This is for the transition period. We don't want to break existing asyncio code. But we do want coroutines to be a separate concept from generators. It doesn't make any sense to iterate through coroutines or to yield-from them. We can deprecate @coroutine decorator in 3.6 or 3.7 and at some time remove it. > > BTW, by "coroutine object", do you mean only objects > returned by an async def function, or any object having > an __await__ method? I think a lot of things would be > clearer if we could replace the term "coroutine object" > with "awaitable object" everywhere. The PEP clearly separates awaitbale from coroutine objects. - coroutine object is returned from coroutine call. - awaitable is either a coroutine object or an object with __await__. list(), tuple(), iter(), next(), for..in etc. won't work on objects with __await__ (unless they implement __iter__). The problem I was discussing is about specifically 'yield from' and 'coroutine object'. > >> ``yield from`` does not accept *native coroutine objects* >> from regular Python generators > > It's the "from" there that's confusing -- it sounds > like you're talking about where the argument to > yield-from comes from, rather than where the yield-from > expression resides. In other words, we though you were > proposing to disallow *this*: > > # *Not* decorated with @coroutine > def some_algorithm_impl(): > yield 1 > yield from iterator_implemented_by_generator() > > I hope to agree that this is a perfectly legitimate > thing to do, and should remain so? > Sure it's perfectly normal ;) I apologize for the poor wording. Yury From yselivanov.ml at gmail.com Wed Apr 29 16:18:41 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 29 Apr 2015 10:18:41 -0400 Subject: [Python-Dev] PEP 492: async/await in Python; v3 In-Reply-To: <5540A09D.5010505@canterbury.ac.nz> References: <553EF985.2070808@gmail.com> <55405713.3050001@canterbury.ac.nz> <554059C5.9040303@gmail.com> <5540A09D.5010505@canterbury.ac.nz> Message-ID: <5540E841.5030200@gmail.com> Greg, On 2015-04-29 5:13 AM, Greg Ewing wrote: > Yury Selivanov wrote: >> >> On 2015-04-28 11:59 PM, Greg wrote: >> >>> On 29/04/2015 9:49 a.m., Guido van Rossum wrote: >>> >>>> *But* every generator-based coroutine *must* be >>>> decorated with `asyncio.coroutine()`. This is >>>> potentially a backwards incompatible change. >>>> >>>> See below. I worry about backward compatibility. A lot. Are you saying >>>> that asycio-based code that doesn't use @coroutine will break in 3.5? >>> >>> That seems unavoidable if the goal is for 'await' to only >>> work on generators that are intended to implement coroutines, >> >> Not sure what you mean by "unavoidable". > > Guido is worried about existing asyncio-based code that > doesn't always decorate its generators with @coroutine. > > If I understand correctly, if you have > > @coroutine > def coro1(): > yield from coro2() > > def coro2(): > yield from ... > > then coro1() would no longer work. In other words, > some currently legitimate asyncio-based code will break > under PEP 492 even if it doesn't use any PEP 492 features. > > What you seem to be trying to do here is catch the > mistake of using a non-coroutine iterator as if it > were a coroutine. By "unavoidable" I mean I can't see > a way to achieve that in all possible permutations > without giving up some backward compatibility. Please see my reply to Guido in other thread. Yury From stephen at xemacs.org Wed Apr 29 16:24:53 2015 From: stephen at xemacs.org (Stephen J. Turnbull) Date: Wed, 29 Apr 2015 23:24:53 +0900 Subject: [Python-Dev] PEP 492: async/await in Python; v3 In-Reply-To: <55404EC2.6040004@gmail.com> References: <553EF985.2070808@gmail.com> <87oam7prf9.fsf@uwakimon.sk.tsukuba.ac.jp> <55404EC2.6040004@gmail.com> Message-ID: <87iocfovuy.fsf@uwakimon.sk.tsukuba.ac.jp> Yury Selivanov writes: > Agree. Do you think it'd be better to combine comprehensions > and async lambdas in one section? No, I'd keep them separate. For one thing, given Guido's long-standing lack of enthusiasm for lambda, they'll need to be separate PEPs. From yselivanov.ml at gmail.com Wed Apr 29 16:29:58 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 29 Apr 2015 10:29:58 -0400 Subject: [Python-Dev] PEP 492: async/await in Python; v3 In-Reply-To: <5540A096.70705@canterbury.ac.nz> References: <553EF985.2070808@gmail.com> <5540A096.70705@canterbury.ac.nz> Message-ID: <5540EAE6.3030402@gmail.com> Greg, On 2015-04-29 5:12 AM, Greg Ewing wrote: > Guido van Rossum wrote: > >> On Mon, Apr 27, 2015 at 8:07 PM, Yury Selivanov >> > wrote: > > >> Why StopAsyncIteration? >> ''''''''''''''''''''''' >> >> I keep wanting to propose to rename this to AsyncStopIteration. > > +1, that seems more consistent to me too. > >> And since PEP 479 is accepted and enabled by default for coroutines, >> the following example will have its ``StopIteration`` wrapped into a >> ``RuntimeError`` > > I think that's a red herring in relation to the reason > for StopAsyncIteration/AsyncStopIteration being needed. > The real reason is that StopIteration is already being > used to signal returning a value from an async function, > so it can't also be used to signal the end of an async > iteration. When we start thinking about generator-coroutines (the ones that combine 'await' and 'async yield'-something), we'll have to somehow multiplex them to the existing generator object (at least that's one way to do it). StopIteration is already extremely loaded with different special meanings. [..] > >> Does send() make sense for a native coroutine? Check PEP 380. I think >> the only way to access the send() argument is by using ``yield`` but >> that's disallowed. Or is this about send() being passed to the >> ``yield`` that ultimately suspends the chain of coroutines? > > That's made me think of something else. Suppose you want > to suspend execution in an 'async def' function -- how do > you do that if 'yield' is not allowed? You may need > something like the suspend() primitive that I was thinking > of adding to PEP 3152. We do this in asyncio with Futures. We never combine 'yield' and 'yield from' in a @coroutine. We don't need 'suspend()'. If you need suspend()-like thing in your own framework, implement an object with an __await__ method and await on it. > >> No implicit wrapping in Futures >> ------------------------------- >> >> There is a proposal to add similar mechanism to ECMAScript 7 >> [2]_. A >> key difference is that JavaScript "async functions" always return a >> Promise. While this approach has some advantages, it also implies >> that >> a new Promise object is created on each "async function" invocation. > > I don't see how this is different from an 'async def' > function always returning an awaitable object, or a new > awaitable object being created on each 'async def' > function invocation. Sounds pretty much isomorphic to me. > Agree. I'll try to reword that section. Thanks, Yury From victor.stinner at gmail.com Wed Apr 29 16:59:01 2015 From: victor.stinner at gmail.com (Victor Stinner) Date: Wed, 29 Apr 2015 16:59:01 +0200 Subject: [Python-Dev] Unicode literals in Python 2.7 In-Reply-To: References: Message-ID: Le 29 avr. 2015 10:36, "Adam Barto?" a ?crit : > Why I'm talking about PyCF_SOURCE_IS_UTF8? eval(u"u'\u03b1'") -> u'\u03b1' but eval(u"u'\u03b1'".encode('utf-8')) -> u'\xce\xb1'. There is a simple option to get this flag: call eval() with unicode, not with encoded bytes. Victor -------------- next part -------------- An HTML attachment was scrubbed... URL: From ethan at stoneleaf.us Wed Apr 29 17:29:30 2015 From: ethan at stoneleaf.us (Ethan Furman) Date: Wed, 29 Apr 2015 08:29:30 -0700 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <5540E2CD.6060101@gmail.com> References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> Message-ID: <20150429152930.GA10248@stoneleaf.us> On 04/29, Yury Selivanov wrote: > On 2015-04-29 5:12 AM, Greg Ewing wrote: >> Yury Selivanov wrote: >>> Looking at the grammar -- the only downside of the current >>> approach is that >>> you can't do 'await await fut'. I still think that it reads better with >>> parens. If we put 'await' to 'factor' terminal we would allow >>> >>> await -fut # await (-fut) >> >> Is there really a need to disallow that? It would take >> a fairly bizarre API to make it meaningful in the first >> place, but in any case, it's fairly clear what order >> of operations is intended without the parens. > > Greg, if grammar can prevent this kind of mistakes - it should. > I like my current approach. That's like saying we should always put parens around the number being raised in n ** x because -2**4 != (-2)**4 Please do not overuse parens. Python is not lisp, and await is not a function, so parens should not be needed in the common case. -- ~Ethan~ From yselivanov.ml at gmail.com Wed Apr 29 17:33:55 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 29 Apr 2015 11:33:55 -0400 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <20150429152930.GA10248@stoneleaf.us> References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> Message-ID: <5540F9E3.9080805@gmail.com> Ethan, On 2015-04-29 11:29 AM, Ethan Furman wrote: > Python is not lisp, and await is not a > function, so parens should not be needed in the common case. Which common case you mean? Please see this table https://www.python.org/dev/peps/pep-0492/#syntax-of-await-expression Yury From yselivanov.ml at gmail.com Wed Apr 29 18:11:16 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 29 Apr 2015 12:11:16 -0400 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <5540F9E3.9080805@gmail.com> References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> Message-ID: <554102A4.1040603@gmail.com> Also please see this: https://hg.python.org/peps/rev/d048458307b7 FWIW, 'await await fut' isn't something that you likely to see in your life; 'await -fut' is 99.999% just a bug. I'm not sure why Greg is pushing his Grammar idea so aggressively. Yury On 2015-04-29 11:33 AM, Yury Selivanov wrote: > Ethan, > > On 2015-04-29 11:29 AM, Ethan Furman wrote: >> Python is not lisp, and await is not a >> function, so parens should not be needed in the common case. > > Which common case you mean? > > Please see this table > https://www.python.org/dev/peps/pep-0492/#syntax-of-await-expression > > Yury From drekin at gmail.com Wed Apr 29 18:19:49 2015 From: drekin at gmail.com (=?UTF-8?B?QWRhbSBCYXJ0b8Wh?=) Date: Wed, 29 Apr 2015 18:19:49 +0200 Subject: [Python-Dev] Unicode literals in Python 2.7 In-Reply-To: References: Message-ID: Yes, that works for eval. But I want it for code entered during an interactive session. >>> u'?' u'\xce\xb1' The tokenizer gets b"u'\xce\xb1'" by calling PyOS_Readline and it knows it's utf-8 encoded. But the result of evaluation is u'\xce\xb1'. Because of how eval works, I believe that it would work correctly if the PyCF_SOURCE_IS_UTF8 was set, but it is not. That is why I'm asking if there is a way to set it. Also, my naive thought is that it should be always set in the case of interactive session. On Wed, Apr 29, 2015 at 4:59 PM, Victor Stinner wrote: > Le 29 avr. 2015 10:36, "Adam Barto?" a ?crit : > > Why I'm talking about PyCF_SOURCE_IS_UTF8? eval(u"u'\u03b1'") -> > u'\u03b1' but eval(u"u'\u03b1'".encode('utf-8')) -> u'\xce\xb1'. > > There is a simple option to get this flag: call eval() with unicode, not > with encoded bytes. > > Victor > -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Wed Apr 29 18:40:43 2015 From: guido at python.org (Guido van Rossum) Date: Wed, 29 Apr 2015 09:40:43 -0700 Subject: [Python-Dev] Unicode literals in Python 2.7 In-Reply-To: References: Message-ID: I suspect the interactive session is *not* always in UTF8. It probably depends on the keyboard mapping of your terminal emulator. I imagine in Windows it's the current code page. On Wed, Apr 29, 2015 at 9:19 AM, Adam Barto? wrote: > Yes, that works for eval. But I want it for code entered during an > interactive session. > > >>> u'?' > u'\xce\xb1' > > The tokenizer gets b"u'\xce\xb1'" by calling PyOS_Readline and it knows > it's utf-8 encoded. But the result of evaluation is u'\xce\xb1'. Because of > how eval works, I believe that it would work correctly if the > PyCF_SOURCE_IS_UTF8 was set, but it is not. That is why I'm asking if there > is a way to set it. Also, my naive thought is that it should be always set > in the case of interactive session. > > > On Wed, Apr 29, 2015 at 4:59 PM, Victor Stinner > wrote: > >> Le 29 avr. 2015 10:36, "Adam Barto?" a ?crit : >> > Why I'm talking about PyCF_SOURCE_IS_UTF8? eval(u"u'\u03b1'") -> >> u'\u03b1' but eval(u"u'\u03b1'".encode('utf-8')) -> u'\xce\xb1'. >> >> There is a simple option to get this flag: call eval() with unicode, not >> with encoded bytes. >> >> Victor >> > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From andrew.svetlov at gmail.com Wed Apr 29 18:45:07 2015 From: andrew.svetlov at gmail.com (andrew.svetlov at gmail.com) Date: Wed, 29 Apr 2015 09:45:07 -0700 (PDT) Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <554102A4.1040603@gmail.com> References: <554102A4.1040603@gmail.com> Message-ID: <1430325907287.1c1c1373@Nodemailer> ? Sent from Mailbox On Wednesday Apr 29, 2015 at 7:14 PM, Yury Selivanov , wrote:Also please see this: https://hg.python.org/peps/rev/d048458307b7 FWIW, 'await await fut' isn't something that you likely to see in your life; 'await -fut' is 99.999% just a bug. Agree.? I'm not sure why Greg is pushing his Grammar idea so aggressively. Yury On 2015-04-29 11:33 AM, Yury Selivanov wrote: > Ethan, > > On 2015-04-29 11:29 AM, Ethan Furman wrote: >> Python is not lisp, and await is not a >> function, so parens should not be needed in the common case. > > Which common case you mean? > > Please see this table > https://www.python.org/dev/peps/pep-0492/#syntax-of-await-expression > > Yury _______________________________________________ Python-Dev mailing list Python-Dev at python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/andrew.svetlov%40gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From ethan at stoneleaf.us Wed Apr 29 19:25:21 2015 From: ethan at stoneleaf.us (Ethan Furman) Date: Wed, 29 Apr 2015 10:25:21 -0700 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <5540F9E3.9080805@gmail.com> References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> Message-ID: <20150429172521.GC10248@stoneleaf.us> On 04/29, Yury Selivanov wrote: > On 2015-04-29 11:29 AM, Ethan Furman wrote: >> Python is not lisp, and await is not a >> function, so parens should not be needed in the common case. > > Which common case you mean? The common case of not wanting to write parenthesis. ;) Seriously, why are the parens necessary? await (-coro()) from the many examples above that just work, it's a mystery why await -coro() cannot also just work and be the same as the parenthesized version. -- ~Ethan~ From jimjjewett at gmail.com Wed Apr 29 19:43:29 2015 From: jimjjewett at gmail.com (Jim J. Jewett) Date: Wed, 29 Apr 2015 10:43:29 -0700 (PDT) Subject: [Python-Dev] PEP 492: What is the real goal? In-Reply-To: Message-ID: <55411841.c5b3340a.1cf8.07e8@mx.google.com> On Tue Apr 28 23:49:56 CEST 2015, Guido van Rossum quoted PEP 492: > Rationale and Goals > =================== > > Current Python supports implementing coroutines via generators (PEP > 342), further enhanced by the ``yield from`` syntax introduced in PEP > 380. This approach has a number of shortcomings: > > * it is easy to confuse coroutines with regular generators, since they > share the same syntax; async libraries often attempt to alleviate > this by using decorators (e.g. ``@asyncio.coroutine`` [1]_); So? PEP 492 never says what coroutines *are* in a way that explains why it matters that they are different from generators. Do you really mean "coroutines that can be suspended while they wait for something slow"? As best I can guess, the difference seems to be that a "normal" generator is using yield primarily to say: "I'm not done; I have more values when you want them", but an asynchronous (PEP492) coroutine is primarily saying: "This might take a while, go ahead and do something else meanwhile." > As shown later in this proposal, the new ``async > with`` statement lets Python programs perform asynchronous calls when > entering and exiting a runtime context, and the new ``async for`` > statement makes it possible to perform asynchronous calls in iterators. Does it really permit *making* them, or does it just signal that you will be waiting for them to finish processing anyhow, and it doesn't need to be a busy-wait? As nearly as I can tell, "async with" doesn't start processing the managed block until the "asynchronous" call finishes its work -- the only point of the async is to signal a scheduler that the task is blocked. Similarly, "async for" is still linearized, with each step waiting until the previous "asynchronous" step was not merely launched, but fully processed. If anything, it *prevents* within-task parallelism. > It uses the ``yield from`` implementation with an extra step of > validating its argument. ``await`` only accepts an *awaitable*, which > can be one of: What justifies this limitation? Is there anything wrong awaiting something that eventually uses "return" instead of "yield", if the "this might take a while" signal is still true? Is the problem just that the current implementation might not take proper advantage of task-switching? > Objects with ``__await__`` method are called *Future-like* objects in > the rest of this PEP. > > Also, please note that ``__aiter__`` method (see its definition > below) cannot be used for this purpose. It is a different protocol, > and would be like using ``__iter__`` instead of ``__call__`` for > regular callables. > > It is a ``TypeError`` if ``__await__`` returns anything but an > iterator. What would be wrong if a class just did __await__ = __anext__ ? If the problem is that the result of __await__ should be iterable, then why isn't __await__ = __aiter__ OK? > ``await`` keyword is defined differently from ``yield`` and ``yield > from``. The main difference is that *await expressions* do not require > parentheses around them most of the times. Does that mean "The ``await`` keyword has slightly higher precedence than ``yield``, so that fewer expressions require parentheses"? > class AsyncContextManager: > async def __aenter__(self): > await log('entering context') Other than the arbitrary "keyword must be there" limitations imposed by this PEP, how is that different from: class AsyncContextManager: async def __aenter__(self): log('entering context') or even: class AsyncContextManager: def __aenter__(self): log('entering context') Will anything different happen when calling __aenter__ or log? Is it that log itself now has more freedom to let other tasks run in the middle? > It is an error to pass a regular context manager without ``__aenter__`` > and ``__aexit__`` methods to ``async with``. It is a ``SyntaxError`` > to use ``async with`` outside of a coroutine. Why? Does that just mean they won't take advantage of the freedom you offered them? Or are you concerned that they are more likely to cooperate badly with the scheduler in practice? > It is a ``TypeError`` to pass a regular iterable without ``__aiter__`` > method to ``async for``. It is a ``SyntaxError`` to use ``async for`` > outside of a coroutine. The same questions about why -- what is the harm? > The following code illustrates new asynchronous iteration protocol:: > > class Cursor: > def __init__(self): > self.buffer = collections.deque() > > def _prefetch(self): > ... > > async def __aiter__(self): > return self > > async def __anext__(self): > if not self.buffer: > self.buffer = await self._prefetch() > if not self.buffer: > raise StopAsyncIteration > return self.buffer.popleft() > > then the ``Cursor`` class can be used as follows:: > > async for row in Cursor(): > print(row) Again, I don't see what this buys you except that a scheduler has been signaled that it is OK to pre-empt between rows. That is worth signaling, but I don't see why a regular iterator should be forbidden. > For debugging this kind of mistakes there is a special debug mode in > asyncio, in which ``@coroutine`` decorator wraps all functions with a > special object with a destructor logging a warning. ... > The only problem is how to enable these debug capabilities. Since > debug facilities should be a no-op in production mode, ``@coroutine`` > decorator makes the decision of whether to wrap or not to wrap based on > an OS environment variable ``PYTHONASYNCIODEBUG``. So the decision is made at compile-time, and can't be turned on later? Then what is wrong with just offering an alternative @coroutine that can be used to override the builtin? Or why not just rely on set_coroutine_wrapper entirely, and simply set it to None (so no wasted wrappings) by default? -jJ -- If there are still threading problems with my replies, please email me with details, so that I can try to resolve them. -jJ From yselivanov.ml at gmail.com Wed Apr 29 19:44:23 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 29 Apr 2015 13:44:23 -0400 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <20150429172521.GC10248@stoneleaf.us> References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> <20150429172521.GC10248@stoneleaf.us> Message-ID: <55411877.7010608@gmail.com> Ethan, On 2015-04-29 1:25 PM, Ethan Furman wrote: > cannot also just work and be the same as the parenthesized > version. Because it does not make any sense. Yury From phd at phdru.name Wed Apr 29 19:40:12 2015 From: phd at phdru.name (Oleg Broytman) Date: Wed, 29 Apr 2015 19:40:12 +0200 Subject: [Python-Dev] Unicode literals in Python 2.7 In-Reply-To: References: Message-ID: <20150429174012.GA23918@phdru.name> On Wed, Apr 29, 2015 at 09:40:43AM -0700, Guido van Rossum wrote: > I suspect the interactive session is *not* always in UTF8. It probably > depends on the keyboard mapping of your terminal emulator. I imagine in > Windows it's the current code page. Even worse: in w32 it can be an OEM codepage. Oleg. -- Oleg Broytman http://phdru.name/ phd at phdru.name Programmers don't die, they just GOSUB without RETURN. From yselivanov.ml at gmail.com Wed Apr 29 20:06:23 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 29 Apr 2015 14:06:23 -0400 Subject: [Python-Dev] PEP 492: What is the real goal? In-Reply-To: <55411841.c5b3340a.1cf8.07e8@mx.google.com> References: <55411841.c5b3340a.1cf8.07e8@mx.google.com> Message-ID: <55411D9F.2050400@gmail.com> Hi Jim, On 2015-04-29 1:43 PM, Jim J. Jewett wrote: > On Tue Apr 28 23:49:56 CEST 2015, Guido van Rossum quoted PEP 492: > >> Rationale and Goals >> =================== >> >> Current Python supports implementing coroutines via generators (PEP >> 342), further enhanced by the ``yield from`` syntax introduced in PEP >> 380. This approach has a number of shortcomings: >> >> * it is easy to confuse coroutines with regular generators, since they >> share the same syntax; async libraries often attempt to alleviate >> this by using decorators (e.g. ``@asyncio.coroutine`` [1]_); > So? PEP 492 never says what coroutines *are* in a way that explains > why it matters that they are different from generators. > > Do you really mean "coroutines that can be suspended while they wait > for something slow"? > > As best I can guess, the difference seems to be that a "normal" > generator is using yield primarily to say: > > "I'm not done; I have more values when you want them", > > but an asynchronous (PEP492) coroutine is primarily saying: > > "This might take a while, go ahead and do something else meanwhile." Correct. > >> As shown later in this proposal, the new ``async >> with`` statement lets Python programs perform asynchronous calls when >> entering and exiting a runtime context, and the new ``async for`` >> statement makes it possible to perform asynchronous calls in iterators. > Does it really permit *making* them, or does it just signal that you > will be waiting for them to finish processing anyhow, and it doesn't > need to be a busy-wait? I does. > > As nearly as I can tell, "async with" doesn't start processing the > managed block until the "asynchronous" call finishes its work -- the > only point of the async is to signal a scheduler that the task is > blocked. Right. > > Similarly, "async for" is still linearized, with each step waiting > until the previous "asynchronous" step was not merely launched, but > fully processed. If anything, it *prevents* within-task parallelism. It enables cooperative parallelism. > >> It uses the ``yield from`` implementation with an extra step of >> validating its argument. ``await`` only accepts an *awaitable*, which >> can be one of: > What justifies this limitation? We want to avoid people passing regular generators and random objects to 'await', because it is a bug. > > Is there anything wrong awaiting something that eventually uses > "return" instead of "yield", if the "this might take a while" signal > is still true? If it's an 'async def' then sure, you can use it in await. > Is the problem just that the current implementation > might not take proper advantage of task-switching? > >> Objects with ``__await__`` method are called *Future-like* objects in >> the rest of this PEP. >> >> Also, please note that ``__aiter__`` method (see its definition >> below) cannot be used for this purpose. It is a different protocol, >> and would be like using ``__iter__`` instead of ``__call__`` for >> regular callables. >> >> It is a ``TypeError`` if ``__await__`` returns anything but an >> iterator. > What would be wrong if a class just did __await__ = __anext__ ? > If the problem is that the result of __await__ should be iterable, > then why isn't __await__ = __aiter__ OK? For coroutines in PEP 492: __await__ = __anext__ is the same as __call__ = __next__ __await__ = __aiter__ is the same as __call__ = __iter__ > >> ``await`` keyword is defined differently from ``yield`` and ``yield >> from``. The main difference is that *await expressions* do not require >> parentheses around them most of the times. > Does that mean > > "The ``await`` keyword has slightly higher precedence than ``yield``, > so that fewer expressions require parentheses"? > >> class AsyncContextManager: >> async def __aenter__(self): >> await log('entering context') > Other than the arbitrary "keyword must be there" limitations imposed > by this PEP, how is that different from: > > class AsyncContextManager: > async def __aenter__(self): > log('entering context') This is OK. The point is that you can use 'await log' in __aenter__. If you don't need awaits in __aenter__ you can use them in __aexit__. If you don't need them there too, then just define a regular context manager. > > or even: > > class AsyncContextManager: > def __aenter__(self): > log('entering context') > > Will anything different happen when calling __aenter__ or log? > Is it that log itself now has more freedom to let other tasks run > in the middle? __aenter__ must return an awaitable. > > >> It is an error to pass a regular context manager without ``__aenter__`` >> and ``__aexit__`` methods to ``async with``. It is a ``SyntaxError`` >> to use ``async with`` outside of a coroutine. > Why? Does that just mean they won't take advantage of the freedom > you offered them? Not sure I understand the question. It doesn't make any sense in using 'async with' outside of a coroutine. The interpeter won't know what to do with them: you need an event loop for that. > Or are you concerned that they are more likely to > cooperate badly with the scheduler in practice? > >> It is a ``TypeError`` to pass a regular iterable without ``__aiter__`` >> method to ``async for``. It is a ``SyntaxError`` to use ``async for`` >> outside of a coroutine. > The same questions about why -- what is the harm? > >> The following code illustrates new asynchronous iteration protocol:: >> >> class Cursor: >> def __init__(self): >> self.buffer = collections.deque() >> >> def _prefetch(self): >> ... >> >> async def __aiter__(self): >> return self >> >> async def __anext__(self): >> if not self.buffer: >> self.buffer = await self._prefetch() >> if not self.buffer: >> raise StopAsyncIteration >> return self.buffer.popleft() >> >> then the ``Cursor`` class can be used as follows:: >> >> async for row in Cursor(): >> print(row) > Again, I don't see what this buys you except that a scheduler has > been signaled that it is OK to pre-empt between rows. That is worth > signaling, but I don't see why a regular iterator should be > forbidden. It's not about signaling. It's about allowing cooperative scheduling of long-running processes. > >> For debugging this kind of mistakes there is a special debug mode in >> asyncio, in which ``@coroutine`` decorator wraps all functions with a >> special object with a destructor logging a warning. > ... >> The only problem is how to enable these debug capabilities. Since >> debug facilities should be a no-op in production mode, ``@coroutine`` >> decorator makes the decision of whether to wrap or not to wrap based on >> an OS environment variable ``PYTHONASYNCIODEBUG``. > So the decision is made at compile-time, and can't be turned on later? > Then what is wrong with just offering an alternative @coroutine that > can be used to override the builtin? > > Or why not just rely on set_coroutine_wrapper entirely, and simply > set it to None (so no wasted wrappings) by default? It is set to None by default. Will clarify that in the PEP. Thanks, Yury From p.f.moore at gmail.com Wed Apr 29 20:26:05 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Wed, 29 Apr 2015 19:26:05 +0100 Subject: [Python-Dev] PEP 492: What is the real goal? In-Reply-To: <55411841.c5b3340a.1cf8.07e8@mx.google.com> References: <55411841.c5b3340a.1cf8.07e8@mx.google.com> Message-ID: On 29 April 2015 at 18:43, Jim J. Jewett wrote: >> Rationale and Goals >> =================== >> >> Current Python supports implementing coroutines via generators (PEP >> 342), further enhanced by the ``yield from`` syntax introduced in PEP >> 380. This approach has a number of shortcomings: >> >> * it is easy to confuse coroutines with regular generators, since they >> share the same syntax; async libraries often attempt to alleviate >> this by using decorators (e.g. ``@asyncio.coroutine`` [1]_); > > So? PEP 492 never says what coroutines *are* in a way that explains > why it matters that they are different from generators. I agree. While I don't use coroutines/asyncio, and I may never do so, I will say that I find Python's approach very difficult to understand. I'd hope that the point of PEP 492, by making await/async first class language constructs, would be to make async programming more accessible in Python. Whether that will actually be the case isn't particularly clear to me. And whether "async programming" and "coroutines" are the same thing, I'm even less sure of. I haven't really followed the discussions here, because they seem to be about details that are completely confusing to me. In principle, I support the PEP, on the basis that working towards better coroutine/async support in Python seems worthwhile to me. But until the whole area is made more accessible to the average programmer, I doubt any of this will be more than a niche area in Python. For example, the PEP says: """ New Coroutine Declaration Syntax The following new syntax is used to declare a coroutine: async def read_data(db): pass """ Looking at the Wikipedia article on coroutines, I see an example of how a producer/consumer process might be written with coroutines: var q := new queue coroutine produce loop while q is not full create some new items add the items to q yield to consume coroutine consume loop while q is not empty remove some items from q use the items yield to produce (To start everything off, you'd just run "produce"). I can't even see how to relate that to PEP 429 syntax. I'm not allowed to use "yield", so should I use "await consume" in produce (and vice versa)? I'd actually expect to just write 2 generators in Python, and use .send() somehow (it's clunky and I can never remember how to write the calls, but that's OK, it just means that coroutines don't have first-class syntax support in Python). This is totally unrelated to asyncio, which is the core use case for all of Python's async support. But it's what I think of when I see the word "coroutine" (and Wikipedia agrees). Searching for "Async await" gets me to the Microsoft page "Asynchronous Programming with Async and Await" describing the C# keywords. That looks more like what PEP 429 is talking about, but it uses the name "async method". Maybe that's what PEP should do, too, and leave the word "coroutine" for the yielding of control that I quoted from Wikipedia above. Confusedly, Paul From ethan at stoneleaf.us Wed Apr 29 20:32:13 2015 From: ethan at stoneleaf.us (Ethan Furman) Date: Wed, 29 Apr 2015 11:32:13 -0700 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <55411877.7010608@gmail.com> References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> <20150429172521.GC10248@stoneleaf.us> <55411877.7010608@gmail.com> Message-ID: <20150429183213.GD10248@stoneleaf.us> On 04/29, Yury Selivanov wrote: > On 2015-04-29 1:25 PM, Ethan Furman wrote: >> cannot also just work and be the same as the parenthesized >> version. > > Because it does not make any sense. I obviously don't understand your position that "it does not make any sense" -- perhaps you could explain a bit? What I see is a suspension point that is waiting for the results of coro(), which will be negated (and returned/assigned/whatever). What part of that doesn't make sense? -- ~Ethan~ From yselivanov.ml at gmail.com Wed Apr 29 20:42:34 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 29 Apr 2015 14:42:34 -0400 Subject: [Python-Dev] PEP 492: What is the real goal? In-Reply-To: References: <55411841.c5b3340a.1cf8.07e8@mx.google.com> Message-ID: <5541261A.9020909@gmail.com> Hi Paul, On 2015-04-29 2:26 PM, Paul Moore wrote: > On 29 April 2015 at 18:43, Jim J. Jewett wrote: >>> Rationale and Goals >>> =================== >>> >>> Current Python supports implementing coroutines via generators (PEP >>> 342), further enhanced by the ``yield from`` syntax introduced in PEP >>> 380. This approach has a number of shortcomings: >>> >>> * it is easy to confuse coroutines with regular generators, since they >>> share the same syntax; async libraries often attempt to alleviate >>> this by using decorators (e.g. ``@asyncio.coroutine`` [1]_); >> So? PEP 492 never says what coroutines *are* in a way that explains >> why it matters that they are different from generators. > I agree. While I don't use coroutines/asyncio, and I may never do so, > I will say that I find Python's approach very difficult to understand. > > I'd hope that the point of PEP 492, by making await/async first class > language constructs, would be to make async programming more > accessible in Python. Whether that will actually be the case isn't > particularly clear to me. And whether "async programming" and > "coroutines" are the same thing, I'm even less sure of. I haven't > really followed the discussions here, because they seem to be about > details that are completely confusing to me. It will make it more accessible in Python. asyncio is getting a lot of traction, and with this PEP accepted I can see it only becoming easier to work with it (or any other async frameworks that start using the new syntax/protocols). > > In principle, I support the PEP, on the basis that working towards > better coroutine/async support in Python seems worthwhile to me. But > until the whole area is made more accessible to the average > programmer, I doubt any of this will be more than a niche area in > Python. > > For example, the PEP says: > > """ > New Coroutine Declaration Syntax > > The following new syntax is used to declare a coroutine: > > async def read_data(db): > pass > """ > > Looking at the Wikipedia article on coroutines, I see an example of > how a producer/consumer process might be written with coroutines: > > var q := new queue > > coroutine produce > loop > while q is not full > create some new items > add the items to q > yield to consume > > coroutine consume > loop > while q is not empty > remove some items from q > use the items > yield to produce > > (To start everything off, you'd just run "produce"). > > I can't even see how to relate that to PEP 429 syntax. I'm not allowed > to use "yield", so should I use "await consume" in produce (and vice > versa)? I'd actually expect to just write 2 generators in Python, and > use .send() somehow (it's clunky and I can never remember how to write > the calls, but that's OK, it just means that coroutines don't have > first-class syntax support in Python). This is totally unrelated to > asyncio, which is the core use case for all of Python's async support. > But it's what I think of when I see the word "coroutine" (and > Wikipedia agrees). That Wikipedia page is very generic, and the pseudo-code that it uses does indeed look confusing. Here's how it might look like (this is the same pseudo-code but tailored for PEP 492, not a real something) q = asyncio.Queue(maxsize=100) async def produce(): # you might want to wrap it all in 'while True' while not q.full(): item = create_item() await q.put(item) async def consume(): while not q.empty(): item = await q.get() process_item(item) Thanks! Yury From p.f.moore at gmail.com Wed Apr 29 20:44:53 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Wed, 29 Apr 2015 19:44:53 +0100 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <20150429183213.GD10248@stoneleaf.us> References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> <20150429172521.GC10248@stoneleaf.us> <55411877.7010608@gmail.com> <20150429183213.GD10248@stoneleaf.us> Message-ID: On 29 April 2015 at 19:32, Ethan Furman wrote: > On 04/29, Yury Selivanov wrote: >> On 2015-04-29 1:25 PM, Ethan Furman wrote: >>> cannot also just work and be the same as the parenthesized >>> version. >> >> Because it does not make any sense. > > I obviously don't understand your position that "it does not make > any sense" -- perhaps you could explain a bit? > > What I see is a suspension point that is waiting for the results of > coro(), which will be negated (and returned/assigned/whatever). > What part of that doesn't make sense? Would that not be "-await coro()"? What "await -coro()" would mean is to call coro() (which would return something you can wait on), then apply - to that (then waiting on the result of that negation). But what does negating an awaitable object mean? Obviously you can *define* it to mean something, but you probably didn't. Paul From yselivanov.ml at gmail.com Wed Apr 29 20:46:14 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 29 Apr 2015 14:46:14 -0400 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <20150429183213.GD10248@stoneleaf.us> References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> <20150429172521.GC10248@stoneleaf.us> <55411877.7010608@gmail.com> <20150429183213.GD10248@stoneleaf.us> Message-ID: <554126F6.6050108@gmail.com> Hi Ethan, On 2015-04-29 2:32 PM, Ethan Furman wrote: > On 04/29, Yury Selivanov wrote: >> On 2015-04-29 1:25 PM, Ethan Furman wrote: >>> cannot also just work and be the same as the parenthesized >>> version. >> Because it does not make any sense. > I obviously don't understand your position that "it does not make > any sense" -- perhaps you could explain a bit? > > What I see is a suspension point that is waiting for the results of > coro(), which will be negated (and returned/assigned/whatever). > What part of that doesn't make sense? > Because you want operators to be resolved in the order you see them, generally. You want '(await -fut)' to: 1. Suspend on fut; 2. Get the result; 3. Negate it. This is a non-obvious thing. I would myself interpret it as: 1. Get fut.__neg__(); 2. await on it. So I want to make this syntactically incorrect: 'await -fut' would throw a SyntaxError. To do what you want, write a pythonic '- await fut'. Yury From njs at pobox.com Wed Apr 29 21:14:50 2015 From: njs at pobox.com (Nathaniel Smith) Date: Wed, 29 Apr 2015 12:14:50 -0700 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <554126F6.6050108@gmail.com> References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> <20150429172521.GC10248@stoneleaf.us> <55411877.7010608@gmail.com> <20150429183213.GD10248@stoneleaf.us> <554126F6.6050108@gmail.com> Message-ID: On Apr 29, 2015 11:49 AM, "Yury Selivanov" wrote: > > Hi Ethan, > > > On 2015-04-29 2:32 PM, Ethan Furman wrote: >> >> On 04/29, Yury Selivanov wrote: >>> >>> On 2015-04-29 1:25 PM, Ethan Furman wrote: >>>> >>>> cannot also just work and be the same as the parenthesized >>>> version. >>> >>> Because it does not make any sense. >> >> I obviously don't understand your position that "it does not make >> any sense" -- perhaps you could explain a bit? >> >> What I see is a suspension point that is waiting for the results of >> coro(), which will be negated (and returned/assigned/whatever). >> What part of that doesn't make sense? >> > > Because you want operators to be resolved in the > order you see them, generally. > > You want '(await -fut)' to: > > 1. Suspend on fut; > 2. Get the result; > 3. Negate it. > > This is a non-obvious thing. I would myself interpret it > as: > > 1. Get fut.__neg__(); > 2. await on it. > > So I want to make this syntactically incorrect: As a bystander, I don't really care either way about whether await -fut is syntactically valid (since like you say it's semantically nonsense regardless and no one will ever write it). But I would rather like to actually know what the syntax actually is, not just have a list of examples (which kinda gives me perl flashbacks). Is there any simple way to state what the rules for parsing await are? Or do I just have to read the parser code if I want to know that? (I suspect this may also be the impetus behind Greg's request that it just be treated the same as unary minus. IMHO it matters much more that the rules be predictable and teachable than that they allow or disallow every weird edge case in exactly the right way.) -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From drekin at gmail.com Wed Apr 29 21:18:40 2015 From: drekin at gmail.com (=?UTF-8?B?QWRhbSBCYXJ0b8Wh?=) Date: Wed, 29 Apr 2015 21:18:40 +0200 Subject: [Python-Dev] Unicode literals in Python 2.7 In-Reply-To: References: Message-ID: I am in Windows and my terminal isn't utf-8 at the beginning, but I install custom sys.std* objects at runtime and I also install custom readline hook, so the interactive loop gets the input from my stream objects via PyOS_Readline. So when I enter u'?', the tokenizer gets b"u'\xce\xb1'", which is the string encoded in utf-8, and sys.stdin.encoding == 'utf-8'. However, the input is then interpreted as u'\xce\xb1' instead of u'\u03b1'. On Wed, Apr 29, 2015 at 6:40 PM, Guido van Rossum wrote: > I suspect the interactive session is *not* always in UTF8. It probably > depends on the keyboard mapping of your terminal emulator. I imagine in > Windows it's the current code page. > > On Wed, Apr 29, 2015 at 9:19 AM, Adam Barto? wrote: > >> Yes, that works for eval. But I want it for code entered during an >> interactive session. >> >> >>> u'?' >> u'\xce\xb1' >> >> The tokenizer gets b"u'\xce\xb1'" by calling PyOS_Readline and it knows >> it's utf-8 encoded. But the result of evaluation is u'\xce\xb1'. Because of >> how eval works, I believe that it would work correctly if the >> PyCF_SOURCE_IS_UTF8 was set, but it is not. That is why I'm asking if there >> is a way to set it. Also, my naive thought is that it should be always set >> in the case of interactive session. >> >> >> On Wed, Apr 29, 2015 at 4:59 PM, Victor Stinner > > wrote: >> >>> Le 29 avr. 2015 10:36, "Adam Barto?" a ?crit : >>> > Why I'm talking about PyCF_SOURCE_IS_UTF8? eval(u"u'\u03b1'") -> >>> u'\u03b1' but eval(u"u'\u03b1'".encode('utf-8')) -> u'\xce\xb1'. >>> >>> There is a simple option to get this flag: call eval() with unicode, not >>> with encoded bytes. >>> >>> Victor >>> >> >> >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/guido%40python.org >> >> > > > -- > --Guido van Rossum (python.org/~guido) > -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Wed Apr 29 21:19:40 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Wed, 29 Apr 2015 20:19:40 +0100 Subject: [Python-Dev] PEP 492: What is the real goal? In-Reply-To: <5541261A.9020909@gmail.com> References: <55411841.c5b3340a.1cf8.07e8@mx.google.com> <5541261A.9020909@gmail.com> Message-ID: On 29 April 2015 at 19:42, Yury Selivanov wrote: > Here's how it might look like (this is the same pseudo-code > but tailored for PEP 492, not a real something) > > q = asyncio.Queue(maxsize=100) > > async def produce(): > # you might want to wrap it all in 'while True' I think the "loop" in the Wikipedia pseudocode was intended to be the "while True" here, not part of the "while" on the next line. > > while not q.full(): > item = create_item() > await q.put(item) > > async def consume(): > while not q.empty(): > item = await q.get() > process_item(item) Thanks for that. That does look pretty OK. One question, though - it uses an asyncio Queue. The original code would work just as well with a list, or more accurately, something that wasn't designed for async use. So the translation isn't completely equivalent. Also, can I run the produce/consume just by calling produce()? My impression is that with asyncio I need an event loop - which "traditional" coroutines don't need. Nevertheless, the details aren't so important, it was only a toy example anyway. However, just to make my point precise, here's a more or less direct translation of the Wikipedia code into Python. It doesn't actually work, because getting the right combinations of yield and send stuff is confusing to me. Specifically, I suspect that "yield produce.send(None)" isn't the right way to translate "yield to produce". But it gives the idea. data = [1,2,3,4,5,6,7,8,9,10] q = [] def produce(): while True: while len(q) < 10: if not data: return item = data.pop() print("In produce - got", item) q.append(item) yield consume.send(None) total = 0 def consume(): while True: while q: item = q.pop() print("In consume - handling", item) global total total += item yield produce.send(None) # Prime the coroutines produce = produce() consume = consume() next(produce) print(total) The *only* bits of this that are related to coroutines are: 1. yield consume.send(None) (and the same for produce) 2. produce = produce() (and the same for consume) priming the coroutines 3. next(produce) to start the coroutines I don't think this is at all related to PEP 492 (which is about async) but it's what is traditionally meant by coroutines. It would be nice to have a simpler syntax for these "traditional" coroutines, but it's a very niche requirement, and probably not worth it. But the use of "coroutine" in PEP 492 for the functions introduced by "async def" is confusing - at least to me - because I think of the above, and not of async. Why not just call them "async functions" and leave the term coroutine for the above flow control construct, which is where it originated? But maybe that ship has long sailed - the term "coroutine" is pretty entrenched in the asyncio documentation. If so, then I guess we have to live with the consequences. Paul From yselivanov.ml at gmail.com Wed Apr 29 21:27:24 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 29 Apr 2015 15:27:24 -0400 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> <20150429172521.GC10248@stoneleaf.us> <55411877.7010608@gmail.com> <20150429183213.GD10248@stoneleaf.us> <554126F6.6050108@gmail.com> Message-ID: <5541309C.7090902@gmail.com> Hi Nathaniel, BTW, I'm going to reply to you in the other thread about context-local objects soon. I have some thoughts on the topic. On 2015-04-29 3:14 PM, Nathaniel Smith wrote: > On Apr 29, 2015 11:49 AM, "Yury Selivanov" wrote: >> Hi Ethan, >> >> >> On 2015-04-29 2:32 PM, Ethan Furman wrote: >>> On 04/29, Yury Selivanov wrote: >>>> On 2015-04-29 1:25 PM, Ethan Furman wrote: >>>>> cannot also just work and be the same as the parenthesized >>>>> version. >>>> Because it does not make any sense. >>> I obviously don't understand your position that "it does not make >>> any sense" -- perhaps you could explain a bit? >>> >>> What I see is a suspension point that is waiting for the results of >>> coro(), which will be negated (and returned/assigned/whatever). >>> What part of that doesn't make sense? >>> >> Because you want operators to be resolved in the >> order you see them, generally. >> >> You want '(await -fut)' to: >> >> 1. Suspend on fut; >> 2. Get the result; >> 3. Negate it. >> >> This is a non-obvious thing. I would myself interpret it >> as: >> >> 1. Get fut.__neg__(); >> 2. await on it. >> >> So I want to make this syntactically incorrect: > As a bystander, I don't really care either way about whether await -fut is > syntactically valid (since like you say it's semantically nonsense > regardless and no one will ever write it). But I would rather like to > actually know what the syntax actually is, not just have a list of examples > (which kinda gives me perl flashbacks). Is there any simple way to state > what the rules for parsing await are? Or do I just have to read the parser > code if I want to know that? There is a summary of grammar changes in the PEP: https://www.python.org/dev/peps/pep-0492/#grammar-updates You can also see the actual grammar file from the reference implementation: https://github.com/1st1/cpython/blob/await/Grammar/Grammar Thanks, Yury From yselivanov.ml at gmail.com Wed Apr 29 21:42:18 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 29 Apr 2015 15:42:18 -0400 Subject: [Python-Dev] PEP 492: What is the real goal? In-Reply-To: References: <55411841.c5b3340a.1cf8.07e8@mx.google.com> <5541261A.9020909@gmail.com> Message-ID: <5541341A.8090204@gmail.com> Paul, On 2015-04-29 3:19 PM, Paul Moore wrote: > On 29 April 2015 at 19:42, Yury Selivanov wrote: >> Here's how it might look like (this is the same pseudo-code >> but tailored for PEP 492, not a real something) >> >> q = asyncio.Queue(maxsize=100) >> >> async def produce(): >> # you might want to wrap it all in 'while True' > I think the "loop" in the Wikipedia pseudocode was intended to be the > "while True" here, not part of the "while" on the next line. > >> while not q.full(): >> item = create_item() >> await q.put(item) >> >> async def consume(): >> while not q.empty(): >> item = await q.get() >> process_item(item) > Thanks for that. That does look pretty OK. One question, though - it > uses an asyncio Queue. The original code would work just as well with > a list, or more accurately, something that wasn't designed for async > use. So the translation isn't completely equivalent. Also, can I run > the produce/consume just by calling produce()? My impression is that > with asyncio I need an event loop - which "traditional" coroutines > don't need. Nevertheless, the details aren't so important, it was only > a toy example anyway. Well, yes. Coroutine is a generic term. And you can use PEP 492 coroutines without asyncio, in fact that's how most tests for the reference implementation is written. Coroutine objects have .send(), .throw() and .close() methods (same as generator objects in Python). You can work with them without a loop, but loop implementations contain a lot of logic to implement the actual cooperative execution. You can use generators as coroutines, and nothing would prevent you from doing that after PEP 492, moreover, for some use-cases it might be quite a good decision. But a lot of the code -- web frameworks, network applications, etc will hugely benefit from the proposal, streamlined syntax and async for/with statements. [..] > > But the use of "coroutine" in PEP 492 for the functions introduced by > "async def" is confusing - at least to me - because I think of the > above, and not of async. Why not just call them "async functions" and > leave the term coroutine for the above flow control construct, which > is where it originated? > > But maybe that ship has long sailed - the term "coroutine" is pretty > entrenched in the asyncio documentation. If so, then I guess we have > to live with the consequences. Everybody is pulling me in a different direction :) Guido proposed to call them "native coroutines". Some people think that "async functions" is a better name. Greg loves his "cofunction" term. I'm flexible about how we name 'async def' functions. I like to call them "coroutines", because that's what they are, and that's how asyncio calls them. It's also convenient to use 'coroutine-object' to explain what is the result of calling a coroutine. Anyways, I'd be OK to start using a new term, if "coroutine" is confusing. Thanks, Yury From skip.montanaro at gmail.com Wed Apr 29 22:14:22 2015 From: skip.montanaro at gmail.com (Skip Montanaro) Date: Wed, 29 Apr 2015 15:14:22 -0500 Subject: [Python-Dev] PEP 492: What is the real goal? In-Reply-To: <5541341A.8090204@gmail.com> References: <55411841.c5b3340a.1cf8.07e8@mx.google.com> <5541261A.9020909@gmail.com> <5541341A.8090204@gmail.com> Message-ID: On Wed, Apr 29, 2015 at 2:42 PM, Yury Selivanov wrote: > Anyways, I'd be OK to start using a new term, if "coroutine" is > confusing. > According to Wikipedia , term "coroutine" was first coined in 1958, so several generations of computer science graduates will be familiar with the textbook definition. If your use of "coroutine" matches the textbook definition of the term, I think you should continue to use it instead of inventing new names which will just confuse people new to Python. Skip -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Wed Apr 29 22:19:14 2015 From: njs at pobox.com (Nathaniel Smith) Date: Wed, 29 Apr 2015 13:19:14 -0700 Subject: [Python-Dev] PEP 492: What is the real goal? In-Reply-To: References: <55411841.c5b3340a.1cf8.07e8@mx.google.com> <5541261A.9020909@gmail.com> <5541341A.8090204@gmail.com> Message-ID: On Wed, Apr 29, 2015 at 1:14 PM, Skip Montanaro wrote: > > On Wed, Apr 29, 2015 at 2:42 PM, Yury Selivanov > wrote: >> >> Anyways, I'd be OK to start using a new term, if "coroutine" is >> confusing. > > > According to Wikipedia, term "coroutine" was first coined in 1958, so > several generations of computer science graduates will be familiar with the > textbook definition. If your use of "coroutine" matches the textbook > definition of the term, I think you should continue to use it instead of > inventing new names which will just confuse people new to Python. IIUC the problem is that Python has or will have a number of different things that count as coroutines by that classic CS definition, including generators, "async def" functions, and in general any object that implements the same set of methods as one or both of these objects, or possibly inherits from a certain abstract base class. It would be useful to have some terms to refer specifically to async def functions and the await protocol as opposed to generators and the iterator protocol, and "coroutine" does not make this distinction. -n -- Nathaniel J. Smith -- http://vorpus.org From guido at python.org Wed Apr 29 22:30:58 2015 From: guido at python.org (Guido van Rossum) Date: Wed, 29 Apr 2015 13:30:58 -0700 Subject: [Python-Dev] PEP 492: What is the real goal? In-Reply-To: References: <55411841.c5b3340a.1cf8.07e8@mx.google.com> <5541261A.9020909@gmail.com> <5541341A.8090204@gmail.com> Message-ID: Maybe it would help to refer to PEP 342, which first formally introduced the concept of coroutines (as a specific use case of generators) in Python. Personally I don't care too much which term the PEP uses, as logn as it defines its terms. The motivation is already clear to me; it's the details that I care about before approving this PEP. On Wed, Apr 29, 2015 at 1:19 PM, Nathaniel Smith wrote: > On Wed, Apr 29, 2015 at 1:14 PM, Skip Montanaro > wrote: > > > > On Wed, Apr 29, 2015 at 2:42 PM, Yury Selivanov > > > wrote: > >> > >> Anyways, I'd be OK to start using a new term, if "coroutine" is > >> confusing. > > > > > > According to Wikipedia, term "coroutine" was first coined in 1958, so > > several generations of computer science graduates will be familiar with > the > > textbook definition. If your use of "coroutine" matches the textbook > > definition of the term, I think you should continue to use it instead of > > inventing new names which will just confuse people new to Python. > > IIUC the problem is that Python has or will have a number of different > things that count as coroutines by that classic CS definition, > including generators, "async def" functions, and in general any object > that implements the same set of methods as one or both of these > objects, or possibly inherits from a certain abstract base class. It > would be useful to have some terms to refer specifically to async def > functions and the await protocol as opposed to generators and the > iterator protocol, and "coroutine" does not make this distinction. > > -n > > -- > Nathaniel J. Smith -- http://vorpus.org > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Thu Apr 30 00:02:23 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Wed, 29 Apr 2015 23:02:23 +0100 Subject: [Python-Dev] PEP 492: What is the real goal? In-Reply-To: <5541341A.8090204@gmail.com> References: <55411841.c5b3340a.1cf8.07e8@mx.google.com> <5541261A.9020909@gmail.com> <5541341A.8090204@gmail.com> Message-ID: On 29 April 2015 at 20:42, Yury Selivanov wrote: > Everybody is pulling me in a different direction :) Sorry :-) > Guido proposed to call them "native coroutines". Some people > think that "async functions" is a better name. Greg loves > his "cofunction" term. If it helps, ignore my opinion - I'm not a heavy user of coroutines or asyncio, so my view shouldn't have too much weight. Thanks for your response - my question was a little off-topic, but your reply has made things clearer for me. Paul From yselivanov.ml at gmail.com Thu Apr 30 00:11:51 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 29 Apr 2015 18:11:51 -0400 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <75c09142bded.5540e9b9@wiscmail.wisc.edu> References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> <20150429172521.GC10248@stoneleaf.us> <55411877.7010608@gmail.com> <20150429183213.GD10248@stoneleaf.us> <554126F6.6050108@gmail.com> <76d08c5febac.55412df3@wiscmail.wisc.edu> <75c0c6e0c543.55412e2f@wiscmail.wisc.edu> <75c0d79a9efa.55412ea7@wiscmail.wisc.edu> <75c086a7c44b.55412ee4@wiscmail.wisc.edu> <75c0e3aec47e.55412f20@wiscmail.wisc.edu> <75c08a2181ed.55412f5c@wiscmail.wisc.edu> <7720ec97fe17.55412f99@wiscmail.wisc.edu> <7720e702fb35.55412fd5@wiscmail.wisc.edu> <75c09142bded.5540e9b9@wiscmail.wisc.edu> Message-ID: <55415727.6080402@gmail.com> On 2015-04-29 3:24 PM, Isaac Schwabacher wrote: > On 15-04-29, Yury Selivanov wrote: >> Hi Ethan, [..] >> So I want to make this syntactically incorrect: > Does this need to be a syntax error? -"hello" raises TypeError because str doesn't have a __neg__, but there's no reason a str subclass couldn't define one. "TypeError: bad operand type for unary -: 'asyncio.Future'" is enough to clear up any misunderstandings, and if someone approaching a new language construct doesn't test their code well enough to at least execute all the code paths, the difference between a compile-time SyntaxError and a run-time TypeError is not going to save them. > The grammar of the language should match the most common use case. FWIW, I've just updated the pep with a precedence table: https://hg.python.org/peps/rev/d355918bc0d7 Yury From greg.ewing at canterbury.ac.nz Thu Apr 30 00:46:16 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 30 Apr 2015 10:46:16 +1200 Subject: [Python-Dev] PEP 492: async/await in Python; v3 In-Reply-To: <5540E426.40305@gmail.com> References: <553EF985.2070808@gmail.com> <55401741.9040703@gmail.com> <5540656C.4060705@gmail.com> <5540A09F.8030709@canterbury.ac.nz> <5540E426.40305@gmail.com> Message-ID: <55415F38.7020003@canterbury.ac.nz> Yury Selivanov wrote: >> Won't that prevent some existing generator-based >> coroutines (ones not decorated with @coroutine) >> from calling ones implemented with 'async def'? >> > It would. But that's not a backwards compatibility > issue. It seems to go against Guido's desire for the new way to be a 100% drop-in replacement for the old way. There are various ways that old code can end up calling new code -- subclassing, callbacks, etc. It also means that if person A writes a library in the new style, then person B can't make use of it without upgrading all of their code to the new style as well. The new style will thus be "infectious" in a sense. I suppose it's up to Guido to decide whether it's a good or bad infection. But the same kind of reasoning seemed to be at least partly behind the rejection of PEP 3152. -- Greg From greg.ewing at canterbury.ac.nz Thu Apr 30 00:46:55 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 30 Apr 2015 10:46:55 +1200 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <554102A4.1040603@gmail.com> References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> <554102A4.1040603@gmail.com> Message-ID: <55415F5F.3090101@canterbury.ac.nz> Yury Selivanov wrote: > I'm not sure > why Greg is pushing his Grammar idea so aggressively. Because I believe that any extra complexity in the grammar needs a very strong justification. It's complexity in the core language, like a new keyword, so it puts a burden on everyone's brain. Saying "I don't think anyone would ever need to write this, therefore we should disallow it" is not enough, given that there is a substantial cost to disallowing it. If you don't think there's a cost, consider that we *both* seem to be having trouble predicting the consequences of your proposed syntax, and you're the one who invented it. That's not a good sign! -- Greg From yselivanov.ml at gmail.com Thu Apr 30 00:54:49 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 29 Apr 2015 18:54:49 -0400 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <55415F5F.3090101@canterbury.ac.nz> References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> <554102A4.1040603@gmail.com> <55415F5F.3090101@canterbury.ac.nz> Message-ID: <55416139.1000004@gmail.com> Greg, On 2015-04-29 6:46 PM, Greg Ewing wrote: > Yury Selivanov wrote: >> I'm not sure >> why Greg is pushing his Grammar idea so aggressively. > > Because I believe that any extra complexity in the grammar > needs a very strong justification. It's complexity in the > core language, like a new keyword, so it puts a burden on > everyone's brain. > > Saying "I don't think anyone would ever need to write this, > therefore we should disallow it" is not enough, given that > there is a substantial cost to disallowing it. > > If you don't think there's a cost, consider that we *both* > seem to be having trouble predicting the consequences of > your proposed syntax, and you're the one who invented it. > That's not a good sign! Sorry, but I'm not sure where & when I had any troubles predicting the consequences.. Please take a look at the updated PEP. There is a precedence table there + motivation. Yury From yselivanov.ml at gmail.com Thu Apr 30 00:58:43 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 29 Apr 2015 18:58:43 -0400 Subject: [Python-Dev] PEP 492: async/await in Python; v3 In-Reply-To: <55415F38.7020003@canterbury.ac.nz> References: <553EF985.2070808@gmail.com> <55401741.9040703@gmail.com> <5540656C.4060705@gmail.com> <5540A09F.8030709@canterbury.ac.nz> <5540E426.40305@gmail.com> <55415F38.7020003@canterbury.ac.nz> Message-ID: <55416223.30705@gmail.com> Greg, On 2015-04-29 6:46 PM, Greg Ewing wrote: > Yury Selivanov wrote: > >>> Won't that prevent some existing generator-based >>> coroutines (ones not decorated with @coroutine) >>> from calling ones implemented with 'async def'? >>> >> It would. But that's not a backwards compatibility >> issue. > > It seems to go against Guido's desire for the new > way to be a 100% drop-in replacement for the old > way. There are various ways that old code can end > up calling new code -- subclassing, callbacks, > etc. > > It also means that if person A writes a library > in the new style, then person B can't make use > of it without upgrading all of their code to the > new style as well. The new style will thus be > "infectious" in a sense. > > I suppose it's up to Guido to decide whether it's > a good or bad infection. But the same kind of > reasoning seemed to be at least partly behind > the rejection of PEP 3152. > It's a drop-in replacement ;) If you run your existing code - it will 100% work just fine. There is a probability that *when* you start applying new syntax something could go wrong -- you're right here. I'm updating the PEP to explain this clearly, and let's see what Guido thinks about that. My opinion is that this is a solvable problem with a clear guidelines on how to transition existing code to the new style. Thanks, Yury From guido at python.org Thu Apr 30 00:58:51 2015 From: guido at python.org (Guido van Rossum) Date: Wed, 29 Apr 2015 15:58:51 -0700 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <55415F5F.3090101@canterbury.ac.nz> References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> <554102A4.1040603@gmail.com> <55415F5F.3090101@canterbury.ac.nz> Message-ID: On Wed, Apr 29, 2015 at 3:46 PM, Greg Ewing wrote: > Yury Selivanov wrote: > >> I'm not sure >> why Greg is pushing his Grammar idea so aggressively. >> > > Because I believe that any extra complexity in the grammar > needs a very strong justification. It's complexity in the > core language, like a new keyword, so it puts a burden on > everyone's brain. > > Saying "I don't think anyone would ever need to write this, > therefore we should disallow it" is not enough, given that > there is a substantial cost to disallowing it. > > If you don't think there's a cost, consider that we *both* > seem to be having trouble predicting the consequences of > your proposed syntax, and you're the one who invented it. > That's not a good sign! > I have a slightly different view. A bunch of things *must* work, e.g. f(await g(), await h()) or with await f(): (there's a longer list in the PEP). Other things may be ambiguous to most readers, e.g. what does await f() + g() mean, or can we say await await f(), and the solution is to recommend adding parentheses that make things clear to the parser *and* humans. Yury's proposal satisfies my requirements, and if we really find some unpleasant edge case we can fix it during the 3.5 release (the PEP will be provisional). -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Thu Apr 30 01:35:47 2015 From: njs at pobox.com (Nathaniel Smith) Date: Wed, 29 Apr 2015 16:35:47 -0700 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <55415F5F.3090101@canterbury.ac.nz> References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> <554102A4.1040603@gmail.com> <55415F5F.3090101@canterbury.ac.nz> Message-ID: On Wed, Apr 29, 2015 at 3:46 PM, Greg Ewing wrote: > Yury Selivanov wrote: >> >> I'm not sure >> why Greg is pushing his Grammar idea so aggressively. > > > Because I believe that any extra complexity in the grammar > needs a very strong justification. It's complexity in the > core language, like a new keyword, so it puts a burden on > everyone's brain. > > Saying "I don't think anyone would ever need to write this, > therefore we should disallow it" is not enough, given that > there is a substantial cost to disallowing it. > > If you don't think there's a cost, consider that we *both* > seem to be having trouble predicting the consequences of > your proposed syntax, and you're the one who invented it. > That's not a good sign! FWIW, now that I've seen the precedence table in the updated PEP, it seems really natural to me: https://www.python.org/dev/peps/pep-0492/#updated-operator-precedence-table According to that, "await" is just a prefix operator that binds more tightly than any arithmetic operation, but less tightly than indexing/funcall/attribute lookup, which seems about right. However, if what I just wrote were true, then that would mean that "await -foo" and "await await foo" would be syntactically legal (though useless). The fact that they apparently are *not* legal means that in fact there is still some weird thing going on in the syntax that I don't understand. And the PEP gives no further details, it just suggests I go read the parser generator source. My preference would be that the PEP be updated so that my one-sentence summary above became correct. But like Guido, I don't necessarily care about the exact details all that much. What I do feel strongly about is that whatever syntax we end up with, there should be *some* accurate human-readable description of *what it is*. AFAICT the PEP currently doesn't have that. -n -- Nathaniel J. Smith -- http://vorpus.org From yselivanov.ml at gmail.com Thu Apr 30 01:48:15 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 29 Apr 2015 19:48:15 -0400 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> <554102A4.1040603@gmail.com> <55415F5F.3090101@canterbury.ac.nz> Message-ID: <55416DBF.3020303@gmail.com> Nathaniel, On 2015-04-29 7:35 PM, Nathaniel Smith wrote: > What I do feel strongly about > is that whatever syntax we end up with, there should be*some* > accurate human-readable description of*what it is*. AFAICT the PEP > currently doesn't have that. How to define human-readable description of how unary minus operator works? Yury From guido at python.org Thu Apr 30 01:52:58 2015 From: guido at python.org (Guido van Rossum) Date: Wed, 29 Apr 2015 16:52:58 -0700 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <55416DBF.3020303@gmail.com> References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> <554102A4.1040603@gmail.com> <55415F5F.3090101@canterbury.ac.nz> <55416DBF.3020303@gmail.com> Message-ID: On Wed, Apr 29, 2015 at 4:48 PM, Yury Selivanov wrote: > Nathaniel, > > On 2015-04-29 7:35 PM, Nathaniel Smith wrote: > >> What I do feel strongly about >> is that whatever syntax we end up with, there should be*some* >> accurate human-readable description of*what it is*. AFAICT the PEP >> currently doesn't have that. >> > > How to define human-readable description of how unary > minus operator works? > In a PEP you should probably give grammar that is not the actual grammar from the implementation, but matches the grammar used in the reference manual on docs.python.org. -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Thu Apr 30 01:58:59 2015 From: njs at pobox.com (Nathaniel Smith) Date: Wed, 29 Apr 2015 16:58:59 -0700 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <55416DBF.3020303@gmail.com> References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> <554102A4.1040603@gmail.com> <55415F5F.3090101@canterbury.ac.nz> <55416DBF.3020303@gmail.com> Message-ID: On Wed, Apr 29, 2015 at 4:48 PM, Yury Selivanov wrote: > Nathaniel, > > On 2015-04-29 7:35 PM, Nathaniel Smith wrote: >> >> What I do feel strongly about >> is that whatever syntax we end up with, there should be*some* >> accurate human-readable description of*what it is*. AFAICT the PEP >> currently doesn't have that. > > How to define human-readable description of how unary > minus operator works? Hah, good question :-). Of course we all learned how to parse arithmetic in school, so perhaps it's a bit cheating to refer to that knowledge. Except of course basically all our users *do* have that knowledge (or else are forced to figure it out anyway). So I would be happy with a description of "await" that just says "it's like unary minus but higher precedence". Even if we put aside our trained intuitions about arithmetic, I think it's correct to say that the way unary minus is parsed is: everything to the right of it that has a tighter precedence gets collected up and parsed as an expression, and then it takes that expression as its argument. Still pretty simple. -- Nathaniel J. Smith -- http://vorpus.org From yselivanov.ml at gmail.com Thu Apr 30 02:04:37 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 29 Apr 2015 20:04:37 -0400 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> <554102A4.1040603@gmail.com> <55415F5F.3090101@canterbury.ac.nz> <55416DBF.3020303@gmail.com> Message-ID: <55417195.8090209@gmail.com> Guido, On 2015-04-29 7:52 PM, Guido van Rossum wrote: > On Wed, Apr 29, 2015 at 4:48 PM, Yury Selivanov > wrote: > >> Nathaniel, >> >> On 2015-04-29 7:35 PM, Nathaniel Smith wrote: >> >>> What I do feel strongly about >>> is that whatever syntax we end up with, there should be*some* >>> accurate human-readable description of*what it is*. AFAICT the PEP >>> currently doesn't have that. >>> >> How to define human-readable description of how unary >> minus operator works? >> > In a PEP you should probably give grammar that is not the actual grammar > from the implementation, but matches the grammar used in the reference > manual on docs.python.org. > Will do! Thanks, Yury From yselivanov.ml at gmail.com Thu Apr 30 02:05:51 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 29 Apr 2015 20:05:51 -0400 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> <554102A4.1040603@gmail.com> <55415F5F.3090101@canterbury.ac.nz> <55416DBF.3020303@gmail.com> Message-ID: <554171DF.2030906@gmail.com> Nathaniel, On 2015-04-29 7:58 PM, Nathaniel Smith wrote: > On Wed, Apr 29, 2015 at 4:48 PM, Yury Selivanov wrote: >> Nathaniel, >> >> On 2015-04-29 7:35 PM, Nathaniel Smith wrote: >>> What I do feel strongly about >>> is that whatever syntax we end up with, there should be*some* >>> accurate human-readable description of*what it is*. AFAICT the PEP >>> currently doesn't have that. >> How to define human-readable description of how unary >> minus operator works? > Hah, good question :-). Of course we all learned how to parse > arithmetic in school, so perhaps it's a bit cheating to refer to that > knowledge. Except of course basically all our users *do* have that > knowledge (or else are forced to figure it out anyway). So I would be > happy with a description of "await" that just says "it's like unary > minus but higher precedence". > > Even if we put aside our trained intuitions about arithmetic, I think > it's correct to say that the way unary minus is parsed is: everything > to the right of it that has a tighter precedence gets collected up and > parsed as an expression, and then it takes that expression as its > argument. Still pretty simple. > Well, await is defined exactly like that ;) Anyways, I'll follow Guido's suggestion to define await in the PEP the same way we define other syntax in python docs. Yury From njs at pobox.com Thu Apr 30 02:14:09 2015 From: njs at pobox.com (Nathaniel Smith) Date: Wed, 29 Apr 2015 17:14:09 -0700 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <554171DF.2030906@gmail.com> References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> <554102A4.1040603@gmail.com> <55415F5F.3090101@canterbury.ac.nz> <55416DBF.3020303@gmail.com> <554171DF.2030906@gmail.com> Message-ID: On Wed, Apr 29, 2015 at 5:05 PM, Yury Selivanov wrote: > Nathaniel, > > On 2015-04-29 7:58 PM, Nathaniel Smith wrote: >> >> On Wed, Apr 29, 2015 at 4:48 PM, Yury Selivanov >> wrote: >>> >>> Nathaniel, >>> >>> On 2015-04-29 7:35 PM, Nathaniel Smith wrote: >>>> >>>> What I do feel strongly about >>>> is that whatever syntax we end up with, there should be*some* >>>> accurate human-readable description of*what it is*. AFAICT the PEP >>>> currently doesn't have that. >>> >>> How to define human-readable description of how unary >>> minus operator works? >> >> Hah, good question :-). Of course we all learned how to parse >> arithmetic in school, so perhaps it's a bit cheating to refer to that >> knowledge. Except of course basically all our users *do* have that >> knowledge (or else are forced to figure it out anyway). So I would be >> happy with a description of "await" that just says "it's like unary >> minus but higher precedence". >> >> Even if we put aside our trained intuitions about arithmetic, I think >> it's correct to say that the way unary minus is parsed is: everything >> to the right of it that has a tighter precedence gets collected up and >> parsed as an expression, and then it takes that expression as its >> argument. Still pretty simple. > > > Well, await is defined exactly like that ;) So you're saying that "await -fut" and "await await fut" are actually legal syntax after all, contra what the PEP says? Because "- -fut" is totally legal syntax, so if await and unary minus work the same... (Again I don't care about those examples in their own right, I just find it frustrating that I can't answer these questions without asking you each time.) -n -- Nathaniel J. Smith -- http://vorpus.org From ethan at stoneleaf.us Thu Apr 30 02:21:47 2015 From: ethan at stoneleaf.us (Ethan Furman) Date: Wed, 29 Apr 2015 17:21:47 -0700 Subject: [Python-Dev] PEP 492 quibble and request Message-ID: <20150430002147.GE10248@stoneleaf.us> >From the PEP: > Why not a __future__ import > > __future__ imports are inconvenient and easy to forget to add. That is a horrible rationale for not using an import. By that logic we should have everything in built-ins. ;) > Working example > ... The working example only uses async def and await, not async with nor async for nor __aenter__, etc., etc. Could you put in a more complete example -- maybe a basic chat room with both server and client -- that demonstrated more of the new possibilities? Having gone through the PEP again, I am still no closer to understanding what happens here: data = await reader.read(8192) What does the flow of control look like at the interpreter level? -- ~Ethan~ From yselivanov.ml at gmail.com Thu Apr 30 02:33:09 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 29 Apr 2015 20:33:09 -0400 Subject: [Python-Dev] PEP 492 quibble and request In-Reply-To: <20150430002147.GE10248@stoneleaf.us> References: <20150430002147.GE10248@stoneleaf.us> Message-ID: <55417845.3060507@gmail.com> Hi Ethan, On 2015-04-29 8:21 PM, Ethan Furman wrote: > From the PEP: > >> Why not a __future__ import >> >> __future__ imports are inconvenient and easy to forget to add. > That is a horrible rationale for not using an import. By that logic we > should have everything in built-ins. ;) > > >> Working example >> ... > The working example only uses async def and await, not async with > nor async for nor __aenter__, etc., etc. > > Could you put in a more complete example -- maybe a basic chat room > with both server and client -- that demonstrated more of the new > possibilities? Andrew Svetlov has implemented some new features in his aiomysql driver: https://github.com/aio-libs/aiomysql/blob/await/tests/test_async_iter.py I don't want to cite it in the PEP because it's not complete yet, and some idioms (like 'async with') aren't used to their full potential. > > Having gone through the PEP again, I am still no closer to understanding > what happens here: > > data = await reader.read(8192) > > What does the flow of control look like at the interpreter level? 'await' is semantically equivalent to 'yield from' in this line. To really understand all implementation details of this line you need to read PEP 3156 and experiment with asyncio. There is no easier way, unfortunately. I can't add a super detailed explanation how event loops can be implemented in PEP 492, that's not in its scope. The good news is that to use asyncio on a daily basis you don't need to know all details, as you don't need to know how 'ceval.c' works and how 'setTimeout' is implemented in JavaScript. Yury From ncoghlan at gmail.com Thu Apr 30 02:33:17 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 30 Apr 2015 10:33:17 +1000 Subject: [Python-Dev] Clarification of PEP 476 "opting out" section Message-ID: Hi folks, This is just a note to highlight the fact that I tweaked the "Opting out" section in PEP 476 based on various discussions I've had over the past few months: https://hg.python.org/peps/rev/dfd96ee9d6a8 The notable changes: * the example monkeypatching code handles AttributeError when looking up "ssl._create_unverified_context", in order to accommodate older versions of Python that don't have PEP 476 implemented * new paragraph making it clearer that while the intended use case for the monkeypatching trick is as a workaround to handle environments where you *know* HTTPS certificate verification won't work properly (including explicit references to sitecustomize.py and Standard Operating Environments for Python), there's also a secondary use case in allowing applications to provide a system administrator controlled setting to globally disable certificate verification (hence the change to the example code) * new paragraph making it explicit that even though we've improved Python's default behaviour, particularly security sensitive applications should still provide their own context rather than relying on the defaults Regards, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Thu Apr 30 02:59:13 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 30 Apr 2015 10:59:13 +1000 Subject: [Python-Dev] PEP 492 quibble and request In-Reply-To: <20150430002147.GE10248@stoneleaf.us> References: <20150430002147.GE10248@stoneleaf.us> Message-ID: On 30 April 2015 at 10:21, Ethan Furman wrote: > From the PEP: > >> Why not a __future__ import >> >> __future__ imports are inconvenient and easy to forget to add. > > That is a horrible rationale for not using an import. By that logic we > should have everything in built-ins. ;) It is also makes things more painful than they need to be for syntax highlighters. 'as' went through the "not really a keyword" path, and it's a recipe for complexity in the code generation toolchain and general quirkiness as things behave in unexpected ways. We have a defined process for introducing new keywords (i.e. __future__ imports) and the PEP doesn't adequately make the case for why we shouldn't use it here. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From stephen at xemacs.org Thu Apr 30 03:03:30 2015 From: stephen at xemacs.org (Stephen J. Turnbull) Date: Thu, 30 Apr 2015 10:03:30 +0900 Subject: [Python-Dev] Unicode literals in Python 2.7 In-Reply-To: References: Message-ID: <87egn2pgv1.fsf@uwakimon.sk.tsukuba.ac.jp> Adam Barto? writes: > I am in Windows and my terminal isn't utf-8 at the beginning, but I > install custom sys.std* objects at runtime and I also install > custom readline hook, IIRC, on the Linux console and in an uxterm, PYTHONIOENCODING=utf-8 in the environment does what you want. (Can't test at the moment, I'm on a Mac and Terminal.app somehow fails to pass the right thing to Python from the input methods I have available -- I get an empty string, while I don't seem to have an uxterm, only an xterm.) This has to be set at interpreter startup; once the interpreter has decided its IO encoding, you can't change it, you can only override it by intercepting the console input and decoding it yourself. Regarding your environment, the repeated use of "custom" is a red flag. Unless you bundle your whole environment with the code you distribute, Python can know nothing about that. In general, Python doesn't know what encoding it is receiving text in. If you *do* know, you can set PyCF_SOURCE_IS_UTF8. So if you know that all of your users will have your custom stdio and readline hooks installed (AFAICS, they can't use IDLE or IPython!), then you can bundle Python built with the flag set, or perhaps you can do the decoding in your custom stdio module. Note that even if you have a UTF-8 input source, some users are likely to be surprised because IIRC Python doesn't canonicalize in its codecs; that is left for higher-level libraries. Linux UTF-8 is usually NFC normalized, while Mac UTF-8 is NFD normalized. > >> u'\xce\xb1' Note that that is perfectly legal Unicode. > >>> Le 29 avr. 2015 10:36, "Adam Barto?" a ?crit : > >>> > Why I'm talking about PyCF_SOURCE_IS_UTF8? eval(u"u'\u03b1'") -> > >>> u'\u03b1' but eval(u"u'\u03b1'".encode('utf-8')) -> u'\xce\xb1'. Just to be clear, you accept those results as correct, right? From guido at python.org Thu Apr 30 03:12:37 2015 From: guido at python.org (Guido van Rossum) Date: Wed, 29 Apr 2015 18:12:37 -0700 Subject: [Python-Dev] PEP 492 quibble and request In-Reply-To: References: <20150430002147.GE10248@stoneleaf.us> Message-ID: On Wed, Apr 29, 2015 at 5:59 PM, Nick Coghlan wrote: > On 30 April 2015 at 10:21, Ethan Furman wrote: > > From the PEP: > > > >> Why not a __future__ import > >> > >> __future__ imports are inconvenient and easy to forget to add. > > > > That is a horrible rationale for not using an import. By that logic we > > should have everything in built-ins. ;) > This response is silly. The point is not against import but against __future__. A __future__ import definitely is inconvenient -- few people I know could even recite the correct constraints on their placement. > It is also makes things more painful than they need to be for syntax > highlighters. Does it? Do highlighters even understand __future__ imports? I wouldn't mind if a highlighter always highlighted 'async' and 'await' as keywords even where they aren't yet -- since they will be in 3.7. > 'as' went through the "not really a keyword" path, and > it's a recipe for complexity in the code generation toolchain and > general quirkiness as things behave in unexpected ways. > I don't recall that -- but it was a really long time ago so I may misremember (did we even have __future__ at the time?). > We have a defined process for introducing new keywords (i.e. > __future__ imports) and the PEP doesn't adequately make the case for > why we shouldn't use it here. > That's fair. But because of the difficulty in introducing new keywords, many proposals have been shot down or discouraged (or changed to use punctuation characters or abuse existing keywords) -- we should give Yury some credit for figuring out a way around this. Honestly I'm personally on the fence. -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From yselivanov.ml at gmail.com Thu Apr 30 03:30:42 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 29 Apr 2015 21:30:42 -0400 Subject: [Python-Dev] PEP 492: async/await in Python; version 4 Message-ID: <554185C2.5080003@gmail.com> Hi python-dev, New version of the PEP is attached. Summary of updates: 1. Terminology: - *native coroutine* term is used for "async def" functions. - *generator-based coroutine* term is used for PEP 380 coroutines used in asyncio. - *coroutine* is used when *native coroutine* or *generator based coroutine* can be used in the same context. I think that it's not really productive to discuss the terminology that we use in the PEP. Its only purpose is to disambiguate concepts used in the PEP. We should discuss how we will name new 'async def' coroutines in Python Documentation if the PEP is accepted. Although if you notice that somewhere in the PEP it is not crystal clear what "coroutine" means please give me a heads up! 2. Syntax of await expressions is now thoroghly defined in the PEP. See "Await Expression", "Updated operator precedence table", and "Examples of "await" expressions" sections. I like the current approach. I'm still not convinced that we should make 'await' the same grammatically as unary minus. I don't understand why we should allow parsing things like 'await -fut'; this should be a SyntaxError. I'm fine to modify the grammar to allow 'await await fut' syntax, though. And I'm open to discussion :) 3. CO_NATIVE_COROUTINE flag. This enables us to disable __iter__ and __next__ on native coroutines while maintaining full backwards compatibility. 4. asyncio / Migration strategy. Existing code can be used with PEP 492 as is, everything will work as expected. However, since *plain generator*s (not decorated with @asyncio.coroutine) cannot 'yield from' native coroutines (async def functions), it might break some code *while adapting it to the new syntax*. I'm open to just throw a RuntimeWarning in this case in 3.5, and raise a TypeError in 3.6. Please see the "Differences from generators" section of the PEP. 5. inspect.isawaitable() function. And, all new functions are now listed in the "New Standard Library Functions" section. 6. Lot's of small updates and tweaks throughout the PEP. Thanks, Yury PEP: 492 Title: Coroutines with async and await syntax Version: $Revision$ Last-Modified: $Date$ Author: Yury Selivanov Status: Draft Type: Standards Track Content-Type: text/x-rst Created: 09-Apr-2015 Python-Version: 3.5 Post-History: 17-Apr-2015, 21-Apr-2015, 27-Apr-2015, 29-Apr-2015 Abstract ======== This PEP introduces new syntax for coroutines, asynchronous ``with`` statements and ``for`` loops. The main motivation behind this proposal is to streamline writing and maintaining asynchronous code, as well as to simplify previously hard to implement code patterns. Rationale and Goals =================== Current Python supports implementing coroutines via generators (PEP 342), further enhanced by the ``yield from`` syntax introduced in PEP 380. This approach has a number of shortcomings: * it is easy to confuse coroutines with regular generators, since they share the same syntax; async libraries often attempt to alleviate this by using decorators (e.g. ``@asyncio.coroutine`` [1]_); * it is not possible to natively define a coroutine which has no ``yield`` or ``yield from`` statements, again requiring the use of decorators to fix potential refactoring issues; * support for asynchronous calls is limited to expressions where ``yield`` is allowed syntactically, limiting the usefulness of syntactic features, such as ``with`` and ``for`` statements. This proposal makes coroutines a native Python language feature, and clearly separates them from generators. This removes generator/coroutine ambiguity, and makes it possible to reliably define coroutines without reliance on a specific library. This also enables linters and IDEs to improve static code analysis and refactoring. Native coroutines and the associated new syntax features make it possible to define context manager and iteration protocols in asynchronous terms. As shown later in this proposal, the new ``async with`` statement lets Python programs perform asynchronous calls when entering and exiting a runtime context, and the new ``async for`` statement makes it possible to perform asynchronous calls in iterators. Specification ============= This proposal introduces new syntax and semantics to enhance coroutine support in Python. This specification presumes knowledge of the implementation of coroutines in Python (PEP 342 and PEP 380). Motivation for the syntax changes proposed here comes from the asyncio framework (PEP 3156) and the "Cofunctions" proposal (PEP 3152, now rejected in favor of this specification). From this point in this document we use the word *native coroutine* to refer to functions declared using the new syntax. *generator-based coroutine* is used where necessary to refer to coroutines that are based on generator syntax. *coroutine* is used in contexts where both definitions are applicable. New Coroutine Declaration Syntax -------------------------------- The following new syntax is used to declare a *native coroutine*:: async def read_data(db): pass Key properties of *native coroutines*: * ``async def`` functions are always coroutines, even if they do not contain ``await`` expressions. * It is a ``SyntaxError`` to have ``yield`` or ``yield from`` expressions in an ``async`` function. * Internally, two new code object flags were introduced: - ``CO_COROUTINE`` is used to enable runtime detection of *coroutines* (and migrating existing code). - ``CO_NATIVE_COROUTINE`` is used to mark *native coroutines* (defined with new syntax.) All coroutines have ``CO_COROUTINE``, ``CO_NATIVE_COROUTINE``, and ``CO_GENERATOR`` flags set. * Regular generators, when called, return a *generator object*; similarly, coroutines return a *coroutine object*. * ``StopIteration`` exceptions are not propagated out of coroutines, and are replaced with a ``RuntimeError``. For regular generators such behavior requires a future import (see PEP 479). * See also `Coroutine objects`_ section. types.coroutine() ----------------- A new function ``coroutine(gen)`` is added to the ``types`` module. It allows interoperability between existing *generator-based coroutines* in asyncio and *native coroutines* introduced by this PEP. The function applies ``CO_COROUTINE`` flag to generator-function's code object, making it return a *coroutine object*. The function can be used as a decorator, since it modifies generator- functions in-place and returns them. Note, that the ``CO_NATIVE_COROUTINE`` flag is not applied by ``types.coroutine()`` to make it possible to separate *native coroutines* defined with new syntax, from *generator-based coroutines*. Await Expression ---------------- The following new ``await`` expression is used to obtain a result of coroutine execution:: async def read_data(db): data = await db.fetch('SELECT ...') ... ``await``, similarly to ``yield from``, suspends execution of ``read_data`` coroutine until ``db.fetch`` *awaitable* completes and returns the result data. It uses the ``yield from`` implementation with an extra step of validating its argument. ``await`` only accepts an *awaitable*, which can be one of: * A *native coroutine object* returned from a *native coroutine*. * A *generator-based coroutine object* returned from a generator decorated with ``types.coroutine()``. * An object with an ``__await__`` method returning an iterator. Any ``yield from`` chain of calls ends with a ``yield``. This is a fundamental mechanism of how *Futures* are implemented. Since, internally, coroutines are a special kind of generators, every ``await`` is suspended by a ``yield`` somewhere down the chain of ``await`` calls (please refer to PEP 3156 for a detailed explanation.) To enable this behavior for coroutines, a new magic method called ``__await__`` is added. In asyncio, for instance, to enable *Future* objects in ``await`` statements, the only change is to add ``__await__ = __iter__`` line to ``asyncio.Future`` class. Objects with ``__await__`` method are called *Future-like* objects in the rest of this PEP. Also, please note that ``__aiter__`` method (see its definition below) cannot be used for this purpose. It is a different protocol, and would be like using ``__iter__`` instead of ``__call__`` for regular callables. It is a ``TypeError`` if ``__await__`` returns anything but an iterator. * Objects defined with CPython C API with a ``tp_await`` function, returning an iterator (similar to ``__await__`` method). It is a ``SyntaxError`` to use ``await`` outside of an ``async def`` function (like it is a ``SyntaxError`` to use ``yield`` outside of ``def`` function.) It is a ``TypeError`` to pass anything other than an *awaitable* object to an ``await`` expression. Updated operator precedence table ''''''''''''''''''''''''''''''''' ``await`` keyword is defined as follows:: power ::= await ["**" u_expr] await ::= ["await"] primary where "primary" represents the most tightly bound operations of the language. Its syntax is:: primary ::= atom | attributeref | subscription | slicing | call See Python Documentation [12]_ and `Grammar Updates`_ section of this proposal for details. The key ``await`` difference from ``yield`` and ``yield from`` operators is that *await expressions* do not require parentheses around them most of the times. Also, ``yield from`` allows any expression as its argument, including expressions like ``yield from a() + b()``, that would be parsed as ``yield from (a() + b())``, which is almost always a bug. In general, the result of any arithmetic operation is not an *awaitable* object. To avoid this kind of mistakes, it was decided to make ``await`` precedence lower than ``[]``, ``()``, and ``.``, but higher than ``**`` operators. +------------------------------+-----------------------------------+ | Operator | Description | +==============================+===================================+ | ``yield`` ``x``, | Yield expression | | ``yield from`` ``x`` | | +------------------------------+-----------------------------------+ | ``lambda`` | Lambda expression | +------------------------------+-----------------------------------+ | ``if`` -- ``else`` | Conditional expression | +------------------------------+-----------------------------------+ | ``or`` | Boolean OR | +------------------------------+-----------------------------------+ | ``and`` | Boolean AND | +------------------------------+-----------------------------------+ | ``not`` ``x`` | Boolean NOT | +------------------------------+-----------------------------------+ | ``in``, ``not in``, | Comparisons, including membership | | ``is``, ``is not``, ``<``, | tests and identity tests | | ``<=``, ``>``, ``>=``, | | | ``!=``, ``==`` | | +------------------------------+-----------------------------------+ | ``|`` | Bitwise OR | +------------------------------+-----------------------------------+ | ``^`` | Bitwise XOR | +------------------------------+-----------------------------------+ | ``&`` | Bitwise AND | +------------------------------+-----------------------------------+ | ``<<``, ``>>`` | Shifts | +------------------------------+-----------------------------------+ | ``+``, ``-`` | Addition and subtraction | +------------------------------+-----------------------------------+ | ``*``, ``@``, ``/``, ``//``, | Multiplication, matrix | | ``%`` | multiplication, division, | | | remainder | +------------------------------+-----------------------------------+ | ``+x``, ``-x``, ``~x`` | Positive, negative, bitwise NOT | +------------------------------+-----------------------------------+ | ``**`` | Exponentiation | +------------------------------+-----------------------------------+ | ``await`` ``x`` | Await expression | +------------------------------+-----------------------------------+ | ``x[index]``, | Subscription, slicing, | | ``x[index:index]``, | call, attribute reference | | ``x(arguments...)``, | | | ``x.attribute`` | | +------------------------------+-----------------------------------+ | ``(expressions...)``, | Binding or tuple display, | | ``[expressions...]``, | list display, | | ``{key: value...}``, | dictionary display, | | ``{expressions...}`` | set display | +------------------------------+-----------------------------------+ Examples of "await" expressions ''''''''''''''''''''''''''''''' Valid syntax examples: ================================== ================================== Expression Will be parsed as ================================== ================================== ``if await fut: pass`` ``if (await fut): pass`` ``if await fut + 1: pass`` ``if (await fut) + 1: pass`` ``pair = await fut, 'spam'`` ``pair = (await fut), 'spam'`` ``with await fut, open(): pass`` ``with (await fut), open(): pass`` ``await foo()['spam'].baz()()`` ``await ( foo()['spam'].baz()() )`` ``return await coro()`` ``return ( await coro() )`` ``res = await coro() ** 2`` ``res = (await coro()) ** 2`` ``func(a1=await coro(), a2=0)`` ``func(a1=(await coro()), a2=0)`` ``await foo() + await bar()`` ``(await foo()) + (await bar())`` ``-await foo()`` ``-(await foo())`` ================================== ================================== Invalid syntax examples: ================================== ================================== Expression Should be written as ================================== ================================== ``await await coro()`` ``await (await coro())`` ``await -coro()`` ``await (-coro())`` ================================== ================================== Asynchronous Context Managers and "async with" ---------------------------------------------- An *asynchronous context manager* is a context manager that is able to suspend execution in its *enter* and *exit* methods. To make this possible, a new protocol for asynchronous context managers is proposed. Two new magic methods are added: ``__aenter__`` and ``__aexit__``. Both must return an *awaitable*. An example of an asynchronous context manager:: class AsyncContextManager: async def __aenter__(self): await log('entering context') async def __aexit__(self, exc_type, exc, tb): await log('exiting context') New Syntax '''''''''' A new statement for asynchronous context managers is proposed:: async with EXPR as VAR: BLOCK which is semantically equivalent to:: mgr = (EXPR) aexit = type(mgr).__aexit__ aenter = type(mgr).__aenter__(mgr) exc = True try: VAR = await aenter BLOCK except: if not await aexit(mgr, *sys.exc_info()): raise else: await aexit(mgr, None, None, None) As with regular ``with`` statements, it is possible to specify multiple context managers in a single ``async with`` statement. It is an error to pass a regular context manager without ``__aenter__`` and ``__aexit__`` methods to ``async with``. It is a ``SyntaxError`` to use ``async with`` outside of an ``async def`` function. Example ''''''' With *asynchronous context managers* it is easy to implement proper database transaction managers for coroutines:: async def commit(session, data): ... async with session.transaction(): ... await session.update(data) ... Code that needs locking also looks lighter:: async with lock: ... instead of:: with (yield from lock): ... Asynchronous Iterators and "async for" -------------------------------------- An *asynchronous iterable* is able to call asynchronous code in its *iter* implementation, and *asynchronous iterator* can call asynchronous code in its *next* method. To support asynchronous iteration: 1. An object must implement an ``__aiter__`` method returning an *awaitable* resulting in an *asynchronous iterator object*. 2. An *asynchronous iterator object* must implement an ``__anext__`` method returning an *awaitable*. 3. To stop iteration ``__anext__`` must raise a ``StopAsyncIteration`` exception. An example of asynchronous iterable:: class AsyncIterable: async def __aiter__(self): return self async def __anext__(self): data = await self.fetch_data() if data: return data else: raise StopAsyncIteration async def fetch_data(self): ... New Syntax '''''''''' A new statement for iterating through asynchronous iterators is proposed:: async for TARGET in ITER: BLOCK else: BLOCK2 which is semantically equivalent to:: iter = (ITER) iter = await type(iter).__aiter__(iter) running = True while running: try: TARGET = await type(iter).__anext__(iter) except StopAsyncIteration: running = False else: BLOCK else: BLOCK2 It is a ``TypeError`` to pass a regular iterable without ``__aiter__`` method to ``async for``. It is a ``SyntaxError`` to use ``async for`` outside of an ``async def`` function. As for with regular ``for`` statement, ``async for`` has an optional ``else`` clause. Example 1 ''''''''' With asynchronous iteration protocol it is possible to asynchronously buffer data during iteration:: async for data in cursor: ... Where ``cursor`` is an asynchronous iterator that prefetches ``N`` rows of data from a database after every ``N`` iterations. The following code illustrates new asynchronous iteration protocol:: class Cursor: def __init__(self): self.buffer = collections.deque() def _prefetch(self): ... async def __aiter__(self): return self async def __anext__(self): if not self.buffer: self.buffer = await self._prefetch() if not self.buffer: raise StopAsyncIteration return self.buffer.popleft() then the ``Cursor`` class can be used as follows:: async for row in Cursor(): print(row) which would be equivalent to the following code:: i = await Cursor().__aiter__() while True: try: row = await i.__anext__() except StopAsyncIteration: break else: print(row) Example 2 ''''''''' The following is a utility class that transforms a regular iterable to an asynchronous one. While this is not a very useful thing to do, the code illustrates the relationship between regular and asynchronous iterators. :: class AsyncIteratorWrapper: def __init__(self, obj): self._it = iter(obj) async def __aiter__(self): return self async def __anext__(self): try: value = next(self._it) except StopIteration: raise StopAsyncIteration return value async for letter in AsyncIteratorWrapper("abc"): print(letter) Why StopAsyncIteration? ''''''''''''''''''''''' Coroutines are still based on generators internally. So, before PEP 479, there was no fundamental difference between :: def g1(): yield from fut return 'spam' and :: def g2(): yield from fut raise StopIteration('spam') And since PEP 479 is accepted and enabled by default for coroutines, the following example will have its ``StopIteration`` wrapped into a ``RuntimeError`` :: async def a1(): await fut raise StopIteration('spam') The only way to tell the outside code that the iteration has ended is to raise something other than ``StopIteration``. Therefore, a new built-in exception class ``StopAsyncIteration`` was added. Moreover, with semantics from PEP 479, all ``StopIteration`` exceptions raised in coroutines are wrapped in ``RuntimeError``. Coroutine objects ----------------- Differences from generators ''''''''''''''''''''''''''' This section applies only to *native coroutines* with ``CO_NATIVE_COROUTINE`` flag, i.e. defined with the new ``async def`` syntax. **The behavior of existing *generator-based coroutines* in asyncio remains unchanged.** Great effort has been made to make sure that coroutines and generators are treated as distinct concepts: 1. *Native coroutine objects* do not implement ``__iter__`` and ``__next__`` methods. Therefore, they cannot be iterated over or passed to ``iter()``, ``list()``, ``tuple()`` and other built-ins. They also cannot be used in a ``for..in`` loop. An attempt to use ``__iter__`` or ``__next__`` on a *native coroutine object* will result in a ``TypeError``. 2. *Plain generators* cannot ``yield from`` *native coroutine objects*: doing so will result in a ``TypeError``. 3. *generator-based coroutines* (for asyncio code must be decorated with ``@asyncio.coroutine``) can ``yield from`` *native coroutine objects*. 4. ``inspect.isgenerator()`` and ``inspect.isgeneratorfunction()`` return ``False`` for *native coroutine objects* and *native coroutine functions*. Coroutine object methods '''''''''''''''''''''''' Coroutines are based on generators internally, thus they share the implementation. Similarly to generator objects, coroutine objects have ``throw()``, ``send()`` and ``close()`` methods. ``StopIteration`` and ``GeneratorExit`` play the same role for coroutine objects (although PEP 479 is enabled by default for coroutines). See PEP 342, PEP 380, and Python Documentation [11]_ for details. ``throw()``, ``send()`` methods for coroutine objects are used to push values and raise errors into *Future-like* objects. Debugging Features ------------------ A common beginner mistake is forgetting to use ``yield from`` on coroutines:: @asyncio.coroutine def useful(): asyncio.sleep(1) # this will do noting without 'yield from' For debugging this kind of mistakes there is a special debug mode in asyncio, in which ``@coroutine`` decorator wraps all functions with a special object with a destructor logging a warning. Whenever a wrapped generator gets garbage collected, a detailed logging message is generated with information about where exactly the decorator function was defined, stack trace of where it was collected, etc. Wrapper object also provides a convenient ``__repr__`` function with detailed information about the generator. The only problem is how to enable these debug capabilities. Since debug facilities should be a no-op in production mode, ``@coroutine`` decorator makes the decision of whether to wrap or not to wrap based on an OS environment variable ``PYTHONASYNCIODEBUG``. This way it is possible to run asyncio programs with asyncio's own functions instrumented. ``EventLoop.set_debug``, a different debug facility, has no impact on ``@coroutine`` decorator's behavior. With this proposal, coroutines is a native, distinct from generators, concept. New methods ``set_coroutine_wrapper`` and ``get_coroutine_wrapper`` are added to the ``sys`` module, with which frameworks can provide advanced debugging facilities. It is also important to make coroutines as fast and efficient as possible, therefore there are no debug features enabled by default. Example:: async def debug_me(): await asyncio.sleep(1) def async_debug_wrap(generator): return asyncio.CoroWrapper(generator) sys.set_coroutine_wrapper(async_debug_wrap) debug_me() # <- this line will likely GC the coroutine object and # trigger asyncio.CoroWrapper's code. assert isinstance(debug_me(), asyncio.CoroWrapper) sys.set_coroutine_wrapper(None) # <- this unsets any # previously set wrapper assert not isinstance(debug_me(), asyncio.CoroWrapper) New Standard Library Functions ------------------------------ * ``types.coroutine(gen)``. See `types.coroutine()`_ section for details. * ``inspect.iscoroutine(obj)`` returns ``True`` if ``obj`` is a *coroutine object*. * ``inspect.iscoroutinefunction(obj)`` returns ``True`` if ``obj`` is a *coroutine function*. * ``inspect.isawaitable(obj)`` returns ``True`` if ``obj`` can be used in ``await`` expression. See `Await Expression`_ for details. * ``sys.set_coroutine_wrapper(wraper)`` allows to intercept creation of *coroutine objects*. ``wraper`` must be a callable that accepts one argument: a *coroutine object* or ``None``. ``None`` resets the wrapper. If called twice, the new wrapper replaces the previous one. See `Debugging Features`_ for more details. * ``sys.get_coroutine_wrapper()`` returns the current wrapper object. Returns ``None`` if no wrapper was set. See `Debugging Features`_ for more details. Glossary ======== :Native coroutine: A coroutine function is declared with ``async def``. It uses ``await`` and ``return value``; see `New Coroutine Declaration Syntax`_ for details. :Native coroutine object: Returned from a native coroutine function. See `Await Expression`_ for details. :Generator-based coroutine: Coroutines based on generator syntax. Most common example are functions decorated with ``@asyncio.coroutine``. :Generator-based coroutine object: Returned from a generator-based coroutine function. :Coroutine: Either *native coroutine* or *generator-based coroutine*. :Coroutine object: Either *native coroutine object* or *generator-based coroutine object*. :Future-like object: An object with an ``__await__`` method, or a C object with ``tp_await`` function, returning an iterator. Can be consumed by an ``await`` expression in a coroutine. A coroutine waiting for a Future-like object is suspended until the Future-like object's ``__await__`` completes, and returns the result. See `Await Expression`_ for details. :Awaitable: A *Future-like* object or a *coroutine object*. See `Await Expression`_ for details. :Asynchronous context manager: An asynchronous context manager has ``__aenter__`` and ``__aexit__`` methods and can be used with ``async with``. See `Asynchronous Context Managers and "async with"`_ for details. :Asynchronous iterable: An object with an ``__aiter__`` method, which must return an *asynchronous iterator* object. Can be used with ``async for``. See `Asynchronous Iterators and "async for"`_ for details. :Asynchronous iterator: An asynchronous iterator has an ``__anext__`` method. See `Asynchronous Iterators and "async for"`_ for details. List of functions and methods ============================= ================= =================================== ================= Method Can contain Can't contain ================= =================================== ================= async def func await, return value yield, yield from async def __a*__ await, return value yield, yield from def __a*__ return awaitable await def __await__ yield, yield from, return iterable await generator yield, yield from, return value await ================= =================================== ================= Where: * "async def func": native coroutine; * "async def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``, ``__aexit__`` defined with the ``async`` keyword; * "def __a*__": ``__aiter__``, ``__anext__``, ``__aenter__``, ``__aexit__`` defined without the ``async`` keyword, must return an *awaitable*; * "def __await__": ``__await__`` method to implement *Future-like* objects; * generator: a "regular" generator, function defined with ``def`` and which contains a least one ``yield`` or ``yield from`` expression. Transition Plan =============== To avoid backwards compatibility issues with ``async`` and ``await`` keywords, it was decided to modify ``tokenizer.c`` in such a way, that it: * recognizes ``async def`` name tokens combination (start of a native coroutine); * keeps track of regular functions and native coroutines; * replaces ``'async'`` token with ``ASYNC`` and ``'await'`` token with ``AWAIT`` when in the process of yielding tokens for native coroutines. This approach allows for seamless combination of new syntax features (all of them available only in ``async`` functions) with any existing code. An example of having "async def" and "async" attribute in one piece of code:: class Spam: async = 42 async def ham(): print(getattr(Spam, 'async')) # The coroutine can be executed and will print '42' Backwards Compatibility ----------------------- This proposal preserves 100% backwards compatibility. asyncio ------- ``asyncio`` module was adapted and tested to work with coroutines and new statements. Backwards compatibility is 100% preserved, i.e. all existing code will work as-is. The required changes are mainly: 1. Modify ``@asyncio.coroutine`` decorator to use new ``types.coroutine()`` function. 2. Add ``__await__ = __iter__`` line to ``asyncio.Future`` class. 3. Add ``ensure_task()`` as an alias for ``async()`` function. Deprecate ``async()`` function. Migration strategy '''''''''''''''''' Because *plain generators* cannot ``yield from`` *native coroutine objects* (see `Differences from generators`_ section for more details), it is advised to make sure that all generator-based coroutines are decorated with ``@asyncio.coroutine`` *before* starting to use the new syntax. Grammar Updates --------------- Grammar changes are also fairly minimal:: decorated: decorators (classdef | funcdef | async_funcdef) async_funcdef: ASYNC funcdef compound_stmt: (if_stmt | while_stmt | for_stmt | try_stmt | with_stmt | funcdef | classdef | decorated | async_stmt) async_stmt: ASYNC (funcdef | with_stmt | for_stmt) power: atom_expr ['**' factor] atom_expr: [AWAIT] atom trailer* Transition Period Shortcomings ------------------------------ There is just one. Until ``async`` and ``await`` are not proper keywords, it is not possible (or at least very hard) to fix ``tokenizer.c`` to recognize them on the **same line** with ``def`` keyword:: # async and await will always be parsed as variables async def outer(): # 1 def nested(a=(await fut)): pass async def foo(): return (await fut) # 2 Since ``await`` and ``async`` in such cases are parsed as ``NAME`` tokens, a ``SyntaxError`` will be raised. To workaround these issues, the above examples can be easily rewritten to a more readable form:: async def outer(): # 1 a_default = await fut def nested(a=a_default): pass async def foo(): # 2 return (await fut) This limitation will go away as soon as ``async`` and ``await`` are proper keywords. Or if it's decided to use a future import for this PEP. Deprecation Plans ----------------- ``async`` and ``await`` names will be softly deprecated in CPython 3.5 and 3.6. In 3.7 we will transform them to proper keywords. Making ``async`` and ``await`` proper keywords before 3.7 might make it harder for people to port their code to Python 3. Design Considerations ===================== PEP 3152 -------- PEP 3152 by Gregory Ewing proposes a different mechanism for coroutines (called "cofunctions"). Some key points: 1. A new keyword ``codef`` to declare a *cofunction*. *Cofunction* is always a generator, even if there is no ``cocall`` expressions inside it. Maps to ``async def`` in this proposal. 2. A new keyword ``cocall`` to call a *cofunction*. Can only be used inside a *cofunction*. Maps to ``await`` in this proposal (with some differences, see below.) 3. It is not possible to call a *cofunction* without a ``cocall`` keyword. 4. ``cocall`` grammatically requires parentheses after it:: atom: cocall | cocall: 'cocall' atom cotrailer* '(' [arglist] ')' cotrailer: '[' subscriptlist ']' | '.' NAME 5. ``cocall f(*args, **kwds)`` is semantically equivalent to ``yield from f.__cocall__(*args, **kwds)``. Differences from this proposal: 1. There is no equivalent of ``__cocall__`` in this PEP, which is called and its result is passed to ``yield from`` in the ``cocall`` expression. ``await`` keyword expects an *awaitable* object, validates the type, and executes ``yield from`` on it. Although, ``__await__`` method is similar to ``__cocall__``, but is only used to define *Future-like* objects. 2. ``await`` is defined in almost the same way as ``yield from`` in the grammar (it is later enforced that ``await`` can only be inside ``async def``). It is possible to simply write ``await future``, whereas ``cocall`` always requires parentheses. 3. To make asyncio work with PEP 3152 it would be required to modify ``@asyncio.coroutine`` decorator to wrap all functions in an object with a ``__cocall__`` method, or to implement ``__cocall__`` on generators. To call *cofunctions* from existing generator-based coroutines it would be required to use ``costart(cofunc, *args, **kwargs)`` built-in. 4. Since it is impossible to call a *cofunction* without a ``cocall`` keyword, it automatically prevents the common mistake of forgetting to use ``yield from`` on generator-based coroutines. This proposal addresses this problem with a different approach, see `Debugging Features`_. 5. A shortcoming of requiring a ``cocall`` keyword to call a coroutine is that if is decided to implement coroutine-generators -- coroutines with ``yield`` or ``async yield`` expressions -- we wouldn't need a ``cocall`` keyword to call them. So we'll end up having ``__cocall__`` and no ``__call__`` for regular coroutines, and having ``__call__`` and no ``__cocall__`` for coroutine- generators. 6. Requiring parentheses grammatically also introduces a whole lot of new problems. The following code:: await fut await function_returning_future() await asyncio.gather(coro1(arg1, arg2), coro2(arg1, arg2)) would look like:: cocall fut() # or cocall costart(fut) cocall (function_returning_future())() cocall asyncio.gather(costart(coro1, arg1, arg2), costart(coro2, arg1, arg2)) 7. There are no equivalents of ``async for`` and ``async with`` in PEP 3152. Coroutine-generators -------------------- With ``async for`` keyword it is desirable to have a concept of a *coroutine-generator* -- a coroutine with ``yield`` and ``yield from`` expressions. To avoid any ambiguity with regular generators, we would likely require to have an ``async`` keyword before ``yield``, and ``async yield from`` would raise a ``StopAsyncIteration`` exception. While it is possible to implement coroutine-generators, we believe that they are out of scope of this proposal. It is an advanced concept that should be carefully considered and balanced, with a non-trivial changes in the implementation of current generator objects. This is a matter for a separate PEP. Why "async" and "await" keywords -------------------------------- async/await is not a new concept in programming languages: * C# has it since long time ago [5]_; * proposal to add async/await in ECMAScript 7 [2]_; see also Traceur project [9]_; * Facebook's Hack/HHVM [6]_; * Google's Dart language [7]_; * Scala [8]_; * proposal to add async/await to C++ [10]_; * and many other less popular languages. This is a huge benefit, as some users already have experience with async/await, and because it makes working with many languages in one project easier (Python with ECMAScript 7 for instance). Why "__aiter__" returns awaitable --------------------------------- In principle, ``__aiter__`` could be a regular function. There are several good reasons to make it a coroutine: * as most of the ``__anext__``, ``__aenter__``, and ``__aexit__`` methods are coroutines, users would often make a mistake defining it as ``async`` anyways; * there might be a need to run some asynchronous operations in ``__aiter__``, for instance to prepare DB queries or do some file operation. Importance of "async" keyword ----------------------------- While it is possible to just implement ``await`` expression and treat all functions with at least one ``await`` as coroutines, this approach makes APIs design, code refactoring and its long time support harder. Let's pretend that Python only has ``await`` keyword:: def useful(): ... await log(...) ... def important(): await useful() If ``useful()`` function is refactored and someone removes all ``await`` expressions from it, it would become a regular python function, and all code that depends on it, including ``important()`` would be broken. To mitigate this issue a decorator similar to ``@asyncio.coroutine`` has to be introduced. Why "async def" --------------- For some people bare ``async name(): pass`` syntax might look more appealing than ``async def name(): pass``. It is certainly easier to type. But on the other hand, it breaks the symmetry between ``async def``, ``async with`` and ``async for``, where ``async`` is a modifier, stating that the statement is asynchronous. It is also more consistent with the existing grammar. Why not "await for" and "await with" ------------------------------------ ``async`` is an adjective, and hence it is a better choice for a *statement qualifier* keyword. ``await for/with`` would imply that something is awaiting for a completion of a ``for`` or ``with`` statement. Why "async def" and not "def async" ----------------------------------- ``async`` keyword is a *statement qualifier*. A good analogy to it are "static", "public", "unsafe" keywords from other languages. "async for" is an asynchronous "for" statement, "async with" is an asynchronous "with" statement, "async def" is an asynchronous function. Having "async" after the main statement keyword might introduce some confusion, like "for async item in iterator" can be read as "for each asynchronous item in iterator". Having ``async`` keyword before ``def``, ``with`` and ``for`` also makes the language grammar simpler. And "async def" better separates coroutines from regular functions visually. Why not a __future__ import --------------------------- ``__future__`` imports are inconvenient and easy to forget to add. Also, they are enabled for the whole source file. Consider that there is a big project with a popular module named "async.py". With future imports it is required to either import it using ``__import__()`` or ``importlib.import_module()`` calls, or to rename the module. The proposed approach makes it possible to continue using old code and modules without a hassle, while coming up with a migration plan for future python versions. Why magic methods start with "a" -------------------------------- New asynchronous magic methods ``__aiter__``, ``__anext__``, ``__aenter__``, and ``__aexit__`` all start with the same prefix "a". An alternative proposal is to use "async" prefix, so that ``__aiter__`` becomes ``__async_iter__``. However, to align new magic methods with the existing ones, such as ``__radd__`` and ``__iadd__`` it was decided to use a shorter version. Why not reuse existing magic names ---------------------------------- An alternative idea about new asynchronous iterators and context managers was to reuse existing magic methods, by adding an ``async`` keyword to their declarations:: class CM: async def __enter__(self): # instead of __aenter__ ... This approach has the following downsides: * it would not be possible to create an object that works in both ``with`` and ``async with`` statements; * it would break backwards compatibility, as nothing prohibits from returning a Future-like objects from ``__enter__`` and/or ``__exit__`` in Python <= 3.4; * one of the main points of this proposal is to make native coroutines as simple and foolproof as possible, hence the clear separation of the protocols. Why not reuse existing "for" and "with" statements -------------------------------------------------- The vision behind existing generator-based coroutines and this proposal is to make it easy for users to see where the code might be suspended. Making existing "for" and "with" statements to recognize asynchronous iterators and context managers will inevitably create implicit suspend points, making it harder to reason about the code. Comprehensions -------------- Syntax for asynchronous comprehensions could be provided, but this construct is outside of the scope of this PEP. Async lambda functions ---------------------- Syntax for asynchronous lambda functions could be provided, but this construct is outside of the scope of this PEP. Performance =========== Overall Impact -------------- This proposal introduces no observable performance impact. Here is an output of python's official set of benchmarks [4]_: :: python perf.py -r -b default ../cpython/python.exe ../cpython-aw/python.exe [skipped] Report on Darwin ysmac 14.3.0 Darwin Kernel Version 14.3.0: Mon Mar 23 11:59:05 PDT 2015; root:xnu-2782.20.48~5/RELEASE_X86_64 x86_64 i386 Total CPU cores: 8 ### etree_iterparse ### Min: 0.365359 -> 0.349168: 1.05x faster Avg: 0.396924 -> 0.379735: 1.05x faster Significant (t=9.71) Stddev: 0.01225 -> 0.01277: 1.0423x larger The following not significant results are hidden, use -v to show them: django_v2, 2to3, etree_generate, etree_parse, etree_process, fastpickle, fastunpickle, json_dump_v2, json_load, nbody, regex_v8, tornado_http. Tokenizer modifications ----------------------- There is no observable slowdown of parsing python files with the modified tokenizer: parsing of one 12Mb file (``Lib/test/test_binop.py`` repeated 1000 times) takes the same amount of time. async/await ----------- The following micro-benchmark was used to determine performance difference between "async" functions and generators:: import sys import time def binary(n): if n <= 0: return 1 l = yield from binary(n - 1) r = yield from binary(n - 1) return l + 1 + r async def abinary(n): if n <= 0: return 1 l = await abinary(n - 1) r = await abinary(n - 1) return l + 1 + r def timeit(gen, depth, repeat): t0 = time.time() for _ in range(repeat): list(gen(depth)) t1 = time.time() print('{}({}) * {}: total {:.3f}s'.format( gen.__name__, depth, repeat, t1-t0)) The result is that there is no observable performance difference. Minimum timing of 3 runs :: abinary(19) * 30: total 12.985s binary(19) * 30: total 12.953s Note that depth of 19 means 1,048,575 calls. Reference Implementation ======================== The reference implementation can be found here: [3]_. List of high-level changes and new protocols -------------------------------------------- 1. New syntax for defining coroutines: ``async def`` and new ``await`` keyword. 2. New ``__await__`` method for Future-like objects, and new ``tp_await`` slot in ``PyTypeObject``. 3. New syntax for asynchronous context managers: ``async with``. And associated protocol with ``__aenter__`` and ``__aexit__`` methods. 4. New syntax for asynchronous iteration: ``async for``. And associated protocol with ``__aiter__``, ``__aexit__`` and new built- in exception ``StopAsyncIteration``. 5. New AST nodes: ``AsyncFunctionDef``, ``AsyncFor``, ``AsyncWith``, ``Await``. 6. New functions: ``sys.set_coroutine_wrapper(callback)``, ``sys.get_coroutine_wrapper()``, ``types.coroutine(gen)``, ``inspect.iscoroutinefunction()``, ``inspect.iscoroutine()``, and ``inspect.isawaitable()``. 7. New ``CO_COROUTINE`` and ``CO_NATIVE_COROUTINE`` bit flags for code objects. While the list of changes and new things is not short, it is important to understand, that most users will not use these features directly. It is intended to be used in frameworks and libraries to provide users with convenient to use and unambiguous APIs with ``async def``, ``await``, ``async for`` and ``async with`` syntax. Working example --------------- All concepts proposed in this PEP are implemented [3]_ and can be tested. :: import asyncio async def echo_server(): print('Serving on localhost:8000') await asyncio.start_server(handle_connection, 'localhost', 8000) async def handle_connection(reader, writer): print('New connection...') while True: data = await reader.read(8192) if not data: break print('Sending {:.10}... back'.format(repr(data))) writer.write(data) loop = asyncio.get_event_loop() loop.run_until_complete(echo_server()) try: loop.run_forever() finally: loop.close() References ========== .. [1] https://docs.python.org/3/library/asyncio-task.html#asyncio.coroutine .. [2] http://wiki.ecmascript.org/doku.php?id=strawman:async_functions .. [3] https://github.com/1st1/cpython/tree/await .. [4] https://hg.python.org/benchmarks .. [5] https://msdn.microsoft.com/en-us/library/hh191443.aspx .. [6] http://docs.hhvm.com/manual/en/hack.async.php .. [7] https://www.dartlang.org/articles/await-async/ .. [8] http://docs.scala-lang.org/sips/pending/async.html .. [9] https://github.com/google/traceur-compiler/wiki/LanguageFeatures#async-functions-experimental .. [10] http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3722.pdf (PDF) .. [11] https://docs.python.org/3/reference/expressions.html#generator-iterator-methods .. [12] https://docs.python.org/3/reference/expressions.html#primaries Acknowledgments =============== I thank Guido van Rossum, Victor Stinner, Elvis Pranskevichus, Andrew Svetlov, and ?ukasz Langa for their initial feedback. Copyright ========= This document has been placed in the public domain. .. Local Variables: mode: indented-text indent-tabs-mode: nil sentence-end-double-space: t fill-column: 70 coding: utf-8 End: From rosuav at gmail.com Thu Apr 30 03:43:46 2015 From: rosuav at gmail.com (Chris Angelico) Date: Thu, 30 Apr 2015 11:43:46 +1000 Subject: [Python-Dev] Unicode literals in Python 2.7 In-Reply-To: <87egn2pgv1.fsf@uwakimon.sk.tsukuba.ac.jp> References: <87egn2pgv1.fsf@uwakimon.sk.tsukuba.ac.jp> Message-ID: On Thu, Apr 30, 2015 at 11:03 AM, Stephen J. Turnbull wrote: > Note that even if you have a UTF-8 input source, some users are likely > to be surprised because IIRC Python doesn't canonicalize in its > codecs; that is left for higher-level libraries. Linux UTF-8 is > usually NFC normalized, while Mac UTF-8 is NFD normalized. > > > >> u'\xce\xb1' > > Note that that is perfectly legal Unicode. It's legal Unicode, but it doesn't mean what he typed in. This means: '\xce' LATIN CAPITAL LETTER I WITH CIRCUMFLEX '\xb1' PLUS-MINUS SIGN but the original input was: '\u03b1' GREEK SMALL LETTER ALPHA ChrisA From ncoghlan at gmail.com Thu Apr 30 03:52:11 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 30 Apr 2015 11:52:11 +1000 Subject: [Python-Dev] PEP 492 quibble and request In-Reply-To: <55417845.3060507@gmail.com> References: <20150430002147.GE10248@stoneleaf.us> <55417845.3060507@gmail.com> Message-ID: On 30 April 2015 at 10:33, Yury Selivanov wrote: > To really understand all implementation details of this line > you need to read PEP 3156 and experiment with asyncio. There > is no easier way, unfortunately. I can't add a super detailed > explanation how event loops can be implemented in PEP 492, > that's not in its scope. This request isn't about understanding the implementation details, it's about understanding what Python *users* will gain from the PEP without *needing* to understand the implementation details. For that, I'd like to see some not-completely-trivial example code in (or at least linked from) the PEP written using: * trollius (no "yield from", Python 2 compatible, akin to Twisted's inlineDeferred's) * asyncio/tulip ("yield from", Python 3.3+ compatible) * PEP 492 (async/await, Python 3.5+ only) The *intent* of PEP 492, like PEP 380 and PEP 342 before it, is "make asynchronous programming in Python easier". I think it will actually succeed in that goal, but I don't think it currently does an especially job of explaining that to folks that aren't both already deeply invested in the explicitly asynchronous programming model *and* thoroughly aware of the fact that most of us need asynchronous programming to look as much like synchronous programming as possible in order for it to fit our brains. Some folks can fit ravioli code with callbacks going everywhere in their brains, but I can't, and it's my experience that most other folks can't either. This lack means the PEP that gets confused objections from folks that wish explicitly asynchronous programming models would just go away entirely (they won't), as well as from folks that already understand it and don't see why we can't continue treating it as a special case of other features that folks have to learn how to use from first principles, rather than saving them that up front learning cost by doing the pattern extraction to make event driven explicitly asynchronous programming its own first class concept with dedicated syntax (a hint on that front: with statements are just particular patterns for using try/except/finally, decorators are just particular patterns in using higher order functions, for statements are just particular patterns in using while statements and builtins, and even imports and classes just represent particular patterns in combining dictionaries, code execution and the metaclass machinery - the pattern extraction and dedicated syntax associated with all of them makes it possible to learn to *use* these concepts without first having to learn how to *implement* them) >From my own perspective, I've spent a reasonable amount of time attempting to explain to folks the "modal" nature of generators, in that you can use them both as pure iterable data sources *and* as coroutines (as per PEP 342). The problem I've found is that our current approach ends up failing the "conceptually different things should also look superficially different" test: outside certain data pipeline processing problems, generators-as-iterators and generators-as-coroutines mostly end up being fundamentally *different* ways of approaching a programming problem, but the current shared syntax encourages users to attempt to place them in the same mental bucket. Those users will remain eternally confused until they learn to place them in two different buckets despite the shared syntax (the 5 or so different meanings of "static" in C++ come to mind at this point...). With explicitly asynchronous development*, this problem of needing to learn to segment the world into two pieces is particularly important, and this wonderful rant on red & blue functions published a couple of months ago helps explain why: http://journal.stuffwithstuff.com/2015/02/01/what-color-is-your-function/ The dedicated async/await syntax proposed in PEP 492 lets the end user mostly *not care* about all the messiness that needs to happen under the covers to make explicitly asynchronous programming as close to normal synchronous programming as possible. The fact that under the hood in CPython normal functions, coroutines, and generators are implemented largely as variant behaviours of the same type (producing respectively the requested result, an object expecting to be driven by an event loop to produce the requested result, and an object expecting to be driven be an iterative loop to produce a succession of requested values rather than a single result) can (and at least arguably should) be irrelevant to the mental model formed by future Python programmers (see http://uxoslo.com/2014/01/14/ux-hints-why-mental-models-matter/ for more on the difference between the representational models we present directly to end users and the underlying implementation models we use as programmers to actually make things work) Regards, Nick. P.S. *While it's not a substitute for explicitly asynchronous development, implicitly asynchronous code still has an important role to play as one of the 3 models (together with thread pools and blocking on the event loop while running a coroutine to completion) that lets synchronous code play nice with asynchronous code: http://python-notes.curiousefficiency.org/en/latest/pep_ideas/async_programming.html#gevent-and-pep-3156 -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Thu Apr 30 04:07:24 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 30 Apr 2015 12:07:24 +1000 Subject: [Python-Dev] PEP 492 quibble and request In-Reply-To: References: <20150430002147.GE10248@stoneleaf.us> Message-ID: On 30 April 2015 at 11:12, Guido van Rossum wrote: > On Wed, Apr 29, 2015 at 5:59 PM, Nick Coghlan wrote: >> It is also makes things more painful than they need to be for syntax >> highlighters. > > Does it? Do highlighters even understand __future__ imports? I wouldn't mind > if a highlighter always highlighted 'async' and 'await' as keywords even > where they aren't yet -- since they will be in 3.7. Yeah, that's a good point. >> 'as' went through the "not really a keyword" path, and >> it's a recipe for complexity in the code generation toolchain and >> general quirkiness as things behave in unexpected ways. > > I don't recall that -- but it was a really long time ago so I may > misremember (did we even have __future__ at the time?). I don't actually know, I only know about its former pseudo-keyword status because we made it a real keyword as part of "from __future__ import with_statement". I think I was conflating it with the hassles we encountered at various points due to None, True, and False not being real keywords in Python 2, but I don't believe the problems we had with those apply here (given that we won't be using 'await' and 'async' as values in any context that the bytecode generation chain cares about). >> We have a defined process for introducing new keywords (i.e. >> __future__ imports) and the PEP doesn't adequately make the case for >> why we shouldn't use it here. > > That's fair. But because of the difficulty in introducing new keywords, many > proposals have been shot down or discouraged (or changed to use punctuation > characters or abuse existing keywords) -- we should give Yury some credit > for figuring out a way around this. Honestly I'm personally on the fence. Yeah, I'm coming around to the idea. For the async pseudo-keyword, I can see that the proposal only allows its use in cases that were previously entirely illegal, but I'm not yet clear on how the PEP proposes to avoid changing the meaning of the following code: x = await(this_is_a_function_call) Unless I'm misreading the proposed grammar in the PEP (which is entirely possible), I believe PEP 492 would reinterpret that as: x = await this_is_not_a_function_call_any_more Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From yselivanov.ml at gmail.com Thu Apr 30 04:03:26 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 29 Apr 2015 22:03:26 -0400 Subject: [Python-Dev] PEP 492: async/await in Python; version 4 In-Reply-To: <554185C2.5080003@gmail.com> References: <554185C2.5080003@gmail.com> Message-ID: <55418D6E.7020105@gmail.com> One more thing to discuss: 7. StopAsyncIteration vs AsyncStopIteration. I don't have a strong opinion on this, I prefer the former because it reads better. There was no consensus on which one we should use. Thanks, Yury From yselivanov.ml at gmail.com Thu Apr 30 04:25:09 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 29 Apr 2015 22:25:09 -0400 Subject: [Python-Dev] PEP 492 quibble and request In-Reply-To: References: <20150430002147.GE10248@stoneleaf.us> <55417845.3060507@gmail.com> Message-ID: <55419285.1040505@gmail.com> Nick, On 2015-04-29 9:52 PM, Nick Coghlan wrote: > On 30 April 2015 at 10:33, Yury Selivanov wrote: >> To really understand all implementation details of this line >> you need to read PEP 3156 and experiment with asyncio. There >> is no easier way, unfortunately. I can't add a super detailed >> explanation how event loops can be implemented in PEP 492, >> that's not in its scope. > This request isn't about understanding the implementation details, > it's about understanding what Python *users* will gain from the PEP > without *needing* to understand the implementation details. > > For that, I'd like to see some not-completely-trivial example code in > (or at least linked from) the PEP written using: > > * trollius (no "yield from", Python 2 compatible, akin to Twisted's > inlineDeferred's) > * asyncio/tulip ("yield from", Python 3.3+ compatible) > * PEP 492 (async/await, Python 3.5+ only) I'll see what I can do with aiomysql library to showcase async for and async with. I have a few outstanding tasks with reference implementation to test/add though. I'm not sure that trollius will add anything to it. It's just like asyncio, but uses 'yield' instead of 'yield from' and was envisioned by Victor Stinner as a transitional-framework to ease the porting of openstack to python 3. > > The *intent* of PEP 492, like PEP 380 and PEP 342 before it, is "make > asynchronous programming in Python easier". I think it will actually > succeed in that goal, but I don't think it currently does an > especially job of explaining that to folks that aren't both already > deeply invested in the explicitly asynchronous programming model *and* > thoroughly aware of the fact that most of us need asynchronous > programming to look as much like synchronous programming as possible > in order for it to fit our brains. Some folks can fit ravioli code > with callbacks going everywhere in their brains, but I can't, and it's > my experience that most other folks can't either. This lack means the > PEP that gets confused objections from folks that wish explicitly > asynchronous programming models would just go away entirely (they > won't), as well as from folks that already understand it and don't see > why we can't continue treating it as a special case of other features > that folks have to learn how to use from first principles, rather than > saving them that up front learning cost by doing the pattern > extraction to make event driven explicitly asynchronous programming > its own first class concept with dedicated syntax (a hint on that > front: with statements are just particular patterns for using > try/except/finally, decorators are just particular patterns in using > higher order functions, for statements are just particular patterns in > using while statements and builtins, and even imports and classes just > represent particular patterns in combining dictionaries, code > execution and the metaclass machinery - the pattern extraction and > dedicated syntax associated with all of them makes it possible to > learn to *use* these concepts without first having to learn how to > *implement* them) > > From my own perspective, I've spent a reasonable amount of time > attempting to explain to folks the "modal" nature of generators, in > that you can use them both as pure iterable data sources *and* as > coroutines (as per PEP 342). > > The problem I've found is that our current approach ends up failing > the "conceptually different things should also look superficially > different" test: outside certain data pipeline processing problems, > generators-as-iterators and generators-as-coroutines mostly end up > being fundamentally *different* ways of approaching a programming > problem, but the current shared syntax encourages users to attempt to > place them in the same mental bucket. Those users will remain > eternally confused until they learn to place them in two different > buckets despite the shared syntax (the 5 or so different meanings of > "static" in C++ come to mind at this point...). Agree. This confusion of trying to fit two fundamentally different programming models in one syntax is what led me to start thinking about PEP 492 ideas several years ago. And the absence of 'async with' and 'async for' statements forced me to use greenlets, which is another reason for the PEP. > > With explicitly asynchronous development*, this problem of needing to > learn to segment the world into two pieces is particularly important, > and this wonderful rant on red & blue functions published a couple of > months ago helps explain why: > http://journal.stuffwithstuff.com/2015/02/01/what-color-is-your-function/ > > The dedicated async/await syntax proposed in PEP 492 lets the end user > mostly *not care* about all the messiness that needs to happen under > the covers to make explicitly asynchronous programming as close to > normal synchronous programming as possible. The fact that under the > hood in CPython normal functions, coroutines, and generators are > implemented largely as variant behaviours of the same type (producing > respectively the requested result, an object expecting to be driven by > an event loop to produce the requested result, and an object expecting > to be driven be an iterative loop to produce a succession of requested > values rather than a single result) can (and at least arguably should) > be irrelevant to the mental model formed by future Python programmers > (see http://uxoslo.com/2014/01/14/ux-hints-why-mental-models-matter/ > for more on the difference between the representational models we > present directly to end users and the underlying implementation models > we use as programmers to actually make things work) I'll see how I can incorporate your thoughts into Abstract and Rationale sections. Thanks, Yury From guido at python.org Thu Apr 30 04:31:22 2015 From: guido at python.org (Guido van Rossum) Date: Wed, 29 Apr 2015 19:31:22 -0700 Subject: [Python-Dev] PEP 492 quibble and request In-Reply-To: References: <20150430002147.GE10248@stoneleaf.us> Message-ID: On Wed, Apr 29, 2015 at 7:07 PM, Nick Coghlan wrote: > [...] > Yeah, I'm coming around to the idea. For the async pseudo-keyword, I > can see that the proposal only allows its use in cases that were > previously entirely illegal, but I'm not yet clear on how the PEP > proposes to avoid changing the meaning of the following code: > > x = await(this_is_a_function_call) > > Unless I'm misreading the proposed grammar in the PEP (which is > entirely possible), I believe PEP 492 would reinterpret that as: > > x = await this_is_not_a_function_call_any_more > Ah, but here's the other clever bit: it's only interpreted this way *inside* a function declared with 'async def'. Outside such functions, 'await' is not a keyword, so that grammar rule doesn't trigger. (Kind of similar to the way that the print_function __future__ disables the keyword-ness of 'print', except here it's toggled on or off depending on whether the nearest surrounding scope is 'async def' or not. The PEP could probably be clearer about this; it's all hidden in the Transition Plan section.) -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Thu Apr 30 05:01:40 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 30 Apr 2015 13:01:40 +1000 Subject: [Python-Dev] PEP 492 quibble and request In-Reply-To: References: <20150430002147.GE10248@stoneleaf.us> Message-ID: On 30 April 2015 at 12:31, Guido van Rossum wrote: > On Wed, Apr 29, 2015 at 7:07 PM, Nick Coghlan wrote: >> >> [...] >> Yeah, I'm coming around to the idea. For the async pseudo-keyword, I >> can see that the proposal only allows its use in cases that were >> previously entirely illegal, but I'm not yet clear on how the PEP >> proposes to avoid changing the meaning of the following code: >> >> x = await(this_is_a_function_call) >> >> Unless I'm misreading the proposed grammar in the PEP (which is >> entirely possible), I believe PEP 492 would reinterpret that as: >> >> x = await this_is_not_a_function_call_any_more > > > Ah, but here's the other clever bit: it's only interpreted this way *inside* > a function declared with 'async def'. Outside such functions, 'await' is not > a keyword, so that grammar rule doesn't trigger. (Kind of similar to the way > that the print_function __future__ disables the keyword-ness of 'print', > except here it's toggled on or off depending on whether the nearest > surrounding scope is 'async def' or not. The PEP could probably be clearer > about this; it's all hidden in the Transition Plan section.) Ah, nice, although even reading the Transition Plan section didn't clue me in to that particular aspect of the idea :) Given that clarification, I think the rationale for "no __future__ statement needed" can be strengthened by focusing on the fact that such a statement would largely be *redundant*, given that: * "async def", "async with", and "async for" are all currently syntax errors, and hence adding them is backwards compatible if "async" is otherwise treated as a normal variable name * "await " only gains its new interpretation when used inside an "async def" statement, so "async def" fills the role that a module level compiler declaration like "from __future__ import async_functions" would otherwise fill That said, it may be worth having the future statement *anyway* as: 1. It gives us the runtime accessible record of the feature transition in the __future__ module 2. It lets folks opt-in to the full keyword implementation from day 1 if they prefer, addressing the tokenisation limitation noted in the PEP: https://www.python.org/dev/peps/pep-0492/#transition-period-shortcomings Regards, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From yselivanov.ml at gmail.com Thu Apr 30 05:19:56 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 29 Apr 2015 23:19:56 -0400 Subject: [Python-Dev] PEP 492 quibble and request In-Reply-To: References: <20150430002147.GE10248@stoneleaf.us> Message-ID: <55419F5C.20604@gmail.com> On 2015-04-29 11:01 PM, Nick Coghlan wrote: > On 30 April 2015 at 12:31, Guido van Rossum wrote: >> >On Wed, Apr 29, 2015 at 7:07 PM, Nick Coghlan wrote: >>> >> >>> >>[...] >>> >>Yeah, I'm coming around to the idea. For the async pseudo-keyword, I >>> >>can see that the proposal only allows its use in cases that were >>> >>previously entirely illegal, but I'm not yet clear on how the PEP >>> >>proposes to avoid changing the meaning of the following code: >>> >> >>> >> x = await(this_is_a_function_call) >>> >> >>> >>Unless I'm misreading the proposed grammar in the PEP (which is >>> >>entirely possible), I believe PEP 492 would reinterpret that as: >>> >> >>> >> x = await this_is_not_a_function_call_any_more >> > >> > >> >Ah, but here's the other clever bit: it's only interpreted this way*inside* >> >a function declared with 'async def'. Outside such functions, 'await' is not >> >a keyword, so that grammar rule doesn't trigger. (Kind of similar to the way >> >that the print_function __future__ disables the keyword-ness of 'print', >> >except here it's toggled on or off depending on whether the nearest >> >surrounding scope is 'async def' or not. The PEP could probably be clearer >> >about this; it's all hidden in the Transition Plan section.) > Ah, nice, although even reading the Transition Plan section didn't > clue me in to that particular aspect of the idea :) > > Given that clarification, I think the rationale for "no __future__ > statement needed" can be strengthened by focusing on the fact that > such a statement would largely be*redundant*, given that: > > * "async def", "async with", and "async for" are all currently syntax > errors, and hence adding them is backwards compatible if "async" is > otherwise treated as a normal variable name > * "await " only gains its new interpretation when used inside an > "async def" statement, so "async def" fills the role that a module > level compiler declaration like "from __future__ import > async_functions" would otherwise fill Thanks, Nick. I've fixed the Transition Plan section, and rewrote the "why not __future__" one too. https://hg.python.org/peps/rev/552773d7e085 https://hg.python.org/peps/rev/5db3ad3d540b Yury From guido at python.org Thu Apr 30 05:21:59 2015 From: guido at python.org (Guido van Rossum) Date: Wed, 29 Apr 2015 20:21:59 -0700 Subject: [Python-Dev] PEP 492 quibble and request In-Reply-To: References: <20150430002147.GE10248@stoneleaf.us> Message-ID: Belt and suspenders. :-) We may need help with the implementation though -- PEP 479 is still waiting on implementation help with __future__ too AFAIK. On Wed, Apr 29, 2015 at 8:01 PM, Nick Coghlan wrote: > On 30 April 2015 at 12:31, Guido van Rossum wrote: > > On Wed, Apr 29, 2015 at 7:07 PM, Nick Coghlan > wrote: > >> > >> [...] > >> Yeah, I'm coming around to the idea. For the async pseudo-keyword, I > >> can see that the proposal only allows its use in cases that were > >> previously entirely illegal, but I'm not yet clear on how the PEP > >> proposes to avoid changing the meaning of the following code: > >> > >> x = await(this_is_a_function_call) > >> > >> Unless I'm misreading the proposed grammar in the PEP (which is > >> entirely possible), I believe PEP 492 would reinterpret that as: > >> > >> x = await this_is_not_a_function_call_any_more > > > > > > Ah, but here's the other clever bit: it's only interpreted this way > *inside* > > a function declared with 'async def'. Outside such functions, 'await' is > not > > a keyword, so that grammar rule doesn't trigger. (Kind of similar to the > way > > that the print_function __future__ disables the keyword-ness of 'print', > > except here it's toggled on or off depending on whether the nearest > > surrounding scope is 'async def' or not. The PEP could probably be > clearer > > about this; it's all hidden in the Transition Plan section.) > > Ah, nice, although even reading the Transition Plan section didn't > clue me in to that particular aspect of the idea :) > > Given that clarification, I think the rationale for "no __future__ > statement needed" can be strengthened by focusing on the fact that > such a statement would largely be *redundant*, given that: > > * "async def", "async with", and "async for" are all currently syntax > errors, and hence adding them is backwards compatible if "async" is > otherwise treated as a normal variable name > * "await " only gains its new interpretation when used inside an > "async def" statement, so "async def" fills the role that a module > level compiler declaration like "from __future__ import > async_functions" would otherwise fill > > That said, it may be worth having the future statement *anyway* as: > > 1. It gives us the runtime accessible record of the feature transition > in the __future__ module > 2. It lets folks opt-in to the full keyword implementation from day 1 > if they prefer, addressing the tokenisation limitation noted in the > PEP: > https://www.python.org/dev/peps/pep-0492/#transition-period-shortcomings > > Regards, > Nick. > > -- > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From greg.ewing at canterbury.ac.nz Thu Apr 30 07:39:58 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 30 Apr 2015 17:39:58 +1200 Subject: [Python-Dev] PEP 492: What is the real goal? In-Reply-To: References: <55411841.c5b3340a.1cf8.07e8@mx.google.com> Message-ID: <5541C02E.50505@canterbury.ac.nz> Paul Moore wrote: > I agree. While I don't use coroutines/asyncio, and I may never do so, > I will say that I find Python's approach very difficult to understand. Well, I tried to offer something easier to understand. The idea behind PEP 3152 is that writing async code should be just like writing threaded code, except that the suspension points are explicit. But apparently that was too simple, or something. > Looking at the Wikipedia article on coroutines, I see an example of > how a producer/consumer process might be written with coroutines: > > var q := new queue > > coroutine produce > loop > while q is not full > create some new items > add the items to q > yield to consume > > coroutine consume > loop > while q is not empty > remove some items from q > use the items > yield to produce Aaargh, this is what we get for overloading the word "coroutine". The Wikipedia article is talking about a technique where coroutines yield control to other explicitly identified coroutines. Coroutines in asyncio don't work that way; instead they just suspend themselves, and the event loop takes care of deciding which one to run next. > I can't even see how to relate that to PEP 429 syntax. I'm not allowed > to use "yield", You probably wouldn't need to explicitly yield, since you'd use an asyncio.Queue for passing data between the tasks, which takes care of suspending until data becomes available. You would only need to yield if you were implementing some new synchronisation primitive. Yury's answer to that appears to be that you don't do it with an async def function, you create an object that implements the awaitable-object protocol directly. -- Greg From greg.ewing at canterbury.ac.nz Thu Apr 30 07:55:24 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 30 Apr 2015 17:55:24 +1200 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> <20150429172521.GC10248@stoneleaf.us> <55411877.7010608@gmail.com> <20150429183213.GD10248@stoneleaf.us> <554126F6.6050108@gmail.com> Message-ID: <5541C3CC.4030602@canterbury.ac.nz> Nathaniel Smith wrote: > (I suspect this may also be the impetus behind Greg's request that it > just be treated the same as unary minus. IMHO it matters much more that > the rules be predictable and teachable than that they allow or disallow > every weird edge case in exactly the right way.) Exactly. -- Greg From greg.ewing at canterbury.ac.nz Thu Apr 30 08:35:50 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 30 Apr 2015 18:35:50 +1200 Subject: [Python-Dev] PEP 492: What is the real goal? In-Reply-To: <5541341A.8090204@gmail.com> References: <55411841.c5b3340a.1cf8.07e8@mx.google.com> <5541261A.9020909@gmail.com> <5541341A.8090204@gmail.com> Message-ID: <5541CD46.7080803@canterbury.ac.nz> Yury Selivanov wrote: > Everybody is pulling me in a different direction :) > Guido proposed to call them "native coroutines". Some people > think that "async functions" is a better name. Greg loves > his "cofunction" term. Not the term, the *idea*. But PEP 492 is not based on that idea, so I don't advocate calling anything in PEP 492 a cofunction. My vote goes to "async function". -- Greg From greg.ewing at canterbury.ac.nz Thu Apr 30 08:53:00 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 30 Apr 2015 18:53:00 +1200 Subject: [Python-Dev] PEP 492: What is the real goal? In-Reply-To: References: <55411841.c5b3340a.1cf8.07e8@mx.google.com> <5541261A.9020909@gmail.com> <5541341A.8090204@gmail.com> Message-ID: <5541D14C.5090404@canterbury.ac.nz> Skip Montanaro wrote: > According to Wikipedia , term > "coroutine" was first coined in 1958, so several generations of computer > science graduates will be familiar with the textbook definition. If your > use of "coroutine" matches the textbook definition of the term, I think > you should continue to use it instead of inventing new names which will > just confuse people new to Python. I don't think anything in asyncio or PEP 492 fits that definition directly. Generators and async def functions seem to be what that page calls a "generator" or "semicoroutine": they differ in that coroutines can control where execution continues after they yield, while generators cannot, instead transferring control back to the generator's caller. -- Greg From greg.ewing at canterbury.ac.nz Thu Apr 30 09:16:58 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 30 Apr 2015 19:16:58 +1200 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <55416139.1000004@gmail.com> References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> <554102A4.1040603@gmail.com> <55415F5F.3090101@canterbury.ac.nz> <55416139.1000004@gmail.com> Message-ID: <5541D6EA.4070507@canterbury.ac.nz> Yury Selivanov wrote: > Sorry, but I'm not sure where & when I had any troubles > predicting the consequences.. You said you wanted 'await a() * b()' to be a syntax error, but your grammar allows it. -- Greg From encukou at gmail.com Tue Apr 28 12:05:59 2015 From: encukou at gmail.com (Petr Viktorin) Date: Tue, 28 Apr 2015 12:05:59 +0200 Subject: [Python-Dev] A macro for easier rich comparisons In-Reply-To: References: Message-ID: On Tue, Apr 28, 2015 at 11:13 AM, Victor Stinner wrote: > Hi, > > 2015-04-27 16:02 GMT+02:00 Petr Viktorin : >> A macro like this would reduce boilerplate in stdlib and third-party C >> extensions. It would ease porting C extensions to Python 3, where rich >> comparison is mandatory. > > It would be nice to have a six module for C extensions. I'm quite sure > that many projects are already full of #ifdef PYTHON3 ... #else ... > #endif macros. The idea actually came from my work on such a library: http://py3c.readthedocs.org/en/latest/ >> #define Py_RETURN_RICHCOMPARE(val1, val2, op) \ >> do { \ >> switch (op) { \ >> case Py_EQ: if ((val1) == (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ >> case Py_NE: if ((val1) != (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ >> case Py_LT: if ((val1) < (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ >> case Py_GT: if ((val1) > (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ >> case Py_LE: if ((val1) <= (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ >> case Py_GE: if ((val1) >= (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE; \ >> } \ >> Py_RETURN_NOTIMPLEMENTED; \ >> } while (0) > > I would prefer a function for that: > > PyObject *Py_RichCompare(long val1, long2, int op); The original version of the macro used ternary statements. This was shot down because a chain of comparisons would be slower than a case statement. (See discussion on the issue.) Wouldn't a function call also be slower? Also, a function with long arguments won't work on unsigned long or long long. > You should also handle invalid operator. PyUnicode_RichCompare() calls > PyErr_BadArgument() in this case. There are many different precedents, from ignoring this case to doing an assert. Is PyErr_BadArgument() better than returning NotImplemented? > Anyway, please open an issue for this idea. http://bugs.python.org/issue23699 From encukou at gmail.com Tue Apr 28 17:09:03 2015 From: encukou at gmail.com (Petr Viktorin) Date: Tue, 28 Apr 2015 17:09:03 +0200 Subject: [Python-Dev] A macro for easier rich comparisons In-Reply-To: <20150428105917.77c6f3df@anarchist.wooz.org> References: <20150428105917.77c6f3df@anarchist.wooz.org> Message-ID: On Tue, Apr 28, 2015 at 4:59 PM, Barry Warsaw wrote: > On Apr 28, 2015, at 11:13 AM, Victor Stinner wrote: > >>It would be nice to have a six module for C extensions. I'm quite sure >>that many projects are already full of #ifdef PYTHON3 ... #else ... >>#endif macros. > > Maybe encapsulating some of the recommendations here: > > https://wiki.python.org/moin/PortingToPy3k/BilingualQuickRef#Python_extension_modules py3c (or its documentation) now has all that except REPRV (with an alias for the native string type, e.g. "PyStr", you can use that in all reprs, so REPRV strikes me as somewhat redundant). > (We really need to collect all this information in on place.) > >>> #define Py_RETURN_RICHCOMPARE(val1, val2, op) > > I think this macro would make a nice addition to the C API. It might read > better as `Py_RETURN_RICHCOMPARE(val1, op, val2)`. (val1, val2, op) mirrors richcmp and PyObject_RichCompareBool; I think a different order of arguments would just be confusing. From pmiscml at gmail.com Tue Apr 28 21:24:22 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Tue, 28 Apr 2015 22:24:22 +0300 Subject: [Python-Dev] PEP 492: No new syntax is required In-Reply-To: <553DE9E1.4040403@hotpy.org> References: <553D48BF.1070600@hotpy.org> <553DE9E1.4040403@hotpy.org> Message-ID: <20150428222422.54d1884d@x230> Hello, On Mon, 27 Apr 2015 08:48:49 +0100 Mark Shannon wrote: > > > On 27/04/15 00:13, Guido van Rossum wrote: > > But new syntax is the whole point of the PEP. I want to be able to > > *syntactically* tell where the suspension points are in coroutines. > Doesn't "yield from" already do that? > > > Currently this means looking for yield [from]; PEP 492 just adds > > looking for await and async [for|with]. Making await() a function > > defeats the purpose because now aliasing can hide its presence, and > > we're back in the land of gevent or stackless (where *anything* can > > potentially suspend the current task). I don't want to live in that > > land. > > I don't think I was clear enough. I said that "await" *is* a > function, not that is should be disguised as one. Yes, you said, but it is not. I guess other folks left figuring that out for yourself, and it's worthy exercise. Hint: await appears to translate to GET_AWAITABLE and YIELD_FROM opcodes. If your next reply is "I told you so", then you again miss that "await" is a special Python language construct (effectively, operator), while the fact that its implemented as GET_AWAITABLE and YIELD_FROM opcodes in CPython is only CPython's implementation detail, CPython being just one (random) Python language implementation. > Reading the code, > "GetAwaitableIter" would be a better name for that element of the > implementation. It is a straightforward non-blocking function. Based on all this passage, my guess is that you miss difference between C and Python functions. On C level, there're only functions used to implement everything (C doesn't offer anything else). But on Python level, there're larger variety: functions, methods, special forms (a term with a bow to Scheme - it's a function which you can't implement in terms of other functions and which may have behavior they can't have). "await" is a special form. The fact that it's implemented by a C function (or not exactly, as pointed above) is just CPython's implementation detail. Arguing that "await" should be something based on what you saw in C code is putting it all backwards. -- Best regards, Paul mailto:pmiscml at gmail.com From pmiscml at gmail.com Tue Apr 28 21:39:41 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Tue, 28 Apr 2015 22:39:41 +0300 Subject: [Python-Dev] Issues with PEP 482 (1) In-Reply-To: <553FD525.1030202@hotpy.org> References: <553FD525.1030202@hotpy.org> Message-ID: <20150428223941.60c9ba55@x230> Hello, On Tue, 28 Apr 2015 19:44:53 +0100 Mark Shannon wrote: [] > A coroutine without a yield statement can be defined simply and > concisely, thus: > > @coroutine > def f(): > return 1 [] > A pure-python definition of the "coroutine" decorator is > given below. > [] > from types import FunctionType, CodeType > > CO_COROUTINE = 0x0080 > CO_GENERATOR = 0x0020 > > def coroutine(f): > 'Converts a function to a generator function' > old_code = f.__code__ > new_code = CodeType( > old_code.co_argcount, > old_code.co_kwonlyargcount, This is joke right? This code has nothing to do with *Python*. This code deals with internal implementation details of *CPython*. No other Python implementation would have anything like that (because then it would be just another CPython, and there's clearly no need to have two or more CPythons). The code above is as helpful as saying "you can write some magic values at some magic memory addressed to solve any problem you ever have". All that is rather far away from making coroutine writing in Python easier and less error-prone, which is the topic of PEP482. -- Best regards, Paul mailto:pmiscml at gmail.com From tritium-list at sdamon.com Tue Apr 28 22:22:50 2015 From: tritium-list at sdamon.com (Alexander Walters) Date: Tue, 28 Apr 2015 16:22:50 -0400 Subject: [Python-Dev] Unicode literals in Python 2.7 In-Reply-To: References: Message-ID: <553FEC1A.5000907@sdamon.com> does this not work for you? from __future__ import unicode_literals On 4/28/2015 16:20, Adam Barto? wrote: > Hello, > > is it possible to somehow tell Python 2.7 to compile a code entered in > the interactive session with the flag PyCF_SOURCE_IS_UTF8 set? I'm > considering adding support for Python 2 in my package > (https://github.com/Drekin/win-unicode-console) and I have run into > the fact that when u"?" is entered in the interactive session, it > results in u"\xce\xb1" rather than u"\u03b1". As this seems to be a > highly specialized question, I'm asking it here. > > Regards, Drekin > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/tritium-list%40sdamon.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From elprans at gmail.com Tue Apr 28 22:22:54 2015 From: elprans at gmail.com (Elvis Pranskevichus) Date: Tue, 28 Apr 2015 16:22:54 -0400 Subject: [Python-Dev] Issues with PEP 482 References: <553FD525.1030202@hotpy.org> Message-ID: Hi Mark, Mark Shannon wrote: > Hi, > > I still think that there are several issues that need addressing with > PEP 492. This time, one issue at a time :) > > "async" > > The "Rationale and Goals" of PEP 492 states that PEP 380 has 3 > shortcomings. The second of which is: > """It is not possible to natively define a coroutine which has no > yield or yield from statements.""" > This is incorrect, although what is meant by 'natively' is unclear. > > A coroutine without a yield statement can be defined simply and > concisely, thus: > > @coroutine > def f(): > return 1 > > This is only a few character longer than the proposed new syntax, > perfectly explicit and requires no modification the language whatsoever. > A pure-python definition of the "coroutine" decorator is given below. > > So could the "Rationale and Goals" be correctly accordingly, please. > Also, either the "async def" syntax should be dropped, or a new > justification is required. > > Cheers, > Mark. > As was previously mentioned, new async syntax is a major point of PEP 492. Coroutine-based concurrent programming is something that a lot of languages and platforms are adopting as a first class feature. Just look at the list of references in the PEP. While the specific coroutine definition point you are arguing about can certainly be debated about, you seem to disregard the other points PEP 492 is raising, which boil down to making coroutines a first class object in Python, with all the robustness and support that implies. Decorators in Python are an auxiliary feature that has nothing to do with core language semantics. Also, "async for" and "async with" are just as important in concurrent programs, as regular "for" and "with" are in sequential programs. Saying "no we don't need new syntax", when lots of folks think we do, is just a contradiction without real argument. Elvis From pmiscml at gmail.com Tue Apr 28 23:14:15 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Wed, 29 Apr 2015 00:14:15 +0300 Subject: [Python-Dev] PEP 492: No new syntax is required In-Reply-To: <553FE696.8060806@hotpy.org> References: <553D48BF.1070600@hotpy.org> <553DE9E1.4040403@hotpy.org> <20150428222422.54d1884d@x230> <553FE696.8060806@hotpy.org> Message-ID: <20150429001415.79b56515@x230> Hello, On Tue, 28 Apr 2015 20:59:18 +0100 Mark Shannon wrote: > > > On 28/04/15 20:24, Paul Sokolovsky wrote: > > Hello, > > > [snip] > > > Based on all this passage, my guess is that you miss difference > > between C and Python functions. > This is rather patronising, almost to the point of being insulting. > Please keep the debate civil. And yet people do make mistakes and misunderstand, and someone should bother with social psychology of programming - why this happens, what are typical patterns, root causes, etc. I don't think you should be insulted, especially if you think you're right with your points - then all counter arguments will either help you understand another side, or will just look funny. > > [snip] > > Cheers, > Mark. -- Best regards, Paul mailto:pmiscml at gmail.com From pmiscml at gmail.com Tue Apr 28 23:37:00 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Wed, 29 Apr 2015 00:37:00 +0300 Subject: [Python-Dev] Issues with PEP 482 (1) In-Reply-To: <553FE6D1.4030801@hotpy.org> References: <553FD525.1030202@hotpy.org> <20150428223941.60c9ba55@x230> <553FE6D1.4030801@hotpy.org> Message-ID: <20150429003700.563582ec@x230> Hello, On Tue, 28 Apr 2015 21:00:17 +0100 Mark Shannon wrote: [] > >> CO_COROUTINE = 0x0080 > >> CO_GENERATOR = 0x0020 > >> > >> def coroutine(f): > >> 'Converts a function to a generator function' > >> old_code = f.__code__ > >> new_code = CodeType( > >> old_code.co_argcount, > >> old_code.co_kwonlyargcount, > > > > > > This is joke right? > Well it was partly for entertainment value, although it works on PyPy. > > The point is that something that can be done with a decorator, > whether in pure Python or as builtin, does not require new syntax. And that's exactly not what Python is and not how it evolves. Unlike Scheme, it doesn't offer some minimal orthogonal basis out of which everything can be derived by functional application. Instead, it's more pragmatic and offers plethora of (well defined, unlike many other languages) concepts and implementations to choose from. And if so happens that practice shows that some concepts needs "slight" redefinition, such concept is defined as first-class, despite the fact that it matches 90% semantics of another concept. Fortunately, on implementation level, those 90% of semantics are shared, so it is not outright "bloat". The current wishful thinking of this PEP is that more people will know and use "await", while "yield from" will keep being understood and used by quite not every Python programmer. (Just to state the obvious, all the above is actually my own trying to grasp it, and is reverse causation - trying to explain Python progress in terms of how this particular PEP482 and its older friends progress, it may be quite different for other aspects of language. I for one rather surprised that BDFL is so positive about this PEP). > > Cheers, > Mark. > -- Best regards, Paul mailto:pmiscml at gmail.com From ischwabacher at wisc.edu Wed Apr 29 21:24:57 2015 From: ischwabacher at wisc.edu (Isaac Schwabacher) Date: Wed, 29 Apr 2015 14:24:57 -0500 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <7720e702fb35.55412fd5@wiscmail.wisc.edu> References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> <20150429172521.GC10248@stoneleaf.us> <55411877.7010608@gmail.com> <20150429183213.GD10248@stoneleaf.us> <554126F6.6050108@gmail.com> <76d08c5febac.55412df3@wiscmail.wisc.edu> <75c0c6e0c543.55412e2f@wiscmail.wisc.edu> <75c0d79a9efa.55412ea7@wiscmail.wisc.edu> <75c086a7c44b.55412ee4@wiscmail.wisc.edu> <75c0e3aec47e.55412f20@wiscmail.wisc.edu> <75c08a2181ed.55412f5c@wiscmail.wisc.edu> <7720ec97fe17.55412f99@wiscmail.wisc.edu> <7720e702fb35.55412fd5@wiscmail.wisc.edu> Message-ID: <75c09142bded.5540e9b9@wiscmail.wisc.edu> On 15-04-29, Yury Selivanov wrote: > Hi Ethan, > > On 2015-04-29 2:32 PM, Ethan Furman wrote: > >On 04/29, Yury Selivanov wrote: > >>On 2015-04-29 1:25 PM, Ethan Furman wrote: > >>>cannot also just work and be the same as the parenthesized > >>>version. > >>Because it does not make any sense. > >I obviously don't understand your position that "it does not make > >any sense" -- perhaps you could explain a bit? > > > >What I see is a suspension point that is waiting for the results of > >coro(), which will be negated (and returned/assigned/whatever). > >What part of that doesn't make sense? > > > > Because you want operators to be resolved in the > order you see them, generally. > > You want '(await -fut)' to: > > 1. Suspend on fut; > 2. Get the result; > 3. Negate it. > > This is a non-obvious thing. I would myself interpret it > as: > > 1. Get fut.__neg__(); > 2. await on it. > > So I want to make this syntactically incorrect: Does this need to be a syntax error? -"hello" raises TypeError because str doesn't have a __neg__, but there's no reason a str subclass couldn't define one. "TypeError: bad operand type for unary -: 'asyncio.Future'" is enough to clear up any misunderstandings, and if someone approaching a new language construct doesn't test their code well enough to at least execute all the code paths, the difference between a compile-time SyntaxError and a run-time TypeError is not going to save them. ijs > 'await -fut' would throw a SyntaxError. To do what you > want, write a pythonic '- await fut'. > > > Yury > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ischwabacher%40wisc.edu From pmiscml at gmail.com Wed Apr 29 23:06:06 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Thu, 30 Apr 2015 00:06:06 +0300 Subject: [Python-Dev] PEP 492: What is the real goal? In-Reply-To: References: <55411841.c5b3340a.1cf8.07e8@mx.google.com> <5541261A.9020909@gmail.com> Message-ID: <20150430000606.48bc6b7c@x230> Hello, On Wed, 29 Apr 2015 20:19:40 +0100 Paul Moore wrote: [] > Thanks for that. That does look pretty OK. One question, though - it > uses an asyncio Queue. The original code would work just as well with > a list, or more accurately, something that wasn't designed for async > use. So the translation isn't completely equivalent. Also, can I run > the produce/consume just by calling produce()? My impression is that > with asyncio I need an event loop - which "traditional" coroutines > don't need. Nevertheless, the details aren't so important, it was only > a toy example anyway. All this confusion stems from the fact that wikipedia article fails to clearly provide classification dichotomies for coroutines. I suggest reading Lua coroutine description as much better attempt at classification: http://www.lua.org/pil/9.1.html . It for example explicit at mentioning common pitfall: "Some people call asymmetric coroutine semi-coroutines (because they are not symmetrical, they are not really co). However, other people use the same term semi-coroutine to denote a restricted implementation of coroutines". Comparing that to wikipedia article, you'll notice that it uses "semicoroutine" in just one of a sense, and well, different people use "semi" part of a different classification axis. So, trying to draw a table from Lua's text, there're following 2 axes: Axis 1: Symmetric vs Asymmetric Asymmetric coroutines use 2 control flow constructs, akin to subroutine call and return. (Names vary, return is usually called yield.) Symmetric use only one. You can think of symmetric coroutines only call or only return, though more un-confusing term is "switch to". Axis 2: "Lexical" vs "Dynamic" Naming less standardized. Lua calls its coroutines "tru", while other - "generators". Others call them "coroutines" vs "generators". But the real difference is intuitively akin of lexical vs dynamic scoping. "Lexical" coroutines require explicit marking of each (including recursive) call to a coroutine. "Dynamic" do not - you can call a normally looking function, and it suddenly pass control to somewhere else (another coroutine), about which fact you don't have a clue. All *four* recombined types above are coroutines, albeit all with slightly different properties. Symmetric dynamic coroutines are the most powerful type - as powerful as an abyss. They are what is usually used to frighten the innocent. Wikipedia shows you example of them. No sane real-world language uses symmetric coroutines - they're not useful without continuations, and sane real-world people don't want to manage continuations manually. Python, Lua, C# use asymmetric coroutines. Python and C# use asymmetric "lexical" coroutines - the simplest, and thus safest type, but which has limitations wrt to doing mind-boggling things. Lua has "dynamic" asymmetric coroutines - more powerful, and thus more dangerous type (you want to look with jaundiced eye at that guy's framework based on "dynamic" coroutines - you'd better rewrite it from scratch before you trust it). -- Best regards, Paul mailto:pmiscml at gmail.com From ischwabacher at wisc.edu Thu Apr 30 00:35:34 2015 From: ischwabacher at wisc.edu (Isaac Schwabacher) Date: Wed, 29 Apr 2015 17:35:34 -0500 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <77109d2fd623.55415c7b@wiscmail.wisc.edu> References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> <20150429172521.GC10248@stoneleaf.us> <55411877.7010608@gmail.com> <20150429183213.GD10248@stoneleaf.us> <554126F6.6050108@gmail.com> <76d08c5febac.55412df3@wiscmail.wisc.edu> <75c0c6e0c543.55412e2f@wiscmail.wisc.edu> <75c0d79a9efa.55412ea7@wiscmail.wisc.edu> <75c086a7c44b.55412ee4@wiscmail.wisc.edu> <75c0e3aec47e.55412f20@wiscmail.wisc.edu> <75c08a2181ed.55412f5c@wiscmail.wisc.edu> <7720ec97fe17.55412f99@wiscmail.wisc.edu> <7720e702fb35.55412fd5@wiscmail.wisc.edu> <75c09142bded.5540e9b9@wiscmail.wisc.edu> <55415727.6080402@gmail.com> <7750caa0bab9.55415b88@wiscmail.wisc.edu> <7790cf77ecc1.55415c3e@wiscmail.wisc.edu> <77109d2fd623.55415c7b@wiscmail.wisc.edu> Message-ID: <76d0addfbecc.55411666@wiscmail.wisc.edu> On 15-04-29, Yury Selivanov wrote: > > > On 2015-04-29 3:24 PM, Isaac Schwabacher wrote: > >On 15-04-29, Yury Selivanov wrote: > >>Hi Ethan, > [..] > >>So I want to make this syntactically incorrect: > >Does this need to be a syntax error? -"hello" raises TypeError because str doesn't have a __neg__, but there's no reason a str subclass couldn't define one. "TypeError: bad operand type for unary -: 'asyncio.Future'" is enough to clear up any misunderstandings, and if someone approaching a new language construct doesn't test their code well enough to at least execute all the code paths, the difference between a compile-time SyntaxError and a run-time TypeError is not going to save them. > > The grammar of the language should match the most common > use case. > > FWIW, I've just updated the pep with a precedence table: > https://hg.python.org/peps/rev/d355918bc0d7 I'd say the grammar of the language should be the least surprising overall, which definitely means it should be clean in the most common case but doesn't mean it has to go out of its way to make other cases difficult. But reading that precedence table makes everything clear-- the proper comparison isn't (-"hello"), it's (-not False), which *is* a syntax error. Silly prefix operators. ijs From pmiscml at gmail.com Thu Apr 30 02:56:03 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Thu, 30 Apr 2015 03:56:03 +0300 Subject: [Python-Dev] PEP 492 quibble and request In-Reply-To: <55417845.3060507@gmail.com> References: <20150430002147.GE10248@stoneleaf.us> <55417845.3060507@gmail.com> Message-ID: <20150430035603.419639a6@x230> Hello, On Wed, 29 Apr 2015 20:33:09 -0400 Yury Selivanov wrote: > Hi Ethan, > > On 2015-04-29 8:21 PM, Ethan Furman wrote: > > From the PEP: > > > >> Why not a __future__ import > >> > >> __future__ imports are inconvenient and easy to forget to add. > > That is a horrible rationale for not using an import. By that > > logic we should have everything in built-ins. ;) > > > > > >> Working example > >> ... > > The working example only uses async def and await, not async with > > nor async for nor __aenter__, etc., etc. > > > > Could you put in a more complete example -- maybe a basic chat room > > with both server and client -- that demonstrated more of the new > > possibilities? > > Andrew Svetlov has implemented some new features in his > aiomysql driver: > > https://github.com/aio-libs/aiomysql/blob/await/tests/test_async_iter.py > > I don't want to cite it in the PEP because it's not complete > yet, and some idioms (like 'async with') aren't used to their > full potential. > > > > > Having gone through the PEP again, I am still no closer to > > understanding what happens here: > > > > data = await reader.read(8192) > > > > What does the flow of control look like at the interpreter level? > > 'await' is semantically equivalent to 'yield from' in this line. > > To really understand all implementation details of this line > you need to read PEP 3156 and experiment with asyncio. There > is no easier way, unfortunately. I can't add a super detailed > explanation how event loops can be implemented in PEP 492, > that's not in its scope. > > The good news is that to use asyncio on a daily basis you > don't need to know all details, as you don't need to know > how 'ceval.c' works and how 'setTimeout' is implemented in > JavaScript. +1 But if you really want, you can. The likely reason for that though would be desire to develop "yield from" for an alternative Python implementation. You can take inspiration from a diagram I drew while I implemented "yield from" for MicroPython: https://dl.dropboxusercontent.com/u/44884329/yield-from.pdf -- Best regards, Paul mailto:pmiscml at gmail.com From mal at egenix.com Thu Apr 30 09:59:34 2015 From: mal at egenix.com (M.-A. Lemburg) Date: Thu, 30 Apr 2015 09:59:34 +0200 Subject: [Python-Dev] Clarification of PEP 476 "opting out" section In-Reply-To: References: Message-ID: <5541E0E6.6060701@egenix.com> On 30.04.2015 02:33, Nick Coghlan wrote: > Hi folks, > > This is just a note to highlight the fact that I tweaked the "Opting > out" section in PEP 476 based on various discussions I've had over the > past few months: https://hg.python.org/peps/rev/dfd96ee9d6a8 > > The notable changes: > > * the example monkeypatching code handles AttributeError when looking > up "ssl._create_unverified_context", in order to accommodate older > versions of Python that don't have PEP 476 implemented > * new paragraph making it clearer that while the intended use case for > the monkeypatching trick is as a workaround to handle environments > where you *know* HTTPS certificate verification won't work properly > (including explicit references to sitecustomize.py and Standard > Operating Environments for Python), there's also a secondary use case > in allowing applications to provide a system administrator controlled > setting to globally disable certificate verification (hence the change > to the example code) > * new paragraph making it explicit that even though we've improved > Python's default behaviour, particularly security sensitive > applications should still provide their own context rather than > relying on the defaults Can we please make the monkeypatch a regular part of Python's site.py which can enabled via an environment variable, say export PYTHONHTTPSVERIFY=0. See http://bugs.python.org/issue23857 for the discussion. Esp. for Python 2.7.9 the default verification from PEP 476 is causing problems for admins who want to upgrade their Python installation without breaking applications using Python. They need an easy and official non-hackish way to opt-out from the PEP 476 default on a per application basis. Thanks, -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source (#1, Apr 30 2015) >>> Python Projects, Coaching and Consulting ... http://www.egenix.com/ >>> mxODBC Plone/Zope Database Adapter ... http://zope.egenix.com/ >>> mxODBC, mxDateTime, mxTextTools ... http://python.egenix.com/ ________________________________________________________________________ ::::: Try our mxODBC.Connect Python Database Interface for free ! :::::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/ From p.f.moore at gmail.com Thu Apr 30 10:17:08 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Thu, 30 Apr 2015 09:17:08 +0100 Subject: [Python-Dev] PEP 492: What is the real goal? In-Reply-To: <5541C02E.50505@canterbury.ac.nz> References: <55411841.c5b3340a.1cf8.07e8@mx.google.com> <5541C02E.50505@canterbury.ac.nz> Message-ID: On 30 April 2015 at 06:39, Greg Ewing wrote: > Aaargh, this is what we get for overloading the word > "coroutine". The Wikipedia article is talking about a > technique where coroutines yield control to other > explicitly identified coroutines. Yep, I understand that. It's just that that's what I understand by coroutines. > Coroutines in asyncio don't work that way; instead > they just suspend themselves, and the event loop > takes care of deciding which one to run next. Precisely. As I say, the terminology is probably not going to change now - no big deal in practice. Paul From stephen at xemacs.org Thu Apr 30 10:18:13 2015 From: stephen at xemacs.org (Stephen J. Turnbull) Date: Thu, 30 Apr 2015 17:18:13 +0900 Subject: [Python-Dev] Unicode literals in Python 2.7 In-Reply-To: References: <87egn2pgv1.fsf@uwakimon.sk.tsukuba.ac.jp> Message-ID: <878udaowqi.fsf@uwakimon.sk.tsukuba.ac.jp> Chris Angelico writes: > It's legal Unicode, but it doesn't mean what he typed in. Of course, that's obvious. My point is "Welcome to the wild wacky world of soi-disant 'internationalized' software, where what you see is what you get regardless of what you type." From greg.ewing at canterbury.ac.nz Thu Apr 30 10:56:10 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 30 Apr 2015 20:56:10 +1200 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: References: <553BF680.10304@gmail.com> <553E3109.2050106@canterbury.ac.nz> <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> <554102A4.1040603@gmail.com> <55415F5F.3090101@canterbury.ac.nz> <55416DBF.3020303@gmail.com> Message-ID: <5541EE2A.1070109@canterbury.ac.nz> Nathaniel Smith wrote: > Even if we put aside our trained intuitions about arithmetic, I think > it's correct to say that the way unary minus is parsed is: everything > to the right of it that has a tighter precedence gets collected up and > parsed as an expression, and then it takes that expression as its > argument. Tighter or equal, actually: '--a' is allowed. This explains why Yury's syntax disallows 'await -f'. The 'await' operator requires something after it, but there's *nothing* between it and the following '-', which binds less tightly. So it's understandable, but you have to think a bit harder. Why do we have to think harder? I suspect it's because the notion of precedence is normally introduced to resolve ambiguities. Knowing that infix '*' has higher precedence than infix '+' tells us that 'a + b * c' is parsed as 'a + (b * c)' and not '(a + b) * c'. Similarly, knowing that infix '.' has higher precedence than prefix '-' tells us that '-a.b' is parsed as '-(a.b)' rather than '(-a).b'. However, giving prefix 'await' higher precedence than prefix '-' doesn't serve to resolve any ambiguity. '- await f' is parsed as '-(await f)' either way, and 'await f + g' is parsed as '(await f) + g' either way. So when we see 'await -f', we think we already know what it means. There is only one possible order for the operations, so it doesn't look as though precedence comes into it at all, and we don't consider it when judging whether it's a valid expression. What's the conclusion from all this? I think it's that using precedence purely to disallow certain constructs, rather than to resolve ambiguities, leads to a grammar with less-than-intuitive characteristics. -- Greg From greg.ewing at canterbury.ac.nz Thu Apr 30 10:58:53 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 30 Apr 2015 20:58:53 +1200 Subject: [Python-Dev] PEP 492 quibble and request In-Reply-To: <20150430002147.GE10248@stoneleaf.us> References: <20150430002147.GE10248@stoneleaf.us> Message-ID: <5541EECD.2020903@canterbury.ac.nz> Ethan Furman wrote: > Having gone through the PEP again, I am still no closer to understanding > what happens here: > > data = await reader.read(8192) > > What does the flow of control look like at the interpreter level? Are you sure you *really* want to know? For the sake of sanity, I recommend ignoring the actual control flow and pretending that it's just like data = reader.read(8192) with the reader.read() method somehow able to be magically suspended. -- Greg From p.f.moore at gmail.com Thu Apr 30 11:02:17 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Thu, 30 Apr 2015 10:02:17 +0100 Subject: [Python-Dev] PEP 492 quibble and request In-Reply-To: References: <20150430002147.GE10248@stoneleaf.us> <55417845.3060507@gmail.com> Message-ID: On 30 April 2015 at 02:52, Nick Coghlan wrote: > This request isn't about understanding the implementation details, > it's about understanding what Python *users* will gain from the PEP > without *needing* to understand the implementation details. > > For that, I'd like to see some not-completely-trivial example code in > (or at least linked from) the PEP written using: > > * trollius (no "yield from", Python 2 compatible, akin to Twisted's > inlineDeferred's) > * asyncio/tulip ("yield from", Python 3.3+ compatible) > * PEP 492 (async/await, Python 3.5+ only) > > The *intent* of PEP 492, like PEP 380 and PEP 342 before it, is "make > asynchronous programming in Python easier". I think it will actually > succeed in that goal, but I don't think it currently does an > especially job of explaining that to folks that aren't both already > deeply invested in the explicitly asynchronous programming model *and* > thoroughly aware of the fact that most of us need asynchronous > programming to look as much like synchronous programming as possible > in order for it to fit our brains. I agree 100% on this. As things stand, it feels as though asyncio feels frighteningly complex for anyone who isn't deeply involved with it (it certainly does to me). The current PEP feels to an outsider as if it is solving problems that are specialist and to the average (non-asyncio) programmer add language complexity with no real benefit. It's not specific to PEP 492, but what I would like to see is: 1. More tutorial level examples of how to use asyncio. Specifically *not* examples of how to write web services, or how to do async web requests in your existing async program. Instead, how to integrate asyncio into generally non-async code. For example, looking at pip, I see a few places where I can anticipate asyncio might be useful - the link-chasing code, the package download code, and the code to run setup.py in a subprocess, seem like places where we could do stuff in an async manner (it's not a big enough deal that we've ever used threads, so I doubt we'd want to use asyncio either in practice, but they are certainly the *types* of code I see as benefitting from async). 2. Following on from this, how do I isolate async code from the rest of my program (i.e. I don't want to have to rewrite my whole program around an event loop just to run a bunch of background programs in parallel)? 3. Clarification on the roles of async/await vs yield from/generator.send. Are they both useful, and if so in what contexts (ignoring "if you want to support Python 3.4" compatibility cases)? How should a programmer choose which is appropriate? 4. A much better explanation of *when* any of the async constructs are appropriate at all. The name "asyncio" implies IO, and all of the examples further tend to imply "sockets". So the immediate impression is that only socket programmers and people writing network protocols should care. Of course, if asyncio and the PEP *are* only really relevant to network protocols, then my impressions are actually correct and I should drop out of the discussion. But if that's the case, it seems like a lot of language change for a relatively specialist use case. From p.f.moore at gmail.com Thu Apr 30 11:07:15 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Thu, 30 Apr 2015 10:07:15 +0100 Subject: [Python-Dev] PEP 492 quibble and request In-Reply-To: <5541EECD.2020903@canterbury.ac.nz> References: <20150430002147.GE10248@stoneleaf.us> <5541EECD.2020903@canterbury.ac.nz> Message-ID: On 30 April 2015 at 09:58, Greg Ewing wrote: > Ethan Furman wrote: >> >> Having gone through the PEP again, I am still no closer to understanding >> what happens here: >> >> data = await reader.read(8192) >> >> What does the flow of control look like at the interpreter level? > > > Are you sure you *really* want to know? For the sake > of sanity, I recommend ignoring the actual control > flow and pretending that it's just like > > data = reader.read(8192) > > with the reader.read() method somehow able to be > magically suspended. Well, if I don't know, I get confused as to where I invoke the event loop, how my non-async code runs alongside this etc. Paul From greg.ewing at canterbury.ac.nz Thu Apr 30 11:16:11 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 30 Apr 2015 21:16:11 +1200 Subject: [Python-Dev] PEP 492: async/await in Python; version 4 In-Reply-To: <554185C2.5080003@gmail.com> References: <554185C2.5080003@gmail.com> Message-ID: <5541F2DB.5000201@canterbury.ac.nz> Yury Selivanov wrote: > 3. CO_NATIVE_COROUTINE flag. This enables us to disable > __iter__ and __next__ on native coroutines while maintaining > full backwards compatibility. I don't think you can honestly claim "full backwards compatibility" as long as there are some combinations of old-style and new-style code that won't work together. You seem to be using your own personal definition of "full" here. -- Greg From solipsis at pitrou.net Thu Apr 30 11:17:24 2015 From: solipsis at pitrou.net (Antoine Pitrou) Date: Thu, 30 Apr 2015 11:17:24 +0200 Subject: [Python-Dev] PEP 492 quibble and request References: <20150430002147.GE10248@stoneleaf.us> <55417845.3060507@gmail.com> Message-ID: <20150430111724.2d92fbd6@fsol> On Thu, 30 Apr 2015 10:02:17 +0100 Paul Moore wrote: > > Of course, if asyncio and the PEP *are* only really relevant to > network protocols, then my impressions are actually correct and I > should drop out of the discussion. But if that's the case, it seems > like a lot of language change for a relatively specialist use case. That's my impression too. There's nothing remotely horrible about "yield from". Although we should have done the right thing from the start, this is a lot of language churn to introduce *now*, not to mention annoyingly similar but incompatible constructs that make the language more difficult to understand. So I'm rather -0.5 on the idea. Regards Antoine. From solipsis at pitrou.net Thu Apr 30 11:20:23 2015 From: solipsis at pitrou.net (Antoine Pitrou) Date: Thu, 30 Apr 2015 11:20:23 +0200 Subject: [Python-Dev] Clarification of PEP 476 "opting out" section References: <5541E0E6.6060701@egenix.com> Message-ID: <20150430112023.08f7de51@fsol> On Thu, 30 Apr 2015 09:59:34 +0200 "M.-A. Lemburg" wrote: > > Can we please make the monkeypatch a regular part of Python's > site.py which can enabled via an environment variable, say > export PYTHONHTTPSVERIFY=0. -1 (already explained in the bug below). > See http://bugs.python.org/issue23857 for the discussion. Regards Antoine. From dimaqq at gmail.com Thu Apr 30 13:41:53 2015 From: dimaqq at gmail.com (Dima Tisnek) Date: Thu, 30 Apr 2015 13:41:53 +0200 Subject: [Python-Dev] What's missing in PEP-484 (Type hints) Message-ID: # Syntactic sugar "Beautiful is better than ugly, " thus nice syntax is needed. Current syntax is very mechanical. Syntactic sugar is needed on top of current PEP. # internal vs external @intify def foo() -> int: b = "42" return b # check 1 x = foo() // 2 # check 2 Does the return type apply to implementation (str) or decorated callable (int)? How can same annotation or a pair of annotations be used to: * validate return statement type * validate subsequent use * look reasonable in the source code # lambda Not mentioned in the PEP, omitted for convenience or is there a rationale? f = lambda x: None if x is None else str(x ** 2) Current syntax seems to preclude annotation of `x` due to colon. Current syntax sort of allows lamba return type annotation, but it's easy to confuse with `f`. # local variables Not mentioned in the PEP Non-trivial code could really use these. # global variables Not mentioned in the PEP Module-level globals are part of API, annotation is welcome. What is the syntax? # comprehensions [3 * x.data for x in foo if "bar" in x.type] Arguable, perhaps annotation is only needed on `foo` here, but then how complex comprehensions, e.g. below, the intermediate comprehension could use an annotation [xx for y in [...] if ...] # class attributes s = socket.socket(...) s.type, s.family, s.proto # int s.fileno # callable If annotations are only available for methods, it will lead to Java-style explicit getters and setters. Python language and data model prefers properties instead, thus annotations are needed on attributes. # plain data user1 = dict(id=123, # always int name="uuu", # always str ...) # other fields possible smth = [42, "xx", ...] (why not namedtuple? b/c extensible, mutable) At least one PHP IDE allows to annotate PDO. Perhaps it's just bad taste in Python? Or is there a valid use-case? # personal note I think it's amazing how much thought has already been put into this proposal. The foundation is pretty solid (per Guido talk). I am not at all opposed to software that infers types (like jedi), or reads user-specified types (like phpstorm and pep 484) and does something good with that. In fact I'm ambivalent to current proposal, standard and promise of better tools on one hand; narrow scope, debatable looks on the other. -- dima From greg.ewing at canterbury.ac.nz Thu Apr 30 14:08:36 2015 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Fri, 01 May 2015 00:08:36 +1200 Subject: [Python-Dev] PEP 492: What is the real goal? In-Reply-To: References: <55411841.c5b3340a.1cf8.07e8@mx.google.com> <5541261A.9020909@gmail.com> Message-ID: <55421B44.50708@canterbury.ac.nz> Paul Moore wrote: > Also, can I run > the produce/consume just by calling produce()? My impression is that > with asyncio I need an event loop - which "traditional" coroutines > don't need. The Pythonic way to do things like that is to write the producer as a generator, and the consumer as a loop that iterates over it. Or the consumer as a generator, and the producer as a loop that send()s things into it. To do it symmetrically, you would need to write them both as generators (or async def functions or whatever) plus a mini event loop to tie the two together. -- Greg From steve at pearwood.info Thu Apr 30 14:33:46 2015 From: steve at pearwood.info (Steven D'Aprano) Date: Thu, 30 Apr 2015 22:33:46 +1000 Subject: [Python-Dev] What's missing in PEP-484 (Type hints) In-Reply-To: References: Message-ID: <20150430123345.GD5663@ando.pearwood.info> On Thu, Apr 30, 2015 at 01:41:53PM +0200, Dima Tisnek wrote: > # Syntactic sugar > "Beautiful is better than ugly, " thus nice syntax is needed. > Current syntax is very mechanical. > Syntactic sugar is needed on top of current PEP. I think the annotation syntax is beautiful. It reminds me of Pascal. > # internal vs external > @intify > def foo() -> int: > b = "42" > return b # check 1 > x = foo() // 2 # check 2 > > Does the return type apply to implementation (str) or decorated callable (int)? I would expect that a static type checker would look at foo, and flag this as an error. The annotation says that foo returns an int, but it clearly returns a string. That's an obvious error. Here is how I would write that: # Perhaps typing should have a Function type? def intify(func: Callable[[], str]) -> Callable[[], int]: @functools.wraps(func) def inner() -> int: return int(func()) return inner @intify def foo() -> str: b = "42" return b That should, I hope, pass the type check, and without lying about the signature of *undecorated* foo. The one problem with this is that naive readers will assume that *decorated* foo also has a return type of str, and be confused. That's a problem. One solution might be, "don't write decorators that change the return type", but that seems horribly restrictive. Another solution might be to write a comment: @intify # changes return type to int def foo() -> str: ... but that's duplicating information already in the intify decorator, and it relies on the programmer writing a comment, which people don't do unless they really need to. I think that the only solution is education: given a decorator, you cannot assume that the annotations still apply unless you know what the decorator does. > How can same annotation or a pair of annotations be used to: > * validate return statement type > * validate subsequent use > * look reasonable in the source code > > > # lambda > Not mentioned in the PEP, omitted for convenience or is there a rationale? > f = lambda x: None if x is None else str(x ** 2) > Current syntax seems to preclude annotation of `x` due to colon. > Current syntax sort of allows lamba return type annotation, but it's > easy to confuse with `f`. I don't believe that you can annotate lambda functions with current syntax. For many purposes, I do not think that is important: a good type checker will often be able to infer the return type of the lambda, and from that infer what argument types are permitted: lambda arg: arg + 1 Obviously arg must be a Number, since it has to support addition with ints. > # local variables > Not mentioned in the PEP > Non-trivial code could really use these. Normally local variables will have their type inferred from the operations done to them: s = arg[1:] # s has the same type as arg When that is not satisfactory, you can annotate variables with a comment: s = arg[1:] #type: List[int] https://www.python.org/dev/peps/pep-0484/#id24 > # global variables > Not mentioned in the PEP > Module-level globals are part of API, annotation is welcome. > What is the syntax? As above. > # comprehensions > [3 * x.data for x in foo if "bar" in x.type] > Arguable, perhaps annotation is only needed on `foo` here, but then > how complex comprehensions, e.g. below, the intermediate comprehension > could use an annotation > [xx for y in [...] if ...] A list comprehension is obviously of type List. If you need to give a more specific hint: result = [expr for x in things if cond(x)] #type: List[Whatever] See also the discussion of "cast" in the PEP. https://www.python.org/dev/peps/pep-0484/#id25 > # class attributes > s = socket.socket(...) > s.type, s.family, s.proto # int > s.fileno # callable > If annotations are only available for methods, it will lead to > Java-style explicit getters and setters. > Python language and data model prefers properties instead, thus > annotations are needed on attributes. class Thing: a = 42 # can be inferred b = [] # inferred as List[Any] c = [] #type: List[float] -- Steve From bcannon at gmail.com Thu Apr 30 16:23:11 2015 From: bcannon at gmail.com (Brett Cannon) Date: Thu, 30 Apr 2015 14:23:11 +0000 Subject: [Python-Dev] Postponing making a decision on the future of the development workflow Message-ID: Real world stuff is devouring my free time since immediately after PyCon and will continue to do so for probably the next few months. I'm hoping to find the energy to engage Donald and Nick about their proposals while I'm time-constrained so that when I do have free time again I will be able to make a decision quickly. This also means that I'm allowing Donald and Nick to update their PEPs while they wait for me, although I'm not taking on any more proposals as the two current proposals cover the two ranges of suggestions people have talked to me about on this topic. My apologies to Nick and Donald for slipping on my own deadline. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ethan at stoneleaf.us Thu Apr 30 17:07:32 2015 From: ethan at stoneleaf.us (Ethan Furman) Date: Thu, 30 Apr 2015 08:07:32 -0700 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: References: <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> <20150429172521.GC10248@stoneleaf.us> <55411877.7010608@gmail.com> <20150429183213.GD10248@stoneleaf.us> <554126F6.6050108@gmail.com> Message-ID: <20150430150732.GF10248@stoneleaf.us> On 04/29, Nathaniel Smith wrote: > (I suspect this may also be the impetus behind Greg's request that it just > be treated the same as unary minus. IMHO it matters much more that the > rules be predictable and teachable than that they allow or disallow every > weird edge case in exactly the right way.) +1 -- ~Ethan~ From drekin at gmail.com Thu Apr 30 17:44:00 2015 From: drekin at gmail.com (=?UTF-8?B?QWRhbSBCYXJ0b8Wh?=) Date: Thu, 30 Apr 2015 17:44:00 +0200 Subject: [Python-Dev] Unicode literals in Python 2.7 In-Reply-To: <87egn2pgv1.fsf@uwakimon.sk.tsukuba.ac.jp> References: <87egn2pgv1.fsf@uwakimon.sk.tsukuba.ac.jp> Message-ID: > does this not work for you? > > from __future__ import unicode_literals No, with unicode_literals I just don't have to use the u'' prefix, but the wrong interpretation persists. On Thu, Apr 30, 2015 at 3:03 AM, Stephen J. Turnbull wrote: > > IIRC, on the Linux console and in an uxterm, PYTHONIOENCODING=utf-8 in > the environment does what you want. Unfortunately, it doesn't work. With PYTHONIOENCODING=utf-8, the sys.std* streams are created with utf-8 encoding (which doesn't help on Windows since they still don't use ReadConsoleW and WriteConsoleW to communicate with the terminal) and after changing the sys.std* streams to the fixed ones and setting readline hook, it still doesn't work, so presumably the PyCF_SOURCE_IS_UTF8 is still not set. > Regarding your environment, the repeated use of "custom" is a red > flag. Unless you bundle your whole environment with the code you > distribute, Python can know nothing about that. In general, Python > doesn't know what encoding it is receiving text in. > Well, the received text comes from sys.stdin and its encoding is known. Ideally, Python would recieve the text as Unicode String object so there would be no problem with encoding (see http://bugs.python.org/issue17620#msg234439 ). If you *do* know, you can set PyCF_SOURCE_IS_UTF8. So if you know > that all of your users will have your custom stdio and readline hooks > installed (AFAICS, they can't use IDLE or IPython!), then you can > bundle Python built with the flag set, or perhaps you can do the > decoding in your custom stdio module. > The custom stdio streams and readline hooks are set at runtime by a code in sitecustomize. It does not affect IDLE and it is compatible with IPython. I would like to also set PyCF_SOURCE_IS_UTF8 at runtime from Python e.g. via ctypes. But this may be impossible. > Note that even if you have a UTF-8 input source, some users are likely > to be surprised because IIRC Python doesn't canonicalize in its > codecs; that is left for higher-level libraries. Linux UTF-8 is > usually NFC normalized, while Mac UTF-8 is NFD normalized. > Actually, I have a UTF-16-LE source, but that is not important since it's decoted to Python Unicode string object. I have this Unicode string and I'm to return it from the readline hook, but I don't know how to communicate it to the caller ? the tokenizer ? so it is interpreted correctly. Note that the following works: >>> eval(raw_input('~~> ')) ~~> u'?' u'\u03b1' Unfortunatelly, the REPL works differently than eval/exec on raw_input. It seems that the only option is to bypass the REPL by a custom REPL (e.g. based on code.InteractiveConsole). However, wrapping up the execution of a script, so that the custom REPL is invoked at the right place, is complicated. > >>> Le 29 avr. 2015 10:36, "Adam Barto?" a ?crit : > > >>> > Why I'm talking about PyCF_SOURCE_IS_UTF8? eval(u"u'\u03b1'") -> > > >>> u'\u03b1' but eval(u"u'\u03b1'".encode('utf-8')) -> u'\xce\xb1'. > > Just to be clear, you accept those results as correct, right? > Yes. In the latter case, eval has no idea how the bytes given are encoded. -------------- next part -------------- An HTML attachment was scrubbed... URL: From yselivanov.ml at gmail.com Thu Apr 30 18:03:03 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Thu, 30 Apr 2015 12:03:03 -0400 Subject: [Python-Dev] PEP 492: async/await in Python; version 4 In-Reply-To: <5541F2DB.5000201@canterbury.ac.nz> References: <554185C2.5080003@gmail.com> <5541F2DB.5000201@canterbury.ac.nz> Message-ID: <55425237.1080700@gmail.com> On 2015-04-30 5:16 AM, Greg Ewing wrote: > Yury Selivanov wrote: > >> 3. CO_NATIVE_COROUTINE flag. This enables us to disable >> __iter__ and __next__ on native coroutines while maintaining >> full backwards compatibility. > > I don't think you can honestly claim "full backwards > compatibility" as long as there are some combinations > of old-style and new-style code that won't work > together. You seem to be using your own personal > definition of "full" here. > Well, using next() and iter() on coroutines in asyncio code is something esoteric. I can't even imagine why you would want to do that. Yury From ethan at stoneleaf.us Thu Apr 30 18:15:41 2015 From: ethan at stoneleaf.us (Ethan Furman) Date: Thu, 30 Apr 2015 09:15:41 -0700 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <554126F6.6050108@gmail.com> References: <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> <20150429172521.GC10248@stoneleaf.us> <55411877.7010608@gmail.com> <20150429183213.GD10248@stoneleaf.us> <554126F6.6050108@gmail.com> Message-ID: <20150430161541.GG10248@stoneleaf.us> On 04/29, Yury Selivanov wrote: > Because you want operators to be resolved in the > order you see them, generally. > > You want '(await -fut)' to: > > 1. Suspend on fut; > 2. Get the result; > 3. Negate it. > > This is a non-obvious thing. I would myself interpret it > as: > > 1. Get fut.__neg__(); > 2. await on it. Both you and Paul are correct on this, thank you. The proper resolution of await -coro() is indeed to get the result of coro(), call it's __neg__ method, and then await on that. And that is perfectly reasonable, and should not be a SyntaxError; what it might be is an AttributeError (no __neg__ method) or an AsyncError (__neg__ returned non-awaitable object), or might even just work [1]... but it definitely should /not/ be a SyntaxError. -- ~Ethan~ [1] http://stackoverflow.com/q/7719018/208880 From jimjjewett at gmail.com Thu Apr 30 19:24:27 2015 From: jimjjewett at gmail.com (Jim J. Jewett) Date: Thu, 30 Apr 2015 13:24:27 -0400 Subject: [Python-Dev] PEP 492: What is the real goal? In-Reply-To: References: <55411841.c5b3340a.1cf8.07e8@mx.google.com> Message-ID: On Wed, Apr 29, 2015 at 2:26 PM, Paul Moore wrote: > On 29 April 2015 at 18:43, Jim J. Jewett wrote: >> So? PEP 492 never says what coroutines *are* in a way that explains >> why it matters that they are different from generators. ... > Looking at the Wikipedia article on coroutines, I see an example of > how a producer/consumer process might be written with coroutines: > > var q := new queue > > coroutine produce > loop > while q is not full > create some new items > add the items to q > yield to consume > > coroutine consume > loop > while q is not empty > remove some items from q > use the items > yield to produce > > (To start everything off, you'd just run "produce"). > > I can't even see how to relate that to PEP 429 syntax. I'm not allowed > to use "yield", so should I use "await consume" in produce (and vice > versa)? I think so ... but the fact that nothing is actually coming via the await channel makes it awkward. I also worry that it would end up with an infinite stack depth, unless the await were actually replaced with some sort of framework-specific scheduling primitive, or one of them were rewritten differently to ensure it returned to the other instead of calling it anew. I suspect the real problem is that the PEP is really only concerned with a very specific subtype of coroutine, and these don't quite fit. (Though it could be done by somehow making them both await on the queue status, instead of on each other.) -jJ From guido at python.org Thu Apr 30 19:38:16 2015 From: guido at python.org (Guido van Rossum) Date: Thu, 30 Apr 2015 10:38:16 -0700 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <20150430161541.GG10248@stoneleaf.us> References: <553E3D20.2020008@gmail.com> <55403937.7040409@gmail.com> <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> <20150429172521.GC10248@stoneleaf.us> <55411877.7010608@gmail.com> <20150429183213.GD10248@stoneleaf.us> <554126F6.6050108@gmail.com> <20150430161541.GG10248@stoneleaf.us> Message-ID: On Thu, Apr 30, 2015 at 9:15 AM, Ethan Furman wrote: > [...] > Both you and Paul are correct on this, thank you. The proper resolution > of > > await -coro() > > is indeed to get the result of coro(), call it's __neg__ method, and then > await on that. > > And that is perfectly reasonable, and should not be a SyntaxError; what it > might be is an AttributeError (no __neg__ method) or an AsyncError (__neg__ > returned non-awaitable object), or might even just work [1]... but it > definitely should /not/ be a SyntaxError. > Why not? Unlike some other languages, Python does not have uniform priorities for unary operators, so it's reasonable for some unary operations to have a different priority than others, and certain things will be SyntaxErrors because of that. E.g. you can write "not -x" but you can't write "- not x". This is because they have different priorities: 'not' has a very low priority so it binds less tight than comparisons (which bind less tight than arithmetic), but '-' has a high priority. (There are other quirks, e.g. -2**2 means -(2**2) and you can write 2**-2; but you can't write 2**not x.) -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Thu Apr 30 19:55:08 2015 From: guido at python.org (Guido van Rossum) Date: Thu, 30 Apr 2015 10:55:08 -0700 Subject: [Python-Dev] PEP 492: What is the real goal? In-Reply-To: References: <55411841.c5b3340a.1cf8.07e8@mx.google.com> Message-ID: On Thu, Apr 30, 2015 at 10:24 AM, Jim J. Jewett wrote: > I suspect the real problem is that the PEP is really only concerned > with a very specific subtype of coroutine, and these don't quite fit. > That's correct. The PEP is concerned with the existing notion of coroutines in Python, which was first introduced by PEP 342: Coroutines via Enhanced Generators. The Wikipedia definition of coroutine (which IIRC is due to Knuth) is quite different and nobody who actually uses the coding style introduced by PEP 342 should mistake one for the other. This same notion of "Pythonic" (so to speak) coroutines was refined by PEP 380, which introduced yield from. It was then *used* in PEP 3156 (the asyncio package) for the specific purpose of standardizing a way to do I/O multiplexing using an event loop. The basic premise of using coroutines with the asyncio package is that most of the time you can write *almost* sequential code as long as you insert "yield from" in front of all blocking operations (and as long as you use blocking operations that are implemented by or on top of the asyncio package). This makes the code easier to follow than code written "traditional" event-loop-based I/O multiplexing (which is heavy on callbacks, or callback-like abstractions like Twisted's Deferred). However, heavy users of the asyncio package (like Yury) discovered some common patterns when using coroutines that were awkward. In particular, "yield from" is quite a mouthful, the coroutine version of a for-loop is awkward, and a with-statement can't have a blocking operation in __exit__ (because there's no explicit yield opcode). PEP 492 proposes a quite simple and elegant solution for these issues. Most of the technical discussion about the PEP is on getting the details right so that users won't have to worry about them, and can instead just continue to write *almost* sequential code when using the asyncio package (or some other framework that offers an event loop integrated with coroutines). -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From ethan at stoneleaf.us Thu Apr 30 19:56:34 2015 From: ethan at stoneleaf.us (Ethan Furman) Date: Thu, 30 Apr 2015 10:56:34 -0700 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: References: <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> <20150429172521.GC10248@stoneleaf.us> <55411877.7010608@gmail.com> <20150429183213.GD10248@stoneleaf.us> <554126F6.6050108@gmail.com> <20150430161541.GG10248@stoneleaf.us> Message-ID: <20150430175634.GH10248@stoneleaf.us> On 04/30, Guido van Rossum wrote: > On Thu, Apr 30, 2015 at 9:15 AM, Ethan Furman wrote: > >> [...] >> Both you and Paul are correct on this, thank you. The proper resolution >> of >> >> await -coro() >> >> is indeed to get the result of coro(), call it's __neg__ method, and then >> await on that. >> >> And that is perfectly reasonable, and should not be a SyntaxError; what it >> might be is an AttributeError (no __neg__ method) or an AsyncError (__neg__ >> returned non-awaitable object), or might even just work [1]... but it >> definitely should /not/ be a SyntaxError. >> > > Why not? Unlike some other languages, Python does not have uniform > priorities for unary operators, so it's reasonable for some unary > operations to have a different priority than others, and certain things > will be SyntaxErrors because of that. E.g. you can write "not -x" but you > can't write "- not x". For one, Yury's answer is "- await x" which looks just as nonsensical as "- not x". For another, an error of some type will be raised if either __neg__ doesn't exist or it doesn't return an awaitable, so a SyntaxError is unnecessary. For a third, by making it a SyntaxError you are forcing the use of parens to get what should be the behavior anyway. In other words, a SyntaxError is nat any clearer than "AttributeError: obj has no __neg__ method" and it's not any clearer than "AwaitError: __neg__ returned not-awaitable". Those last two errors tell you exactly what you did wrong. -- ~Ethan~ From yselivanov.ml at gmail.com Thu Apr 30 20:04:58 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Thu, 30 Apr 2015 14:04:58 -0400 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <20150430175634.GH10248@stoneleaf.us> References: <5540A093.2070808@canterbury.ac.nz> <5540E2CD.6060101@gmail.com> <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> <20150429172521.GC10248@stoneleaf.us> <55411877.7010608@gmail.com> <20150429183213.GD10248@stoneleaf.us> <554126F6.6050108@gmail.com> <20150430161541.GG10248@stoneleaf.us> <20150430175634.GH10248@stoneleaf.us> Message-ID: <55426ECA.6040403@gmail.com> On 2015-04-30 1:56 PM, Ethan Furman wrote: >> >Why not? Unlike some other languages, Python does not have uniform >> >priorities for unary operators, so it's reasonable for some unary >> >operations to have a different priority than others, and certain things >> >will be SyntaxErrors because of that. E.g. you can write "not -x" but you >> >can't write "- not x". > For one, Yury's answer is "- await x" which looks just as nonsensical as > "- not x". "- await x" is a perfectly valid code: result = - await compute_in_db() (same as "result = - (yield from do_something())") > > For another, an error of some type will be raised if either __neg__ doesn't > exist or it doesn't return an awaitable, so a SyntaxError is unnecessary. > > For a third, by making it a SyntaxError you are forcing the use of parens to > get what should be the behavior anyway. I still want to see where my current grammar forces to use parens. See [1], there are no useless parens anywhere. FWIW, I'll fix the 'await (await x)' expression to be parsed without parens. > > In other words, a SyntaxError is nat any clearer than "AttributeError: obj > has no __neg__ method" and it's not any clearer than "AwaitError: __neg__ > returned not-awaitable". Those last two errors tell you exactly what you > did wrong. This is debatable. "obj has no __neg__ method" isn't obvious to everyone (especially to those people who aren't using operator overloading). [1] https://www.python.org/dev/peps/pep-0492/#examples-of-await-expressions Yury From ethan at stoneleaf.us Thu Apr 30 20:22:55 2015 From: ethan at stoneleaf.us (Ethan Furman) Date: Thu, 30 Apr 2015 11:22:55 -0700 Subject: [Python-Dev] PEP 492 vs. PEP 3152, new round In-Reply-To: <55426ECA.6040403@gmail.com> References: <20150429152930.GA10248@stoneleaf.us> <5540F9E3.9080805@gmail.com> <20150429172521.GC10248@stoneleaf.us> <55411877.7010608@gmail.com> <20150429183213.GD10248@stoneleaf.us> <554126F6.6050108@gmail.com> <20150430161541.GG10248@stoneleaf.us> <20150430175634.GH10248@stoneleaf.us> <55426ECA.6040403@gmail.com> Message-ID: <20150430182254.GI10248@stoneleaf.us> On 04/30, Yury Selivanov wrote: > On 2015-04-30 1:56 PM, Ethan Furman wrote: > I still want to see where my current grammar forces to use > parens. See [1], there are no useless parens anywhere. --> await -coro() SyntaxError --> await (-coro()) # not a SyntaxError, therefore parens are # forced >> In other words, a SyntaxError is nat any clearer than "AttributeError: obj >> has no __neg__ method" and it's not any clearer than "AwaitError: __neg__ >> returned not-awaitable". Those last two errors tell you exactly what you >> did wrong. > > This is debatable. "obj has no __neg__ method" isn't obvious > to everyone (especially to those people who aren't using > operator overloading). Good news! The error there is actually --> -object() TypeError: bad operand type for unary -: 'object' Which is definitely clear, even for those who don't do operator overloading. -- ~Ethan~ From jimjjewett at gmail.com Thu Apr 30 20:41:55 2015 From: jimjjewett at gmail.com (Jim J. Jewett) Date: Thu, 30 Apr 2015 11:41:55 -0700 (PDT) Subject: [Python-Dev] PEP 492: What is the real goal? In-Reply-To: <55411D9F.2050400@gmail.com> Message-ID: <55427773.ed89340a.5bd4.0711@mx.google.com> On Wed Apr 29 20:06:23 CEST 2015,Yury Selivanov replied: >> As best I can guess, the difference seems to be that a "normal" >> generator is using yield primarily to say: >> "I'm not done; I have more values when you want them", >> but an asynchronous (PEP492) coroutine is primarily saying: >> "This might take a while, go ahead and do something else meanwhile." > Correct. Then I strongly request a more specific name than coroutine. I would prefer something that refers to cooperative pre-emption, but I haven't thought of anything that is short without leading to other types of confusion. My least bad idea at the moment would be "self-suspending coroutine" to emphasize that suspending themselves is a crucial feature. Even "PEP492-coroutine" would be an improvement. >> Does it really permit *making* them [asynchronous calls], or does >> it just signal that you will be waiting for them to finish processing >> anyhow, and it doesn't need to be a busy-wait? > I does. Bad phrasing on my part. Is there anything that prevents an asynchronous call (or waiting for one) without the "async with"? If so, I'm missing something important. Either way, I would prefer different wording in the PEP. >>> It uses the ``yield from`` implementation with an extra step of >>> validating its argument. ``await`` only accepts an *awaitable*, >>> which can be one of: >> What justifies this limitation? > We want to avoid people passing regular generators and random > objects to 'await', because it is a bug. Why? Is it a bug just because you defined it that way? Is it a bug because the "await" makes timing claims that an object not making such a promise probably won't meet? (In other words, a marker interface.) Is it likely to be a symptom of something that wasn't converted correctly, *and* there are likely to be other bugs caused by that same lack of conversion? > For coroutines in PEP 492: > __await__ = __anext__ is the same as __call__ = __next__ > __await__ = __aiter__ is the same as __call__ = __iter__ That tells me that it will be OK sometimes, but will usually be either a mistake or an API problem -- and it explains why. Please put those 3 lines in the PEP. > This is OK. The point is that you can use 'await log' in > __aenter__. If you don't need awaits in __aenter__ you can > use them in __aexit__. If you don't need them there too, > then just define a regular context manager. Is it an error to use "async with" on a regular context manager? If so, why? If it is just that doing so could be misleading, then what about "async with mgr1, mgr2, mgr3" -- is it enough that one of the three might suspend itself? >> class AsyncContextManager: >> def __aenter__(self): >> log('entering context') > __aenter__ must return an awaitable Why? Is there a fundamental reason, or it is just to avoid the hassle of figuring out whether or not the returned object is a future that might still need awaiting? Is there an assumption that the scheduler will let the thing-being awaited run immediately, but look for other tasks when it returns, and a further assumption that something which finishes the whole task would be too slow to run right away? > It doesn't make any sense in using 'async with' outside of a > coroutine. The interpeter won't know what to do with them: > you need an event loop for that. So does the PEP also provide some way of ensuring that there is an event loop? Does it assume that self-suspending coroutines will only ever be called by an already-running event loop compatible with asyncio.get_event_loop()? If so, please make these contextual assumptions explicit near the beginning of the PEP. >>> It is a ``TypeError`` to pass a regular iterable without ``__aiter__`` >>> method to ``async for``. It is a ``SyntaxError`` to use ``async for`` >>> outside of a coroutine. >> The same questions about why -- what is the harm? I can imagine that as an implementation detail, the async for wouldn't be taken advtange of unless it was running under an event loop that knew to look for "aync for" as suspension points. I'm not seeing what the actual harm is in either not happening to suspend (less efficient, but still correct), or in suspending between every step of a regular iterator (because, why not?) >>> For debugging this kind of mistakes there is a special debug mode in >>> asyncio, in which ``@coroutine`` ... >>> decorator makes the decision of whether to wrap or not to wrap based on >>> an OS environment variable ``PYTHONASYNCIODEBUG``. (1) How does this differ from the existing asynchio.coroutine? (2) Why does it need to have an environment variable? (Sadly, the answer may be "backwards compatibility", if you're really just specifying the existing asynchio interface better.) (3) Why does it need [set]get_coroutine_wrapper, instead of just setting the asynchio.coroutines.coroutine attribute? (4) Why do the get/set need to be in sys? Is the intent to do anything more than preface execution with: import asynchio.coroutines asynchio.coroutines._DEBUG = True -jJ -- If there are still threading problems with my replies, please email me with details, so that I can try to resolve them. -jJ From p.f.moore at gmail.com Thu Apr 30 20:52:46 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Thu, 30 Apr 2015 19:52:46 +0100 Subject: [Python-Dev] PEP 492: What is the real goal? In-Reply-To: References: <55411841.c5b3340a.1cf8.07e8@mx.google.com> <5541261A.9020909@gmail.com> Message-ID: On 29 April 2015 at 20:19, Paul Moore wrote: > However, just to make my point precise, here's a more or less direct > translation of the Wikipedia code into Python. It doesn't actually > work, because getting the right combinations of yield and send stuff > is confusing to me. Specifically, I suspect that "yield > produce.send(None)" isn't the right way to translate "yield to > produce". But it gives the idea. Hmm, when I try to fix this "minor" (as I thought!) issue with my code, I discover it's more fundamental. The error I get is Traceback (most recent call last): File ".\coro.py", line 28, in next(produce) File ".\coro.py", line 13, in produce yield consume.send(None) File ".\coro.py", line 23, in consume yield produce.send(None) ValueError: generator already executing What I now realise that means is that you cannot have producer send to consumer which then sends back to producer. That's what the "generator already executing" message means. This is fundamentally different from the "traditional" use of coroutines as described in the Wikipedia article, and as I thought was implemented in Python. The Wikipedia example allows two coroutines to freely yield between each other. Python, on the other hand, does not support this - it requires the mediation of some form of "trampoline" controller (or event loop, in asyncio terms) to redirect control. [1] This limitation of Python's coroutines is not mentioned anywhere in PEP 342, and that's probably why I never really understood Python coroutines properly, as my mental model didn't match the implementation. Given that any non-trivial use of coroutines in Python requires an event loop / trampoline, I begin to understand the logic behind asyncio and this PEP a little better. I'm a long way behind in understanding the details, but at least I'm no longer completely baffled. Somewhere, there should be an explanation of the difference between Python's coroutines and Wikipedia's - I can't be the only person to be confused like this. But I don't think there's any docs covering "coroutines in Python" outside of PEP 342 - the docs just cover the components (the send and throw methods, the yield expression, etc). Maybe it could be covered in the send documentation (as that's what gives the "generator already executing" error. I'll try to work up a doc patch. Actually, looking at the docs, I can't even *find* where the behaviour of the send method is defined - can someone point me in the right direction? Paul [1] It's sort of similar to how Python doesn't do tail call elimination. Symmetric yields rely on stack frames that are no longer needed being discarded if they are to avoid unlimited recursion, so to have symmetric yields, Python would need a form of tail call ("tail yield", I guess :-)) elimination. From yselivanov.ml at gmail.com Thu Apr 30 21:27:09 2015 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Thu, 30 Apr 2015 15:27:09 -0400 Subject: [Python-Dev] PEP 492: What is the real goal? In-Reply-To: <55427773.ed89340a.5bd4.0711@mx.google.com> References: <55427773.ed89340a.5bd4.0711@mx.google.com> Message-ID: <5542820D.1030504@gmail.com> Jim, On 2015-04-30 2:41 PM, Jim J. Jewett wrote: [...] >>> Does it really permit *making* them [asynchronous calls], or does >>> it just signal that you will be waiting for them to finish processing >>> anyhow, and it doesn't need to be a busy-wait? >> I does. > Bad phrasing on my part. Is there anything that prevents an > asynchronous call (or waiting for one) without the "async with"? > > If so, I'm missing something important. Either way, I would > prefer different wording in the PEP. Yes, you can't use 'yield from' in __exit__/__enter__ in current Python. >>>> It uses the ``yield from`` implementation with an extra step of >>>> validating its argument. ``await`` only accepts an *awaitable*, >>>> which can be one of: >>> What justifies this limitation? >> We want to avoid people passing regular generators and random >> objects to 'await', because it is a bug. > Why? > > Is it a bug just because you defined it that way? > > Is it a bug because the "await" makes timing claims that an > object not making such a promise probably won't meet? (In > other words, a marker interface.) > > Is it likely to be a symptom of something that wasn't converted > correctly, *and* there are likely to be other bugs caused by > that same lack of conversion? Same as 'yield from' is expecting an iterable, await is expecting an awaitable. That's the protocol. You can't pass random objects to 'with' statements, 'yield from', 'for..in', etc. If you write def gen(): yield 1 await gen() then it's a bug. > >> For coroutines in PEP 492: >> __await__ = __anext__ is the same as __call__ = __next__ >> __await__ = __aiter__ is the same as __call__ = __iter__ > That tells me that it will be OK sometimes, but will usually > be either a mistake or an API problem -- and it explains why. > > Please put those 3 lines in the PEP. There is a line like that: https://www.python.org/dev/peps/pep-0492/#await-expression Look for "Also, please note..." line. >> This is OK. The point is that you can use 'await log' in >> __aenter__. If you don't need awaits in __aenter__ you can >> use them in __aexit__. If you don't need them there too, >> then just define a regular context manager. > Is it an error to use "async with" on a regular context manager? > If so, why? If it is just that doing so could be misleading, > then what about "async with mgr1, mgr2, mgr3" -- is it enough > that one of the three might suspend itself? 'with' requires an object with __enter__ and __exit__ 'async with' requires an object with __aenter__ and __aexit__ You can have an object that implements both interfaces. > >>> class AsyncContextManager: >>> def __aenter__(self): >>> log('entering context') > >> __aenter__ must return an awaitable > Why? Is there a fundamental reason, or it is just to avoid the > hassle of figuring out whether or not the returned object is a > future that might still need awaiting? The fundamental reason why 'async with' is proposed is because you can't suspend execution in __enter__ and __exit__. If you need to suspend it there, use 'async with' and its __a*__ methods, but they have to return awaitable (see https://www.python.org/dev/peps/pep-0492/#new-syntax and look what 'async with' is semantically equivalent to) > > Is there an assumption that the scheduler will let the thing-being > awaited run immediately, but look for other tasks when it returns, > and a further assumption that something which finishes the whole > task would be too slow to run right away? > >> It doesn't make any sense in using 'async with' outside of a >> coroutine. The interpeter won't know what to do with them: >> you need an event loop for that. > So does the PEP also provide some way of ensuring that there is > an event loop? Does it assume that self-suspending coroutines > will only ever be called by an already-running event loop > compatible with asyncio.get_event_loop()? If so, please make > these contextual assumptions explicit near the beginning of the PEP. You need some kind of loop, but it doesn't have to the one from asyncio. There is at least one place in the PEP where it's mentioned that the PEP introduses a generic concept that can be used by asyncio *and* other frameworks. > > >>>> It is a ``TypeError`` to pass a regular iterable without ``__aiter__`` >>>> method to ``async for``. It is a ``SyntaxError`` to use ``async for`` >>>> outside of a coroutine. >>> The same questions about why -- what is the harm? > I can imagine that as an implementation detail, the async for wouldn't > be taken advtange of unless it was running under an event loop that > knew to look for "aync for" as suspension points. Event loop doesn't need to know anything about 'async with' and 'async for'. For loop it's always one thing -- something is awaiting somewhere for some result. > > I'm not seeing what the actual harm is in either not happening to > suspend (less efficient, but still correct), or in suspending between > every step of a regular iterator (because, why not?) > > >>>> For debugging this kind of mistakes there is a special debug mode in >>>> asyncio, in which ``@coroutine`` > ... >>>> decorator makes the decision of whether to wrap or not to wrap based on >>>> an OS environment variable ``PYTHONASYNCIODEBUG``. > (1) How does this differ from the existing asynchio.coroutine? > (2) Why does it need to have an environment variable? (Sadly, > the answer may be "backwards compatibility", if you're really > just specifying the existing asynchio interface better.) > (3) Why does it need [set]get_coroutine_wrapper, instead of just > setting the asynchio.coroutines.coroutine attribute? > (4) Why do the get/set need to be in sys? That section describes some hassles we had in asyncio to enable better debugging. (3) because it allows to enable debug selectively when we need it (4) because it's where functions like 'set_trace' live. set_coroutine_wrapper() also requires some modifications in the eval loop, so sys looks like the right place. > > Is the intent to do anything more than preface execution with: > > import asynchio.coroutines > asynchio.coroutines._DEBUG = True This won't work, unfortunately. You need to set the debug flag *before* you import asyncio package (otherwise we would have an unavoidable performance cost for debug features). If you enable it after you import asyncio, then asyncio itself won't be instrumented. Please see the implementation of asyncio.coroutine for details. set_coroutine_wrapper solves these problems. Yury From guido at python.org Thu Apr 30 21:32:02 2015 From: guido at python.org (Guido van Rossum) Date: Thu, 30 Apr 2015 12:32:02 -0700 Subject: [Python-Dev] PEP 492: What is the real goal? In-Reply-To: <55427773.ed89340a.5bd4.0711@mx.google.com> References: <55411D9F.2050400@gmail.com> <55427773.ed89340a.5bd4.0711@mx.google.com> Message-ID: On Thu, Apr 30, 2015 at 11:41 AM, Jim J. Jewett wrote: > > On Wed Apr 29 20:06:23 CEST 2015,Yury Selivanov replied: > > >> As best I can guess, the difference seems to be that a "normal" > >> generator is using yield primarily to say: > > >> "I'm not done; I have more values when you want them", > This seems so vague as to be useless to me. When using generators to implement iterators, "yield" very specifically means "here is the next value in the sequence I'm generating". (And to indicate there are no more values you have to use "return".) > >> but an asynchronous (PEP492) coroutine is primarily saying: > > >> "This might take a while, go ahead and do something else > meanwhile." > > > Correct. > Actually that's not even wrong. When using generators as coroutines, PEP 342 style, "yield" means "I am blocked waiting for a result that the I/O multiplexer is eventually going to produce". The argument to yield tells the multiplexer what the coroutine is waiting for, and it puts the generator stack frame on an appropriate queue. When the multiplexer has obtained the requested result it resumes the coroutine by using send() with that value, which resumes the coroutine/generator frame, making that value the return value from yield. Read Greg Ewing's tutorial for more color: http://www.cosc.canterbury.ac.nz/greg.ewing/python/yield-from/yield_from.html Then I strongly request a more specific name than coroutine. > No, this is the name we've been using since PEP 342 and it's still the same concept. -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From solipsis at pitrou.net Thu Apr 30 21:40:36 2015 From: solipsis at pitrou.net (Antoine Pitrou) Date: Thu, 30 Apr 2015 21:40:36 +0200 Subject: [Python-Dev] PEP 492: What is the real goal? References: <55411D9F.2050400@gmail.com> <55427773.ed89340a.5bd4.0711@mx.google.com> Message-ID: <20150430214036.33675e57@fsol> On Thu, 30 Apr 2015 12:32:02 -0700 Guido van Rossum wrote: > > No, this is the name we've been using since PEP 342 and it's still the same > concept. The fact that all syntax uses the word "async" and not "coro" or "coroutine" hints that it should really *not* be called a coroutine (much less a "native coroutine", which both silly and a lie). Why not "async function"? Regards Antoine. From p.f.moore at gmail.com Thu Apr 30 21:46:46 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Thu, 30 Apr 2015 20:46:46 +0100 Subject: [Python-Dev] PEP 492: What is the real goal? In-Reply-To: References: <55411D9F.2050400@gmail.com> <55427773.ed89340a.5bd4.0711@mx.google.com> Message-ID: On 30 April 2015 at 20:32, Guido van Rossum wrote: >> Then I strongly request a more specific name than coroutine. > > No, this is the name we've been using since PEP 342 and it's still the same > concept. However, it is (as I noted in my other email) not very well documented. There isn't a glossary entry in the docs for "coroutine", and there's nothing pointing out that coroutines need (for anything other than toy cases) an event loop, trampoline, or IO multiplexer (call it what you want, although I prefer terms that don't make it sound like it's exclusively about IO). I'll raise an issue on the tracker for this, and I'll see if I can write up something. Once there's a non-expert's view in the docs, the experts can clarify the technicalities if I get them wrong :-) I propose a section under https://docs.python.org/3/reference/expressions.html#yield-expressions describing coroutines, and their usage. Paul From guido at python.org Thu Apr 30 22:09:45 2015 From: guido at python.org (Guido van Rossum) Date: Thu, 30 Apr 2015 13:09:45 -0700 Subject: [Python-Dev] PEP 492: What is the real goal? In-Reply-To: <20150430214036.33675e57@fsol> References: <55411D9F.2050400@gmail.com> <55427773.ed89340a.5bd4.0711@mx.google.com> <20150430214036.33675e57@fsol> Message-ID: It is spelled "Raymond Luxury-Yacht", but it's pronounced "Throatwobbler Mangrove". :-) I am actually fine with calling a function defined with "async def ..." an async function, just as we call a function containing "yield" a generator function. However I prefer to still use "coroutine" to describe the concept implemented by async functions. *Some* generator functions also implement coroutines; however I would like to start a movement where eventually we'll always be using async functions when coroutines are called for, dedicating generators once again to their pre-PEP-342 role of a particularly efficient way to implement iterators. Note that I'm glossing over the distinction between yield and yield-from here; both can be used to implement the coroutine pattern, but the latter has some advantages when the pattern is used to support an event loop: most importantly, when using yield-from-style coroutines, a coroutine can use return to pass a value directly to the stack frame that is waiting for its result. Prior to PEP 380 (yield from), the trampoline would have to be involved in this step, and there was no standard convention for how to communicate the final result to the trampoline; I've seen "returnValue(x)" (Twisted inlineCallbacks), "raise ReturnValue(x)" (Google App Engine NDB), "yield Return(x)" (Monocle) and I believe I've seen plain "yield x" too (the latter two being abominations in my mind, since it's unclear whether the generator is resumed after s value-returning yield). While yield-from was an improvement over plain yield, await is an improvement over yield-from. As with most changes to Python (as well as natural evolution), an improvement often leads the way to another improvement -- one that wasn't obvious before. And that's fine. If I had laid awake worrying about the best way to spell async functions while designing asyncio, PEP 3156 probably still wouldn't have been finished today. On Thu, Apr 30, 2015 at 12:40 PM, Antoine Pitrou wrote: > On Thu, 30 Apr 2015 12:32:02 -0700 > Guido van Rossum wrote: > > > > No, this is the name we've been using since PEP 342 and it's still the > same > > concept. > > The fact that all syntax uses the word "async" and not "coro" or > "coroutine" hints that it should really *not* be called a coroutine > (much less a "native coroutine", which both silly and a lie). > > Why not "async function"? > > Regards > > Antoine. > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From arnodel at gmail.com Thu Apr 30 10:50:11 2015 From: arnodel at gmail.com (Arnaud Delobelle) Date: Thu, 30 Apr 2015 09:50:11 +0100 Subject: [Python-Dev] PEP 492: What is the real goal? In-Reply-To: <5541341A.8090204@gmail.com> References: <55411841.c5b3340a.1cf8.07e8@mx.google.com> <5541261A.9020909@gmail.com> <5541341A.8090204@gmail.com> Message-ID: On 29 April 2015 at 20:42, Yury Selivanov wrote: > Everybody is pulling me in a different direction :) > Guido proposed to call them "native coroutines". Some people > think that "async functions" is a better name. Greg loves > his "cofunction" term. > > I'm flexible about how we name 'async def' functions. I like > to call them "coroutines", because that's what they are, and > that's how asyncio calls them. It's also convenient to use > 'coroutine-object' to explain what is the result of calling > a coroutine. I'd like the object created by an 'async def' statement to be called a 'coroutine function' and the result of calling it to be called a 'coroutine'. This is consistent with the usage of 'generator function' and 'generator' has two advantages IMO: - they both would follow the pattern 'X function' is a function statement that when called returns an 'X'. - When the day comes to define generator coroutines, then it will be clear what to call them: 'generator coroutine function' will be the function definition and 'generator coroutine' will be the object it creates. Cheers, -- Arnaud From me at the-compiler.org Thu Apr 30 13:54:11 2015 From: me at the-compiler.org (Florian Bruhin) Date: Thu, 30 Apr 2015 13:54:11 +0200 Subject: [Python-Dev] What's missing in PEP-484 (Type hints) In-Reply-To: References: Message-ID: <20150430115411.GN429@tonks> * Dima Tisnek [2015-04-30 13:41:53 +0200]: > # lambda > Not mentioned in the PEP, omitted for convenience or is there a rationale? > f = lambda x: None if x is None else str(x ** 2) > Current syntax seems to preclude annotation of `x` due to colon. > Current syntax sort of allows lamba return type annotation, but it's > easy to confuse with `f`. Not sure if you'd really want to stuff type annotations into a lambda... at that point you'd IMHO be better off by using a real function. > # local variables > Not mentioned in the PEP > Non-trivial code could really use these. > > > # global variables > Not mentioned in the PEP > Module-level globals are part of API, annotation is welcome. > What is the syntax? > > > # comprehensions > [3 * x.data for x in foo if "bar" in x.type] > Arguable, perhaps annotation is only needed on `foo` here, but then > how complex comprehensions, e.g. below, the intermediate comprehension > could use an annotation > [xx foj y in [...] if ...] > > > # class attributes > s = socket.socket(...) > s.type, s.family, s.proto # int > s.fileno # callable > If annotations are only available for methods, it will lead to > Java-style explicit getters and setters. > Python language and data model prefers properties instead, thus > annotations are needed on attributes. > > > # plain data > user1 = dict(id=123, # always int > name="uuu", # always str > ...) # other fields possible > smth = [42, "xx", ...] > (why not namedtuple? b/c extensible, mutable) > At least one PHP IDE allows to annotate PDO. > Perhaps it's just bad taste in Python? Or is there a valid use-case? Most (all?) of this is actually mentioned in the PEP: https://www.python.org/dev/peps/pep-0484/#type-comments Florian -- http://www.the-compiler.org | me at the-compiler.org (Mail/XMPP) GPG: 916E B0C8 FD55 A072 | http://the-compiler.org/pubkey.asc I love long mails! | http://email.is-not-s.ms/ -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 819 bytes Desc: not available URL: From pmiscml at gmail.com Thu Apr 30 09:58:14 2015 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Thu, 30 Apr 2015 10:58:14 +0300 Subject: [Python-Dev] PEP 492: What is the real goal? In-Reply-To: <5541D14C.5090404@canterbury.ac.nz> References: <55411841.c5b3340a.1cf8.07e8@mx.google.com> <5541261A.9020909@gmail.com> <5541341A.8090204@gmail.com> <5541D14C.5090404@canterbury.ac.nz> Message-ID: <20150430105814.06cb6583@x230> Hello, On Thu, 30 Apr 2015 18:53:00 +1200 Greg Ewing wrote: > Skip Montanaro wrote: > > According to Wikipedia , > > term "coroutine" was first coined in 1958, so several generations > > of computer science graduates will be familiar with the textbook > > definition. If your use of "coroutine" matches the textbook > > definition of the term, I think you should continue to use it > > instead of inventing new names which will just confuse people new > > to Python. > > I don't think anything in asyncio or PEP 492 fits that > definition directly. Generators and async def functions > seem to be what that page calls a "generator" or "semicoroutine": > > they differ in that coroutines can control where execution > continues after they yield, while generators cannot, instead > transferring control back to the generator's caller. But of course it's only a Wikipedia page, which doesn't mean it has to provide complete and well-defined picture, and quality of some (important) Wikipedia pages is indeed pretty poor and doesn't improve. -- Best regards, Paul mailto:pmiscml at gmail.com