From alexander.belopolsky at gmail.com Thu Oct 1 02:19:21 2015 From: alexander.belopolsky at gmail.com (Alexander Belopolsky) Date: Wed, 30 Sep 2015 20:19:21 -0400 Subject: [Python-Dev] PEP acceptance and SIGs Message-ID: It has been my understanding that some PEPs may be discussed on specialized mailings lists, but a notice would be given on python-dev prior to any acceptance. I have recently received a notification that since PEP 470 has been accepted, I can no longer use external hosting for one of the packages that I published on PyPI. The PEP refers to a notice [1] posted on distutils-sig, but I cannot find any discussion of PEP 470 on either python-dev or python-ideas. [1]: https://mail.python.org/pipermail/distutils-sig/2015-September/026789.html -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Thu Oct 1 02:32:28 2015 From: donald at stufft.io (Donald Stufft) Date: Wed, 30 Sep 2015 20:32:28 -0400 Subject: [Python-Dev] PEP acceptance and SIGs In-Reply-To: References: Message-ID: On September 30, 2015 at 8:20:53 PM, Alexander Belopolsky (alexander.belopolsky at gmail.com) wrote: > It has been my understanding that some PEPs may be discussed on specialized > mailings lists, but a notice would be given on python-dev prior to any > acceptance. > I don?t see any requirement to post PEPs to python-dev if they have a Discussions-To header in PEP 1. I don?t really think it makes sense in this case either tbh, PyPI, pip, and setuptools are not under python-dev?s banner. We use PEPs because they are a convenient way to manage change (and we?ve even discussed recently not using PEPs for packaging things that don?t have anything to do with the standard library and moving to a more lightweight process more akin to the Rust RFC process for various reasons). ----------------- Donald Stufft PGP: 0x6E3CBCE93372DCFA // 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA From alexander.belopolsky at gmail.com Thu Oct 1 03:14:39 2015 From: alexander.belopolsky at gmail.com (Alexander Belopolsky) Date: Wed, 30 Sep 2015 21:14:39 -0400 Subject: [Python-Dev] PEP acceptance and SIGs In-Reply-To: References: Message-ID: On Wed, Sep 30, 2015 at 8:32 PM, Donald Stufft wrote: > > I don?t see any requirement to post PEPs to python-dev if they have a Discussions-To header in PEP 1. When I faced a similar situation with PEP 495, Guido's advise was "I think that a courtesy message to python-dev is appropriate, with a link to the PEP and an invitation to discuss its merits on datetime-sig." [1] Maybe it is time to clarify that in PEP 1. > I don?t really think it makes sense in this case either tbh, PyPI, pip, and setuptools are not under python-dev?s banner. Given that ensurepip is part of stdlib, I am not sure this is an accurate statement. Even if it was, did you make any effort to discuss the proposal outside of a small group subscribed to distutils ML? My main issue with PEP 470 is that it came shortly after PEP 438 and replaced it. PEP 438 created a solution that was not very convenient, but possible to implement. With PEP 470, you are punishing the developers who took your advise and created verified external distribution assuming that it would remain available for a foreseeable future. By your own count, [2] 59 projects implemented PEP 438 verification in two years since the PEP was published. You compare that to 931 that remain vulnerable and conclude that the solution did not work. Given that information about PEP 438 features was very thinly disseminated, I think 59 is a large number and it would be appropriate to involve the developers of those packages in the discussion that led to PEP 470. [1]: https://mail.python.org/pipermail/datetime-sig/2015-August/000262.html [2]: https://www.python.org/dev/peps/pep-0470/#impact -------------- next part -------------- An HTML attachment was scrubbed... URL: From barry at python.org Thu Oct 1 03:55:11 2015 From: barry at python.org (Barry Warsaw) Date: Wed, 30 Sep 2015 21:55:11 -0400 Subject: [Python-Dev] PEP acceptance and SIGs In-Reply-To: References: Message-ID: <20150930215511.5eb5e95e@limelight.wooz.org> On Sep 30, 2015, at 09:14 PM, Alexander Belopolsky wrote: >When I faced a similar situation with PEP 495, Guido's advise was "I think >that a courtesy message to python-dev is appropriate, with a link to the >PEP and an invitation to discuss its merits on datetime-sig." [1] Certainly Discussions-To PEPs can be discussed and resolved on mailing lists other than python-dev, and the Resolution header must be set to the URL of the message in that other list that resolves the PEP. A courtesy email to python-dev should go out for all such PEPs, both before resolution (e.g. "hey we're discussing PEP 4000 over on py-in-the-sky-thon at python.org") and after. I do think Standards-Track PEPs (i.e. those that change the language or stdlib) must be discussed and/or posted on python-dev prior to resolution. Cheers, -Barry From donald at stufft.io Thu Oct 1 03:58:40 2015 From: donald at stufft.io (Donald Stufft) Date: Wed, 30 Sep 2015 21:58:40 -0400 Subject: [Python-Dev] PEP acceptance and SIGs In-Reply-To: References: Message-ID: On September 30, 2015 at 9:14:42 PM, Alexander Belopolsky (alexander.belopolsky at gmail.com) wrote: > On Wed, Sep 30, 2015 at 8:32 PM, Donald Stufft wrote: > > > > I don?t see any requirement to post PEPs to python-dev if they have a > Discussions-To header in PEP 1. > > > When I faced a similar situation with PEP 495, Guido's advise was "I think > that a courtesy message to python-dev is appropriate, with a link to the > PEP and an invitation to discuss its merits on datetime-sig." [1] > > Maybe it is time to clarify that in PEP 1. An obvious difference is that PEP 495 modifies the standard library and PEP 470 does not (nor do most of the packaging related PEPs, the one that did was discussed on python-dev). > > > I don?t really think it makes sense in this case either tbh, PyPI, pip, > and setuptools are not under python-dev?s banner. > > Given that ensurepip is part of stdlib, I am not sure this is an accurate > statement. PEP 453 states: ? ? The bootstrapped software will still remain external to CPython and this ? ? PEP does not include CPython subsuming the development responsibilities or ? ? design decisions of the bootstrapped software. It was an explicit decision in PEP 453 that these projects still remain independent precisely for this reason. > Even if it was, did you make any effort to discuss the proposal > outside of a small group subscribed to distutils ML? Did I personally? No. The discussion that started PEP 470 actually started on python-dev over a year ago and moved to distutils-sig because people told us to take the packaging stuff off python-dev. However, since it was a PEP, all CPython core developers should have gotten notifications for every modification to the text via the python-checkins mailing list, which all Cpython core developers are expected to be subscribed to. In addition to that, it was also posted as an article to LWN towards the begining of the discussion around it [1]. I don't think it's unreasonable to say that if you want a say in how things are changed, then you should be subscribed to the location where changes get discussed, in this case, distutils-sig. > > My main issue with PEP 470 is that it came shortly after PEP 438 and > replaced it. PEP 438 created a solution that was not very convenient, but > possible to implement. With PEP 470, you are punishing the developers who > took your advise and created verified external distribution assuming that > it would remain available for a foreseeable future. By your own count, > [2] 59 projects implemented PEP 438 verification in two years since the PEP > was published. You compare that to 931 that remain vulnerable and conclude > that the solution did not work. Given that information about PEP 438 > features was very thinly disseminated, I think 59 is a large number and it > would be appropriate to involve the developers of those packages in the > discussion that led to PEP 470. Two years is not a short amount of time in the current pace of Python's packaging evolution. Honestly though, we tried PEP 438 and the end result was pip's users were regularly confused. In hindsight, PEP 438 was not a very good PEP. It was impossible to implement in a way that wasn't massively confusing to end users and indeed, our issue tracker, and my personal email, and IRC got fairly regular complaints from end users due to the confusing state that PEP 438 left us in. With my pip core developer hat on, I decided that it was no longer tennable and I fully planned to implement something to replace PEP 438 regardless of if there was a PEP to back it or not (and as an external project, pip is not bound to follow any PEP, our unofficial policy is to attempt to follow all PEPs unless they are harmful or useless). However I decided that it would be a better overall outcome if PEP 438 was replaced with something that didn't have the problems of PEP 438 and to give folks who cared enough to "show up" to discuss it to have a chance to weigh in on it. I should also mention that it wasn't 59 projects implemented PEP 438 in two years, PEP 438 was explicitly implemented in a way that matched the way that a few projects were already using to get verified downloads. If my memory is correct, there were ~30 some projects that already had PEP 438 compliant URLs so realistically, only ~20 some projects willfully and knowingly took advantage of PEP 438, the rest were just left overs from before. Of those ~20, a handful of them were small projects that were utilities to make PEP 438 easier which aren't really relevant when discussing if PEP 438 was successful or not. > > [1]: https://mail.python.org/pipermail/datetime-sig/2015-August/000262.html > [2]: https://www.python.org/dev/peps/pep-0470/#impact >? [1]: https://lwn.net/Articles/599793/ ----------------- Donald Stufft PGP: 0x6E3CBCE93372DCFA // 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA From ncoghlan at gmail.com Thu Oct 1 08:29:24 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 1 Oct 2015 16:29:24 +1000 Subject: [Python-Dev] PEP acceptance and SIGs In-Reply-To: References: Message-ID: On 1 October 2015 at 10:19, Alexander Belopolsky wrote: > It has been my understanding that some PEPs may be discussed on specialized > mailings lists, but a notice would be given on python-dev prior to any > acceptance. No, that only applies for changes that actually impact the reference interpreter or the standard library - those still need to be discussed and approved on python-dev. For changes that are wholly decoupled from the reference interpreter (like PyPI behavioural changes), the delegation of authority to the relevant SIG is more complete: ================= PEP review and resolution may also occur on a list other than python-dev (for example, distutils-sig for packaging related PEPs that don't immediately affect the standard library). In this case, the "Discussions-To" heading in the PEP will identify the appropriate alternative list where discussion, review and pronouncement on the PEP will occur. ================= (From https://www.python.org/dev/peps/pep-0001/#pep-review-resolution ) Regards, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From p.f.moore at gmail.com Thu Oct 1 10:32:00 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Thu, 1 Oct 2015 09:32:00 +0100 Subject: [Python-Dev] VS 2010 compiler In-Reply-To: References: <-2386876371239912889@unknownmsgid> <853517484730244188@unknownmsgid> <3651291780988629080@unknownmsgid> Message-ID: On 30 September 2015 at 20:15, Paul Moore wrote: > I'll push an addition to packaging.python.org, probably tomorrow. https://github.com/pypa/python-packaging-user-guide/pull/175 Unless there's a discussion on the PR, I'll probably commit it in a day or so. Paul From chris.barker at noaa.gov Thu Oct 1 19:18:04 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Thu, 1 Oct 2015 10:18:04 -0700 Subject: [Python-Dev] VS 2010 compiler In-Reply-To: References: <-2386876371239912889@unknownmsgid> <853517484730244188@unknownmsgid> <3651291780988629080@unknownmsgid> Message-ID: On Wed, Sep 30, 2015 at 12:15 PM, Paul Moore wrote: > > This, unfortunately is non-trivial, and really a pain if you want to > > automate builds. > > Please clarify. First point -- that was intended to be a lament, not a criticism. And certainly not a criticism of anything Python devs are doing / have done. It's more difficult than it "should" be because of how MS distributes and configures the compiler and SDK, and that they are no longer distributing the same thing they used to distribute (i.e. VS2010 Express). I'm not at all sure that there is anything we can do to help.... But once upon a time, it really was as simple as: 1) Install this freely available compiler from MS 2) $ python setup.py build and that is the case now with py2.7 and (I assume) py3.5 Compared to that, setting environment variables, having to use a particular command prompt, etc. is non trivial. Part of the problem is that there are multiple audiences here: 1) experienced Windows devs that are writing and building their own packages: - not a big deal there, though it's still hard to figure out what is the "right" way to do it -- adding this to the packaging docs will help, I hope. 2) developers on other platforms that want to be able to build and distribute their package for Windows users. - this is a semi-big deal for them -- they need to learn a bit about how Windows works, and will find that annoying and frustrating because "why can't this be as easy as Linux?" (or really "the same as Linux") ;-) 3) users that want to be able to "pip install", and find that there is no binary wheel available, and they get the dreaded "can't find vcvarsall.bat" message. - these are the hardest people to support -- they may know very, very little about Windows command lines and environment variables etc. 4) users that are not particularly familiar with command lines or development tools, but need a little C extension -- Cython, in particular, makes it really easy to do that -- but then you need to buld the darn thing. distutils (setuptools) makes it possible to do that with very, very little understanding of build tools, but you have to get set up first... The whole process (including finding these instructions) is non-trivial for everyone of these use cases other than (1)[1]. What is non-trivial? Installing the SDK? I know, but > we said that's out of scope. Using an SDK command prompt? It is, sort > of, particularly if (like me) you use powershell. or gitBash :-) > But again, not our > issue. I assume setting the environment variable isn't an issue - you > can do it for the session rather than globally, so even restrictive > permissions aren't a problem. > but both of these add steps to the "$python setup.py install" -- that really des make a difference to many users (see above) I but it > is an acknowledgement that often the audience for this sort of > instruction are stumped by Microsoft's less than intuitive install > processes... > yup ! > For context, installing mingw is just as messy, complicated and error > prone (I speak from experience :-)) God yes! > so it's unfair to complain that > the above is a non-trivial pain. I know of no install option that's > *less* straightforward than this (except of course for "install any > version of Visual Studio 2010, even the free ones" - if you have > access to those, use them!) > exactly! but that is the standard I'm comparing against. When it comes to usability, there really is no argument about whether something is "easy to use" -- As we've discussed, this comes up over an over again on StackOverflow, various mailing lists, assorted Blogs, -- there really is no debate about this being "easy". What can be done about it is a different question. My argument is that docs can help, and maybe once this is in the packaging docs, all those mailing list questions can be answered with a link to that. > For automation, why not use Appveyor? See > https://packaging.python.org/en/latest/appveyor/ yup -- very cool, I've been meaning to get that set up for distributing some stuff. > Unless you meant > setting up a local build machine. Exactly -- you need that while developing, anyway. And at the moment, I have a non-sophisticated user-base that needs to be working with a frequently updated gitHub repo -- so needs to build themselves. (OK, if I had Appveyor doing updated binary packages every push, they wouldn't need to build themselves -- maybe someday) > If you want a simple "install a > Python build environment" process, you could look at > https://github.com/pfmoore/pybuild - nice! I'll checdk that out. But I"m confused -- right in there, you write: "setting up a Windows build environment can be a significant challenge" then you ask me what is "non trivial" :-) I haven't used it in a while (as > it's of no relevance to me, because I have VS2010) but it does work. I > never publicised or distributed it, because I got too much pushback in > terms of "but it doesn't work right on my system" (typically because > the system in question usually *wasn't* a clean build of Windows) that > I didn't have time or energy to address. But if it works for you, go > for it. > I'll give it a shot, thanks! I'll push an addition to packaging.python.org, probably tomorrow. I'll look at that, too -- thanks much for your attention to this. -CHB [1] Actually, I've had some of the larger struggles with folks that ARE experienced Windows devs -- they tend to think "I know how to use Visual Studio; I know how to build libs", and then they try to go off and do it by hand, and/or with their favorite verson of the compiler... I've had to pound it home again and again that you REALLY DO want to use distutils (or setuptools) and the version of VS that Python was compiled with. :-) -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Thu Oct 1 21:59:05 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Thu, 1 Oct 2015 20:59:05 +0100 Subject: [Python-Dev] VS 2010 compiler In-Reply-To: References: <-2386876371239912889@unknownmsgid> <853517484730244188@unknownmsgid> <3651291780988629080@unknownmsgid> Message-ID: On 1 October 2015 at 18:18, Chris Barker wrote: >> If you want a simple "install a >> Python build environment" process, you could look at >> https://github.com/pfmoore/pybuild - > > nice! I'll checdk that out. But I"m confused -- right in there, you write: > > "setting up a Windows build environment can be a significant challenge" > > then you ask me what is "non trivial" :-) You caught me out :-) Paul From chris.barker at noaa.gov Fri Oct 2 01:19:57 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Thu, 1 Oct 2015 16:19:57 -0700 Subject: [Python-Dev] VS 2010 compiler In-Reply-To: References: <-2386876371239912889@unknownmsgid> <853517484730244188@unknownmsgid> <3651291780988629080@unknownmsgid> Message-ID: I don't know that anyone disagrees with my point, but, less than an hour ago, this on the wxPython list: """ Microsoft no longer sells the compiler that's needed to build Python 3.4, and the needed compiler for Python 3.5 is free. """ To be fair, if you are trying to build wxWidgets, you may well want the whole IDE, but still... -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From victor.stinner at gmail.com Fri Oct 2 09:18:43 2015 From: victor.stinner at gmail.com (Victor Stinner) Date: Fri, 2 Oct 2015 09:18:43 +0200 Subject: [Python-Dev] Issue #25256: Add sys.debug_build? Message-ID: Hi, I created the issue "Add sys.debug_build public variable to check if Python was compiled in debug mode": http://bugs.python.org/issue25256 I would like to add an obvious way to check if Python was compiled in debug mode, instead of having hacks/tips to check it. On the Internet, I found various recipes to check if Python is compiled is debug mode. Sadly, some of them are not portable. For example, 3 different checks are proposed on StackOverflow but 2 of them are specific to Windows: http://stackoverflow.com/questions/646518/python-how-to-detect-debug-interpreter Even if the exact impact of a debug build depends on the Python implementation and the Python version, we can use it to have the same behaviour on all Python implementations. For example, the warnings module shows warnings by default if Python is compiled in debug mode: Extract of my patch: - if hasattr(sys, 'gettotalrefcount'): + if sys.debug_build: resource_action = "always" else: resource_action = "ignore" Alternative: Add a new sys.implementation.debug_build flag. Problem: extending sys.implementation requires a new PEP, and I don't think that debug_build fits into this object. Berker Peksag likes the idea. Serhiy Storchaka dislike the new flag: "I don't like this. The sys module is one of most used module, but it has too many members, and adding yet one makes the situation worse." (sys has 81 symbols) "Checking for debug mode is not often needed, and mainly in tests. Current way ``hasattr(sys, 'gettotalrefcount')`` works good. You also can check ``'d' in sys.abiflags`` if it looks cleaner to you. Or add a member to test.support." The name "debug_build" comes from the existing sysconfig.is_python_build() function. There is a sys.flags.debug flag, so "sys.debug" can be confusing. I prefer to attach the "build" suffix. First I proposed a function sys.is_debug_build(), but a flag is simpler than a function. There is not need to compute a version it's known at build time. What do you think? Should we add sys.debug_build? Victor From ncoghlan at gmail.com Fri Oct 2 09:37:07 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 2 Oct 2015 17:37:07 +1000 Subject: [Python-Dev] Issue #25256: Add sys.debug_build? In-Reply-To: References: Message-ID: On 2 October 2015 at 17:18, Victor Stinner wrote: > What do you think? Should we add sys.debug_build? Spell it as "sys.implementation.debug_build" and I'm in favour. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From victor.stinner at gmail.com Fri Oct 2 11:46:05 2015 From: victor.stinner at gmail.com (Victor Stinner) Date: Fri, 2 Oct 2015 11:46:05 +0200 Subject: [Python-Dev] Issue #25256: Add sys.debug_build? In-Reply-To: References: Message-ID: 2015-10-02 9:37 GMT+02:00 Nick Coghlan : > Spell it as "sys.implementation.debug_build" and I'm in favour. Oh, in fact, I don't have no preference between sys.debug_flag and sys.implementation.debug_flag. If I understood correctly, Serhiy would prefer sys.implementation.debug_flag because he doesn't want to add yet another symbol to the sys namespace. But Berker Peksag wrote: "According to the sys.implementation documentation and PEP 421, we can only add a private attribute without writing a PEP. But I find sys.implementation._debug_build too long and ``from sys import implementation; implementation._debug_build``(or ``from sys import implementation as i; i._debug_build``) is also not easy to write. So I'm +1 to sys.debug_build." Should I write a PEP for a new field in sys.implementation? Victor From nirsof at gmail.com Fri Oct 2 13:16:48 2015 From: nirsof at gmail.com (Nir Soffer) Date: Fri, 2 Oct 2015 14:16:48 +0300 Subject: [Python-Dev] Issue #25256: Add sys.debug_build? In-Reply-To: References: Message-ID: Whats wrong with: >>> sysconfig.get_config_var('Py_DEBUG') 0 Nir On Fri, Oct 2, 2015 at 10:18 AM, Victor Stinner wrote: > Hi, > > I created the issue "Add sys.debug_build public variable to check if > Python was compiled in debug mode": http://bugs.python.org/issue25256 > > I would like to add an obvious way to check if Python was compiled in > debug mode, instead of having hacks/tips to check it. On the Internet, > I found various recipes to check if Python is compiled is debug mode. > Sadly, some of them are not portable. For example, 3 different checks > are proposed on StackOverflow but 2 of them are specific to Windows: > > http://stackoverflow.com/questions/646518/python-how-to-detect-debug-interpreter > > Even if the exact impact of a debug build depends on the Python > implementation and the Python version, we can use it to have the same > behaviour on all Python implementations. For example, the warnings > module shows warnings by default if Python is compiled in debug mode: > Extract of my patch: > > - if hasattr(sys, 'gettotalrefcount'): > + if sys.debug_build: > resource_action = "always" > else: > resource_action = "ignore" > > Alternative: Add a new sys.implementation.debug_build flag. Problem: > extending sys.implementation requires a new PEP, and I don't think > that debug_build fits into this object. > > Berker Peksag likes the idea. > > Serhiy Storchaka dislike the new flag: "I don't like this. The sys > module is one of most used module, but it has too many members, and > adding yet one makes the situation worse." (sys has 81 symbols) > "Checking for debug mode is not often needed, and mainly in tests. > Current way ``hasattr(sys, 'gettotalrefcount')`` works good. You also > can check ``'d' in sys.abiflags`` if it looks cleaner to you. Or add a > member to test.support." > > The name "debug_build" comes from the existing > sysconfig.is_python_build() function. There is a sys.flags.debug flag, > so "sys.debug" can be confusing. I prefer to attach the "build" > suffix. > > First I proposed a function sys.is_debug_build(), but a flag is > simpler than a function. There is not need to compute a version it's > known at build time. > > What do you think? Should we add sys.debug_build? > > Victor > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/nirsof%40gmail.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From victor.stinner at gmail.com Fri Oct 2 14:19:32 2015 From: victor.stinner at gmail.com (Victor Stinner) Date: Fri, 2 Oct 2015 14:19:32 +0200 Subject: [Python-Dev] Issue #25256: Add sys.debug_build? In-Reply-To: References: Message-ID: 2015-10-02 13:16 GMT+02:00 Nir Soffer : > Whats wrong with: > >>>> sysconfig.get_config_var('Py_DEBUG') > 0 Again, refer to my first message "On the Internet, I found various recipes to check if Python is compiled is debug mode. Sadly, some of them are not portable." I don't think that sysconfig.get_config_var('Py_DEBUG') will work on other Python implementations. On Windows, there is no such file like "Makefile" used to fill syscofig.get_config_vars() :-( sysconfig._init_non_posix() only fills a few variables like BINDIR or INCLUDEPY, but not Py_DEBUG. Victor From fijall at gmail.com Fri Oct 2 15:17:26 2015 From: fijall at gmail.com (Maciej Fijalkowski) Date: Fri, 2 Oct 2015 15:17:26 +0200 Subject: [Python-Dev] Issue #25256: Add sys.debug_build? In-Reply-To: References: Message-ID: Speaking of other python implementations - why would you even care? (the pypy debug build has very different properties and does very different stuff for example). I would be very happy to have this clearly marked as implementation-dependent and that's why it would be cool to not be in sys (there are already 5 symbols there for this reason, so hasattr totalrefcount is cool enough) On Fri, Oct 2, 2015 at 2:19 PM, Victor Stinner wrote: > 2015-10-02 13:16 GMT+02:00 Nir Soffer : >> Whats wrong with: >> >>>>> sysconfig.get_config_var('Py_DEBUG') >> 0 > > Again, refer to my first message "On the Internet, I found various > recipes to check if Python is compiled is debug mode. Sadly, some of > them are not portable." > > I don't think that sysconfig.get_config_var('Py_DEBUG') will work on > other Python implementations. > > On Windows, there is no such file like "Makefile" used to fill > syscofig.get_config_vars() :-( sysconfig._init_non_posix() only fills > a few variables like BINDIR or INCLUDEPY, but not Py_DEBUG. > > Victor > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/fijall%40gmail.com From barry at python.org Fri Oct 2 15:57:15 2015 From: barry at python.org (Barry Warsaw) Date: Fri, 2 Oct 2015 09:57:15 -0400 Subject: [Python-Dev] Issue #25256: Add sys.debug_build? In-Reply-To: References: Message-ID: <20151002095715.1363f903@limelight.wooz.org> On Oct 02, 2015, at 11:46 AM, Victor Stinner wrote: >Should I write a PEP for a new field in sys.implementation? Specifically PEP 421 says that a PEP is needed if the new sys.implementation attribute is required to be defined in all implementations, i.e. it's a new required attribute. Will debug_build be required of all implementations? The PEP can be short. https://www.python.org/dev/peps/pep-0421/#id30 If it's only relevant for CPython then an underscore-prefix symbol in sys.implementation is the right place for it, and no PEP is needed. Just open an issue on the tracker. Cheers, -Barry From storchaka at gmail.com Fri Oct 2 16:43:56 2015 From: storchaka at gmail.com (Serhiy Storchaka) Date: Fri, 2 Oct 2015 17:43:56 +0300 Subject: [Python-Dev] Issue #25256: Add sys.debug_build? In-Reply-To: References: Message-ID: On 02.10.15 10:18, Victor Stinner wrote: > I would like to add an obvious way to check if Python was compiled in > debug mode, instead of having hacks/tips to check it. On the Internet, > I found various recipes to check if Python is compiled is debug mode. > Sadly, some of them are not portable. I agree with Maciej. Why do you need to check if Python is compiled in debug mode? Because either you need to use some feature that is available only in debug mode (such as sys.gettotalrefcount), or to distinguish debug and non-debug binaries (in sysconfig and distutils), or make the behaviour different in debug mode (in warnings), or handle other behaviour differences (such as additional asserts or different stack consumption). In the first case you should just explicitly check the existence of related function. In the second case I suggest to use sys.abiflags as a suffix (likely you want to distinguish pymalloc from non-pymalloc build), or at least make a suffix depended on sys.abiflags. In the third case perhaps we have to set default warning level depending on sys.flags.debug, or sys.flags.verbose, or other dynamic flags. In the fourth case there is no good solution, but in any case this behaviour is implementation specific, and other implementation can have different consistency checks and different limits. From status at bugs.python.org Fri Oct 2 18:08:30 2015 From: status at bugs.python.org (Python tracker) Date: Fri, 2 Oct 2015 18:08:30 +0200 (CEST) Subject: [Python-Dev] Summary of Python tracker Issues Message-ID: <20151002160830.7C4A656894@psf.upfronthosting.co.za> ACTIVITY SUMMARY (2015-09-25 - 2015-10-02) Python tracker at http://bugs.python.org/ To view or respond to any of the issues listed below, click on the issue. Do NOT respond to this message. Issues counts and deltas: open 5155 (+13) closed 31906 (+57) total 37061 (+70) Open issues with patches: 2286 Issues opened (48) ================== #25235: EmailMessage.add_attachment() creates parts with spurious MIME http://bugs.python.org/issue25235 opened by groner #25237: Add doc for tkinter commondialog.Dialog and subclasses http://bugs.python.org/issue25237 opened by terry.reedy #25238: Version added of context parameter for xmlrpc.client.ServerPro http://bugs.python.org/issue25238 opened by desbma #25239: HTMLParser handle_starttag replaces entity references in attri http://bugs.python.org/issue25239 opened by frogcoder #25240: Stack overflow in reprlib causes a core dump http://bugs.python.org/issue25240 opened by ceridwen #25242: Failed tests for Python 3.5.0 on shared virtual host http://bugs.python.org/issue25242 opened by Open Genomes #25243: decouple string-to-boolean logic from ConfigParser.getboolean http://bugs.python.org/issue25243 opened by jab #25244: Idle: refine right-click behavior http://bugs.python.org/issue25244 opened by terry.reedy #25246: Alternative algorithm for deque_remove() http://bugs.python.org/issue25246 opened by rhettinger #25247: Tkinter modules built successfully but removed because they co http://bugs.python.org/issue25247 opened by abrantesasf #25251: Unknown MS Compiler version 1900 http://bugs.python.org/issue25251 opened by Matt.Hickford #25252: Hard-coded line ending in asyncio.streams.StreamReader.readlin http://bugs.python.org/issue25252 opened by eric.smith #25254: Idle: debugger source line highlighting fails again http://bugs.python.org/issue25254 opened by terry.reedy #25256: Add sys.debug_build public variable to check if Python was com http://bugs.python.org/issue25256 opened by haypo #25257: In subject line email library inserts unwanted space after a t http://bugs.python.org/issue25257 opened by SegundoBob #25258: HtmlParser doesn't handle void element tags correctly http://bugs.python.org/issue25258 opened by Chenyun Yang #25259: readline macros can segfault Python http://bugs.python.org/issue25259 opened by gumnos #25263: test_tkinter fails randomly on the buildbots "AMD64 Windows10" http://bugs.python.org/issue25263 opened by haypo #25264: test_marshal always crashs on "AMD64 Windows10 2.7" buildbot http://bugs.python.org/issue25264 opened by haypo #25266: mako benchmark not working in Python 3.6 http://bugs.python.org/issue25266 opened by florin.papa #25268: Support pointing frozen modules to the corresponding source fi http://bugs.python.org/issue25268 opened by eric.snow #25269: Add method to detect if a string contains surrogates http://bugs.python.org/issue25269 opened by r.david.murray #25270: codecs.escape_encode systemerror on empty byte string http://bugs.python.org/issue25270 opened by reaperhulk #25272: asyncio tests are getting noisy http://bugs.python.org/issue25272 opened by gvanrossum #25274: sys.setrecursionlimit() must fail if the current recursion dep http://bugs.python.org/issue25274 opened by haypo #25275: Documentation v/s behaviour mismatch wrt integer literals cont http://bugs.python.org/issue25275 opened by shreevatsa #25276: Intermittent segfaults on PPC64 AIX 3.x http://bugs.python.org/issue25276 opened by haypo #25277: test_eintr hangs on randomly on "AMD64 FreeBSD 9.x 3.x" http://bugs.python.org/issue25277 opened by haypo #25278: Unexpected socket exception on SFTP 'STOR' command http://bugs.python.org/issue25278 opened by blanquier #25282: regex: Support for recursive patterns http://bugs.python.org/issue25282 opened by Sworddragon #25283: Make tm_gmtoff and tm_zone available on all platforms http://bugs.python.org/issue25283 opened by belopolsky #25285: regrtest: run tests in subprocesses with -j1 on buildbots http://bugs.python.org/issue25285 opened by haypo #25286: views are not sequences http://bugs.python.org/issue25286 opened by akira #25287: test_crypt fails on OpenBSD http://bugs.python.org/issue25287 opened by haypo #25289: test_strptime hangs sometimes on AMD64 Windows7 SP1 3.x buildb http://bugs.python.org/issue25289 opened by haypo #25290: csv.reader: minor docstring typo http://bugs.python.org/issue25290 opened by wasserverein #25291: better Exception message for certain task termination scenario http://bugs.python.org/issue25291 opened by ovex #25292: ssl socket gets into broken state when client exits during han http://bugs.python.org/issue25292 opened by ovex #25293: Hooking Thread/Process instantiation in concurrent.futures. http://bugs.python.org/issue25293 opened by Antony.Lee #25294: Absolute imports fail in some cases where relative imports wou http://bugs.python.org/issue25294 opened by Patrick Maupin #25295: functools.lru_cache raises KeyError http://bugs.python.org/issue25295 opened by Peter Brady #25296: Simple End-of-life guide covering all unsupported versions http://bugs.python.org/issue25296 opened by ncoghlan #25297: max_help_position is not works in argparse library http://bugs.python.org/issue25297 opened by morden2k #25298: Add lock and rlock weakref tests http://bugs.python.org/issue25298 opened by nirs #25299: TypeError: __init__() takes at least 4 arguments (4 given) http://bugs.python.org/issue25299 opened by A. Skrobov #25300: Enable Intel MPX (Memory protection Extensions) feature http://bugs.python.org/issue25300 opened by florin.papa #25301: Optimize UTF-8 decoder with error handlers http://bugs.python.org/issue25301 opened by haypo #25302: Memory Leaks with Address Sanitizer http://bugs.python.org/issue25302 opened by Matt Clarkson Most recent 15 issues with no replies (15) ========================================== #25302: Memory Leaks with Address Sanitizer http://bugs.python.org/issue25302 #25301: Optimize UTF-8 decoder with error handlers http://bugs.python.org/issue25301 #25298: Add lock and rlock weakref tests http://bugs.python.org/issue25298 #25294: Absolute imports fail in some cases where relative imports wou http://bugs.python.org/issue25294 #25293: Hooking Thread/Process instantiation in concurrent.futures. http://bugs.python.org/issue25293 #25292: ssl socket gets into broken state when client exits during han http://bugs.python.org/issue25292 #25290: csv.reader: minor docstring typo http://bugs.python.org/issue25290 #25287: test_crypt fails on OpenBSD http://bugs.python.org/issue25287 #25285: regrtest: run tests in subprocesses with -j1 on buildbots http://bugs.python.org/issue25285 #25283: Make tm_gmtoff and tm_zone available on all platforms http://bugs.python.org/issue25283 #25269: Add method to detect if a string contains surrogates http://bugs.python.org/issue25269 #25264: test_marshal always crashs on "AMD64 Windows10 2.7" buildbot http://bugs.python.org/issue25264 #25263: test_tkinter fails randomly on the buildbots "AMD64 Windows10" http://bugs.python.org/issue25263 #25246: Alternative algorithm for deque_remove() http://bugs.python.org/issue25246 #25244: Idle: refine right-click behavior http://bugs.python.org/issue25244 Most recent 15 issues waiting for review (15) ============================================= #25300: Enable Intel MPX (Memory protection Extensions) feature http://bugs.python.org/issue25300 #25298: Add lock and rlock weakref tests http://bugs.python.org/issue25298 #25287: test_crypt fails on OpenBSD http://bugs.python.org/issue25287 #25286: views are not sequences http://bugs.python.org/issue25286 #25285: regrtest: run tests in subprocesses with -j1 on buildbots http://bugs.python.org/issue25285 #25274: sys.setrecursionlimit() must fail if the current recursion dep http://bugs.python.org/issue25274 #25270: codecs.escape_encode systemerror on empty byte string http://bugs.python.org/issue25270 #25266: mako benchmark not working in Python 3.6 http://bugs.python.org/issue25266 #25256: Add sys.debug_build public variable to check if Python was com http://bugs.python.org/issue25256 #25251: Unknown MS Compiler version 1900 http://bugs.python.org/issue25251 #25246: Alternative algorithm for deque_remove() http://bugs.python.org/issue25246 #25235: EmailMessage.add_attachment() creates parts with spurious MIME http://bugs.python.org/issue25235 #25232: CGIRequestHandler behave incorrectly with query component cons http://bugs.python.org/issue25232 #25229: distutils doesn't add "-Wl," prefix to "-R" on Linux if the C http://bugs.python.org/issue25229 #25228: Regression in cookie parsing with brackets and quotes http://bugs.python.org/issue25228 Top 10 most discussed issues (10) ================================= #25220: Enhance and refactor test.regrtest (convert regrtest.py to a p http://bugs.python.org/issue25220 21 msgs #25274: sys.setrecursionlimit() must fail if the current recursion dep http://bugs.python.org/issue25274 15 msgs #25125: "Edit with IDLE" does not work for shortcuts http://bugs.python.org/issue25125 12 msgs #24820: IDLE themes for light on dark http://bugs.python.org/issue24820 11 msgs #25296: Simple End-of-life guide covering all unsupported versions http://bugs.python.org/issue25296 11 msgs #25256: Add sys.debug_build public variable to check if Python was com http://bugs.python.org/issue25256 9 msgs #18814: Add utilities to "clean" surrogate code points from strings http://bugs.python.org/issue18814 8 msgs #25001: Make --nowindows argument to regrtest propagate when running w http://bugs.python.org/issue25001 7 msgs #25275: Documentation v/s behaviour mismatch wrt integer literals cont http://bugs.python.org/issue25275 7 msgs #25276: Intermittent segfaults on PPC64 AIX 3.x http://bugs.python.org/issue25276 7 msgs Issues closed (56) ================== #10485: http.server fails when query string contains addition '?' char http://bugs.python.org/issue10485 closed by martin.panter #11215: test_fileio error on AIX http://bugs.python.org/issue11215 closed by haypo #12219: tkinter.filedialog.askopenfilename XT dialog on Windows 7 http://bugs.python.org/issue12219 closed by terry.reedy #14566: run_cgi reverts to using unnormalized path http://bugs.python.org/issue14566 closed by martin.panter #22413: Bizarre StringIO(newline="\r\n") translation http://bugs.python.org/issue22413 closed by pitrou #22609: Constructors of some mapping classes don't accept `self` keywo http://bugs.python.org/issue22609 closed by serhiy.storchaka #22958: Constructors of weakref mapping classes don't accept "self" an http://bugs.python.org/issue22958 closed by serhiy.storchaka #23546: Windows, 'Edit with IDLE', and multiple installed versions http://bugs.python.org/issue23546 closed by steve.dower #23600: tizinfo.fromutc changed for tzinfo wih StdOffset=0, DstOffset= http://bugs.python.org/issue23600 closed by belopolsky #24028: Idle: add doc subsection on calltips http://bugs.python.org/issue24028 closed by terry.reedy #24483: Avoid repeated hash calculation in C implementation of functoo http://bugs.python.org/issue24483 closed by serhiy.storchaka #24570: IDLE Autocomplete and Call Tips Do Not Pop Up on OS X with Act http://bugs.python.org/issue24570 closed by terry.reedy #24884: Add method reopenFile() in WatchedFileHandler class http://bugs.python.org/issue24884 closed by python-dev #24972: IDLE: revisit text highlighting for inactive windows on win32 http://bugs.python.org/issue24972 closed by terry.reedy #24988: IDLE: debugger context menus not working on Mac http://bugs.python.org/issue24988 closed by terry.reedy #25003: os.urandom() should call getrandom(2) not getentropy(2) on Sol http://bugs.python.org/issue25003 closed by haypo #25011: Smarter rl complete: hide private and special names http://bugs.python.org/issue25011 closed by serhiy.storchaka #25034: string.Formatter accepts empty fields but displays wrong when http://bugs.python.org/issue25034 closed by eric.smith #25091: Windows Installer uses small font http://bugs.python.org/issue25091 closed by steve.dower #25097: test_logging may fail with 'Access is denied' when pywin32 is http://bugs.python.org/issue25097 closed by vinay.sajip #25111: Broken compatibility in FrameSummary equality http://bugs.python.org/issue25111 closed by serhiy.storchaka #25123: Logging Documentation - dictConfig disable_existing_loggers http://bugs.python.org/issue25123 closed by vinay.sajip #25131: The AST for dict and set displays has the lineno of the first http://bugs.python.org/issue25131 closed by python-dev #25135: Deques to adopt the standard clearing procedure for mutable ob http://bugs.python.org/issue25135 closed by rhettinger #25165: Windows uninstallation should not remove launcher if other ver http://bugs.python.org/issue25165 closed by steve.dower #25171: does not build on OpenBSD with no value defined for PY_GETENTR http://bugs.python.org/issue25171 closed by haypo #25173: IDLE - several common dialogs don't have correct parent set http://bugs.python.org/issue25173 closed by terry.reedy #25182: python -v crashes in nonencodable directory http://bugs.python.org/issue25182 closed by serhiy.storchaka #25185: Inconsistency between venv and site http://bugs.python.org/issue25185 closed by python-dev #25186: Don't duplicate _verbose_message in importlib._bootstrap and _ http://bugs.python.org/issue25186 closed by brett.cannon #25203: Incorrect handling MemoryError in readline.set_completer_delim http://bugs.python.org/issue25203 closed by serhiy.storchaka #25209: Append space after completed keywords http://bugs.python.org/issue25209 closed by serhiy.storchaka #25211: Error message formatting errors in int object unit-test script http://bugs.python.org/issue25211 closed by martin.panter #25227: Optimize ASCII/latin1 encoder with surrogateescape error handl http://bugs.python.org/issue25227 closed by haypo #25233: AssertionError from asyncio Queue get http://bugs.python.org/issue25233 closed by gvanrossum #25234: test_eintr.test_os_open hangs under Xcode 7 http://bugs.python.org/issue25234 closed by brett.cannon #25236: str.maketrans wrong description for optional 3rd parameter http://bugs.python.org/issue25236 closed by BrianO #25241: ctypes: access violation reading http://bugs.python.org/issue25241 closed by eryksun #25245: Compile warnings in _pickle.c http://bugs.python.org/issue25245 closed by python-dev #25248: Discrepancy in unpickling integers with protocol 0 http://bugs.python.org/issue25248 closed by serhiy.storchaka #25249: Unneeded and unsafe mkstemp replacement in test_subprocess.py http://bugs.python.org/issue25249 closed by berker.peksag #25250: AttributeError: 'MSVCCompiler' object has no attribute '_MSVCC http://bugs.python.org/issue25250 closed by steve.dower #25253: AttributeError: 'Weather' object has no attribute 'dom' http://bugs.python.org/issue25253 closed by berker.peksag #25255: Security of CPython Builds http://bugs.python.org/issue25255 closed by r.david.murray #25260: python -m test --coverage doesn't work on Windows http://bugs.python.org/issue25260 closed by haypo #25261: Incorrect Return Values for any() and all() Built-in Functions http://bugs.python.org/issue25261 closed by zach.ware #25262: Issues with BINUNICODE8 and BINBYTES8 opcodes in pickle http://bugs.python.org/issue25262 closed by serhiy.storchaka #25265: Python install failed windows 8.1- Error 0x80240017: Failed to http://bugs.python.org/issue25265 closed by haypo #25267: Optimize UTF-8 encoder with error handlers http://bugs.python.org/issue25267 closed by haypo #25271: SystemError when doing codecs.escape_encode(b'') http://bugs.python.org/issue25271 closed by zach.ware #25273: Console interpreter holds a hard link on last displayed object http://bugs.python.org/issue25273 closed by eryksun #25279: Unexpected ftplib.error_xxx exception on SFTP 'STOR' command http://bugs.python.org/issue25279 closed by r.david.murray #25280: Message can be formatted twice in importlib http://bugs.python.org/issue25280 closed by serhiy.storchaka #25281: Incorrect enum behavior during json.dumps serialization http://bugs.python.org/issue25281 closed by r.david.murray #25284: Spec for BaseEventLoop.run_in_executor(executor, callback, *ar http://bugs.python.org/issue25284 closed by asvetlov #25288: readline.py file in current directory caused unexpected code e http://bugs.python.org/issue25288 closed by r.david.murray From tjreedy at udel.edu Sat Oct 3 00:16:28 2015 From: tjreedy at udel.edu (Terry Reedy) Date: Fri, 2 Oct 2015 18:16:28 -0400 Subject: [Python-Dev] Migrating to Python 3: the python 3 install issue Message-ID: On python-list, Chris Warrick reported (thread title): "The Nikola project is deprecating Python 2.7 (+2.x/3.x user survey results)" This is for the November release, with 2.7 dropped in the next version next year. (Nikola is a cross-platform unicode-based app for building static websites and blogs from user-written templates and (marked-up) text files. https://getnikola.com/ ) Since users do not write code to use Nikola, the survey was about installation of Python 3. At present, 1/2 have 3.x only, 1/3 2.x only, and 1/6 both. (So much for 'nobody uses 3.x for real work'.) Most of the 2.x only people are able and willing to install 3.x. https://getnikola.com/blog/env-survey-results-and-the-future-of-python-27.html When Stefan Behnel asked why they did not drop the hard-to-maintain 2.7 version once they ported to 3.3, Chris answered > We did it now because it all started with frustration with 2.7 [0]. > Also, doing it back in 2012/2013 would be problematic, because back > then not all Linux distros had an easily installable Python 3 stack > (and RHEL 7 still doesn?t have one in the default repos) > > [0]: http://ralsina.me/weblog/posts/floss-decision-making-in-action.html -- Terry Jan Reedy From brett at python.org Sat Oct 3 00:55:17 2015 From: brett at python.org (Brett Cannon) Date: Fri, 02 Oct 2015 22:55:17 +0000 Subject: [Python-Dev] Migrating to Python 3: the python 3 install issue In-Reply-To: References: Message-ID: Thanks for the info, Terry! Glad people are realizing that Python 3 is now available widely enough that applications can seriously consider dropping Python 2 support now. I still think 2016 is going to see this happen more and more once the Linux distros make their switches to Python 3. On Fri, 2 Oct 2015 at 15:16 Terry Reedy wrote: > On python-list, Chris Warrick reported (thread title): > "The Nikola project is deprecating Python 2.7 (+2.x/3.x user survey > results)" This is for the November release, with 2.7 dropped in the > next version next year. (Nikola is a cross-platform unicode-based app > for building static websites and blogs from user-written templates and > (marked-up) text files. https://getnikola.com/ ) > > Since users do not write code to use Nikola, the survey was about > installation of Python 3. At present, 1/2 have 3.x only, 1/3 2.x only, > and 1/6 both. (So much for 'nobody uses 3.x for real work'.) Most of > the 2.x only people are able and willing to install 3.x. > > https://getnikola.com/blog/env-survey-results-and-the-future-of-python-27.html > > When Stefan Behnel asked why they did not drop the hard-to-maintain 2.7 > version once they ported to 3.3, Chris answered > > > We did it now because it all started with frustration with 2.7 [0]. > > Also, doing it back in 2012/2013 would be problematic, because back > > then not all Linux distros had an easily installable Python 3 stack > > (and RHEL 7 still doesn?t have one in the default repos) > > > > [0]: > http://ralsina.me/weblog/posts/floss-decision-making-in-action.html > > -- > Terry Jan Reedy > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From victor.stinner at gmail.com Sat Oct 3 01:03:06 2015 From: victor.stinner at gmail.com (Victor Stinner) Date: Sat, 3 Oct 2015 01:03:06 +0200 Subject: [Python-Dev] Migrating to Python 3: the python 3 install issue In-Reply-To: References: Message-ID: Fedora 23 (scheduled for the end of this month) will only come with python3 (/usr/bin/python3), no python2 (nor python), *in the base system*. Obviously, it will be possible to install Python 2 to install applications not compatible with Python 3 yet. Note: the current development version is Fedora 23, Fedora 24 is the next one. Ubuntu is still working on a similar change. Victor 2015-10-03 0:55 GMT+02:00 Brett Cannon : > Thanks for the info, Terry! Glad people are realizing that Python 3 is now > available widely enough that applications can seriously consider dropping > Python 2 support now. I still think 2016 is going to see this happen more > and more once the Linux distros make their switches to Python 3. > > On Fri, 2 Oct 2015 at 15:16 Terry Reedy wrote: >> >> On python-list, Chris Warrick reported (thread title): >> "The Nikola project is deprecating Python 2.7 (+2.x/3.x user survey >> results)" This is for the November release, with 2.7 dropped in the >> next version next year. (Nikola is a cross-platform unicode-based app >> for building static websites and blogs from user-written templates and >> (marked-up) text files. https://getnikola.com/ ) >> >> Since users do not write code to use Nikola, the survey was about >> installation of Python 3. At present, 1/2 have 3.x only, 1/3 2.x only, >> and 1/6 both. (So much for 'nobody uses 3.x for real work'.) Most of >> the 2.x only people are able and willing to install 3.x. >> >> https://getnikola.com/blog/env-survey-results-and-the-future-of-python-27.html >> >> When Stefan Behnel asked why they did not drop the hard-to-maintain 2.7 >> version once they ported to 3.3, Chris answered >> >> > We did it now because it all started with frustration with 2.7 [0]. >> > Also, doing it back in 2012/2013 would be problematic, because back >> > then not all Linux distros had an easily installable Python 3 stack >> > (and RHEL 7 still doesn?t have one in the default repos) >> > >> > [0]: >> http://ralsina.me/weblog/posts/floss-decision-making-in-action.html >> >> -- >> Terry Jan Reedy >> >> >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/brett%40python.org > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/victor.stinner%40gmail.com > From victor.stinner at gmail.com Sat Oct 3 01:05:53 2015 From: victor.stinner at gmail.com (Victor Stinner) Date: Sat, 3 Oct 2015 01:05:53 +0200 Subject: [Python-Dev] Migrating to Python 3: the python 3 install issue In-Reply-To: References: Message-ID: (grr, again i sent a draft by mistake, sorry about that) Fedora 23 (scheduled for the end of this month) will only come with python3 (/usr/bin/python3), no python2 (nor python), *in the base system*. Obviously, it will be possible to install Python 2 to install applications not compatible with Python 3 yet. https://fedoraproject.org/wiki/Releases/23/ChangeSet#Python_3_as_the_Default_Implementation https://twitter.com/_solotraveller/status/645559393627435008 Ubuntu is also working on a similar change. I don't know when it will happen. Victor 2015-10-03 0:55 GMT+02:00 Brett Cannon : > Thanks for the info, Terry! Glad people are realizing that Python 3 is now > available widely enough that applications can seriously consider dropping > Python 2 support now. I still think 2016 is going to see this happen more > and more once the Linux distros make their switches to Python 3. > > On Fri, 2 Oct 2015 at 15:16 Terry Reedy wrote: >> >> On python-list, Chris Warrick reported (thread title): >> "The Nikola project is deprecating Python 2.7 (+2.x/3.x user survey >> results)" This is for the November release, with 2.7 dropped in the >> next version next year. (Nikola is a cross-platform unicode-based app >> for building static websites and blogs from user-written templates and >> (marked-up) text files. https://getnikola.com/ ) >> >> Since users do not write code to use Nikola, the survey was about >> installation of Python 3. At present, 1/2 have 3.x only, 1/3 2.x only, >> and 1/6 both. (So much for 'nobody uses 3.x for real work'.) Most of >> the 2.x only people are able and willing to install 3.x. >> >> https://getnikola.com/blog/env-survey-results-and-the-future-of-python-27.html >> >> When Stefan Behnel asked why they did not drop the hard-to-maintain 2.7 >> version once they ported to 3.3, Chris answered >> >> > We did it now because it all started with frustration with 2.7 [0]. >> > Also, doing it back in 2012/2013 would be problematic, because back >> > then not all Linux distros had an easily installable Python 3 stack >> > (and RHEL 7 still doesn?t have one in the default repos) >> > >> > [0]: >> http://ralsina.me/weblog/posts/floss-decision-making-in-action.html >> >> -- >> Terry Jan Reedy >> >> >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/brett%40python.org > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/victor.stinner%40gmail.com > From moiein2000 at gmail.com Sat Oct 3 00:34:18 2015 From: moiein2000 at gmail.com (Matthew Einhorn) Date: Fri, 2 Oct 2015 18:34:18 -0400 Subject: [Python-Dev] VS 2010 compiler In-Reply-To: References: <-2386876371239912889@unknownmsgid> <853517484730244188@unknownmsgid> <3651291780988629080@unknownmsgid> Message-ID: On Wed, Sep 30, 2015 at 3:57 PM, Carl Kleffner wrote: > Concerning the claims that mingw is difficult: > > The mingwpy package is a sligthly modified mingw-w64 based gcc toolchain, > that is in development. It is designed for simple use and for much better > compatibility to the standard MSVC python builds. It should work out of the > box, as long as the \Scripts folder is in the PATH. > Indeed, I tested it with 2.7.9 x86/x64, and wheres before I had to apply at least http://bugs.python.org/issue4709, and http://bugs.python.org/issue16472 and then generate the .a files to get it to work, this worked out of the box! (except of course still needing to change distutils.cfg). I do wonder though, which may be slightly off topic here, whether putting the mingw binary stub files (?) in python/Scripts, rather than directly having python/shared/mingwpy/bin be on the path, is the best approach? I suppose it's no worse than having the other stuff in python/scripts on the path (although with e.g. pip you could do the python -m pip trick). Are there guidelines about this (using python/scripts) for Windows? I didn't see anything mentioned when python -m pip install ... vs. pip install is discussed. Although I hope a 7z version of that mingw will be separately available for those that want to share a copy between versions of python, since the files in share/mingwpy is presumably the same between python versions for a specific bitness? Matt -------------- next part -------------- An HTML attachment was scrubbed... URL: From barry at python.org Sat Oct 3 01:57:24 2015 From: barry at python.org (Barry Warsaw) Date: Fri, 2 Oct 2015 19:57:24 -0400 Subject: [Python-Dev] Migrating to Python 3: the python 3 install issue In-Reply-To: References: Message-ID: <20151002195724.7d6d9162@limelight.wooz.org> On Oct 03, 2015, at 01:05 AM, Victor Stinner wrote: >Ubuntu is also working on a similar change. I don't know when it will happen. For the desktop, we're aiming for 16.04 LTS. Cheers, -Barry From ncoghlan at gmail.com Sat Oct 3 15:59:12 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 3 Oct 2015 23:59:12 +1000 Subject: [Python-Dev] Migrating to Python 3: the python 3 install issue In-Reply-To: <20151002195724.7d6d9162@limelight.wooz.org> References: <20151002195724.7d6d9162@limelight.wooz.org> Message-ID: On 3 October 2015 at 09:57, Barry Warsaw wrote: > On Oct 03, 2015, at 01:05 AM, Victor Stinner wrote: > >>Ubuntu is also working on a similar change. I don't know when it will happen. > > For the desktop, we're aiming for 16.04 LTS. So close! Out of curiousity, I dug up the original Arch announcement linked from PEP 394: https://www.archlinux.org/news/python-is-now-python-3/ 'twas a mere 5 years ago :) Cheers, Nick. [1] https://fedoraproject.org/wiki/Releases/24/Schedule -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From tismer at stackless.com Sat Oct 3 19:49:54 2015 From: tismer at stackless.com (Christian Tismer) Date: Sat, 3 Oct 2015 19:49:54 +0200 Subject: [Python-Dev] Migrating to Python 3: the python 3 install issue In-Reply-To: References: Message-ID: <56101542.7010303@stackless.com> Great, that this finally happens. I think this was a silent revolution, initiated by nagging people, distros and larger companies about how mega-out Python2 is, until they finally started to believe it ;-) cheers -- Chris [since 2012 on Py3, charging an extra for back-porting] On 03/10/15 01:05, Victor Stinner wrote: > (grr, again i sent a draft by mistake, sorry about that) > > Fedora 23 (scheduled for the end of this month) will only come with > python3 (/usr/bin/python3), no python2 (nor python), *in the base > system*. Obviously, it will be possible to install Python 2 to install > applications not compatible with Python 3 yet. > > https://fedoraproject.org/wiki/Releases/23/ChangeSet#Python_3_as_the_Default_Implementation > https://twitter.com/_solotraveller/status/645559393627435008 > > Ubuntu is also working on a similar change. I don't know when it will happen. > > Victor > > 2015-10-03 0:55 GMT+02:00 Brett Cannon : >> Thanks for the info, Terry! Glad people are realizing that Python 3 is now >> available widely enough that applications can seriously consider dropping >> Python 2 support now. I still think 2016 is going to see this happen more >> and more once the Linux distros make their switches to Python 3. >> >> On Fri, 2 Oct 2015 at 15:16 Terry Reedy wrote: >>> >>> On python-list, Chris Warrick reported (thread title): >>> "The Nikola project is deprecating Python 2.7 (+2.x/3.x user survey >>> results)" This is for the November release, with 2.7 dropped in the >>> next version next year. (Nikola is a cross-platform unicode-based app >>> for building static websites and blogs from user-written templates and >>> (marked-up) text files. https://getnikola.com/ ) >>> >>> Since users do not write code to use Nikola, the survey was about >>> installation of Python 3. At present, 1/2 have 3.x only, 1/3 2.x only, >>> and 1/6 both. (So much for 'nobody uses 3.x for real work'.) Most of >>> the 2.x only people are able and willing to install 3.x. >>> >>> https://getnikola.com/blog/env-survey-results-and-the-future-of-python-27.html >>> >>> When Stefan Behnel asked why they did not drop the hard-to-maintain 2.7 >>> version once they ported to 3.3, Chris answered >>> >>> > We did it now because it all started with frustration with 2.7 [0]. >>> > Also, doing it back in 2012/2013 would be problematic, because back >>> > then not all Linux distros had an easily installable Python 3 stack >>> > (and RHEL 7 still doesn?t have one in the default repos) >>> > >>> > [0]: >>> http://ralsina.me/weblog/posts/floss-decision-making-in-action.html >>> >>> -- >>> Terry Jan Reedy >>> >>> >>> _______________________________________________ >>> Python-Dev mailing list >>> Python-Dev at python.org >>> https://mail.python.org/mailman/listinfo/python-dev >>> Unsubscribe: >>> https://mail.python.org/mailman/options/python-dev/brett%40python.org >> >> >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/victor.stinner%40gmail.com >> > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/tismer%40stackless.com > -- Christian Tismer :^) tismer at stackless.com Software Consulting : http://www.stackless.com/ Karl-Liebknecht-Str. 121 : https://github.com/PySide 14482 Potsdam : GPG key -> 0xFB7BEE0E phone +49 173 24 18 776 fax +49 (30) 700143-0023 -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 522 bytes Desc: OpenPGP digital signature URL: From tjreedy at udel.edu Sun Oct 4 03:35:40 2015 From: tjreedy at udel.edu (Terry Reedy) Date: Sat, 3 Oct 2015 21:35:40 -0400 Subject: [Python-Dev] 3.4.4rc1, when? Message-ID: When, exactly, is 3.4.4c1 being branched off (which is when we should stop pushing non-critical patches)? The 3.4 PEP has no mention of 3.4.4 https://www.python.org/dev/peps/pep-0429/ -- Terry Jan Reedy From larry at hastings.org Mon Oct 5 02:37:33 2015 From: larry at hastings.org (Larry Hastings) Date: Sun, 4 Oct 2015 17:37:33 -0700 Subject: [Python-Dev] 3.4.4rc1, when? In-Reply-To: References: Message-ID: <5611C64D.7060702@hastings.org> On 10/03/2015 06:35 PM, Terry Reedy wrote: > When, exactly, is 3.4.4c1 being branched off (which is when we should > stop pushing non-critical patches)? > > The 3.4 PEP has no mention of 3.4.4 > https://www.python.org/dev/peps/pep-0429/ > I'm holding off on 3.4.4 for now. My idea is, we should ship 3.5.1 first; once that's settled, we'll work on 3.4.4, which will be the final release of 3.4 with binary installers. Cheers, //arry/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From tjreedy at udel.edu Mon Oct 5 03:26:35 2015 From: tjreedy at udel.edu (Terry Reedy) Date: Sun, 4 Oct 2015 21:26:35 -0400 Subject: [Python-Dev] 3.4.4rc1, when? In-Reply-To: <5611C64D.7060702@hastings.org> References: <5611C64D.7060702@hastings.org> Message-ID: On 10/4/2015 8:37 PM, Larry Hastings wrote: > On 10/03/2015 06:35 PM, Terry Reedy wrote: >> When, exactly, is 3.4.4c1 being branched off (which is when we should >> stop pushing non-critical patches)? >> >> The 3.4 PEP has no mention of 3.4.4 >> https://www.python.org/dev/peps/pep-0429/ >> > > I'm holding off on 3.4.4 for now. My idea is, we should ship 3.5.1 > first; once that's settled, we'll work on 3.4.4, which will be the final > release of 3.4 with binary installers. When might 3.5.1 candidate be? -- Terry Jan Reedy From larry at hastings.org Mon Oct 5 05:44:41 2015 From: larry at hastings.org (Larry Hastings) Date: Sun, 4 Oct 2015 20:44:41 -0700 Subject: [Python-Dev] 3.4.4rc1, when? In-Reply-To: References: <5611C64D.7060702@hastings.org> Message-ID: <5611F229.8030408@hastings.org> On 10/04/2015 06:26 PM, Terry Reedy wrote: > > When might 3.5.1 candidate be? > No announcements yet! //arry/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From tjreedy at udel.edu Mon Oct 5 20:42:16 2015 From: tjreedy at udel.edu (Terry Reedy) Date: Mon, 5 Oct 2015 14:42:16 -0400 Subject: [Python-Dev] 3.4.4rc1, when? In-Reply-To: <5611F229.8030408@hastings.org> References: <5611C64D.7060702@hastings.org> <5611F229.8030408@hastings.org> Message-ID: On 10/4/2015 11:44 PM, Larry Hastings wrote: > On 10/04/2015 06:26 PM, Terry Reedy wrote: >> When might 3.5.1 candidate be? > No announcements yet! May I assume that you will give at least two weeks notice and that 3.5.1c1 will be at least 3 or 4 weeks off? -- Terry Jan Reedy From stevewedig at gmail.com Mon Oct 5 20:55:37 2015 From: stevewedig at gmail.com (Steve Wedig) Date: Mon, 5 Oct 2015 11:55:37 -0700 Subject: [Python-Dev] Not Deprecating Arbitrary Function Annotations Message-ID: Congratulations on the release of 3.5 and Pep 484. I've used Python professionally for 10 years and I believe type hints will make it easier to work with large codebases evolving over time. My only concern about Pep 484 is the discussion of whether or not to deprecate arbitrary function annotations. https://www.python.org/dev/peps/pep-0484/ I would like to request that arbitrary function annotations are not deprecated for three reasons: 1. Backwards Compatibility 2. Type Experimentation 3. Embedded Languages 1. Backwards Compatibility After reading Pep 3107 my team has made significant use of non-standard annotations. It would be a serious burden to be forced to port our annotations back to decorators. This would also make our codebase considerably less readable because function annotations are much more readable than input/output annotations relegated to decorators. https://www.python.org/dev/peps/pep-3107/ 2. Type Experimentation Arbitrary function annotations allow developers to experiment with potential type system improvements in real projects. Ideas can be validated before officially adding them to the language. This seems like an advantage that should be preserved. After all, Pep 484 says it was strongly inspired by MyPy, an existing project. http://mypy-lang.org/ 3. Embedded Languages Python's flexibility makes it an amazing language to embed other languages in. In this regard, Python 3's addition of arbitrary function annotations and class decorators complements Python 2's dynamic typing, function decorators, reflection, metaclasses, properties, magic methods, generators, and keyword arguments. Arbitrary function annotations are a crucial part of this toolkit, and this feature is not available in most other languages. For anyone interested in the utility and mechanics of embedded languages, I'd recommend Martin Fowler's book: Domain Specific Languages. http://www.amazon.com/Domain-Specific-Languages-Addison-Wesley-Signature-Series/dp/0321712943 So I agree with the course of action mentioned in Pep 484 that avoids runtime deprecation of arbitrary function annotation: "Another possible outcome would be that type hints will eventually become the default meaning for annotations, but that there will always remain an option to disable them." I would only add that there should be a way to disable type checking for an entire directory (recursively). This would be useful for codebases that have not been ported to standard annotations yet, and for codebases that will not be ported for the reasons listed above. Thanks for your consideration. Best, Steve -------------- next part -------------- An HTML attachment was scrubbed... URL: From larry at hastings.org Mon Oct 5 21:39:12 2015 From: larry at hastings.org (Larry Hastings) Date: Mon, 5 Oct 2015 12:39:12 -0700 Subject: [Python-Dev] 3.4.4rc1, when? In-Reply-To: References: <5611C64D.7060702@hastings.org> <5611F229.8030408@hastings.org> Message-ID: <5612D1E0.5070702@hastings.org> On 10/05/2015 11:42 AM, Terry Reedy wrote: > On 10/4/2015 11:44 PM, Larry Hastings wrote: >> On 10/04/2015 06:26 PM, Terry Reedy wrote: >>> When might 3.5.1 candidate be? >> No announcements yet! > > May I assume that you will give at least two weeks notice and that > 3.5.1c1 will be at least 3 or 4 weeks off? > Yes. To be specific: I'd want to give at least two weeks notice before 3.5.1rc1, and 3.5.1 final would be at least a week after that (though perhaps no more than that). Cheers, //arry/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From rymg19 at gmail.com Mon Oct 5 21:57:07 2015 From: rymg19 at gmail.com (Ryan Gonzalez) Date: Mon, 05 Oct 2015 14:57:07 -0500 Subject: [Python-Dev] Not Deprecating Arbitrary Function Annotations In-Reply-To: References: Message-ID: <6CBC061D-2B15-4BFF-AE85-18EE9DBA55FA@gmail.com> There is one reason I would be really freaking mad if they deprecated other uses of annotations: https://pypi.python.org/pypi/plac On October 5, 2015 1:55:37 PM CDT, Steve Wedig wrote: >Congratulations on the release of 3.5 and Pep 484. I've used Python >professionally for 10 years and I believe type hints will make it >easier to >work with large codebases evolving over time. My only concern about Pep >484 >is the discussion of whether or not to deprecate arbitrary function >annotations. >https://www.python.org/dev/peps/pep-0484/ > >I would like to request that arbitrary function annotations are not >deprecated for three reasons: >1. Backwards Compatibility >2. Type Experimentation >3. Embedded Languages > >1. Backwards Compatibility >After reading Pep 3107 my team has made significant use of non-standard >annotations. It would be a serious burden to be forced to port our >annotations back to decorators. This would also make our codebase >considerably less readable because function annotations are much more >readable than input/output annotations relegated to decorators. >https://www.python.org/dev/peps/pep-3107/ > >2. Type Experimentation >Arbitrary function annotations allow developers to experiment with >potential type system improvements in real projects. Ideas can be >validated >before officially adding them to the language. This seems like an >advantage >that should be preserved. After all, Pep 484 says it was strongly >inspired >by MyPy, an existing project. >http://mypy-lang.org/ > >3. Embedded Languages >Python's flexibility makes it an amazing language to embed other >languages >in. In this regard, Python 3's addition of arbitrary function >annotations >and class decorators complements Python 2's dynamic typing, function >decorators, reflection, metaclasses, properties, magic methods, >generators, >and keyword arguments. Arbitrary function annotations are a crucial >part of >this toolkit, and this feature is not available in most other >languages. >For anyone interested in the utility and mechanics of embedded >languages, >I'd recommend Martin Fowler's book: Domain Specific Languages. >http://www.amazon.com/Domain-Specific-Languages-Addison-Wesley-Signature-Series/dp/0321712943 > >So I agree with the course of action mentioned in Pep 484 that avoids >runtime deprecation of arbitrary function annotation: "Another possible >outcome would be that type hints will eventually become the default >meaning >for annotations, but that there will always remain an option to disable >them." I would only add that there should be a way to disable type >checking >for an entire directory (recursively). This would be useful for >codebases >that have not been ported to standard annotations yet, and for >codebases >that will not be ported for the reasons listed above. > >Thanks for your consideration. > >Best, >Steve > > >------------------------------------------------------------------------ > >_______________________________________________ >Python-Dev mailing list >Python-Dev at python.org >https://mail.python.org/mailman/listinfo/python-dev >Unsubscribe: >https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com -- Sent from my Nexus 5 with K-9 Mail. Please excuse my brevity. From guido at python.org Mon Oct 5 22:01:11 2015 From: guido at python.org (Guido van Rossum) Date: Mon, 5 Oct 2015 13:01:11 -0700 Subject: [Python-Dev] Not Deprecating Arbitrary Function Annotations In-Reply-To: <6CBC061D-2B15-4BFF-AE85-18EE9DBA55FA@gmail.com> References: <6CBC061D-2B15-4BFF-AE85-18EE9DBA55FA@gmail.com> Message-ID: "They"? On Mon, Oct 5, 2015 at 12:57 PM, Ryan Gonzalez wrote: > There is one reason I would be really freaking mad if they deprecated > other uses of annotations: > > https://pypi.python.org/pypi/plac > > On October 5, 2015 1:55:37 PM CDT, Steve Wedig > wrote: > >Congratulations on the release of 3.5 and Pep 484. I've used Python > >professionally for 10 years and I believe type hints will make it > >easier to > >work with large codebases evolving over time. My only concern about Pep > >484 > >is the discussion of whether or not to deprecate arbitrary function > >annotations. > >https://www.python.org/dev/peps/pep-0484/ > > > >I would like to request that arbitrary function annotations are not > >deprecated for three reasons: > >1. Backwards Compatibility > >2. Type Experimentation > >3. Embedded Languages > > > >1. Backwards Compatibility > >After reading Pep 3107 my team has made significant use of non-standard > >annotations. It would be a serious burden to be forced to port our > >annotations back to decorators. This would also make our codebase > >considerably less readable because function annotations are much more > >readable than input/output annotations relegated to decorators. > >https://www.python.org/dev/peps/pep-3107/ > > > >2. Type Experimentation > >Arbitrary function annotations allow developers to experiment with > >potential type system improvements in real projects. Ideas can be > >validated > >before officially adding them to the language. This seems like an > >advantage > >that should be preserved. After all, Pep 484 says it was strongly > >inspired > >by MyPy, an existing project. > >http://mypy-lang.org/ > > > >3. Embedded Languages > >Python's flexibility makes it an amazing language to embed other > >languages > >in. In this regard, Python 3's addition of arbitrary function > >annotations > >and class decorators complements Python 2's dynamic typing, function > >decorators, reflection, metaclasses, properties, magic methods, > >generators, > >and keyword arguments. Arbitrary function annotations are a crucial > >part of > >this toolkit, and this feature is not available in most other > >languages. > >For anyone interested in the utility and mechanics of embedded > >languages, > >I'd recommend Martin Fowler's book: Domain Specific Languages. > > > http://www.amazon.com/Domain-Specific-Languages-Addison-Wesley-Signature-Series/dp/0321712943 > > > >So I agree with the course of action mentioned in Pep 484 that avoids > >runtime deprecation of arbitrary function annotation: "Another possible > >outcome would be that type hints will eventually become the default > >meaning > >for annotations, but that there will always remain an option to disable > >them." I would only add that there should be a way to disable type > >checking > >for an entire directory (recursively). This would be useful for > >codebases > >that have not been ported to standard annotations yet, and for > >codebases > >that will not be ported for the reasons listed above. > > > >Thanks for your consideration. > > > >Best, > >Steve > > > > > >------------------------------------------------------------------------ > > > >_______________________________________________ > >Python-Dev mailing list > >Python-Dev at python.org > >https://mail.python.org/mailman/listinfo/python-dev > >Unsubscribe: > >https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com > > -- > Sent from my Nexus 5 with K-9 Mail. Please excuse my brevity. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Mon Oct 5 22:02:45 2015 From: brett at python.org (Brett Cannon) Date: Mon, 05 Oct 2015 20:02:45 +0000 Subject: [Python-Dev] Not Deprecating Arbitrary Function Annotations In-Reply-To: <6CBC061D-2B15-4BFF-AE85-18EE9DBA55FA@gmail.com> References: <6CBC061D-2B15-4BFF-AE85-18EE9DBA55FA@gmail.com> Message-ID: Function annotations for uses other than types are not deprecated, just discouraged if they don't have an appropriate decorator: https://docs.python.org/3/library/typing.html#typing.no_type_check . There is even a decorator for decorators since most uses previous to type hints utilized some form of a decorator: https://docs.python.org/3/library/typing.html#typing.no_type_check_decorator . And as a last resort you simply don't use your Python code with anything that assumes type hints. On Mon, 5 Oct 2015 at 12:57 Ryan Gonzalez wrote: > There is one reason I would be really freaking mad if they deprecated > other uses of annotations: > > https://pypi.python.org/pypi/plac > > On October 5, 2015 1:55:37 PM CDT, Steve Wedig > wrote: > >Congratulations on the release of 3.5 and Pep 484. I've used Python > >professionally for 10 years and I believe type hints will make it > >easier to > >work with large codebases evolving over time. My only concern about Pep > >484 > >is the discussion of whether or not to deprecate arbitrary function > >annotations. > >https://www.python.org/dev/peps/pep-0484/ > > > >I would like to request that arbitrary function annotations are not > >deprecated for three reasons: > >1. Backwards Compatibility > >2. Type Experimentation > >3. Embedded Languages > > > >1. Backwards Compatibility > >After reading Pep 3107 my team has made significant use of non-standard > >annotations. It would be a serious burden to be forced to port our > >annotations back to decorators. This would also make our codebase > >considerably less readable because function annotations are much more > >readable than input/output annotations relegated to decorators. > >https://www.python.org/dev/peps/pep-3107/ > > > >2. Type Experimentation > >Arbitrary function annotations allow developers to experiment with > >potential type system improvements in real projects. Ideas can be > >validated > >before officially adding them to the language. This seems like an > >advantage > >that should be preserved. After all, Pep 484 says it was strongly > >inspired > >by MyPy, an existing project. > >http://mypy-lang.org/ > > > >3. Embedded Languages > >Python's flexibility makes it an amazing language to embed other > >languages > >in. In this regard, Python 3's addition of arbitrary function > >annotations > >and class decorators complements Python 2's dynamic typing, function > >decorators, reflection, metaclasses, properties, magic methods, > >generators, > >and keyword arguments. Arbitrary function annotations are a crucial > >part of > >this toolkit, and this feature is not available in most other > >languages. > >For anyone interested in the utility and mechanics of embedded > >languages, > >I'd recommend Martin Fowler's book: Domain Specific Languages. > > > http://www.amazon.com/Domain-Specific-Languages-Addison-Wesley-Signature-Series/dp/0321712943 > > > >So I agree with the course of action mentioned in Pep 484 that avoids > >runtime deprecation of arbitrary function annotation: "Another possible > >outcome would be that type hints will eventually become the default > >meaning > >for annotations, but that there will always remain an option to disable > >them." I would only add that there should be a way to disable type > >checking > >for an entire directory (recursively). This would be useful for > >codebases > >that have not been ported to standard annotations yet, and for > >codebases > >that will not be ported for the reasons listed above. > > > >Thanks for your consideration. > > > >Best, > >Steve > > > > > >------------------------------------------------------------------------ > > > >_______________________________________________ > >Python-Dev mailing list > >Python-Dev at python.org > >https://mail.python.org/mailman/listinfo/python-dev > >Unsubscribe: > >https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com > > -- > Sent from my Nexus 5 with K-9 Mail. Please excuse my brevity. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From rymg19 at gmail.com Mon Oct 5 22:18:36 2015 From: rymg19 at gmail.com (Ryan Gonzalez) Date: Mon, 05 Oct 2015 15:18:36 -0500 Subject: [Python-Dev] Not Deprecating Arbitrary Function Annotations In-Reply-To: References: <6CBC061D-2B15-4BFF-AE85-18EE9DBA55FA@gmail.com> Message-ID: PSF. Nothing personal, of course... On October 5, 2015 3:01:11 PM CDT, Guido van Rossum wrote: >"They"? > >On Mon, Oct 5, 2015 at 12:57 PM, Ryan Gonzalez >wrote: > >> There is one reason I would be really freaking mad if they deprecated >> other uses of annotations: >> >> https://pypi.python.org/pypi/plac >> >> On October 5, 2015 1:55:37 PM CDT, Steve Wedig >> wrote: >> >Congratulations on the release of 3.5 and Pep 484. I've used Python >> >professionally for 10 years and I believe type hints will make it >> >easier to >> >work with large codebases evolving over time. My only concern about >Pep >> >484 >> >is the discussion of whether or not to deprecate arbitrary function >> >annotations. >> >https://www.python.org/dev/peps/pep-0484/ >> > >> >I would like to request that arbitrary function annotations are not >> >deprecated for three reasons: >> >1. Backwards Compatibility >> >2. Type Experimentation >> >3. Embedded Languages >> > >> >1. Backwards Compatibility >> >After reading Pep 3107 my team has made significant use of >non-standard >> >annotations. It would be a serious burden to be forced to port our >> >annotations back to decorators. This would also make our codebase >> >considerably less readable because function annotations are much >more >> >readable than input/output annotations relegated to decorators. >> >https://www.python.org/dev/peps/pep-3107/ >> > >> >2. Type Experimentation >> >Arbitrary function annotations allow developers to experiment with >> >potential type system improvements in real projects. Ideas can be >> >validated >> >before officially adding them to the language. This seems like an >> >advantage >> >that should be preserved. After all, Pep 484 says it was strongly >> >inspired >> >by MyPy, an existing project. >> >http://mypy-lang.org/ >> > >> >3. Embedded Languages >> >Python's flexibility makes it an amazing language to embed other >> >languages >> >in. In this regard, Python 3's addition of arbitrary function >> >annotations >> >and class decorators complements Python 2's dynamic typing, function >> >decorators, reflection, metaclasses, properties, magic methods, >> >generators, >> >and keyword arguments. Arbitrary function annotations are a crucial >> >part of >> >this toolkit, and this feature is not available in most other >> >languages. >> >For anyone interested in the utility and mechanics of embedded >> >languages, >> >I'd recommend Martin Fowler's book: Domain Specific Languages. >> > >> >http://www.amazon.com/Domain-Specific-Languages-Addison-Wesley-Signature-Series/dp/0321712943 >> > >> >So I agree with the course of action mentioned in Pep 484 that >avoids >> >runtime deprecation of arbitrary function annotation: "Another >possible >> >outcome would be that type hints will eventually become the default >> >meaning >> >for annotations, but that there will always remain an option to >disable >> >them." I would only add that there should be a way to disable type >> >checking >> >for an entire directory (recursively). This would be useful for >> >codebases >> >that have not been ported to standard annotations yet, and for >> >codebases >> >that will not be ported for the reasons listed above. >> > >> >Thanks for your consideration. >> > >> >Best, >> >Steve >> > >> > >> >>------------------------------------------------------------------------ >> > >> >_______________________________________________ >> >Python-Dev mailing list >> >Python-Dev at python.org >> >https://mail.python.org/mailman/listinfo/python-dev >> >Unsubscribe: >> >>https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com >> >> -- >> Sent from my Nexus 5 with K-9 Mail. Please excuse my brevity. >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/guido%40python.org >> > > > >-- >--Guido van Rossum (python.org/~guido) -- Sent from my Nexus 5 with K-9 Mail. Please excuse my brevity. -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Mon Oct 5 22:23:29 2015 From: guido at python.org (Guido van Rossum) Date: Mon, 5 Oct 2015 13:23:29 -0700 Subject: [Python-Dev] Not Deprecating Arbitrary Function Annotations In-Reply-To: References: <6CBC061D-2B15-4BFF-AE85-18EE9DBA55FA@gmail.com> Message-ID: Maybe I should clarify how the process of changing the language works. The PSF doesn't enter into it -- they manage the infrastructure (e.g. mailing lists, Hg repo, tracker, python.org) but they don't have anything to do with deciding how or when the language changes. Language changes are done *here* by *us* all. Anyone can write a PEP and it will be discussed here (but first in python-ideas of course). I'm sorry you don't feel more included, but I really don't like the idea of "us vs. them" in this list. We're all working together to make Python the best language it can be. --Guido On Mon, Oct 5, 2015 at 1:18 PM, Ryan Gonzalez wrote: > PSF. Nothing personal, of course... > > > On October 5, 2015 3:01:11 PM CDT, Guido van Rossum > wrote: >> >> "They"? >> >> On Mon, Oct 5, 2015 at 12:57 PM, Ryan Gonzalez wrote: >> >>> There is one reason I would be really freaking mad if they deprecated >>> other uses of annotations: >>> >>> https://pypi.python.org/pypi/plac >>> >>> On October 5, 2015 1:55:37 PM CDT, Steve Wedig >>> wrote: >>> >Congratulations on the release of 3.5 and Pep 484. I've used Python >>> >professionally for 10 years and I believe type hints will make it >>> >easier to >>> >work with large codebases evolving over time. My only concern about Pep >>> >484 >>> >is the discussion of whether or not to deprecate arbitrary function >>> >annotations. >>> >https://www.python.org/dev/peps/pep-0484/ >>> > >>> >I would like to request that arbitrary function annotations are not >>> >deprecated for three reasons: >>> >1. Backwards Compatibility >>> >2. Type Experimentation >>> >3. Embedded Languages >>> > >>> >1. Backwards Compatibility >>> >After reading Pep 3107 my team has made significant use of non-standard >>> >annotations. It would be a serious burden to be forced to port our >>> >annotations back to decorators. This would also make our codebase >>> >considerably less readable because function annotations are much more >>> >readable than input/output annotations relegated to decorators. >>> >https://www.python.org/dev/peps/pep-3107/ >>> > >>> >2. Type Experimentation >>> >Arbitrary function annotations allow developers to experiment with >>> >potential type system improvements in real projects. Ideas can be >>> >validated >>> >before officially adding them to the language. This seems like an >>> >advantage >>> >that should be preserved. After all, Pep 484 says it was strongly >>> >inspired >>> >by MyPy, an existing project. >>> >http://mypy-lang.org/ >>> > >>> >3. Embedded Languages >>> >Python's flexibility makes it an amazing language to embed other >>> >languages >>> >in. In this regard, Python 3's addition of arbitrary function >>> >annotations >>> >and class decorators complements Python 2's dynamic typing, function >>> >decorators, reflection, metaclasses, properties, magic methods, >>> >generators, >>> >and keyword arguments. Arbitrary function annotations are a crucial >>> >part of >>> >this toolkit, and this feature is not available in most other >>> >languages. >>> >For anyone interested in the utility and mechanics of embedded >>> >languages, >>> >I'd recommend Martin Fowler's book: Domain Specific Languages. >>> > >>> http://www.amazon.com/Domain-Specific-Languages-Addison-Wesley-Signature-Series/dp/0321712943 >>> > >>> >So I agree with the course of action mentioned in Pep 484 that avoids >>> >runtime deprecation of arbitrary function annotation: "Another possible >>> >outcome would be that type hints will eventually become the default >>> >meaning >>> >for annotations, but that there will always remain an option to disable >>> >them." I would only add that there should be a way to disable type >>> >checking >>> >for an entire directory (recursively). This would be useful for >>> >codebases >>> >that have not been ported to standard annotations yet, and for >>> >codebases >>> >that will not be ported for the reasons listed above. >>> > >>> >Thanks for your consideration. >>> > >>> >Best, >>> >Steve >>> > >>> > >>> >------------------------------------------------------------------------ >>> > >>> >_______________________________________________ >>> >Python-Dev mailing list >>> >Python-Dev at python.org >>> >https://mail.python.org/mailman/listinfo/python-dev >>> >Unsubscribe: >>> >https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com >>> >>> -- >>> Sent from my Nexus 5 with K-9 Mail. Please excuse my brevity. >>> _______________________________________________ >>> Python-Dev mailing list >>> Python-Dev at python.org >>> https://mail.python.org/mailman/listinfo/python-dev >>> Unsubscribe: >>> https://mail.python.org/mailman/options/python-dev/guido%40python.org >>> >> >> >> > -- > Sent from my Nexus 5 with K-9 Mail. Please excuse my brevity. > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From stevewedig at gmail.com Mon Oct 5 22:46:27 2015 From: stevewedig at gmail.com (Steve Wedig) Date: Mon, 5 Oct 2015 13:46:27 -0700 Subject: [Python-Dev] Not Deprecating Arbitrary Function Annotations In-Reply-To: References: <6CBC061D-2B15-4BFF-AE85-18EE9DBA55FA@gmail.com> Message-ID: Brett and Alexander, I am concerned about deprecation of arbitrary function annotations because Pep 484 suggests that two paths are under consideration. Here is the relevant section: " We do hope that type hints will eventually become the sole use for annotations, but this will require additional discussion and a deprecation period after the initial roll-out of the typing module with Python 3.5. The current PEP will have provisional status (see PEP 411 ) until Python 3.6 is released. The fastest conceivable scheme would introduce silent deprecation of non-type-hint annotations in 3.6, full deprecation in 3.7, and declare type hints as the only allowed use of annotations in Python 3.8. This should give authors of packages that use annotations plenty of time to devise another approach, even if type hints become an overnight success. Another possible outcome would be that type hints will eventually become the default meaning for annotations, but that there will always remain an option to disable them. For this purpose the current proposal defines a decorator @no_type_check which disables the default interpretation of annotations as type hints in a given class or function. It also defines a meta-decorator @no_type_check_decorator which can be used to decorate a decorator (!), causing annotations in any function or class decorated with the latter to be ignored by the type checker. " I am advocating against paragraph 1 (a deprecation path) and for the course of action stated in paragraph 2 :) On Mon, Oct 5, 2015 at 1:23 PM, Guido van Rossum wrote: > Maybe I should clarify how the process of changing the language works. > > The PSF doesn't enter into it -- they manage the infrastructure (e.g. > mailing lists, Hg repo, tracker, python.org) but they don't have anything > to do with deciding how or when the language changes. > > Language changes are done *here* by *us* all. Anyone can write a PEP and > it will be discussed here (but first in python-ideas of course). > > I'm sorry you don't feel more included, but I really don't like the idea > of "us vs. them" in this list. We're all working together to make Python > the best language it can be. > > --Guido > > On Mon, Oct 5, 2015 at 1:18 PM, Ryan Gonzalez wrote: > >> PSF. Nothing personal, of course... >> >> >> On October 5, 2015 3:01:11 PM CDT, Guido van Rossum >> wrote: >>> >>> "They"? >>> >>> On Mon, Oct 5, 2015 at 12:57 PM, Ryan Gonzalez wrote: >>> >>>> There is one reason I would be really freaking mad if they deprecated >>>> other uses of annotations: >>>> >>>> https://pypi.python.org/pypi/plac >>>> >>>> On October 5, 2015 1:55:37 PM CDT, Steve Wedig >>>> wrote: >>>> >Congratulations on the release of 3.5 and Pep 484. I've used Python >>>> >professionally for 10 years and I believe type hints will make it >>>> >easier to >>>> >work with large codebases evolving over time. My only concern about Pep >>>> >484 >>>> >is the discussion of whether or not to deprecate arbitrary function >>>> >annotations. >>>> >https://www.python.org/dev/peps/pep-0484/ >>>> > >>>> >I would like to request that arbitrary function annotations are not >>>> >deprecated for three reasons: >>>> >1. Backwards Compatibility >>>> >2. Type Experimentation >>>> >3. Embedded Languages >>>> > >>>> >1. Backwards Compatibility >>>> >After reading Pep 3107 my team has made significant use of non-standard >>>> >annotations. It would be a serious burden to be forced to port our >>>> >annotations back to decorators. This would also make our codebase >>>> >considerably less readable because function annotations are much more >>>> >readable than input/output annotations relegated to decorators. >>>> >https://www.python.org/dev/peps/pep-3107/ >>>> > >>>> >2. Type Experimentation >>>> >Arbitrary function annotations allow developers to experiment with >>>> >potential type system improvements in real projects. Ideas can be >>>> >validated >>>> >before officially adding them to the language. This seems like an >>>> >advantage >>>> >that should be preserved. After all, Pep 484 says it was strongly >>>> >inspired >>>> >by MyPy, an existing project. >>>> >http://mypy-lang.org/ >>>> > >>>> >3. Embedded Languages >>>> >Python's flexibility makes it an amazing language to embed other >>>> >languages >>>> >in. In this regard, Python 3's addition of arbitrary function >>>> >annotations >>>> >and class decorators complements Python 2's dynamic typing, function >>>> >decorators, reflection, metaclasses, properties, magic methods, >>>> >generators, >>>> >and keyword arguments. Arbitrary function annotations are a crucial >>>> >part of >>>> >this toolkit, and this feature is not available in most other >>>> >languages. >>>> >For anyone interested in the utility and mechanics of embedded >>>> >languages, >>>> >I'd recommend Martin Fowler's book: Domain Specific Languages. >>>> > >>>> http://www.amazon.com/Domain-Specific-Languages-Addison-Wesley-Signature-Series/dp/0321712943 >>>> > >>>> >So I agree with the course of action mentioned in Pep 484 that avoids >>>> >runtime deprecation of arbitrary function annotation: "Another possible >>>> >outcome would be that type hints will eventually become the default >>>> >meaning >>>> >for annotations, but that there will always remain an option to disable >>>> >them." I would only add that there should be a way to disable type >>>> >checking >>>> >for an entire directory (recursively). This would be useful for >>>> >codebases >>>> >that have not been ported to standard annotations yet, and for >>>> >codebases >>>> >that will not be ported for the reasons listed above. >>>> > >>>> >Thanks for your consideration. >>>> > >>>> >Best, >>>> >Steve >>>> > >>>> > >>>> >>>> >------------------------------------------------------------------------ >>>> > >>>> >_______________________________________________ >>>> >Python-Dev mailing list >>>> >Python-Dev at python.org >>>> >https://mail.python.org/mailman/listinfo/python-dev >>>> >Unsubscribe: >>>> >https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com >>>> >>>> -- >>>> Sent from my Nexus 5 with K-9 Mail. Please excuse my brevity. >>>> _______________________________________________ >>>> Python-Dev mailing list >>>> Python-Dev at python.org >>>> https://mail.python.org/mailman/listinfo/python-dev >>>> Unsubscribe: >>>> https://mail.python.org/mailman/options/python-dev/guido%40python.org >>>> >>> >>> >>> >> -- >> Sent from my Nexus 5 with K-9 Mail. Please excuse my brevity. >> > > > > -- > --Guido van Rossum (python.org/~guido) > -- Steve Wedig stevewedig.com linkedin.com/in/wedig -------------- next part -------------- An HTML attachment was scrubbed... URL: From rymg19 at gmail.com Mon Oct 5 22:58:36 2015 From: rymg19 at gmail.com (Ryan Gonzalez) Date: Mon, 05 Oct 2015 15:58:36 -0500 Subject: [Python-Dev] Not Deprecating Arbitrary Function Annotations In-Reply-To: References: <6CBC061D-2B15-4BFF-AE85-18EE9DBA55FA@gmail.com> Message-ID: <46F874A2-461A-4260-BAE6-385E35A8B787@gmail.com> Uh, I kind of knew that. Then I rushed the email and my brain momentarily left me. Sorry... On October 5, 2015 3:23:29 PM CDT, Guido van Rossum wrote: >Maybe I should clarify how the process of changing the language works. > >The PSF doesn't enter into it -- they manage the infrastructure (e.g. >mailing lists, Hg repo, tracker, python.org) but they don't have >anything >to do with deciding how or when the language changes. > >Language changes are done *here* by *us* all. Anyone can write a PEP >and it >will be discussed here (but first in python-ideas of course). > >I'm sorry you don't feel more included, but I really don't like the >idea of >"us vs. them" in this list. We're all working together to make Python >the >best language it can be. > >--Guido > >On Mon, Oct 5, 2015 at 1:18 PM, Ryan Gonzalez wrote: > >> PSF. Nothing personal, of course... >> >> >> On October 5, 2015 3:01:11 PM CDT, Guido van Rossum > >> wrote: >>> >>> "They"? >>> >>> On Mon, Oct 5, 2015 at 12:57 PM, Ryan Gonzalez >wrote: >>> >>>> There is one reason I would be really freaking mad if they >deprecated >>>> other uses of annotations: >>>> >>>> https://pypi.python.org/pypi/plac >>>> >>>> On October 5, 2015 1:55:37 PM CDT, Steve Wedig > >>>> wrote: >>>> >Congratulations on the release of 3.5 and Pep 484. I've used >Python >>>> >professionally for 10 years and I believe type hints will make it >>>> >easier to >>>> >work with large codebases evolving over time. My only concern >about Pep >>>> >484 >>>> >is the discussion of whether or not to deprecate arbitrary >function >>>> >annotations. >>>> >https://www.python.org/dev/peps/pep-0484/ >>>> > >>>> >I would like to request that arbitrary function annotations are >not >>>> >deprecated for three reasons: >>>> >1. Backwards Compatibility >>>> >2. Type Experimentation >>>> >3. Embedded Languages >>>> > >>>> >1. Backwards Compatibility >>>> >After reading Pep 3107 my team has made significant use of >non-standard >>>> >annotations. It would be a serious burden to be forced to port our >>>> >annotations back to decorators. This would also make our codebase >>>> >considerably less readable because function annotations are much >more >>>> >readable than input/output annotations relegated to decorators. >>>> >https://www.python.org/dev/peps/pep-3107/ >>>> > >>>> >2. Type Experimentation >>>> >Arbitrary function annotations allow developers to experiment with >>>> >potential type system improvements in real projects. Ideas can be >>>> >validated >>>> >before officially adding them to the language. This seems like an >>>> >advantage >>>> >that should be preserved. After all, Pep 484 says it was strongly >>>> >inspired >>>> >by MyPy, an existing project. >>>> >http://mypy-lang.org/ >>>> > >>>> >3. Embedded Languages >>>> >Python's flexibility makes it an amazing language to embed other >>>> >languages >>>> >in. In this regard, Python 3's addition of arbitrary function >>>> >annotations >>>> >and class decorators complements Python 2's dynamic typing, >function >>>> >decorators, reflection, metaclasses, properties, magic methods, >>>> >generators, >>>> >and keyword arguments. Arbitrary function annotations are a >crucial >>>> >part of >>>> >this toolkit, and this feature is not available in most other >>>> >languages. >>>> >For anyone interested in the utility and mechanics of embedded >>>> >languages, >>>> >I'd recommend Martin Fowler's book: Domain Specific Languages. >>>> > >>>> >http://www.amazon.com/Domain-Specific-Languages-Addison-Wesley-Signature-Series/dp/0321712943 >>>> > >>>> >So I agree with the course of action mentioned in Pep 484 that >avoids >>>> >runtime deprecation of arbitrary function annotation: "Another >possible >>>> >outcome would be that type hints will eventually become the >default >>>> >meaning >>>> >for annotations, but that there will always remain an option to >disable >>>> >them." I would only add that there should be a way to disable type >>>> >checking >>>> >for an entire directory (recursively). This would be useful for >>>> >codebases >>>> >that have not been ported to standard annotations yet, and for >>>> >codebases >>>> >that will not be ported for the reasons listed above. >>>> > >>>> >Thanks for your consideration. >>>> > >>>> >Best, >>>> >Steve >>>> > >>>> > >>>> >>------------------------------------------------------------------------ >>>> > >>>> >_______________________________________________ >>>> >Python-Dev mailing list >>>> >Python-Dev at python.org >>>> >https://mail.python.org/mailman/listinfo/python-dev >>>> >Unsubscribe: >>>> >>https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com >>>> >>>> -- >>>> Sent from my Nexus 5 with K-9 Mail. Please excuse my brevity. >>>> _______________________________________________ >>>> Python-Dev mailing list >>>> Python-Dev at python.org >>>> https://mail.python.org/mailman/listinfo/python-dev >>>> Unsubscribe: >>>> >https://mail.python.org/mailman/options/python-dev/guido%40python.org >>>> >>> >>> >>> >> -- >> Sent from my Nexus 5 with K-9 Mail. Please excuse my brevity. >> > > > >-- >--Guido van Rossum (python.org/~guido) -- Sent from my Nexus 5 with K-9 Mail. Please excuse my brevity. -------------- next part -------------- An HTML attachment was scrubbed... URL: From srkunze at mail.de Mon Oct 5 23:06:21 2015 From: srkunze at mail.de (Sven R. Kunze) Date: Mon, 05 Oct 2015 23:06:21 +0200 Subject: [Python-Dev] Not Deprecating Arbitrary Function Annotations In-Reply-To: References: <6CBC061D-2B15-4BFF-AE85-18EE9DBA55FA@gmail.com> Message-ID: <5612E64D.9050201@mail.de> Not really being affected by the "python annotation movement", I supply some non-constructive comment: I would not prefer any of these outcomes but would always allow all possible meanings that people wish to encode in the annotations. My $0.02 and I am out. On 05.10.2015 22:46, Steve Wedig wrote: > Brett and Alexander, > > I am concerned about deprecation of arbitrary function annotations > because Pep 484 suggests that two paths are under consideration. Here > is the relevant section: > > " > We do hope that type hints will eventually become the sole use for > annotations, but this will require additional discussion and a > deprecation period after the initial roll-out of the typing module > with Python 3.5. The current PEP will have provisional status (see PEP > 411 ) until Python 3.6 is released. The fastest conceivable scheme > would introduce silent deprecation of non-type-hint annotations in > 3.6, full deprecation in 3.7, and declare type hints as the only > allowed use of annotations in Python 3.8. This should give authors of > packages that use annotations plenty of time to devise another > approach, even if type hints become an overnight success. > > Another possible outcome would be that type hints will eventually > become the default meaning for annotations, but that there will always > remain an option to disable them. For this purpose the current > proposal defines a decorator @no_type_check which disables the default > interpretation of annotations as type hints in a given class or > function. It also defines a meta-decorator @no_type_check_decorator > which can be used to decorate a decorator (!), causing annotations in > any function or class decorated with the latter to be ignored by the > type checker. > " > > I am advocating against paragraph 1 (a deprecation path) and for the > course of action stated in paragraph 2 :) > From brett at python.org Mon Oct 5 23:17:33 2015 From: brett at python.org (Brett Cannon) Date: Mon, 05 Oct 2015 21:17:33 +0000 Subject: [Python-Dev] Not Deprecating Arbitrary Function Annotations In-Reply-To: References: <6CBC061D-2B15-4BFF-AE85-18EE9DBA55FA@gmail.com> Message-ID: On Mon, 5 Oct 2015 at 13:55 Steve Wedig wrote: > Brett and Alexander, > > I am concerned about deprecation of arbitrary function annotations because > Pep 484 suggests that two paths are under consideration. Here is the > relevant section: > > " > We do hope that type hints will eventually become the sole use for > annotations, but this will require additional discussion and a deprecation > period after the initial roll-out of the typing module with Python 3.5. The > current PEP will have provisional status (see PEP 411 ) until Python 3.6 is > released. The fastest conceivable scheme would introduce silent deprecation > of non-type-hint annotations in 3.6, full deprecation in 3.7, and declare > type hints as the only allowed use of annotations in Python 3.8. This > should give authors of packages that use annotations plenty of time to > devise another approach, even if type hints become an overnight success. > > Another possible outcome would be that type hints will eventually become > the default meaning for annotations, but that there will always remain an > option to disable them. For this purpose the current proposal defines a > decorator @no_type_check which disables the default interpretation of > annotations as type hints in a given class or function. It also defines a > meta-decorator @no_type_check_decorator which can be used to decorate a > decorator (!), causing annotations in any function or class decorated with > the latter to be ignored by the type checker. > " > > I am advocating against paragraph 1 (a deprecation path) and for the > course of action stated in paragraph 2 :) > Fair enough, but since Python 3.5 is so new we have yet to gather any feedback on the entire concept of type hints, let alone whether their use is so broad and liked that we will consider dropping the decorators in `typing` which mark alternative uses and define "function annotations" as "type annotations" for everything. So consider your view noted, but realize that the discussion of the uptake of type hints has not started yet as it's premature to do so. If you want to make sure to participate if/when the discussion of dropping support for alternative uses of function annocations then consider staying subscribed to python-dev to notice when that happens (but I suspect it will be a while). -Brett > > > > > On Mon, Oct 5, 2015 at 1:23 PM, Guido van Rossum wrote: > >> Maybe I should clarify how the process of changing the language works. >> >> The PSF doesn't enter into it -- they manage the infrastructure (e.g. >> mailing lists, Hg repo, tracker, python.org) but they don't have >> anything to do with deciding how or when the language changes. >> >> Language changes are done *here* by *us* all. Anyone can write a PEP and >> it will be discussed here (but first in python-ideas of course). >> >> I'm sorry you don't feel more included, but I really don't like the idea >> of "us vs. them" in this list. We're all working together to make Python >> the best language it can be. >> >> --Guido >> >> On Mon, Oct 5, 2015 at 1:18 PM, Ryan Gonzalez wrote: >> >>> PSF. Nothing personal, of course... >>> >>> >>> On October 5, 2015 3:01:11 PM CDT, Guido van Rossum >>> wrote: >>>> >>>> "They"? >>>> >>>> On Mon, Oct 5, 2015 at 12:57 PM, Ryan Gonzalez >>>> wrote: >>>> >>>>> There is one reason I would be really freaking mad if they deprecated >>>>> other uses of annotations: >>>>> >>>>> https://pypi.python.org/pypi/plac >>>>> >>>>> On October 5, 2015 1:55:37 PM CDT, Steve Wedig >>>>> wrote: >>>>> >Congratulations on the release of 3.5 and Pep 484. I've used Python >>>>> >professionally for 10 years and I believe type hints will make it >>>>> >easier to >>>>> >work with large codebases evolving over time. My only concern about >>>>> Pep >>>>> >484 >>>>> >is the discussion of whether or not to deprecate arbitrary function >>>>> >annotations. >>>>> >https://www.python.org/dev/peps/pep-0484/ >>>>> > >>>>> >I would like to request that arbitrary function annotations are not >>>>> >deprecated for three reasons: >>>>> >1. Backwards Compatibility >>>>> >2. Type Experimentation >>>>> >3. Embedded Languages >>>>> > >>>>> >1. Backwards Compatibility >>>>> >After reading Pep 3107 my team has made significant use of >>>>> non-standard >>>>> >annotations. It would be a serious burden to be forced to port our >>>>> >annotations back to decorators. This would also make our codebase >>>>> >considerably less readable because function annotations are much more >>>>> >readable than input/output annotations relegated to decorators. >>>>> >https://www.python.org/dev/peps/pep-3107/ >>>>> > >>>>> >2. Type Experimentation >>>>> >Arbitrary function annotations allow developers to experiment with >>>>> >potential type system improvements in real projects. Ideas can be >>>>> >validated >>>>> >before officially adding them to the language. This seems like an >>>>> >advantage >>>>> >that should be preserved. After all, Pep 484 says it was strongly >>>>> >inspired >>>>> >by MyPy, an existing project. >>>>> >http://mypy-lang.org/ >>>>> > >>>>> >3. Embedded Languages >>>>> >Python's flexibility makes it an amazing language to embed other >>>>> >languages >>>>> >in. In this regard, Python 3's addition of arbitrary function >>>>> >annotations >>>>> >and class decorators complements Python 2's dynamic typing, function >>>>> >decorators, reflection, metaclasses, properties, magic methods, >>>>> >generators, >>>>> >and keyword arguments. Arbitrary function annotations are a crucial >>>>> >part of >>>>> >this toolkit, and this feature is not available in most other >>>>> >languages. >>>>> >For anyone interested in the utility and mechanics of embedded >>>>> >languages, >>>>> >I'd recommend Martin Fowler's book: Domain Specific Languages. >>>>> > >>>>> http://www.amazon.com/Domain-Specific-Languages-Addison-Wesley-Signature-Series/dp/0321712943 >>>>> > >>>>> >So I agree with the course of action mentioned in Pep 484 that avoids >>>>> >runtime deprecation of arbitrary function annotation: "Another >>>>> possible >>>>> >outcome would be that type hints will eventually become the default >>>>> >meaning >>>>> >for annotations, but that there will always remain an option to >>>>> disable >>>>> >them." I would only add that there should be a way to disable type >>>>> >checking >>>>> >for an entire directory (recursively). This would be useful for >>>>> >codebases >>>>> >that have not been ported to standard annotations yet, and for >>>>> >codebases >>>>> >that will not be ported for the reasons listed above. >>>>> > >>>>> >Thanks for your consideration. >>>>> > >>>>> >Best, >>>>> >Steve >>>>> > >>>>> > >>>>> >>>>> >------------------------------------------------------------------------ >>>>> > >>>>> >_______________________________________________ >>>>> >Python-Dev mailing list >>>>> >Python-Dev at python.org >>>>> >https://mail.python.org/mailman/listinfo/python-dev >>>>> >Unsubscribe: >>>>> >https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com >>>>> >>>>> -- >>>>> Sent from my Nexus 5 with K-9 Mail. Please excuse my brevity. >>>>> _______________________________________________ >>>>> Python-Dev mailing list >>>>> Python-Dev at python.org >>>>> https://mail.python.org/mailman/listinfo/python-dev >>>>> Unsubscribe: >>>>> https://mail.python.org/mailman/options/python-dev/guido%40python.org >>>>> >>>> >>>> >>>> >>> -- >>> Sent from my Nexus 5 with K-9 Mail. Please excuse my brevity. >>> >> >> >> >> -- >> --Guido van Rossum (python.org/~guido) >> > > > > -- > Steve Wedig > stevewedig.com > linkedin.com/in/wedig > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From fijall at gmail.com Tue Oct 6 13:29:50 2015 From: fijall at gmail.com (Maciej Fijalkowski) Date: Tue, 6 Oct 2015 13:29:50 +0200 Subject: [Python-Dev] An example of Python 3 promotion attitude Message-ID: There was a discussion a while ago about python 3 and the attitude on social media and there was a lack of examples. Here is one example: https://www.reddit.com/r/Python/comments/3nl5ut/ninite_the_popular_website_to_install_essential/ According to some people, it is everybodys job to promote python 3 and force people to upgrade. This is really not something I enjoy (people telling me pypy should promote python 3 - it's not really our job). Now I sometimes feel that there is not enough sentiment in python-dev to distance from such ideas. It *is* python-dev job to promote python3, but it's also python-dev job sometimes to point out that whatever helps in promoting the python ecosystem (e.g. in case of pypy is speed) is a good enough reason to do those things. I wonder what are other people ideas about that. Cheers, fijal From njs at pobox.com Tue Oct 6 16:35:55 2015 From: njs at pobox.com (Nathaniel Smith) Date: Tue, 6 Oct 2015 07:35:55 -0700 Subject: [Python-Dev] An example of Python 3 promotion attitude In-Reply-To: References: Message-ID: On Oct 6, 2015 4:31 AM, "Maciej Fijalkowski" wrote: > > There was a discussion a while ago about python 3 and the attitude on > social media and there was a lack of examples. Here is one example: > > https://www.reddit.com/r/Python/comments/3nl5ut/ninite_the_popular_website_to_install_essential/ > > According to some people, it is everybodys job to promote python 3 and > force people to upgrade. This is really not something I enjoy (people > telling me pypy should promote python 3 - it's not really our job). I'm not a core dev so I don't really have a dog in this fight (except that I do like python 3 the language), but: in the interests of having a more productive discussion, can you elaborate on what specifically you found frustrating about that link? It seems to be a page of people talking in a measured way about the trade offs between python 2 and python 3. It looked to me like probably the majority opinion expressed was that for the poster's personal uses python 3 was superior for specific reasons that they described, but people generally seemed very respectful and open to the possibility that their experience wasn't universal. Your email had me expecting something very different, so I'm wondering what I'm missing. -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Tue Oct 6 17:53:30 2015 From: brett at python.org (Brett Cannon) Date: Tue, 06 Oct 2015 15:53:30 +0000 Subject: [Python-Dev] An example of Python 3 promotion attitude In-Reply-To: References: Message-ID: On Tue, 6 Oct 2015 at 07:36 Nathaniel Smith wrote: > On Oct 6, 2015 4:31 AM, "Maciej Fijalkowski" wrote: > > > > There was a discussion a while ago about python 3 and the attitude on > > social media and there was a lack of examples. Here is one example: > > > > > https://www.reddit.com/r/Python/comments/3nl5ut/ninite_the_popular_website_to_install_essential/ > > > > According to some people, it is everybodys job to promote python 3 and > > force people to upgrade. This is really not something I enjoy (people > > telling me pypy should promote python 3 - it's not really our job). > > I'm not a core dev so I don't really have a dog in this fight (except that > I do like python 3 the language), but: in the interests of having a more > productive discussion, can you elaborate on what specifically you found > frustrating about that link? It seems to be a page of people talking in a > measured way about the trade offs between python 2 and python 3. It looked > to me like probably the majority opinion expressed was that for the > poster's personal uses python 3 was superior for specific reasons that they > described, but people generally seemed very respectful and open to the > possibility that their experience wasn't universal. Your email had me > expecting something very different, so I'm wondering what I'm missing. > I'm in the same position as Nathaniel. I was expecting a flood of comments yelling that not supporting Python 3 was horrible and they should be burned at the stake for heresy or something. Instead I found very reasonable responses to questions and only 2 people who went overboard, both of whom admitted they were wrong when their arguments were shown to be extreme or invalid. While I can imagine the kind of responses that Glyph was talking about at the language summit I don't quite see how this is an example of that. -------------- next part -------------- An HTML attachment was scrubbed... URL: From tjreedy at udel.edu Wed Oct 7 00:20:06 2015 From: tjreedy at udel.edu (Terry Reedy) Date: Tue, 6 Oct 2015 18:20:06 -0400 Subject: [Python-Dev] An example of Python 3 promotion attitude In-Reply-To: References: Message-ID: On 10/6/2015 7:29 AM, Maciej Fijalkowski wrote: > There was a discussion a while ago about python 3 and the attitude on > social media and there was a lack of examples. Here is one example: > > https://www.reddit.com/r/Python/comments/3nl5ut/ninite_the_popular_website_to_install_essential/ I read this. The proposition on the table for debate is "Ninite -- the popular website to install essential programs at once -- should start offering Python 3 instead of Python 2" The current situation (as of today) is that Ninite offers to install and update about 85 programs. Among these is 'Python', which they translate as 2.7.10. My first answer is that this makes their claim to keep people updated, "Always Up-to-Date", a lie because the most recent update to 'Python' is 3.5.0. I have no idea if they are editorially holding back updates to other programs or not. In other words, the proposition was whether Ninite should do what they promise to do. My second answer is that for Python, they should offer 'Python2' and 'Python3'. Many people said this also. > According to some people, it is everybodys job to promote python 3 and > force people to upgrade. The discussion is about Ninite. They claim that they install the most up to date version of each program users select and (forcibly, and silently) update everything when they *choose* to re-run it. They are not doing that with Python. Someone who emailed them reported back "they're considering it but holding off for now due to the fact that most people still use Py2." To the extent that this is true, and it not in all contexts, it is partly because they are helping to keep it true by implicitly claiming that Python2 is Python and Python3 is not. There was peripherally mention of a 4-year-document called LPTHW that recommends 2. I have no idea what they are referring to. There was also inconsequential mention of RHEL. > This is really not something I enjoy (people > telling me pypy should promote python 3 - it's not really our job). Pypy is not mentioned in the discussion you linked. Your job is what you conceive it to be. If you don't claim to support or promote the latest Python version, you have no obligation to do so. > Now I sometimes feel that there is not enough sentiment in python-dev > to distance from such ideas. It *is* python-dev job to promote > python3, but it's also python-dev job sometimes to point out that > whatever helps in promoting the python ecosystem (e.g. in case of pypy > is speed) is a good enough reason to do those things. This is *your* idea of what *our* job is ;-). I think our job *as python core developers* is to collectively produce the best new releases we can within the constraints of policies and resources. That currently includes further releases of 2.7. Each core dev interprets and augments the above for themselves. -- Terry Jan Reedy From ben+python at benfinney.id.au Wed Oct 7 00:43:59 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Wed, 07 Oct 2015 09:43:59 +1100 Subject: [Python-Dev] An example of Python 3 promotion attitude References: Message-ID: <85d1wrzkgg.fsf@benfinney.id.au> Terry Reedy writes: > There was peripherally mention of a 4-year-document called LPTHW that > recommends 2. I have no idea what they are referring to. It is expanded in passing, but for reference they are talking about ?Learn Python the Hard Way? , a book which (reportedly) has not been updated since 2010. -- \ ?Liberal capitalism is not at all the Good of humanity. Quite | `\ the contrary; it is the vehicle of savage, destructive | _o__) nihilism.? ?Alain Badiou | Ben Finney From chris.barker at noaa.gov Wed Oct 7 08:20:03 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Tue, 6 Oct 2015 23:20:03 -0700 Subject: [Python-Dev] An example of Python 3 promotion attitude In-Reply-To: <85d1wrzkgg.fsf@benfinney.id.au> References: <85d1wrzkgg.fsf@benfinney.id.au> Message-ID: On Tue, Oct 6, 2015 at 3:43 PM, Ben Finney wrote: > > > There was peripherally mention of a 4-year-document called LPTHW that > > recommends 2. I have no idea what they are referring to. > > It is expanded in passing, but for reference they are talking about > ?Learn Python the Hard Way? , a > book which (reportedly) has not been updated since 2010. > which is too bad -- I've recommended LPTHW for years for newbies -- and just started teaching a new "into to python" class -- now it py3! I do still recommend LPTHW, but now they'll have to add parentheses to all those print statements(functions).... I suppose we should all bug Zed Shaw to write a LPTHW3 .... -Chris > -- > \ ?Liberal capitalism is not at all the Good of humanity. Quite | > `\ the contrary; it is the vehicle of savage, destructive | > _o__) nihilism.? ?Alain Badiou | > Ben Finney > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/chris.barker%40noaa.gov > -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Wed Oct 7 12:33:46 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Wed, 7 Oct 2015 20:33:46 +1000 Subject: [Python-Dev] Migrating to Python 3: the python 3 install issue In-Reply-To: <56101542.7010303@stackless.com> References: <56101542.7010303@stackless.com> Message-ID: On 4 October 2015 at 03:49, Christian Tismer wrote: > Great, that this finally happens. > > I think this was a silent revolution, initiated by nagging > people, distros and larger companies about how mega-out Python2 is, > until they finally started to believe it ;-) While that was part of it (at least initially), the main impediment on the Linux front turned out to be the sheer amount of work involved, and the number of different projects impacted (without even counting the upstream projects that had already added Python 3 support of their own accord). This meant the employee time investment from Canonical, Red Hat and anyone else that contributed to distro package porting wasn't just in development effort - a fair bit of it was in the politics of getting primarily C/C++ projects that happened to have some Python components to accept the migration patches (even while the developers and other users of those projects were still running Python 2 based distributions themselves), as well as in revising distro packaging policies to mandate Python 3 support for new projects. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Wed Oct 7 13:12:30 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Wed, 7 Oct 2015 21:12:30 +1000 Subject: [Python-Dev] An example of Python 3 promotion attitude In-Reply-To: References: Message-ID: On 6 October 2015 at 21:29, Maciej Fijalkowski wrote: > Now I sometimes feel that there is not enough sentiment in python-dev > to distance from such ideas. It *is* python-dev job to promote > python3, but it's also python-dev job sometimes to point out that > whatever helps in promoting the python ecosystem (e.g. in case of pypy > is speed) is a good enough reason to do those things. > > I wonder what are other people ideas about that. It's not generally python-dev's job to promote Python 3 either - folks are here for their own reasons, and that's largely a shared aim of making a better programming language and other tools for our own future use (whatever those use cases may be). The fact that there are lots of *other* people that find those tools useful and helpful (to the point of elevating Python to being one of the most popular programming languages in the world) is a beneficial side effect of doing that work in the open, rather than necessarily being the reason people decide to participate. This is the key difference between community open source projects and commercial products that also happen to be open source - in the latter case, good luck getting anything added that doesn't align with the sponsoring company's plans, while in the community driven case, we don't *have* a pre-defined road map, we have a lot of individual contributors with possible ideas for improvement (occasionally company sponsored, usually not), and a range of processes for reviewing, refining and deciding on whether or not to accept those ideas. That said, those of us that get paid to be here (even part time), typically *do* have a significant obligation not to leave current Python 2 users behind, hence the extended lifecycle for the 2.7 series, and the ongoing work in lowering barriers to migration from Python 2 to Python 3. Those of us working for commercial redistributors (depending on our specific role) are also likely to have at least some obligation to our customers to help them understand the implications of the migration, and assure them that we'll help them manage the shift in a minimally disruptive way. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From raymond.hettinger at gmail.com Thu Oct 8 02:43:29 2015 From: raymond.hettinger at gmail.com (Raymond Hettinger) Date: Wed, 7 Oct 2015 20:43:29 -0400 Subject: [Python-Dev] An example of Python 3 promotion attitude In-Reply-To: References: Message-ID: <02ED7A0A-FEE1-4E75-9D8F-CE0313FF8A7A@gmail.com> > On Oct 7, 2015, at 7:12 AM, Nick Coghlan wrote: > > On 6 October 2015 at 21:29, Maciej Fijalkowski wrote: >> Now I sometimes feel that there is not enough sentiment in python-dev >> to distance from such ideas. It *is* python-dev job to promote >> python3, but it's also python-dev job sometimes to point out that >> whatever helps in promoting the python ecosystem (e.g. in case of pypy >> is speed) is a good enough reason to do those things. >> >> I wonder what are other people ideas about that. > > It's not generally python-dev's job to promote Python 3 either - folks > are here for their own reasons, and that's largely a shared aim of > making a better programming language and other tools for our own > future use (whatever those use cases may be). I concur. Our responsibilities are to make Python 3 into an effective tool that makes people *want* to adopt it and to be honest with anyone who asks us about the pros and cons of switching over. Raymond From status at bugs.python.org Fri Oct 9 18:08:32 2015 From: status at bugs.python.org (Python tracker) Date: Fri, 9 Oct 2015 18:08:32 +0200 (CEST) Subject: [Python-Dev] Summary of Python tracker Issues Message-ID: <20151009160832.1E82056685@psf.upfronthosting.co.za> ACTIVITY SUMMARY (2015-10-02 - 2015-10-09) Python tracker at http://bugs.python.org/ To view or respond to any of the issues listed below, click on the issue. Do NOT respond to this message. Issues counts and deltas: open 5144 (-11) closed 31971 (+65) total 37115 (+54) Open issues with patches: 2271 Issues opened (35) ================== #12939: Add new io.FileIO using the native Windows API http://bugs.python.org/issue12939 reopened by haypo #22413: Bizarre StringIO(newline="\r\n") translation http://bugs.python.org/issue22413 reopened by martin.panter #25303: py_compile disregards PYTHONDONTWRITEBYTECODE and -B http://bugs.python.org/issue25303 opened by proski #25307: Enhancing the argparse help output http://bugs.python.org/issue25307 opened by Sworddragon #25311: Add f-string support to tokenize.py http://bugs.python.org/issue25311 opened by skrah #25312: Cryptic error message if incorrect spec is set on a callable m http://bugs.python.org/issue25312 opened by uranusjr #25313: IDLE: gracefully handle themes (or keysets, or ...) not presen http://bugs.python.org/issue25313 opened by markroseman #25314: Documentation: argparse's actions store_{true,false} default t http://bugs.python.org/issue25314 opened by Julien Baley #25320: unittest loader.py TypeError when code directory contains a so http://bugs.python.org/issue25320 opened by Victor van den Elzen #25322: contextlib.suppress not tested for nested usage http://bugs.python.org/issue25322 opened by RazerM #25324: Importing tokenize modifies token http://bugs.python.org/issue25324 opened by serhiy.storchaka #25327: Python 3.5 Windows 10 Installation Fails With Corrupt Director http://bugs.python.org/issue25327 opened by Max Farrell #25329: test_json crashes with stack overflow on Windows http://bugs.python.org/issue25329 opened by zach.ware #25330: Docs for pkgutil.get_data inconsistent with semantics http://bugs.python.org/issue25330 opened by Antony.Lee #25331: https://docs.python.org/3.5/using/windows.html should list whc http://bugs.python.org/issue25331 opened by lac #25334: telnetlib: process_rawq() and binary data http://bugs.python.org/issue25334 opened by mwalle #25335: ast.literal_eval fails to parse numbers with leading "+" http://bugs.python.org/issue25335 opened by Scott Turner #25337: weakref.finalize documentation refers to old interpreter shutd http://bugs.python.org/issue25337 opened by josh.r #25338: urllib bypasses all hosts if proxyoverride includes an empty e http://bugs.python.org/issue25338 opened by Jung-chih Wei #25339: sys.stdout.errors is set to "surrogateescape" http://bugs.python.org/issue25339 opened by serhiy.storchaka #25340: libraries variable in setup.py ignore for multiprocessing modu http://bugs.python.org/issue25340 opened by davyg #25341: File mode wb+ appears as rb+ http://bugs.python.org/issue25341 opened by Mark.Williams #25342: test_json segfault on OpenBSD http://bugs.python.org/issue25342 opened by rpointel #25343: Document atomic operations on builtin types http://bugs.python.org/issue25343 opened by Dima.Tisnek #25344: Enhancement to Logging - Logging Stack http://bugs.python.org/issue25344 opened by dasilver at cisco.com #25345: Unable to install Python 3.5 on Windows 10 http://bugs.python.org/issue25345 opened by Gowtham NM #25347: assert_has_calls output is formatted inconsistently http://bugs.python.org/issue25347 opened by rzimmerman #25348: Update pgo_build.bat to use --pgo flag for regrtest http://bugs.python.org/issue25348 opened by brett.cannon #25349: Use _PyBytesWriter for bytes%args http://bugs.python.org/issue25349 opened by haypo #25351: pyvenv activate script failure with specific bash option http://bugs.python.org/issue25351 opened by s-wakaba #25352: Add 'make this my default python' to windows installs for Pyth http://bugs.python.org/issue25352 opened by lac #25353: Use _PyBytesWriter for unicode escape and raw unicode escape e http://bugs.python.org/issue25353 opened by haypo #25354: test_datetime failing http://bugs.python.org/issue25354 opened by shanmbic #25355: Windows 3.5 installer does not add python to "App Paths" key http://bugs.python.org/issue25355 opened by oscarbenjamin #25356: Idle (Python 3.4 on Ubuntu) does not allow typing accents http://bugs.python.org/issue25356 opened by Gian Carlo Martinelli Most recent 15 issues with no replies (15) ========================================== #25356: Idle (Python 3.4 on Ubuntu) does not allow typing accents http://bugs.python.org/issue25356 #25355: Windows 3.5 installer does not add python to "App Paths" key http://bugs.python.org/issue25355 #25351: pyvenv activate script failure with specific bash option http://bugs.python.org/issue25351 #25348: Update pgo_build.bat to use --pgo flag for regrtest http://bugs.python.org/issue25348 #25347: assert_has_calls output is formatted inconsistently http://bugs.python.org/issue25347 #25341: File mode wb+ appears as rb+ http://bugs.python.org/issue25341 #25340: libraries variable in setup.py ignore for multiprocessing modu http://bugs.python.org/issue25340 #25339: sys.stdout.errors is set to "surrogateescape" http://bugs.python.org/issue25339 #25337: weakref.finalize documentation refers to old interpreter shutd http://bugs.python.org/issue25337 #25334: telnetlib: process_rawq() and binary data http://bugs.python.org/issue25334 #25331: https://docs.python.org/3.5/using/windows.html should list whc http://bugs.python.org/issue25331 #25320: unittest loader.py TypeError when code directory contains a so http://bugs.python.org/issue25320 #25312: Cryptic error message if incorrect spec is set on a callable m http://bugs.python.org/issue25312 #25293: Hooking Thread/Process instantiation in concurrent.futures. http://bugs.python.org/issue25293 #25292: ssl socket gets into broken state when client exits during han http://bugs.python.org/issue25292 Most recent 15 issues waiting for review (15) ============================================= #25353: Use _PyBytesWriter for unicode escape and raw unicode escape e http://bugs.python.org/issue25353 #25349: Use _PyBytesWriter for bytes%args http://bugs.python.org/issue25349 #25347: assert_has_calls output is formatted inconsistently http://bugs.python.org/issue25347 #25338: urllib bypasses all hosts if proxyoverride includes an empty e http://bugs.python.org/issue25338 #25334: telnetlib: process_rawq() and binary data http://bugs.python.org/issue25334 #25322: contextlib.suppress not tested for nested usage http://bugs.python.org/issue25322 #25320: unittest loader.py TypeError when code directory contains a so http://bugs.python.org/issue25320 #25314: Documentation: argparse's actions store_{true,false} default t http://bugs.python.org/issue25314 #25313: IDLE: gracefully handle themes (or keysets, or ...) not presen http://bugs.python.org/issue25313 #25311: Add f-string support to tokenize.py http://bugs.python.org/issue25311 #25300: Enable Intel MPX (Memory protection Extensions) feature http://bugs.python.org/issue25300 #25287: test_crypt fails on OpenBSD http://bugs.python.org/issue25287 #25285: regrtest: run tests in subprocesses with -j1 on buildbots http://bugs.python.org/issue25285 #25274: sys.setrecursionlimit() must fail if the current recursion dep http://bugs.python.org/issue25274 #25270: codecs.escape_encode systemerror on empty byte string http://bugs.python.org/issue25270 Top 10 most discussed issues (10) ================================= #25300: Enable Intel MPX (Memory protection Extensions) feature http://bugs.python.org/issue25300 23 msgs #25228: Regression in cookie parsing with brackets and quotes http://bugs.python.org/issue25228 19 msgs #25294: Absolute imports fail in some cases where relative imports wou http://bugs.python.org/issue25294 11 msgs #25342: test_json segfault on OpenBSD http://bugs.python.org/issue25342 11 msgs #25157: Installing Python 3.5.0 32bit on Windows 8.1 64bit system give http://bugs.python.org/issue25157 9 msgs #25311: Add f-string support to tokenize.py http://bugs.python.org/issue25311 8 msgs #25258: HtmlParser doesn't handle void element tags correctly http://bugs.python.org/issue25258 7 msgs #25313: IDLE: gracefully handle themes (or keysets, or ...) not presen http://bugs.python.org/issue25313 7 msgs #25322: contextlib.suppress not tested for nested usage http://bugs.python.org/issue25322 7 msgs #5380: pty.read raises IOError when slave pty device is closed http://bugs.python.org/issue5380 6 msgs Issues closed (63) ================== #10967: move regrtest over to using more unittest infrastructure http://bugs.python.org/issue10967 closed by haypo #11365: Integrate Buildroot patches (cross-compilation) http://bugs.python.org/issue11365 closed by haypo #12006: strptime should implement %G, %V and %u directives http://bugs.python.org/issue12006 closed by belopolsky #12314: regrtest checks (os.environ, sys.path, etc.) are hard to use http://bugs.python.org/issue12314 closed by haypo #12346: Python source code build fails with old mercurial http://bugs.python.org/issue12346 closed by serhiy.storchaka #13466: new timezones http://bugs.python.org/issue13466 closed by belopolsky #13954: Add regrtest option to record test results to a file http://bugs.python.org/issue13954 closed by haypo #14423: Getting the starting date of iso week from a week number and a http://bugs.python.org/issue14423 closed by belopolsky #16099: robotparser doesn't support request rate and crawl delay param http://bugs.python.org/issue16099 closed by berker.peksag #16701: Docs missing the behavior of += (in-place add) for lists. http://bugs.python.org/issue16701 closed by martin.panter #16802: fileno argument to socket.socket() undocumented http://bugs.python.org/issue16802 closed by berker.peksag #17548: unittest.mock: test_create_autospec_unbound_methods is skipped http://bugs.python.org/issue17548 closed by haypo #19518: Add new PyRun_xxx() functions to not encode the filename http://bugs.python.org/issue19518 closed by haypo #19519: Parser: don't transcode input string to UTF-8 if it is already http://bugs.python.org/issue19519 closed by haypo #19817: tracemalloc add a memory limit feature http://bugs.python.org/issue19817 closed by haypo #19835: Add a MemoryError singleton to fix an unlimited loop when the http://bugs.python.org/issue19835 closed by haypo #19917: [httplib] logging information for request is not pretty printe http://bugs.python.org/issue19917 closed by berker.peksag #20910: Make sleep configurable in tests http://bugs.python.org/issue20910 closed by haypo #20964: Add support.check_time_delta() http://bugs.python.org/issue20964 closed by haypo #21373: robotparser: Automatically call modified function in read() http://bugs.python.org/issue21373 closed by berker.peksag #22323: Rewrite PyUnicode_AsWideChar() and PyUnicode_AsWideCharString( http://bugs.python.org/issue22323 closed by haypo #22324: Use PyUnicode_AsWideCharString() instead of PyUnicode_AsUnicod http://bugs.python.org/issue22324 closed by haypo #22444: Floor divide should return int http://bugs.python.org/issue22444 closed by belopolsky #22806: regrtest: add switch -c to run only modified tests http://bugs.python.org/issue22806 closed by haypo #23543: encoding error trying to save string to file http://bugs.python.org/issue23543 closed by serhiy.storchaka #23919: [Windows] test_os fails several C-level assertions http://bugs.python.org/issue23919 closed by steve.dower #23972: Asyncio reuseport http://bugs.python.org/issue23972 closed by gvanrossum #24657: CGIHTTPServer module discard continuous '/' letters from param http://bugs.python.org/issue24657 closed by martin.panter #24806: Inheriting from NoneType does not fail consistently http://bugs.python.org/issue24806 closed by python-dev #24820: IDLE themes for light on dark http://bugs.python.org/issue24820 closed by terry.reedy #25045: smtplib throws exception TypeError: readline() http://bugs.python.org/issue25045 closed by r.david.murray #25175: Documentation-Tkinter Clarify module spelling change in Python http://bugs.python.org/issue25175 closed by terry.reedy #25188: regrtest.py improvement for Profile Guided Optimization builds http://bugs.python.org/issue25188 closed by brett.cannon #25220: Enhance and refactor test.regrtest (convert regrtest.py to a p http://bugs.python.org/issue25220 closed by haypo #25232: CGIRequestHandler behave incorrectly with query component cons http://bugs.python.org/issue25232 closed by martin.panter #25266: mako benchmark not working in Python 3.6 http://bugs.python.org/issue25266 closed by brett.cannon #25286: views are not sequences http://bugs.python.org/issue25286 closed by martin.panter #25290: csv.reader: minor docstring typo http://bugs.python.org/issue25290 closed by berker.peksag #25298: Add lock and rlock weakref tests http://bugs.python.org/issue25298 closed by rhettinger #25301: Optimize UTF-8 decoder with error handlers http://bugs.python.org/issue25301 closed by haypo #25302: Memory Leaks with Address Sanitizer http://bugs.python.org/issue25302 closed by skrah #25304: Add run_coroutine_threadsafe() to asyncio http://bugs.python.org/issue25304 closed by gvanrossum #25305: Windows: python opens a popup and flood stderr with assertion http://bugs.python.org/issue25305 closed by steve.dower #25306: test_huntrleaks_fd_leak() of test_regrtest hangs on Windows http://bugs.python.org/issue25306 closed by haypo #25308: Multiple names can target the same namespace http://bugs.python.org/issue25308 closed by benjamin.peterson #25309: askopenfilename crashes on XP with "Show pop-up description fo http://bugs.python.org/issue25309 closed by terry.reedy #25310: End mark argument for StreamReader.readline() method http://bugs.python.org/issue25310 closed by gvanrossum #25315: Make it clear in the collections Python source code that Order http://bugs.python.org/issue25315 closed by zach.ware #25316: distutils: broken error reporting about vcvarsall.bat http://bugs.python.org/issue25316 closed by steve.dower #25317: Convert test_tokenize to unittests http://bugs.python.org/issue25317 closed by serhiy.storchaka #25318: Add _PyBytesWriter API to optimize Unicode encoders http://bugs.python.org/issue25318 closed by haypo #25319: Keep lock type when reseting internal locks http://bugs.python.org/issue25319 closed by python-dev #25321: Null Dereference "subname" in zipimport.c http://bugs.python.org/issue25321 closed by benjamin.peterson #25323: Bus error: 10 when executing recursive program http://bugs.python.org/issue25323 closed by benjamin.peterson #25325: UTF-16LE, UTF-16BE, UTF-32LE, and UTF-32BE encodings don't add http://bugs.python.org/issue25325 closed by eryksun #25326: Improve error message for "character buffer objects" http://bugs.python.org/issue25326 closed by rhettinger #25328: ValueError in smtpd.py __init__() is not raised http://bugs.python.org/issue25328 closed by r.david.murray #25332: [Errno 10035] A non-blocking socket operation could not be com http://bugs.python.org/issue25332 closed by zach.ware #25333: .1 + .2 == .3 should be True http://bugs.python.org/issue25333 closed by eric.smith #25336: Segmentation fault on Mavericks consistent crashing of softwar http://bugs.python.org/issue25336 closed by ned.deily #25346: test_regrtest fails because 'sys' is not imported. http://bugs.python.org/issue25346 closed by steve.dower #25350: Stronger type enforcement (feature request) http://bugs.python.org/issue25350 closed by r.david.murray #762963: timemodule.c: Python loses current timezone http://bugs.python.org/issue762963 closed by belopolsky From lac at openend.se Tue Oct 13 13:21:47 2015 From: lac at openend.se (Laura Creighton) Date: Tue, 13 Oct 2015 13:21:47 +0200 Subject: [Python-Dev] PEP 0484 - the Numeric Tower Message-ID: <201510131121.t9DBLlFa018601@fido.openend.se> Any chance of adding Decimal to the list of things that are also acceptable for things annotated float? Laura From stefanmihaila91 at gmail.com Tue Oct 13 13:59:56 2015 From: stefanmihaila91 at gmail.com (Stefan Mihaila) Date: Tue, 13 Oct 2015 14:59:56 +0300 Subject: [Python-Dev] Rationale behind lazy map/filter Message-ID: <561CF23C.2070107@gmail.com> Hey guys, Could someone clarify for me why it is a good idea to have map return an iterator that is iterable multiple times and acts as an empty iterator in subsequent iterations? > r = range(10) > list(r) == list(r) True > a=map(lambda x:x+1, [1,2,3]) > list(a) == list(a) False Wouldn't it be safer for everyone if attempting to traverse some map iterator a second time would just throw a runtime error rather than act as an empty iterator? Or am I missing something obvious? I understand that chaining functional operators like map/reduce/filter is fairly common and not materializing intermediate computation can improve performance in some cases (or even turn infinite into finite computation), but I also find myself running into non-obvious bugs because of this. Maybe it's just python2 habits, but I assume I'm not the only one carelessly thinking that "iterating over an input a second time will result in the same thing as the first time (or raise an error)". What would you say is a good practice to avoid accidentally passing the result of a map to a function traversing its inputs multiple times? I assume there needs to be an agreement on these things for larger codebases. Should the caller always materialize the result of a map before passing it elsewhere? Or should the callee always materialize its inputs before using them? Or should we just document whether the function traverses its input only once, such as through some type annotation ("def f(x: TraversableOnce[T]")? If we refactor `f' such that it used to traverse `x' only once, but now traverses it twice, should we go and update all callers? Would type hints solve this? More obvious assumption errors, such as "has a method called __len__" throw a runtime error thanks to duck typing, but more subtle ones, such as this one, are harder to express. Thanks, Stefan From rdmurray at bitdance.com Tue Oct 13 16:51:41 2015 From: rdmurray at bitdance.com (R. David Murray) Date: Tue, 13 Oct 2015 10:51:41 -0400 Subject: [Python-Dev] Rationale behind lazy map/filter In-Reply-To: <561CF23C.2070107@gmail.com> References: <561CF23C.2070107@gmail.com> Message-ID: <20151013145141.EAE33B2009A@webabinitio.net> On Tue, 13 Oct 2015 14:59:56 +0300, Stefan Mihaila wrote: > Maybe it's just python2 habits, but I assume I'm not the only one > carelessly thinking that "iterating over an input a second time will > result in the same thing as the first time (or raise an error)". This is the way iterators have always worked. The only new thing is that in python3 some things that used to be iter*ables* (lists, usually) are now iter*ators*. Yes it is a change in mindset *with regards to those functions* (and yes I sometimes find it annoying), but it is actually more consistent than it was in python2, and thus easier to generalize your knowledge about how python works instead of having to remember which functions work which way. That is, if you need to iterate it twice, turn it into a list first. --David From random832 at fastmail.com Tue Oct 13 17:26:09 2015 From: random832 at fastmail.com (Random832) Date: Tue, 13 Oct 2015 11:26:09 -0400 Subject: [Python-Dev] Rationale behind lazy map/filter References: <561CF23C.2070107@gmail.com> <20151013145141.EAE33B2009A@webabinitio.net> Message-ID: <877fmqvlgu.fsf@fastmail.com> "R. David Murray" writes: > On Tue, 13 Oct 2015 14:59:56 +0300, Stefan Mihaila > wrote: >> Maybe it's just python2 habits, but I assume I'm not the only one >> carelessly thinking that "iterating over an input a second time will >> result in the same thing as the first time (or raise an error)". > > This is the way iterators have always worked. It does raise the question though of what working code it would actually break to have "exhausted" iterators raise an error if you try to iterate them again rather than silently yield no items. From raymond.hettinger at gmail.com Tue Oct 13 17:38:07 2015 From: raymond.hettinger at gmail.com (Raymond Hettinger) Date: Tue, 13 Oct 2015 08:38:07 -0700 Subject: [Python-Dev] PEP 0484 - the Numeric Tower In-Reply-To: <201510131121.t9DBLlFa018601@fido.openend.se> References: <201510131121.t9DBLlFa018601@fido.openend.se> Message-ID: <702AAFEA-B3F3-43D7-A05E-9B86D19EB046@gmail.com> > On Oct 13, 2015, at 4:21 AM, Laura Creighton wrote: > > Any chance of adding Decimal to the list of things that are also > acceptable for things annotated float? From Lib/numbers.py: ## Notes on Decimal ## ---------------- ## Decimal has all of the methods specified by the Real abc, but it should ## not be registered as a Real because decimals do not interoperate with ## binary floats (i.e. Decimal('3.14') + 2.71828 is undefined). But, ## abstract reals are expected to interoperate (i.e. R1 + R2 should be ## expected to work if R1 and R2 are both Reals). That is still true: Python 3.5.0 (v3.5.0:374f501f4567, Sep 12 2015, 11:00:19) [GCC 4.2.1 (Apple Inc. build 5666) (dot 3)] on darwin Type "copyright", "credits" or "license()" for more information. >>> from decimal import Decimal >>> Decimal('3.14') + 2.71828 Traceback (most recent call last): File "", line 1, in Decimal('3.14') + 2.71828 TypeError: unsupported operand type(s) for +: 'decimal.Decimal' and 'float' Raymond Hettinger From zachary.ware+pydev at gmail.com Tue Oct 13 17:40:04 2015 From: zachary.ware+pydev at gmail.com (Zachary Ware) Date: Tue, 13 Oct 2015 10:40:04 -0500 Subject: [Python-Dev] Rationale behind lazy map/filter In-Reply-To: <877fmqvlgu.fsf@fastmail.com> References: <561CF23C.2070107@gmail.com> <20151013145141.EAE33B2009A@webabinitio.net> <877fmqvlgu.fsf@fastmail.com> Message-ID: On Tue, Oct 13, 2015 at 10:26 AM, Random832 wrote: > "R. David Murray" writes: > >> On Tue, 13 Oct 2015 14:59:56 +0300, Stefan Mihaila >> wrote: >>> Maybe it's just python2 habits, but I assume I'm not the only one >>> carelessly thinking that "iterating over an input a second time will >>> result in the same thing as the first time (or raise an error)". >> >> This is the way iterators have always worked. > > It does raise the question though of what working code it would actually > break to have "exhausted" iterators raise an error if you try to iterate > them again rather than silently yield no items. You mean like this? >>> m = map(int, '1234') >>> list(m) [1, 2, 3, 4] >>> next(m) Traceback (most recent call last): File "", line 1, in StopIteration It just happens that 'list()' and 'for ...' handle StopIteration for you. -- Zach From rdmurray at bitdance.com Tue Oct 13 17:41:37 2015 From: rdmurray at bitdance.com (R. David Murray) Date: Tue, 13 Oct 2015 11:41:37 -0400 Subject: [Python-Dev] Rationale behind lazy map/filter In-Reply-To: <877fmqvlgu.fsf@fastmail.com> References: <561CF23C.2070107@gmail.com> <20151013145141.EAE33B2009A@webabinitio.net> <877fmqvlgu.fsf@fastmail.com> Message-ID: <20151013154138.1FB63B2009A@webabinitio.net> On Tue, 13 Oct 2015 11:26:09 -0400, Random832 wrote: > "R. David Murray" writes: > > > On Tue, 13 Oct 2015 14:59:56 +0300, Stefan Mihaila > > wrote: > >> Maybe it's just python2 habits, but I assume I'm not the only one > >> carelessly thinking that "iterating over an input a second time will > >> result in the same thing as the first time (or raise an error)". > > > > This is the way iterators have always worked. > > It does raise the question though of what working code it would actually > break to have "exhausted" iterators raise an error if you try to iterate > them again rather than silently yield no items. They do raise an error: StopIteration. It's just that the iteration machinery uses that to stop iteration :). And the answer to the question is: lots of code. I've written some: code that iterates an iterator, breaks that loop on a condition, then resumes iterating, breaking that loop on a different condition, and so on, until the iterator is exhausted. If the iterator restarted at the top once it was exhausted, that code would break. --David From random832 at fastmail.com Tue Oct 13 18:08:12 2015 From: random832 at fastmail.com (Random832) Date: Tue, 13 Oct 2015 12:08:12 -0400 Subject: [Python-Dev] Rationale behind lazy map/filter References: <561CF23C.2070107@gmail.com> <20151013145141.EAE33B2009A@webabinitio.net> <877fmqvlgu.fsf@fastmail.com> <20151013154138.1FB63B2009A@webabinitio.net> Message-ID: <871tcyvjir.fsf@fastmail.com> "R. David Murray" writes: > On Tue, 13 Oct 2015 11:26:09 -0400, Random832 wrote: >> It does raise the question though of what working code it would actually >> break to have "exhausted" iterators raise an error if you try to iterate >> them again rather than silently yield no items. > > They do raise an error: StopIteration. It's just that the iteration > machinery uses that to stop iteration :). I meant a real error and you know it, both of you. StopIteration is an exception in the technical sense that it can be raised and caught, but it's not an error because it is used for normal control flow. In the plain english meaning of the word, it isn't even an exception. > And the answer to the question is: lots of code. I've written some: > code that iterates an iterator, breaks that loop on a condition, then > resumes iterating, breaking that loop on a different condition, and so > on, until the iterator is exhausted. If the iterator restarted at the > top once it was exhausted, that code would break I'm not suggesting restarting at the top (I've elsewhere suggested that many such methods would be better as an *iterable* that can be restarted at the top by calling iter() multiple times, but that's not the same thing). I'm suggesting raising an exception other than StopIteration, so that this situation can be detected. If you are writing code that tries to resume iterating after the iterator has been exhausted, I have to ask: why? I suppose the answer is the same reason people would deliberately raise StopIteration in the ways that PEP479 breaks - because it works and is easy. But that wasn't a reason not to deprecate that. From random832 at fastmail.com Tue Oct 13 18:16:16 2015 From: random832 at fastmail.com (Random832) Date: Tue, 13 Oct 2015 12:16:16 -0400 Subject: [Python-Dev] PEP 0484 - the Numeric Tower References: <201510131121.t9DBLlFa018601@fido.openend.se> <702AAFEA-B3F3-43D7-A05E-9B86D19EB046@gmail.com> Message-ID: <87vbaau4kv.fsf@fastmail.com> > From Lib/numbers.py: > > ## Notes on Decimal > ## ---------------- > ## Decimal has all of the methods specified by the Real abc, but it should > ## not be registered as a Real because decimals do not interoperate with > ## binary floats (i.e. Decimal('3.14') + 2.71828 is undefined). But, > ## abstract reals are expected to interoperate (i.e. R1 + R2 should be > ## expected to work if R1 and R2 are both Reals). Why? From rosuav at gmail.com Tue Oct 13 18:36:20 2015 From: rosuav at gmail.com (Chris Angelico) Date: Wed, 14 Oct 2015 03:36:20 +1100 Subject: [Python-Dev] Rationale behind lazy map/filter In-Reply-To: <871tcyvjir.fsf@fastmail.com> References: <561CF23C.2070107@gmail.com> <20151013145141.EAE33B2009A@webabinitio.net> <877fmqvlgu.fsf@fastmail.com> <20151013154138.1FB63B2009A@webabinitio.net> <871tcyvjir.fsf@fastmail.com> Message-ID: On Wed, Oct 14, 2015 at 3:08 AM, Random832 wrote: > If you are writing code that tries > to resume iterating after the iterator has been exhausted, I have to > ask: why? A well-behaved iterator is supposed to continue raising StopIteration forever once it's been exhausted. I don't know how much code actually depends on this, but it wouldn't be hard to make a wrapper that raises a different exception instead: class iter: _orig_iter = iter def __init__(self, thing): self.iter = self._orig_iter(thing) self.exhausted = False def __iter__(self): return self def __next__(self): if self.exhausted: raise RuntimeError("Already exhausted") try: return next(self.iter) except StopIteration: self.exhausted = True raise Play with that, and see where RuntimeErrors start coming up. I suspect they'll be rare, but they will happen. ChrisA From lac at openend.se Tue Oct 13 18:46:55 2015 From: lac at openend.se (Laura Creighton) Date: Tue, 13 Oct 2015 18:46:55 +0200 Subject: [Python-Dev] PEP 0484 - the Numeric Tower In-Reply-To: <702AAFEA-B3F3-43D7-A05E-9B86D19EB046@gmail.com> References: <201510131121.t9DBLlFa018601@fido.openend.se> <702AAFEA-B3F3-43D7-A05E-9B86D19EB046@gmail.com> Message-ID: <201510131646.t9DGktnY009595@fido.openend.se> In a message of Tue, 13 Oct 2015 08:38:07 -0700, Raymond Hettinger writes: > > >> On Oct 13, 2015, at 4:21 AM, Laura Creighton wrote: >> >> Any chance of adding Decimal to the list of things that are also >> acceptable for things annotated float? > >>From Lib/numbers.py: > >## Notes on Decimal >## ---------------- >## Decimal has all of the methods specified by the Real abc, but it should >## not be registered as a Real because decimals do not interoperate with >## binary floats (i.e. Decimal('3.14') + 2.71828 is undefined). But, >## abstract reals are expected to interoperate (i.e. R1 + R2 should be >## expected to work if R1 and R2 are both Reals). > >That is still true: > >Python 3.5.0 (v3.5.0:374f501f4567, Sep 12 2015, 11:00:19) >[GCC 4.2.1 (Apple Inc. build 5666) (dot 3)] on darwin >Type "copyright", "credits" or "license()" for more information. >>>> from decimal import Decimal >>>> Decimal('3.14') + 2.71828 >Traceback (most recent call last): > File "", line 1, in > Decimal('3.14') + 2.71828 >TypeError: unsupported operand type(s) for +: 'decimal.Decimal' and 'float' > > >Raymond Hettinger I take it that is a 'no'. I merely worry about what hapens if people start relying upon the fact that a float annotation 'will handle all the numbers I care about' to the forgotten Decimal users such as myself. Laura From random832 at fastmail.com Tue Oct 13 18:49:32 2015 From: random832 at fastmail.com (Random832) Date: Tue, 13 Oct 2015 12:49:32 -0400 Subject: [Python-Dev] Rationale behind lazy map/filter References: <561CF23C.2070107@gmail.com> <20151013145141.EAE33B2009A@webabinitio.net> <877fmqvlgu.fsf@fastmail.com> <20151013154138.1FB63B2009A@webabinitio.net> <871tcyvjir.fsf@fastmail.com> Message-ID: <87mvvmu31f.fsf@fastmail.com> Chris Angelico writes: > A well-behaved iterator is supposed to continue raising StopIteration > forever once it's been exhausted. Yes, and that is *precisely* the behavior that causes the problem under discussion. My question was what code depends on this. > Play with that, and see where RuntimeErrors start coming up. I suspect > they'll be rare, but they will happen. My theory is that most circumstances under which this would cause a RuntimeError are indicative of a bug in the algorithm consuming the iterator (for example, an algorithm that hasn't considered iterators and expects to be passed an iterable it can iterate from the top more than once), rather than the current behavior being relied on to produce the intended end result. This is essentially the same argument as PEP 479 - except there it was at least *easy* to come up with code which would rely on the old behavior to produce the intended end result. About the only example I can think of is that the implementation of itertools.zip_longest would have to change. From rosuav at gmail.com Tue Oct 13 19:03:47 2015 From: rosuav at gmail.com (Chris Angelico) Date: Wed, 14 Oct 2015 04:03:47 +1100 Subject: [Python-Dev] Rationale behind lazy map/filter In-Reply-To: <87mvvmu31f.fsf@fastmail.com> References: <561CF23C.2070107@gmail.com> <20151013145141.EAE33B2009A@webabinitio.net> <877fmqvlgu.fsf@fastmail.com> <20151013154138.1FB63B2009A@webabinitio.net> <871tcyvjir.fsf@fastmail.com> <87mvvmu31f.fsf@fastmail.com> Message-ID: On Wed, Oct 14, 2015 at 3:49 AM, Random832 wrote: > My theory is that most circumstances under which this would cause a > RuntimeError are indicative of a bug in the algorithm consuming the > iterator (for example, an algorithm that hasn't considered iterators and > expects to be passed an iterable it can iterate from the top more than > once), rather than the current behavior being relied on to produce > the intended end result. > > This is essentially the same argument as PEP 479 - except there it was > at least *easy* to come up with code which would rely on the old > behavior to produce the intended end result. Yeah. Hence my suggestion of a quick little replacement for the iter() function (though, on second reading of the code, I realise that I forgot about the two-arg form; changing 'thing' to '*args' should fix that though) as a means of locating the actual cases where that happens. Hmm. Actually, this kinda breaks if you call it multiple times. Calling iter() on an iterator should return itself, not a wrapper around self. So, new version: class iter: _orig_iter = iter def __new__(cls, *args): if len(args)==1 and isinstance(args[0], cls): # It's already a wrapped iterator. Return it as-is. return args[0] return super().__new__(cls) def __init__(self, *args): if hasattr(self, "iter"): return # Don't rewrap self.iter = self._orig_iter(*args) self.exhausted = False def __iter__(self): return self def __next__(self): if self.exhausted: raise RuntimeError("Already exhausted") try: return next(self.iter) except StopIteration: self.exhausted = True raise I don't have any code of mine that would be broken by this implementation of iter(). Doesn't mean it isn't buggy in ways I haven't spotted, though. :) ChrisA From chris.jerdonek at gmail.com Tue Oct 13 19:44:02 2015 From: chris.jerdonek at gmail.com (Chris Jerdonek) Date: Tue, 13 Oct 2015 10:44:02 -0700 Subject: [Python-Dev] Rationale behind lazy map/filter In-Reply-To: <877fmqvlgu.fsf@fastmail.com> References: <561CF23C.2070107@gmail.com> <20151013145141.EAE33B2009A@webabinitio.net> <877fmqvlgu.fsf@fastmail.com> Message-ID: On Tue, Oct 13, 2015 at 8:26 AM, Random832 wrote: > "R. David Murray" writes: > >> On Tue, 13 Oct 2015 14:59:56 +0300, Stefan Mihaila >> wrote: >>> Maybe it's just python2 habits, but I assume I'm not the only one >>> carelessly thinking that "iterating over an input a second time will >>> result in the same thing as the first time (or raise an error)". >> >> This is the way iterators have always worked. > > It does raise the question though of what working code it would actually > break to have "exhausted" iterators raise an error if you try to iterate > them again rather than silently yield no items. What about cases where not all of the elements of the iterator are known at the outset? For example, you might have a collection of pending tasks that you periodically loop through and process. Changing the behavior would result in an error when checking for more tasks instead of no tasks. --Chris From srkunze at mail.de Tue Oct 13 20:11:25 2015 From: srkunze at mail.de (Sven R. Kunze) Date: Tue, 13 Oct 2015 20:11:25 +0200 Subject: [Python-Dev] PEP 0484 - the Numeric Tower In-Reply-To: <702AAFEA-B3F3-43D7-A05E-9B86D19EB046@gmail.com> References: <201510131121.t9DBLlFa018601@fido.openend.se> <702AAFEA-B3F3-43D7-A05E-9B86D19EB046@gmail.com> Message-ID: <561D494D.4030304@mail.de> On 13.10.2015 17:38, Raymond Hettinger wrote: > Traceback (most recent call last): > File "", line 1, in > Decimal('3.14') + 2.71828 > TypeError: unsupported operand type(s) for +: 'decimal.Decimal' and 'float' Reminds me of 'int' and 'long'. Different but almost the same. Best, Sven From rdmurray at bitdance.com Tue Oct 13 20:32:10 2015 From: rdmurray at bitdance.com (R. David Murray) Date: Tue, 13 Oct 2015 14:32:10 -0400 Subject: [Python-Dev] Rationale behind lazy map/filter In-Reply-To: <871tcyvjir.fsf@fastmail.com> References: <561CF23C.2070107@gmail.com> <20151013145141.EAE33B2009A@webabinitio.net> <877fmqvlgu.fsf@fastmail.com> <20151013154138.1FB63B2009A@webabinitio.net> <871tcyvjir.fsf@fastmail.com> Message-ID: <20151013183211.2B0E7B2009A@webabinitio.net> On Tue, 13 Oct 2015 12:08:12 -0400, Random832 wrote: > "R. David Murray" writes: > > On Tue, 13 Oct 2015 11:26:09 -0400, Random832 wrote: > > > > And the answer to the question is: lots of code. I've written some: > > code that iterates an iterator, breaks that loop on a condition, then > > resumes iterating, breaking that loop on a different condition, and so > > on, until the iterator is exhausted. If the iterator restarted at the > > top once it was exhausted, that code would break > > I'm not suggesting restarting at the top (I've elsewhere suggested that > many such methods would be better as an *iterable* that can be restarted > at the top by calling iter() multiple times, but that's not the same > thing). I'm suggesting raising an exception other than StopIteration, so > that this situation can be detected. If you are writing code that tries > to resume iterating after the iterator has been exhausted, I have to > ask: why? Because the those second &c loops don't run if the iterator is already exhausted, the else clause is executed instead (or nothing happens, depending on the code). Now, likely such code isn't common (so I shouldn't have said "lots"), but the fact that I've done it at least once, maybe twice (but I can't remember what context, it was a while ago), argues it isn't vanishingly uncommon. --David From tjreedy at udel.edu Wed Oct 14 01:14:48 2015 From: tjreedy at udel.edu (Terry Reedy) Date: Tue, 13 Oct 2015 19:14:48 -0400 Subject: [Python-Dev] Rationale behind lazy map/filter In-Reply-To: <561CF23C.2070107@gmail.com> References: <561CF23C.2070107@gmail.com> Message-ID: On 10/13/2015 7:59 AM, Stefan Mihaila wrote: > Could someone clarify for me ... This list, pydev, short for 'python development', is for discussing development of future releases of CPython. Your question should have been directed to python-list, where it would be entirely on topic. -- Terry Jan Reedy From steve at pearwood.info Wed Oct 14 01:29:35 2015 From: steve at pearwood.info (Steven D'Aprano) Date: Wed, 14 Oct 2015 10:29:35 +1100 Subject: [Python-Dev] Rationale behind lazy map/filter In-Reply-To: <877fmqvlgu.fsf@fastmail.com> References: <561CF23C.2070107@gmail.com> <20151013145141.EAE33B2009A@webabinitio.net> <877fmqvlgu.fsf@fastmail.com> Message-ID: <20151013232935.GE13813@ando.pearwood.info> On Tue, Oct 13, 2015 at 11:26:09AM -0400, Random832 wrote: > "R. David Murray" writes: > > > On Tue, 13 Oct 2015 14:59:56 +0300, Stefan Mihaila > > wrote: > >> Maybe it's just python2 habits, but I assume I'm not the only one > >> carelessly thinking that "iterating over an input a second time will > >> result in the same thing as the first time (or raise an error)". > > > > This is the way iterators have always worked. > > It does raise the question though of what working code it would actually > break to have "exhausted" iterators raise an error if you try to iterate > them again rather than silently yield no items. Anything which looks like this: for item in iterator: if condition: break do_this() ... for item in iterator: do_that() If the condition is never true, the iterator is completely processed by the first loop, and the second loop is a no-op by design. I don't know how common it is, but I've written code like that. Had we been designing the iterator protocol from scratch, perhaps we might have had two exceptions: class EmptyIterator(Exception): ... class StopIteration(EmptyIterator): ... and have StopIteration only raised the first time you call next() on an empty iterator. But would it have been better? I don't know. I suspect not. I think that although it might avoid a certain class of errors, it would add complexity to other situations which are currently simple. -- Steve From raymond.hettinger at gmail.com Wed Oct 14 01:37:43 2015 From: raymond.hettinger at gmail.com (Raymond Hettinger) Date: Tue, 13 Oct 2015 16:37:43 -0700 Subject: [Python-Dev] PEP 0484 - the Numeric Tower In-Reply-To: <87vbaau4kv.fsf@fastmail.com> References: <201510131121.t9DBLlFa018601@fido.openend.se> <702AAFEA-B3F3-43D7-A05E-9B86D19EB046@gmail.com> <87vbaau4kv.fsf@fastmail.com> Message-ID: > On Oct 13, 2015, at 9:16 AM, Random832 wrote: > >> ## ---------------- >> ## Decimal has all of the methods specified by the Real abc, but it should >> ## not be registered as a Real because decimals do not interoperate with >> ## binary floats (i.e. Decimal('3.14') + 2.71828 is undefined). But, >> ## abstract reals are expected to interoperate (i.e. R1 + R2 should be >> ## expected to work if R1 and R2 are both Reals). > > Why? Q. Why is Python the way it is? A. Because Guido said so ;-) IIRC, the answer is that we were being conservative with possibly unintended operations between types with differing precision and with differing notions of what numbers could be exactly representable. We could have (and still could) make the choice to always coerce to decimal (every float is exactly representable in decimal). Further, any decimal float or binary float could be losslessly coerced to a Fraction, but that probably isn't what you really want most of the time. I think people who work in decimal usually want to stay there and people who work with binary floating point want to stay there as well (invisible coercions being more likely to cause pain than relieve pain). Raymond From graham.gower at gmail.com Wed Oct 14 01:42:09 2015 From: graham.gower at gmail.com (Graham Gower) Date: Wed, 14 Oct 2015 10:12:09 +1030 Subject: [Python-Dev] Rationale behind lazy map/filter In-Reply-To: <20151013232935.GE13813@ando.pearwood.info> References: <561CF23C.2070107@gmail.com> <20151013145141.EAE33B2009A@webabinitio.net> <877fmqvlgu.fsf@fastmail.com> <20151013232935.GE13813@ando.pearwood.info> Message-ID: On 14 October 2015 at 09:59, Steven D'Aprano wrote: > On Tue, Oct 13, 2015 at 11:26:09AM -0400, Random832 wrote: >> "R. David Murray" writes: >> >> > On Tue, 13 Oct 2015 14:59:56 +0300, Stefan Mihaila >> > wrote: >> >> Maybe it's just python2 habits, but I assume I'm not the only one >> >> carelessly thinking that "iterating over an input a second time will >> >> result in the same thing as the first time (or raise an error)". >> > >> > This is the way iterators have always worked. >> >> It does raise the question though of what working code it would actually >> break to have "exhausted" iterators raise an error if you try to iterate >> them again rather than silently yield no items. > > Anything which looks like this: > > > for item in iterator: > if condition: > break > do_this() > ... > for item in iterator: > do_that() > > > If the condition is never true, the iterator is completely processed by > the first loop, and the second loop is a no-op by design. > > I don't know how common it is, but I've written code like that. > I wrote code like this yesterday, to parse a file where there were multiple lines of one type of data, followed by multiple lines of another type of data. I can think of more complex examples including two (or more) iterators where one might reasonably do similar things. E.g. file 1 contains data, some of which is a subset of data in file 2, both of which are sorted. And during parsing, one wishes to match up the common elements. -Graham From steve at pearwood.info Wed Oct 14 02:08:07 2015 From: steve at pearwood.info (Steven D'Aprano) Date: Wed, 14 Oct 2015 11:08:07 +1100 Subject: [Python-Dev] PEP 0484 - the Numeric Tower In-Reply-To: References: <201510131121.t9DBLlFa018601@fido.openend.se> <702AAFEA-B3F3-43D7-A05E-9B86D19EB046@gmail.com> <87vbaau4kv.fsf@fastmail.com> Message-ID: <20151014000807.GH13813@ando.pearwood.info> On Tue, Oct 13, 2015 at 04:37:43PM -0700, Raymond Hettinger wrote: > We could have (and still could) make the choice to always coerce to > decimal (every float is exactly representable in decimal). Further, > any decimal float or binary float could be losslessly coerced to a > Fraction, but that probably isn't what you really want most of the > time. I think people who work in decimal usually want to stay there > and people who work with binary floating point want to stay there as > well (invisible coercions being more likely to cause pain than relieve > pain). Further to what Raymond says, if anyone wants to persue this further, and wants to argue for such automatic promotion of float to Decimal (or vice versa), I think a good place to start would be a survey of other languages with a numeric tower. How do they handle similar situations? -- Steve From random832 at fastmail.com Wed Oct 14 02:41:54 2015 From: random832 at fastmail.com (Random832) Date: Tue, 13 Oct 2015 20:41:54 -0400 Subject: [Python-Dev] PEP 0484 - the Numeric Tower References: <201510131121.t9DBLlFa018601@fido.openend.se> <702AAFEA-B3F3-43D7-A05E-9B86D19EB046@gmail.com> <87vbaau4kv.fsf@fastmail.com> <20151014000807.GH13813@ando.pearwood.info> Message-ID: <87twpuwab1.fsf@fastmail.com> Steven D'Aprano writes: > On Tue, Oct 13, 2015 at 04:37:43PM -0700, Raymond Hettinger wrote: > >> We could have (and still could) make the choice to always coerce to >> decimal (every float is exactly representable in decimal). Further, >> any decimal float or binary float could be losslessly coerced to a >> Fraction, but that probably isn't what you really want most of the >> time. I think people who work in decimal usually want to stay there >> and people who work with binary floating point want to stay there as >> well (invisible coercions being more likely to cause pain than relieve >> pain). > > Further to what Raymond says, if anyone wants to persue this further, > and wants to argue for such automatic promotion of float to Decimal (or > vice versa), I think a good place to start would be a survey of other > languages with a numeric tower. How do they handle similar situations? I believe that in Scheme (which AIUI is where the term "numeric tower" comes from) has a notion of "exact" and "inexact" numbers. A "flonum" (float) would be an inexact number. And any operation between exact and inexact numbers (including e.g. min and max, even if the exact number wins) gives an inexact result. A Decimal, though, could be regarded as an exact number (a special kind of rational) or an inexact number (another kind of floating point). I suspect some people who use them intend them one way and others intend the other (especially since Fraction lacks a good way to format the value in decimal notation). If they are both inexact, then it doesn't much matter which one they're promoted to, since they're both implementation details of the abstract type "inexact real number". AIUI many implementations don't *actually* have any other implementations of "inexact real number" except flonum, and those that do just have them as a rational fraction with an "inexact" flag set, but the spec does allow it. Implementing a scheme-style exact/inexact numeric tower also suggests more ABCs. From chris.barker at noaa.gov Wed Oct 14 06:32:11 2015 From: chris.barker at noaa.gov (Chris Barker - NOAA Federal) Date: Tue, 13 Oct 2015 21:32:11 -0700 Subject: [Python-Dev] PEP 0484 - the Numeric Tower In-Reply-To: <201510131646.t9DGktnY009595@fido.openend.se> References: <201510131121.t9DBLlFa018601@fido.openend.se> <702AAFEA-B3F3-43D7-A05E-9B86D19EB046@gmail.com> <201510131646.t9DGktnY009595@fido.openend.se> Message-ID: <-5006831532939498661@unknownmsgid> I merely worry about what happens if people > start relying upon the fact that a float annotation 'will handle all > the numbers I care about' to the forgotten Decimal users such as > myself. Well, that's what you get in exchange for "type safety". Which is exactly why I'm concerned about widespread use of type annotations. Might as well use a static language :-( - CHB > > Laura > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/chris.barker%40noaa.gov From stephen at xemacs.org Wed Oct 14 08:23:31 2015 From: stephen at xemacs.org (Stephen J. Turnbull) Date: Wed, 14 Oct 2015 15:23:31 +0900 Subject: [Python-Dev] PEP 0484 - the Numeric Tower In-Reply-To: <-5006831532939498661@unknownmsgid> References: <201510131121.t9DBLlFa018601@fido.openend.se> <702AAFEA-B3F3-43D7-A05E-9B86D19EB046@gmail.com> <201510131646.t9DGktnY009595@fido.openend.se> <-5006831532939498661@unknownmsgid> Message-ID: <22045.62691.317706.290770@turnbull.sk.tsukuba.ac.jp> Chris Barker - NOAA Federal writes: > Laura Creighton writes: > > I merely worry about what happens if people > > start relying upon the fact that a float annotation 'will handle all > > the numbers I care about' to the forgotten Decimal users such as > > myself. > > Well, that's what you get in exchange for "type safety". AIUI, the point of type annotations is that some use cases benefit *a lot* from machine-parsable type information, not that type annotation is a universally good idea in itself. It's not type *safety* that's the aim here. It's type *auditability*. If it were about "safety", annotations would be in the interpreter, not in a separate, optional application. > Which is exactly why I'm concerned about widespread use of type > annotations. Might as well use a static language :-( No, no way would this satisfy static typing advocates. Optional, remember? In Python, widespread use of type annotations that messes up Decimal users would be un-Pythonic (infringes the "consenting adults" principle). Yes, Laura is going to run into modules that functions that have a "float" or "Real" annotation that will complain about Decimal when Decimal works perfectly well in that function. Not everybody will use annotations according to BDFL Original Intent. But if perfectly normal usage of Decimal or whatever runs into type annotation abuse, and you can't simply refuse to run the type checker, I will bet that Guido himself will be your champion.[1] Footnotes: [1] I'm not channeling anybody here, that's a statement of my personal assessment of the real risk. And of course it may have no effect on the developers who use type annotations in that way, but this is no different from any other hard to work around programming practice. From mal at egenix.com Wed Oct 14 11:44:40 2015 From: mal at egenix.com (M.-A. Lemburg) Date: Wed, 14 Oct 2015 11:44:40 +0200 Subject: [Python-Dev] PEP 0484 - the Numeric Tower In-Reply-To: References: <201510131121.t9DBLlFa018601@fido.openend.se> <702AAFEA-B3F3-43D7-A05E-9B86D19EB046@gmail.com> <87vbaau4kv.fsf@fastmail.com> Message-ID: <561E2408.9070705@egenix.com> On 14.10.2015 01:37, Raymond Hettinger wrote: > >> On Oct 13, 2015, at 9:16 AM, Random832 wrote: >> >>> ## ---------------- >>> ## Decimal has all of the methods specified by the Real abc, but it should >>> ## not be registered as a Real because decimals do not interoperate with >>> ## binary floats (i.e. Decimal('3.14') + 2.71828 is undefined). But, >>> ## abstract reals are expected to interoperate (i.e. R1 + R2 should be >>> ## expected to work if R1 and R2 are both Reals). >> >> Why? > > Q. Why is Python the way it is? > A. Because Guido said so ;-) > > IIRC, the answer is that we were being conservative with possibly unintended operations between types with differing precision and with differing notions of what numbers could be exactly representable. > > We could have (and still could) make the choice to always coerce to decimal (every float is exactly representable in decimal). Further, any decimal float or binary float could be losslessly coerced to a Fraction, but that probably isn't what you really want most of the time. I think people who work in decimal usually want to stay there and people who work with binary floating point want to stay there as well (invisible coercions being more likely to cause pain than relieve pain). I can only underline this. Conversion to decimals or fractions should not be implicit. People needing these types will know when they need them and apply the required explicit conversions to fit their use case. E.g. in accounting you'll likely use decimal, in finance and science you stick to floats. >From a theoretical point of view, it would make sense to add coercion to these types, but not from a practical point of view. -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Experts >>> Python Projects, Coaching and Consulting ... http://www.egenix.com/ >>> Python Database Interfaces ... http://products.egenix.com/ >>> Plone/Zope Database Interfaces ... http://zope.egenix.com/ ________________________________________________________________________ ::::: Try our mxODBC.Connect Python Database Interface for free ! :::::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/ From lac at openend.se Wed Oct 14 13:09:06 2015 From: lac at openend.se (Laura Creighton) Date: Wed, 14 Oct 2015 13:09:06 +0200 Subject: [Python-Dev] PEP 0484 - the Numeric Tower In-Reply-To: <561E2408.9070705@egenix.com> References: <201510131121.t9DBLlFa018601@fido.openend.se> <702AAFEA-B3F3-43D7-A05E-9B86D19EB046@gmail.com> <87vbaau4kv.fsf@fastmail.com> <561E2408.9070705@egenix.com> Message-ID: <201510141109.t9EB96jF025342@fido.openend.se> In a message of Wed, 14 Oct 2015 11:44:40 +0200, "M.-A. Lemburg" writes: >I can only underline this. Conversion to decimals or fractions should >not be implicit. People needing these types will know when they need >them and apply the required explicit conversions to fit their use case. > >E.g. in accounting you'll likely use decimal, in finance and science >you stick to floats. > >>From a theoretical point of view, it would make sense to add coercion >to these types, but not from a practical point of view. > >-- >Marc-Andre Lemburg >eGenix.com Actually, people in Finance tend to use both. (Often while being completely unaware of what they are doing, too.) And these days anybody who is using Decimal for Money (which ought to be everybody, but again, lots of people don't know what they are doing) still wants to grab the SciPy stack so they can use pandas to analyse the data, and matplotlib to graph it, and bokeh to turn the results into a all-singing and dancing interactive graph. Laura From chris at simplistix.co.uk Wed Oct 14 15:11:46 2015 From: chris at simplistix.co.uk (Chris Withers) Date: Wed, 14 Oct 2015 14:11:46 +0100 Subject: [Python-Dev] under what circumstances can python still exhibit "high water mark" memory usage? Message-ID: <561E5492.1030808@simplistix.co.uk> Hi All, I'm having trouble with some python processes that are using 3GB+ of memory but when I inspect them with either heapy or meliae, injected via pyrasite, those tools only report total memory usage to be 119Mb. This feels like the old "python high water mark" problem, but I thought that was fixed in 2.6/3.0? Under what circumstances can a Python process still exhibit high memory usage that tools like heapy don't know about? cheers, Chris PS: Full details here of libraries being used and versions here: https://groups.google.com/forum/#!topic/celery-users/SsTRZ7-mDMI This post feels related and seems to suggest the high water mark problem is still there: http://chase-seibert.github.io/blog/2013/08/03/diagnosing-memory-leaks-python.html From stefanrin at gmail.com Wed Oct 14 17:04:24 2015 From: stefanrin at gmail.com (Stefan Ring) Date: Wed, 14 Oct 2015 17:04:24 +0200 Subject: [Python-Dev] under what circumstances can python still exhibit "high water mark" memory usage? In-Reply-To: <561E5492.1030808@simplistix.co.uk> References: <561E5492.1030808@simplistix.co.uk> Message-ID: On Wed, Oct 14, 2015 at 3:11 PM, Chris Withers wrote: > I'm having trouble with some python processes that are using 3GB+ of memory > but when I inspect them with either heapy or meliae, injected via pyrasite, > those tools only report total memory usage to be 119Mb. > > This feels like the old "python high water mark" problem, but I thought that > was fixed in 2.6/3.0? > Under what circumstances can a Python process still exhibit high memory > usage that tools like heapy don't know about? Which Python version are you experiencing this with? I know that in Python 2.7, having many floats (and I think also ints) active at once creates a high water situation. Python 2.7 is what I have experience with -- with heap sizes around 40 GB sometimes. From chris at simplistix.co.uk Wed Oct 14 17:10:27 2015 From: chris at simplistix.co.uk (Chris Withers) Date: Wed, 14 Oct 2015 16:10:27 +0100 Subject: [Python-Dev] under what circumstances can python still exhibit "high water mark" memory usage? In-Reply-To: References: <561E5492.1030808@simplistix.co.uk> Message-ID: <561E7063.4060503@simplistix.co.uk> On 14/10/2015 16:04, Stefan Ring wrote: > On Wed, Oct 14, 2015 at 3:11 PM, Chris Withers wrote: >> I'm having trouble with some python processes that are using 3GB+ of memory >> but when I inspect them with either heapy or meliae, injected via pyrasite, >> those tools only report total memory usage to be 119Mb. >> >> This feels like the old "python high water mark" problem, but I thought that >> was fixed in 2.6/3.0? >> Under what circumstances can a Python process still exhibit high memory >> usage that tools like heapy don't know about? > Which Python version are you experiencing this with? I know that in > Python 2.7, having many floats (and I think also ints) active at once > creates a high water situation. Python 2.7 is what I have experience > with -- with heap sizes around 40 GB sometimes. Python 2.7.5 on RHEL 7.1. Chris From victor.stinner at gmail.com Wed Oct 14 17:13:37 2015 From: victor.stinner at gmail.com (Victor Stinner) Date: Wed, 14 Oct 2015 17:13:37 +0200 Subject: [Python-Dev] under what circumstances can python still exhibit "high water mark" memory usage? In-Reply-To: <561E5492.1030808@simplistix.co.uk> References: <561E5492.1030808@simplistix.co.uk> Message-ID: Hi, You may also try tracemalloc to get stats of the Python memory usage ;-) The Python memory allocator was optimized in Python 3.3: it now uses mmap() when available (on UNIX), it helps to reduce the fragmentation of the heap memory. Since Python 3.4, VirtualAlloc() is used for the same purpose on Windows. Please mention your OS, OS version and Python version. The Python memory allocator allocates chunks of memory of 256 KB (see ARENA_SIZE in Objects/obmalloc.c). A chunk cannot be released to the system before all objects stored in the chunk are released. The Python memory allocator is only used for allocations smaller than 256 bytes in Python <= 3.2, or allocations smaller than 512 bytes in Python 3.3. Otherwise, malloc() and free() are used. The GNU libc uses brk() or mmap() depending on a threshold: 128 KB by default. The threshold is dynamic nowadays. Use mallopt(M_MMAP_THRESHOLD, nbytes) to change this threshold. The fragmentation of the heap memory is an hard problem not fully solved in CPython. A moving garbage collector would reduce the fragmentation of the "arenas" objects, but I don't think that it's possible to change the Python garbage collector... My test for memory fragmentation: https://bitbucket.org/haypo/misc/src/56c61c3815a6f0a604cda0314d44081f21e8a786/memory/python_memleak.py?at=default&fileviewer=file-view-default Other example in pure C: https://bitbucket.org/haypo/misc/src/56c61c3815a6f0a604cda0314d44081f21e8a786/memory/tu_malloc.c?at=default&fileviewer=file-view-default Victor 2015-10-14 15:11 GMT+02:00 Chris Withers : > Hi All, > > I'm having trouble with some python processes that are using 3GB+ of memory > but when I inspect them with either heapy or meliae, injected via pyrasite, > those tools only report total memory usage to be 119Mb. > > This feels like the old "python high water mark" problem, but I thought that > was fixed in 2.6/3.0? > Under what circumstances can a Python process still exhibit high memory > usage that tools like heapy don't know about? > > cheers, > > Chris > > PS: Full details here of libraries being used and versions here: > > https://groups.google.com/forum/#!topic/celery-users/SsTRZ7-mDMI > > This post feels related and seems to suggest the high water mark problem is > still there: > > http://chase-seibert.github.io/blog/2013/08/03/diagnosing-memory-leaks-python.html > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/victor.stinner%40gmail.com From chris at simplistix.co.uk Wed Oct 14 17:26:11 2015 From: chris at simplistix.co.uk (Chris Withers) Date: Wed, 14 Oct 2015 16:26:11 +0100 Subject: [Python-Dev] under what circumstances can python still exhibit "high water mark" memory usage? In-Reply-To: References: <561E5492.1030808@simplistix.co.uk> Message-ID: <561E7413.2070902@simplistix.co.uk> On 14/10/2015 16:13, Victor Stinner wrote: > Hi, > > You may also try tracemalloc to get stats of the Python memory usage ;-) > > The Python memory allocator was optimized in Python 3.3: it now uses > mmap() when available (on UNIX), it helps to reduce the fragmentation > of the heap memory. Since Python 3.4, VirtualAlloc() is used for the > same purpose on Windows. > > Please mention your OS, OS version and Python version. Python 2.7.5 on RHEL 7.1. Would tracemalloc still be useful here? cheers, Chris From chris.barker at noaa.gov Wed Oct 14 17:26:05 2015 From: chris.barker at noaa.gov (Chris Barker - NOAA Federal) Date: Wed, 14 Oct 2015 08:26:05 -0700 Subject: [Python-Dev] PEP 0484 - the Numeric Tower In-Reply-To: <201510141109.t9EB96jF025342@fido.openend.se> References: <201510131121.t9DBLlFa018601@fido.openend.se> <702AAFEA-B3F3-43D7-A05E-9B86D19EB046@gmail.com> <87vbaau4kv.fsf@fastmail.com> <561E2408.9070705@egenix.com> <201510141109.t9EB96jF025342@fido.openend.se> Message-ID: <-1299548556864014518@unknownmsgid> And these days > anybody who is using Decimal for Money (which ought to be everybody, I'm not so sure about that -- using base-10 is nice, but it doesn't automatically buy you the appropriate rounding rules, etc that you need to "proper" accounting. And, as MA pointed out, in much "finance" work, the approximations of FP are just as appropriate as they are for science. (Which of course, floats are not always appropriate for...) > still wants > to grab the SciPy stack so they can use pandas to analyse the data, > and matplotlib to graph it, and bokeh to turn the results into a > all-singing and dancing interactive graph. There's no technical reason Numpy couldn't have a decimal dtype -- someone "just" has to write the code. The fact that no one has tells me that no one needs it that badly. (Or that numpy' dtype system is inscrutable :-) ) But while we're on Numpy -- there is s lesson there -- Numpy supports many different precision a of various styles - int8, int16, int32....float32, float64.... Back in the day, the coercion rules would tend to push user's arrays to larger dtypes: say you added a python float (float64) to a Numpy array of float32: you'd get a float64 array. But the fact is that people choose to use a smaller dtype for a reason -- so numpy's casting rules where changed to make it less likely that you'd accidentally upcast your arrays. A similar principle applies here. If someone is working with Decimals, they have a reason to do so. Likewise if they are Not working with Decimals... So it's all good... -CHB > > Laura > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/chris.barker%40noaa.gov From guido at python.org Wed Oct 14 17:38:43 2015 From: guido at python.org (Guido van Rossum) Date: Wed, 14 Oct 2015 08:38:43 -0700 Subject: [Python-Dev] PEP 0484 - the Numeric Tower In-Reply-To: <201510141109.t9EB96jF025342@fido.openend.se> References: <201510131121.t9DBLlFa018601@fido.openend.se> <702AAFEA-B3F3-43D7-A05E-9B86D19EB046@gmail.com> <87vbaau4kv.fsf@fastmail.com> <561E2408.9070705@egenix.com> <201510141109.t9EB96jF025342@fido.openend.se> Message-ID: Perhaps you could solve this with type variables. Here's a little demonstration program: ``` from decimal import Decimal from typing import TypeVar F = TypeVar('F', float, Decimal) def add(a: F, b: F) -> F: return a+b print(add(4.2, 3.14)) print(add(Decimal('4.2'), Decimal('3.14'))) print(add(Decimal('4.2'), 3.14)) ``` Note that the last line is invalid. mypy correctly finds this: ``` flt.py:8: error: Type argument 1 of "add" has incompatible value "object" ``` (We could work on the error message.) Now, I'm not sure that this is the best solution given the audience -- but the type variable 'F' only needs to be defined once, so this might actually work. -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Wed Oct 14 17:47:57 2015 From: chris.barker at noaa.gov (Chris Barker - NOAA Federal) Date: Wed, 14 Oct 2015 08:47:57 -0700 Subject: [Python-Dev] PEP 0484 - the Numeric Tower In-Reply-To: <22045.62691.317706.290770@turnbull.sk.tsukuba.ac.jp> References: <201510131121.t9DBLlFa018601@fido.openend.se> <702AAFEA-B3F3-43D7-A05E-9B86D19EB046@gmail.com> <201510131646.t9DGktnY009595@fido.openend.se> <-5006831532939498661@unknownmsgid> <22045.62691.317706.290770@turnbull.sk.tsukuba.ac.jp> Message-ID: <-5559682458530750964@unknownmsgid> >> Well, that's what you get in exchange for "type safety". > > AIUI, the point of type annotations is that some use cases benefit *a > lot* from machine-parsable type information, not that type annotation > is a universally good idea in itself. It's not type *safety* that's > the aim here. It's type *auditability*. If it were about "safety", > annotations would be in the interpreter, not in a separate, optional > application. Well, what's the point if you don't use it? I'm probably being paranoid here, but I can envision institutions that enforce running the type checker for any committed code, etc... And then for practical purposes, it's enforced on that project. >> Which is exactly why I'm concerned about widespread use of type >> annotations. Might as well use a static language :-( > > No, no way would this satisfy static typing advocates. Optional, > remember? Notice that I said "widespread" -- and again, paranoia: it'll always be optional in Python itself, but it may not be optional in some institutions/code bases. > In Python, widespread use of type annotations that messes up Decimal > users would be un-Pythonic Well yes, that's my point :-) > you can't simply refuse to run the type checker, If you can simply refuse to run the type checker, then it makes me wonder what the point of the type checker is :-) Anyway, lots of folks think type annotations will be useful without constraining the language -- and now we have a standard way to do it -- so I guess we'll see how it plays out. -CHB > I will bet that > Guido himself will be your champion.[1] > > Footnotes: > [1] I'm not channeling anybody here, that's a statement of my > personal assessment of the real risk. And of course it may have no > effect on the developers who use type annotations in that way, but > this is no different from any other hard to work around programming > practice. > > From lac at openend.se Wed Oct 14 22:56:22 2015 From: lac at openend.se (Laura Creighton) Date: Wed, 14 Oct 2015 22:56:22 +0200 Subject: [Python-Dev] PEP 0484 - the Numeric Tower In-Reply-To: References: <201510131121.t9DBLlFa018601@fido.openend.se> <702AAFEA-B3F3-43D7-A05E-9B86D19EB046@gmail.com> <87vbaau4kv.fsf@fastmail.com> <561E2408.9070705@egenix.com> <201510141109.t9EB96jF025342@fido.openend.se> Message-ID: <201510142056.t9EKuMiw004003@fido.openend.se> In a message of Wed, 14 Oct 2015 08:38:43 -0700, Guido van Rossum writes: >Perhaps you could solve this with type variables. Here's a little >demonstration program: >``` >from decimal import Decimal >from typing import TypeVar >F = TypeVar('F', float, Decimal) >def add(a: F, b: F) -> F: > return a+b >print(add(4.2, 3.14)) >print(add(Decimal('4.2'), Decimal('3.14'))) >print(add(Decimal('4.2'), 3.14)) >``` >Note that the last line is invalid. mypy correctly finds this: >``` >flt.py:8: error: Type argument 1 of "add" has incompatible value "object" >``` >(We could work on the error message.) > >Now, I'm not sure that this is the best solution given the audience -- but >the type variable 'F' only needs to be defined once, so this might actually >work. > >-- >--Guido van Rossum (python.org/~guido) This looks good to me. I wonder if there is anything we can do, documentation and PEP wise to encourage people who write code to use it, rather than just using float? Laura From oscar.j.benjamin at gmail.com Wed Oct 14 23:21:30 2015 From: oscar.j.benjamin at gmail.com (Oscar Benjamin) Date: Wed, 14 Oct 2015 21:21:30 +0000 Subject: [Python-Dev] PEP 0484 - the Numeric Tower In-Reply-To: <201510142056.t9EKuMiw004003@fido.openend.se> References: <201510131121.t9DBLlFa018601@fido.openend.se> <702AAFEA-B3F3-43D7-A05E-9B86D19EB046@gmail.com> <87vbaau4kv.fsf@fastmail.com> <561E2408.9070705@egenix.com> <201510141109.t9EB96jF025342@fido.openend.se> <201510142056.t9EKuMiw004003@fido.openend.se> Message-ID: On Wed, 14 Oct 2015 21:57 Laura Creighton wrote: In a message of Wed, 14 Oct 2015 08:38:43 -0700, Guido van Rossum writes: >Perhaps you could solve this with type variables. Here's a little >demonstration program: >``` >from decimal import Decimal >from typing import TypeVar >F = TypeVar('F', float, Decimal) >def add(a: F, b: F) -> F: > return a+b >print(add(4.2, 3.14)) >print(add(Decimal('4.2'), Decimal('3.14'))) >print(add(Decimal('4.2'), 3.14)) >``` >Note that the last line is invalid. mypy correctly finds this: >``` >flt.py:8: error: Type argument 1 of "add" has incompatible value "object" >``` >(We could work on the error message.) > >Now, I'm not sure that this is the best solution given the audience -- but >the type variable 'F' only needs to be defined once, so this might actually >work. > >-- >--Guido van Rossum (python.org/~guido) This looks good to me. I wonder if there is anything we can do, documentation and PEP wise to encourage people who write code to use it, rather than just using float? It's not always appropriate though. If the author types it as float then they're obviously not thinking about decimal in which case it may not work correctly for decimal. Writing accurate numerical code that ducktypes with float and decimal is non-trivial even in cases where the operation is relatively trivial. Personally I just wouldn't mix them but if you want to see how tricky it can be take a look at statistics.sum. Generally if it's possible to interchange floats and decimals in your code then there's probably no need for decimals in the first place. If mypy requires you to do an explicit conversion to float then there may be some seld-documenting merit in showing that conversion up front rather than assuming that it's okay to insert decimals where they're not expected. The point of static type checking is to detect precisely these kinds of errors. -- Oscar -------------- next part -------------- An HTML attachment was scrubbed... URL: From lac at openend.se Thu Oct 15 00:04:43 2015 From: lac at openend.se (Laura Creighton) Date: Thu, 15 Oct 2015 00:04:43 +0200 Subject: [Python-Dev] PEP 0484 - the Numeric Tower In-Reply-To: References: <201510131121.t9DBLlFa018601@fido.openend.se> <702AAFEA-B3F3-43D7-A05E-9B86D19EB046@gmail.com> <87vbaau4kv.fsf@fastmail.com> <561E2408.9070705@egenix.com> <201510141109.t9EB96jF025342@fido.openend.se> <201510142056.t9EKuMiw004003@fido.openend.se> Message-ID: <201510142204.t9EM4hoI009104@fido.openend.se> In a message of Wed, 14 Oct 2015 21:21:30 -0000, Oscar Benjamin writes: >Generally if it's possible to interchange floats and decimals in your code >then there's probably no need for decimals in the first place. Yes, but, at least around here the common case is that you already _have_ a pile of decimals (extracted from your leger) and now you want to do something with them (like graph them and make reports out of the graphs) with other people's graphing and report generating software. >If mypy >requires you to do an explicit conversion to float then there may be some >seld-documenting merit in showing that conversion up front rather than >assuming that it's okay to insert decimals where they're not expected. The >point of static type checking is to detect precisely these kinds of errors. The thing is that there is a very big split between code written as 'this is a float using function and decimal users very much have to avoid using it' and 'this thing works perfectly well for floats and decimals'. That code writers in the scientific python world mostly never think of Decimal users, doesn't mean they don't end up writing perfectly wonderful tools and libraries we use. :) thankfully :) I was looking for a way for the Python type hinting to be expressive enough to handle this common (at least in my world) case. So then, even if the bokeh developers (just to pick some friends) forget about me in their type annotations, I can just make a pull request, send it back with some corrected annotations and the note 'remember me!' :) >Oscar Laura From chris.barker at noaa.gov Thu Oct 15 00:36:35 2015 From: chris.barker at noaa.gov (Chris Barker) Date: Wed, 14 Oct 2015 15:36:35 -0700 Subject: [Python-Dev] PEP 0484 - the Numeric Tower In-Reply-To: <201510142204.t9EM4hoI009104@fido.openend.se> References: <201510131121.t9DBLlFa018601@fido.openend.se> <702AAFEA-B3F3-43D7-A05E-9B86D19EB046@gmail.com> <87vbaau4kv.fsf@fastmail.com> <561E2408.9070705@egenix.com> <201510141109.t9EB96jF025342@fido.openend.se> <201510142056.t9EKuMiw004003@fido.openend.se> <201510142204.t9EM4hoI009104@fido.openend.se> Message-ID: On Wed, Oct 14, 2015 at 3:04 PM, Laura Creighton wrote: > That code writers in the scientific python world mostly > never think of Decimal users, doesn't mean they don't end up writing > perfectly wonderful tools and libraries we use. :) thankfully :) > sure -- but those are almost guaranteed to convert to float internally, as there is no decimal dtype for numpy. in fact, much numpy code is actually semi-statically typed. The common idiom is to write your functions to take "something that can be turned into a float array", which, in practice, means that calling: np.asarray(the_input_object, dtype-np.float64) Doesn't raise an exception.[1] And, often ends up returning an array with the right shape, also, so maybe: np.asarray(the_input_object, dtype-np.float64).reshape(-1, 2) I guess it would be nice if there were a way to describe that in type annotations. -CHB [1] for those not in the know, "asarray" is more or less: if the input is already an array as specified, return that array unchanged. elif the input is a buffer or memoryview that fits the specification, wrap an ndarray around that buffer. else call np.array() on it -- which will attempt make an appropriate numpy array, and copy the values from the input object into it. I was looking for a way for the Python type hinting to be expressive > enough to handle this common (at least in my world) case. So then, > even if the bokeh developers (just to pick some friends) forget about > me in their type annotations, as you see above, it's generally more complicated than a single scalar dtype... -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From lac at openend.se Thu Oct 15 00:41:08 2015 From: lac at openend.se (Laura Creighton) Date: Thu, 15 Oct 2015 00:41:08 +0200 Subject: [Python-Dev] PEP 0484 - the Numeric Tower In-Reply-To: References: <201510131121.t9DBLlFa018601@fido.openend.se> <702AAFEA-B3F3-43D7-A05E-9B86D19EB046@gmail.com> <87vbaau4kv.fsf@fastmail.com> <561E2408.9070705@egenix.com> <201510141109.t9EB96jF025342@fido.openend.se> <201510142056.t9EKuMiw004003@fido.openend.se> Message-ID: <201510142241.t9EMf8f8011820@fido.openend.se> I forgot something. In a message of Wed, 14 Oct 2015 21:21:30 -0000, Oscar Benjamin writes: >The point of static type checking is to detect precisely these kinds of >errors. Yes, but what I expect the type annotations to be used for, especially in the SciPy world, is to make things easier for Numba to generate fast code. I really hope that the SciPy world is not going to go nuts for useless annotations, but I have, alas, all too many years dealing with people whose desire to have faster code vastly outstrips their ability to understand just what is possible in that regard. In James Barrie's novel, Peter Pan, earthly children are given the ability to fly by having Fairy Dust (supplied by the unwilling Tinkerbell) sprinkled over them. I now know that what a large number of people who want faster code, really want is magic Fairy Dust. They wanted psyco to be it, they wanted pypy to be it, and now they want Numba to be it. The notion that sprinkling type annotations all over their code will make it fly _absolutely deeply resonates_ with how these people wish the world worked. They will believe that type annotations are magic fairy dust until they are forced to confront the fact poorly written code can not be made fly until it is rewritten. Laura From oscar.j.benjamin at gmail.com Thu Oct 15 00:49:33 2015 From: oscar.j.benjamin at gmail.com (Oscar Benjamin) Date: Wed, 14 Oct 2015 23:49:33 +0100 Subject: [Python-Dev] PEP 0484 - the Numeric Tower In-Reply-To: <201510142204.t9EM4hoI009104@fido.openend.se> References: <201510131121.t9DBLlFa018601@fido.openend.se> <702AAFEA-B3F3-43D7-A05E-9B86D19EB046@gmail.com> <87vbaau4kv.fsf@fastmail.com> <561E2408.9070705@egenix.com> <201510141109.t9EB96jF025342@fido.openend.se> <201510142056.t9EKuMiw004003@fido.openend.se> <201510142204.t9EM4hoI009104@fido.openend.se> Message-ID: On 14 Oct 2015 23:06, "Laura Creighton" wrote: > > In a message of Wed, 14 Oct 2015 21:21:30 -0000, Oscar Benjamin writes: > >Generally if it's possible to interchange floats and decimals in your code > >then there's probably no need for decimals in the first place. > > Yes, but, at least around here the common case is that you already > _have_ a pile of decimals (extracted from your leger) and now you > want to do something with them (like graph them and make reports > out of the graphs) with other people's graphing and report generating > software. The graphing library will almost certainly convert your data to float so I'd say this comes under the category of: you don't need decimals here. > >If mypy > >requires you to do an explicit conversion to float then there may be some > >seld-documenting merit in showing that conversion up front rather than > >assuming that it's okay to insert decimals where they're not expected. The > >point of static type checking is to detect precisely these kinds of errors. > > The thing is that there is a very big split between code written as > 'this is a float using function and decimal users very much have to > avoid using it' and 'this thing works perfectly well for floats and > decimals'. That code writers in the scientific python world mostly > never think of Decimal users, doesn't mean they don't end up writing > perfectly wonderful tools and libraries we use. :) thankfully :) > > I was looking for a way for the Python type hinting to be expressive > enough to handle this common (at least in my world) case. So then, > even if the bokeh developers (just to pick some friends) forget about > me in their type annotations, I can just make a pull request, send it > back with some corrected annotations and the note 'remember me!' :) I'm sure the bokeh developers will be aware of the different ways that their library is used (at this level). If the input spec is "sequence of coercible to float" then I agree that they should use type annotations to match that rather than putting float and I imagine they would welcome your PR. Guido's suggestion is not general enough for that though: what about Fraction, mpf, gmpy, numpy, sympy, h5py etc? The ABCs in the numeric tower are unused by 3rd party types making them useless for abstract type inference (IMO). AFAIK the lowest common denominator among number types in Python is the __float__ special method. Does mypy have a way to require (a sequence of) that? -- Oscar -------------- next part -------------- An HTML attachment was scrubbed... URL: From lac at openend.se Thu Oct 15 01:12:37 2015 From: lac at openend.se (Laura Creighton) Date: Thu, 15 Oct 2015 01:12:37 +0200 Subject: [Python-Dev] PEP 0484 - the Numeric Tower In-Reply-To: References: <201510131121.t9DBLlFa018601@fido.openend.se> <702AAFEA-B3F3-43D7-A05E-9B86D19EB046@gmail.com> <87vbaau4kv.fsf@fastmail.com> <561E2408.9070705@egenix.com> <201510141109.t9EB96jF025342@fido.openend.se> <201510142056.t9EKuMiw004003@fido.openend.se> <201510142204.t9EM4hoI009104@fido.openend.se> Message-ID: <201510142312.t9ENCbWD014156@fido.openend.se> In a message of Wed, 14 Oct 2015 23:49:33 +0100, Oscar Benjamin writes: >I'm sure the bokeh developers will be aware of the different ways that >their library is used (at this level). If the input spec is "sequence of >coercible to float" then I agree that they should use type annotations to >match that rather than putting float and I imagine they would welcome your >PR. > >Guido's suggestion is not general enough for that though: what about >Fraction, mpf, gmpy, numpy, sympy, h5py etc? The ABCs in the numeric tower >are unused by 3rd party types making them useless for abstract type >inference (IMO). AFAIK the lowest common denominator among number types in >Python is the __float__ special method. Does mypy have a way to require (a >sequence of) that? Thank you Oscar, and Chris Barker, and Guido for improving my thoughts on this matter. I go to bed now, pondering the idea that for me, internally, a new way to do type annotation 'to-x-or-is-coercible-to-x' seems a decent-enough idea. Seems a betrayal of earlier principles. Perhaps when I wake up I will feel differently. Off to dream about 'the meaning of type annotation' then. :) Thank you. Laura From chris.barker at noaa.gov Thu Oct 15 02:45:55 2015 From: chris.barker at noaa.gov (Chris Barker - NOAA Federal) Date: Wed, 14 Oct 2015 17:45:55 -0700 Subject: [Python-Dev] PEP 0484 - the Numeric Tower In-Reply-To: <201510142241.t9EMf8f8011820@fido.openend.se> References: <201510131121.t9DBLlFa018601@fido.openend.se> <702AAFEA-B3F3-43D7-A05E-9B86D19EB046@gmail.com> <87vbaau4kv.fsf@fastmail.com> <561E2408.9070705@egenix.com> <201510141109.t9EB96jF025342@fido.openend.se> <201510142056.t9EKuMiw004003@fido.openend.se> <201510142241.t9EMf8f8011820@fido.openend.se> Message-ID: <4218209166463277854@unknownmsgid> > Yes, but what I expect the type annotations to be used for, especially > in the SciPy world, is to make things easier for Numba to generate fast > code. Well, probably not. There are two reasons to have type declarations: performance and type safety. But the current type annotations are designed only for the latter. A number of us brought this up in the discussion because we do, in fact, want the magic fairy dust of faster code, whether by numba or Cython(my favorite) and I had hoped that type annotations would be useful for that. But it was explicitly stated that that was not the intent. And indeed, if you want to preserve any of python's nifty dynamic typing ( and we all do ) it can't support truly static typing. I.e. In Python, you want to say that your function works with, say, a sequence of numbers. But for performant compilation, you'd need to know the binary layout of that sequence. Side notes: Numba does JIT compilation, so pre specifying the types isn't needed anyway. Certainly adding these type annotations will buy you nothing. Bokeh, on the other hand, has to pass everything off to the browser anyway, presumably with JSON, so in theory it shouldn't need Numpy arrays anyway. (Of course, JSON doesn't support Decimal anyway -- or, strictly speaking, only supports decimal, but JavaScript conveys it to floats...) > I really hope that the SciPy world is not going to go nuts for > useless annotations, but I have, alas, all too many years dealing with > people whose desire to have faster code vastly outstrips their ability > to understand just what is possible in that regard. > > In James Barrie's novel, Peter Pan, earthly children are given the > ability to fly by having Fairy Dust (supplied by the unwilling > Tinkerbell) sprinkled over them. I now know that what a large number > of people who want faster code, really want is magic Fairy Dust. They > wanted psyco to be it, they wanted pypy to be it, and now they want > Numba to be it. The notion that sprinkling type annotations all over > their code will make it fly _absolutely deeply resonates_ with how > these people wish the world worked. They will believe that type > annotations are magic fairy dust until they are forced to confront > the fact poorly written code can not be made fly until it is rewritten. > > Laura > > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/chris.barker%40noaa.gov From stephen at xemacs.org Thu Oct 15 11:13:06 2015 From: stephen at xemacs.org (Stephen J. Turnbull) Date: Thu, 15 Oct 2015 18:13:06 +0900 Subject: [Python-Dev] PEP 0484 - the Numeric Tower In-Reply-To: <201510142241.t9EMf8f8011820@fido.openend.se> References: <201510131121.t9DBLlFa018601@fido.openend.se> <702AAFEA-B3F3-43D7-A05E-9B86D19EB046@gmail.com> <87vbaau4kv.fsf@fastmail.com> <561E2408.9070705@egenix.com> <201510141109.t9EB96jF025342@fido.openend.se> <201510142056.t9EKuMiw004003@fido.openend.se> <201510142241.t9EMf8f8011820@fido.openend.se> Message-ID: <22047.28194.365593.458904@turnbull.sk.tsukuba.ac.jp> Laura Creighton writes: > [W]hat I expect the type annotations to be used for, especially in > the SciPy world, is to make things easier for Numba to generate > fast code. I don't understand why that's a problem. *You* run mypy, and *you* decide whether to do anything about its warnings. The application you want to run is still a Python program, as are the modules it imports, and the Python compiler still simply attaches the annotations to the functions and otherwise ignores them, as does the VM. So, running the type checker is optional, just like running a profiler or pylint. How does the availability of profiling tools and linters hurt you? If they don't hurt, why would an optional analysis tool that happens to check types be a problem? It seems to me it wouldn't, but maybe I'm missing something? Unless you expect the Numba people themselves to require annotations before you can use Numba at all. In any case, that's not a problem created by PEP 484. Annotations have been available since 2006, decorator-based type-checkers since around 2003, and LBYL type- checking at runtime since the beginning of time. If Numba was going to require type annotations, they could have devised an annotation syntax themselves and done it long ago. Why would they, or any other project, change to a LBYL approach now? I would expect that Bokeh would be much less likely than Numba to do that, for example. > I now know that what a large number of people who want faster code, > really want is magic Fairy Dust. And a pony. And they think Santa Claus will deliver on Christmas. But I don't understand why that interferes with what you want to do. OK, if you think annotations are ugly and don't want to see them ever, fine, that I can understand (though I don't sympathize, at least not yet -- we'll see how badly they get abused in libraries I use). But how do they actually cause a problem when you're just running your code? I must be missing something .... Steve From steve at pearwood.info Fri Oct 16 02:57:11 2015 From: steve at pearwood.info (Steven D'Aprano) Date: Fri, 16 Oct 2015 11:57:11 +1100 Subject: [Python-Dev] PEP 506 secrets module Message-ID: <20151016005711.GC11980@ando.pearwood.info> Hi, As extensively discussed on Python-Ideas, the secrets module and PEP 506 is (I hope) ready for pronouncement. https://www.python.org/dev/peps/pep-0506/ There is code and tests here: https://bitbucket.org/sdaprano/secrets or you can run hg clone https://sdaprano at bitbucket.org/sdaprano/secrets The code is written for and tested on Python 2.6, 2.7, 3.1 - 3.4. -- Steve From clp2 at rebertia.com Fri Oct 16 07:33:37 2015 From: clp2 at rebertia.com (Chris Rebert) Date: Thu, 15 Oct 2015 22:33:37 -0700 Subject: [Python-Dev] PEP 506 secrets module In-Reply-To: <20151016005711.GC11980@ando.pearwood.info> References: <20151016005711.GC11980@ando.pearwood.info> Message-ID: On Thu, Oct 15, 2015 at 5:57 PM, Steven D'Aprano wrote: > Hi, > > As extensively discussed on Python-Ideas, the secrets module and PEP 506 > is (I hope) ready for pronouncement. > > https://www.python.org/dev/peps/pep-0506/ {{{ Comparison To Other Languages [...] Javascript Based on a rather cursory search [20], there do not appear to be any well-known standard functions for producing strong random values in Javascript, [...] [20] Volunteers and patches are welcome. }}} Looks like client-side JS has window.crypto.getRandomValues() for this: https://developer.mozilla.org/en-US/docs/Web/API/RandomSource/getRandomValues Similarly, Node.js offers crypto.randomBytes(): https://nodejs.org/api/crypto.html#crypto_crypto_randombytes_size_callback Also, it's spelled "JavaScript", not "Javascript". Additionally, it looks like there's some kind of bold formatting error in the answer to "Q: What about a password generator?" in the HTML version of the PEP. > There is code and tests here: > > https://bitbucket.org/sdaprano/secrets I think there's a timing-related flaw in the current fallback implementation of equal(): https://bitbucket.org/sdaprano/secrets/pull-requests/1 Cheers, Chris -- https://github.com/cvrebert From victor.stinner at gmail.com Fri Oct 16 08:57:24 2015 From: victor.stinner at gmail.com (Victor Stinner) Date: Fri, 16 Oct 2015 08:57:24 +0200 Subject: [Python-Dev] PEP 506 secrets module In-Reply-To: <20151016005711.GC11980@ando.pearwood.info> References: <20151016005711.GC11980@ando.pearwood.info> Message-ID: Hi, I like the PEP. IMHO it's a better solution than using a CPRNG for random by default. I suggest to raise an error if token_bytes(n) if calls with n < 16 bytes (128 bits). Well, I'm not sure that 16 is the good compromise between performance and security, but we must enforce users to use a minimum number of bits of entropy. token_bytes(1) looks valid, even token_bytes(0), according to the Python code in the PEP. I don't like the idea how having two functions doing *almost* the same thing: randint() and randrange(). There is a risk that these functions will be misused. I consider that I know some stuff on PRNG but I'm still confused by randint() and randrange(). Usually, I open python and type: >>> x=[s.randrange(1,6) for n in range(100)] >>> min(x), max(x) (1, 5) Hum, ok, it's not a good dice :-) I probably wanted to use randint(). So I suggest to only add randint() to secrets. The PEP doesn't explain if secrets uses a "blocking" CPRNG (like /dev/random or getentropy() on Solaris) or a "non-blocking" CRPNG (like /dev/urandom). And it doesn't explain the rationale. Please explain, or I'm sure that the question will arise (ex: I just asked it ;-)) You may also be a little bit more explicit on the CPRNG: it *looks* like secrets will always use a CRPNG implemented in the kernel. Is it a property of the secrets module, or can it be ssl.RAND_bytes() for example? IMHO we must always use a CRPNG implemented in the kernel, there is still an issue with ssl.RAND_bytes() and fork() (two child process can produce exactly the same random numbers after a lot of fork()...). I understood that OpenSSL developers doesn't want to fix it. You may even be very explicit, list CPRNG that will be used on Python 3.6: * Linux: getrandom() syscall if available (Linux 3.17 or newer), or /dev/urandom * Solaris: getrandom() function if available (Solaris 11.3 or newer), or /dev/urandom * OpenBSD: getentropy() function (OpenBSD 5.6 or newer), or /dev/urandom * Windows: CryptAcquireContext(PROV_RSA_FULL, CRYPT_VERIFYCONTEXT) and CryptGenRandom() * Other UNIXes: /dev/urandom It's still unclear to me if getentropy() on OpenBSD can block or not if the entropy is too low :-/ Victor 2015-10-16 2:57 GMT+02:00 Steven D'Aprano : > Hi, > > As extensively discussed on Python-Ideas, the secrets module and PEP 506 > is (I hope) ready for pronouncement. > > https://www.python.org/dev/peps/pep-0506/ > > There is code and tests here: > > https://bitbucket.org/sdaprano/secrets > > > or you can run > > hg clone https://sdaprano at bitbucket.org/sdaprano/secrets > > > The code is written for and tested on Python 2.6, 2.7, 3.1 - 3.4. > > > > -- > Steve > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/victor.stinner%40gmail.com From steve at pearwood.info Fri Oct 16 12:04:58 2015 From: steve at pearwood.info (Steven D'Aprano) Date: Fri, 16 Oct 2015 21:04:58 +1100 Subject: [Python-Dev] PEP 506 secrets module In-Reply-To: References: <20151016005711.GC11980@ando.pearwood.info> Message-ID: <20151016100457.GD11980@ando.pearwood.info> On Fri, Oct 16, 2015 at 08:57:24AM +0200, Victor Stinner wrote: > Hi, > > I like the PEP. IMHO it's a better solution than using a CPRNG for > random by default. > > I suggest to raise an error if token_bytes(n) if calls with n < 16 > bytes (128 bits). Well, I'm not sure that 16 is the good compromise > between performance and security, but we must enforce users to use a > minimum number of bits of entropy. token_bytes(1) looks valid, even > token_bytes(0), according to the Python code in the PEP. Youtube URLs have what looks like about 8 or 9 bytes of randomness: https://www.youtube.com/watch?v=B3KBuQHHKx0 (assuming it is base64 encoded, 11 chars is about 8 or 9 bytes). I don't think that we should force people to use any specific length, it would be pointless since they can always just slice it: py> secrets.token_bytes(16)[:2] b'\xd1s' Unless anyone has a good argument for why we should do this, I would be inclined to allow any value for nbytes. At most, I would be prepared to raise a warning if the nbytes was less than 16, but I don't think an error is appropriate. Thoughts? > I don't like the idea how having two functions doing *almost* the same > thing: randint() and randrange(). There is a risk that these functions > will be misused. I consider that I know some stuff on PRNG but I'm > still confused by randint() and randrange(). Usually, I open python > and type: > > >>> x=[s.randrange(1,6) for n in range(100)] > >>> min(x), max(x) > (1, 5) Wouldn't help(randrange) be easier? :-) Choose a random item from range(start, stop[, step]). This fixes the problem with randint() which includes the endpoint; in Python this is usually not what you want. I always find that comment amusing. While it is true that in slicing, half-open ranges are more useful than closed ranges, but when it comes to generating random numbers (say, simulating dice) I find randint much more useful and intuitive. But I appreciate that some people think differently. > Hum, ok, it's not a good dice :-) I probably wanted to use randint(). > So I suggest to only add randint() to secrets. On the Python-Ideas list the argument was about equally split between those who wanted only randrange and those who wanted only randint. I think if we don't supply both, we'll be forever fielding feature requests to add in the missing one. > The PEP doesn't explain if secrets uses a "blocking" CPRNG (like > /dev/random or getentropy() on Solaris) or a "non-blocking" CRPNG > (like /dev/urandom). And it doesn't explain the rationale. Please > explain, or I'm sure that the question will arise (ex: I just asked it > ;-)) secrets uses os.urandom and random.SystemRandom, which also uses os.urandom (or equivalent under Windows). The implementation section of the PEP shows that. As for the reason why urandom rather than /dev/random: http://sockpuppet.org/blog/2014/02/25/safely-generate-random-numbers/ http://www.2uo.de/myths-about-urandom/ I'm not sure if this was explicitly discussed in Python-Ideas, if it was I have forgotten it. I seem to recall that everyone who seemed to know what they were talking about just assumed we'd be using os.urandom and nobody questioned it or argued. > You may also be a little bit more explicit on the CPRNG: it *looks* > like secrets will always use a CRPNG implemented in the kernel. Is it > a property of the secrets module, or can it be ssl.RAND_bytes() for > example? IMHO we must always use a CRPNG implemented in the kernel, > there is still an issue with ssl.RAND_bytes() and fork() (two child > process can produce exactly the same random numbers after a lot of > fork()...). I understood that OpenSSL developers doesn't want to fix > it. > > You may even be very explicit, list CPRNG that will be used on Python 3.6: > > * Linux: getrandom() syscall if available (Linux 3.17 or newer), or /dev/urandom > * Solaris: getrandom() function if available (Solaris 11.3 or newer), > or /dev/urandom > * OpenBSD: getentropy() function (OpenBSD 5.6 or newer), or /dev/urandom > * Windows: CryptAcquireContext(PROV_RSA_FULL, CRYPT_VERIFYCONTEXT) and > CryptGenRandom() > * Other UNIXes: /dev/urandom Does anyone else think that the secrets module should promise specific implementations? -- Steve From rosuav at gmail.com Fri Oct 16 12:32:30 2015 From: rosuav at gmail.com (Chris Angelico) Date: Fri, 16 Oct 2015 21:32:30 +1100 Subject: [Python-Dev] PEP 506 secrets module In-Reply-To: <20151016100457.GD11980@ando.pearwood.info> References: <20151016005711.GC11980@ando.pearwood.info> <20151016100457.GD11980@ando.pearwood.info> Message-ID: On Fri, Oct 16, 2015 at 9:04 PM, Steven D'Aprano wrote: > On Fri, Oct 16, 2015 at 08:57:24AM +0200, Victor Stinner wrote: >> Hi, >> >> I like the PEP. IMHO it's a better solution than using a CPRNG for >> random by default. >> >> I suggest to raise an error if token_bytes(n) if calls with n < 16 >> bytes (128 bits). Well, I'm not sure that 16 is the good compromise >> between performance and security, but we must enforce users to use a >> minimum number of bits of entropy. token_bytes(1) looks valid, even >> token_bytes(0), according to the Python code in the PEP. > > Youtube URLs have what looks like about 8 or 9 bytes of randomness: > > https://www.youtube.com/watch?v=B3KBuQHHKx0 > > (assuming it is base64 encoded, 11 chars is about 8 or 9 bytes). I don't think that's randomness - it's some kind of unique ID, and I believe they lengthen with the log of the number of videos on Youtube. > Unless anyone has a good argument for why we should do this, I would be > inclined to allow any value for nbytes. At most, I would be prepared to > raise a warning if the nbytes was less than 16, but I don't think an > error is appropriate. > > Thoughts? Not even a warning, IMO. If you ask for a certain number of bytes, you should get that many bytes. Raise ValueError on token_bytes(-1), and maybe token_bytes(0), but otherwise, do what you were told. For people who want "good enough security", token_bytes() is the correct choice; if you explicitly choose, you get what you ask for. And secrets.DEFAULT_ENTROPY is intentionally public, right? I see no __all__, and it doesn't start with an underscore. So if someone wants "twice as long as the default", s/he can spell that secrets.token_bytes(secrets.DEFAULT_ENTROPY*2). There should be no reason to hard-code a number if you don't specifically want that number. >> You may also be a little bit more explicit on the CPRNG: it *looks* >> like secrets will always use a CRPNG implemented in the kernel. Is it >> a property of the secrets module, or can it be ssl.RAND_bytes() for >> example? IMHO we must always use a CRPNG implemented in the kernel, >> there is still an issue with ssl.RAND_bytes() and fork() (two child >> process can produce exactly the same random numbers after a lot of >> fork()...). I understood that OpenSSL developers doesn't want to fix >> it. >> >> You may even be very explicit, list CPRNG that will be used on Python 3.6: >> >> * Linux: getrandom() syscall if available (Linux 3.17 or newer), or /dev/urandom >> * Solaris: getrandom() function if available (Solaris 11.3 or newer), >> or /dev/urandom >> * OpenBSD: getentropy() function (OpenBSD 5.6 or newer), or /dev/urandom >> * Windows: CryptAcquireContext(PROV_RSA_FULL, CRYPT_VERIFYCONTEXT) and >> CryptGenRandom() >> * Other UNIXes: /dev/urandom > > Does anyone else think that the secrets module should promise specific > implementations? I think it should NOT. The point of this is to provide whatever is believed to be secure, and that should never be promised. If RHEL is still shipping CPython 3.6 ten years from now, the best source of entropy might have changed, and the module should be able to be patched. It may be worth having the implementation introspectable, though - have some attribute that identifies the source and any CSPRNG algorithms used. Might not be much use in practice, given that you can just look at the source code. ChrisA From storchaka at gmail.com Fri Oct 16 17:35:14 2015 From: storchaka at gmail.com (Serhiy Storchaka) Date: Fri, 16 Oct 2015 18:35:14 +0300 Subject: [Python-Dev] PEP 506 secrets module In-Reply-To: References: <20151016005711.GC11980@ando.pearwood.info> Message-ID: On 16.10.15 09:57, Victor Stinner wrote: > I suggest to raise an error if token_bytes(n) if calls with n < 16 > bytes (128 bits). Well, I'm not sure that 16 is the good compromise > between performance and security, but we must enforce users to use a > minimum number of bits of entropy. token_bytes(1) looks valid, even > token_bytes(0), according to the Python code in the PEP. This will provoke to write code token_bytes(16)[:5]. > I don't like the idea how having two functions doing *almost* the same > thing: randint() and randrange(). There is a risk that these functions > will be misused. I consider that I know some stuff on PRNG but I'm > still confused by randint() and randrange(). Usually, I open python > and type: > >>>> x=[s.randrange(1,6) for n in range(100)] >>>> min(x), max(x) > (1, 5) > > Hum, ok, it's not a good dice :-) I probably wanted to use randint(). > So I suggest to only add randint() to secrets. I suggest to add only randrange(). randint() is historical artefact, we shouldn't repeat this mistake in new module. The secrets module is not good way to generate dice rolls. In most other cases you need to generate integers in half-open interval [0; N). And randbelow() is absolute redundant. Random._randbelow() is implementation detail and I inclined to get rid of it (implementing randrange() in C instead). From status at bugs.python.org Fri Oct 16 18:08:32 2015 From: status at bugs.python.org (Python tracker) Date: Fri, 16 Oct 2015 18:08:32 +0200 (CEST) Subject: [Python-Dev] Summary of Python tracker Issues Message-ID: <20151016160832.336DE566EB@psf.upfronthosting.co.za> ACTIVITY SUMMARY (2015-10-09 - 2015-10-16) Python tracker at http://bugs.python.org/ To view or respond to any of the issues listed below, click on the issue. Do NOT respond to this message. Issues counts and deltas: open 5152 ( +8) closed 32026 (+55) total 37178 (+63) Open issues with patches: 2272 Issues opened (37) ================== #24931: _asdict breaks when inheriting from a namedtuple http://bugs.python.org/issue24931 reopened by doko #25188: regrtest.py improvement for Profile Guided Optimization builds http://bugs.python.org/issue25188 reopened by Arfrever #25359: io.open() fails to open ascii file if LANG env not set http://bugs.python.org/issue25359 opened by sentinel #25360: pyw should search for pythonw to implement #!/usr/bin/env pyt http://bugs.python.org/issue25360 opened by eryksun #25366: test_venv fails with --without-threads http://bugs.python.org/issue25366 opened by Arfrever #25370: Add support of pickling very large bytes and str objects with http://bugs.python.org/issue25370 opened by serhiy.storchaka #25376: KeyboardInterrupt handling and traceback broken on Windows 10 http://bugs.python.org/issue25376 opened by mdf #25377: Mention octal format of mode argument of os.chmod http://bugs.python.org/issue25377 opened by krichter #25381: Doc: Use of old description of raise in Python3 http://bugs.python.org/issue25381 opened by xiang.zhang #25385: Spurious warning when compiling extension module http://bugs.python.org/issue25385 opened by pitrou #25386: msvcrt_putch/msvcrt_putwch don't check the return value of _pu http://bugs.python.org/issue25386 opened by Alexander Riccio #25387: sound_msgbeep doesn't check the return value of MessageBeep http://bugs.python.org/issue25387 opened by Alexander Riccio #25388: tokenizer crash/misbehavior -- heap use-after-free http://bugs.python.org/issue25388 opened by Brian.Cain #25390: Can't define a typing.Union containing a typing.re.Pattern http://bugs.python.org/issue25390 opened by Mart?? Congost Tapias #25392: setup.py --quiet doesn't silence "no previously-included direc http://bugs.python.org/issue25392 opened by nedbat #25393: 'resource' module documentation error http://bugs.python.org/issue25393 opened by tzot #25394: CoroWrapper breaks gen.throw http://bugs.python.org/issue25394 opened by Chris Seto #25395: SIGSEGV using json.tool http://bugs.python.org/issue25395 opened by nagisa #25397: improve ac_cv_have_long_long_format GCC fallback http://bugs.python.org/issue25397 opened by vapier #25400: robotparser doesn't return crawl delay for default entry http://bugs.python.org/issue25400 opened by pwirtz #25402: More accurate estimation of the number of digits in int to dec http://bugs.python.org/issue25402 opened by serhiy.storchaka #25403: urllib.parse.urljoin is broken in python 3.5 http://bugs.python.org/issue25403 opened by Pavel Ivanov #25404: ssl.SSLcontext.load_dh_params() does not handle unicode filena http://bugs.python.org/issue25404 opened by schlenk #25405: User install of 3.5 removes py.exe from C:\Windows http://bugs.python.org/issue25405 opened by paul.moore #25407: Update PEP 4 to keep modules in Python 3 http://bugs.python.org/issue25407 opened by brett.cannon #25408: Consider dropping html5lib and spambayes from the default benc http://bugs.python.org/issue25408 opened by brett.cannon #25409: fnmatch.fnmatch normalizes slashes/backslashes on Windows http://bugs.python.org/issue25409 opened by The Compiler #25410: Clean up and fix OrderedDict http://bugs.python.org/issue25410 opened by serhiy.storchaka #25411: SMTPHandler in the logging module fails with unicode strings http://bugs.python.org/issue25411 opened by simon04 #25412: __floordiv__ in module fraction fails with TypeError instead o http://bugs.python.org/issue25412 opened by ShashkovS #25413: ctypes (libffi) fails to compile on Solaris X86 http://bugs.python.org/issue25413 opened by CristiFati #25414: Bypass unnecessary size limit test from deques on builds with http://bugs.python.org/issue25414 opened by rhettinger #25415: I can create instances of io.IOBase http://bugs.python.org/issue25415 opened by Gerrit.Holl #25416: Add encoding aliases from the (HTML5) Encoding Standard http://bugs.python.org/issue25416 opened by zwol #25417: Minor typo in Path.samefile docstring http://bugs.python.org/issue25417 opened by Antony.Lee #25419: Readline completion of module names in import statements http://bugs.python.org/issue25419 opened by martin.panter #25420: "import random" blocks on entropy collection http://bugs.python.org/issue25420 opened by matejcik Most recent 15 issues with no replies (15) ========================================== #25420: "import random" blocks on entropy collection http://bugs.python.org/issue25420 #25419: Readline completion of module names in import statements http://bugs.python.org/issue25419 #25417: Minor typo in Path.samefile docstring http://bugs.python.org/issue25417 #25416: Add encoding aliases from the (HTML5) Encoding Standard http://bugs.python.org/issue25416 #25413: ctypes (libffi) fails to compile on Solaris X86 http://bugs.python.org/issue25413 #25407: Update PEP 4 to keep modules in Python 3 http://bugs.python.org/issue25407 #25397: improve ac_cv_have_long_long_format GCC fallback http://bugs.python.org/issue25397 #25394: CoroWrapper breaks gen.throw http://bugs.python.org/issue25394 #25393: 'resource' module documentation error http://bugs.python.org/issue25393 #25387: sound_msgbeep doesn't check the return value of MessageBeep http://bugs.python.org/issue25387 #25366: test_venv fails with --without-threads http://bugs.python.org/issue25366 #25360: pyw should search for pythonw to implement #!/usr/bin/env pyt http://bugs.python.org/issue25360 #25355: Windows 3.5 installer does not add python to "App Paths" key http://bugs.python.org/issue25355 #25351: pyvenv activate script failure with specific bash option http://bugs.python.org/issue25351 #25348: Update pgo_build.bat to use --pgo flag for regrtest http://bugs.python.org/issue25348 Most recent 15 issues waiting for review (15) ============================================= #25419: Readline completion of module names in import statements http://bugs.python.org/issue25419 #25414: Bypass unnecessary size limit test from deques on builds with http://bugs.python.org/issue25414 #25413: ctypes (libffi) fails to compile on Solaris X86 http://bugs.python.org/issue25413 #25411: SMTPHandler in the logging module fails with unicode strings http://bugs.python.org/issue25411 #25410: Clean up and fix OrderedDict http://bugs.python.org/issue25410 #25402: More accurate estimation of the number of digits in int to dec http://bugs.python.org/issue25402 #25400: robotparser doesn't return crawl delay for default entry http://bugs.python.org/issue25400 #25394: CoroWrapper breaks gen.throw http://bugs.python.org/issue25394 #25388: tokenizer crash/misbehavior -- heap use-after-free http://bugs.python.org/issue25388 #25381: Doc: Use of old description of raise in Python3 http://bugs.python.org/issue25381 #25370: Add support of pickling very large bytes and str objects with http://bugs.python.org/issue25370 #25347: assert_has_calls output is formatted inconsistently http://bugs.python.org/issue25347 #25341: File mode wb+ appears as rb+ http://bugs.python.org/issue25341 #25338: urllib bypasses all hosts if proxyoverride includes an empty e http://bugs.python.org/issue25338 #25334: telnetlib: process_rawq() and binary data http://bugs.python.org/issue25334 Top 10 most discussed issues (10) ================================= #22005: datetime.__setstate__ fails decoding python2 pickle http://bugs.python.org/issue22005 11 msgs #25356: Idle (Python 3.4 on Ubuntu) does not allow typing accents http://bugs.python.org/issue25356 11 msgs #25194: Opt-in motivations & affiliations page for core contributors http://bugs.python.org/issue25194 10 msgs #25381: Doc: Use of old description of raise in Python3 http://bugs.python.org/issue25381 10 msgs #25359: io.open() fails to open ascii file if LANG env not set http://bugs.python.org/issue25359 9 msgs #25303: Add option to py_compile to compile for syntax checking withou http://bugs.python.org/issue25303 8 msgs #21159: configparser.InterpolationMissingOptionError is not very intui http://bugs.python.org/issue21159 6 msgs #25343: Document atomic operations on builtin types http://bugs.python.org/issue25343 6 msgs #25006: List pybind11 binding generator http://bugs.python.org/issue25006 5 msgs #25352: Add 'make this my default python' to windows installs for Pyth http://bugs.python.org/issue25352 5 msgs Issues closed (57) ================== #5380: pty.read raises IOError when slave pty device is closed http://bugs.python.org/issue5380 closed by gvanrossum #21165: Optimize str.translate() for replacement with substrings and n http://bugs.python.org/issue21165 closed by haypo #21264: test_compileall fails to build in the installed location http://bugs.python.org/issue21264 closed by brett.cannon #22179: Idle. Search dialog found text not highlited on Windows http://bugs.python.org/issue22179 closed by terry.reedy #22726: Idle: add help to config dialogs http://bugs.python.org/issue22726 closed by terry.reedy #24164: Support pickling objects with __new__ with keyword arguments w http://bugs.python.org/issue24164 closed by serhiy.storchaka #24402: input() uses sys.__stdout__ instead of sys.stdout for prompt http://bugs.python.org/issue24402 closed by martin.panter #24782: Merge 'configure extensions' into main IDLE config dialog http://bugs.python.org/issue24782 closed by terry.reedy #24784: Build fails with --with-pydebug and --without-threads http://bugs.python.org/issue24784 closed by python-dev #25001: Make --nowindows argument to regrtest propagate when running w http://bugs.python.org/issue25001 closed by haypo #25023: time.strftime('%a'), ValueError: embedded null byte, in ko loc http://bugs.python.org/issue25023 closed by steve.dower #25093: New 3.5.0 failure in test_tcl on win7 http://bugs.python.org/issue25093 closed by python-dev #25099: test_compileall fails when run by unprivileged user on install http://bugs.python.org/issue25099 closed by brett.cannon #25143: 3.5 install fails poorly on Windows XP http://bugs.python.org/issue25143 closed by steve.dower #25159: Regression in time to import a module http://bugs.python.org/issue25159 closed by serhiy.storchaka #25161: Missing periods at the end of sentences http://bugs.python.org/issue25161 closed by martin.panter #25163: Windows installer in AllUsers mode presents wrong installation http://bugs.python.org/issue25163 closed by steve.dower #25164: Windows default installation path is inconsistent between per- http://bugs.python.org/issue25164 closed by steve.dower #25207: ICC compiler warnings http://bugs.python.org/issue25207 closed by haypo #25210: Special-case NoneType() in do_richcompare() http://bugs.python.org/issue25210 closed by haypo #25254: Idle: debugger source line highlighting fails again http://bugs.python.org/issue25254 closed by terry.reedy #25274: sys.setrecursionlimit() must fail if the current recursion dep http://bugs.python.org/issue25274 closed by haypo #25277: test_sigwaitinfo() of test_eintr hangs on randomly on FreeBSD http://bugs.python.org/issue25277 closed by haypo #25322: contextlib.suppress not tested for nested usage http://bugs.python.org/issue25322 closed by martin.panter #25335: ast.literal_eval fails to parse numbers with leading "+" http://bugs.python.org/issue25335 closed by terry.reedy #25344: Enhancement to Logging - Logging Stack http://bugs.python.org/issue25344 closed by python-dev #25349: Use _PyBytesWriter for bytes%args http://bugs.python.org/issue25349 closed by haypo #25353: Use _PyBytesWriter for unicode escape and raw unicode escape e http://bugs.python.org/issue25353 closed by haypo #25354: test_datetime failing http://bugs.python.org/issue25354 closed by tim.peters #25357: Add an optional newline parameter to binascii.b2a_base64() to http://bugs.python.org/issue25357 closed by haypo #25358: Unexpected behaviour when converting large float to int http://bugs.python.org/issue25358 closed by eryksun #25361: Is python-3-5-0.exe compiled with SSE2 instrutions? If so shou http://bugs.python.org/issue25361 closed by steve.dower #25362: In threading module, use with instead of try finally http://bugs.python.org/issue25362 closed by python-dev #25363: x=[1,2,3].append([1,2,3]) bug http://bugs.python.org/issue25363 closed by eric.smith #25364: zipfile broken with --without-threads http://bugs.python.org/issue25364 closed by serhiy.storchaka #25365: test_pickle fails with --without-threads http://bugs.python.org/issue25365 closed by serhiy.storchaka #25367: test_coroutines fails with --without-threads http://bugs.python.org/issue25367 closed by python-dev #25368: test_eintr fails with --without-threads http://bugs.python.org/issue25368 closed by python-dev #25369: test_regrtest fails with --without-threads http://bugs.python.org/issue25369 closed by python-dev #25371: select.select docstring needs comma http://bugs.python.org/issue25371 closed by python-dev #25372: load_module() does not link submodule to parent package http://bugs.python.org/issue25372 closed by brett.cannon #25373: test.regrtest: -jN (with N != 1) + --slow + child error or int http://bugs.python.org/issue25373 closed by python-dev #25374: Deficiencies in type hint usage in Python standard libraries http://bugs.python.org/issue25374 closed by r.david.murray #25375: Don't mention 2.2 in the 3.x docs http://bugs.python.org/issue25375 closed by python-dev #25378: Roundoff error on OS X http://bugs.python.org/issue25378 closed by ezio.melotti #25379: Changes in traceback broke existing code (Python 3.5) http://bugs.python.org/issue25379 closed by berker.peksag #25380: Incorrect protocol for the STACK_GLOBAL opcode http://bugs.python.org/issue25380 closed by serhiy.storchaka #25382: pickletools.dis(): output memo id for MEMOIZE http://bugs.python.org/issue25382 closed by serhiy.storchaka #25383: Docs: Broken link http://bugs.python.org/issue25383 closed by berker.peksag #25384: Use _PyBytesWriter in the binascii module http://bugs.python.org/issue25384 closed by haypo #25389: It crashes as long as I press "(parenthese) http://bugs.python.org/issue25389 closed by ned.deily #25391: difflib.SequenceMatcher(...).ratio gives bad/wrong/unexpected http://bugs.python.org/issue25391 closed by tim.peters #25396: A Python runtime not could be located. http://bugs.python.org/issue25396 closed by zach.ware #25399: Optimize bytearray % args http://bugs.python.org/issue25399 closed by haypo #25401: Optimize bytes.fromhex() and bytearray.fromhex() http://bugs.python.org/issue25401 closed by haypo #25406: OrderedDict.move_to_end may cause crash in python 3.5 http://bugs.python.org/issue25406 closed by serhiy.storchaka #25418: Minor markup issue in reference/datamodel docs http://bugs.python.org/issue25418 closed by berker.peksag From steve at pearwood.info Fri Oct 16 18:26:46 2015 From: steve at pearwood.info (Steven D'Aprano) Date: Sat, 17 Oct 2015 03:26:46 +1100 Subject: [Python-Dev] PEP 506 secrets module In-Reply-To: References: <20151016005711.GC11980@ando.pearwood.info> Message-ID: <20151016162645.GE11980@ando.pearwood.info> On Fri, Oct 16, 2015 at 06:35:14PM +0300, Serhiy Storchaka wrote: > I suggest to add only randrange(). randint() is historical artefact, we > shouldn't repeat this mistake in new module. The secrets module is not > good way to generate dice rolls. In most other cases you need to > generate integers in half-open interval [0; N). > > And randbelow() is absolute redundant. Random._randbelow() is > implementation detail and I inclined to get rid of it (implementing > randrange() in C instead). This was discussed on Python-Ideas, and there was little consensus there either. (Looks like Tim Peters' prediction is coming true :-) Putting aside your inflammatory description of randint() as a "mistake", if you are correct that in most cases people will need to generate integers in the half-open interval [0...n) then we should keep randbelow, since that is precisely what it does. randrange([start=0,] end [, step=1]) is a complex API. It can take one, two or three arguments, like range. Are there any use-cases for providing the step argument? If not, then why offer such a complex API that will never be used? Personally, I have no sense of which of the three functions will be most useful, but if you are right about the half-open [0...n) interval, then randbelow seems to be the right API to offer. But I have seen people argue in favour of randint, and others argue in favour of randrange. Given that these are just thin wrappers or aliases to methods of random.SystemRandom, I don't think there is any harm in providing all three. I've also raised this issue on the python-list mailing list. -- Steve From storchaka at gmail.com Fri Oct 16 20:29:56 2015 From: storchaka at gmail.com (Serhiy Storchaka) Date: Fri, 16 Oct 2015 21:29:56 +0300 Subject: [Python-Dev] PEP 506 secrets module In-Reply-To: <20151016162645.GE11980@ando.pearwood.info> References: <20151016005711.GC11980@ando.pearwood.info> <20151016162645.GE11980@ando.pearwood.info> Message-ID: On 16.10.15 19:26, Steven D'Aprano wrote: > On Fri, Oct 16, 2015 at 06:35:14PM +0300, Serhiy Storchaka wrote: >> I suggest to add only randrange(). randint() is historical artefact, we >> shouldn't repeat this mistake in new module. The secrets module is not >> good way to generate dice rolls. In most other cases you need to >> generate integers in half-open interval [0; N). >> >> And randbelow() is absolute redundant. Random._randbelow() is >> implementation detail and I inclined to get rid of it (implementing >> randrange() in C instead). > > This was discussed on Python-Ideas, and there was little consensus there > either. (Looks like Tim Peters' prediction is coming true :-) > > Putting aside your inflammatory description of randint() as a "mistake", > if you are correct that in most cases people will need to generate > integers in the half-open interval [0...n) then we should keep > randbelow, since that is precisely what it does. Andrew explained the history of the issue (http://permalink.gmane.org/gmane.comp.python.ideas/36437). randrange was added in 61464037da53 to address a problem with unpythonic randint. > Personally, I have no sense of which of the three functions will be most > useful, but if you are right about the half-open [0...n) interval, then > randbelow seems to be the right API to offer. But I have seen people > argue in favour of randint, and others argue in favour of randrange. > Given that these are just thin wrappers or aliases to methods of > random.SystemRandom, I don't think there is any harm in providing all > three. Yes, randbelow provides simpler API, but randrange is more familiar for Python users due to similarity to range and because it is the public API in the random module (unlike to randbelow). From guido at python.org Fri Oct 16 20:33:07 2015 From: guido at python.org (Guido van Rossum) Date: Fri, 16 Oct 2015 11:33:07 -0700 Subject: [Python-Dev] PEP 506 secrets module In-Reply-To: References: <20151016005711.GC11980@ando.pearwood.info> <20151016162645.GE11980@ando.pearwood.info> Message-ID: Single-argument randrange(n) is the same as randbelow(n), right? I don't see any reason to have randbelow() if that's true. On Fri, Oct 16, 2015 at 11:29 AM, Serhiy Storchaka wrote: > On 16.10.15 19:26, Steven D'Aprano wrote: > >> On Fri, Oct 16, 2015 at 06:35:14PM +0300, Serhiy Storchaka wrote: >> >>> I suggest to add only randrange(). randint() is historical artefact, we >>> shouldn't repeat this mistake in new module. The secrets module is not >>> good way to generate dice rolls. In most other cases you need to >>> generate integers in half-open interval [0; N). >>> >>> And randbelow() is absolute redundant. Random._randbelow() is >>> implementation detail and I inclined to get rid of it (implementing >>> randrange() in C instead). >>> >> >> This was discussed on Python-Ideas, and there was little consensus there >> either. (Looks like Tim Peters' prediction is coming true :-) >> >> Putting aside your inflammatory description of randint() as a "mistake", >> if you are correct that in most cases people will need to generate >> integers in the half-open interval [0...n) then we should keep >> randbelow, since that is precisely what it does. >> > > Andrew explained the history of the issue ( > http://permalink.gmane.org/gmane.comp.python.ideas/36437). randrange was > added in 61464037da53 to address a problem with unpythonic randint. > > Personally, I have no sense of which of the three functions will be most >> useful, but if you are right about the half-open [0...n) interval, then >> randbelow seems to be the right API to offer. But I have seen people >> argue in favour of randint, and others argue in favour of randrange. >> Given that these are just thin wrappers or aliases to methods of >> random.SystemRandom, I don't think there is any harm in providing all >> three. >> > > Yes, randbelow provides simpler API, but randrange is more familiar for > Python users due to similarity to range and because it is the public API in > the random module (unlike to randbelow). > > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at pearwood.info Sat Oct 17 11:50:50 2015 From: steve at pearwood.info (Steven D'Aprano) Date: Sat, 17 Oct 2015 20:50:50 +1100 Subject: [Python-Dev] PEP 506 secrets module In-Reply-To: <20151016162645.GE11980@ando.pearwood.info> References: <20151016005711.GC11980@ando.pearwood.info> <20151016162645.GE11980@ando.pearwood.info> Message-ID: <20151017095050.GA3725@ando.pearwood.info> On Sat, Oct 17, 2015 at 03:26:46AM +1100, Steven D'Aprano wrote: > On Fri, Oct 16, 2015 at 06:35:14PM +0300, Serhiy Storchaka wrote: > > > I suggest to add only randrange(). randint() is historical artefact, we > > shouldn't repeat this mistake in new module. The secrets module is not > > good way to generate dice rolls. In most other cases you need to > > generate integers in half-open interval [0; N). > > > > And randbelow() is absolute redundant. Random._randbelow() is > > implementation detail and I inclined to get rid of it (implementing > > randrange() in C instead). > > This was discussed on Python-Ideas, and there was little consensus there > either. (Looks like Tim Peters' prediction is coming true :-) [...] > I've also raised this issue on the python-list mailing list. I've had some feedback on python-list. To summarise the various positions expressed so far: randbelow only: 3 in favour randint only: 1 neutral (neither opposed nor in favour) randrange only: 1 in favour, 1 against both randrange and randint: 2 in favour (Total number of comments is more than the total number of posts, as some people expressed more than one opinion in the same post. As in, "I prefer X, but Y would be good too".) So you can see there is nothing even close to consensus as to which API is best, which is an argument for keeping all three functions. But significanly, only *one* of the commenters has claimed to have any significant experience in crypto work, and I will quote him: Having done quite a bit of serious crypto implementation over the past 25 years, I don't recall ever wanting anything like randrange, and if I did need it, I'd probably build it inline from randbelow rather than force some hapless future code maintainer to look up the specs on randrange. My opinion, FWIW: I like randbelow, because in modern crypto one very frequently works with integers in the range [0,M-1] for some large modulus M, and there is a constant risk of asking for something in [0,M] when one meant [0,M-1]. One can eliminate this risk, as randbelow does, by building in the -1, which normally introduces a risk of making a mistake that gives you [0,M-2], but the name "randbelow" seems like a neat fix to that problem. -- Peter Pearson This matches what Serhiy suggests: in crypto, one normally only needs to generate the half-open interval [0...n). It also matches the reason why Tim Peters added randbelow in the first place. As the author of the PEP, I'm satisfied by this argument, and will now state that my preferred option is to drop randint and randrange, and just keep randbelow. My second choice is to keep all three functions. I think it is fair to say that out of the three functions, there is consensus that randbelow has the most useful functionality in a crypto context. Otherwise, people seem roughly equally split between the three functions. There doesn't seem to be any use-case for the three argument version of randrange. -- Steve From nemodevops at gmail.com Sat Oct 17 15:22:13 2015 From: nemodevops at gmail.com (Nemo Nautilius) Date: Sat, 17 Oct 2015 18:52:13 +0530 Subject: [Python-Dev] Conversion of a standard unicode string to a bit string in Python Message-ID: Hi All, I'm currently programming a set of crypto challenges in order to get a deeper understanding of python and crypto. The problem is to break a repeating key xor data (in a file). In order to do that I need a function to calculate the hamming distance between two strings. To find that one needs to find the differing number of *bits* in a string. Any ideas on how to manipulate the string at bit level? This is my first time in writing a question to the mailing list so please let me know anything that I need to keep in mind while asking questions. Thanks in advance. Gracias Nemo -------------- next part -------------- An HTML attachment was scrubbed... URL: From brg at gladman.plus.com Sat Oct 17 13:14:14 2015 From: brg at gladman.plus.com (Brian Gladman) Date: Sat, 17 Oct 2015 12:14:14 +0100 Subject: [Python-Dev] PEP 506 secrets module In-Reply-To: <20151017095050.GA3725@ando.pearwood.info> References: <20151017095050.GA3725@ando.pearwood.info> Message-ID: <56222D86.1090102@gladman.plus.com> > On Sat, Oct 17, 2015 at 03:26:46AM +1100, Steven D'Aprano wrote: [snip] > But significanly, only *one* of the commenters has claimed to have > any significant experience in crypto work, and I will quote him: I didn't specifically claim the experience you requested in responding to your post on comp.lang.python because I thought that this was implied in making a response. In fact I have 30+ years of experience in implementing cryptographic code (much involving random numbers) so there were at least two respondents who could have made this claim. For the record, I consider it desirable in code involving security to exhibit the minimum functionality neccessary to get a job done. This is because funtionality and security very often work against each other in building secure systems. I hence support your conclusion that the module should offer randbelow alone. I would oppose offering randomrange (or offering more than one of them) since this will pretty well guarantee that, sooner or later, someone will make a mistake in using the extra functionality and possibly deploy an insecure application as a result. Brian Gladman From mertz at gnosis.cx Sat Oct 17 17:52:30 2015 From: mertz at gnosis.cx (David Mertz) Date: Sat, 17 Oct 2015 08:52:30 -0700 Subject: [Python-Dev] Conversion of a standard unicode string to a bit string in Python In-Reply-To: References: Message-ID: This list is for discussion of development of the Python core language and standard libraries, not for development *using* Python. It sounds like you should probably do your homework problem on your own, actually, but if you seek advice, something like StackOverflow or python-list are likely to be more appropriate. On Sat, Oct 17, 2015 at 6:22 AM, Nemo Nautilius wrote: > Hi All, > I'm currently programming a set of crypto challenges in order to get a > deeper understanding of python and crypto. The problem is to break a > repeating key xor data (in a file). In order to do that I need a function > to calculate the hamming distance between two strings. To find that one > needs to find the differing number of *bits* in a string. Any ideas on how > to manipulate the string at bit level? > > This is my first time in writing a question to the mailing list so please > let me know anything that I need to keep in mind while asking questions. > Thanks in advance. > > Gracias > Nemo > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/mertz%40gnosis.cx > > -- Keeping medicines from the bloodstreams of the sick; food from the bellies of the hungry; books from the hands of the uneducated; technology from the underdeveloped; and putting advocates of freedom in prisons. Intellectual property is to the 21st century what the slave trade was to the 16th. -------------- next part -------------- An HTML attachment was scrubbed... URL: From random832 at fastmail.com Sat Oct 17 18:54:51 2015 From: random832 at fastmail.com (Random832) Date: Sat, 17 Oct 2015 12:54:51 -0400 Subject: [Python-Dev] PEP 506 secrets module References: <20151017095050.GA3725@ando.pearwood.info> <56222D86.1090102@gladman.plus.com> Message-ID: <87vba5306c.fsf@fastmail.com> Brian Gladman writes: >> On Sat, Oct 17, 2015 at 03:26:46AM +1100, Steven D'Aprano wrote: > I hence support your conclusion that the module should offer randbelow > alone. I would oppose offering randomrange (or offering more than one > of them) since this will pretty well guarantee that, sooner or later, > someone will make a mistake in using the extra functionality and > possibly deploy an insecure application as a result. > > Brian Gladman Plus if someone really does want randrange, they can simply do this: def randfrom(seq): return seq[randbelow(len(seq))] def randrange(start, stop, step=None): randfrom(range(start, stop, step)) These are simple recipes that probably don't belong in the module. From guido at python.org Sat Oct 17 21:51:39 2015 From: guido at python.org (Guido van Rossum) Date: Sat, 17 Oct 2015 12:51:39 -0700 Subject: [Python-Dev] PEP 506 secrets module In-Reply-To: <20151017095050.GA3725@ando.pearwood.info> References: <20151016005711.GC11980@ando.pearwood.info> <20151016162645.GE11980@ando.pearwood.info> <20151017095050.GA3725@ando.pearwood.info> Message-ID: On Sat, Oct 17, 2015 at 2:50 AM, Steven D'Aprano wrote: > [...] > So you can see there is nothing even close to consensus as to which API > is best, which is an argument for keeping all three functions. > No, that's not how we do it in Python. :-) > But significanly, only *one* of the commenters has claimed to have any > significant experience in crypto work, and I will quote him: > > Having done quite a bit of serious crypto implementation over the > past 25 years, I don't recall ever wanting anything like randrange, > and if I did need it, I'd probably build it inline from randbelow > rather than force some hapless future code maintainer to look up the > specs on randrange. > > My opinion, FWIW: I like randbelow, because in modern crypto one > very frequently works with integers in the range [0,M-1] for some > large modulus M, and there is a constant risk of asking for > something in [0,M] when one meant [0,M-1]. One can eliminate this > risk, as randbelow does, by building in the -1, which normally > introduces a risk of making a mistake that gives you [0,M-2], but > the name "randbelow" seems like a neat fix to that problem. > > -- Peter Pearson > It's not clear whether this correspondent realizes that randrange(N) is identical to randbelow(N). > This matches what Serhiy suggests: in crypto, one normally only needs to > generate the half-open interval [0...n). It also matches the reason why > Tim Peters added randbelow in the first place. > > As the author of the PEP, I'm satisfied by this argument, and will now > state that my preferred option is to drop randint and randrange, and > just keep randbelow. > > My second choice is to keep all three functions. > > I think it is fair to say that out of the three functions, there is > consensus that randbelow has the most useful functionality in a crypto > context. Otherwise, people seem roughly equally split between the three > functions. There doesn't seem to be any use-case for the three argument > version of randrange. > I'm fine with dropping the 3rd arg. But I find the argument to introduce a new spelling for 1-arg randrange() weak. I definitely thing that randint() is an attractive nuisance so we should drop that. -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From random832 at fastmail.com Sat Oct 17 22:30:27 2015 From: random832 at fastmail.com (Random832) Date: Sat, 17 Oct 2015 16:30:27 -0400 Subject: [Python-Dev] PEP 506 secrets module References: <20151016005711.GC11980@ando.pearwood.info> <20151016162645.GE11980@ando.pearwood.info> <20151017095050.GA3725@ando.pearwood.info> Message-ID: <8761252q70.fsf@fastmail.com> Guido van Rossum writes: > On Sat, Oct 17, 2015 at 2:50 AM, Steven D'Aprano > wrote: > > [...] > So you can see there is nothing even close to consensus as to > which API > is best, which is an argument for keeping all three functions. > > No, that's not how we do it in Python. :-) Ooh, I know this! It's an argument for scrapping the whole thing, right? From tim.peters at gmail.com Sat Oct 17 23:13:23 2015 From: tim.peters at gmail.com (Tim Peters) Date: Sat, 17 Oct 2015 16:13:23 -0500 Subject: [Python-Dev] PEP 506 secrets module In-Reply-To: References: <20151016005711.GC11980@ando.pearwood.info> <20151016162645.GE11980@ando.pearwood.info> <20151017095050.GA3725@ando.pearwood.info> Message-ID: [Steven D'Aprano] >> ... >> I think it is fair to say that out of the three functions, there is >> consensus that randbelow has the most useful functionality in a crypto >> context. Otherwise, people seem roughly equally split between the three >> functions. There doesn't seem to be any use-case for the three argument >> version of randrange. [Guido] > I'm fine with dropping the 3rd arg. But I find the argument to introduce a > new spelling for 1-arg randrange() weak. Note the inherent absurdity here ;-) That is, we're proposing to add a module _all_ of whose functions are merely trivial (to the educated) respellings of functions that already exist elsewhere. Is there, e.g., really "a strong" argument for giving a different way to spell os.urandom()? That depends. I'm playing along with the basic assumption: that this module intends to be as idiot-proof as possible. In which case, sure, rename away. And in which case, no, randbelow is not a new spelling for 1-arg randrange. To the contrary, all the integer-like functions (including randrange, randint, and choice) in the random module are currently built on top of the private Random._randbelow() method. The latter is "the right" fundamental building block for implementing uniform choice free of statistical bias. It's also overwhelmingly most useful _on its own_ in the `secrets` context, and has the crushing (to me, in this context) advantage that its very name strongly suggests its argument is not a possible return value. Giving a strongly mnemonic name to a function with a dead simple "one required integer argument" signature is likely "as idiot-proof as possible" as it gets. BTW, it also gives a simple answer to "I'm used to using arc4random_uniform() in OpenBSD - how do I spell that in `secrets`?". Answer: `randbelow()` is identical, except in Python the argument isn't limited to uint32_t". > I definitely thing that randint() is an attractive nuisance so we should > drop that. `randrange()` isn't a nuisance at all in Python, but its signature is more convoluted than putative users of the proposed module seem to want, and long experience has shown its name is not idiot-proof. `secrets` users aren't looking to pick something uniformly from "a range" - they're looking to pick a non-negative integer less than some upper bound. Unneeded generalization beyond just that much is also an attractive nuisance, in context. "Simplest thing that can possibly suffice" is always a good way to start :-) From brg at gladman.plus.com Sat Oct 17 23:25:38 2015 From: brg at gladman.plus.com (Brian Gladman) Date: Sat, 17 Oct 2015 22:25:38 +0100 Subject: [Python-Dev] PEP 506 secrets module Message-ID: <5622BCD2.5090105@gladman.plus.com> > Guido van Rossum wrote: > I'm fine with dropping the 3rd arg. But I find the argument to > introduce a new spelling for 1-arg randrange() weak. I should stress that my preference for randbelow over randrange was based purely on their proposed functionality and not on their names. I do however have a preference for a function of minimum required functionality, i.e. one that allows only a single parameter (N) to set the range 0 .. N-1. From ericsnowcurrently at gmail.com Sat Oct 17 23:45:19 2015 From: ericsnowcurrently at gmail.com (Eric Snow) Date: Sat, 17 Oct 2015 15:45:19 -0600 Subject: [Python-Dev] type(obj) vs. obj.__class__ Message-ID: In a recent tracker issue about OrderedDict [1] we've had some discussion about the use of type(od) as a replacement for od.__class__. It came up because the pure Python implementation of OrderedDict uses self.__class__ in 3 different methods (__repr__, __reduce__, and copy). The patch in that issue changes the C implementation to use Py_TYPE(). [2] So I wanted to get some feedback on the practical implications of such a change and if we need to clarify the difference more formally. In this specific case [3] there are 3 questions: * Should __repr__() for a stdlib class use type(self).__name__ or self.__class__.__name__? * Should __reduce__() return type(self) or self.__class__? * Should copy() use type(self) or self.__class__? The more general question of when we use type(obj) vs. obj.__class__ applies to both the language and to all the stdlib as I expect consistency there would result in fewer surprises. I realize that there are some places where using obj.__class__ makes more sense (e.g. for some proxy support). There are other places where using type(obj) is the way to go (e.g. special method lookup). However, the difference is muddled enough that usage is inconsistent in the stdlib. For example, C-implemented types use Py_TYPE() almost exclusively. So, would it make sense to establish some concrete guidelines about when to use type(obj) vs. obj.__class__? If so, what would those be? It may also be helpful to enumerate use cases for "type(obj) is not obj.__class__". -eric [1] http://bugs.python.org/issue25410 [2] I'm going to open a separate thread about the issue of compatibility and C accelerated types. [3] https://hg.python.org/cpython/file/default/Lib/collections/__init__.py#l238 From ericsnowcurrently at gmail.com Sun Oct 18 00:20:34 2015 From: ericsnowcurrently at gmail.com (Eric Snow) Date: Sat, 17 Oct 2015 16:20:34 -0600 Subject: [Python-Dev] compatibility for C-accelerated types Message-ID: A recent discussion in a tracker issue [1] brought up the matter of compatibility between the pure Python implementation of OrderedDict and the new C implementation. In working on that port I stuck as closely as possible to the Python implementation. This meant some parts of the code are bit more complex than they would be otherwise. (Serhiy has been kind enough to do some cleanup.) Compatibility was one of the fundamental goals of the porting effort. Not only does compatibility make sense but it's also specifically required by PEP 399 [2]: Any new accelerated code must act as a drop-in replacement as close to the pure Python implementation as reasonable. Technical details of the VM providing the accelerated code are allowed to differ as necessary, e.g., a class being a type when implemented in C. For the most part I have questions about what is "reasonable", specifically in relation to OrderedDict. I've already opened up a separate thread related to my main question: type(obj) vs. obj.__class__. [3] In the tracker issue, Serhiy pointed out: There is no a difference. io, pickle, ElementTree, bz2, virtually all accelerator classes was created as replacements of pure Python implementations. All C implementations use Py_TYPE(self) for repr() and pickling. I think this deviation is common and acceptable. In a review comment on the associated patch he said: Isn't type(self) is always the same as self.__class__ for pure Python class? If right, then this change doesn't have any effect. To which he later replied: It is the same if you assigned the __class__ attribute, but can be different if set __class__ in the subclass declaration. So it isn't clear if that is a compatibility break or how much so it might be. Serhiy also noted that, as of 3.5 [4], you can no longer assign to obj.__class__ for instances of subclasses of builtin (non-heap) types. So that is another place where the two OrderedDict implementations differ. I expect there are a few others in dark corner cases. On the tracker he notes another OrderedDict compatibility break: Backward compatibility related to __class__ assignment was already broken in C implementation. In 3.4 following code works: >>> from collections import * >>> class foo(OrderedDict): ... def bark(self): return "spam" ... >>> class bar(OrderedDict): ... pass ... >>> od = bar() >>> od.__class__ = foo >>> od.bark() 'spam' In 3.5 it doesn't. As PEP 399 says, we should go as far "as reasonable" in the pursuit of compatibility. At the same time, I feel not insignificant responsibility for *any* incompatibility that comes from the C implementation of OrderedDict. The corner cases impacted by the above compatibility concerns are borderline enough that I wanted to get some feedback. Thanks! -eric [1] http://bugs.python.org/issue25410 [2] https://www.python.org/dev/peps/pep-0399/ [3] https://mail.python.org/pipermail/python-dev/2015-October/141953.html [4] http://bugs.python.org/issue24912 From storchaka at gmail.com Sun Oct 18 00:29:34 2015 From: storchaka at gmail.com (Serhiy Storchaka) Date: Sun, 18 Oct 2015 01:29:34 +0300 Subject: [Python-Dev] type(obj) vs. obj.__class__ In-Reply-To: References: Message-ID: On 18.10.15 00:45, Eric Snow wrote: > In a recent tracker issue about OrderedDict [1] we've had some > discussion about the use of type(od) as a replacement for > od.__class__. It came up because the pure Python implementation of > OrderedDict uses self.__class__ in 3 different methods (__repr__, > __reduce__, and copy). The patch in that issue changes the C > implementation to use Py_TYPE(). [2] So I wanted to get some feedback > on the practical implications of such a change and if we need to > clarify the difference more formally. > > In this specific case [3] there are 3 questions: > > * Should __repr__() for a stdlib class use type(self).__name__ or > self.__class__.__name__? > * Should __reduce__() return type(self) or self.__class__? > * Should copy() use type(self) or self.__class__? > > The more general question of when we use type(obj) vs. obj.__class__ > applies to both the language and to all the stdlib as I expect > consistency there would result in fewer surprises. I realize that > there are some places where using obj.__class__ makes more sense (e.g. > for some proxy support). There are other places where using type(obj) > is the way to go (e.g. special method lookup). However, the > difference is muddled enough that usage is inconsistent in the stdlib. > For example, C-implemented types use Py_TYPE() almost exclusively. > > So, would it make sense to establish some concrete guidelines about > when to use type(obj) vs. obj.__class__? If so, what would those be? > It may also be helpful to enumerate use cases for "type(obj) is not > obj.__class__". > > -eric > > > [1] http://bugs.python.org/issue25410 > [2] I'm going to open a separate thread about the issue of > compatibility and C accelerated types. > [3] https://hg.python.org/cpython/file/default/Lib/collections/__init__.py#l238 Want to add that in common case type(obj) is the same as obj.__class__. When you set obj.__class__ (assignment is restricted by issue24912), type(obj) is changed as well. You can make obj.__class__ differ from type(obj) if set __class__ as class attribute at class creation time, or made __class__ a property, or like. >>> class A: pass ... >>> class B: __class__ = A ... >>> type(B()) >>> B().__class__ The only places where obj.__class__ made different from type(obj) in the stdlib, besides tests, are mock object (hence related to tests), and proxy stream in xml.sax.saxutils (I'm not sure that the latter use is correct). About pickling. Default implementation of __reduce_ex__ uses obj.__class__ in protocols 0 and 1, and type(obj) in protocols 2+. >>> B().__reduce_ex__(1) (, (, , None)) >>> B().__reduce_ex__(2) (, (,), None, None, None) But pickler rejects classes with mismatched type(obj) and obj.__class__ in protocols 2+. >>> pickle.dumps(B(), 2) Traceback (most recent call last): File "", line 1, in _pickle.PicklingError: args[0] from __newobj__ args has the wrong class From storchaka at gmail.com Sun Oct 18 00:38:21 2015 From: storchaka at gmail.com (Serhiy Storchaka) Date: Sun, 18 Oct 2015 01:38:21 +0300 Subject: [Python-Dev] compatibility for C-accelerated types In-Reply-To: References: Message-ID: On 18.10.15 01:20, Eric Snow wrote: > On the tracker he notes another OrderedDict compatibility break: > > Backward compatibility related to __class__ assignment was > already broken in C implementation. In 3.4 following code > works: > > >>> from collections import * > >>> class foo(OrderedDict): > ... def bark(self): return "spam" > ... > >>> class bar(OrderedDict): > ... pass > ... > >>> od = bar() > >>> od.__class__ = foo > >>> od.bark() > 'spam' > > In 3.5 it doesn't. Sorry, I was mistaken with this example. It works in 3.5. From gvanrossum at gmail.com Sun Oct 18 01:03:35 2015 From: gvanrossum at gmail.com (Guido van Rossum) Date: Sat, 17 Oct 2015 16:03:35 -0700 Subject: [Python-Dev] PEP 506 secrets module In-Reply-To: <5622BCD2.5090105@gladman.plus.com> References: <5622BCD2.5090105@gladman.plus.com> Message-ID: Yes, randrange(n) does that. --Guido (mobile) On Oct 17, 2015 2:28 PM, "Brian Gladman" wrote: > > Guido van Rossum wrote: > > > I'm fine with dropping the 3rd arg. But I find the argument to > > introduce a new spelling for 1-arg randrange() weak. > > I should stress that my preference for randbelow over randrange was > based purely on their proposed functionality and not on their names. > > I do however have a preference for a function of minimum required > functionality, i.e. one that allows only a single parameter (N) to > set the range 0 .. N-1. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Sun Oct 18 01:05:42 2015 From: guido at python.org (Guido van Rossum) Date: Sat, 17 Oct 2015 16:05:42 -0700 Subject: [Python-Dev] PEP 506 secrets module In-Reply-To: References: <20151016005711.GC11980@ando.pearwood.info> <20151016162645.GE11980@ando.pearwood.info> <20151017095050.GA3725@ando.pearwood.info> Message-ID: OK, so just randbelow() then. --Guido (mobile) On Oct 17, 2015 2:13 PM, "Tim Peters" wrote: > [Steven D'Aprano] > >> ... > >> I think it is fair to say that out of the three functions, there is > >> consensus that randbelow has the most useful functionality in a crypto > >> context. Otherwise, people seem roughly equally split between the three > >> functions. There doesn't seem to be any use-case for the three argument > >> version of randrange. > > [Guido] > > I'm fine with dropping the 3rd arg. But I find the argument to introduce > a > > new spelling for 1-arg randrange() weak. > > Note the inherent absurdity here ;-) That is, we're proposing to add > a module _all_ of whose functions are merely trivial (to the educated) > respellings of functions that already exist elsewhere. Is there, > e.g., really "a strong" argument for giving a different way to spell > os.urandom()? > > That depends. I'm playing along with the basic assumption: that this > module intends to be as idiot-proof as possible. In which case, sure, > rename away. > > And in which case, no, randbelow is not a new spelling for 1-arg > randrange. To the contrary, all the integer-like functions (including > randrange, randint, and choice) in the random module are currently > built on top of the private Random._randbelow() method. The latter is > "the right" fundamental building block for implementing uniform choice > free of statistical bias. It's also overwhelmingly most useful _on > its own_ in the `secrets` context, and has the crushing (to me, in > this context) advantage that its very name strongly suggests its > argument is not a possible return value. Giving a strongly mnemonic > name to a function with a dead simple "one required integer argument" > signature is likely "as idiot-proof as possible" as it gets. > > BTW, it also gives a simple answer to "I'm used to using > arc4random_uniform() in OpenBSD - how do I spell that in `secrets`?". > Answer: `randbelow()` is identical, except in Python the argument > isn't limited to uint32_t". > > > > I definitely thing that randint() is an attractive nuisance so we should > > drop that. > > `randrange()` isn't a nuisance at all in Python, but its signature is > more convoluted than putative users of the proposed module seem to > want, and long experience has shown its name is not idiot-proof. > `secrets` users aren't looking to pick something uniformly from "a > range" - they're looking to pick a non-negative integer less than some > upper bound. Unneeded generalization beyond just that much is also an > attractive nuisance, in context. > > "Simplest thing that can possibly suffice" is always a good way to start > :-) > -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at pearwood.info Sun Oct 18 07:55:00 2015 From: steve at pearwood.info (Steven D'Aprano) Date: Sun, 18 Oct 2015 16:55:00 +1100 Subject: [Python-Dev] type(obj) vs. obj.__class__ In-Reply-To: References: Message-ID: <20151018055500.GD3725@ando.pearwood.info> On Sat, Oct 17, 2015 at 03:45:19PM -0600, Eric Snow wrote: > In a recent tracker issue about OrderedDict [1] we've had some > discussion about the use of type(od) as a replacement for > od.__class__. [...] > The more general question of when we use type(obj) vs. obj.__class__ > applies to both the language and to all the stdlib as I expect > consistency there would result in fewer surprises. I realize that > there are some places where using obj.__class__ makes more sense (e.g. > for some proxy support). There are other places where using type(obj) > is the way to go (e.g. special method lookup). However, the > difference is muddled enough that usage is inconsistent in the stdlib. > For example, C-implemented types use Py_TYPE() almost exclusively. > > So, would it make sense to establish some concrete guidelines about > when to use type(obj) vs. obj.__class__? If so, what would those be? > It may also be helpful to enumerate use cases for "type(obj) is not > obj.__class__". I for one would like to see a definitive explanation for when they are different, and when you should use one or the other. The only obvious example I've seen is the RingBuffer from the Python Cookbook: http://code.activestate.com/recipes/68429-ring-buffer/ -- Steve From guido at python.org Sun Oct 18 16:55:58 2015 From: guido at python.org (Guido van Rossum) Date: Sun, 18 Oct 2015 07:55:58 -0700 Subject: [Python-Dev] type(obj) vs. obj.__class__ In-Reply-To: <20151018055500.GD3725@ando.pearwood.info> References: <20151018055500.GD3725@ando.pearwood.info> Message-ID: It's mostly a historical accident -- for classic classes, type(inst) was always `instance' while inst.__class__ was the user-defined class object. For new-style classes, the idea is that you can write a proxy class that successfully masquerades as another class. Because __class__ is an attribute, a proxy class can fake this attribute. But type() would reveal the proxy class. IIRC __class__ is used by the isinstance() implementation, although the code is complicated and I wouldn't be surprised if isinstance(x, type(x)) was also true for proxy instances. (I haven't looked at the code in a long time and it's not easy to follow, alas.) C code that checks the type instead of __class__ is probably one reason why proxy classes have never taken off -- there just are too many exceptions, so the experience is never very smooth, and everyone ends up cursing the proxy class. Maybe this kind of "strong" proxy class is just not a good idea. And maybe then we needn't worry about the distinction between type() and __class__. On Sat, Oct 17, 2015 at 10:55 PM, Steven D'Aprano wrote: > On Sat, Oct 17, 2015 at 03:45:19PM -0600, Eric Snow wrote: > > In a recent tracker issue about OrderedDict [1] we've had some > > discussion about the use of type(od) as a replacement for > > od.__class__. > [...] > > The more general question of when we use type(obj) vs. obj.__class__ > > applies to both the language and to all the stdlib as I expect > > consistency there would result in fewer surprises. I realize that > > there are some places where using obj.__class__ makes more sense (e.g. > > for some proxy support). There are other places where using type(obj) > > is the way to go (e.g. special method lookup). However, the > > difference is muddled enough that usage is inconsistent in the stdlib. > > For example, C-implemented types use Py_TYPE() almost exclusively. > > > > So, would it make sense to establish some concrete guidelines about > > when to use type(obj) vs. obj.__class__? If so, what would those be? > > It may also be helpful to enumerate use cases for "type(obj) is not > > obj.__class__". > > I for one would like to see a definitive explanation for when they are > different, and when you should use one or the other. The only > obvious example I've seen is the RingBuffer from the Python Cookbook: > > http://code.activestate.com/recipes/68429-ring-buffer/ > > > > -- > Steve > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From vadmium+py at gmail.com Sun Oct 18 10:07:07 2015 From: vadmium+py at gmail.com (Martin Panter) Date: Sun, 18 Oct 2015 08:07:07 +0000 Subject: [Python-Dev] type(obj) vs. obj.__class__ In-Reply-To: <20151018055500.GD3725@ando.pearwood.info> References: <20151018055500.GD3725@ando.pearwood.info> Message-ID: On 18 October 2015 at 05:55, Steven D'Aprano wrote: > On Sat, Oct 17, 2015 at 03:45:19PM -0600, Eric Snow wrote: >> So, would it make sense to establish some concrete guidelines about >> when to use type(obj) vs. obj.__class__? If so, what would those be? >> It may also be helpful to enumerate use cases for "type(obj) is not >> obj.__class__". > > I for one would like to see a definitive explanation for when they are > different, and when you should use one or the other. The only > obvious example I've seen is the RingBuffer from the Python Cookbook: > > http://code.activestate.com/recipes/68429-ring-buffer/ It looks like this example just assigns to the existing __class__ attribute, to switch to a different class. I haven?t seen this ability mentioned in the documentation, but I suspect it is meant to be supported. However assigning to __class__ like that should automatically update the type() return value, so type(ring_buffer) == ring_buffer.__class__ is still maintained. Perhaps some of this confusion comes from Python 2. I don?t know the details, but I know in Python 2, type() can do something different, so you have to use __class__ directly if you want to be compatible with Python 2 classes. But in Python 3 code I prefer using direct function calls like type() to ?special attributes? like __class__ where practical. The documentation says that __*__ names are reserved for Python and its built-in library, rather than user code. So user code that creates a class attribute or property called __class__ is asking for trouble IMO, and we shouldn?t spend much effort accommodating such cases. For __repr__() I would use type(), which seems to agree with what object.__repr__() uses. From mertz at gnosis.cx Sun Oct 18 17:45:08 2015 From: mertz at gnosis.cx (David Mertz) Date: Sun, 18 Oct 2015 08:45:08 -0700 Subject: [Python-Dev] type(obj) vs. obj.__class__ In-Reply-To: <20151018055500.GD3725@ando.pearwood.info> References: <20151018055500.GD3725@ando.pearwood.info> Message-ID: This recipe looks like a bad design to me to start with. It's too-clever-by-half, IMO. If I were to implement RingBuffer, I wouldn't futz around with the __class__ attribute to change it into another thing when it was full. A much more obvious API for users would be simply to implement a RingBuffer.isfull() method, perhaps supported by an underlying RingBuffer._full boolean attribute. That's much friendlier than expecting people to introspect the type of the thing for a question that only occasionally matters; and when it does matter, the question is always conceived exactly as "Is it full?" not "What class is this currently?" So I think I'm still waiting for a compelling example where type(x) != x.__class__ would be worthwhile (yes, of course it's *possible*) On Sat, Oct 17, 2015 at 10:55 PM, Steven D'Aprano wrote: > On Sat, Oct 17, 2015 at 03:45:19PM -0600, Eric Snow wrote: > > In a recent tracker issue about OrderedDict [1] we've had some > > discussion about the use of type(od) as a replacement for > > od.__class__. > [...] > > The more general question of when we use type(obj) vs. obj.__class__ > > applies to both the language and to all the stdlib as I expect > > consistency there would result in fewer surprises. I realize that > > there are some places where using obj.__class__ makes more sense (e.g. > > for some proxy support). There are other places where using type(obj) > > is the way to go (e.g. special method lookup). However, the > > difference is muddled enough that usage is inconsistent in the stdlib. > > For example, C-implemented types use Py_TYPE() almost exclusively. > > > > So, would it make sense to establish some concrete guidelines about > > when to use type(obj) vs. obj.__class__? If so, what would those be? > > It may also be helpful to enumerate use cases for "type(obj) is not > > obj.__class__". > > I for one would like to see a definitive explanation for when they are > different, and when you should use one or the other. The only > obvious example I've seen is the RingBuffer from the Python Cookbook: > > http://code.activestate.com/recipes/68429-ring-buffer/ > > > > -- > Steve > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/mertz%40gnosis.cx > -- Keeping medicines from the bloodstreams of the sick; food from the bellies of the hungry; books from the hands of the uneducated; technology from the underdeveloped; and putting advocates of freedom in prisons. Intellectual property is to the 21st century what the slave trade was to the 16th. -------------- next part -------------- An HTML attachment was scrubbed... URL: From pludemann at google.com Mon Oct 19 02:09:03 2015 From: pludemann at google.com (Peter Ludemann) Date: Sun, 18 Oct 2015 17:09:03 -0700 Subject: [Python-Dev] type(obj) vs. obj.__class__ In-Reply-To: References: <20151018055500.GD3725@ando.pearwood.info> Message-ID: I re-coded the "too clever by half" RingBuffer to use the same design but with delegation ... and it ran 50% slower. (Code available on request) Then I changed it to switch implementations of append() and get() when it got full (the code is below) and it ran at essentially the same speed as the original. So, there's no need to be so clever with __class__. Of course, this trick of replacing a method is also "too clever by half"; but an instance variable for "full" slows it down by 15%. class RingBuffer(object): def __init__(self, size_max): self.max = size_max self.data = [] self.cur = 0 def append(self, x): self.data.append(x) if len(self.data) == self.max: self.append = self.append_full def append_full(self, x): self.data[self.cur] = x self.cur = (self.cur + 1) % self.max def get(self): return self.data[self.cur:] + self.data[:self.cur] On 18 October 2015 at 08:45, David Mertz wrote: > This recipe looks like a bad design to me to start with. It's > too-clever-by-half, IMO. > > If I were to implement RingBuffer, I wouldn't futz around with the > __class__ attribute to change it into another thing when it was full. A > much more obvious API for users would be simply to implement a > RingBuffer.isfull() method, perhaps supported by an underlying > RingBuffer._full boolean attribute. That's much friendlier than expecting > people to introspect the type of the thing for a question that only > occasionally matters; and when it does matter, the question is always > conceived exactly as "Is it full?" not "What class is this currently?" > > So I think I'm still waiting for a compelling example where type(x) != > x.__class__ would be worthwhile (yes, of course it's *possible*) > > On Sat, Oct 17, 2015 at 10:55 PM, Steven D'Aprano > wrote: > >> On Sat, Oct 17, 2015 at 03:45:19PM -0600, Eric Snow wrote: >> > In a recent tracker issue about OrderedDict [1] we've had some >> > discussion about the use of type(od) as a replacement for >> > od.__class__. >> [...] >> > The more general question of when we use type(obj) vs. obj.__class__ >> > applies to both the language and to all the stdlib as I expect >> > consistency there would result in fewer surprises. I realize that >> > there are some places where using obj.__class__ makes more sense (e.g. >> > for some proxy support). There are other places where using type(obj) >> > is the way to go (e.g. special method lookup). However, the >> > difference is muddled enough that usage is inconsistent in the stdlib. >> > For example, C-implemented types use Py_TYPE() almost exclusively. >> > >> > So, would it make sense to establish some concrete guidelines about >> > when to use type(obj) vs. obj.__class__? If so, what would those be? >> > It may also be helpful to enumerate use cases for "type(obj) is not >> > obj.__class__". >> >> I for one would like to see a definitive explanation for when they are >> different, and when you should use one or the other. The only >> obvious example I've seen is the RingBuffer from the Python Cookbook: >> >> http://code.activestate.com/recipes/68429-ring-buffer/ >> >> >> >> -- >> Steve >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/mertz%40gnosis.cx >> > > > > -- > Keeping medicines from the bloodstreams of the sick; food > from the bellies of the hungry; books from the hands of the > uneducated; technology from the underdeveloped; and putting > advocates of freedom in prisons. Intellectual property is > to the 21st century what the slave trade was to the 16th. > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/pludemann%40google.com > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mertz at gnosis.cx Mon Oct 19 02:35:14 2015 From: mertz at gnosis.cx (David Mertz) Date: Sun, 18 Oct 2015 17:35:14 -0700 Subject: [Python-Dev] type(obj) vs. obj.__class__ In-Reply-To: References: <20151018055500.GD3725@ando.pearwood.info> Message-ID: I'm not sure what benchmark you used to define the speed of RingBuffer. I'm sure you are reporting numbers accurately for your tests, but there are "lies, damn lies, and benchmarks", so "how fast" has a lot of nuance to it. In any case, redefining a method in a certain situation feels a lot less magic to me than redefining .__class__, and clarity and good API are much more important than micro-optimization for something unlikely to be on a critical path. That's interesting about the `self._full` variable slowing it down, I think I'm not surprised (but obviously it depends on just how it's used). But one can also simply define RingBuffer.isfull() using `self.max==len(self.data)` if you prefer that approach. I doubt `myringbuffer.isfull()` is something you need to call in an inner loop. That said, I think my implementation of RingBuffer would probably look more like (completely untested): class RingBuffer(object): def __init__(self, size_max): self.data = [None] * size_max self.size_max = size_max self.used = 0 self.cur = 0 def append(self, val): self.data[self.cur] = val self.cur = (self.cur+1) % self.size_max self.used = max(self.used, self.cur+1) def isfull(self): self.used == self.size_max Feel free to try this version against whatever benchmark you have in mind. On Sun, Oct 18, 2015 at 5:09 PM, Peter Ludemann wrote: > I re-coded the "too clever by half" RingBuffer to use the same design but > with delegation ... and it ran 50% slower. (Code available on request) > Then I changed it to switch implementations of append() and get() when it > got full (the code is below) and it ran at essentially the same speed as > the original. So, there's no need to be so clever with __class__. Of > course, this trick of replacing a method is also "too clever by half"; but > an instance variable for "full" slows it down by 15%. > > class RingBuffer(object): > def __init__(self, size_max): > self.max = size_max > self.data = [] > self.cur = 0 > def append(self, x): > self.data.append(x) > if len(self.data) == self.max: > self.append = self.append_full > def append_full(self, x): > self.data[self.cur] = x > self.cur = (self.cur + 1) % self.max > def get(self): > return self.data[self.cur:] + self.data[:self.cur] > > > > On 18 October 2015 at 08:45, David Mertz wrote: > >> This recipe looks like a bad design to me to start with. It's >> too-clever-by-half, IMO. >> >> If I were to implement RingBuffer, I wouldn't futz around with the >> __class__ attribute to change it into another thing when it was full. A >> much more obvious API for users would be simply to implement a >> RingBuffer.isfull() method, perhaps supported by an underlying >> RingBuffer._full boolean attribute. That's much friendlier than expecting >> people to introspect the type of the thing for a question that only >> occasionally matters; and when it does matter, the question is always >> conceived exactly as "Is it full?" not "What class is this currently?" >> >> So I think I'm still waiting for a compelling example where type(x) != >> x.__class__ would be worthwhile (yes, of course it's *possible*) >> >> On Sat, Oct 17, 2015 at 10:55 PM, Steven D'Aprano >> wrote: >> >>> On Sat, Oct 17, 2015 at 03:45:19PM -0600, Eric Snow wrote: >>> > In a recent tracker issue about OrderedDict [1] we've had some >>> > discussion about the use of type(od) as a replacement for >>> > od.__class__. >>> [...] >>> > The more general question of when we use type(obj) vs. obj.__class__ >>> > applies to both the language and to all the stdlib as I expect >>> > consistency there would result in fewer surprises. I realize that >>> > there are some places where using obj.__class__ makes more sense (e.g. >>> > for some proxy support). There are other places where using type(obj) >>> > is the way to go (e.g. special method lookup). However, the >>> > difference is muddled enough that usage is inconsistent in the stdlib. >>> > For example, C-implemented types use Py_TYPE() almost exclusively. >>> > >>> > So, would it make sense to establish some concrete guidelines about >>> > when to use type(obj) vs. obj.__class__? If so, what would those be? >>> > It may also be helpful to enumerate use cases for "type(obj) is not >>> > obj.__class__". >>> >>> I for one would like to see a definitive explanation for when they are >>> different, and when you should use one or the other. The only >>> obvious example I've seen is the RingBuffer from the Python Cookbook: >>> >>> http://code.activestate.com/recipes/68429-ring-buffer/ >>> >>> >>> >>> -- >>> Steve >>> _______________________________________________ >>> Python-Dev mailing list >>> Python-Dev at python.org >>> https://mail.python.org/mailman/listinfo/python-dev >>> Unsubscribe: >>> https://mail.python.org/mailman/options/python-dev/mertz%40gnosis.cx >>> >> >> >> >> -- >> Keeping medicines from the bloodstreams of the sick; food >> from the bellies of the hungry; books from the hands of the >> uneducated; technology from the underdeveloped; and putting >> advocates of freedom in prisons. Intellectual property is >> to the 21st century what the slave trade was to the 16th. >> >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/pludemann%40google.com >> >> > -- Keeping medicines from the bloodstreams of the sick; food from the bellies of the hungry; books from the hands of the uneducated; technology from the underdeveloped; and putting advocates of freedom in prisons. Intellectual property is to the 21st century what the slave trade was to the 16th. -------------- next part -------------- An HTML attachment was scrubbed... URL: From rosuav at gmail.com Mon Oct 19 02:41:44 2015 From: rosuav at gmail.com (Chris Angelico) Date: Mon, 19 Oct 2015 11:41:44 +1100 Subject: [Python-Dev] type(obj) vs. obj.__class__ In-Reply-To: References: <20151018055500.GD3725@ando.pearwood.info> Message-ID: On Mon, Oct 19, 2015 at 11:35 AM, David Mertz wrote: > That's interesting about the `self._full` variable slowing it down, I think > I'm not surprised (but obviously it depends on just how it's used). But one > can also simply define RingBuffer.isfull() using `self.max==len(self.data)` > if you prefer that approach. I doubt `myringbuffer.isfull()` is something > you need to call in an inner loop. > > That said, I think my implementation of RingBuffer would probably look more > like (completely untested): > > class RingBuffer(object): > def __init__(self, size_max): > self.data = [None] * size_max > self.size_max = size_max > self.used = 0 > self.cur = 0 > def append(self, val): > self.data[self.cur] = val > self.cur = (self.cur+1) % self.size_max > self.used = max(self.used, self.cur+1) > def isfull(self): > self.used == self.size_max > > Feel free to try this version against whatever benchmark you have in mind. What does this provide that collections.deque(maxlen=size_max) doesn't? I'm a little lost. ChrisA From pludemann at google.com Mon Oct 19 02:57:19 2015 From: pludemann at google.com (Peter Ludemann) Date: Sun, 18 Oct 2015 17:57:19 -0700 Subject: [Python-Dev] type(obj) vs. obj.__class__ In-Reply-To: References: <20151018055500.GD3725@ando.pearwood.info> Message-ID: On 18 October 2015 at 17:41, Chris Angelico wrote: > On Mon, Oct 19, 2015 at 11:35 AM, David Mertz wrote: > > That's interesting about the `self._full` variable slowing it down, I > think > > I'm not surprised (but obviously it depends on just how it's used). But > one > > can also simply define RingBuffer.isfull() using > `self.max==len(self.data)` > > if you prefer that approach. I doubt `myringbuffer.isfull()` is > something > > you need to call in an inner loop. > > > > That said, I think my implementation of RingBuffer would probably look > more > > like (completely untested): > > > > class RingBuffer(object): > > def __init__(self, size_max): > > self.data = [None] * size_max > > self.size_max = size_max > > self.used = 0 > > self.cur = 0 > > def append(self, val): > > self.data[self.cur] = val > > self.cur = (self.cur+1) % self.size_max > > self.used = max(self.used, self.cur+1) > > def isfull(self): > > self.used == self.size_max > > > > Feel free to try this version against whatever benchmark you have in > mind. > > What does this provide that collections.deque(maxlen=size_max) > doesn't? I'm a little lost. > ?I was merely re-implementing the "clever" code in a slightly less clever way, for the same performance, to demonstrate that there's no need to assign to __class__. collections.deque is about 5x faster. (My simple benchmark tests the cost of x.append(i)) - p? > > ChrisA > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/pludemann%40google.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at pearwood.info Mon Oct 19 03:47:31 2015 From: steve at pearwood.info (Steven D'Aprano) Date: Mon, 19 Oct 2015 12:47:31 +1100 Subject: [Python-Dev] type(obj) vs. obj.__class__ In-Reply-To: References: <20151018055500.GD3725@ando.pearwood.info> Message-ID: <20151019014730.GB3813@ando.pearwood.info> On Sun, Oct 18, 2015 at 05:35:14PM -0700, David Mertz wrote: > In any case, redefining a method in a certain situation feels a lot less > magic to me than redefining .__class__ That surprises me greatly. As published in the Python Cookbook[1], there is a one-to-one correspondence between the methods used by an object and its class. If you want to know what instance.spam() method does, you look at the class type(instance) or instance.__class__, and read the source code for spam. With your suggestion of re-defining the methods on the fly, you no longer have that simple relationship. If you want to know what instance.spam() method does, first you have to work out what it actually is, which may not be that easy. In the worst case, it might not be possible at all: class K: def method(self): if condition: self.method = random.choice([lambda self: ..., lambda self: ..., lambda self: ...]) Okay, that's an extreme example, and one can write bad code using any technique. But even with a relatively straight-forward version: def method(self): if condition: self.method = self.other_method I would classify "change the methods on the fly" as self-modifying code, which strikes me as much more hacky and hard to maintain than something as simple as changing the __class__ on the fly. Changing the __class__ is just a straight-forward metamorphosis: what was a caterpillar, calling methods defined in the Caterpillar class, is now a butterfly, calling methods defined in the Butterfly class. (The only change I would make from the published recipe would be to make the full Ringbuffer a subclass of the regular one, so isinstance() tests would work as expected. But given that the recipe pre-dates the wide-spread use of isinstance, the author can be forgiven for not thinking of that.) If changing the class on the fly is a metamorphosis, then it seems to me that self-modifying methods are like something from The Fly, where a horrible teleporter accident grafts body parts and DNA from one object into another object... or at least *repurposes* existing methods, so that what was your leg is now your arm. I've done that, and found it harder to reason about than the alternative: "okay, the object is an RingBuffer, but is the append method the RingBuffer.append method or the RingBuffer.full_append method?" versus "okay, the object is a RingBuffer, therefore the append method is the RingBuffer.append method". In my opinion, the only tricky thing about the metamorphosis tactic is that: obj = Caterpillar() # later assert type(obj) is Caterpillar may fail. You need a runtime introspection to see what the type of obj actually is. But that's not exactly unusual: if you consider Caterpillar to be a function rather than a class constructor (a factory perhaps?), then it's not that surprising that you can't know what *specific* type a function returns until runtime. There are many functions with polymorphic return types. [1] The first edition of the Cookbook was edited by Python luminaries Alex Martelli and David Ascher, so this recipe has their stamp of approval. This isn't some dirty hack. -- Steve From steve at pearwood.info Mon Oct 19 03:57:10 2015 From: steve at pearwood.info (Steven D'Aprano) Date: Mon, 19 Oct 2015 12:57:10 +1100 Subject: [Python-Dev] type(obj) vs. obj.__class__ In-Reply-To: References: <20151018055500.GD3725@ando.pearwood.info> Message-ID: <20151019015710.GC3813@ando.pearwood.info> On Mon, Oct 19, 2015 at 11:41:44AM +1100, Chris Angelico wrote: > What does this provide that collections.deque(maxlen=size_max) > doesn't? I'm a little lost. The Ringbuffer recipe predates deque by quite a few years. These days I would consider it only useful in a pedagogical context, giving a practical use for changing the class of an object on-the-fly. -- Steve From mertz at gnosis.cx Mon Oct 19 04:12:34 2015 From: mertz at gnosis.cx (David Mertz) Date: Sun, 18 Oct 2015 19:12:34 -0700 Subject: [Python-Dev] type(obj) vs. obj.__class__ In-Reply-To: <20151019014730.GB3813@ando.pearwood.info> References: <20151018055500.GD3725@ando.pearwood.info> <20151019014730.GB3813@ando.pearwood.info> Message-ID: My intuition differs from Steven's here. But that's fine. In any case, my simple implementation of RingBuffer in this thread avoids either rebinding methods or changing .__class__. And yes, of course collections.deque is better than any of these implementations. I was just trying to show that any such magic is unlikely to be necessary... and in particular that the recipe given as an example doesn't show it is. But still, you REALLY want your `caterpillar = Caterpillar()` to become something of type "Butterfly" later?! Obviously I understand the biological metaphor. But I'd much rather have an API that provided me with .has_metamorphosed() then have to look for the type as something new. Btw. Take a look at Alex' talk with Anna at PyCon 2015. They discuss various "best practices" that have been superseded by improved language facilities. They don't say anything about this "mutate the __class__ trick", but I somehow suspect he'd put that in that category. On Sun, Oct 18, 2015 at 6:47 PM, Steven D'Aprano wrote: > On Sun, Oct 18, 2015 at 05:35:14PM -0700, David Mertz wrote: > > > In any case, redefining a method in a certain situation feels a lot less > > magic to me than redefining .__class__ > > That surprises me greatly. As published in the Python Cookbook[1], there > is a one-to-one correspondence between the methods used by an object and > its class. If you want to know what instance.spam() method does, you > look at the class type(instance) or instance.__class__, and read the > source code for spam. > > With your suggestion of re-defining the methods on the fly, you no > longer have that simple relationship. If you want to know what > instance.spam() method does, first you have to work out what it actually > is, which may not be that easy. In the worst case, it might not be > possible at all: > > class K: > def method(self): > if condition: > self.method = random.choice([lambda self: ..., > lambda self: ..., > lambda self: ...]) > > > Okay, that's an extreme example, and one can write bad code using any > technique. But even with a relatively straight-forward version: > > def method(self): > if condition: > self.method = self.other_method > > > I would classify "change the methods on the fly" as self-modifying code, > which strikes me as much more hacky and hard to maintain than something > as simple as changing the __class__ on the fly. > > Changing the __class__ is just a straight-forward metamorphosis: what > was a caterpillar, calling methods defined in the Caterpillar class, is > now a butterfly, calling methods defined in the Butterfly class. > > (The only change I would make from the published recipe would be to make > the full Ringbuffer a subclass of the regular one, so isinstance() tests > would work as expected. But given that the recipe pre-dates the > wide-spread use of isinstance, the author can be forgiven for not > thinking of that.) > > If changing the class on the fly is a metamorphosis, then it seems to me > that self-modifying methods are like something from The Fly, where a > horrible teleporter accident grafts body parts and DNA from one object > into another object... or at least *repurposes* existing methods, so > that what was your leg is now your arm. > > I've done that, and found it harder to reason about than the > alternative: > > "okay, the object is an RingBuffer, but is the append method the > RingBuffer.append method or the RingBuffer.full_append method?" > > versus > > "okay, the object is a RingBuffer, therefore the append method is the > RingBuffer.append method". > > > In my opinion, the only tricky thing about the metamorphosis tactic is > that: > > obj = Caterpillar() > # later > assert type(obj) is Caterpillar > > may fail. You need a runtime introspection to see what the type of obj > actually is. But that's not exactly unusual: if you consider Caterpillar > to be a function rather than a class constructor (a factory perhaps?), > then it's not that surprising that you can't know what *specific* > type a function returns until runtime. There are many functions with > polymorphic return types. > > > > > > [1] The first edition of the Cookbook was edited by Python luminaries > Alex Martelli and David Ascher, so this recipe has their stamp of > approval. This isn't some dirty hack. > > > -- > Steve > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/mertz%40gnosis.cx > -- Keeping medicines from the bloodstreams of the sick; food from the bellies of the hungry; books from the hands of the uneducated; technology from the underdeveloped; and putting advocates of freedom in prisons. Intellectual property is to the 21st century what the slave trade was to the 16th. -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Mon Oct 19 05:10:07 2015 From: guido at python.org (Guido van Rossum) Date: Sun, 18 Oct 2015 20:10:07 -0700 Subject: [Python-Dev] type(obj) vs. obj.__class__ In-Reply-To: <20151019014730.GB3813@ando.pearwood.info> References: <20151018055500.GD3725@ando.pearwood.info> <20151019014730.GB3813@ando.pearwood.info> Message-ID: Assigning __class__ is a precarious stunt (look at the implementation, it requires lots of checks for various things like __slots__ and implementation-specific special cases). The gesture that looks like "overriding a method" is merely setting a new instance attribute that hides the method, and quite tame in comparison. -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Mon Oct 19 23:00:00 2015 From: guido at python.org (Guido van Rossum) Date: Mon, 19 Oct 2015 14:00:00 -0700 Subject: [Python-Dev] compatibility for C-accelerated types In-Reply-To: References: Message-ID: Apart from Serhiy's detraction of the 3.5 bug report there wasn't any discussion in this thread. I also don't really see any specific questions, so maybe you don't have any. Are you just asking whether it's okay to merge your code? Or are you asking for more code review? On Sat, Oct 17, 2015 at 3:20 PM, Eric Snow wrote: > A recent discussion in a tracker issue [1] brought up the matter of > compatibility between the pure Python implementation of OrderedDict > and the new C implementation. In working on that port I stuck as > closely as possible to the Python implementation. This meant some > parts of the code are bit more complex than they would be otherwise. > (Serhiy has been kind enough to do some cleanup.) > > Compatibility was one of the fundamental goals of the porting effort. > Not only does compatibility make sense but it's also specifically > required by PEP 399 [2]: > > Any new accelerated code must act as a drop-in replacement > as close to the pure Python implementation as reasonable. > Technical details of the VM providing the accelerated code > are allowed to differ as necessary, e.g., a class being a type > when implemented in C. > > For the most part I have questions about what is "reasonable", > specifically in relation to OrderedDict. > > I've already opened up a separate thread related to my main question: > type(obj) vs. obj.__class__. [3] In the tracker issue, Serhiy pointed > out: > > There is no a difference. io, pickle, ElementTree, bz2, virtually > all accelerator classes was created as replacements of pure > Python implementations. All C implementations use > Py_TYPE(self) for repr() and pickling. I think this deviation is > common and acceptable. > > In a review comment on the associated patch he said: > > Isn't type(self) is always the same as self.__class__ for pure > Python class? If right, then this change doesn't have any effect. > > To which he later replied: > > It is the same if you assigned the __class__ attribute, but can > be different if set __class__ in the subclass declaration. > > So it isn't clear if that is a compatibility break or how much so it might > be. > > Serhiy also noted that, as of 3.5 [4], you can no longer assign to > obj.__class__ for instances of subclasses of builtin (non-heap) types. > So that is another place where the two OrderedDict implementations > differ. I expect there are a few others in dark corner cases. > > On the tracker he notes another OrderedDict compatibility break: > > Backward compatibility related to __class__ assignment was > already broken in C implementation. In 3.4 following code > works: > > >>> from collections import * > >>> class foo(OrderedDict): > ... def bark(self): return "spam" > ... > >>> class bar(OrderedDict): > ... pass > ... > >>> od = bar() > >>> od.__class__ = foo > >>> od.bark() > 'spam' > > In 3.5 it doesn't. > > As PEP 399 says, we should go as far "as reasonable" in the pursuit of > compatibility. At the same time, I feel not insignificant > responsibility for *any* incompatibility that comes from the C > implementation of OrderedDict. The corner cases impacted by the above > compatibility concerns are borderline enough that I wanted to get some > feedback. Thanks! > > -eric > > > [1] http://bugs.python.org/issue25410 > [2] https://www.python.org/dev/peps/pep-0399/ > [3] https://mail.python.org/pipermail/python-dev/2015-October/141953.html > [4] http://bugs.python.org/issue24912 > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From ericsnowcurrently at gmail.com Mon Oct 19 23:25:52 2015 From: ericsnowcurrently at gmail.com (Eric Snow) Date: Mon, 19 Oct 2015 15:25:52 -0600 Subject: [Python-Dev] compatibility for C-accelerated types In-Reply-To: References: Message-ID: On Mon, Oct 19, 2015 at 3:00 PM, Guido van Rossum wrote: > Apart from Serhiy's detraction of the 3.5 bug report there wasn't any > discussion in this thread. I also don't really see any specific questions, > so maybe you don't have any. Are you just asking whether it's okay to merge > your code? Or are you asking for more code review? Basically, I've had all my questions answered. PEP 399 covers the matter well enough. -eric From storchaka at gmail.com Mon Oct 19 23:47:01 2015 From: storchaka at gmail.com (Serhiy Storchaka) Date: Tue, 20 Oct 2015 00:47:01 +0300 Subject: [Python-Dev] compatibility for C-accelerated types In-Reply-To: References: Message-ID: On 20.10.15 00:00, Guido van Rossum wrote: > Apart from Serhiy's detraction of the 3.5 bug report there wasn't any > discussion in this thread. I also don't really see any specific > questions, so maybe you don't have any. Are you just asking whether it's > okay to merge your code? Or are you asking for more code review? I think Eric ask whether it's okay to have some incompatibility between Python and C implementations. 1. Is it okay to have a difference in effect of __class__ assignment. Pure Python and extension classes have different restrictions. For example (tested example this time) following code works with Python implementation in 3.4, but fails with C implementation in 3.5: from collections import OrderedDict od = OrderedDict() class D(dict): pass od.__class__ = D 2. Is it okay to use obj.__class__ in Python implementation and type(obj) in C implementation for the sake of code simplification? Can we ignore subtle differences? 3. In general, is it okay to have some incompatibility between Python and C implementations for the sake of code simplification, and where the border line lies? From guido at python.org Tue Oct 20 00:01:32 2015 From: guido at python.org (Guido van Rossum) Date: Mon, 19 Oct 2015 15:01:32 -0700 Subject: [Python-Dev] compatibility for C-accelerated types In-Reply-To: References: Message-ID: On Mon, Oct 19, 2015 at 2:47 PM, Serhiy Storchaka wrote: > On 20.10.15 00:00, Guido van Rossum wrote: > >> Apart from Serhiy's detraction of the 3.5 bug report there wasn't any >> discussion in this thread. I also don't really see any specific >> questions, so maybe you don't have any. Are you just asking whether it's >> okay to merge your code? Or are you asking for more code review? >> > > I think Eric ask whether it's okay to have some incompatibility between > Python and C implementations. > > 1. Is it okay to have a difference in effect of __class__ assignment. Pure > Python and extension classes have different restrictions. For example > (tested example this time) following code works with Python implementation > in 3.4, but fails with C implementation in 3.5: > > from collections import OrderedDict > od = OrderedDict() > class D(dict): pass > > od.__class__ = D > Yes. > 2. Is it okay to use obj.__class__ in Python implementation and type(obj) > in C implementation for the sake of code simplification? Can we ignore > subtle differences? > Yes. > 3. In general, is it okay to have some incompatibility between Python and > C implementations for the sake of code simplification, and where the border > line lies? > I don't want to rule in general -- the above two look pretty clear-cut to me in this case, but even for __class__ vs. type() it's conceivable that it might be important in some other case (e.g. if it was for a proxy class :-). I think it's fine to ask here the next time there is some doubt about how far a C implementation would need to go. -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From storchaka at gmail.com Tue Oct 20 10:21:28 2015 From: storchaka at gmail.com (Serhiy Storchaka) Date: Tue, 20 Oct 2015 11:21:28 +0300 Subject: [Python-Dev] type(obj) vs. obj.__class__ In-Reply-To: References: Message-ID: On 18.10.15 00:45, Eric Snow wrote: > So, would it make sense to establish some concrete guidelines about > when to use type(obj) vs. obj.__class__? If so, what would those be? > It may also be helpful to enumerate use cases for "type(obj) is not > obj.__class__". My conclusion of this discussion. In Python 3 type(obj) and obj.__class__ are the same in common case. Assigning obj.__class__ is a way to change type(obj). If the assignment is successful, type(obj) becomes the same as obj.__class__. This is used in importlib for lazy importing and some clever classes like the RingBuffer recipe. But __class__ assignment has many restrictions, and changing Python to C implementation or adding __slots__ for sure adds new restrictions. obj.__class__ is different from type(obj) in proxy classes like weakref or Mock. isinstance() and pickle take __class__ to account to support proxies. Unless we write proxy class or code that should handle proxy classes, we shouldn't care about the difference between type(obj) and obj.__class__, and can use what is the more convenient. In Python this is obj.__class__ (avoids globals lookup), and in C this is type(obj) (much simpler and reliable code). From fijall at gmail.com Tue Oct 20 10:38:07 2015 From: fijall at gmail.com (Maciej Fijalkowski) Date: Tue, 20 Oct 2015 10:38:07 +0200 Subject: [Python-Dev] compatibility for C-accelerated types In-Reply-To: References: Message-ID: For what is worth, that level of differences already exists on pypy and it's really hard to get the *exact* same semantics if things are implemented in python vs C or the other way around. Example list of differences (which I think OrderedDict already breaks if moved to C): * do methods like items call special methods like __getitem__ (I think it's undecided anyway) * what happens if you take a method and rebind it to another subclass, does it automatically become a method (there are differences between built in and pure python) * atomicity of operations. Some operations used to be non-atomic in Python will be atomic now. I personally think those (and the __class__ issue) are unavoidable On Mon, Oct 19, 2015 at 11:47 PM, Serhiy Storchaka wrote: > On 20.10.15 00:00, Guido van Rossum wrote: >> >> Apart from Serhiy's detraction of the 3.5 bug report there wasn't any >> discussion in this thread. I also don't really see any specific >> questions, so maybe you don't have any. Are you just asking whether it's >> okay to merge your code? Or are you asking for more code review? > > > I think Eric ask whether it's okay to have some incompatibility between > Python and C implementations. > > 1. Is it okay to have a difference in effect of __class__ assignment. Pure > Python and extension classes have different restrictions. For example > (tested example this time) following code works with Python implementation > in 3.4, but fails with C implementation in 3.5: > > from collections import OrderedDict > od = OrderedDict() > class D(dict): pass > > od.__class__ = D > > 2. Is it okay to use obj.__class__ in Python implementation and type(obj) in > C implementation for the sake of code simplification? Can we ignore subtle > differences? > > 3. In general, is it okay to have some incompatibility between Python and C > implementations for the sake of code simplification, and where the border > line lies? > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/fijall%40gmail.com From ncoghlan at gmail.com Tue Oct 20 11:11:33 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Tue, 20 Oct 2015 11:11:33 +0200 Subject: [Python-Dev] PEP 506 secrets module In-Reply-To: <20151016100457.GD11980@ando.pearwood.info> References: <20151016005711.GC11980@ando.pearwood.info> <20151016100457.GD11980@ando.pearwood.info> Message-ID: On 16 October 2015 at 12:04, Steven D'Aprano wrote: > On Fri, Oct 16, 2015 at 08:57:24AM +0200, Victor Stinner wrote: >> I don't like the idea how having two functions doing *almost* the same >> thing: randint() and randrange(). There is a risk that these functions >> will be misused. I consider that I know some stuff on PRNG but I'm >> still confused by randint() and randrange(). Usually, I open python >> and type: >> >> >>> x=[s.randrange(1,6) for n in range(100)] >> >>> min(x), max(x) >> (1, 5) > > Wouldn't help(randrange) be easier? :-) > > Choose a random item from range(start, stop[, step]). > > This fixes the problem with randint() which includes the > endpoint; in Python this is usually not what you want. > > > I always find that comment amusing. While it is true that in slicing, > half-open ranges are more useful than closed ranges, but when it comes > to generating random numbers (say, simulating dice) I find randint much > more useful and intuitive. > > But I appreciate that some people think differently. Folks wanting to simulate die rolls should be using the random module rather than the secrets module anyway, so the "only offer secrets.randbelow()" approach Guido suggested in his last email makes sense to me. Regards, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Tue Oct 20 11:29:45 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Tue, 20 Oct 2015 11:29:45 +0200 Subject: [Python-Dev] type(obj) vs. obj.__class__ In-Reply-To: References: Message-ID: On 20 October 2015 at 10:21, Serhiy Storchaka wrote: > On 18.10.15 00:45, Eric Snow wrote: >> >> So, would it make sense to establish some concrete guidelines about >> when to use type(obj) vs. obj.__class__? If so, what would those be? >> It may also be helpful to enumerate use cases for "type(obj) is not >> obj.__class__". > > > My conclusion of this discussion. In Python 3 type(obj) and obj.__class__ > are the same in common case. Assigning obj.__class__ is a way to change > type(obj). If the assignment is successful, type(obj) becomes the same as > obj.__class__. This is used in importlib for lazy importing and some clever > classes like the RingBuffer recipe. But __class__ assignment has many > restrictions, and changing Python to C implementation or adding __slots__ > for sure adds new restrictions. > > obj.__class__ is different from type(obj) in proxy classes like weakref or > Mock. isinstance() and pickle take __class__ to account to support proxies. > > Unless we write proxy class or code that should handle proxy classes, we > shouldn't care about the difference between type(obj) and obj.__class__, and > can use what is the more convenient. In Python this is obj.__class__ (avoids > globals lookup), and in C this is type(obj) (much simpler and reliable > code). Right, this is a good summary. Weakref proxies provide one of the simplest demonstrations of cases where the two diverge: >>> from weakref import proxy >>> class C: pass ... >>> obj = C() >>> ref = proxy(obj) >>> type(ref) >>> ref.__class__ When we use "obj.__class__", we're treating proxies as their target, when we use "type(obj)", we're treating them as the proxy object. Which of those to use depends greatly on what we're doing. For Eric's original question that started the thread: proxy types shouldn't inherit from a concrete container type like OrderedDict, so type(self) and self.__class__ should *always* give the same answer, even in subclasses. Cheers, Nick. P.S. Proxy types are actually quite hard to right correctly, so if anyone *does* need to implement one, they're likely to be best served by starting with an existing library like wrapt: http://wrapt.readthedocs.org/en/latest/wrappers.html -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From victor.stinner at gmail.com Tue Oct 20 11:33:59 2015 From: victor.stinner at gmail.com (Victor Stinner) Date: Tue, 20 Oct 2015 11:33:59 +0200 Subject: [Python-Dev] PEP 506 secrets module In-Reply-To: References: <20151016005711.GC11980@ando.pearwood.info> <20151016100457.GD11980@ando.pearwood.info> Message-ID: 2015-10-20 11:11 GMT+02:00 Nick Coghlan : > Folks wanting to simulate die rolls should be using the random module > rather than the secrets module anyway, Hum, why? Dices are used in Casino where security matters because it costs money. A bad API can be more likely misused and introduce security vulnerability. The C rand() API is a good example: 1+rand()%6 is not uniform... Victor From ncoghlan at gmail.com Tue Oct 20 11:56:37 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Tue, 20 Oct 2015 11:56:37 +0200 Subject: [Python-Dev] PEP 506 secrets module In-Reply-To: References: <20151016005711.GC11980@ando.pearwood.info> <20151016100457.GD11980@ando.pearwood.info> Message-ID: On 20 October 2015 at 11:33, Victor Stinner wrote: > 2015-10-20 11:11 GMT+02:00 Nick Coghlan : >> Folks wanting to simulate die rolls should be using the random module >> rather than the secrets module anyway, > > Hum, why? Dices are used in Casino where security matters because it > costs money. True, I was thinking of just-for-fun games, but in gambling games unbiased randomness can be significantly more important. > A bad API can be more likely misused and introduce security > vulnerability. The C rand() API is a good example: 1+rand()%6 is not > uniform... "1 + secrets.randbelow(6)" would be uniform, though. As Tim pointed out, the *lack* of flexibility in randbelow() is a feature here, since it focuses on producing a uniformly random distribution of a given size, which can then be transformed deterministically. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ericsnowcurrently at gmail.com Tue Oct 20 17:05:04 2015 From: ericsnowcurrently at gmail.com (Eric Snow) Date: Tue, 20 Oct 2015 09:05:04 -0600 Subject: [Python-Dev] compatibility for C-accelerated types In-Reply-To: References: Message-ID: On Tue, Oct 20, 2015 at 2:38 AM, Maciej Fijalkowski wrote: > For what is worth, that level of differences already exists on pypy > and it's really hard to get the *exact* same semantics if things are > implemented in python vs C or the other way around. > > Example list of differences (which I think OrderedDict already breaks > if moved to C): > > * do methods like items call special methods like __getitem__ (I think > it's undecided anyway) > > * what happens if you take a method and rebind it to another subclass, > does it automatically become a method (there are differences between > built in and pure python) > > * atomicity of operations. Some operations used to be non-atomic in > Python will be atomic now. > > I personally think those (and the __class__ issue) are unavoidable Yeah, I figured as much. Thanks for pointing those out. Perhaps it would be useful to enumerate specific cases like these in PEP 399? They could go near the part that says "as close to the pure Python implementation as reasonable". -eric From guido at python.org Tue Oct 20 17:10:09 2015 From: guido at python.org (Guido van Rossum) Date: Tue, 20 Oct 2015 08:10:09 -0700 Subject: [Python-Dev] compatibility for C-accelerated types In-Reply-To: References: Message-ID: Please go ahead and update PEP 399. On Tue, Oct 20, 2015 at 8:05 AM, Eric Snow wrote: > On Tue, Oct 20, 2015 at 2:38 AM, Maciej Fijalkowski > wrote: > > For what is worth, that level of differences already exists on pypy > > and it's really hard to get the *exact* same semantics if things are > > implemented in python vs C or the other way around. > > > > Example list of differences (which I think OrderedDict already breaks > > if moved to C): > > > > * do methods like items call special methods like __getitem__ (I think > > it's undecided anyway) > > > > * what happens if you take a method and rebind it to another subclass, > > does it automatically become a method (there are differences between > > built in and pure python) > > > > * atomicity of operations. Some operations used to be non-atomic in > > Python will be atomic now. > > > > I personally think those (and the __class__ issue) are unavoidable > > Yeah, I figured as much. Thanks for pointing those out. Perhaps it > would be useful to enumerate specific cases like these in PEP 399? > They could go near the part that says "as close to the pure Python > implementation as reasonable". > > -eric > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From ericsnowcurrently at gmail.com Tue Oct 20 17:32:35 2015 From: ericsnowcurrently at gmail.com (Eric Snow) Date: Tue, 20 Oct 2015 09:32:35 -0600 Subject: [Python-Dev] compatibility for C-accelerated types In-Reply-To: References: Message-ID: On Tue, Oct 20, 2015 at 9:10 AM, Guido van Rossum wrote: > Please go ahead and update PEP 399. Will do. -eric From brett at snarky.ca Tue Oct 20 20:13:20 2015 From: brett at snarky.ca (Brett Cannon) Date: Tue, 20 Oct 2015 18:13:20 +0000 Subject: [Python-Dev] [Python-checkins] Daily reference leaks (d7e490db8d54): sum=61494 In-Reply-To: <20151020084733.34053.597@psf.io> References: <20151020084733.34053.597@psf.io> Message-ID: These leaks have been here a while. Anyone know the cause? On Tue, 20 Oct 2015 at 01:47 wrote: > results for d7e490db8d54 on branch "default" > -------------------------------------------- > > test_capi leaked [5411, 5411, 5411] references, sum=16233 > test_capi leaked [1421, 1423, 1423] memory blocks, sum=4267 > test_functools leaked [0, 2, 2] memory blocks, sum=4 > test_threading leaked [10820, 10820, 10820] references, sum=32460 > test_threading leaked [2842, 2844, 2844] memory blocks, sum=8530 > > > Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', > '3:3:/home/psf-users/antoine/refleaks/reflogyrNnBL', '--timeout', '7200'] > _______________________________________________ > Python-checkins mailing list > Python-checkins at python.org > https://mail.python.org/mailman/listinfo/python-checkins > -------------- next part -------------- An HTML attachment was scrubbed... URL: From antoine at pitrou.net Wed Oct 21 00:57:00 2015 From: antoine at pitrou.net (Antoine Pitrou) Date: Wed, 21 Oct 2015 00:57:00 +0200 Subject: [Python-Dev] [Python-checkins] Daily reference leaks (d7e490db8d54): sum=61494 Message-ID: <5626C6BC.8090105@pitrou.net> > These leaks have been here a while. Anyone know the cause? > > On Tue, 20 Oct 2015 at 01:47 wrote: > >> results for d7e490db8d54 on branch "default" >> -------------------------------------------- >> >> test_capi leaked [5411, 5411, 5411] references, sum=16233 >> test_capi leaked [1421, 1423, 1423] memory blocks, sum=4267 >> test_functools leaked [0, 2, 2] memory blocks, sum=4 >> test_threading leaked [10820, 10820, 10820] references, sum=32460 >> test_threading leaked [2842, 2844, 2844] memory blocks, sum=8530 Bisection shows they were probably introduced by: changeset: 97413:dccc4e63aef5 user: Raymond Hettinger date: Sun Aug 16 19:43:34 2015 -0700 files: Doc/library/operator.rst Doc/whatsnew/3.6.rst Lib/operator.py Lib/test/test_operator.py description: Issue #24379: Add operator.subscript() as a convenience for building slices. If you comment out `@object.__new__` on line 411 in operator.py, or if you remove the __slots__ assignment (which is a bit worrying), the leak seems suppressed. You can reproduce using: $ ./python -m test -m SubinterpreterTest -R 3:3 test_capi [1/1] test_capi beginning 6 repetitions 123456 ...... test_capi leaked [5443, 5443, 5443] references, sum=16329 test_capi leaked [1432, 1434, 1434] memory blocks, sum=4300 1 test failed: test_capi Regards Antoine. From greg at krypto.org Wed Oct 21 03:25:38 2015 From: greg at krypto.org (Gregory P. Smith) Date: Wed, 21 Oct 2015 01:25:38 +0000 Subject: [Python-Dev] PEP-8 wart... it recommends short names because of DOS Message-ID: https://www.python.org/dev/peps/pep-0008/#names-to-avoid *"Since module names are mapped to file names, and some file systems are case insensitive and truncate long names, it is important that module names be chosen to be fairly short -- this won't be a problem on Unix, but it may be a problem when the code is transported to older Mac or Windows versions, or DOS."* There haven't been computers with less than 80 character file or path name element length limits in wide use in decades... ;) -gps -------------- next part -------------- An HTML attachment was scrubbed... URL: From mertz at gnosis.cx Wed Oct 21 03:32:34 2015 From: mertz at gnosis.cx (David Mertz) Date: Tue, 20 Oct 2015 18:32:34 -0700 Subject: [Python-Dev] PEP-8 wart... it recommends short names because of DOS In-Reply-To: References: Message-ID: DOS Python programmers probably can't use `concurrent` or `multiprocessing`. ? On Oct 20, 2015 6:26 PM, "Gregory P. Smith" wrote: > https://www.python.org/dev/peps/pep-0008/#names-to-avoid > > *"Since module names are mapped to file names, and some file systems are > case insensitive and truncate long names, it is important that module names > be chosen to be fairly short -- this won't be a problem on Unix, but it may > be a problem when the code is transported to older Mac or Windows versions, > or DOS."* > > There haven't been computers with less than 80 character file or path name > element length limits in wide use in decades... ;) > > -gps > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/mertz%40gnosis.cx > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ben+python at benfinney.id.au Wed Oct 21 03:59:06 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Wed, 21 Oct 2015 12:59:06 +1100 Subject: [Python-Dev] PEP 8 recommends short module names because FAT is still common today (was: PEP-8 wart... it recommends short names because of DOS) References: Message-ID: <85pp09ouc5.fsf@benfinney.id.au> "Gregory P. Smith" writes: > There haven't been computers with less than 80 character file or path > name element length limits in wide use in decades... ;) Not true, your computer will happily mount severely-limited filesystems. Indeed, I'd wager it has done so many times this year. It is *filesystems* that limit the length of filesystem entries, and the FAT filesystem is still in very widespread use ? on devices mounted by the computers you use today. Yes, we have much better filesystems today, and your primary desktop computer will almost certainly use something better than FAT for its primary storage's filesystem. That does not mean Python programmers should assume your computer will never mount a FAT filesystem (think small flash storage), nor that a program you run will never need to load Python modules from that filesystem. You'd like FAT to go away forever? Great, me too. Now we need to convince all the vendors of every small storage device ? USB thumb drives, network routers, all manner of single-purpose devices ? to use modern filesystems instead. Then, maybe after another human generation has come and gone, we can finally expect every filesystem, in every active device that might run any Python code, to be using something with a reasonably-large limit for filesystem entries. Until then, the advice in PEP 8 to keep module names short is reasonable. -- \ ?The most common of all follies is to believe passionately in | `\ the palpably not true. It is the chief occupation of mankind.? | _o__) ?Henry L. Mencken | Ben Finney From mertz at gnosis.cx Wed Oct 21 04:02:17 2015 From: mertz at gnosis.cx (David Mertz) Date: Tue, 20 Oct 2015 19:02:17 -0700 Subject: [Python-Dev] PEP 8 recommends short module names because FAT is still common today (was: PEP-8 wart... it recommends short names because of DOS) In-Reply-To: <85pp09ouc5.fsf@benfinney.id.au> References: <85pp09ouc5.fsf@benfinney.id.au> Message-ID: Even thumb drives use VFAT. Yes it's an ugly hack, but the names aren't limited to 8.3. On Oct 20, 2015 6:59 PM, "Ben Finney" wrote: > "Gregory P. Smith" writes: > > > There haven't been computers with less than 80 character file or path > > name element length limits in wide use in decades... ;) > > Not true, your computer will happily mount severely-limited filesystems. > Indeed, I'd wager it has done so many times this year. > > It is *filesystems* that limit the length of filesystem entries, and the > FAT filesystem is still in very widespread use ? on devices mounted by > the computers you use today. > > Yes, we have much better filesystems today, and your primary desktop > computer will almost certainly use something better than FAT for its > primary storage's filesystem. > > That does not mean Python programmers should assume your computer will > never mount a FAT filesystem (think small flash storage), nor that a > program you run will never need to load Python modules from that > filesystem. > > > You'd like FAT to go away forever? Great, me too. Now we need to > convince all the vendors of every small storage device ? USB thumb > drives, network routers, all manner of single-purpose devices ? to use > modern filesystems instead. > > Then, maybe after another human generation has come and gone, we can > finally expect every filesystem, in every active device that might run > any Python code, to be using something with a reasonably-large limit for > filesystem entries. > > Until then, the advice in PEP 8 to keep module names short is reasonable. > > -- > \ ?The most common of all follies is to believe passionately in | > `\ the palpably not true. It is the chief occupation of mankind.? | > _o__) ?Henry L. Mencken | > Ben Finney > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/mertz%40gnosis.cx > -------------- next part -------------- An HTML attachment was scrubbed... URL: From benjamin at python.org Wed Oct 21 05:23:22 2015 From: benjamin at python.org (Benjamin Peterson) Date: Tue, 20 Oct 2015 20:23:22 -0700 Subject: [Python-Dev] [Python-checkins] Daily reference leaks (d7e490db8d54): sum=61494 In-Reply-To: <5626C6BC.8090105@pitrou.net> References: <5626C6BC.8090105@pitrou.net> Message-ID: <1445397802.2024936.415931457.556EAB95@webmail.messagingengine.com> On Tue, Oct 20, 2015, at 15:57, Antoine Pitrou wrote: > > > These leaks have been here a while. Anyone know the cause? > > > > On Tue, 20 Oct 2015 at 01:47 wrote: > > > >> results for d7e490db8d54 on branch "default" > >> -------------------------------------------- > >> > >> test_capi leaked [5411, 5411, 5411] references, sum=16233 > >> test_capi leaked [1421, 1423, 1423] memory blocks, sum=4267 > >> test_functools leaked [0, 2, 2] memory blocks, sum=4 > >> test_threading leaked [10820, 10820, 10820] references, sum=32460 > >> test_threading leaked [2842, 2844, 2844] memory blocks, sum=8530 > > Bisection shows they were probably introduced by: > > changeset: 97413:dccc4e63aef5 > user: Raymond Hettinger > date: Sun Aug 16 19:43:34 2015 -0700 > files: Doc/library/operator.rst Doc/whatsnew/3.6.rst > Lib/operator.py Lib/test/test_operator.py > description: > Issue #24379: Add operator.subscript() as a convenience for building > slices. > > > If you comment out `@object.__new__` on line 411 in operator.py, or if > you remove the __slots__ assignment (which is a bit worrying), the leak > seems suppressed. The problem is that the "subscript" class is not a GC type, but participates in a cycle through its type. I suspect it's type to force all heap types to have GC. Armin Rigo found previous examples where this special case caused problems, and I don't see what it buys anyway. From guido at python.org Wed Oct 21 05:25:13 2015 From: guido at python.org (Guido van Rossum) Date: Tue, 20 Oct 2015 20:25:13 -0700 Subject: [Python-Dev] PEP 8 recommends short module names because FAT is still common today (was: PEP-8 wart... it recommends short names because of DOS) In-Reply-To: References: <85pp09ouc5.fsf@benfinney.id.au> Message-ID: Regardless, I don't think the continued existence of FAT filesystems can be perceived as a threat to module names, so I've removed the offending paragraph from the PEP. Note that it still recommends short, all-lowercase module and package names -- it just doesn't use computers to motivate it. On Tue, Oct 20, 2015 at 7:02 PM, David Mertz wrote: > Even thumb drives use VFAT. Yes it's an ugly hack, but the names aren't > limited to 8.3. > On Oct 20, 2015 6:59 PM, "Ben Finney" wrote: > >> "Gregory P. Smith" writes: >> >> > There haven't been computers with less than 80 character file or path >> > name element length limits in wide use in decades... ;) >> >> Not true, your computer will happily mount severely-limited filesystems. >> Indeed, I'd wager it has done so many times this year. >> >> It is *filesystems* that limit the length of filesystem entries, and the >> FAT filesystem is still in very widespread use ? on devices mounted by >> the computers you use today. >> >> Yes, we have much better filesystems today, and your primary desktop >> computer will almost certainly use something better than FAT for its >> primary storage's filesystem. >> >> That does not mean Python programmers should assume your computer will >> never mount a FAT filesystem (think small flash storage), nor that a >> program you run will never need to load Python modules from that >> filesystem. >> >> >> You'd like FAT to go away forever? Great, me too. Now we need to >> convince all the vendors of every small storage device ? USB thumb >> drives, network routers, all manner of single-purpose devices ? to use >> modern filesystems instead. >> >> Then, maybe after another human generation has come and gone, we can >> finally expect every filesystem, in every active device that might run >> any Python code, to be using something with a reasonably-large limit for >> filesystem entries. >> >> Until then, the advice in PEP 8 to keep module names short is reasonable. >> >> -- >> \ ?The most common of all follies is to believe passionately in | >> `\ the palpably not true. It is the chief occupation of mankind.? | >> _o__) ?Henry L. Mencken | >> Ben Finney >> >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/mertz%40gnosis.cx >> > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From ben+python at benfinney.id.au Wed Oct 21 05:54:07 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Wed, 21 Oct 2015 14:54:07 +1100 Subject: [Python-Dev] PEP 8 recommends short module names because FAT is still common today References: <85pp09ouc5.fsf@benfinney.id.au> Message-ID: <85lhawq3kw.fsf@benfinney.id.au> Guido van Rossum writes: > [?] I've removed the offending paragraph from the PEP. Note that it > still recommends short, all-lowercase module and package names -- it > just doesn't use computers to motivate it. That suits me too. I think the justification was valid, but its absence doesn't harm the PEP. -- \ ?I busted a mirror and got seven years bad luck, but my lawyer | `\ thinks he can get me five.? ?Steven Wright | _o__) | Ben Finney From guido at python.org Wed Oct 21 06:23:01 2015 From: guido at python.org (Guido van Rossum) Date: Tue, 20 Oct 2015 21:23:01 -0700 Subject: [Python-Dev] [Python-checkins] Daily reference leaks (d7e490db8d54): sum=61494 In-Reply-To: <1445397802.2024936.415931457.556EAB95@webmail.messagingengine.com> References: <5626C6BC.8090105@pitrou.net> <1445397802.2024936.415931457.556EAB95@webmail.messagingengine.com> Message-ID: [Adding Raymond to the thread, since he doesn't always follow the lists closely.] On Tue, Oct 20, 2015 at 8:23 PM, Benjamin Peterson wrote: > > > On Tue, Oct 20, 2015, at 15:57, Antoine Pitrou wrote: > > > > > These leaks have been here a while. Anyone know the cause? > > > > > > On Tue, 20 Oct 2015 at 01:47 wrote: > > > > > >> results for d7e490db8d54 on branch "default" > > >> -------------------------------------------- > > >> > > >> test_capi leaked [5411, 5411, 5411] references, sum=16233 > > >> test_capi leaked [1421, 1423, 1423] memory blocks, sum=4267 > > >> test_functools leaked [0, 2, 2] memory blocks, sum=4 > > >> test_threading leaked [10820, 10820, 10820] references, sum=32460 > > >> test_threading leaked [2842, 2844, 2844] memory blocks, sum=8530 > > > > Bisection shows they were probably introduced by: > > > > changeset: 97413:dccc4e63aef5 > > user: Raymond Hettinger > > date: Sun Aug 16 19:43:34 2015 -0700 > > files: Doc/library/operator.rst Doc/whatsnew/3.6.rst > > Lib/operator.py Lib/test/test_operator.py > > description: > > Issue #24379: Add operator.subscript() as a convenience for building > > slices. > > > > > > If you comment out `@object.__new__` on line 411 in operator.py, or if > > you remove the __slots__ assignment (which is a bit worrying), the leak > > seems suppressed. > > The problem is that the "subscript" class is not a GC type, but > participates in a cycle through its type. I suspect it's type to force > all heap types to have GC. Armin Rigo found previous examples where this > special case caused problems, and I don't see what it buys anyway. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From storchaka at gmail.com Wed Oct 21 09:44:19 2015 From: storchaka at gmail.com (Serhiy Storchaka) Date: Wed, 21 Oct 2015 10:44:19 +0300 Subject: [Python-Dev] PEP-8 wart... it recommends short names because of DOS In-Reply-To: References: Message-ID: On 21.10.15 04:25, Gregory P. Smith wrote: > https://www.python.org/dev/peps/pep-0008/#names-to-avoid > > /"Since module names are mapped to file names, and some file systems are > case insensitive and truncate long names, it is important that module > names be chosen to be fairly short -- this won't be a problem on Unix, > but it may be a problem when the code is transported to older Mac or > Windows versions, or DOS."/ > > There haven't been computers with less than 80 character file or path > name element length limits in wide use in decades... ;) We should also avoid special file names like con.py or lpt1.py. From raymond.hettinger at gmail.com Wed Oct 21 17:32:39 2015 From: raymond.hettinger at gmail.com (Raymond Hettinger) Date: Wed, 21 Oct 2015 08:32:39 -0700 Subject: [Python-Dev] [Python-checkins] Daily reference leaks (d7e490db8d54): sum=61494 In-Reply-To: <5626C6BC.8090105@pitrou.net> References: <5626C6BC.8090105@pitrou.net> Message-ID: > On Oct 20, 2015, at 3:57 PM, Antoine Pitrou wrote: > > >> These leaks have been here a while. Anyone know the cause? >> >> On Tue, 20 Oct 2015 at 01:47 wrote: >> >>> results for d7e490db8d54 on branch "default" >>> -------------------------------------------- >>> >>> test_capi leaked [5411, 5411, 5411] references, sum=16233 >>> test_capi leaked [1421, 1423, 1423] memory blocks, sum=4267 >>> test_functools leaked [0, 2, 2] memory blocks, sum=4 >>> test_threading leaked [10820, 10820, 10820] references, sum=32460 >>> test_threading leaked [2842, 2844, 2844] memory blocks, sum=8530 > > Bisection shows they were probably introduced by: > > changeset: 97413:dccc4e63aef5 > user: Raymond Hettinger > date: Sun Aug 16 19:43:34 2015 -0700 > files: Doc/library/operator.rst Doc/whatsnew/3.6.rst > Lib/operator.py Lib/test/test_operator.py > description: > Issue #24379: Add operator.subscript() as a convenience for building slices. > > > If you comment out `@object.__new__` on line 411 in operator.py, or if > you remove the __slots__ assignment (which is a bit worrying), the leak > seems suppressed. > Thanks for hunting this down. I had seen the automated reference leak posts but didn't suspect that a pure python class would have caused the leak. I'm re-opening https://mail.python.org/pipermail/python-dev/2015-October/141993.html and will take a look at it this weekend. If I don't see an obvious fix, I'll revert Joe's patch until a correct patch is supplied and reviewed. Raymond From ericsnowcurrently at gmail.com Wed Oct 21 17:45:59 2015 From: ericsnowcurrently at gmail.com (Eric Snow) Date: Wed, 21 Oct 2015 09:45:59 -0600 Subject: [Python-Dev] [Python-checkins] Daily reference leaks (d7e490db8d54): sum=61494 In-Reply-To: References: <5626C6BC.8090105@pitrou.net> Message-ID: On Wed, Oct 21, 2015 at 9:32 AM, Raymond Hettinger wrote: > I'm re-opening https://mail.python.org/pipermail/python-dev/2015-October/141993.html Presumably you meant http://bugs.python.org/issue24379. :) -eric From random832 at fastmail.com Wed Oct 21 17:53:45 2015 From: random832 at fastmail.com (Random832) Date: Wed, 21 Oct 2015 11:53:45 -0400 Subject: [Python-Dev] [Python-checkins] Daily reference leaks (d7e490db8d54): sum=61494 References: <5626C6BC.8090105@pitrou.net> Message-ID: <87pp088bg6.fsf@fastmail.com> Raymond Hettinger writes: > Thanks for hunting this down. I had seen the automated reference leak > posts but didn't suspect that a pure python class would have caused > the leak. > > I'm re-opening > https://mail.python.org/pipermail/python-dev/2015-October/141993.html > and will take a look at it this weekend. If I don't see an obvious > fix, I'll revert Joe's patch until a correct patch is supplied and > reviewed. If a pure python class can cause a reference leak, doesn't that mean it is only a symptom rather than the real cause? Or is it that the use of @object.__new__ is considered "too clever" to be worth fixing? From antoine at python.org Wed Oct 21 18:00:52 2015 From: antoine at python.org (Antoine Pitrou) Date: Wed, 21 Oct 2015 18:00:52 +0200 Subject: [Python-Dev] [Python-checkins] Daily reference leaks (d7e490db8d54): sum=61494 In-Reply-To: References: <5626C6BC.8090105@pitrou.net> Message-ID: <5627B6B4.1020905@python.org> Le 21/10/2015 17:32, Raymond Hettinger a ?crit : > > Thanks for hunting this down. I had seen the automated reference leak posts > but didn't suspect that a pure python class would have caused the leak. Yes, it's a bit baffling at first. > I'm re-opening https://mail.python.org/pipermail/python-dev/2015-October/141993.html > and will take a look at it this weekend. If I don't see an obvious fix, I'll revert Joe's patch > until a correct patch is supplied and reviewed. Well, the simple workaround is to remove the __slots__ field. I don't think it matters here, especially for a singleton. Also, Benjamin's suggestion to make all heap type instances GC-enabled sounds reasonable to me. Regards Antoine. From jjevnik at quantopian.com Wed Oct 21 18:10:53 2015 From: jjevnik at quantopian.com (Joe Jevnik) Date: Wed, 21 Oct 2015 12:10:53 -0400 Subject: [Python-Dev] [Python-checkins] Daily reference leaks (d7e490db8d54): sum=61494 In-Reply-To: <87pp088bg6.fsf@fastmail.com> References: <5626C6BC.8090105@pitrou.net> <87pp088bg6.fsf@fastmail.com> Message-ID: Sorry about introducing this. Where can I subscribe to these automated emails. Also, how do I go about running this locally? On default I tried running `./python -m test -l test_capi` did not print anything about leaks. I think that using `object.__new__` as a decorator here is the same as subclassing object, overriding __new__ and then making a call to `super().__new__` so I would imagine this bug could appear in less "clever" situations. I would love to help fix this issue; Benjamin, you mentioned that you think that maybe all heaptypes should have gc, do you have a suggestion on where I can look in the code to try to make this change? On Wed, Oct 21, 2015 at 11:53 AM, Random832 wrote: > Raymond Hettinger writes: > > Thanks for hunting this down. I had seen the automated reference leak > > posts but didn't suspect that a pure python class would have caused > > the leak. > > > > I'm re-opening > > https://mail.python.org/pipermail/python-dev/2015-October/141993.html > > and will take a look at it this weekend. If I don't see an obvious > > fix, I'll revert Joe's patch until a correct patch is supplied and > > reviewed. > > If a pure python class can cause a reference leak, doesn't that mean it > is only a symptom rather than the real cause? Or is it that the use of > @object.__new__ is considered "too clever" to be worth fixing? > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/joe%40quantopian.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Wed Oct 21 18:58:22 2015 From: brett at python.org (Brett Cannon) Date: Wed, 21 Oct 2015 16:58:22 +0000 Subject: [Python-Dev] [Python-checkins] Daily reference leaks (d7e490db8d54): sum=61494 In-Reply-To: References: <5626C6BC.8090105@pitrou.net> <87pp088bg6.fsf@fastmail.com> Message-ID: On Wed, 21 Oct 2015 at 09:33 Joe Jevnik wrote: > Sorry about introducing this. Where can I subscribe to these automated > emails. > The emails are sent to the python-checkins mailing list. > Also, how do I go about running this locally? > If you look at the bottom of the email that reports the leaks you will notice it tells you how the tests were run, e.g.: ['./python', '-m', 'test.regrtest', '-uall', '-R', '3:3:/home/psf-users/antoine/refleaks/reflogyrNnBL', '--timeout', '7200'] -Brett > On default I tried running > `./python -m test -l test_capi` did not print anything about leaks. I > think that > using `object.__new__` as a decorator here is the same as subclassing > object, > overriding __new__ and then making a call to `super().__new__` so I would > imagine this bug could appear in less "clever" situations. I would love to > help > fix this issue; Benjamin, you mentioned that you think that maybe all > heaptypes > should have gc, do you have a suggestion on where I can look in the code > to try > to make this change? > > On Wed, Oct 21, 2015 at 11:53 AM, Random832 > wrote: > >> Raymond Hettinger writes: >> > Thanks for hunting this down. I had seen the automated reference leak >> > posts but didn't suspect that a pure python class would have caused >> > the leak. >> > >> > I'm re-opening >> > https://mail.python.org/pipermail/python-dev/2015-October/141993.html >> > and will take a look at it this weekend. If I don't see an obvious >> > fix, I'll revert Joe's patch until a correct patch is supplied and >> > reviewed. >> >> If a pure python class can cause a reference leak, doesn't that mean it >> is only a symptom rather than the real cause? Or is it that the use of >> @object.__new__ is considered "too clever" to be worth fixing? >> > >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> > Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/joe%40quantopian.com >> > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ethan at stoneleaf.us Wed Oct 21 19:10:56 2015 From: ethan at stoneleaf.us (Ethan Furman) Date: Wed, 21 Oct 2015 10:10:56 -0700 Subject: [Python-Dev] [Python-checkins] Daily reference leaks (d7e490db8d54): sum=61494 In-Reply-To: <87pp088bg6.fsf@fastmail.com> References: <5626C6BC.8090105@pitrou.net> <87pp088bg6.fsf@fastmail.com> Message-ID: <5627C720.9070609@stoneleaf.us> On 10/21/2015 08:53 AM, Random832 wrote: > If a pure python class can cause a reference leak, doesn't that mean it > is only a symptom rather than the real cause? Or is it that the use of > @object.__new__ is considered "too clever" to be worth fixing? Where can I find out more about using `object.__new__` as a decorator? -- ~Ethan~ From steve at pearwood.info Thu Oct 22 00:41:42 2015 From: steve at pearwood.info (Steven D'Aprano) Date: Thu, 22 Oct 2015 09:41:42 +1100 Subject: [Python-Dev] [Python-checkins] Daily reference leaks (d7e490db8d54): sum=61494 In-Reply-To: <5627C720.9070609@stoneleaf.us> References: <5626C6BC.8090105@pitrou.net> <87pp088bg6.fsf@fastmail.com> <5627C720.9070609@stoneleaf.us> Message-ID: <20151021224141.GO3813@ando.pearwood.info> On Wed, Oct 21, 2015 at 10:10:56AM -0700, Ethan Furman wrote: > On 10/21/2015 08:53 AM, Random832 wrote: > > >If a pure python class can cause a reference leak, doesn't that mean it > >is only a symptom rather than the real cause? Or is it that the use of > >@object.__new__ is considered "too clever" to be worth fixing? > > Where can I find out more about using `object.__new__` as a decorator? How about the interactive interpreter? py> @object.__new__ ... class X: ... pass ... py> X <__main__.X object at 0xb7b4dacc> Consider the pre-decorator-syntax way of writing that: class X: pass X = object.__new__(X) That's a way of setting X = X(), except that it only works for X a class (can't decorate a function this way), and it avoids calling the __init__ method. -- Steve From guido at python.org Thu Oct 22 00:58:17 2015 From: guido at python.org (Guido van Rossum) Date: Wed, 21 Oct 2015 15:58:17 -0700 Subject: [Python-Dev] PEP 484 -- proposal to allow @overload in non-stub files Message-ID: PEP 484 (Type Hinting) currently disallows @overload outside stub files. The hope was that a PEP for multi-dispatch would emerge, but that's been slow coming. Meanwhile, over in https://github.com/ambv/typehinting/issues/72 a proposal has emerged to allow @overload in regular modules, as long as it is followed by a non-overloaded definition that serves as the default/fallback. A motivating example is __getitem__, which is often overloaded for item access and slicing. In stubs, you can use: class Foo(Generic[T]): @overload def __getitem__(self, i: int) -> T: ... @overload def __getitem__(self, s: slice) -> Foo[T]: ... (Note that the '...' are part of the actual code -- an ellipsis is how you represent the body of all functions in stub files.) However, in source files the best you can do is: class Foo(Generic[T]): def __getitem__(self, i: Union[int, slice]) -> Union[T, List[T]]: ... which will require unacceptable casts at every call site. You can work around it by having a stub file but that's cumbersome if this is the only reason to have one. The proposal is to allow this to be written as follows in implementation (non-stub) modules: class Foo(Generic[T]): @overload def __getitem__(self, i: int) -> T: ... @overload def __getitem__(self, s: slice) -> Foo[T]: ... def __getitem__(self, x): The actual implementation must be last, so at run time it will override the definition. It has to use isinstance() to distinguish the cases. A type checker would have to recognize this as a special case (so as not to complain about the non-overloaded version). Jukka thinks it would be about a day's work to implement in mypy; the work in typing.py would be a few minutes. Thoughts? -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From random832 at fastmail.com Thu Oct 22 03:45:56 2015 From: random832 at fastmail.com (Random832) Date: Wed, 21 Oct 2015 21:45:56 -0400 Subject: [Python-Dev] PEP 484 -- proposal to allow @overload in non-stub files References: Message-ID: <87wpufllpn.fsf@fastmail.com> Guido van Rossum writes: > The proposal is to allow this to be written as follows in > implementation (non-stub) modules: > > class Foo(Generic[T]): > @overload > def __getitem__(self, i: int) -> T: ... > @overload > def __getitem__(self, s: slice) -> Foo[T]: ... > def __getitem__(self, x): > > > The actual implementation must be last, so at run time it will > override the definition. How about this to allow overloads to have actual implementations? @overloaded def __getitem__(self, x): @overloaded returns a function which will check the types against the overloads (or anyway any overloads that have actual implementations), call them returning the result if applicable, otherwise call the original function. Some magic with help() would improve usability, too - it could print all the overloads and their docstrings. Maybe even @overload('__getitem__') def __get_int(self, i: int), to make it so order doesn't matter. That just leaves the question of how's this all gonna work with subclasses. From guido at python.org Thu Oct 22 03:50:30 2015 From: guido at python.org (Guido van Rossum) Date: Wed, 21 Oct 2015 18:50:30 -0700 Subject: [Python-Dev] PEP 484 -- proposal to allow @overload in non-stub files In-Reply-To: <87wpufllpn.fsf@fastmail.com> References: <87wpufllpn.fsf@fastmail.com> Message-ID: Well the whole point is not to have to figure out how to implement that right now. On Wed, Oct 21, 2015 at 6:45 PM, Random832 wrote: > Guido van Rossum writes: > > The proposal is to allow this to be written as follows in > > implementation (non-stub) modules: > > > > class Foo(Generic[T]): > > @overload > > def __getitem__(self, i: int) -> T: ... > > @overload > > def __getitem__(self, s: slice) -> Foo[T]: ... > > def __getitem__(self, x): > > > > > > The actual implementation must be last, so at run time it will > > override the definition. > > How about this to allow overloads to have actual implementations? > > @overloaded > def __getitem__(self, x): > > > @overloaded returns a function which will check the types against the > overloads (or anyway any overloads that have actual implementations), > call them returning the result if applicable, otherwise call the > original function. > > Some magic with help() would improve usability, too - it could print all > the overloads and their docstrings. Maybe even @overload('__getitem__') > def __get_int(self, i: int), to make it so order doesn't matter. > > That just leaves the question of how's this all gonna work with > subclasses. > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From rymg19 at gmail.com Thu Oct 22 04:57:17 2015 From: rymg19 at gmail.com (Ryan Gonzalez) Date: Wed, 21 Oct 2015 21:57:17 -0500 Subject: [Python-Dev] Should PEP 498 specify if rf'...' is valid? Message-ID: It mentions fr'...' as a formatted raw string but doesn't say anything about rf'...'. Right now, in implementing PEP 498 support in Howl (https://github.com/howl-editor/howl/pull/118 and https://github.com/howl-editor/howl/commit/1e577da89efc1c1de780634b531f64346cf586d6#diff-851d9b84896270cc7e3bbea3014007a5R86), I assumed both were valid. Should the PEP be more specific? BTW, at the rate language-python is going, GitHub will get syntax highlighting for f-strings in 2050. :D -- Sent from my Nexus 5 with K-9 Mail. Please excuse my brevity. -------------- next part -------------- An HTML attachment was scrubbed... URL: From eryksun at gmail.com Thu Oct 22 06:35:50 2015 From: eryksun at gmail.com (eryksun) Date: Wed, 21 Oct 2015 23:35:50 -0500 Subject: [Python-Dev] PEP-8 wart... it recommends short names because of DOS In-Reply-To: References: Message-ID: On 10/21/15, Serhiy Storchaka wrote: > On 21.10.15 04:25, Gregory P. Smith wrote: >> https://www.python.org/dev/peps/pep-0008/#names-to-avoid >> >> /"Since module names are mapped to file names, and some file systems are >> case insensitive and truncate long names, it is important that module >> names be chosen to be fairly short -- this won't be a problem on Unix, >> but it may be a problem when the code is transported to older Mac or >> Windows versions, or DOS."/ >> >> There haven't been computers with less than 80 character file or path >> name element length limits in wide use in decades... ;) > > We should also avoid special file names like con.py or lpt1.py. Other file names to avoid on Windows are conin$.py, conout$.py, aux.py, prn.py, nul.py, lpt[1-9].py, and com[1-9].py. Using these device names in a file name requires the fully qualified wide-character path, prefixed by \\?\. Incidentally this prefix also allows paths that have up to 32768 characters, if there's concern that long module names in packages might exceed the Windows 260-character limit. Here's an example of what would actually be opened for con.py, etc, at least on my current Windows 10 machine: devs = ('aux prn com1 com9 lpt1 lpt9 ' 'nul con conin$ conout$'.split()) for dev in devs: ntpath = to_nt(r'C:\%s.py' % dev) print(ntpath.ljust(11), '=>' ,query_link(ntpath)) output: \??\aux => \DosDevices\COM1 \??\prn => \DosDevices\LPT1 \??\com1 => object name not found \??\com9 => object name not found \??\lpt1 => \Device\Parallel0 \??\lpt9 => object name not found \??\nul => \Device\Null \??\con => \Device\ConDrv\Console \??\conin$ => \Device\ConDrv\CurrentIn \??\conout$ => \Device\ConDrv\CurrentOut The \\?\ prefix avoids DOS name translation. The only change made by the system is to replace \\?\ with \?? in the path: for dev in devs: print(to_nt(r'\\?\C:\%s.py' % dev)) output: \??\C:\aux.py \??\C:\prn.py \??\C:\com1.py \??\C:\com9.py \??\C:\lpt1.py \??\C:\lpt9.py \??\C:\nul.py \??\C:\con.py \??\C:\conin$.py \??\C:\conout$.py On this machine, \??\C: is a link to \Device\HarddiskVolume2. (to_nt and query_link call the native API functions RtlDosPathNameToNtPathName_U, NtOpenSymbolicLinkObject, and NtQuerySymbolicLinkObject. Note that Microsoft doesn't support calling the native NT API from applications in user mode.) From greg at krypto.org Thu Oct 22 11:21:28 2015 From: greg at krypto.org (Gregory P. Smith) Date: Thu, 22 Oct 2015 09:21:28 +0000 Subject: [Python-Dev] PEP 484 -- proposal to allow @overload in non-stub files In-Reply-To: References: <87wpufllpn.fsf@fastmail.com> Message-ID: On Wed, Oct 21, 2015 at 6:51 PM Guido van Rossum wrote: > Well the whole point is not to have to figure out how to implement that > right now. > > On Wed, Oct 21, 2015 at 6:45 PM, Random832 wrote: > >> Guido van Rossum writes: >> > The proposal is to allow this to be written as follows in >> > implementation (non-stub) modules: >> > >> > class Foo(Generic[T]): >> > @overload >> > def __getitem__(self, i: int) -> T: ... >> > @overload >> > def __getitem__(self, s: slice) -> Foo[T]: ... >> > def __getitem__(self, x): >> > >> > >> > The actual implementation must be last, so at run time it will >> > override the definition. >> > I think this *could* be fine. It is certainly readable. And, as is already possible in .pyi files, more accurately expressive than the Union which doesn't imply a parameter type to return value type relationship. What would it Foo.__getitem__.__annotations__ contain in this situation? It'd unfortunately be an empty dict if implemented in the most trivial fashion rather than a dict containing your Unions... Do we care? Note that it would also slow down module import time as the code for each of the earlier ... definitions with annotation structures and @overload decorator calls is executed, needlessly creating objects and structures that are immediately discarded upon each subsequent definition. -gps > >> How about this to allow overloads to have actual implementations? >> >> @overloaded >> def __getitem__(self, x): >> >> >> @overloaded returns a function which will check the types against the >> overloads (or anyway any overloads that have actual implementations), >> call them returning the result if applicable, otherwise call the >> original function. >> >> Some magic with help() would improve usability, too - it could print all >> the overloads and their docstrings. Maybe even @overload('__getitem__') >> def __get_int(self, i: int), to make it so order doesn't matter. >> >> That just leaves the question of how's this all gonna work with >> subclasses. >> >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> > Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/guido%40python.org >> > > > > -- > --Guido van Rossum (python.org/~guido) > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/greg%40krypto.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Thu Oct 22 12:44:07 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Thu, 22 Oct 2015 11:44:07 +0100 Subject: [Python-Dev] PEP 484 -- proposal to allow @overload in non-stub files In-Reply-To: References: <87wpufllpn.fsf@fastmail.com> Message-ID: On 22 October 2015 at 10:21, Gregory P. Smith wrote: > On Wed, Oct 21, 2015 at 6:51 PM Guido van Rossum wrote: >> >> Well the whole point is not to have to figure out how to implement that >> right now. >> >> On Wed, Oct 21, 2015 at 6:45 PM, Random832 wrote: >>> >>> Guido van Rossum writes: >>> > The proposal is to allow this to be written as follows in >>> > implementation (non-stub) modules: >>> > >>> > class Foo(Generic[T]): >>> > @overload >>> > def __getitem__(self, i: int) -> T: ... >>> > @overload >>> > def __getitem__(self, s: slice) -> Foo[T]: ... >>> > def __getitem__(self, x): >>> > >>> > >>> > The actual implementation must be last, so at run time it will >>> > override the definition. > > > I think this could be fine. It is certainly readable. And, as is already > possible in .pyi files, more accurately expressive than the Union which > doesn't imply a parameter type to return value type relationship. > > What would it Foo.__getitem__.__annotations__ contain in this situation? > It'd unfortunately be an empty dict if implemented in the most trivial > fashion rather than a dict containing your Unions... Do we care? > > Note that it would also slow down module import time as the code for each of > the earlier ... definitions with annotation structures and @overload > decorator calls is executed, needlessly creating objects and structures that > are immediately discarded upon each subsequent definition. Is the idea that in future the "..." dummy declarations could be replaced by specialised implementations for the particular type combinations? If not, is there a risk that by grabbing the @overload decorator just for typing we lose the option of using the natural spelling for an actual multi-dspatch implementation? Paul From eric at trueblade.com Thu Oct 22 13:32:30 2015 From: eric at trueblade.com (Eric V. Smith) Date: Thu, 22 Oct 2015 07:32:30 -0400 Subject: [Python-Dev] Should PEP 498 specify if rf'...' is valid? In-Reply-To: References: Message-ID: <5628C94E.8010804@trueblade.com> On 10/21/2015 10:57 PM, Ryan Gonzalez wrote: > It mentions fr'...' as a formatted raw string but doesn't say anything > about rf'...'. Right now, in implementing PEP 498 support in Howl > (https://github.com/howl-editor/howl/pull/118 and > https://github.com/howl-editor/howl/commit/1e577da89efc1c1de780634b531f64346cf586d6#diff-851d9b84896270cc7e3bbea3014007a5R86), > I assumed both were valid. Should the PEP be more specific? Yes, I'll add some wording. Note that currently, there are 24 valid prefixes: ['B', 'BR', 'Br', 'F', 'FR', 'Fr', 'R', 'RB', 'RF', 'Rb', 'Rf', 'U', 'b', 'bR', 'br', 'f', 'fR', 'fr', 'r', 'rB', 'rF', 'rb', 'rf', 'u'] > BTW, at the rate language-python is going, GitHub will get syntax > highlighting for f-strings in 2050. :D Heh. If we add binary f-strings, there are 80 permutations: ['B', 'BF', 'BFR', 'BFr', 'BR', 'BRF', 'BRf', 'Bf', 'BfR', 'Bfr', 'Br', 'BrF', 'Brf', 'F', 'FB', 'FBR', 'FBr', 'FR', 'FRB', 'FRb', 'Fb', 'FbR', 'Fbr', 'Fr', 'FrB', 'Frb', 'R', 'RB', 'RBF', 'RBf', 'RF', 'RFB', 'RFb', 'Rb', 'RbF', 'Rbf', 'Rf', 'RfB', 'Rfb', 'U', 'b', 'bF', 'bFR', 'bFr', 'bR', 'bRF', 'bRf', 'bf', 'bfR', 'bfr', 'br', 'brF', 'brf', 'f', 'fB', 'fBR', 'fBr', 'fR', 'fRB', 'fRb', 'fb', 'fbR', 'fbr', 'fr', 'frB', 'frb', 'r', 'rB', 'rBF', 'rBf', 'rF', 'rFB', 'rFb', 'rb', 'rbF', 'rbf', 'rf', 'rfB', 'rfb', 'u'] I think the upper/lower ones are nuts, but it's probably too late to do anything about it. 'FbR', really? Eric. From eric at trueblade.com Thu Oct 22 14:27:41 2015 From: eric at trueblade.com (Eric V. Smith) Date: Thu, 22 Oct 2015 08:27:41 -0400 Subject: [Python-Dev] Should PEP 498 specify if rf'...' is valid? In-Reply-To: <5628C94E.8010804@trueblade.com> References: <5628C94E.8010804@trueblade.com> Message-ID: <5628D63D.9010200@trueblade.com> On 10/22/2015 7:32 AM, Eric V. Smith wrote: > On 10/21/2015 10:57 PM, Ryan Gonzalez wrote: >> It mentions fr'...' as a formatted raw string but doesn't say anything >> about rf'...'. Right now, in implementing PEP 498 support in Howl >> (https://github.com/howl-editor/howl/pull/118 and >> https://github.com/howl-editor/howl/commit/1e577da89efc1c1de780634b531f64346cf586d6#diff-851d9b84896270cc7e3bbea3014007a5R86), >> I assumed both were valid. Should the PEP be more specific? > > Yes, I'll add some wording. Now that I check, in the Specification section, the PEP already says "'f' may be combined with 'r', in either order, to produce raw f-string literals". So I think this case is covered, no? Eric. From rymg19 at gmail.com Thu Oct 22 16:34:59 2015 From: rymg19 at gmail.com (Ryan Gonzalez) Date: Thu, 22 Oct 2015 09:34:59 -0500 Subject: [Python-Dev] Should PEP 498 specify if rf'...' is valid? In-Reply-To: <5628D63D.9010200@trueblade.com> References: <5628C94E.8010804@trueblade.com> <5628D63D.9010200@trueblade.com> Message-ID: Ah, I missed that part. Sorry! :/ On October 22, 2015 7:27:41 AM CDT, "Eric V. Smith" wrote: >On 10/22/2015 7:32 AM, Eric V. Smith wrote: >> On 10/21/2015 10:57 PM, Ryan Gonzalez wrote: >>> It mentions fr'...' as a formatted raw string but doesn't say >anything >>> about rf'...'. Right now, in implementing PEP 498 support in Howl >>> (https://github.com/howl-editor/howl/pull/118 and >>> >https://github.com/howl-editor/howl/commit/1e577da89efc1c1de780634b531f64346cf586d6#diff-851d9b84896270cc7e3bbea3014007a5R86), >>> I assumed both were valid. Should the PEP be more specific? >> >> Yes, I'll add some wording. > >Now that I check, in the Specification section, the PEP already says >"'f' may be combined with 'r', in either order, to produce raw f-string >literals". So I think this case is covered, no? > >Eric. -- Sent from my Nexus 5 with K-9 Mail. Please excuse my brevity. -------------- next part -------------- An HTML attachment was scrubbed... URL: From srkunze at mail.de Thu Oct 22 18:10:48 2015 From: srkunze at mail.de (Sven R. Kunze) Date: Thu, 22 Oct 2015 18:10:48 +0200 Subject: [Python-Dev] Should PEP 498 specify if rf'...' is valid? In-Reply-To: <5628C94E.8010804@trueblade.com> References: <5628C94E.8010804@trueblade.com> Message-ID: <56290A88.4090607@mail.de> On 22.10.2015 13:32, Eric V. Smith wrote: > ['B', 'BF', 'BFR', 'BFr', 'BR', 'BRF', 'BRf', 'Bf', 'BfR', 'Bfr', 'Br', > 'BrF', 'Brf', 'F', 'FB', 'FBR', 'FBr', 'FR', 'FRB', 'FRb', 'Fb', 'FbR', > 'Fbr', 'Fr', 'FrB', 'Frb', 'R', 'RB', 'RBF', 'RBf', 'RF', 'RFB', 'RFb', > 'Rb', 'RbF', 'Rbf', 'Rf', 'RfB', 'Rfb', 'U', 'b', 'bF', 'bFR', 'bFr', > 'bR', 'bRF', 'bRf', 'bf', 'bfR', 'bfr', 'br', 'brF', 'brf', 'f', 'fB', > 'fBR', 'fBr', 'fR', 'fRB', 'fRb', 'fb', 'fbR', 'fbr', 'fr', 'frB', > 'frb', 'r', 'rB', 'rBF', 'rBf', 'rF', 'rFB', 'rFb', 'rb', 'rbF', 'rbf', > 'rf', 'rfB', 'rfb', 'u'] > > I think the upper/lower ones are nuts, but it's probably too late to do > anything about it. 'FbR', really? Why not disallowing them? I for one could live with all-lower-case AND a predefined order. Best, Sven From rymg19 at gmail.com Thu Oct 22 18:17:25 2015 From: rymg19 at gmail.com (Ryan Gonzalez) Date: Thu, 22 Oct 2015 11:17:25 -0500 Subject: [Python-Dev] Should PEP 498 specify if rf'...' is valid? In-Reply-To: <56290A88.4090607@mail.de> References: <5628C94E.8010804@trueblade.com> <56290A88.4090607@mail.de> Message-ID: <540BA627-65FF-47B2-813F-3ED0E69701AE@gmail.com> On October 22, 2015 11:10:48 AM CDT, "Sven R. Kunze" wrote: >On 22.10.2015 13:32, Eric V. Smith wrote: >> ['B', 'BF', 'BFR', 'BFr', 'BR', 'BRF', 'BRf', 'Bf', 'BfR', 'Bfr', >'Br', >> 'BrF', 'Brf', 'F', 'FB', 'FBR', 'FBr', 'FR', 'FRB', 'FRb', 'Fb', >'FbR', >> 'Fbr', 'Fr', 'FrB', 'Frb', 'R', 'RB', 'RBF', 'RBf', 'RF', 'RFB', >'RFb', >> 'Rb', 'RbF', 'Rbf', 'Rf', 'RfB', 'Rfb', 'U', 'b', 'bF', 'bFR', 'bFr', >> 'bR', 'bRF', 'bRf', 'bf', 'bfR', 'bfr', 'br', 'brF', 'brf', 'f', >'fB', >> 'fBR', 'fBr', 'fR', 'fRB', 'fRb', 'fb', 'fbR', 'fbr', 'fr', 'frB', >> 'frb', 'r', 'rB', 'rBF', 'rBf', 'rF', 'rFB', 'rFb', 'rb', 'rbF', >'rbf', >> 'rf', 'rfB', 'rfb', 'u'] >> >> I think the upper/lower ones are nuts, but it's probably too late to >do >> anything about it. 'FbR', really? > >Why not disallowing them? > >I for one could live with all-lower-case AND a predefined order. > Well, now it's backwards-compatibility. >Best, >Sven >_______________________________________________ >Python-Dev mailing list >Python-Dev at python.org >https://mail.python.org/mailman/listinfo/python-dev >Unsubscribe: >https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com -- Sent from my Nexus 5 with K-9 Mail. Please excuse my brevity. From stephane at wirtel.be Thu Oct 22 18:36:49 2015 From: stephane at wirtel.be (=?utf-8?q?St=C3=A9phane?= Wirtel) Date: Thu, 22 Oct 2015 18:36:49 +0200 Subject: [Python-Dev] Generated Bytecode ... Message-ID: Hi all, When we compile a python script # test.py if 0: x = 1 python -mdis test.py There is no byte code for the condition. So my question is, the byte code generator removes the unused functions, variables etc?, is it right? What are the cases where the generator does not generate the byte codes ? Thank you, St?phane -- St?phane Wirtel - http://wirtel.be - @matrixise -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 801 bytes Desc: OpenPGP digital signature URL: From srkunze at mail.de Thu Oct 22 19:02:00 2015 From: srkunze at mail.de (Sven R. Kunze) Date: Thu, 22 Oct 2015 19:02:00 +0200 Subject: [Python-Dev] Should PEP 498 specify if rf'...' is valid? In-Reply-To: <540BA627-65FF-47B2-813F-3ED0E69701AE@gmail.com> References: <5628C94E.8010804@trueblade.com> <56290A88.4090607@mail.de> <540BA627-65FF-47B2-813F-3ED0E69701AE@gmail.com> Message-ID: <56291688.9060604@mail.de> On 22.10.2015 18:17, Ryan Gonzalez wrote: >> >>> anything about it. 'FbR', really? >> Why not disallowing them? >> >> I for one could live with all-lower-case AND a predefined order. >> > Well, now it's backwards-compatibility. Huh? There are no fb strings yet. Best, Sven From stephane at wirtel.be Thu Oct 22 19:05:53 2015 From: stephane at wirtel.be (=?utf-8?q?St=C3=A9phane?= Wirtel) Date: Thu, 22 Oct 2015 19:05:53 +0200 Subject: [Python-Dev] Generated Bytecode ... In-Reply-To: References: Message-ID: <8DC8F75D-E468-4D14-A14E-298DB823966E@wirtel.be> Thank you Brett, I am going to read the source code, I am going to give a presentation at PyCon.IE about this part and I wanted to be sure about the dead branches. Thanks On 22 Oct 2015, at 19:02, Brett Cannon wrote: > On Thu, 22 Oct 2015 at 09:37 St?phane Wirtel wrote: > >> Hi all, >> >> When we compile a python script >> >> # test.py >> if 0: >> x = 1 >> >> python -mdis test.py >> >> There is no byte code for the condition. >> >> So my question is, the byte code generator removes the unused functions, >> variables etc?, is it right? >> > > Technically the peepholer removes the dead branch, but since the peepholer > is run on all bytecode you can't avoid it. > > >> >> What are the cases where the generator does not generate the byte codes ? >> > > It's not specified anywhere; it's just what the peepholer decides to > remove. The exact code can be found at > https://hg.python.org/cpython/file/default/Python/peephole.c . There has > been talk in the past for adding a -X flag to disable the peepholer, but it > never went any farther beyond that. -- St?phane Wirtel - http://wirtel.be - @matrixise -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 801 bytes Desc: OpenPGP digital signature URL: From rymg19 at gmail.com Thu Oct 22 19:09:31 2015 From: rymg19 at gmail.com (Ryan Gonzalez) Date: Thu, 22 Oct 2015 12:09:31 -0500 Subject: [Python-Dev] Should PEP 498 specify if rf'...' is valid? In-Reply-To: <56291688.9060604@mail.de> References: <5628C94E.8010804@trueblade.com> <56290A88.4090607@mail.de> <540BA627-65FF-47B2-813F-3ED0E69701AE@gmail.com> <56291688.9060604@mail.de> Message-ID: But it'd be weird now if fR worked but fbR didn't. On Thu, Oct 22, 2015 at 12:02 PM, Sven R. Kunze wrote: > On 22.10.2015 18:17, Ryan Gonzalez wrote: > >> >>> anything about it. 'FbR', really? >>>> >>> Why not disallowing them? >>> >>> I for one could live with all-lower-case AND a predefined order. >>> >>> Well, now it's backwards-compatibility. >> > > Huh? There are no fb strings yet. > > Best, > Sven > -- Ryan [ERROR]: Your autotools build scripts are 200 lines longer than your program. Something?s wrong. http://kirbyfan64.github.io/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Thu Oct 22 19:02:48 2015 From: brett at python.org (Brett Cannon) Date: Thu, 22 Oct 2015 17:02:48 +0000 Subject: [Python-Dev] Generated Bytecode ... In-Reply-To: References: Message-ID: On Thu, 22 Oct 2015 at 09:37 St?phane Wirtel wrote: > Hi all, > > When we compile a python script > > # test.py > if 0: > x = 1 > > python -mdis test.py > > There is no byte code for the condition. > > So my question is, the byte code generator removes the unused functions, > variables etc?, is it right? > Technically the peepholer removes the dead branch, but since the peepholer is run on all bytecode you can't avoid it. > > What are the cases where the generator does not generate the byte codes ? > It's not specified anywhere; it's just what the peepholer decides to remove. The exact code can be found at https://hg.python.org/cpython/file/default/Python/peephole.c . There has been talk in the past for adding a -X flag to disable the peepholer, but it never went any farther beyond that. -------------- next part -------------- An HTML attachment was scrubbed... URL: From eric at trueblade.com Thu Oct 22 19:12:48 2015 From: eric at trueblade.com (Eric V. Smith) Date: Thu, 22 Oct 2015 13:12:48 -0400 Subject: [Python-Dev] Should PEP 498 specify if rf'...' is valid? In-Reply-To: References: <5628C94E.8010804@trueblade.com> <56290A88.4090607@mail.de> <540BA627-65FF-47B2-813F-3ED0E69701AE@gmail.com> <56291688.9060604@mail.de> Message-ID: <56291910.1090303@trueblade.com> On 10/22/2015 1:09 PM, Ryan Gonzalez wrote: > But it'd be weird now if fR worked but fbR didn't. Or bR (which is currently allowed) but not fbR in the future. Eric. > > On Thu, Oct 22, 2015 at 12:02 PM, Sven R. Kunze > wrote: > > On 22.10.2015 18:17, Ryan Gonzalez wrote: > > > anything about it. 'FbR', really? > > Why not disallowing them? > > I for one could live with all-lower-case AND a predefined order. > > Well, now it's backwards-compatibility. > > > Huh? There are no fb strings yet. > > Best, > Sven > > > > > -- > Ryan > [ERROR]: Your autotools build scripts are 200 lines longer than your > program. Something?s wrong. > http://kirbyfan64.github.io/ > > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/eric%2Ba-python-dev%40trueblade.com > From srkunze at mail.de Thu Oct 22 19:18:41 2015 From: srkunze at mail.de (Sven R. Kunze) Date: Thu, 22 Oct 2015 19:18:41 +0200 Subject: [Python-Dev] Should PEP 498 specify if rf'...' is valid? In-Reply-To: References: <5628C94E.8010804@trueblade.com> <56290A88.4090607@mail.de> <540BA627-65FF-47B2-813F-3ED0E69701AE@gmail.com> <56291688.9060604@mail.de> Message-ID: <56291A71.4040106@mail.de> Yeah, that would be weird. Really? That's ridiculous. We don't allow DEF or DeF for function definitions either. So, I don't see any value in it. IMHO, It's time for a clean up again. On 22.10.2015 19:09, Ryan Gonzalez wrote: > But it'd be weird now if fR worked but fbR didn't. > > On Thu, Oct 22, 2015 at 12:02 PM, Sven R. Kunze > wrote: > > On 22.10.2015 18:17, Ryan Gonzalez wrote: > > > anything about it. 'FbR', really? > > Why not disallowing them? > > I for one could live with all-lower-case AND a predefined > order. > > Well, now it's backwards-compatibility. > > > Huh? There are no fb strings yet. > Best, Sven -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Thu Oct 22 19:51:15 2015 From: guido at python.org (Guido van Rossum) Date: Thu, 22 Oct 2015 10:51:15 -0700 Subject: [Python-Dev] PEP 484 -- proposal to allow @overload in non-stub files In-Reply-To: References: <87wpufllpn.fsf@fastmail.com> Message-ID: On Thu, Oct 22, 2015 at 2:21 AM, Gregory P. Smith wrote: > > > On Wed, Oct 21, 2015 at 6:51 PM Guido van Rossum wrote: > >> Well the whole point is not to have to figure out how to implement that >> right now. >> >> On Wed, Oct 21, 2015 at 6:45 PM, Random832 >> wrote: >> >>> Guido van Rossum writes: >>> > The proposal is to allow this to be written as follows in >>> > implementation (non-stub) modules: >>> > >>> > class Foo(Generic[T]): >>> > @overload >>> > def __getitem__(self, i: int) -> T: ... >>> > @overload >>> > def __getitem__(self, s: slice) -> Foo[T]: ... >>> > def __getitem__(self, x): >>> > >>> > >>> > The actual implementation must be last, so at run time it will >>> > override the definition. >>> >> > I think this *could* be fine. It is certainly readable. And, as is > already possible in .pyi files, more accurately expressive than the Union > which doesn't imply a parameter type to return value type relationship. > Right, which is how this got started. > What would it Foo.__getitem__.__annotations__ contain in this situation? > It'd unfortunately be an empty dict if implemented in the most trivial > fashion rather than a dict containing your Unions... Do we care? > Initially it would indeed be {}. Once we have a true multi-dispatch PEP we can iterate, both on how to spell it (perhaps the final __getitem__ needs an @overload as well) and on what happens in the annotations (or at least, what typing.get_type_hints() returns). We could also wait for a multidispatch PEP to land -- but I'm worried that we'll be waiting past 3.6. Then again I don't see how true multidispatch would be able to deal with the syntax proposed here -- you need some kind of decorator on the fallback implementation. > Note that it would also slow down module import time as the code for each > of the earlier ... definitions with annotation structures and @overload > decorator calls is executed, needlessly creating objects and structures > that are immediately discarded upon each subsequent definition. > Yes, but I don't think this is going to make a noticeable difference. -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From rdmurray at bitdance.com Thu Oct 22 19:56:53 2015 From: rdmurray at bitdance.com (R. David Murray) Date: Thu, 22 Oct 2015 13:56:53 -0400 Subject: [Python-Dev] Generated Bytecode ... In-Reply-To: References: Message-ID: <20151022175654.146DDB180EF@webabinitio.net> On Thu, 22 Oct 2015 17:02:48 -0000, Brett Cannon wrote: > On Thu, 22 Oct 2015 at 09:37 St??phane Wirtel wrote: > > > Hi all, > > > > When we compile a python script > > > > # test.py > > if 0: > > x = 1 > > > > python -mdis test.py > > > > There is no byte code for the condition. > > > > So my question is, the byte code generator removes the unused functions, > > variables etc???, is it right? > > > > Technically the peepholer removes the dead branch, but since the peepholer > is run on all bytecode you can't avoid it. There's an issue (http://bugs.python.org/issue2506) for being able to disable all optimizations (that Ned Batchelder, among others, would really like to see happen :). Raymond rejected it as not being worthwhile. I still agree with Ned and others that there should, just on principle, be a way to disable all optimizations. Most (all?) compilers have such a feature, for debugging reasons if nothing else. We even have a way to spell it in the generated byte code files now (opt-0). But, someone would have to champion it and write a patch proposal. --David From status at bugs.python.org Fri Oct 23 12:08:29 2015 From: status at bugs.python.org (Python tracker) Date: Fri, 23 Oct 2015 18:08:29 +0200 (CEST) Subject: [Python-Dev] Summary of Python tracker Issues Message-ID: <20151023160829.D98B456A37@psf.upfronthosting.co.za> ACTIVITY SUMMARY (2015-10-16 - 2015-10-23) Python tracker at http://bugs.python.org/ To view or respond to any of the issues listed below, click on the issue. Do NOT respond to this message. Issues counts and deltas: open 5181 (+29) closed 32043 (+17) total 37224 (+46) Open issues with patches: 2287 Issues opened (40) ================== #24379: operator.subscript http://bugs.python.org/issue24379 reopened by rhettinger #25154: Drop the pyvenv script http://bugs.python.org/issue25154 reopened by brett.cannon #25234: test_eintr.test_os_open hangs under Xcode 7 http://bugs.python.org/issue25234 reopened by brett.cannon #25421: Make __sizeof__ for builtin types more subclass friendly http://bugs.python.org/issue25421 opened by serhiy.storchaka #25423: Deprecate benchmarks that execute too quickly http://bugs.python.org/issue25423 opened by brett.cannon #25425: white-spaces encountered in 3.4 http://bugs.python.org/issue25425 opened by pavan kumar Dharmavarapu #25426: Deprecate the regex_compile benchmark http://bugs.python.org/issue25426 opened by brett.cannon #25427: Remove the pyvenv script in Python 3.8 http://bugs.python.org/issue25427 opened by brett.cannon #25430: speed up ipaddress __contain__ method http://bugs.python.org/issue25430 opened by gescheit #25431: implement address in network in ipaddress module http://bugs.python.org/issue25431 opened by gescheit #25432: isinstance documentation doesn't explain what happens when typ http://bugs.python.org/issue25432 opened by Michael Crouch #25433: whitespace in strip()/lstrip()/rstrip() http://bugs.python.org/issue25433 opened by Dimitri Papadopoulos Orfanos #25435: Wrong function calls and referring to not removed concepts in http://bugs.python.org/issue25435 opened by David Becher #25436: argparse.ArgumentError missing error message in __repr__ http://bugs.python.org/issue25436 opened by dfortunov #25437: Issue with ftplib.FTP_TLS and server forcing SSL connection re http://bugs.python.org/issue25437 opened by dwaites #25438: document what codec PyMemberDef T_STRING decodes the char * as http://bugs.python.org/issue25438 opened by gregory.p.smith #25439: Add type checks to urllib.request.Request http://bugs.python.org/issue25439 opened by ezio.melotti #25442: Shelve consistency issues http://bugs.python.org/issue25442 opened by Yanyan Jiang #25443: Add a count of how many benchmarks are left to run http://bugs.python.org/issue25443 opened by brett.cannon #25444: Py Launch Icon http://bugs.python.org/issue25444 opened by Nils-Hero #25446: smtplib.py AUTH LOGIN code messed up sending login and passwor http://bugs.python.org/issue25446 opened by merkel #25447: TypeError invoking deepcopy on lru_cache http://bugs.python.org/issue25447 opened by jason.coombs #25449: Test OrderedDict subclass http://bugs.python.org/issue25449 opened by serhiy.storchaka #25450: Python 3.5 starts in C:\Windows\system32 as current directory http://bugs.python.org/issue25450 opened by Maja Tomic #25451: tkinter: PhotoImage transparency methods http://bugs.python.org/issue25451 opened by None Becoming #25452: Add __bool__() method to subprocess.CompletedProcess http://bugs.python.org/issue25452 opened by conqp #25453: Arithmetics with complex infinities is inconsistent with C/C++ http://bugs.python.org/issue25453 opened by Francesco Biscani #25454: operator.methodcaller should accept additional arguments durin http://bugs.python.org/issue25454 opened by abacabadabacaba #25455: Some repr implementations don't check for self-referential str http://bugs.python.org/issue25455 opened by abacabadabacaba #25456: 3.4 _tkinter build requires undocumented manual steps to be us http://bugs.python.org/issue25456 opened by terry.reedy #25457: json dump fails for mixed-type keys when sort_keys is specifie http://bugs.python.org/issue25457 opened by tanzer at swing.co.at #25458: ftplib: command response shift - mismatch http://bugs.python.org/issue25458 opened by peterpan #25459: EAGAIN errors in Python logging module http://bugs.python.org/issue25459 opened by Henrique Andrade #25460: Misc/.gdbinit uses preprocessor macro http://bugs.python.org/issue25460 opened by philtweir #25461: Unclear language (the word ineffective) in the documentation f http://bugs.python.org/issue25461 opened by Bernt.R??skar.Brenna #25462: Avoid repeated hash calculation in C implementation of Ordered http://bugs.python.org/issue25462 opened by serhiy.storchaka #25463: 2.7.10 glibc double free detected http://bugs.python.org/issue25463 opened by wcolburnnrao #25464: Tix HList header_exists should be "exist" http://bugs.python.org/issue25464 opened by rtw #25465: Pickle uses O(n) memory overhead http://bugs.python.org/issue25465 opened by prinsherbert #25466: offer "from __future__ import" option for "raise... from" http://bugs.python.org/issue25466 opened by alanf Most recent 15 issues with no replies (15) ========================================== #25466: offer "from __future__ import" option for "raise... from" http://bugs.python.org/issue25466 #25462: Avoid repeated hash calculation in C implementation of Ordered http://bugs.python.org/issue25462 #25458: ftplib: command response shift - mismatch http://bugs.python.org/issue25458 #25455: Some repr implementations don't check for self-referential str http://bugs.python.org/issue25455 #25454: operator.methodcaller should accept additional arguments durin http://bugs.python.org/issue25454 #25451: tkinter: PhotoImage transparency methods http://bugs.python.org/issue25451 #25443: Add a count of how many benchmarks are left to run http://bugs.python.org/issue25443 #25433: whitespace in strip()/lstrip()/rstrip() http://bugs.python.org/issue25433 #25431: implement address in network in ipaddress module http://bugs.python.org/issue25431 #25430: speed up ipaddress __contain__ method http://bugs.python.org/issue25430 #25427: Remove the pyvenv script in Python 3.8 http://bugs.python.org/issue25427 #25421: Make __sizeof__ for builtin types more subclass friendly http://bugs.python.org/issue25421 #25416: Add encoding aliases from the (HTML5) Encoding Standard http://bugs.python.org/issue25416 #25413: ctypes (libffi) fails to compile on Solaris X86 http://bugs.python.org/issue25413 #25397: improve ac_cv_have_long_long_format GCC fallback http://bugs.python.org/issue25397 Most recent 15 issues waiting for review (15) ============================================= #25464: Tix HList header_exists should be "exist" http://bugs.python.org/issue25464 #25462: Avoid repeated hash calculation in C implementation of Ordered http://bugs.python.org/issue25462 #25461: Unclear language (the word ineffective) in the documentation f http://bugs.python.org/issue25461 #25460: Misc/.gdbinit uses preprocessor macro http://bugs.python.org/issue25460 #25449: Test OrderedDict subclass http://bugs.python.org/issue25449 #25447: TypeError invoking deepcopy on lru_cache http://bugs.python.org/issue25447 #25446: smtplib.py AUTH LOGIN code messed up sending login and passwor http://bugs.python.org/issue25446 #25439: Add type checks to urllib.request.Request http://bugs.python.org/issue25439 #25436: argparse.ArgumentError missing error message in __repr__ http://bugs.python.org/issue25436 #25435: Wrong function calls and referring to not removed concepts in http://bugs.python.org/issue25435 #25431: implement address in network in ipaddress module http://bugs.python.org/issue25431 #25430: speed up ipaddress __contain__ method http://bugs.python.org/issue25430 #25421: Make __sizeof__ for builtin types more subclass friendly http://bugs.python.org/issue25421 #25420: "import random" blocks on entropy collection on Linux with low http://bugs.python.org/issue25420 #25419: Readline completion of module names in import statements http://bugs.python.org/issue25419 Top 10 most discussed issues (10) ================================= #25410: Clean up and fix OrderedDict http://bugs.python.org/issue25410 22 msgs #25453: Arithmetics with complex infinities is inconsistent with C/C++ http://bugs.python.org/issue25453 14 msgs #25461: Unclear language (the word ineffective) in the documentation f http://bugs.python.org/issue25461 12 msgs #25420: "import random" blocks on entropy collection on Linux with low http://bugs.python.org/issue25420 11 msgs #25446: smtplib.py AUTH LOGIN code messed up sending login and passwor http://bugs.python.org/issue25446 8 msgs #25395: SIGSEGV using json.tool: highly nested OrderedDict http://bugs.python.org/issue25395 7 msgs #23735: Readline not adjusting width after resize with 6.3 http://bugs.python.org/issue23735 6 msgs #25154: Drop the pyvenv script http://bugs.python.org/issue25154 6 msgs #25447: TypeError invoking deepcopy on lru_cache http://bugs.python.org/issue25447 6 msgs #25439: Add type checks to urllib.request.Request http://bugs.python.org/issue25439 5 msgs Issues closed (18) ================== #13195: subprocess: args with shell=True is not documented on Windows http://bugs.python.org/issue13195 closed by martin.panter #23981: Update test_unicodedata.py to use script_helpers http://bugs.python.org/issue23981 closed by berker.peksag #24885: StreamReaderProtocol docs recommend using private API http://bugs.python.org/issue24885 closed by gvanrossum #25390: Can't define a typing.Union containing a typing.re.Pattern http://bugs.python.org/issue25390 closed by gvanrossum #25407: Update PEP 4 to keep modules in Python 3 http://bugs.python.org/issue25407 closed by brett.cannon #25408: Consider dropping html5lib and spambayes from the default benc http://bugs.python.org/issue25408 closed by brett.cannon #25411: SMTPHandler in the logging module fails with unicode strings http://bugs.python.org/issue25411 closed by python-dev #25414: Bypass unnecessary size limit test from deques on builds with http://bugs.python.org/issue25414 closed by rhettinger #25417: Minor typo in Path.samefile docstring http://bugs.python.org/issue25417 closed by berker.peksag #25422: tokenize: add tests for multi-line strings http://bugs.python.org/issue25422 closed by eric.smith #25424: Deprecate older versions of benchmarks http://bugs.python.org/issue25424 closed by brett.cannon #25428: Have `datetime` understand integer arguments for timezones http://bugs.python.org/issue25428 closed by r.david.murray #25429: Can segfault Python with itertools.chain.from_iterable http://bugs.python.org/issue25429 closed by rhettinger #25434: Fix typo in whatsnew/3.5 http://bugs.python.org/issue25434 closed by berker.peksag #25440: python3.4-config --extension-suffix not expanded http://bugs.python.org/issue25440 closed by doko #25441: StreamWriter.drain() unreliably reports closed sockets http://bugs.python.org/issue25441 closed by gvanrossum #25445: type xterm in python http://bugs.python.org/issue25445 closed by r.david.murray #25448: Exception ABC doesn't work in Python 3 (but does in Python 2.7 http://bugs.python.org/issue25448 closed by eryksun From stephane at wirtel.be Sun Oct 25 16:10:39 2015 From: stephane at wirtel.be (Stephane Wirtel) Date: Sun, 25 Oct 2015 20:10:39 +0000 Subject: [Python-Dev] Generated Bytecode ... In-Reply-To: References: <700118D0-6804-4860-93AE-8353B0D936E2@gmail.com> Message-ID: Thank you for your confirmation, I am going to read the devguide. > On 25 oct. 2015, at 7:50 PM, Raymond Hettinger wrote: > > >>> On Oct 25, 2015, at 12:33 PM, Raymond Hettinger wrote: >>> >>> On Oct 22, 2015, at 10:02 AM, Brett Cannon wrote: >>> >>> So my question is, the byte code generator removes the unused functions, variables etc?, is it right? >>> >>> Technically the peepholer removes the dead branch, but since the peepholer is run on all bytecode you can't avoid it. >> >> IIRC, the code was never generated in the first place (before the peephole pass). This used to be true before the AST branch was added and I think it may still be true. > > I just verified this. So Brett's post was incorrect and misleading. > > > Raymond > > > ----------- Verify by turning-off the optimizations ---------- > cpython $ hg diff Python/peephole.c > diff --git a/Python/peephole.c b/Python/peephole.c > --- a/Python/peephole.c > +++ b/Python/peephole.c > @@ -383,7 +383,7 @@ > /* Avoid situations where jump retargeting could overflow */ > assert(PyBytes_Check(code)); > codelen = PyBytes_GET_SIZE(code); > - if (codelen > 32700) > + if (codelen > 0) > goto exitUnchanged; > > -------- Then run a simple disassembly ----------------------- > > from dis import dis > > def f(x): > if 0: > print('First') > print('Second') > > dis(f) > > -------- The output is --------------------------------------------------- > > $ py tmp.py > 6 0 LOAD_GLOBAL 0 (print) > 3 LOAD_CONST 1 ('Second') > 6 CALL_FUNCTION 1 (1 positional, 0 keyword pair) > 9 POP_TOP > 10 LOAD_CONST 0 (None) > 13 RETURN_VALUE > From stephane at wirtel.be Sat Oct 24 09:53:37 2015 From: stephane at wirtel.be (=?utf-8?q?St=C3=A9phane?= Wirtel) Date: Sat, 24 Oct 2015 14:53:37 +0100 Subject: [Python-Dev] Where is defined the grammar of Python? Message-ID: Hi all, Just to understand, we have the Parser/Python.asdl and Grammar/Grammar files. Which one is used for the AST ? I would like to understand this part of Python, could you help me? Thank you St?phane -- St?phane Wirtel - http://wirtel.be - @matrixise -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 801 bytes Desc: OpenPGP digital signature URL: From ncoghlan at gmail.com Fri Oct 23 11:38:30 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 23 Oct 2015 17:38:30 +0200 Subject: [Python-Dev] PEP 484 -- proposal to allow @overload in non-stub files In-Reply-To: References: <87wpufllpn.fsf@fastmail.com> Message-ID: On 22 October 2015 at 19:51, Guido van Rossum wrote: > On Thu, Oct 22, 2015 at 2:21 AM, Gregory P. Smith wrote: >> What would it Foo.__getitem__.__annotations__ contain in this situation? >> It'd unfortunately be an empty dict if implemented in the most trivial >> fashion rather than a dict containing your Unions... Do we care? > > Initially it would indeed be {}. Once we have a true multi-dispatch PEP we > can iterate, both on how to spell it (perhaps the final __getitem__ needs an > @overload as well) and on what happens in the annotations (or at least, what > typing.get_type_hints() returns). Just ensuring I understand the problem with using a third @overload in the spelling from the start: class Foo(Generic[T]): @overload def __getitem__(self, i: int) -> T: ... @overload def __getitem__(self, s: slice) -> Foo[T]: ... @overload def __getitem__(self, x): If we did this, the implied annotation on the last method would be: @overload def __getitem__(self, x: Any) -> Any: which gets the signature wrong - this isn't an Any:Any mapping, it's a sequence. Leaving the "@overload" out thus indicates that the definition is an implementation of the preceding type based dispatch declaration, rather than a new overload. Assuming a future multidispatch implementation used "functools.multidispatch" as the decorator (to complement the existing functools.singledispatch) rather than "typing.overload", this seems like a reasonable short term solution to me. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From raymond.hettinger at gmail.com Sun Oct 25 15:50:40 2015 From: raymond.hettinger at gmail.com (Raymond Hettinger) Date: Sun, 25 Oct 2015 12:50:40 -0700 Subject: [Python-Dev] Generated Bytecode ... In-Reply-To: <700118D0-6804-4860-93AE-8353B0D936E2@gmail.com> References: <700118D0-6804-4860-93AE-8353B0D936E2@gmail.com> Message-ID: > On Oct 25, 2015, at 12:33 PM, Raymond Hettinger wrote: > >> On Oct 22, 2015, at 10:02 AM, Brett Cannon wrote: >> >> So my question is, the byte code generator removes the unused functions, variables etc?, is it right? >> >> Technically the peepholer removes the dead branch, but since the peepholer is run on all bytecode you can't avoid it. > > IIRC, the code was never generated in the first place (before the peephole pass). This used to be true before the AST branch was added and I think it may still be true. I just verified this. So Brett's post was incorrect and misleading. Raymond ----------- Verify by turning-off the optimizations ---------- cpython $ hg diff Python/peephole.c diff --git a/Python/peephole.c b/Python/peephole.c --- a/Python/peephole.c +++ b/Python/peephole.c @@ -383,7 +383,7 @@ /* Avoid situations where jump retargeting could overflow */ assert(PyBytes_Check(code)); codelen = PyBytes_GET_SIZE(code); - if (codelen > 32700) + if (codelen > 0) goto exitUnchanged; -------- Then run a simple disassembly ----------------------- from dis import dis def f(x): if 0: print('First') print('Second') dis(f) -------- The output is --------------------------------------------------- $ py tmp.py 6 0 LOAD_GLOBAL 0 (print) 3 LOAD_CONST 1 ('Second') 6 CALL_FUNCTION 1 (1 positional, 0 keyword pair) 9 POP_TOP 10 LOAD_CONST 0 (None) 13 RETURN_VALUE From raymond.hettinger at gmail.com Sun Oct 25 15:33:50 2015 From: raymond.hettinger at gmail.com (Raymond Hettinger) Date: Sun, 25 Oct 2015 12:33:50 -0700 Subject: [Python-Dev] Generated Bytecode ... In-Reply-To: References: Message-ID: <700118D0-6804-4860-93AE-8353B0D936E2@gmail.com> > On Oct 22, 2015, at 10:02 AM, Brett Cannon wrote: > > So my question is, the byte code generator removes the unused functions, variables etc?, is it right? > > Technically the peepholer removes the dead branch, but since the peepholer is run on all bytecode you can't avoid it. IIRC, the code was never generated in the first place (before the peephole pass). This used to be true before the AST branch was added and I think it may still be true. Raymond Hettinger From guido at python.org Sun Oct 25 18:54:13 2015 From: guido at python.org (Guido van Rossum) Date: Sun, 25 Oct 2015 15:54:13 -0700 Subject: [Python-Dev] PEP 484 -- proposal to allow @overload in non-stub files In-Reply-To: References: <87wpufllpn.fsf@fastmail.com> Message-ID: On Fri, Oct 23, 2015 at 8:38 AM, Nick Coghlan wrote: > On 22 October 2015 at 19:51, Guido van Rossum wrote: > > On Thu, Oct 22, 2015 at 2:21 AM, Gregory P. Smith > wrote: > >> What would it Foo.__getitem__.__annotations__ contain in this situation? > >> It'd unfortunately be an empty dict if implemented in the most trivial > >> fashion rather than a dict containing your Unions... Do we care? > > > > Initially it would indeed be {}. Once we have a true multi-dispatch PEP > we > > can iterate, both on how to spell it (perhaps the final __getitem__ > needs an > > @overload as well) and on what happens in the annotations (or at least, > what > > typing.get_type_hints() returns). > > Just ensuring I understand the problem with using a third @overload in > the spelling from the start: > > class Foo(Generic[T]): > @overload > def __getitem__(self, i: int) -> T: ... > > @overload > def __getitem__(self, s: slice) -> Foo[T]: ... > > @overload > def __getitem__(self, x): > > > If we did this, the implied annotation on the last method would be: > > @overload > def __getitem__(self, x: Any) -> Any: > > > which gets the signature wrong - this isn't an Any:Any mapping, it's a > sequence. > Well, a type checker could handle the special case of the last overload. There should be a rule that overloads are handled in the order in which they are processed; it's not explicit in the PEP but it's meant to be that way, in case there's overlap between signatures. (This differs from singledispatch: when overloading on multiple types it's not always possible to disambiguate by using the most derived type.) But allowing this in code without having a full-fledged multi-dispatch implementation in @overload would cause confusion in readers, which is why we decided to disallow it outside stubs. > Leaving the "@overload" out thus indicates that the definition is an > implementation of the preceding type based dispatch declaration, > rather than a new overload. > Yeah, that was the proposal. But I no longer think it's worth it. > Assuming a future multidispatch implementation used > "functools.multidispatch" as the decorator (to complement the existing > functools.singledispatch) rather than "typing.overload", this seems > like a reasonable short term solution to me. > But once we have a functools.multidispatch, why would we also need typing.overload? (Outside stubs, that is.) Given that a short-term solution is already possible using a stub, I'm not sure that adding another short-term solution is worth it, if we don't intend to keep it around. -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Sat Oct 24 23:59:03 2015 From: guido at python.org (Guido van Rossum) Date: Sat, 24 Oct 2015 20:59:03 -0700 Subject: [Python-Dev] PEP 484 -- proposal to allow @overload in non-stub files In-Reply-To: References: <87wpufllpn.fsf@fastmail.com> Message-ID: I've thought this over and I don't think it's worth it. We need to wait for a working proposal for multi-dispatch first. Otherwise we'll just end up having to support this interim syntax *and* whatever the new multi-dispatch is. Keeping @overload restricted to stub files makes it much more tractable. On Thu, Oct 22, 2015 at 10:51 AM, Guido van Rossum wrote: > On Thu, Oct 22, 2015 at 2:21 AM, Gregory P. Smith wrote: > >> >> >> On Wed, Oct 21, 2015 at 6:51 PM Guido van Rossum >> wrote: >> >>> Well the whole point is not to have to figure out how to implement that >>> right now. >>> >>> On Wed, Oct 21, 2015 at 6:45 PM, Random832 >>> wrote: >>> >>>> Guido van Rossum writes: >>>> > The proposal is to allow this to be written as follows in >>>> > implementation (non-stub) modules: >>>> > >>>> > class Foo(Generic[T]): >>>> > @overload >>>> > def __getitem__(self, i: int) -> T: ... >>>> > @overload >>>> > def __getitem__(self, s: slice) -> Foo[T]: ... >>>> > def __getitem__(self, x): >>>> > >>>> > >>>> > The actual implementation must be last, so at run time it will >>>> > override the definition. >>>> >>> >> I think this *could* be fine. It is certainly readable. And, as is >> already possible in .pyi files, more accurately expressive than the Union >> which doesn't imply a parameter type to return value type relationship. >> > > Right, which is how this got started. > > >> What would it Foo.__getitem__.__annotations__ contain in this situation? >> It'd unfortunately be an empty dict if implemented in the most trivial >> fashion rather than a dict containing your Unions... Do we care? >> > > Initially it would indeed be {}. Once we have a true multi-dispatch PEP we > can iterate, both on how to spell it (perhaps the final __getitem__ needs > an @overload as well) and on what happens in the annotations (or at least, > what typing.get_type_hints() returns). > > We could also wait for a multidispatch PEP to land -- but I'm worried that > we'll be waiting past 3.6. > > Then again I don't see how true multidispatch would be able to deal with > the syntax proposed here -- you need some kind of decorator on the fallback > implementation. > > >> Note that it would also slow down module import time as the code for each >> of the earlier ... definitions with annotation structures and @overload >> decorator calls is executed, needlessly creating objects and structures >> that are immediately discarded upon each subsequent definition. >> > > Yes, but I don't think this is going to make a noticeable difference. > > > -- > --Guido van Rossum (python.org/~guido) > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From lac at openend.se Sun Oct 25 19:11:24 2015 From: lac at openend.se (Laura Creighton) Date: Mon, 26 Oct 2015 00:11:24 +0100 Subject: [Python-Dev] PEP 484 -- proposal to allow @overload in non-stub files In-Reply-To: References: <87wpufllpn.fsf@fastmail.com> Message-ID: <201510252311.t9PNBOq2001083@fido.openend.se> All these overloads makes the code hard to read. The whole idea of 'i have to know which decorator got called before the other one' is a smell that you have too many decorators. This whole idea reeks 'i can be very, very clever here'. Laura From tjreedy at udel.edu Fri Oct 23 15:34:34 2015 From: tjreedy at udel.edu (Terry Reedy) Date: Fri, 23 Oct 2015 15:34:34 -0400 Subject: [Python-Dev] Generated Bytecode ... In-Reply-To: References: Message-ID: On 10/23/2015 4:23 AM, Victor Stinner wrote: > Hi, > > 2015-10-22 19:02 GMT+02:00 Brett Cannon : >> It's not specified anywhere; it's just what the peepholer decides to remove. >> The exact code can be found at >> https://hg.python.org/cpython/file/default/Python/peephole.c . There has >> been talk in the past for adding a -X flag to disable the peepholer, but it >> never went any farther beyond that. > > Yeah, I remember that I had the same issue than St?phane when I worked > on my astoptimizer project :-) I wanted to completly disable the > peephole optimizer because I wanted to reimplement the optimizations > on the AST instead of rewriting the bytecode. > > I would be nice to have a "-X noopt" command line option for that. How about -x nopeep to specifically skip the peephole optimizer? -- Terry Jan Reedy From ben+python at benfinney.id.au Sun Oct 25 17:04:27 2015 From: ben+python at benfinney.id.au (Ben Finney) Date: Mon, 26 Oct 2015 08:04:27 +1100 Subject: [Python-Dev] Where is defined the grammar of Python? References: Message-ID: <85k2qaoe1w.fsf@benfinney.id.au> "St?phane Wirtel" writes: > I would like to understand this part of Python, could you help me? You should also know the Python Language Reference documentation , especially the Full Grammar specification . -- \ ?Holy astringent plum-like fruit, Batman!? ?Robin | `\ | _o__) | Ben Finney From ncoghlan at gmail.com Fri Oct 23 11:20:45 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 23 Oct 2015 17:20:45 +0200 Subject: [Python-Dev] Should PEP 498 specify if rf'...' is valid? In-Reply-To: <56291910.1090303@trueblade.com> References: <5628C94E.8010804@trueblade.com> <56290A88.4090607@mail.de> <540BA627-65FF-47B2-813F-3ED0E69701AE@gmail.com> <56291688.9060604@mail.de> <56291910.1090303@trueblade.com> Message-ID: On 22 October 2015 at 19:12, Eric V. Smith wrote: > On 10/22/2015 1:09 PM, Ryan Gonzalez wrote: >> But it'd be weird now if fR worked but fbR didn't. > > Or bR (which is currently allowed) but not fbR in the future. My own objection isn't to allowing "fR" or "fbR", it's to allowing the uppercase "F". I also don't understand why we can't say "if 'f' is part of a string prefix, it must be first". That would mean we kept the current 14 combinations: ['B', 'BR', 'Br', 'R', 'RB', 'Rb', 'U', 'b', 'bR', 'br', 'r', 'rB', 'rb', 'u'] And added only 13 more possibilities, being a lowercase 'f' prefix on its own, and as a prefix for the various b/r combinations: ['fB', 'fBR', 'fBr', 'fR', 'fRB', 'fRb', 'fb', 'fbR', 'fbr', 'fr', 'frB', 'frb'] I don't think it would ever be worth the compatibility break to require lowercase for 'rbu', or to enforce a particular relative order (although those could be good pylint rules, if they aren't already), but there's no such restrictions for the new 'f' prefix. Regards, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Mon Oct 26 06:37:33 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Mon, 26 Oct 2015 11:37:33 +0100 Subject: [Python-Dev] Where is defined the grammar of Python? In-Reply-To: References: Message-ID: On 24 October 2015 at 15:53, St?phane Wirtel wrote: > Hi all, > > Just to understand, we have the Parser/Python.asdl and Grammar/Grammar files. > > Which one is used for the AST ? > > I would like to understand this part of Python, could you help me? An overview of all the moving parts is at https://docs.python.org/devguide/grammar.html A more detailed description of the compiler toolchain in CPython is at https://docs.python.org/devguide/compiler.html Cheers, Nick. > > Thank you > > St?phane > > -- > St?phane Wirtel - http://wirtel.be - @matrixise > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ncoghlan%40gmail.com > -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ethan at stoneleaf.us Mon Oct 26 11:22:40 2015 From: ethan at stoneleaf.us (Ethan Furman) Date: Mon, 26 Oct 2015 08:22:40 -0700 Subject: [Python-Dev] Should PEP 498 specify if rf'...' is valid? In-Reply-To: References: <5628C94E.8010804@trueblade.com> <56290A88.4090607@mail.de> <540BA627-65FF-47B2-813F-3ED0E69701AE@gmail.com> <56291688.9060604@mail.de> <56291910.1090303@trueblade.com> Message-ID: <562E4540.8080308@stoneleaf.us> On 10/23/2015 08:20 AM, Nick Coghlan wrote: > On 22 October 2015 at 19:12, Eric V. Smith wrote: >> On 10/22/2015 1:09 PM, Ryan Gonzalez wrote: >>> But it'd be weird now if fR worked but fbR didn't. >> >> Or bR (which is currently allowed) but not fbR in the future. > My own objection isn't to allowing "fR" or "fbR", it's to allowing the > uppercase "F". > > I also don't understand why we can't say "if 'f' is part of a string > prefix, it must be first". Sometimes order matters, and sometimes it does not. If the order does not have an impact on the final code, it does not matter, and making us have to remember an order that does not matter is a waste. -- ~Ethan~ From brett at python.org Mon Oct 26 12:41:48 2015 From: brett at python.org (Brett Cannon) Date: Mon, 26 Oct 2015 16:41:48 +0000 Subject: [Python-Dev] Where is defined the grammar of Python? In-Reply-To: References: Message-ID: On Sun, 25 Oct 2015 at 19:51 St?phane Wirtel wrote: > Hi all, > > Just to understand, we have the Parser/Python.asdl and Grammar/Grammar > files. > > Which one is used for the AST ? > > I would like to understand this part of Python, could you help me? > > See https://docs.python.org/devguide/grammar.html and https://docs.python.org/devguide/compiler.html -------------- next part -------------- An HTML attachment was scrubbed... URL: From srkunze at mail.de Mon Oct 26 14:45:11 2015 From: srkunze at mail.de (Sven R. Kunze) Date: Mon, 26 Oct 2015 19:45:11 +0100 Subject: [Python-Dev] Should PEP 498 specify if rf'...' is valid? In-Reply-To: <562E4540.8080308@stoneleaf.us> References: <5628C94E.8010804@trueblade.com> <56290A88.4090607@mail.de> <540BA627-65FF-47B2-813F-3ED0E69701AE@gmail.com> <56291688.9060604@mail.de> <56291910.1090303@trueblade.com> <562E4540.8080308@stoneleaf.us> Message-ID: <562E74B7.4060409@mail.de> On 26.10.2015 16:22, Ethan Furman wrote: > On 10/23/2015 08:20 AM, Nick Coghlan wrote: >> My own objection isn't to allowing "fR" or "fbR", it's to allowing the >> uppercase "F". >> >> I also don't understand why we can't say "if 'f' is part of a string >> prefix, it must be first". > > Sometimes order matters, and sometimes it does not. If the order does > not have an impact on the final code, it does not matter, and making > us have to remember an order that does not matter is a waste. Order that matters? You must be kidding. That would turn different types of string extremely hard to understand because semantics differ. That is, btw., one reason, why I favor a fixed order (alphabetically or something). Easy to remember and no way to misinterpret it. Best, Sven From python at mrabarnett.plus.com Mon Oct 26 15:43:53 2015 From: python at mrabarnett.plus.com (MRAB) Date: Mon, 26 Oct 2015 19:43:53 +0000 Subject: [Python-Dev] Should PEP 498 specify if rf'...' is valid? In-Reply-To: <562E74B7.4060409@mail.de> References: <5628C94E.8010804@trueblade.com> <56290A88.4090607@mail.de> <540BA627-65FF-47B2-813F-3ED0E69701AE@gmail.com> <56291688.9060604@mail.de> <56291910.1090303@trueblade.com> <562E4540.8080308@stoneleaf.us> <562E74B7.4060409@mail.de> Message-ID: <562E8279.2060305@mrabarnett.plus.com> On 2015-10-26 18:45, Sven R. Kunze wrote: > On 26.10.2015 16:22, Ethan Furman wrote: >> On 10/23/2015 08:20 AM, Nick Coghlan wrote: >>> My own objection isn't to allowing "fR" or "fbR", it's to >>> allowing the uppercase "F". >>> >>> I also don't understand why we can't say "if 'f' is part of a >>> string prefix, it must be first". >> >> Sometimes order matters, and sometimes it does not. If the order >> does not have an impact on the final code, it does not matter, and >> making us have to remember an order that does not matter is a >> waste. > > Order that matters? You must be kidding. That would turn different > types of string extremely hard to understand because semantics > differ. > > That is, btw., one reason, why I favor a fixed order (alphabetically > or something). Easy to remember and no way to misinterpret it. > In Python 2, how often have you seen prefix "ur" rather than "ru"? I always used "ur". How often is alphabetical order used in the prefixes? If the order isn't alphabetical, then it's going to be some order that's harder to remember, so I agree with Ethan here. From ethan at stoneleaf.us Mon Oct 26 15:54:42 2015 From: ethan at stoneleaf.us (Ethan Furman) Date: Mon, 26 Oct 2015 12:54:42 -0700 Subject: [Python-Dev] Should PEP 498 specify if rf'...' is valid? In-Reply-To: <562E74B7.4060409@mail.de> References: <5628C94E.8010804@trueblade.com> <56290A88.4090607@mail.de> <540BA627-65FF-47B2-813F-3ED0E69701AE@gmail.com> <56291688.9060604@mail.de> <56291910.1090303@trueblade.com> <562E4540.8080308@stoneleaf.us> <562E74B7.4060409@mail.de> Message-ID: <562E8502.1050303@stoneleaf.us> On 10/26/2015 11:45 AM, Sven R. Kunze wrote: > On 26.10.2015 16:22, Ethan Furman wrote: >> On 10/23/2015 08:20 AM, Nick Coghlan wrote: >>> My own objection isn't to allowing "fR" or "fbR", it's to allowing the >>> uppercase "F". >>> >>> I also don't understand why we can't say "if 'f' is part of a string >>> prefix, it must be first". >> >> Sometimes order matters, and sometimes it does not. If the order does >> not have an impact on the final code, it does not matter, and making >> us have to remember an order that does not matter is a waste. > > Order that matters? You must be kidding. You misunderstand -- the order of string prefixes does *not* matter, so forcing us to use one is silly. -- ~Ethan~ From soltysh at gmail.com Mon Oct 26 16:21:29 2015 From: soltysh at gmail.com (Maciej Szulik) Date: Mon, 26 Oct 2015 21:21:29 +0100 Subject: [Python-Dev] (no subject) Message-ID: Thanks to Nick Coghlan and Barry Warsaw we've setup a new SIG dedicated to discussing python features from different distributions point of view. Here is Nick's reasoning: > With the Python 3 migration, and the growth in interest in user level > package management for development purposes, what do you think of the idea > of setting up a new Linux SIG to have those discussions? I know it's a case > of "yet another mailing list", but I think it will be worthwhile to have a > clear point of collaboration within the Python ecosystem, rather than > expecting Pythonistas to know how to reach out to (other) distros directly. The list is available @ https://mail.python.org/mailman/listinfo/linux-sig Maciej -------------- next part -------------- An HTML attachment was scrubbed... URL: From victor.stinner at gmail.com Mon Oct 26 22:36:54 2015 From: victor.stinner at gmail.com (Victor Stinner) Date: Tue, 27 Oct 2015 11:36:54 +0900 Subject: [Python-Dev] Generated Bytecode ... In-Reply-To: References: Message-ID: 2015-10-24 4:34 GMT+09:00 Terry Reedy : > How about -x nopeep to specifically skip the peephole optimizer? Raymond wrote "IIRC, the code was never generated in the first place (before the peephole pass)." So "nopeep" would have a different purpose. Victor From tjreedy at udel.edu Tue Oct 27 00:47:36 2015 From: tjreedy at udel.edu (Terry Reedy) Date: Tue, 27 Oct 2015 00:47:36 -0400 Subject: [Python-Dev] Generated Bytecode ... In-Reply-To: References: Message-ID: On 10/26/2015 10:36 PM, Victor Stinner wrote: > 2015-10-24 4:34 GMT+09:00 Terry Reedy : >> How about -x nopeep to specifically skip the peephole optimizer? > > Raymond wrote "IIRC, the code was never generated in the first place > (before the peephole pass)." I based that suggestion on what others said about why code was not generated. I think the actual situation supports my point that turning off 'all optimizations' means reviewing the entire process of generating code, from start to finish, to find anything that might be called an 'optimization' and that could be disabled or given a 'un-optimized' alternative. -- Terry Jan Reedy From francismb at email.de Mon Oct 26 16:17:02 2015 From: francismb at email.de (francismb) Date: Mon, 26 Oct 2015 21:17:02 +0100 Subject: [Python-Dev] Should PEP 498 specify if rf'...' is valid? In-Reply-To: <562E74B7.4060409@mail.de> References: <5628C94E.8010804@trueblade.com> <56290A88.4090607@mail.de> <540BA627-65FF-47B2-813F-3ED0E69701AE@gmail.com> <56291688.9060604@mail.de> <56291910.1090303@trueblade.com> <562E4540.8080308@stoneleaf.us> <562E74B7.4060409@mail.de> Message-ID: <562E8A3E.3040802@email.de> >> Sometimes order matters, and sometimes it does not. If the order does >> not have an impact on the final code, it does not matter, and making >> us have to remember an order that does not matter is a waste. > > Order that matters? You must be kidding. That would turn different types > of string extremely hard to understand because semantics differ. > > That is, btw., one reason, why I favor a fixed order (alphabetically or > something). Easy to remember and no way to misinterpret it. > Well if the order is not important and all permutations are allowed then, at least those permutations can be used as an alphabet to build a hidden channel over it :-) Regards, francis From vadmium+py at gmail.com Tue Oct 27 00:15:00 2015 From: vadmium+py at gmail.com (Martin Panter) Date: Tue, 27 Oct 2015 04:15:00 +0000 Subject: [Python-Dev] Should PEP 498 specify if rf'...' is valid? In-Reply-To: <562E8279.2060305@mrabarnett.plus.com> References: <5628C94E.8010804@trueblade.com> <56290A88.4090607@mail.de> <540BA627-65FF-47B2-813F-3ED0E69701AE@gmail.com> <56291688.9060604@mail.de> <56291910.1090303@trueblade.com> <562E4540.8080308@stoneleaf.us> <562E74B7.4060409@mail.de> <562E8279.2060305@mrabarnett.plus.com> Message-ID: On 26 October 2015 at 19:43, MRAB wrote: > On 2015-10-26 18:45, Sven R. Kunze wrote: >> >> On 26.10.2015 16:22, Ethan Furman wrote: >>> >>> On 10/23/2015 08:20 AM, Nick Coghlan wrote: >>>> >>>> My own objection isn't to allowing "fR" or "fbR", it's to >>>> allowing the uppercase "F". >>>> >>>> I also don't understand why we can't say "if 'f' is part of a >>>> string prefix, it must be first". >>> >>> >>> Sometimes order matters, and sometimes it does not. If the order >>> does not have an impact on the final code, it does not matter, and >>> making us have to remember an order that does not matter is a >>> waste. >> >> >> Order that matters? You must be kidding. That would turn different >> types of string extremely hard to understand because semantics >> differ. >> >> That is, btw., one reason, why I favor a fixed order (alphabetically >> or something). Easy to remember and no way to misinterpret it. >> > In Python 2, how often have you seen prefix "ur" rather than "ru"? > > I always used "ur". > > How often is alphabetical order used in the prefixes? > > If the order isn't alphabetical, then it's going to be some order > that's harder to remember, so I agree with Ethan here. In Python 2, ru". . ." is a SyntaxError, despite R coming before U in the alphabet. And rb". . ." is also a SyntaxError, but in Python 3 it was made legal. I don?t see much point restricting the order of rf". . ." versus fr". . .". Neither flag is particularly more important than the other, and even if one were, should that one be at the front, or at the end closer to the string? From srkunze at mail.de Tue Oct 27 14:39:26 2015 From: srkunze at mail.de (Sven R. Kunze) Date: Tue, 27 Oct 2015 19:39:26 +0100 Subject: [Python-Dev] Should PEP 498 specify if rf'...' is valid? In-Reply-To: <562E8502.1050303@stoneleaf.us> References: <5628C94E.8010804@trueblade.com> <56290A88.4090607@mail.de> <540BA627-65FF-47B2-813F-3ED0E69701AE@gmail.com> <56291688.9060604@mail.de> <56291910.1090303@trueblade.com> <562E4540.8080308@stoneleaf.us> <562E74B7.4060409@mail.de> <562E8502.1050303@stoneleaf.us> Message-ID: <562FC4DE.4000904@mail.de> On 26.10.2015 20:54, Ethan Furman wrote: > You misunderstand -- the order of string prefixes does *not* matter, > so forcing us to use one is silly. I don't think so. It's better to have less possibilities in the beginning. So later on, we can easily relax the restrictions without breaking compatibility. Best, Sven From breamoreboy at yahoo.co.uk Tue Oct 27 16:39:33 2015 From: breamoreboy at yahoo.co.uk (Mark Lawrence) Date: Tue, 27 Oct 2015 20:39:33 +0000 Subject: [Python-Dev] Should PEP 498 specify if rf'...' is valid? In-Reply-To: <562FC4DE.4000904@mail.de> References: <5628C94E.8010804@trueblade.com> <56290A88.4090607@mail.de> <540BA627-65FF-47B2-813F-3ED0E69701AE@gmail.com> <56291688.9060604@mail.de> <56291910.1090303@trueblade.com> <562E4540.8080308@stoneleaf.us> <562E74B7.4060409@mail.de> <562E8502.1050303@stoneleaf.us> <562FC4DE.4000904@mail.de> Message-ID: On 27/10/2015 18:39, Sven R. Kunze wrote: > On 26.10.2015 20:54, Ethan Furman wrote: >> You misunderstand -- the order of string prefixes does *not* matter, >> so forcing us to use one is silly. > > I don't think so. > > It's better to have less possibilities in the beginning. So later on, we > can easily relax the restrictions without breaking compatibility. > > Best, > Sven Please no. Having a memory much like a bottomless sieve I have no intention of reading the manual every time I need to use string prefixes. Would the table listing "string prefix precedence" in The Fine Docs? be unique in the programming world? -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence From eric at trueblade.com Tue Oct 27 17:19:02 2015 From: eric at trueblade.com (Eric V. Smith) Date: Tue, 27 Oct 2015 17:19:02 -0400 Subject: [Python-Dev] Should PEP 498 specify if rf'...' is valid? In-Reply-To: References: <5628C94E.8010804@trueblade.com> <56290A88.4090607@mail.de> <540BA627-65FF-47B2-813F-3ED0E69701AE@gmail.com> <56291688.9060604@mail.de> <56291910.1090303@trueblade.com> <562E4540.8080308@stoneleaf.us> <562E74B7.4060409@mail.de> <562E8502.1050303@stoneleaf.us> <562FC4DE.4000904@mail.de> Message-ID: On Oct 27, 2015, at 4:39 PM, Mark Lawrence wrote: > >> On 27/10/2015 18:39, Sven R. Kunze wrote: >>> On 26.10.2015 20:54, Ethan Furman wrote: >>> You misunderstand -- the order of string prefixes does *not* matter, >>> so forcing us to use one is silly. >> >> I don't think so. >> >> It's better to have less possibilities in the beginning. So later on, we >> can easily relax the restrictions without breaking compatibility. >> >> Best, >> Sven > > Please no. Having a memory much like a bottomless sieve I have no intention of reading the manual every time I need to use string prefixes. Would the table listing "string prefix precedence" in The Fine Docs? be unique in the programming world? Plus, it's too late for this. What would the manual say: "either br or rb is okay, but you if you want fbr you can't spell it frb."? (Assuming we add binary f-strings.) Let's just leave it as any order, and with any capitalization. Eric. From srkunze at mail.de Tue Oct 27 17:54:12 2015 From: srkunze at mail.de (Sven R. Kunze) Date: Tue, 27 Oct 2015 22:54:12 +0100 Subject: [Python-Dev] Should PEP 498 specify if rf'...' is valid? In-Reply-To: References: <5628C94E.8010804@trueblade.com> <56290A88.4090607@mail.de> <540BA627-65FF-47B2-813F-3ED0E69701AE@gmail.com> <56291688.9060604@mail.de> <56291910.1090303@trueblade.com> <562E4540.8080308@stoneleaf.us> <562E74B7.4060409@mail.de> <562E8502.1050303@stoneleaf.us> <562FC4DE.4000904@mail.de> Message-ID: <562FF284.2030504@mail.de> On 27.10.2015 22:19, Eric V. Smith wrote: > On Oct 27, 2015, at 4:39 PM, Mark Lawrence wrote: >>> On 27/10/2015 18:39, Sven R. Kunze wrote: >>>> On 26.10.2015 20:54, Ethan Furman wrote: >>>> You misunderstand -- the order of string prefixes does *not* matter, >>>> so forcing us to use one is silly. >>> I don't think so. >>> >>> It's better to have less possibilities in the beginning. So later on, we >>> can easily relax the restrictions without breaking compatibility. >>> >>> Best, >>> Sven >> Please no. Having a memory much like a bottomless sieve I have no intention of reading the manual every time I need to use string prefixes. Would the table listing "string prefix precedence" in The Fine Docs? be unique in the programming world? > Plus, it's too late for this. What would the manual say: "either br or rb is okay, but you if you want fbr you can't spell it frb."? (Assuming we add binary f-strings.) > > Let's just leave it as any order, and with any capitalization. It was just for the protocol. The harm is already done. ;-) Best, Sven From eric at trueblade.com Wed Oct 28 08:35:42 2015 From: eric at trueblade.com (Eric V. Smith) Date: Wed, 28 Oct 2015 08:35:42 -0400 Subject: [Python-Dev] Bytcode "magic tag" Message-ID: <5630C11E.9060009@trueblade.com> In issue 25483 I'm adding an opcode to make f-string formatting more robust and faster. As part of that, I'm bumping the .pyc magic number. While doing that, I notice Lib/importlib/_bootstrap_external.h includes this comment: # Starting with the adoption of PEP 3147 in Python 3.2, every bump in magic # number also includes a new "magic tag", i.e. a human readable string used # to represent the magic number in __pycache__ directories. When you change # the magic number, you must also set a new unique magic tag. Generally this # can be named after the Python major version of the magic number bump, but # it can really be anything, as long as it's different than anything else # that's come before. The tags are included in the following table, starting # with Python 3.2a0. The "following table" is a comment, that contains a few references to the tag "cpython-", specifically cpython-32. It doesn't seem that the tag is routinely updated in the comment. sys.implementation.cache_tag returns 'cpython-36', and is in fact implemented as 'cpython-{PY_MAJOR_VERSION}{PY_MINOR_VERSION}'. Do I need to do anything else? Unlike what the comment in _boostrap_external.py suggests, this "magic tag" will not change every time a bytecode is added, but only on every minor release. Which implies that if we have a micro release that adds an opcode, we'll in fact break the promise in the comment. >From my understanding on how this tag is used, this wouldn't be a problem (because the magic number in the file also changes). But I want to make sure I'm not misunderstanding something. I think the comment above is probably just misleading. Eric. From barry at python.org Wed Oct 28 10:19:44 2015 From: barry at python.org (Barry Warsaw) Date: Wed, 28 Oct 2015 10:19:44 -0400 Subject: [Python-Dev] Bytcode "magic tag" In-Reply-To: <5630C11E.9060009@trueblade.com> References: <5630C11E.9060009@trueblade.com> Message-ID: <20151028101944.542e182d@limelight.wooz.org> On Oct 28, 2015, at 08:35 AM, Eric V. Smith wrote: >The "following table" is a comment, that contains a few references to >the tag "cpython-", specifically cpython-32. It doesn't seem >that the tag is routinely updated in the comment. IIRC, it used to have to be changed in the code, but with this... >sys.implementation.cache_tag returns 'cpython-36', and is in fact >implemented as 'cpython-{PY_MAJOR_VERSION}{PY_MINOR_VERSION}'. ...I don't believe it does any more. >Do I need to do anything else? I don't think so. >Unlike what the comment in _boostrap_external.py suggests, this "magic tag" >will not change every time a bytecode is added, but only on every minor >release. Which implies that if we have a micro release that adds an opcode, >we'll in fact break the promise in the comment. Right. Have we ever done that though? We shouldn't! >From my understanding on how this tag is used, this wouldn't be a >problem (because the magic number in the file also changes). But I want >to make sure I'm not misunderstanding something. I think the comment >above is probably just misleading. Yeah, it's just out of date. Cheers, -Barry From eric at trueblade.com Wed Oct 28 10:24:30 2015 From: eric at trueblade.com (Eric V. Smith) Date: Wed, 28 Oct 2015 10:24:30 -0400 Subject: [Python-Dev] Bytcode "magic tag" In-Reply-To: <20151028101944.542e182d@limelight.wooz.org> References: <5630C11E.9060009@trueblade.com> <20151028101944.542e182d@limelight.wooz.org> Message-ID: <5630DA9E.20603@trueblade.com> On 10/28/2015 10:19 AM, Barry Warsaw wrote: > On Oct 28, 2015, at 08:35 AM, Eric V. Smith wrote: > >> The "following table" is a comment, that contains a few references to >> the tag "cpython-", specifically cpython-32. It doesn't seem >> that the tag is routinely updated in the comment. > > IIRC, it used to have to be changed in the code, but with this... > >> sys.implementation.cache_tag returns 'cpython-36', and is in fact >> implemented as 'cpython-{PY_MAJOR_VERSION}{PY_MINOR_VERSION}'. > > ...I don't believe it does any more. Okay. Maybe I'll update that comment, then. >> Unlike what the comment in _boostrap_external.py suggests, this "magic tag" >> will not change every time a bytecode is added, but only on every minor >> release. Which implies that if we have a micro release that adds an opcode, >> we'll in fact break the promise in the comment. > > Right. Have we ever done that though? We shouldn't! And maybe I'll add that to the updated comment! Eric. From eric at trueblade.com Wed Oct 28 10:28:33 2015 From: eric at trueblade.com (Eric V. Smith) Date: Wed, 28 Oct 2015 10:28:33 -0400 Subject: [Python-Dev] Bytcode "magic tag" In-Reply-To: References: <5630C11E.9060009@trueblade.com> Message-ID: <5630DB91.1010108@trueblade.com> On 10/28/2015 10:22 AM, Eric Snow wrote: > On Wed, Oct 28, 2015 at 6:35 AM, Eric V. Smith wrote: >> Do I need to do anything else? Unlike what the comment in >> _boostrap_external.py suggests, this "magic tag" will not change every >> time a bytecode is added, but only on every minor release. Which implies >> that if we have a micro release that adds an opcode, we'll in fact break >> the promise in the comment. > > You will want to bump the number on the following line: > https://hg.python.org/cpython/file/default/Lib/importlib/_bootstrap_external.py#l231 Thanks. That part I've done (but forgot to mention). I was just concerned about the "magic tag" part, which Barry cleared up. Eric. From ericsnowcurrently at gmail.com Wed Oct 28 10:22:19 2015 From: ericsnowcurrently at gmail.com (Eric Snow) Date: Wed, 28 Oct 2015 08:22:19 -0600 Subject: [Python-Dev] Bytcode "magic tag" In-Reply-To: <5630C11E.9060009@trueblade.com> References: <5630C11E.9060009@trueblade.com> Message-ID: On Wed, Oct 28, 2015 at 6:35 AM, Eric V. Smith wrote: > In issue 25483 I'm adding an opcode to make f-string formatting more > robust and faster. As part of that, I'm bumping the .pyc magic number. > > While doing that, I notice Lib/importlib/_bootstrap_external.h includes > this comment: > > # Starting with the adoption of PEP 3147 in Python 3.2, every bump in magic > # number also includes a new "magic tag", i.e. a human readable string used > # to represent the magic number in __pycache__ directories. When you change > # the magic number, you must also set a new unique magic tag. Generally > this > # can be named after the Python major version of the magic number bump, but > # it can really be anything, as long as it's different than anything else > # that's come before. The tags are included in the following table, > starting > # with Python 3.2a0. > > The "following table" is a comment, that contains a few references to > the tag "cpython-", specifically cpython-32. It doesn't seem > that the tag is routinely updated in the comment. > > sys.implementation.cache_tag returns 'cpython-36', and is in fact > implemented as 'cpython-{PY_MAJOR_VERSION}{PY_MINOR_VERSION}'. > > Do I need to do anything else? Unlike what the comment in > _boostrap_external.py suggests, this "magic tag" will not change every > time a bytecode is added, but only on every minor release. Which implies > that if we have a micro release that adds an opcode, we'll in fact break > the promise in the comment. You will want to bump the number on the following line: https://hg.python.org/cpython/file/default/Lib/importlib/_bootstrap_external.py#l231 -eric > > From my understanding on how this tag is used, this wouldn't be a > problem (because the magic number in the file also changes). But I want > to make sure I'm not misunderstanding something. I think the comment > above is probably just misleading. > > Eric. > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ericsnowcurrently%40gmail.com From jab at math.brown.edu Wed Oct 28 11:45:31 2015 From: jab at math.brown.edu (jab at math.brown.edu) Date: Wed, 28 Oct 2015 11:45:31 -0400 Subject: [Python-Dev] Intended Usage of collections.abc for Custom Collections Message-ID: Dear Python-Dev, I am the author of bidict, a bidirectional map implementation for Python. A user recently filed a bug that bidict should be a subclass of dict, so that isinstance(mybidict, dict) would return True. I replied that the user should instead use isinstance(mybidict, collections.abc.Mapping), which does already return True, and is more polymorphic to boot. But before I put the issue to bed, I want to make sure I'm correctly understanding the intended usage of collections.abc, as well as any relevant interfaces I'm not currently using (collections.UserDict? __subclasshook__?), since the documentation leaves me with some doubt. Could any collections experts on this list please confirm whether bidict is implemented as the language intends it should be? Some quick references: https://bidict.readthedocs.org/en/latest/other-bidict-types.html#bidict-type-hierarchy https://github.com/jab/bidict/blob/master/bidict/_bidict.py I would be happy to try to capture what I learn from this thread and write up a guide for collections library authors in the future, or otherwise pay your help forward however I can. Thanks and best wishes. -jab -------------- next part -------------- An HTML attachment was scrubbed... URL: From ericsnowcurrently at gmail.com Wed Oct 28 12:36:51 2015 From: ericsnowcurrently at gmail.com (Eric Snow) Date: Wed, 28 Oct 2015 10:36:51 -0600 Subject: [Python-Dev] Bytcode "magic tag" In-Reply-To: <5630DB91.1010108@trueblade.com> References: <5630C11E.9060009@trueblade.com> <5630DB91.1010108@trueblade.com> Message-ID: On Wed, Oct 28, 2015 at 8:28 AM, Eric V. Smith wrote: > Thanks. That part I've done (but forgot to mention). I was just > concerned about the "magic tag" part, which Barry cleared up. Ah, I misread. :) Yeah, that comment is out of date. -eric From brett at python.org Wed Oct 28 13:16:08 2015 From: brett at python.org (Brett Cannon) Date: Wed, 28 Oct 2015 17:16:08 +0000 Subject: [Python-Dev] Intended Usage of collections.abc for Custom Collections In-Reply-To: References: Message-ID: On Wed, 28 Oct 2015 at 08:47 wrote: > Dear Python-Dev, > > I am the author of bidict, a bidirectional map implementation for Python. > A user recently filed a bug that bidict should be a subclass of dict, so > that isinstance(mybidict, dict) would return True. I replied that the user > should instead use isinstance(mybidict, collections.abc.Mapping), which > does already return True, and is more polymorphic to boot. > I would argue that chances are they don't need isinstance() in either case. :) > > But before I put the issue to bed, I want to make sure I'm correctly > understanding the intended usage of collections.abc, as well as any > relevant interfaces I'm not currently using (collections.UserDict? > __subclasshook__?), since the documentation leaves me with some doubt. > Could any collections experts on this list please confirm whether bidict is > implemented as the language intends it should be? > ABCs are meant to make sure you implement key methods for an interface/protocol. So in the case of collections.abc.Mapping, it's to make sure you implement __getitem__. In exchange for subclassing the ABC you also gain some methods for free like get(). So you subclass an ABC because you want your object to be acceptable in any code that expects an object that implements that interface/protocol and you want the help ABCs provide in making sure you don't accidentally miss some key method. Subclassing a concrete implementation of the Mapping ABC -- which is what dict is -- should be done if it is beneficial to you, but not simply to satisfy an isinstance() check. I think the ABC registration is the right thing to do and the user requesting the dict subclass should actually be doing what you suggested and testing for the interface/protocol and not the concrete implementation. And if you want another way to hit this point home, with type hints people should only be expecting abstract types like typing.Mapping as input: https://docs.python.org/3/library/typing.html#typing.Mapping . Restricting yourself to only a dict locks out other completely viable types that implement the mapping interface/protocol. Much like working with data, you should be as flexible as possible on your inputs (e.g., specifying typing.Mapping as the parameter type), but as strict as possible on the return type (.e.g, specifying dict/typing.Dict as the return type). I honestly would want to know why the user cares about an isinstance() check to begin with since they might want to go with a try/except when using the object how they want it to be and erroring out if they get passed an object that doesn't quack like a dict thanks to duck typing. -Brett > > Some quick references: > > https://bidict.readthedocs.org/en/latest/other-bidict-types.html#bidict-type-hierarchy > https://github.com/jab/bidict/blob/master/bidict/_bidict.py > > I would be happy to try to capture what I learn from this thread and write > up a guide for collections library authors in the future, or otherwise pay > your help forward however I can. > > Thanks and best wishes. > > -jab > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jab at math.brown.edu Wed Oct 28 14:21:14 2015 From: jab at math.brown.edu (jab at math.brown.edu) Date: Wed, 28 Oct 2015 14:21:14 -0400 Subject: [Python-Dev] Intended Usage of collections.abc for Custom Collections In-Reply-To: References: Message-ID: On Wed, Oct 28, 2015 at 1:16 PM, Brett Cannon wrote: > On Wed, 28 Oct 2015 at 08:47 wrote: > >> Dear Python-Dev, >> >> I am the author of bidict, a bidirectional map implementation for Python. >> A user recently filed a bug that bidict should be a subclass of dict, so >> that isinstance(mybidict, dict) would return True. I replied that the user >> should instead use isinstance(mybidict, collections.abc.Mapping), which >> does already return True, and is more polymorphic to boot. >> > > I would argue that chances are they don't need isinstance() in either > case. :) > > >> But before I put the issue to bed, I want to make sure I'm correctly >> understanding the intended usage of collections.abc, as well as any >> relevant interfaces I'm not currently using (collections.UserDict? >> __subclasshook__?), since the documentation leaves me with some doubt. >> Could any collections experts on this list please confirm whether bidict is >> implemented as the language intends it should be? >> > > ABCs are meant to make sure you implement key methods for an > interface/protocol. So in the case of collections.abc.Mapping, it's to make > sure you implement __getitem__. In exchange for subclassing the ABC you > also gain some methods for free like get(). So you subclass an ABC because > you want your object to be acceptable in any code that expects an object > that implements that interface/protocol and you want the help ABCs provide > in making sure you don't accidentally miss some key method. > > Subclassing a concrete implementation of the Mapping ABC -- which is what > dict is -- should be done if it is beneficial to you, but not simply to > satisfy an isinstance() check. I think the ABC registration is the right > thing to do and the user requesting the dict subclass should actually be > doing what you suggested and testing for the interface/protocol and not the > concrete implementation. > > And if you want another way to hit this point home, with type hints people > should only be expecting abstract types like typing.Mapping as input: > https://docs.python.org/3/library/typing.html#typing.Mapping . > Restricting yourself to only a dict locks out other completely viable types > that implement the mapping interface/protocol. Much like working with data, > you should be as flexible as possible on your inputs (e.g., specifying > typing.Mapping as the parameter type), but as strict as possible on the > return type (.e.g, specifying dict/typing.Dict as the return type). > > I honestly would want to know why the user cares about an isinstance() > check to begin with since they might want to go with a try/except when > using the object how they want it to be and erroring out if they get passed > an object that doesn't quack like a dict thanks to duck typing. > > -Brett > Thanks very much for the thorough and thoughtful reply. I'll take this as authoritative approval of the current design, barring any further recommendations to the contrary. As for the isinstance check, it turned out that this wasn't actually in the user's code; the offending code is actually in the pandas library, which he was using. I just submitted a PR there in case anyone is interested: https://github.com/pydata/pandas/pull/11461 Thanks again for making my first experience on this list so positive. -jab -------------- next part -------------- An HTML attachment was scrubbed... URL: From tritium-list at sdamon.com Wed Oct 28 15:05:21 2015 From: tritium-list at sdamon.com (Alexander Walters) Date: Wed, 28 Oct 2015 15:05:21 -0400 Subject: [Python-Dev] Should PEP 498 specify if rf'...' is valid? In-Reply-To: <562FC4DE.4000904@mail.de> References: <5628C94E.8010804@trueblade.com> <56290A88.4090607@mail.de> <540BA627-65FF-47B2-813F-3ED0E69701AE@gmail.com> <56291688.9060604@mail.de> <56291910.1090303@trueblade.com> <562E4540.8080308@stoneleaf.us> <562E74B7.4060409@mail.de> <562E8502.1050303@stoneleaf.us> <562FC4DE.4000904@mail.de> Message-ID: <56311C71.3050505@sdamon.com> Have you ever used a command line application that --accepted --Boolean --flags? Have you ever found one that required the flags to be in order? You remember how much you hated that application for being so arbitrary about the input? That is exactly how I feel about the order mattering for string prefixes in 2.7. 3.x got rid of that, and that is one of the few undeniably good choices made in python 3. for the love of all that is holy, don't un-fix that. On 10/27/2015 14:39, Sven R. Kunze wrote: > On 26.10.2015 20:54, Ethan Furman wrote: >> You misunderstand -- the order of string prefixes does *not* matter, >> so forcing us to use one is silly. > > I don't think so. > > It's better to have less possibilities in the beginning. So later on, > we can easily relax the restrictions without breaking compatibility. > > Best, > Sven > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/tritium-list%40sdamon.com From lac at openend.se Thu Oct 29 11:59:44 2015 From: lac at openend.se (Laura Creighton) Date: Thu, 29 Oct 2015 16:59:44 +0100 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen Message-ID: <201510291559.t9TFxi4i003479@fido.openend.se> see the following: lac at smartwheels:~/junk$ echo "print ('hello there')" >string.py lac at smartwheels:~/junk$ idle-python3.5 hello there Traceback (most recent call last): File "", line 1, in File "/usr/lib/python3.5/idlelib/run.py", line 10, in from idlelib import CallTips File "/usr/lib/python3.5/idlelib/CallTips.py", line 16, in from idlelib.HyperParser import HyperParser File "/usr/lib/python3.5/idlelib/HyperParser.py", line 14, in _ASCII_ID_CHARS = frozenset(string.ascii_letters + string.digits + "_") AttributeError: module 'string' has no attribute 'ascii_letters' IDLE then produces a popup that says: IDLE's subprocess didn't make connection. Either IDLE can't stat a subprocess por personal firewall software is blocking the connection. -------- I think that life would be a whole lot easier for people if instead we got a message: Warning: local file /u/lac/junk/string.py shadows module named string in the Standard Library I think that it is python exec that would have to do this -- though of course the popup could also warn about shadowing in general, instead of sending people on wild goose chases over their firewalls. Would this be hard to do? Laura From brett at python.org Thu Oct 29 13:20:31 2015 From: brett at python.org (Brett Cannon) Date: Thu, 29 Oct 2015 17:20:31 +0000 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen In-Reply-To: <201510291559.t9TFxi4i003479@fido.openend.se> References: <201510291559.t9TFxi4i003479@fido.openend.se> Message-ID: On Thu, 29 Oct 2015 at 09:00 Laura Creighton wrote: > > see the following: > lac at smartwheels:~/junk$ echo "print ('hello there')" >string.py > lac at smartwheels:~/junk$ idle-python3.5 > hello there > Traceback (most recent call last): > File "", line 1, in > File "/usr/lib/python3.5/idlelib/run.py", line 10, in > from idlelib import CallTips > File "/usr/lib/python3.5/idlelib/CallTips.py", line 16, in > from idlelib.HyperParser import HyperParser > File "/usr/lib/python3.5/idlelib/HyperParser.py", line 14, in > _ASCII_ID_CHARS = frozenset(string.ascii_letters + string.digits + "_") > AttributeError: module 'string' has no attribute 'ascii_letters' > > IDLE then produces a popup that says: > > IDLE's subprocess didn't make connection. Either IDLE can't stat a > subprocess por personal firewall software is blocking the connection. > > -------- > > I think that life would be a whole lot easier for people if instead we got > a message: > > Warning: local file /u/lac/junk/string.py shadows module named string in > the > Standard Library > > I think that it is python exec that would have to do this -- though of > course the popup could also warn about shadowing in general, instead of > sending people on wild goose chases over their firewalls. > > Would this be hard to do? > It would require a custom importer or overriding __import__ but it's doable. -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Thu Oct 29 13:30:42 2015 From: njs at pobox.com (Nathaniel Smith) Date: Thu, 29 Oct 2015 10:30:42 -0700 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen In-Reply-To: References: <201510291559.t9TFxi4i003479@fido.openend.se> Message-ID: On Oct 29, 2015 10:22 AM, "Brett Cannon" wrote: > > > > On Thu, 29 Oct 2015 at 09:00 Laura Creighton wrote: >> >> >> see the following: >> lac at smartwheels:~/junk$ echo "print ('hello there')" >string.py >> lac at smartwheels:~/junk$ idle-python3.5 >> hello there >> Traceback (most recent call last): >> File "", line 1, in >> File "/usr/lib/python3.5/idlelib/run.py", line 10, in >> from idlelib import CallTips >> File "/usr/lib/python3.5/idlelib/CallTips.py", line 16, in >> from idlelib.HyperParser import HyperParser >> File "/usr/lib/python3.5/idlelib/HyperParser.py", line 14, in >> _ASCII_ID_CHARS = frozenset(string.ascii_letters + string.digits + "_") >> AttributeError: module 'string' has no attribute 'ascii_letters' >> >> IDLE then produces a popup that says: >> >> IDLE's subprocess didn't make connection. Either IDLE can't stat a subprocess por personal firewall software is blocking the connection. >> >> -------- >> >> I think that life would be a whole lot easier for people if instead we got >> a message: >> >> Warning: local file /u/lac/junk/string.py shadows module named string in the >> Standard Library >> >> I think that it is python exec that would have to do this -- though of >> course the popup could also warn about shadowing in general, instead of >> sending people on wild goose chases over their firewalls. >> >> Would this be hard to do? > > > It would require a custom importer or overriding __import__ but it's doable. Is there any reason not to issue such warnings by default in the standard importer? The issue of accidentally shadowing stdlib modules isn't restricted to idle, it's difficult for idle to handle correctly (how do you define a custom importer if you don't yet have access to the stdlib?), and it's not like there's any legitimate reason to want string.py in the working directory to auto-monkeypatch stdlib string. (I know saying that last part out loud will probably just cause someone to pop out of the woodwork and explain how shadowing the sys module is a great idea and they do it all the time or whatever, but I guess I'll take that risk :-).) -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Thu Oct 29 13:36:00 2015 From: donald at stufft.io (Donald Stufft) Date: Thu, 29 Oct 2015 13:36:00 -0400 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen In-Reply-To: References: <201510291559.t9TFxi4i003479@fido.openend.se> Message-ID: On October 29, 2015 at 1:32:31 PM, Nathaniel Smith (njs at pobox.com) wrote: > > (I know saying that last part out loud will probably just cause > someone to pop out of the woodwork and explain how shadowing the > sys module is a great idea and they do it all the time or whatever, > but I guess I'll take that risk :-).) How about someone saying that I wish the standard library was more easily shadow-able? ;) In 2.x it?s easy to do it by accident because of the implicit relative imports which is of course, crummy. I think it?d be nice if a package could override the standard library in a sane way though. Like how pdb++ does. This is already possible if you install stuff as .eggs and the world hasn?t burned down, it?s just not easily possible if you don?t install as eggs. ----------------- Donald Stufft PGP: 0x6E3CBCE93372DCFA // 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA From p.f.moore at gmail.com Thu Oct 29 14:27:59 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Thu, 29 Oct 2015 18:27:59 +0000 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen In-Reply-To: References: <201510291559.t9TFxi4i003479@fido.openend.se> Message-ID: On 29 October 2015 at 17:36, Donald Stufft wrote: > On October 29, 2015 at 1:32:31 PM, Nathaniel Smith (njs at pobox.com) wrote: >> > (I know saying that last part out loud will probably just cause >> someone to pop out of the woodwork and explain how shadowing the >> sys module is a great idea and they do it all the time or whatever, >> but I guess I'll take that risk :-).) > > How about someone saying that I wish the standard library was more easily shadow-able? ;) In 2.x it?s easy to do it by accident because of the implicit relative imports which is of course, crummy. I think it?d be nice if a package could override the standard library in a sane way though. Like how pdb++ does. > > This is already possible if you install stuff as .eggs and the world hasn?t burned down, it?s just not easily possible if you don?t install as eggs. The idle issues seem to me to demonstrate that shadowing the stdlib is a bad idea. Of course, consenting adults, and if you override you're responsible for correctly replacing the functionality, and all that, but honestly, I don't think it needs to be *easy* to shadow the stdlib - there's nothing wrong with it being an "advanced" technique that people have to understand in order to use. And I don't see pdb++ as a good motivating use case. When are you going to find code with pdb calls in, that you can't modify to add "import pdbpp as pdb" at the top, if you want the advanced functionality? (And if you can't modify the code, what's the point of debugging it anyway?) I think the reason the world hasn't burned down because you can shadow the stdlib with eggs, is because people *don't*. Or to rephrase, because people don't actually need to shadow the stdlib. But I have no hard data to support that contention, so if you do, I'll concede the point (but I don't think there are many packages that *have* to be shipped as eggs these days, if they did they wouldn't work with pip, so doesn't that mean that nobody is using the ability of eggs to shadow the stdlib?) Paul From donald at stufft.io Thu Oct 29 14:45:09 2015 From: donald at stufft.io (Donald Stufft) Date: Thu, 29 Oct 2015 14:45:09 -0400 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen In-Reply-To: References: <201510291559.t9TFxi4i003479@fido.openend.se> Message-ID: On October 29, 2015 at 2:28:02 PM, Paul Moore (p.f.moore=40gmail.com) wro= te: > On 29 October 2015 at 17:36, Donald Stufft wrote: > > On October 29, 2015 at 1:32:31 PM, Nathaniel Smith (njs=40pobox.com) = wrote: > >> > (I know saying that last part out loud will probably just cause > >> someone to pop out of the woodwork and explain how shadowing the > >> sys module is a great idea and they do it all the time or whatever, > >> but I guess I'll take that risk :-).) > > > > How about someone saying that I wish the standard library was more ea= sily shadow-able=3F =20 > ;) In 2.x it=E2=80=99s easy to do it by accident because of the implici= t relative imports which =20 > is of course, crummy. I think it=E2=80=99d be nice if a package could o= verride the standard library =20 > in a sane way though. Like how pdb++ does. > > > > This is already possible if you install stuff as .eggs and the world = hasn=E2=80=99t burned down, =20 > it=E2=80=99s just not easily possible if you don=E2=80=99t install as e= ggs. > =20 > The idle issues seem to me to demonstrate that shadowing the stdlib is > a bad idea. Of course, consenting adults, and if you override you're > responsible for correctly replacing the functionality, and all that, > but honestly, I don't think it needs to be *easy* to shadow the stdlib > - there's nothing wrong with it being an =22advanced=22 technique that > people have to understand in order to use. I think the idle problem demonstrates that shadowing the stdlib accidenta= lly is a bad idea, and the old implicit relative imports made it extremel= y trivial to do so. Having =60.=60 first on the path still makes it trivi= al to do so, but less so than the old implicit relative imports did. > =20 > And I don't see pdb++ as a good motivating use case. When are you > going to find code with pdb calls in, that you can't modify to add > =22import pdbpp as pdb=22 at the top, if you want the advanced > functionality=3F (And if you can't modify the code, what's the point of= > debugging it anyway=3F) Every test runner that includes a =E2=80=94pdb flag that will automatical= ly invoke pdb at the point of failure. If pdb++ didn=E2=80=99t force the = shadowing of stdlib, then every single test runner would need an option l= ike =E2=80=94pdb++ instead of =E2=80=94pdb (or it would need to unconditi= onally prefer pdb++ over pdb if you had =E2=80=94pdb). But what if I don=E2= =80=99t like pdb++ and I want to use some other pdb replacement=3F Possib= ly one I=E2=80=99m writing myself=3F Either I can convince every test run= ner to support every pdb replacement or I need some sort of central regis= try where I can insert my pdb as a replacement to the =E2=80=9Creal=E2=80= =9D pdb. The sys.path and import system is one such registry that already= exists and is (my opinion) the logical choice. An example close to home for me where it would have been immensely useful= is for adding pip to the standard library. I was opposed to doing that b= ecause I wanted people to be able to upgrade their pip without having to = upgrade their Python (because we=E2=80=99ve seen how well it worked for d= istutils). Because there=E2=80=99s not a great way to do that, we ended u= p having to go with ensurepip instead which is more complicated and cause= d policy problems for a lot of downstream in a way that just adding pip t= o the standard library wouldn=E2=80=99t have. Instead we could have simpl= y added pip to the standard library, and then upgrading pip would have dr= opped it into site-packages and that could have taken precedence, allowin= g people to upgrade easily. > =20 > I think the reason the world hasn't burned down because you can shadow > the stdlib with eggs, is because people *don't*. Or to rephrase, > because people don't actually need to shadow the stdlib. But I have no > hard data to support that contention, so if you do, I'll concede the > point (but I don't think there are many packages that *have* to be > shipped as eggs these days, if they did they wouldn't work with pip, > so doesn't that mean that nobody is using the ability of eggs to > shadow the stdlib=3F) > =20 People route around technical limitations. In pdb++ case they used to bre= ak the ability of pip to prevent installation of egg by munging sys.argv = to remove the =E2=80=94single-version-externally-managed. Now they instal= l a .pth hack to force it to the front of sys.path.=C2=A0Other examples o= f ways people route around this is by using monkey patching (either expli= citly invoked or implicitly via shenagins), an example of this is setupto= ols monkeypatching distutils on import (which pip uses to override what d= istutils is). Another example is a more explicit shadowing which is done = by having some slightly differently named thing available and expecting a= ll users to do something like try: import unittest2 as unittest except Im= portError: import unittest. So I don=E2=80=99t think it=E2=80=99s true that people don=E2=80=99t shad= ow the standard library, they just have various ways to do it that have s= everal gotchas and require people to generally hack around the limitation= .=C2=A0 ----------------- Donald Stufft PGP: 0x6E3CBCE93372DC=46A // 7C6B 7C5D 5E2B 6356 A926 =4604=46 6E3C BCE9 = 3372 DC=46A -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 836 bytes Desc: Message signed with OpenPGP using AMPGpg URL: From lac at openend.se Thu Oct 29 14:46:31 2015 From: lac at openend.se (Laura Creighton) Date: Thu, 29 Oct 2015 19:46:31 +0100 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen In-Reply-To: References: <201510291559.t9TFxi4i003479@fido.openend.se> Message-ID: <201510291846.t9TIkV2r015999@fido.openend.se> In a message of Thu, 29 Oct 2015 18:27:59 +0000, Paul Moore writes: >The idle issues seem to me to demonstrate that shadowing the stdlib is >a bad idea. Of course, consenting adults, and if you override you're >responsible for correctly replacing the functionality, and all that, >but honestly, I don't think it needs to be *easy* to shadow the stdlib >- there's nothing wrong with it being an "advanced" technique that >people have to understand in order to use. I am actually sick of the 'consenting adults' argument. I am dealing with '11 year old children trying to write their first, third and tenth python programs'. For the life of me I cannot see how convenience for the sort of person who has a legitimate reason to shadow the syslib should get a higher priority over these mites who are doing their damndest to write python despite natural language barriers and the fact that their peers and parents think they are nuts to want to do so. (a grumpy comment from a teacher at a Swedish 'coding for kids' club. Disregard if too grumpy.) Laura From p.f.moore at gmail.com Thu Oct 29 15:13:08 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Thu, 29 Oct 2015 19:13:08 +0000 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen In-Reply-To: <201510291846.t9TIkV2r015999@fido.openend.se> References: <201510291559.t9TFxi4i003479@fido.openend.se> <201510291846.t9TIkV2r015999@fido.openend.se> Message-ID: On 29 October 2015 at 18:46, Laura Creighton wrote: > In a message of Thu, 29 Oct 2015 18:27:59 +0000, Paul Moore writes: >>The idle issues seem to me to demonstrate that shadowing the stdlib is >>a bad idea. Of course, consenting adults, and if you override you're >>responsible for correctly replacing the functionality, and all that, >>but honestly, I don't think it needs to be *easy* to shadow the stdlib >>- there's nothing wrong with it being an "advanced" technique that >>people have to understand in order to use. > > I am actually sick of the 'consenting adults' argument. > I am dealing with '11 year old children trying to write their > first, third and tenth python programs'. For the life of me > I cannot see how convenience for the sort of person who has a > legitimate reason to shadow the syslib should get a higher priority > over these mites who are doing their damndest to write python > despite natural language barriers and the fact that their peers > and parents think they are nuts to want to do so. That's actually a very good point, and I agree totally. To my mind, the point about "consenting adults" (and when I referred to that I was anticipating others using that argument, not proposing it myself) is that we don't *prevent* people from doing weird and wonderful things. But conversely, it's not a reason for making it *easy* to do such things. Quite the opposite - a "consenting adult" should be assumed to be capable of writing an import hook, or manipulating sys.path, or whatever. Paul From njs at pobox.com Thu Oct 29 15:21:12 2015 From: njs at pobox.com (Nathaniel Smith) Date: Thu, 29 Oct 2015 12:21:12 -0700 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen In-Reply-To: References: <201510291559.t9TFxi4i003479@fido.openend.se> Message-ID: On Oct 29, 2015 11:45 AM, "Donald Stufft" wrote: > > Every test runner that includes a =E2=80=94pdb flag that will automatical= > ly invoke pdb at the point of failure. If pdb++ didn=E2=80=99t force the = > shadowing of stdlib, then every single test runner would need an option l= > ike =E2=80=94pdb++ instead of =E2=80=94pdb (or it would need to unconditi= > onally prefer pdb++ over pdb if you had =E2=80=94pdb). But what if I don=E2= > =80=99t like pdb++ and I want to use some other pdb replacement=3F Possib= > ly one I=E2=80=99m writing myself=3F Either I can convince every test run= > ner to support every pdb replacement or I need some sort of central regis= > try where I can insert my pdb as a replacement to the =E2=80=9Creal=E2=80= > =9D pdb. The sys.path and import system is one such registry that already= > exists and is (my opinion) the logical choice. This strikes me as an argument for adding some sort of explicit plugin support to pdb, or defining a standard interface that test tools could use to query for installed debuggers. (Presumably via some entry point that alternative debugger packages would export?) In any case it's orthogonal to the issue of accidental shadowing. It seems unlikely that at this late date we'll be swapping the order of sys.path so site-packages comes before the stdlib, and I doubt there are a lot of people who are using pdb++ via the mechanism of making sure that it's always present in their working directory... -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From lac at openend.se Thu Oct 29 15:23:42 2015 From: lac at openend.se (Laura Creighton) Date: Thu, 29 Oct 2015 20:23:42 +0100 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen In-Reply-To: References: <201510291559.t9TFxi4i003479@fido.openend.se> <201510291846.t9TIkV2r015999@fido.openend.se> Message-ID: <201510291923.t9TJNgw0018921@fido.openend.se> In a message of Thu, 29 Oct 2015 19:13:08 +0000, Paul Moore writes: >> I am actually sick of the 'consenting adults' argument. >> I am dealing with '11 year old children trying to write their >> first, third and tenth python programs'. For the life of me >> I cannot see how convenience for the sort of person who has a >> legitimate reason to shadow the syslib should get a higher priority >> over these mites who are doing their damndest to write python >> despite natural language barriers and the fact that their peers >> and parents think they are nuts to want to do so. > >That's actually a very good point, and I agree totally. To my mind, >the point about "consenting adults" (and when I referred to that I was >anticipating others using that argument, not proposing it myself) is >that we don't *prevent* people from doing weird and wonderful things. >But conversely, it's not a reason for making it *easy* to do such >things. Quite the opposite - a "consenting adult" should be assumed to >be capable of writing an import hook, or manipulating sys.path, or >whatever. > >Paul Hmmm, I think the set of 'consenting adults who cannot write an import hook' is rather large. But all I am asking for is a warning -- and it would be good if Idle noticed the warning and before it fell over dead with its message of firewalls it would mention the warning again, as a problem to look for. It will bugger up doctests for people who legitimately shadow the stdlib and now get a new warning. Anybody else be seriously inconvenienced if we do this? I cannot think of any, but then legitimately shadowing the stdlib in not on the list of things I have done. Perhaps Dstufft has ideas on this line. Laura From p.f.moore at gmail.com Thu Oct 29 15:30:09 2015 From: p.f.moore at gmail.com (Paul Moore) Date: Thu, 29 Oct 2015 19:30:09 +0000 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen In-Reply-To: References: <201510291559.t9TFxi4i003479@fido.openend.se> Message-ID: On 29 October 2015 at 18:45, Donald Stufft wrote: > So I don=E2=80=99t think it=E2=80=99s true that people don=E2=80=99t shad= > ow the standard library, they just have various ways to do it that have s= > everal gotchas and require people to generally hack around the limitation= > .=C2=A0 (Your mailer or mine seems to have gone weird with encoding...) There's a difference between the opting into using (say) pdb++ at runtime, and at install time. The problem with the sort of shadowing eggs did (and people try to emulate) is that simply *installing* a package changes the behaviour of programs that don't use that package, by virtue of the fact that it's there at all. I'd much rather pdb++ or anything similar allowed the user to opt in on demand. For example, it could put a file pdb.py containing "from pdbpp import *" into a subdirectory, and direct users who want to replace pdb with pdbpp to add that directory to PYTHONPATH. Just don't do that at install time, leaving the user needing to uninstall pdbpp to opt out of the behaviour. As I said, I'm not against people being *able* to shadow the stdlib, I just don't want to see it being *easy*, nor do I want it to be the norm. Paul PS My experience with a similar case is pyreadline on Windows, which hooks into the stdlib readline functionality when installed. That's a PITA, because mostly I don't want pyreadline (I prefer the default Windows command line editing, despite it having less functionality) but I want pyreadline when using ipython. There's no way for me to get that behaviour because (py)readline is enabled by installing it, not by user choice at runtime. From lac at openend.se Thu Oct 29 15:33:59 2015 From: lac at openend.se (Laura Creighton) Date: Thu, 29 Oct 2015 20:33:59 +0100 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen In-Reply-To: References: <201510291559.t9TFxi4i003479@fido.openend.se> Message-ID: <201510291933.t9TJXxpM019701@fido.openend.se> In a message of Thu, 29 Oct 2015 19:30:09 +0000, Paul Moore writes: >On 29 October 2015 at 18:45, Donald Stufft wrote: >> So I don=E2=80=99t think it=E2=80=99s true that people don=E2=80=99t shad= >> ow the standard library, they just have various ways to do it that have s= >> everal gotchas and require people to generally hack around the limitation= >> .=C2=A0 > >(Your mailer or mine seems to have gone weird with encoding...) Dstuffts. I see this problem too Laura From bussonniermatthias at gmail.com Thu Oct 29 15:07:44 2015 From: bussonniermatthias at gmail.com (Matthias Bussonnier) Date: Thu, 29 Oct 2015 12:07:44 -0700 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen In-Reply-To: <201510291846.t9TIkV2r015999@fido.openend.se> References: <201510291559.t9TFxi4i003479@fido.openend.se> <201510291846.t9TIkV2r015999@fido.openend.se> Message-ID: <2C1EEDFE-927A-4F86-B6BB-52DBC54722F7@gmail.com> > On Oct 29, 2015, at 11:46, Laura Creighton wrote: > > In a message of Thu, 29 Oct 2015 18:27:59 +0000, Paul Moore writes: >> The idle issues seem to me to demonstrate that shadowing the stdlib is >> a bad idea. Of course, consenting adults, and if you override you're >> responsible for correctly replacing the functionality, and all that, >> but honestly, I don't think it needs to be *easy* to shadow the stdlib >> - there's nothing wrong with it being an "advanced" technique that >> people have to understand in order to use. > > I am actually sick of the 'consenting adults' argument. > I am dealing with '11 year old children trying to write their > first, third and tenth python programs'. For the life of me > I cannot see how convenience for the sort of person who has a > legitimate reason to shadow the syslib should get a higher priority > over these mites who are doing their damndest to write python > despite natural language barriers and the fact that their peers > and parents think they are nuts to want to do so. > > (a grumpy comment from a teacher at a Swedish 'coding for > kids' club. Disregard if too grumpy.) +1 on **warning**, warning still allow people to shadow stdlib, and for people who have **legitimate** reasons to shadow, we can always find a solution to to tag a module as ?yes I know I am shadowing, I am doing that on purpose?. StdlibShadowWarning and warning filter ? Also a warning would be useful for people to discover that some Stdlib modules exist, and maybe explore them. -- M From mark at markroseman.com Thu Oct 29 16:26:08 2015 From: mark at markroseman.com (Mark Roseman) Date: Thu, 29 Oct 2015 13:26:08 -0700 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen Message-ID: <1F1586DD-6864-4896-AF7E-FF448C79CC4C@markroseman.com> Laura, I think what you want should actually be more-or-less doable in IDLE. The main routine that starts IDLE should be able to detect if it starts correctly (something unlikely to happen if a significant stdlib module is shadowed), watch for an attribute error of that form and try to determine if shadowing is the cause, and if so, reissue a saner error message. The subprocess/firewall error is occurring because the ?string? problem in your example isn?t being hit right away so a few startup things already are happening. The point where we?re showing that error (as a result of a timeout) should be able to check as per the above that IDLE was able to start alright, and if not, change or ignore the timeout error. There?ll probably be some cases (depending on exactly what gets shadowed) that may be difficult to get to work, but it should be able to handle most things. Mark From lac at openend.se Thu Oct 29 16:35:36 2015 From: lac at openend.se (Laura Creighton) Date: Thu, 29 Oct 2015 21:35:36 +0100 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen In-Reply-To: <1F1586DD-6864-4896-AF7E-FF448C79CC4C@markroseman.com> References: <1F1586DD-6864-4896-AF7E-FF448C79CC4C@markroseman.com> Message-ID: <201510292035.t9TKZa0p024287@fido.openend.se> In a message of Thu, 29 Oct 2015 13:26:08 -0700, Mark Roseman writes: >Laura, I think what you want should actually be more-or-less doable in IDLE. > >The main routine that starts IDLE should be able to detect if it starts correctly (something unlikely to happen if a significant stdlib module is shadowed), watch for an attribute error of that form and try to determine if shadowing is the cause, and if so, reissue a saner error message. > >The subprocess/firewall error is occurring because the ?string? problem in your example isn?t being hit right away so a few startup things already are happening. The point where we?re showing that error (as a result of a timeout) should be able to check as per the above that IDLE was able to start alright, and if not, change or ignore the timeout error. > >There?ll probably be some cases (depending on exactly what gets shadowed) that may be difficult to get to work, but it should be able to handle most things. > >Mark Mark, how splendid. Need I submit a bug report/feature request to get this happening? Very, very pleased to have mentioned it ... Laura From v+python at g.nevcal.com Thu Oct 29 16:33:48 2015 From: v+python at g.nevcal.com (Glenn Linderman) Date: Thu, 29 Oct 2015 13:33:48 -0700 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen In-Reply-To: References: <201510291559.t9TFxi4i003479@fido.openend.se> Message-ID: <563282AC.1050102@g.nevcal.com> On 10/29/2015 12:30 PM, Paul Moore wrote: > On 29 October 2015 at 18:45, Donald Stufft wrote: >> So I don=E2=80=99t think it=E2=80=99s true that people don=E2=80=99t shad= >> ow the standard library, they just have various ways to do it that have s= >> everal gotchas and require people to generally hack around the limitation= >> .=C2=A0 > (Your mailer or mine seems to have gone weird with encoding...) > > There's a difference between the opting into using (say) pdb++ at > runtime, and at install time. The problem with the sort of shadowing > eggs did (and people try to emulate) is that simply *installing* a > package changes the behaviour of programs that don't use that package, > by virtue of the fact that it's there at all. > > I'd much rather pdb++ or anything similar allowed the user to opt in > on demand. For example, it could put a file pdb.py containing "from > pdbpp import *" into a subdirectory, and direct users who want to > replace pdb with pdbpp to add that directory to PYTHONPATH. Just don't > do that at install time, leaving the user needing to uninstall pdbpp > to opt out of the behaviour. > > As I said, I'm not against people being *able* to shadow the stdlib, I > just don't want to see it being *easy*, nor do I want it to be the > norm. > > Paul > > PS My experience with a similar case is pyreadline on Windows, which > hooks into the stdlib readline functionality when installed. That's a > PITA, because mostly I don't want pyreadline (I prefer the default > Windows command line editing, despite it having less functionality) > but I want pyreadline when using ipython. There's no way for me to get > that behaviour because (py)readline is enabled by installing it, not > by user choice at runtime. +1 Although, I wouldn't mind if it were easy & documented, like say having both shadow-packages before stdlib and site-packages after it, so that multiple, super-tricky and annoying ways of doing the shadowing were not needed. But, say, packages shadow-packages would only be used if there were an extra directive somewhere to enable it, or a subset of the packages that reside there. Glenn -------------- next part -------------- An HTML attachment was scrubbed... URL: From rymg19 at gmail.com Thu Oct 29 16:50:30 2015 From: rymg19 at gmail.com (Ryan Gonzalez) Date: Thu, 29 Oct 2015 15:50:30 -0500 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen In-Reply-To: <1F1586DD-6864-4896-AF7E-FF448C79CC4C@markroseman.com> References: <1F1586DD-6864-4896-AF7E-FF448C79CC4C@markroseman.com> Message-ID: <6FC405D4-56AC-4ECF-8929-0D71507FA489@gmail.com> Why not just check the path of the imported modules and compare it with the Python library directory? On October 29, 2015 3:26:08 PM CDT, Mark Roseman wrote: >Laura, I think what you want should actually be more-or-less doable in >IDLE. > >The main routine that starts IDLE should be able to detect if it starts >correctly (something unlikely to happen if a significant stdlib module >is shadowed), watch for an attribute error of that form and try to >determine if shadowing is the cause, and if so, reissue a saner error >message. > >The subprocess/firewall error is occurring because the ?string? problem >in your example isn?t being hit right away so a few startup things >already are happening. The point where we?re showing that error (as a >result of a timeout) should be able to check as per the above that IDLE >was able to start alright, and if not, change or ignore the timeout >error. > >There?ll probably be some cases (depending on exactly what gets >shadowed) that may be difficult to get to work, but it should be able >to handle most things. > >Mark > >_______________________________________________ >Python-Dev mailing list >Python-Dev at python.org >https://mail.python.org/mailman/listinfo/python-dev >Unsubscribe: >https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com -- Sent from my Nexus 5 with K-9 Mail. Please excuse my brevity. -------------- next part -------------- An HTML attachment was scrubbed... URL: From mark at markroseman.com Thu Oct 29 16:53:06 2015 From: mark at markroseman.com (Mark Roseman) Date: Thu, 29 Oct 2015 13:53:06 -0700 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen In-Reply-To: <201510292035.t9TKZa0p024287@fido.openend.se> References: <1F1586DD-6864-4896-AF7E-FF448C79CC4C@markroseman.com> <201510292035.t9TKZa0p024287@fido.openend.se> Message-ID: > Need I submit a bug report/feature request to get this happening? > Very, very pleased to have mentioned it ? I took care of the bug report. Glad you mentioned it. My personal opinion is that many of the error messages in IDLE (to say nothing of large parts of the overall user experience) are a bit of a horror show, but it?s slowly fixable! :-) Don?t hesitate to share other snags, as the situation you?re in is what I?d like to see much better supported. Mark From lac at openend.se Thu Oct 29 17:18:46 2015 From: lac at openend.se (Laura Creighton) Date: Thu, 29 Oct 2015 22:18:46 +0100 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen In-Reply-To: <6FC405D4-56AC-4ECF-8929-0D71507FA489@gmail.com> References: <1F1586DD-6864-4896-AF7E-FF448C79CC4C@markroseman.com> <6FC405D4-56AC-4ECF-8929-0D71507FA489@gmail.com> Message-ID: <201510292118.t9TLIk8b027422@fido.openend.se> In a message of Thu, 29 Oct 2015 15:50:30 -0500, Ryan Gonzalez writes: >Why not just check the path of the imported modules and compare it with the Python library directory? My friend ?sa who is 12 years old suggested exactly this at the club. If this works then I will be certain to mention this to her. I said that I would ask 'how hard is this?' Laura From rymg19 at gmail.com Thu Oct 29 18:16:56 2015 From: rymg19 at gmail.com (Ryan Gonzalez) Date: Thu, 29 Oct 2015 17:16:56 -0500 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen In-Reply-To: <201510292118.t9TLIk8b027422@fido.openend.se> References: <1F1586DD-6864-4896-AF7E-FF448C79CC4C@markroseman.com> <6FC405D4-56AC-4ECF-8929-0D71507FA489@gmail.com> <201510292118.t9TLIk8b027422@fido.openend.se> Message-ID: Well, tell your friend that that means middle and high schoolers must think alike! :D On Thu, Oct 29, 2015 at 4:18 PM, Laura Creighton wrote: > In a message of Thu, 29 Oct 2015 15:50:30 -0500, Ryan Gonzalez writes: > >Why not just check the path of the imported modules and compare it with > the Python library directory? > > My friend ?sa who is 12 years old suggested exactly this at the club. If > this > works then I will be certain to mention this to her. I said that I would > ask 'how hard is this?' > > Laura > -- Ryan [ERROR]: Your autotools build scripts are 200 lines longer than your program. Something?s wrong. http://kirbyfan64.github.io/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Thu Oct 29 19:56:38 2015 From: njs at pobox.com (Nathaniel Smith) Date: Thu, 29 Oct 2015 16:56:38 -0700 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen In-Reply-To: <6FC405D4-56AC-4ECF-8929-0D71507FA489@gmail.com> References: <1F1586DD-6864-4896-AF7E-FF448C79CC4C@markroseman.com> <6FC405D4-56AC-4ECF-8929-0D71507FA489@gmail.com> Message-ID: On Thu, Oct 29, 2015 at 1:50 PM, Ryan Gonzalez wrote: > Why not just check the path of the imported modules and compare it with the > Python library directory? It works, but it requires that everyone who could run into this problem carefully add some extra guard code to every stdlib import statement, and in practice nobody will (or at least, not until after they've already gotten bitten by this at least once... at which point they no longer need it). Given that AFAICT there's no reason this couldn't be part of the default import system's functionality and "just work" for everyone, if I were going to spend time on trying to fix this I'd probably target that :-). (I guess the trickiest bit would be to find an efficient and maintainable way to check whether a given package name is present in the stdlib.) -n -- Nathaniel J. Smith -- http://vorpus.org From rdmurray at bitdance.com Thu Oct 29 20:06:51 2015 From: rdmurray at bitdance.com (R. David Murray) Date: Thu, 29 Oct 2015 20:06:51 -0400 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen In-Reply-To: References: <1F1586DD-6864-4896-AF7E-FF448C79CC4C@markroseman.com> <6FC405D4-56AC-4ECF-8929-0D71507FA489@gmail.com> Message-ID: <20151030000651.E4DCFB1408D@webabinitio.net> On Thu, 29 Oct 2015 16:56:38 -0700, Nathaniel Smith wrote: > On Thu, Oct 29, 2015 at 1:50 PM, Ryan Gonzalez wrote: > > Why not just check the path of the imported modules and compare it with the > > Python library directory? > > It works, but it requires that everyone who could run into this > problem carefully add some extra guard code to every stdlib import > statement, and in practice nobody will (or at least, not until after > they've already gotten bitten by this at least once... at which point > they no longer need it). > > Given that AFAICT there's no reason this couldn't be part of the > default import system's functionality and "just work" for everyone, if > I were going to spend time on trying to fix this I'd probably target > that :-). > > (I guess the trickiest bit would be to find an efficient and > maintainable way to check whether a given package name is present in > the stdlib.) For Idle, though, it sounds like a very viable strategy, and that's what Laura is concerned about. --David From rymg19 at gmail.com Thu Oct 29 21:00:43 2015 From: rymg19 at gmail.com (Ryan Gonzalez) Date: Thu, 29 Oct 2015 20:00:43 -0500 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen In-Reply-To: <20151030000651.E4DCFB1408D@webabinitio.net> References: <1F1586DD-6864-4896-AF7E-FF448C79CC4C@markroseman.com> <6FC405D4-56AC-4ECF-8929-0D71507FA489@gmail.com> <20151030000651.E4DCFB1408D@webabinitio.net> Message-ID: <953CA20E-FCE0-4750-89E6-06BB1F7AE2B0@gmail.com> Well, this works on Python 2 only (I'm on a phone with only access to 2 right now), but this is a start (apologies for the screwy syntax highlighting): import sys, imp, logging, os modules = ''' imp string ... '''.split() class StdlibTester(object): base = os.path.dirname(os.__file__) # hack; is there a better way to do this? def find_module(self, fullname, path=None): if fullname in modules: self.path = path return self return None def load_module(self, name): if name in sys.modules: return sys.modules[name] module_info = imp.find_module(name, self.path) module = imp.load_module(name, *module_info) sys.modules[name] = module if hasattr(module, '__file__') and not os.path.dirname(module.__file__).startswith(self.base): logging.warning('stdlib module %s was overriden', name) return module sys.meta_path.append(StdlibTester()) import string On October 29, 2015 7:06:51 PM CDT, "R. David Murray" wrote: >On Thu, 29 Oct 2015 16:56:38 -0700, Nathaniel Smith >wrote: >> On Thu, Oct 29, 2015 at 1:50 PM, Ryan Gonzalez >wrote: >> > Why not just check the path of the imported modules and compare it >with the >> > Python library directory? >> >> It works, but it requires that everyone who could run into this >> problem carefully add some extra guard code to every stdlib import >> statement, and in practice nobody will (or at least, not until after >> they've already gotten bitten by this at least once... at which point >> they no longer need it). >> >> Given that AFAICT there's no reason this couldn't be part of the >> default import system's functionality and "just work" for everyone, >if >> I were going to spend time on trying to fix this I'd probably target >> that :-). >> >> (I guess the trickiest bit would be to find an efficient and >> maintainable way to check whether a given package name is present in >> the stdlib.) > >For Idle, though, it sounds like a very viable strategy, and that's >what Laura is concerned about. > >--David >_______________________________________________ >Python-Dev mailing list >Python-Dev at python.org >https://mail.python.org/mailman/listinfo/python-dev >Unsubscribe: >https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com -- Sent from my Nexus 5 with K-9 Mail. Please excuse my brevity. -------------- next part -------------- An HTML attachment was scrubbed... URL: From tjreedy at udel.edu Thu Oct 29 21:17:25 2015 From: tjreedy at udel.edu (Terry Reedy) Date: Thu, 29 Oct 2015 21:17:25 -0400 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen In-Reply-To: References: <1F1586DD-6864-4896-AF7E-FF448C79CC4C@markroseman.com> <201510292035.t9TKZa0p024287@fido.openend.se> Message-ID: On 10/29/2015 4:53 PM, Mark Roseman wrote: >> Need I submit a bug report/feature request to get this happening? >> Very, very pleased to have mentioned it ? > > I took care of the bug report. The idle issue is https://bugs.python.org/issue25514 As I said there, I think that removing '' from sys.path, so that IDLE can run, is better than a nicer warning that it cannot run. This, of course, requires that sys not be shadowed, so that sys.path can be accessed. -- Terry Jan Reedy From tjreedy at udel.edu Thu Oct 29 21:22:15 2015 From: tjreedy at udel.edu (Terry Reedy) Date: Thu, 29 Oct 2015 21:22:15 -0400 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen In-Reply-To: <201510291559.t9TFxi4i003479@fido.openend.se> References: <201510291559.t9TFxi4i003479@fido.openend.se> Message-ID: On 10/29/2015 11:59 AM, Laura Creighton wrote: > > see the following: > lac at smartwheels:~/junk$ echo "print ('hello there')" >string.py > lac at smartwheels:~/junk$ idle-python3.5 > hello there > Traceback (most recent call last): > File "", line 1, in > File "/usr/lib/python3.5/idlelib/run.py", line 10, in > from idlelib import CallTips > File "/usr/lib/python3.5/idlelib/CallTips.py", line 16, in > from idlelib.HyperParser import HyperParser > File "/usr/lib/python3.5/idlelib/HyperParser.py", line 14, in > _ASCII_ID_CHARS = frozenset(string.ascii_letters + string.digits + "_") > AttributeError: module 'string' has no attribute 'ascii_letters' > > IDLE then produces a popup that says: > > IDLE's subprocess didn't make connection. Either IDLE can't stat a subprocess por personal firewall software is blocking the connection. > > -------- > > I think that life would be a whole lot easier for people if instead we got > a message: > > Warning: local file /u/lac/junk/string.py shadows module named string in the > Standard Library > > I think that it is python exec that would have to do this -- though of > course the popup could also warn about shadowing in general, instead of > sending people on wild goose chases over their firewalls. > > Would this be hard to do? Leaving IDLE aside, the reason '' is added to sys.path is so that people can import their own modules. This is very useful. Shadowing is the result of putting it at the front. I have long thought this a dubious choice. If '' were instead appended, people could still import modules that did not duplicate stdlib names. Anyone who wanted shadowing could move '' to the front. But then shadowing would be intentional, not an accident. -- Terry Jan Reedy From tjreedy at udel.edu Thu Oct 29 22:18:29 2015 From: tjreedy at udel.edu (Terry Reedy) Date: Thu, 29 Oct 2015 22:18:29 -0400 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen In-Reply-To: <201510292118.t9TLIk8b027422@fido.openend.se> References: <1F1586DD-6864-4896-AF7E-FF448C79CC4C@markroseman.com> <6FC405D4-56AC-4ECF-8929-0D71507FA489@gmail.com> <201510292118.t9TLIk8b027422@fido.openend.se> Message-ID: On 10/29/2015 5:18 PM, Laura Creighton wrote: > In a message of Thu, 29 Oct 2015 15:50:30 -0500, Ryan Gonzalez writes: >> Why not just check the path of the imported modules and compare it with the Python library directory? > > My friend ?sa who is 12 years old suggested exactly this at the club. This was my first idea, until I realized that it would be even better to avoid shadowing in the first place. > If this works then I will be certain to mention this to her. As far as I can tell, comparison in not foolproof, even if done carefully. This is a proper stdlib import. >>> import string >>> string.__file__ 'C:\\Programs\\Python35\\lib\\string.py' If we look at suffixes, the only part guaranteed, after changing Windows' '\\' to '/', is '/lib/string.py'. Now suppose someone runs python in another 'lib' directory containing string.py. >>> import string >>> string.__file__ 'C:\\Users\\Terry\\lib\\string.py' Same suffix. Let's try prefixes. >>> import os.path >>> import sys >>> os.path.dirname(string.__file__) in sys.path False This is True for the stdlib import. Hooray. But this requires more imports, which also might be shadowed. Having '' at the front of sys.path is a real nuisance when one wants to guaranteed authentic stdlib imports. -- Terry Jan Reedy From storchaka at gmail.com Fri Oct 30 03:21:26 2015 From: storchaka at gmail.com (Serhiy Storchaka) Date: Fri, 30 Oct 2015 09:21:26 +0200 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen In-Reply-To: References: <201510291559.t9TFxi4i003479@fido.openend.se> Message-ID: On 30.10.15 03:22, Terry Reedy wrote: > Leaving IDLE aside, the reason '' is added to sys.path is so that people > can import their own modules. This is very useful. Shadowing is the > result of putting it at the front. I have long thought this a dubious > choice. If '' were instead appended, people could still import modules > that did not duplicate stdlib names. Anyone who wanted shadowing could > move '' to the front. But then shadowing would be intentional, not an > accident. LGTM. AFAIK the sys module can't be shadowed. From njs at pobox.com Fri Oct 30 03:57:13 2015 From: njs at pobox.com (Nathaniel Smith) Date: Fri, 30 Oct 2015 00:57:13 -0700 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen In-Reply-To: References: <201510291559.t9TFxi4i003479@fido.openend.se> Message-ID: On Oct 29, 2015 6:26 PM, "Terry Reedy" wrote: > > On 10/29/2015 11:59 AM, Laura Creighton wrote: >> >> >> see the following: >> lac at smartwheels:~/junk$ echo "print ('hello there')" >string.py >> lac at smartwheels:~/junk$ idle-python3.5 >> hello there >> Traceback (most recent call last): >> File "", line 1, in >> File "/usr/lib/python3.5/idlelib/run.py", line 10, in >> from idlelib import CallTips >> File "/usr/lib/python3.5/idlelib/CallTips.py", line 16, in >> from idlelib.HyperParser import HyperParser >> File "/usr/lib/python3.5/idlelib/HyperParser.py", line 14, in >> _ASCII_ID_CHARS = frozenset(string.ascii_letters + string.digits + "_") >> AttributeError: module 'string' has no attribute 'ascii_letters' >> >> IDLE then produces a popup that says: >> >> IDLE's subprocess didn't make connection. Either IDLE can't stat a subprocess por personal firewall software is blocking the connection. >> >> -------- >> >> I think that life would be a whole lot easier for people if instead we got >> a message: >> >> Warning: local file /u/lac/junk/string.py shadows module named string in the >> Standard Library >> >> I think that it is python exec that would have to do this -- though of >> course the popup could also warn about shadowing in general, instead of >> sending people on wild goose chases over their firewalls. >> >> Would this be hard to do? > > > Leaving IDLE aside, the reason '' is added to sys.path is so that people can import their own modules. This is very useful. Shadowing is the result of putting it at the front. I have long thought this a dubious choice. If '' were instead appended, people could still import modules that did not duplicate stdlib names. Anyone who wanted shadowing could move '' to the front. But then shadowing would be intentional, not an accident. Unfortunately I think that (among other things) there are a lot of scripts out there that blindly do sys.path.pop(0) to remove the "" entry, so the backcompat costs of changing this would probably be catastrophic. -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From status at bugs.python.org Fri Oct 30 13:08:23 2015 From: status at bugs.python.org (Python tracker) Date: Fri, 30 Oct 2015 18:08:23 +0100 (CET) Subject: [Python-Dev] Summary of Python tracker Issues Message-ID: <20151030170823.216DB5619B@psf.upfronthosting.co.za> ACTIVITY SUMMARY (2015-10-23 - 2015-10-30) Python tracker at http://bugs.python.org/ To view or respond to any of the issues listed below, click on the issue. Do NOT respond to this message. Issues counts and deltas: open 5204 (+23) closed 32073 (+30) total 37277 (+53) Open issues with patches: 2298 Issues opened (37) ================== #25467: Put ???deprecated??? warnings first http://bugs.python.org/issue25467 opened by Tony R. #25468: PyTraceBack_Print()/_Py_DisplaySourceLine() should take custom http://bugs.python.org/issue25468 opened by shiz #25469: multiprocessing .Condition.notify(_all) function has O(N) time http://bugs.python.org/issue25469 opened by vilnis.termanis #25470: Random Malloc error raised http://bugs.python.org/issue25470 opened by augustin rieunier #25472: Typing: Specialized subclasses of generics cannot be unpickled http://bugs.python.org/issue25472 opened by maatt #25474: Weird behavior when setting f_trace in a context manager http://bugs.python.org/issue25474 opened by Fred Gansevles #25476: close() behavior on non-blocking BufferedIO objects with socke http://bugs.python.org/issue25476 opened by dabeaz #25477: text mode for pkgutil.get_data http://bugs.python.org/issue25477 opened by Antony.Lee #25478: Consider adding a normalize() method to collections.Counter() http://bugs.python.org/issue25478 opened by rhettinger #25479: Increase unit test coverage for abc.py http://bugs.python.org/issue25479 opened by szymon #25481: PermissionError in subprocess.check_output() when an inaccessi http://bugs.python.org/issue25481 opened by jaystrict #25482: signal.set_wakeup_fd() doesn't work if the signal don't have h http://bugs.python.org/issue25482 opened by ishimoto #25483: Improve f-string implementation: FORMAT_VALUE opcode http://bugs.python.org/issue25483 opened by eric.smith #25485: Add a context manager to telnetlib.Telnet http://bugs.python.org/issue25485 opened by desbma #25486: Resurrect inspect.getargspec() in 3.6 http://bugs.python.org/issue25486 opened by yselivanov #25487: imp module DeprecationWarning in test suite http://bugs.python.org/issue25487 opened by martin.panter #25488: IDLE: Remove idlelib from sys.path when added http://bugs.python.org/issue25488 opened by terry.reedy #25489: sys.exit() caught in exception handler http://bugs.python.org/issue25489 opened by jinty #25491: ftplib.sendcmd only accepts string http://bugs.python.org/issue25491 opened by wozlaf #25492: subprocess with redirection fails after FreeConsole http://bugs.python.org/issue25492 opened by George Prekas #25493: warnings.warn: wrong stacklevel causes import of local file "s http://bugs.python.org/issue25493 opened by bevan #25495: binascii documentation incorrect http://bugs.python.org/issue25495 opened by mouse07410 #25496: tarfile: Default value for compresslevel is not documented http://bugs.python.org/issue25496 opened by Sworddragon #25497: Rewrite test_robotparser http://bugs.python.org/issue25497 opened by berker.peksag #25498: Python 3.4.3 core dump with simple sample code http://bugs.python.org/issue25498 opened by JakeMont #25500: docs claim __import__ checked for in globals, but IMPORT_NAME http://bugs.python.org/issue25500 opened by superbobry #25505: undefined name 'window' in Tools/scripts/fixdiv.py http://bugs.python.org/issue25505 opened by John.Mark.Vandenberg #25507: IDLE: user code 'import tkinter; tkinter.font' should fail http://bugs.python.org/issue25507 opened by terry.reedy #25508: LogRecord attributes are not tuple, when logging only dict http://bugs.python.org/issue25508 opened by sejvlond #25509: PyImport_ImportModule inaccurately described http://bugs.python.org/issue25509 opened by memeplex #25510: fileinput.FileInput.readline() always returns str object at th http://bugs.python.org/issue25510 opened by Ryosuke Ito #25511: multiprocessing pool blocks SIGTERM from being handled http://bugs.python.org/issue25511 opened by djones #25514: better startup error messages in IDLE when stdlib modules shad http://bugs.python.org/issue25514 opened by markroseman #25516: threading.Condition._is_owned() is wrong when using threading. http://bugs.python.org/issue25516 opened by nirs #25517: regex howto example in "Lookahead Assertions" http://bugs.python.org/issue25517 opened by Pavel #25518: Investigate implementation of PyOS_CheckStack on OSX http://bugs.python.org/issue25518 opened by ronaldoussoren #25519: Minor difflib documentation bug http://bugs.python.org/issue25519 opened by lurchman Most recent 15 issues with no replies (15) ========================================== #25519: Minor difflib documentation bug http://bugs.python.org/issue25519 #25518: Investigate implementation of PyOS_CheckStack on OSX http://bugs.python.org/issue25518 #25511: multiprocessing pool blocks SIGTERM from being handled http://bugs.python.org/issue25511 #25505: undefined name 'window' in Tools/scripts/fixdiv.py http://bugs.python.org/issue25505 #25497: Rewrite test_robotparser http://bugs.python.org/issue25497 #25491: ftplib.sendcmd only accepts string http://bugs.python.org/issue25491 #25487: imp module DeprecationWarning in test suite http://bugs.python.org/issue25487 #25486: Resurrect inspect.getargspec() in 3.6 http://bugs.python.org/issue25486 #25479: Increase unit test coverage for abc.py http://bugs.python.org/issue25479 #25477: text mode for pkgutil.get_data http://bugs.python.org/issue25477 #25474: Weird behavior when setting f_trace in a context manager http://bugs.python.org/issue25474 #25468: PyTraceBack_Print()/_Py_DisplaySourceLine() should take custom http://bugs.python.org/issue25468 #25462: Avoid repeated hash calculation in C implementation of Ordered http://bugs.python.org/issue25462 #25458: ftplib: command response shift - mismatch http://bugs.python.org/issue25458 #25455: Some repr implementations don't check for self-referential str http://bugs.python.org/issue25455 Most recent 15 issues waiting for review (15) ============================================= #25516: threading.Condition._is_owned() is wrong when using threading. http://bugs.python.org/issue25516 #25510: fileinput.FileInput.readline() always returns str object at th http://bugs.python.org/issue25510 #25505: undefined name 'window' in Tools/scripts/fixdiv.py http://bugs.python.org/issue25505 #25498: Python 3.4.3 core dump with simple sample code http://bugs.python.org/issue25498 #25497: Rewrite test_robotparser http://bugs.python.org/issue25497 #25495: binascii documentation incorrect http://bugs.python.org/issue25495 #25489: sys.exit() caught in exception handler http://bugs.python.org/issue25489 #25485: Add a context manager to telnetlib.Telnet http://bugs.python.org/issue25485 #25483: Improve f-string implementation: FORMAT_VALUE opcode http://bugs.python.org/issue25483 #25479: Increase unit test coverage for abc.py http://bugs.python.org/issue25479 #25469: multiprocessing .Condition.notify(_all) function has O(N) time http://bugs.python.org/issue25469 #25467: Put ???deprecated??? warnings first http://bugs.python.org/issue25467 #25464: Tix HList header_exists should be "exist" http://bugs.python.org/issue25464 #25462: Avoid repeated hash calculation in C implementation of Ordered http://bugs.python.org/issue25462 #25461: Unclear language (the word ineffective) in the documentation f http://bugs.python.org/issue25461 Top 10 most discussed issues (10) ================================= #25489: sys.exit() caught in exception handler http://bugs.python.org/issue25489 17 msgs #8231: Unable to run IDLE without write-access to home directory http://bugs.python.org/issue8231 12 msgs #25482: signal.set_wakeup_fd() doesn't work if the signal don't have h http://bugs.python.org/issue25482 11 msgs #25467: Put ???deprecated??? warnings first http://bugs.python.org/issue25467 10 msgs #25481: PermissionError in subprocess.check_output() when an inaccessi http://bugs.python.org/issue25481 10 msgs #25483: Improve f-string implementation: FORMAT_VALUE opcode http://bugs.python.org/issue25483 8 msgs #25485: Add a context manager to telnetlib.Telnet http://bugs.python.org/issue25485 8 msgs #25156: shutil.copyfile should internally use os.sendfile when possibl http://bugs.python.org/issue25156 6 msgs #25495: binascii documentation incorrect http://bugs.python.org/issue25495 6 msgs #25498: Python 3.4.3 core dump with simple sample code http://bugs.python.org/issue25498 6 msgs Issues closed (28) ================== #16785: Document the fact that constructing OSError with erron returns http://bugs.python.org/issue16785 closed by martin.panter #21160: incorrect comments in nturl2path.py http://bugs.python.org/issue21160 closed by serhiy.storchaka #21827: textwrap.dedent() fails when largest common whitespace is a su http://bugs.python.org/issue21827 closed by serhiy.storchaka #23391: Documentation of EnvironmentError (OSError) arguments disappea http://bugs.python.org/issue23391 closed by martin.panter #24765: Move .idlerc to %APPDATA%\IDLE on Windows http://bugs.python.org/issue24765 closed by terry.reedy #25193: itertools.accumulate should have an optional initializer argum http://bugs.python.org/issue25193 closed by rhettinger #25311: Add f-string support to tokenize.py http://bugs.python.org/issue25311 closed by eric.smith #25356: Idle (Python 3.4 on Ubuntu) does not allow typing accents http://bugs.python.org/issue25356 closed by terry.reedy #25425: white-spaces encountered in 3.4 http://bugs.python.org/issue25425 closed by terry.reedy #25432: isinstance documentation: explain behavior when type is a tupl http://bugs.python.org/issue25432 closed by terry.reedy #25447: TypeError invoking deepcopy on lru_cache http://bugs.python.org/issue25447 closed by serhiy.storchaka #25466: offer "from __future__ import" option for "raise... from" http://bugs.python.org/issue25466 closed by terry.reedy #25471: socket.recv() blocks while it's gettimeout() returns 0.0 http://bugs.python.org/issue25471 closed by python-dev #25473: socket.recv() raises correct code with the misleading descript http://bugs.python.org/issue25473 closed by benjamin.peterson #25475: use argparse instead of getopt http://bugs.python.org/issue25475 closed by SilentGhost #25480: string format in large number http://bugs.python.org/issue25480 closed by eric.smith #25484: Operator issue with "in" on same level and preceding == http://bugs.python.org/issue25484 closed by ezio.melotti #25490: small mistake in example for random.choice() http://bugs.python.org/issue25490 closed by serhiy.storchaka #25494: Four quotes used to begin docstring http://bugs.python.org/issue25494 closed by python-dev #25499: use argparse for the calendar module http://bugs.python.org/issue25499 closed by serhiy.storchaka #25501: Use async/await through asyncio docs http://bugs.python.org/issue25501 closed by brett.cannon #25502: unnecessary re-imports http://bugs.python.org/issue25502 closed by python-dev #25503: inspect.getdoc does find inherited property __doc__ http://bugs.python.org/issue25503 closed by serhiy.storchaka #25504: undefined name 'modules' in Tools/freeze/checkextensions_win32 http://bugs.python.org/issue25504 closed by benjamin.peterson #25506: test_pprint reuses test_user_dict http://bugs.python.org/issue25506 closed by serhiy.storchaka #25512: apparent memory leak using ctypes http://bugs.python.org/issue25512 closed by blindgoat #25513: collections.abc.Iterable don't implement __bool__ http://bugs.python.org/issue25513 closed by r.david.murray #25515: Always use os.urandom for generating uuid4s http://bugs.python.org/issue25515 closed by python-dev From tjreedy at udel.edu Sat Oct 31 00:42:09 2015 From: tjreedy at udel.edu (Terry Reedy) Date: Sat, 31 Oct 2015 00:42:09 -0400 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen In-Reply-To: References: <201510291559.t9TFxi4i003479@fido.openend.se> Message-ID: On 10/30/2015 3:21 AM, Serhiy Storchaka wrote: > AFAIK the sys module can't be shadowed. I tried it and it seems to be true of builtins in general. -- Terry Jan Reedy From brett at python.org Sat Oct 31 01:13:28 2015 From: brett at python.org (Brett Cannon) Date: Sat, 31 Oct 2015 05:13:28 +0000 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen In-Reply-To: References: <201510291559.t9TFxi4i003479@fido.openend.se> Message-ID: On Fri, 30 Oct 2015 at 21:42 Terry Reedy wrote: > On 10/30/2015 3:21 AM, Serhiy Storchaka wrote: > > > AFAIK the sys module can't be shadowed. > > I tried it and it seems to be true of builtins in general. > The importer on sys.meta_path that handles built-ins is earlier than the one that handles sys.path (technically you can shadow them if you changed sys.meta_path). -------------- next part -------------- An HTML attachment was scrubbed... URL: From storchaka at gmail.com Sat Oct 31 02:07:48 2015 From: storchaka at gmail.com (Serhiy Storchaka) Date: Sat, 31 Oct 2015 08:07:48 +0200 Subject: [Python-Dev] If you shadow a module in the standard library that IDLE depends on, bad things happen In-Reply-To: References: <201510291559.t9TFxi4i003479@fido.openend.se> Message-ID: On 30.10.15 09:57, Nathaniel Smith wrote: > Unfortunately I think that (among other things) there are a lot of > scripts out there that blindly do sys.path.pop(0) to remove the "" > entry, so the backcompat costs of changing this would probably be > catastrophic. You are right. There are too much occurrences even in public libraries. https://code.openhub.net/search?s=%22sys.path.pop(0)%22&p=0 Possible workaround is to add fake path (or a duplicate of system path) at the start of sys.path. Then dropping first element will not break the script. From ncoghlan at gmail.com Sat Oct 31 20:48:53 2015 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun, 1 Nov 2015 01:48:53 +0100 Subject: [Python-Dev] Should PEP 498 specify if rf'...' is valid? In-Reply-To: <56311C71.3050505@sdamon.com> References: <5628C94E.8010804@trueblade.com> <56290A88.4090607@mail.de> <540BA627-65FF-47B2-813F-3ED0E69701AE@gmail.com> <56291688.9060604@mail.de> <56291910.1090303@trueblade.com> <562E4540.8080308@stoneleaf.us> <562E74B7.4060409@mail.de> <562E8502.1050303@stoneleaf.us> <562FC4DE.4000904@mail.de> <56311C71.3050505@sdamon.com> Message-ID: On 28 October 2015 at 20:05, Alexander Walters wrote: > Have you ever used a command line application that --accepted --Boolean > --flags? Have you ever found one that required the flags to be in order? > You remember how much you hated that application for being so arbitrary > about the input? Given that "f" is standing for a runtime transformation (unlike the purely declarative "b" and "r"), it makes sense to me to mentally translate it as "magic_format_call_that_needs_compiler_assistance()", so requiring the "f" to be first isn't arbitrary, it's slotting in where the function name would be for a call to a builtin. I'd also like to leave the door open to i-strings in the future, so my answer to Eric's "What would the docs say?" question is that string prefixes can contain imperative runtime flags (which appear first, are mutually exclusive, are always lowercase, and cause a runtime transformation by changing the actual code generated at compile time) and declarative compile time flags (which can appear in any order after the imperative flag, may be in upper or lower case, and only cause a compile time transformation in the stored constant without changing the code to load that constant at runtime) Currently the only imperative prefix we have is "f", while "b", "u", and "r" are available as declarative prefixes. "i" has been proposed as a second imperative prefix, but is currently deferred until 3.7 at the earliest (after we have a release worth's of experience with "f"). It's only a mild preference, but the main benefit I see is reining in the combinatorial explosion of possible string prefixes before it even has a chance to start getting out of hand. > one of the few undeniably good choices made in python 3. There's no need here for passive aggressive snark directed at the designers providing you with a free programming language. Folks have the entire internet to complain about how much they dislike our work (where we can pick and choose whose feedback we want to listen to), so we don't need to accept it here. Regards, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From tjreedy at udel.edu Sat Oct 31 22:32:38 2015 From: tjreedy at udel.edu (Terry Reedy) Date: Sat, 31 Oct 2015 22:32:38 -0400 Subject: [Python-Dev] Should PEP 498 specify if rf'...' is valid? In-Reply-To: References: <5628C94E.8010804@trueblade.com> <56290A88.4090607@mail.de> <540BA627-65FF-47B2-813F-3ED0E69701AE@gmail.com> <56291688.9060604@mail.de> <56291910.1090303@trueblade.com> <562E4540.8080308@stoneleaf.us> <562E74B7.4060409@mail.de> <562E8502.1050303@stoneleaf.us> <562FC4DE.4000904@mail.de> <56311C71.3050505@sdamon.com> Message-ID: On 10/31/2015 8:48 PM, Nick Coghlan wrote: > Given that "f" is standing for a runtime transformation (unlike the > purely declarative "b" and "r"), it makes sense to me to mentally > translate it as > "magic_format_call_that_needs_compiler_assistance( literal>)", so requiring the "f" to be first isn't arbitrary, it's > slotting in where the function name would be for a call to a builtin. > > I'd also like to leave the door open to i-strings in the future, so my > answer to Eric's "What would the docs say?" question is that string > prefixes can contain imperative runtime flags (which appear first, are > mutually exclusive, are always lowercase, and cause a runtime > transformation by changing the actual code generated at compile time) > and declarative compile time flags (which can appear in any order > after the imperative flag, may be in upper or lower case, I think either order for b|u versus r is ok, even though a nuisance to specify in a grammer that assumes order significance. But given that Python is case-sensitive, I think the exception here was a mistake that need not be copied. > and only > cause a compile time transformation in the stored constant without > changing the code to load that constant at runtime) It makes sense to me that f should be kept logically distinct from the other two. > Currently the only imperative prefix we have is "f", while "b", "u", > and "r" are available as declarative prefixes. "i" has been proposed > as a second imperative prefix, but is currently deferred until 3.7 at > the earliest (after we have a release worth's of experience with "f"). -- Terry Jan Reedy From guido at python.org Sat Oct 31 23:46:27 2015 From: guido at python.org (Guido van Rossum) Date: Sat, 31 Oct 2015 20:46:27 -0700 Subject: [Python-Dev] Should PEP 498 specify if rf'...' is valid? In-Reply-To: References: <5628C94E.8010804@trueblade.com> <56290A88.4090607@mail.de> <540BA627-65FF-47B2-813F-3ED0E69701AE@gmail.com> <56291688.9060604@mail.de> <56291910.1090303@trueblade.com> <562E4540.8080308@stoneleaf.us> <562E74B7.4060409@mail.de> <562E8502.1050303@stoneleaf.us> <562FC4DE.4000904@mail.de> <56311C71.3050505@sdamon.com> Message-ID: Eh. I don't see the point of arguing about the order. String literals may have one or more character prefixes that modify the meaning. Some of those prefixes may be combined; others may not. Given that we allow combining the r and b prefixes in either order, and we allow combining r and f, I don't think we should be picky about the order in which those can appear. Saying that f must come first because it has a different kind of effect (call it "runtime") doesn't make sense to me. On Sat, Oct 31, 2015 at 7:32 PM, Terry Reedy wrote: > On 10/31/2015 8:48 PM, Nick Coghlan wrote: > > Given that "f" is standing for a runtime transformation (unlike the >> purely declarative "b" and "r"), it makes sense to me to mentally >> translate it as >> "magic_format_call_that_needs_compiler_assistance(> literal>)", so requiring the "f" to be first isn't arbitrary, it's >> slotting in where the function name would be for a call to a builtin. >> >> I'd also like to leave the door open to i-strings in the future, so my >> answer to Eric's "What would the docs say?" question is that string >> prefixes can contain imperative runtime flags (which appear first, are >> mutually exclusive, are always lowercase, and cause a runtime >> transformation by changing the actual code generated at compile time) >> and declarative compile time flags (which can appear in any order >> after the imperative flag, may be in upper or lower case, >> > > I think either order for b|u versus r is ok, even though a nuisance to > specify in a grammer that assumes order significance. But given that > Python is case-sensitive, I think the exception here was a mistake that > need not be copied. > > > and only > >> cause a compile time transformation in the stored constant without >> changing the code to load that constant at runtime) >> > > It makes sense to me that f should be kept logically distinct from the > other two. > > Currently the only imperative prefix we have is "f", while "b", "u", >> and "r" are available as declarative prefixes. "i" has been proposed >> as a second imperative prefix, but is currently deferred until 3.7 at >> the earliest (after we have a release worth's of experience with "f"). >> > > -- > Terry Jan Reedy > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: