From tim.peters at gmail.com Tue May 1 00:06:07 2018 From: tim.peters at gmail.com (Tim Peters) Date: Mon, 30 Apr 2018 23:06:07 -0500 Subject: [Python-Dev] PEP 572: Usage of assignment expressions in C In-Reply-To: References: <20180428174524.7e122239@fsol> Message-ID: [Raymond Hettinger ] > Thanks Antoine, this is an important point that I hope doesn't get lost. > In a language with exceptions, assignment expressions are less needful. > Also, the pattern of having of having mutating methods return None > further limits the utility. It doesn't diminish the utility one whit in cases where binding expressions are helpful ;-) What you're saying is that there are _fewer_ such opportunities in Python than in C. Which may or may not be true (depending on the code you're working with). If you believe it is true, fine, then that also argues against that people will rush to abuse the feature (to the extent that it's even plausibly useful less often, to that extent also will there be less temptation to use it at all). But then I only care about use cases at heart, and have presented real-life examples wherein binding expressions read both better and worse than what they're replacing. I intend to limit myself to the cases where they read better :-) Which are most of the cases I even considered, BTW - in the vast majority of cases in real code I'd use them, they'd be replacing the annoyingly bare-bones yet somehow repetitive anyway: value = f() if value; doing something with value with the still bare-bones but minimally repetitive: if value := f(): doing something with value For example, tons of functions I write and use return None or 0 or False when they want to communicate "I have nothing useful to return in this case - but since you expected that might happen, I'm not going to annoy you with an exception". That pattern isn't solely limited to regexp search and match functions. The "win" above is minor but frequent. It adds up. There are other cases where binding expressions really shine, but they're much rarer in all the code I looked at (e.g., see the uselessly ever-increasing indentation levels near the end of `copy()` in the std library's copy.py). In all, I expect I'd use them significantly more often than ternary `if`, but far less often than augmented assignments. If the PEP is accepted, that's what all Pythoneers will be saying 5 years from now ;-) From benjamin at python.org Tue May 1 00:09:55 2018 From: benjamin at python.org (Benjamin Peterson) Date: Mon, 30 Apr 2018 21:09:55 -0700 Subject: [Python-Dev] [RELEASE] Python 2.7.15 Message-ID: <1525147795.1134689.1356448000.51663AA1@webmail.messagingengine.com> Greetings, I'm pleased to announce the immediate availability of Python 2.7.15, the latest bug fix release in the senescent Python 2.7 series. Source and binary downloads may be found on python.org: https://www.python.org/downloads/release/python-2715/ Bugs should be reported to https://bugs.python.org/ The source tarball contains a complete changelog in the Misc/NEWS file. The only change since the release candidate is a fix for undefined C behavior that newer compilers (including GCC 8) have started to exploit. Users of the macOS binaries should note that all python.org macOS installers now ship with a builtin copy of OpenSSL. Additionally, there is a new additional installer variant for macOS 10.9+ that includes a built-in version of Tcl/Tk 8.6. See the installer README for more information. Happy May, Benjamin 2.7 release manager From hasan.diwan at gmail.com Tue May 1 00:45:17 2018 From: hasan.diwan at gmail.com (Hasan Diwan) Date: Mon, 30 Apr 2018 21:45:17 -0700 Subject: [Python-Dev] [RELEASE] Python 2.7.15 In-Reply-To: <1525147795.1134689.1356448000.51663AA1@webmail.messagingengine.com> References: <1525147795.1134689.1356448000.51663AA1@webmail.messagingengine.com> Message-ID: Congrats to all involved! -- H On 30 April 2018 at 21:09, Benjamin Peterson wrote: > Greetings, > I'm pleased to announce the immediate availability of Python 2.7.15, the > latest bug fix release in the senescent Python 2.7 series. > > Source and binary downloads may be found on python.org: > > https://www.python.org/downloads/release/python-2715/ > > Bugs should be reported to https://bugs.python.org/ > > The source tarball contains a complete changelog in the Misc/NEWS file. > The only change since the release candidate is a fix for undefined C > behavior that newer compilers (including GCC 8) have started to exploit. > > Users of the macOS binaries should note that all python.org macOS > installers now ship with a builtin copy of OpenSSL. Additionally, there is > a new additional installer variant for macOS 10.9+ that includes a built-in > version of Tcl/Tk 8.6. See the installer README for more information. > > Happy May, > Benjamin > 2.7 release manager > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ > hasan.diwan%40gmail.com > -- OpenPGP: https://sks-keyservers.net/pks/lookup?op=get&search=0xFEBAD7FFD041BBA1 If you wish to request my time, please do so using http://bit.ly/hd1ScheduleRequest. Si vous voudrais faire connnaisance, allez a http://bit.ly/hd1ScheduleRequest. Sent from my mobile device Envoye de mon portable -------------- next part -------------- An HTML attachment was scrubbed... URL: From greg.ewing at canterbury.ac.nz Tue May 1 02:44:40 2018 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Tue, 01 May 2018 18:44:40 +1200 Subject: [Python-Dev] PEP 572: Assignment Expressions In-Reply-To: <20180501005903.GH7400@ando.pearwood.info> References: <9abfb93b-423b-2d1a-b11c-b1d926611d2f@hastings.org> <20180501005903.GH7400@ando.pearwood.info> Message-ID: <5AE80CD8.4020008@canterbury.ac.nz> Steven D'Aprano wrote: > "Not sure"? Given that you listed seven ways, how much more evidence do > you need to conclude that it is simply wrong to say that Python has a > single way to assign a value to a name? Yes, but six of those are very specialised, and there's rarely any doubt about when it's appropriate to use them. The proposed :=, on the other hand, would overlap a lot with the functionality of =. It's not a case of "Python already has seven ways of assigning, so one more can't hurt." -- Greg From steve at pearwood.info Tue May 1 03:00:53 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Tue, 1 May 2018 17:00:53 +1000 Subject: [Python-Dev] PEP 572: Assignment Expressions In-Reply-To: References: <9abfb93b-423b-2d1a-b11c-b1d926611d2f@hastings.org> <20180501005903.GH7400@ando.pearwood.info> Message-ID: <20180501070053.GI7400@ando.pearwood.info> On Tue, May 01, 2018 at 11:04:55AM +1000, Chris Angelico wrote: > To be fair, I don't see many people replacing "x = 1" with "for x in > [1]: pass". Even though it IS going to have the same effect. :-) Aside from the pass, that is precisely one of the current work-arounds for lack of binding-expressions inside comprehensions: # inefficient, and wrong when f(x) has side-effects [f(x) for x in iterable if f(x) > 1] # what we'd like [y for x in iterable if (y := f(x)) > 1] # a work-around [y for x in iterable for y in [f(x)] if y > 1] I think that's even in your PEP, isn't it? -- Steve From steve at holdenweb.com Tue May 1 05:14:47 2018 From: steve at holdenweb.com (Steve Holden) Date: Tue, 1 May 2018 10:14:47 +0100 Subject: [Python-Dev] PEP 572: Assignment Expressions In-Reply-To: References: <5add0911.81b7500a.d7e15.acb1SMTPIN_ADDED_MISSING@mx.google.com> <20180423092806.530b5996@fsol> <83e528a7-9758-fcbf-eb91-b6f1763b5e2e@mail.de> <20180424231911.GN11616@ando.pearwood.info> Message-ID: On Tue, May 1, 2018 at 3:36 AM, Chris Jerdonek wrote: > On Thu, Apr 26, 2018 at 10:33 AM, Sven R. Kunze wrote: > > On 25.04.2018 01:19, Steven D'Aprano wrote: > >> > >> Sorry, gcd(diff, n) is not the "perfect name", and I will tell you that > >> sometimes g is better. [...] > > > > We were talking about the real-world code snippet of Tim (as a > justification > > of := ) and alternative rewritings of it without resorting to new syntax. > > Apologies if this idea has already been discussed (I might have missed > the relevant email), but thinking back to Tim's earlier example-- > > if (diff := x - x_base) and (g := gcd(diff, n)) > 1: > return g > > it occurs to me this could be implemented with current syntax using a > pattern like the following: > > stashed = [None] > > def stash(x): > stashed[0] = x > return x > > if stash(x - x_base) and stash(gcd(stashed[0], n)) > 1: > return stashed[0] > > There are many variations to this idea, obviously. For example, one > could allow passing a "name" to stash(), or combine stash / stashed > into a single, callable object that allows setting and reading from > its store. I wonder if one of them could be made into a worthwhile > pattern or API.. > ?. > ?I hope you don't think this recasting, is in any way less confusing to a beginner than an inline assignment. This is language abuse! In any case, what advantages would it have over simply declaring "stashed" as a global inside the function and omitting the confusing subscripting? regards Steve? -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.jerdonek at gmail.com Tue May 1 05:32:39 2018 From: chris.jerdonek at gmail.com (Chris Jerdonek) Date: Tue, 1 May 2018 02:32:39 -0700 Subject: [Python-Dev] PEP 572: Assignment Expressions In-Reply-To: References: <5add0911.81b7500a.d7e15.acb1SMTPIN_ADDED_MISSING@mx.google.com> <20180423092806.530b5996@fsol> <83e528a7-9758-fcbf-eb91-b6f1763b5e2e@mail.de> <20180424231911.GN11616@ando.pearwood.info> Message-ID: On Tue, May 1, 2018 at 2:14 AM, Steve Holden wrote: > On Tue, May 1, 2018 at 3:36 AM, Chris Jerdonek > wrote: >> >> On Thu, Apr 26, 2018 at 10:33 AM, Sven R. Kunze wrote: >> > On 25.04.2018 01:19, Steven D'Aprano wrote: >> >> >> >> Sorry, gcd(diff, n) is not the "perfect name", and I will tell you that >> >> sometimes g is better. [...] >> > >> > We were talking about the real-world code snippet of Tim (as a >> > justification >> > of := ) and alternative rewritings of it without resorting to new >> > syntax. >> >> Apologies if this idea has already been discussed (I might have missed >> the relevant email), but thinking back to Tim's earlier example-- >> >> if (diff := x - x_base) and (g := gcd(diff, n)) > 1: >> return g >> >> it occurs to me this could be implemented with current syntax using a >> pattern like the following: >> >> stashed = [None] >> >> def stash(x): >> stashed[0] = x >> return x >> >> if stash(x - x_base) and stash(gcd(stashed[0], n)) > 1: >> return stashed[0] >> >> There are many variations to this idea, obviously. For example, one >> could allow passing a "name" to stash(), or combine stash / stashed >> into a single, callable object that allows setting and reading from >> its store. I wonder if one of them could be made into a worthwhile >> pattern or API.. > > I hope you don't think this recasting, is in any way less confusing to a > beginner than an inline assignment. This is language abuse! I didn't make any claims that it wouldn't be confusing (especially as is). It was just an _idea_. I mentioned it because (1) it uses current syntax, (2) it doesn't require intermediate assignments or extra indents in the main body of code, (3) it doesn't even require choosing intermediate names, and (4) I didn't see it mentioned in any of the previous discussion. All three of the first points have been major sources of discussion in the thread. So I thought it might be of interest. > In any case, what advantages would it have over simply declaring "stashed" > as a global inside the function and omitting the confusing subscripting? Right. Like I said, there are many variations. I just listed one to convey the general idea. --Chris From tritium-list at sdamon.com Tue May 1 05:58:14 2018 From: tritium-list at sdamon.com (Alex Walters) Date: Tue, 1 May 2018 05:58:14 -0400 Subject: [Python-Dev] [RELEASE] Python 2.7.15 In-Reply-To: <1525147795.1134689.1356448000.51663AA1@webmail.messagingengine.com> References: <1525147795.1134689.1356448000.51663AA1@webmail.messagingengine.com> Message-ID: <4a04a01d3e132$ec45fb20$c4d1f160$@sdamon.com> I've gotten some mixed signals on the status of this release, notably from the BDFL: https://twitter.com/gvanrossum/status/991170064417153025 "Python 2.7.15 released -- the last 2.7 release!" (and a link to this thread) I was under the impression that 2.7 was being supported until 2020. If this is the final release of 2.7, what would "support" constitute? My assumption was that the final release of 2.7 would be sometime in 2020 (or much closer to 2020 than 19 months). > -----Original Message----- > From: Python-Dev list=sdamon.com at python.org> On Behalf Of Benjamin Peterson > Sent: Tuesday, May 1, 2018 12:10 AM > To: python-list at python.org; python-announce at python.org; python- > dev at python.org > Subject: [Python-Dev] [RELEASE] Python 2.7.15 > > Greetings, > I'm pleased to announce the immediate availability of Python 2.7.15, the > latest bug fix release in the senescent Python 2.7 series. > > Source and binary downloads may be found on python.org: > > https://www.python.org/downloads/release/python-2715/ > > Bugs should be reported to https://bugs.python.org/ > > The source tarball contains a complete changelog in the Misc/NEWS file. The > only change since the release candidate is a fix for undefined C behavior that > newer compilers (including GCC 8) have started to exploit. > > Users of the macOS binaries should note that all python.org macOS installers > now ship with a builtin copy of OpenSSL. Additionally, there is a new > additional installer variant for macOS 10.9+ that includes a built-in version of > Tcl/Tk 8.6. See the installer README for more information. > > Happy May, > Benjamin > 2.7 release manager > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/tritium- > list%40sdamon.com From guido at python.org Tue May 1 10:35:05 2018 From: guido at python.org (Guido van Rossum) Date: Tue, 1 May 2018 07:35:05 -0700 Subject: [Python-Dev] [RELEASE] Python 2.7.15 In-Reply-To: <4a04a01d3e132$ec45fb20$c4d1f160$@sdamon.com> References: <1525147795.1134689.1356448000.51663AA1@webmail.messagingengine.com> <4a04a01d3e132$ec45fb20$c4d1f160$@sdamon.com> Message-ID: Simple. I misread "latest" for "last" and was hopeful that no new bugs would need to be fixed between now and 2020. I will post a correction on Twitter now. On Tue, May 1, 2018 at 2:58 AM, Alex Walters wrote: > I've gotten some mixed signals on the status of this release, notably from > the BDFL: > > https://twitter.com/gvanrossum/status/991170064417153025 > "Python 2.7.15 released -- the last 2.7 release!" (and a link to this > thread) > > I was under the impression that 2.7 was being supported until 2020. If > this > is the final release of 2.7, what would "support" constitute? My > assumption > was that the final release of 2.7 would be sometime in 2020 (or much closer > to 2020 than 19 months). > > > -----Original Message----- > > From: Python-Dev > list=sdamon.com at python.org> On Behalf Of Benjamin Peterson > > Sent: Tuesday, May 1, 2018 12:10 AM > > To: python-list at python.org; python-announce at python.org; python- > > dev at python.org > > Subject: [Python-Dev] [RELEASE] Python 2.7.15 > > > > Greetings, > > I'm pleased to announce the immediate availability of Python 2.7.15, the > > latest bug fix release in the senescent Python 2.7 series. > > > > Source and binary downloads may be found on python.org: > > > > https://www.python.org/downloads/release/python-2715/ > > > > Bugs should be reported to https://bugs.python.org/ > > > > The source tarball contains a complete changelog in the Misc/NEWS file. > The > > only change since the release candidate is a fix for undefined C behavior > that > > newer compilers (including GCC 8) have started to exploit. > > > > Users of the macOS binaries should note that all python.org macOS > installers > > now ship with a builtin copy of OpenSSL. Additionally, there is a new > > additional installer variant for macOS 10.9+ that includes a built-in > version of > > Tcl/Tk 8.6. See the installer README for more information. > > > > Happy May, > > Benjamin > > 2.7 release manager > > _______________________________________________ > > Python-Dev mailing list > > Python-Dev at python.org > > https://mail.python.org/mailman/listinfo/python-dev > > Unsubscribe: https://mail.python.org/mailman/options/python-dev/tritium- > > list%40sdamon.com > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ > guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From ericsnowcurrently at gmail.com Tue May 1 10:46:33 2018 From: ericsnowcurrently at gmail.com (Eric Snow) Date: Tue, 1 May 2018 08:46:33 -0600 Subject: [Python-Dev] Every Release Can Be a Mini "Python 4000", Within Reason (was (name := expression) doesn't fit the narrative of PEP 20) In-Reply-To: References: <5AE2D36E.7060801@canterbury.ac.nz> <3193344c-24c1-5224-c40e-91d26687b51c@farowl.co.uk> <5AE6B62F.9050902@canterbury.ac.nz> <4fcfb653-4d8b-f209-588f-fe1b88689ee7@farowl.co.uk> Message-ID: FWIW, this thread is about what "Python 4000" means and does not mean. Namely, Python feature deprecation and removal is not prohibited but the bar is high (as always), especially for syntax. While I appreciate the level of interest in certain under-consideration proposals, you'd be better served by continuing discussion about that proposal in other threads. Thanks! -eric On Mon, Apr 30, 2018 at 7:25 PM, Terry Reedy wrote: > On 4/30/2018 4:00 PM, Jeff Allen wrote: > >> They were not "statements", but "formulas" while '=' was assignment (sec >> 8) *and* comparison (sec 10B). So conversely to our worry, they actually >> wanted users to think of assignment initially as a mathematical formula >> (page 2) in order to exploit the similarity to a familiar concept, albeit >> a=a+i makes no sense from this perspective. > > > When explaining iterative algorithms, such as Newton's method, > mathematicians write things like a' = a+1 or a(sub i+1 or sub new) = f(a(sub > i or sub old)) . For computer, we drop the super/subscript. Or one can > write more circuitously, > anew = update(aold) > aold = anew > The abbreviations should be explained when teaching loops. > > For proving that the body of a loop maintains a loop constant, one may > reinstate the super- or sub-script. > > -- > Terry Jan Reedy > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/ericsnowcurrently%40gmail.com From bill at baddogconsulting.com Tue May 1 11:00:12 2018 From: bill at baddogconsulting.com (Bill Deegan) Date: Tue, 1 May 2018 11:00:12 -0400 Subject: [Python-Dev] [RELEASE] Python 2.7.15 In-Reply-To: References: <1525147795.1134689.1356448000.51663AA1@webmail.messagingengine.com> <4a04a01d3e132$ec45fb20$c4d1f160$@sdamon.com> Message-ID: Is it possible to get the release notes included on the download page(s)? On Tue, May 1, 2018 at 10:35 AM, Guido van Rossum wrote: > Simple. I misread "latest" for "last" and was hopeful that no new bugs > would need to be fixed between now and 2020. I will post a correction on > Twitter now. > > On Tue, May 1, 2018 at 2:58 AM, Alex Walters > wrote: > > > I've gotten some mixed signals on the status of this release, notably > from > > the BDFL: > > > > https://twitter.com/gvanrossum/status/991170064417153025 > > "Python 2.7.15 released -- the last 2.7 release!" (and a link to this > > thread) > > > > I was under the impression that 2.7 was being supported until 2020. If > > this > > is the final release of 2.7, what would "support" constitute? My > > assumption > > was that the final release of 2.7 would be sometime in 2020 (or much > closer > > to 2020 than 19 months). > > > > > -----Original Message----- > > > From: Python-Dev > > list=sdamon.com at python.org> On Behalf Of Benjamin Peterson > > > Sent: Tuesday, May 1, 2018 12:10 AM > > > To: python-list at python.org; python-announce at python.org; python- > > > dev at python.org > > > Subject: [Python-Dev] [RELEASE] Python 2.7.15 > > > > > > Greetings, > > > I'm pleased to announce the immediate availability of Python 2.7.15, > the > > > latest bug fix release in the senescent Python 2.7 series. > > > > > > Source and binary downloads may be found on python.org: > > > > > > https://www.python.org/downloads/release/python-2715/ > > > > > > Bugs should be reported to https://bugs.python.org/ > > > > > > The source tarball contains a complete changelog in the Misc/NEWS file. > > The > > > only change since the release candidate is a fix for undefined C > behavior > > that > > > newer compilers (including GCC 8) have started to exploit. > > > > > > Users of the macOS binaries should note that all python.org macOS > > installers > > > now ship with a builtin copy of OpenSSL. Additionally, there is a new > > > additional installer variant for macOS 10.9+ that includes a built-in > > version of > > > Tcl/Tk 8.6. See the installer README for more information. > > > > > > Happy May, > > > Benjamin > > > 2.7 release manager > > > _______________________________________________ > > > Python-Dev mailing list > > > Python-Dev at python.org > > > https://mail.python.org/mailman/listinfo/python-dev > > > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ > tritium- > > > list%40sdamon.com > > > > _______________________________________________ > > Python-Dev mailing list > > Python-Dev at python.org > > https://mail.python.org/mailman/listinfo/python-dev > > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ > > guido%40python.org > > > > > > -- > --Guido van Rossum (python.org/~guido) > -- > https://mail.python.org/mailman/listinfo/python-list > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gregory.szorc at gmail.com Tue May 1 23:26:54 2018 From: gregory.szorc at gmail.com (Gregory Szorc) Date: Tue, 1 May 2018 20:26:54 -0700 Subject: [Python-Dev] Python startup time In-Reply-To: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> Message-ID: On 7/19/2017 12:15 PM, Larry Hastings wrote: > > > On 07/19/2017 05:59 AM, Victor Stinner wrote: >> Mercurial startup time is already 45.8x slower than Git whereas tested >> Mercurial runs on Python 2.7.12. Now try to sell Python 3 to Mercurial >> developers, with a startup time 2x - 3x slower... > > When Matt Mackall spoke at the Python Language Summit some years back, I > recall that he specifically complained about Python startup time.? He > said Python 3 "didn't solve any problems for [them]"--they'd already > solved their Unicode hygiene problems--and that Python's slow startup > time was already a big problem for them.? Python 3 being /even slower/ > to start was absolutely one of the reasons why they didn't want to upgrade. > > You might think "what's a few milliseconds matter".? But if you run > hundreds of commands in a shell script it adds up.? git's speed is one > of the few bright spots in its UX, and hg's comparative slowness here is > a palpable disadvantage. > > >> So please continue efforts for make Python startup even faster to beat >> all other programming languages, and finally convince Mercurial to >> upgrade ;-) > > I believe Mercurial is, finally, slowly porting to Python 3. > > https://www.mercurial-scm.org/wiki/Python3 > > Nevertheless, I can't really be annoyed or upset at them moving slowly > to adopt Python 3, as Matt's objections were entirely legitimate. I just now found found this thread when searching the archive for threads about startup time. And I was searching for threads about startup time because Mercurial's startup time has been getting slower over the past few months and this is causing substantial pain. As I posted back in 2014 [1], CPython's startup overhead was >10% of the total CPU time in Mercurial's test suite. And when you factor in the time to import modules that get Mercurial to a point where it can run commands, it was more like 30%! Mercurial's full test suite currently runs `hg` ~25,000 times. Using Victor's startup time numbers of 6.4ms for 2.7 and 14.5ms for 3.7/master, Python startup overhead contributes ~160s on 2.7 and ~360s on 3.7/master. Even if you divide this by the number of available CPU cores, we're talking dozens of seconds of wall time just waiting for CPython to get to a place where Mercurial's first bytecode can execute. And the problem is worse when you factor in the time it takes to import Mercurial's own modules. As a concrete example, I recently landed a Mercurial patch [2] that stubs out zope.interface to prevent the import of 9 modules on every `hg` invocation. This "only" saved ~6.94ms for a typical `hg` invocation. But this decreased the CPU time required to run the test suite on my i7-6700K from ~4450s to ~3980s (~89.5% of original) - a reduction of almost 8 minutes of CPU time (and over 1 minute of wall time)! By the time CPython gets Mercurial to a point where we can run useful code, we've already blown most of or past the time budget where humans perceive an action/command as instantaneous. If you ignore startup overhead, Mercurial's performance compares quite well to Git's for many operations. But the reality is that CPython startup overhead makes it look like Mercurial is non-instantaneous before Mercurial even has the opportunity to execute meaningful code! Mercurial provides a `chg` program that essentially spins up a daemon `hg` process running a "command server" so the `chg` program [written in C - no startup overhead] can dispatch commands to an already-running Python/`hg` process and avoid paying the startup overhead cost. When you run Mercurial's test suite using `chg`, it completes *minutes* faster. `chg` exists mainly as a workaround for slow startup overhead. Changing gears, my day job is maintaining Firefox's build system. We use Python heavily in the build system. And again, Python startup overhead is problematic. I don't have numbers offhand, but we invoke likely a few hundred Python processes as part of building Firefox. It should be several thousand. But, we've had to "hack" parts of the build system to "batch" certain build actions in single process invocations in order to avoid Python startup overhead. This undermines the ability of some build tools to formulate a reasonable understanding of the DAG and it causes a bit of pain for build system developers and makes it difficult to achieve "no-op" and fast incremental builds because we're always invoking certain Python processes because we've had to move DAG awareness out of the build backend and into Python. At some point, we'll likely replace Python code with Rust so the build system is more "pure" and easier to maintain and reason about. I've seen posts in this thread and elsewhere in the CPython development universe that challenge whether milliseconds in startup time matter. Speaking as a Mercurial and Firefox build system developer, *milliseconds absolutely matter*. Going further, *fractions of milliseconds matter*. For Mercurial's test suite with its ~25,000 Python process invocations, 1ms translates to ~25s of CPU time. With 2.7, Mercurial can dispatch commands in ~50ms. When you load common extensions, it isn't uncommon to see process startup overhead of 100-150ms! A millisecond here. A millisecond there. Before you know it, we're talking *minutes* of CPU (and potentially wall) time in order to run Mercurial's test suite (or build Firefox, or ...). >From my perspective, Python process startup and module import overhead is a severe problem for Python. I don't say this lightly, but in my mind the problem causes me to question the viability of Python for popular use cases, such as CLI applications. When choosing a programming language, I want one that will scale as a project grows. Vanilla process overhead has Python starting off significantly slower than compiled code (or even Perl) and adding module import overhead into the mix makes Python slower and slower as projects grow. As someone who has to deal with this slowness on a daily basis, I can tell you that it is extremely frustrating and it does matter. I hope that the importance of the problem will be acknowledged (milliseconds *do* matter) and that creative minds will band together to address it. Since I am disproportionately impacted by this issue, if there's anything I can do to help, let me know. Gregory [1] https://mail.python.org/pipermail/python-dev/2014-May/134528.html [2] https://www.mercurial-scm.org/repo/hg/rev/856f381ad74b From mingw.android at gmail.com Wed May 2 02:55:05 2018 From: mingw.android at gmail.com (Ray Donnelly) Date: Wed, 02 May 2018 06:55:05 +0000 Subject: [Python-Dev] Python startup time In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> Message-ID: On Wed, May 2, 2018, 4:53 AM Gregory Szorc wrote: > On 7/19/2017 12:15 PM, Larry Hastings wrote: > > > > > > On 07/19/2017 05:59 AM, Victor Stinner wrote: > >> Mercurial startup time is already 45.8x slower than Git whereas tested > >> Mercurial runs on Python 2.7.12. Now try to sell Python 3 to Mercurial > >> developers, with a startup time 2x - 3x slower... > > > > When Matt Mackall spoke at the Python Language Summit some years back, I > > recall that he specifically complained about Python startup time. He > > said Python 3 "didn't solve any problems for [them]"--they'd already > > solved their Unicode hygiene problems--and that Python's slow startup > > time was already a big problem for them. Python 3 being /even slower/ > > to start was absolutely one of the reasons why they didn't want to > upgrade. > > > > You might think "what's a few milliseconds matter". But if you run > > hundreds of commands in a shell script it adds up. git's speed is one > > of the few bright spots in its UX, and hg's comparative slowness here is > > a palpable disadvantage. > > > > > >> So please continue efforts for make Python startup even faster to beat > >> all other programming languages, and finally convince Mercurial to > >> upgrade ;-) > > > > I believe Mercurial is, finally, slowly porting to Python 3. > > > > https://www.mercurial-scm.org/wiki/Python3 > > > > Nevertheless, I can't really be annoyed or upset at them moving slowly > > to adopt Python 3, as Matt's objections were entirely legitimate. > > I just now found found this thread when searching the archive for > threads about startup time. And I was searching for threads about > startup time because Mercurial's startup time has been getting slower > over the past few months and this is causing substantial pain. > > As I posted back in 2014 [1], CPython's startup overhead was >10% of the > total CPU time in Mercurial's test suite. And when you factor in the > time to import modules that get Mercurial to a point where it can run > commands, it was more like 30%! > > Mercurial's full test suite currently runs `hg` ~25,000 times. Using > Victor's startup time numbers of 6.4ms for 2.7 and 14.5ms for > 3.7/master, Python startup overhead contributes ~160s on 2.7 and ~360s > on 3.7/master. Even if you divide this by the number of available CPU > cores, we're talking dozens of seconds of wall time just waiting for > CPython to get to a place where Mercurial's first bytecode can execute. > > And the problem is worse when you factor in the time it takes to import > Mercurial's own modules. > > As a concrete example, I recently landed a Mercurial patch [2] that > stubs out zope.interface to prevent the import of 9 modules on every > `hg` invocation. This "only" saved ~6.94ms for a typical `hg` > invocation. But this decreased the CPU time required to run the test > suite on my i7-6700K from ~4450s to ~3980s (~89.5% of original) - a > reduction of almost 8 minutes of CPU time (and over 1 minute of wall time)! > > By the time CPython gets Mercurial to a point where we can run useful > code, we've already blown most of or past the time budget where humans > perceive an action/command as instantaneous. If you ignore startup > overhead, Mercurial's performance compares quite well to Git's for many > operations. But the reality is that CPython startup overhead makes it > look like Mercurial is non-instantaneous before Mercurial even has the > opportunity to execute meaningful code! > > Mercurial provides a `chg` program that essentially spins up a daemon > `hg` process running a "command server" so the `chg` program [written in > C - no startup overhead] can dispatch commands to an already-running > Python/`hg` process and avoid paying the startup overhead cost. When you > run Mercurial's test suite using `chg`, it completes *minutes* faster. > `chg` exists mainly as a workaround for slow startup overhead. > > Changing gears, my day job is maintaining Firefox's build system. We use > Python heavily in the build system. And again, Python startup overhead > is problematic. I don't have numbers offhand, but we invoke likely a few > hundred Python processes as part of building Firefox. It should be > several thousand. But, we've had to "hack" parts of the build system to > "batch" certain build actions in single process invocations in order to > avoid Python startup overhead. This undermines the ability of some build > tools to formulate a reasonable understanding of the DAG and it causes a > bit of pain for build system developers and makes it difficult to > achieve "no-op" and fast incremental builds because we're always > invoking certain Python processes because we've had to move DAG > awareness out of the build backend and into Python. At some point, we'll > likely replace Python code with Rust so the build system is more "pure" > and easier to maintain and reason about. > > I've seen posts in this thread and elsewhere in the CPython development > universe that challenge whether milliseconds in startup time matter. > Speaking as a Mercurial and Firefox build system developer, > *milliseconds absolutely matter*. Going further, *fractions of > milliseconds matter*. For Mercurial's test suite with its ~25,000 Python > process invocations, 1ms translates to ~25s of CPU time. With 2.7, > Mercurial can dispatch commands in ~50ms. When you load common > extensions, it isn't uncommon to see process startup overhead of > 100-150ms! A millisecond here. A millisecond there. Before you know it, > we're talking *minutes* of CPU (and potentially wall) time in order to > run Mercurial's test suite (or build Firefox, or ...). > > From my perspective, Python process startup and module import overhead > is a severe problem for Python. I don't say this lightly, but in my mind > the problem causes me to question the viability of Python for popular > use cases, such as CLI applications. When choosing a programming > language, I want one that will scale as a project grows. Vanilla process > overhead has Python starting off significantly slower than compiled code > (or even Perl) and adding module import overhead into the mix makes > Python slower and slower as projects grow. As someone who has to deal > with this slowness on a daily basis, I can tell you that it is extremely > frustrating and it does matter. I hope that the importance of the > problem will be acknowledged (milliseconds *do* matter) and that > creative minds will band together to address it. Since I am > disproportionately impacted by this issue, if there's anything I can do > to help, let me know > Is your Python interpreter statically linked? The Python 3 ones from the anaconda distribution (use Miniconda!) are for Linux and macOS and that roughly halved our startup times. > Gregory > > [1] https://mail.python.org/pipermail/python-dev/2014-May/134528.html > [2] https://www.mercurial-scm.org/repo/hg/rev/856f381ad74b > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/mingw.android%40gmail.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vstinner at redhat.com Wed May 2 05:11:16 2018 From: vstinner at redhat.com (Victor Stinner) Date: Wed, 2 May 2018 11:11:16 +0200 Subject: [Python-Dev] Process to remove a Python feature Message-ID: Hi, As a follow-up to the "[Python-Dev] (Looking for) A Retrospective on the Move to Python 3" thread, I will like to clarify how a feature should be removed from Python. We have multiple tools: * Emit a PendingDeprecationWarning warning at runtime for *slow* deprecation (remove a feature in at least 3 cycles) * Emit a DeprecationWarning warning at runtime for fast deprecation (remove a feature in 2 cycles) * Document that a feature is deprecated in the module documentation * "What's New in Python X.Y?" documents: especially Deprecated, Removed and Porting to Python X.Y sections. * Communicate on python-dev, Twitter, Python Insider blog, etc. * Collaborate with major Python projects to help them to migrate the alternative IMHO a feature should not be removed if there is no alternative, or if the alternative is too new. The usual process is: * Document the deprecation and emit DeprecationWarning in version N * Document the removal and remove the feature in version N+1 Slow deprecation: * Document the deprecation and emit PendingDeprecationWarning in version N * Emit a DeprecationWarning in version N+1 * Document the removal and remove the feature in version N+2 The hidden ghost is the old rule: "keep old APIs around to make porting from Python 2 easier" Is this rule still applicable? Does it mean that the Python 3 release following Python 2 end-of-life (2020) will be our next feared "Python 4"? Are we going to remove all deprecated features at once, to maximize incompatibilities and make users unhappy? Should we always document in which version a feature will be removed? Some features are deprecated since many versions, and the deprecated features are still there. In most cases, it's because of the Python 2 rule. Sometimes, we forget to remove features which has been scheduled for removal in a specific version. Maybe we should create a tool to list features scheduled for removal, and open a discussion to check each removal? Ten years ago, I wanted to remove most modules and many functions from the standard library. Now my experience showed me that *each* removal is very painful, hurt more projects than expected, and takes longer than 3 years to be fully effective (no longer used in most common 3rd party modules). The usual issue is to write a single code base working on all Python versions. For example, the new alternative is only available on recent Python versions, which requires to have at least two code paths to support all versions. Sometimes, there are 3 code paths... For a recent example, see "remove platform.linux_distribution()": https://bugs.python.org/issue28167 Removing a feature from the C API is more complex, since there is no portable way to emit a deprecation warning at compilation. There is Py_DEPRECATED() which seems to only be available on GCC (3.1 and newer). Maybe we should stop to remove features, except if there is really a good reason to do that? Victor From solipsis at pitrou.net Wed May 2 05:17:36 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Wed, 2 May 2018 11:17:36 +0200 Subject: [Python-Dev] Process to remove a Python feature References: Message-ID: <20180502111736.73c37b1b@fsol> On Wed, 2 May 2018 11:11:16 +0200 Victor Stinner wrote: > > Removing a feature from the C API is more complex, since there is no > portable way to emit a deprecation warning at compilation. There is > Py_DEPRECATED() which seems to only be available on GCC (3.1 and > newer). It's at least possible with gcc, clang and MSVC: https://stackoverflow.com/questions/295120/c-mark-as-deprecated/295229#295229 You can even add a deprecation message. Regards Antoine. From vstinner at redhat.com Wed May 2 05:26:35 2018 From: vstinner at redhat.com (Victor Stinner) Date: Wed, 2 May 2018 11:26:35 +0200 Subject: [Python-Dev] Python startup time In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> Message-ID: What do you propose to make Python startup faster? As I wrote in my previous emails, many Python core developers care of the startup time and we are working on making it faster. INADA Naoki added -X importtime to identify slow imports and understand where Python spent its startup time. Recent example: Barry Warsaw identified that pkg_resources is slow and added importlib.resources to Python 3.7: https://docs.python.org/dev/library/importlib.html#module-importlib.resources Brett Cannon is also working on a standard solution for lazy imports since many years: https://pypi.org/project/modutil/ https://snarky.ca/lazy-importing-in-python-3-7/ Nick Coghlan is working on the C API to configure Python startup: PEP 432. When it will be ready, maybe Mercurial could use a custom Python optimized for its use case. IMHO Python import system is inefficient. We try too many alternative names. Example with Python 3.8 $ ./python -vv: >>> import dontexist # trying /home/vstinner/prog/python/master/dontexist.cpython-38dm-x86_64-linux-gnu.so # trying /home/vstinner/prog/python/master/dontexist.abi3.so # trying /home/vstinner/prog/python/master/dontexist.so # trying /home/vstinner/prog/python/master/dontexist.py # trying /home/vstinner/prog/python/master/dontexist.pyc # trying /home/vstinner/prog/python/master/Lib/dontexist.cpython-38dm-x86_64-linux-gnu.so # trying /home/vstinner/prog/python/master/Lib/dontexist.abi3.so # trying /home/vstinner/prog/python/master/Lib/dontexist.so # trying /home/vstinner/prog/python/master/Lib/dontexist.py # trying /home/vstinner/prog/python/master/Lib/dontexist.pyc # trying /home/vstinner/prog/python/master/build/lib.linux-x86_64-3.8-pydebug/dontexist.cpython-38dm-x86_64-linux-gnu.so # trying /home/vstinner/prog/python/master/build/lib.linux-x86_64-3.8-pydebug/dontexist.abi3.so # trying /home/vstinner/prog/python/master/build/lib.linux-x86_64-3.8-pydebug/dontexist.so # trying /home/vstinner/prog/python/master/build/lib.linux-x86_64-3.8-pydebug/dontexist.py # trying /home/vstinner/prog/python/master/build/lib.linux-x86_64-3.8-pydebug/dontexist.pyc # trying /home/vstinner/.local/lib/python3.8/site-packages/dontexist.cpython-38dm-x86_64-linux-gnu.so # trying /home/vstinner/.local/lib/python3.8/site-packages/dontexist.abi3.so # trying /home/vstinner/.local/lib/python3.8/site-packages/dontexist.so # trying /home/vstinner/.local/lib/python3.8/site-packages/dontexist.py # trying /home/vstinner/.local/lib/python3.8/site-packages/dontexist.pyc Traceback (most recent call last): File "", line 1, in File "", line 983, in _find_and_load File "", line 965, in _find_and_load_unlocked ModuleNotFoundError: No module named 'dontexist' Why do we still check for the .pyc file outside __pycache__ directories? Why do we have to check for 3 different names for .so files? Does Mercurial need all directories of sys.path? What's the status of the "system python" project? :-) I also would prefer Python without the site module. Can we rewrite this module in C maybe? Until recently, the site module was needed on Python to create the "mbcs" encoding alias. Hopefully, the feature has been removed into Lib/encodings/__init__.py (new private _alias_mbcs() function). Python 3.7b3+: $ python3.7 -X importtime -c pass import time: self [us] | cumulative | imported package import time: 95 | 95 | zipimport import time: 589 | 589 | _frozen_importlib_external import time: 67 | 67 | _codecs import time: 498 | 565 | codecs import time: 425 | 425 | encodings.aliases import time: 641 | 1629 | encodings import time: 228 | 228 | encodings.utf_8 import time: 143 | 143 | _signal import time: 335 | 335 | encodings.latin_1 import time: 58 | 58 | _abc import time: 265 | 322 | abc import time: 298 | 619 | io import time: 69 | 69 | _stat import time: 196 | 265 | stat import time: 169 | 169 | genericpath import time: 336 | 505 | posixpath import time: 1190 | 1190 | _collections_abc import time: 600 | 2557 | os import time: 223 | 223 | _sitebuiltins import time: 214 | 214 | sitecustomize import time: 74 | 74 | usercustomize import time: 477 | 3544 | site Victor From vstinner at redhat.com Wed May 2 05:35:46 2018 From: vstinner at redhat.com (Victor Stinner) Date: Wed, 2 May 2018 11:35:46 +0200 Subject: [Python-Dev] [RELEASE] Python 2.7.15 In-Reply-To: <1525147795.1134689.1356448000.51663AA1@webmail.messagingengine.com> References: <1525147795.1134689.1356448000.51663AA1@webmail.messagingengine.com> Message-ID: Sadly, Python 2.7.15 still miss the implementation of the "PEP 546 -- Backport ssl.MemoryBIO and ssl.SSLObject to Python 2.7": https://www.python.org/dev/peps/pep-0546/ Last time I checked, the tests failed on Travis CI and I failed to reproduce the issue: https://bugs.python.org/issue22559 I expected Cory Benfield to jump into this issue since his "PEP 543 -- A Unified TLS API for Python" was my motivation for the PEP 546, but it seems like he is busy and the TLS PEP doesn't move anymore :-( https://www.python.org/dev/peps/pep-0543/ Victor 2018-05-01 6:09 GMT+02:00 Benjamin Peterson : > Greetings, > I'm pleased to announce the immediate availability of Python 2.7.15, the latest bug fix release in the senescent Python 2.7 series. > > Source and binary downloads may be found on python.org: > > https://www.python.org/downloads/release/python-2715/ > > Bugs should be reported to https://bugs.python.org/ > > The source tarball contains a complete changelog in the Misc/NEWS file. The only change since the release candidate is a fix for undefined C behavior that newer compilers (including GCC 8) have started to exploit. > > Users of the macOS binaries should note that all python.org macOS installers now ship with a builtin copy of OpenSSL. Additionally, there is a new additional installer variant for macOS 10.9+ that includes a built-in version of Tcl/Tk 8.6. See the installer README for more information. > > Happy May, > Benjamin > 2.7 release manager > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/vstinner%40redhat.com From vstinner at redhat.com Wed May 2 05:32:39 2018 From: vstinner at redhat.com (Victor Stinner) Date: Wed, 2 May 2018 11:32:39 +0200 Subject: [Python-Dev] Process to remove a Python feature In-Reply-To: <20180502111736.73c37b1b@fsol> References: <20180502111736.73c37b1b@fsol> Message-ID: 2018-05-02 11:17 GMT+02:00 Antoine Pitrou : > It's at least possible with gcc, clang and MSVC: > https://stackoverflow.com/questions/295120/c-mark-as-deprecated/295229#295229 > > You can even add a deprecation message. Aha, I opened an issue: https://bugs.python.org/issue33407 "Implement Py_DEPRECATED() macro for Visual Studio" Victor From solipsis at pitrou.net Wed May 2 05:43:41 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Wed, 2 May 2018 11:43:41 +0200 Subject: [Python-Dev] Python startup time References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> Message-ID: <20180502114341.35790502@fsol> On Wed, 2 May 2018 11:26:35 +0200 Victor Stinner wrote: > > Brett Cannon is also working on a standard solution for lazy imports > since many years: > https://pypi.org/project/modutil/ > https://snarky.ca/lazy-importing-in-python-3-7/ AFAIK, Mercurial already has its own lazy importer. > Nick Coghlan is working on the C API to configure Python startup: PEP > 432. When it will be ready, maybe Mercurial could use a custom Python > optimized for its use case. > > IMHO Python import system is inefficient. We try too many alternative names. The overhead of importing is not in trying too many names, but in loading the module and executing its bytecode. > Why do we still check for the .pyc file outside __pycache__ directories? Because we support sourceless distributions. > Why do we have to check for 3 different names for .so files? See https://bugs.python.org/issue32387 Regards Antoine. From benjamin at python.org Wed May 2 10:05:30 2018 From: benjamin at python.org (Benjamin Peterson) Date: Wed, 02 May 2018 07:05:30 -0700 Subject: [Python-Dev] [RELEASE] Python 2.7.15 In-Reply-To: References: <1525147795.1134689.1356448000.51663AA1@webmail.messagingengine.com> Message-ID: <1525269930.1196415.1358244048.5E673230@webmail.messagingengine.com> The lack of movement for a year makes me wonder if PEP 546 should be moved to Withdrawn status. On Wed, May 2, 2018, at 02:35, Victor Stinner wrote: > Sadly, Python 2.7.15 still miss the implementation of the "PEP 546 -- > Backport ssl.MemoryBIO and ssl.SSLObject to Python 2.7": > https://www.python.org/dev/peps/pep-0546/ > > Last time I checked, the tests failed on Travis CI and I failed to > reproduce the issue: > https://bugs.python.org/issue22559 > > I expected Cory Benfield to jump into this issue since his "PEP 543 -- > A Unified TLS API for Python" was my motivation for the PEP 546, but > it seems like he is busy and the TLS PEP doesn't move anymore :-( > https://www.python.org/dev/peps/pep-0543/ > > Victor > > 2018-05-01 6:09 GMT+02:00 Benjamin Peterson : > > Greetings, > > I'm pleased to announce the immediate availability of Python 2.7.15, the latest bug fix release in the senescent Python 2.7 series. > > > > Source and binary downloads may be found on python.org: > > > > https://www.python.org/downloads/release/python-2715/ > > > > Bugs should be reported to https://bugs.python.org/ > > > > The source tarball contains a complete changelog in the Misc/NEWS file. The only change since the release candidate is a fix for undefined C behavior that newer compilers (including GCC 8) have started to exploit. > > > > Users of the macOS binaries should note that all python.org macOS installers now ship with a builtin copy of OpenSSL. Additionally, there is a new additional installer variant for macOS 10.9+ that includes a built-in version of Tcl/Tk 8.6. See the installer README for more information. > > > > Happy May, > > Benjamin > > 2.7 release manager > > _______________________________________________ > > Python-Dev mailing list > > Python-Dev at python.org > > https://mail.python.org/mailman/listinfo/python-dev > > Unsubscribe: https://mail.python.org/mailman/options/python-dev/vstinner%40redhat.com > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/benjamin%40python.org From gregory.szorc at gmail.com Wed May 2 12:42:55 2018 From: gregory.szorc at gmail.com (Gregory Szorc) Date: Wed, 2 May 2018 09:42:55 -0700 Subject: [Python-Dev] Python startup time In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> Message-ID: On Tue, May 1, 2018 at 11:55 PM, Ray Donnelly wrote: > Is your Python interpreter statically linked? The Python 3 ones from the anaconda distribution (use Miniconda!) are for Linux and macOS and that roughly halved our startup times. My Python interpreters use a shared library. I'll definitely investigate the performance of a statically-linked interpreter. Correct me if I'm wrong, but aren't there downsides with regards to C extension compatibility to not having a shared libpython? Or does all the packaging tooling "just work" without a libpython? (It's possible I have my wires crossed up with something else regarding a statically linked Python.) On Wed, May 2, 2018 at 2:26 AM, Victor Stinner wrote: > What do you propose to make Python startup faster? > That's a very good question. I'm not sure I'm able to answer it because I haven't dug too much into CPython's internals much farther than what is required to implement C extensions. But I can share insight from what the Mercurial project has collectively learned. > > As I wrote in my previous emails, many Python core developers care of > the startup time and we are working on making it faster. > > INADA Naoki added -X importtime to identify slow imports and > understand where Python spent its startup time. > -X importtime is a great start! For a follow-up enhancement, it would be useful to see what aspects of import are slow. Is it finding modules (involves filesystem I/O)? Is it unmarshaling pyc files? Is it executing the module code? If executing code, what part is slow? Inline statements/expressions? Compiling types? Printing the microseconds it takes to import a module is useful. But it only gives me a general direction: I want to know what parts of the import made it slow so I know if I should be focusing on code running during module import, slimming down the size of a module, eliminating the module import from fast paths, pursuing alternative module importers, etc. > > Recent example: Barry Warsaw identified that pkg_resources is slow and > added importlib.resources to Python 3.7: > https://docs.python.org/dev/library/importlib.html#module- > importlib.resources > > Brett Cannon is also working on a standard solution for lazy imports > since many years: > https://pypi.org/project/modutil/ > https://snarky.ca/lazy-importing-in-python-3-7/ > Mercurial has used lazy module imports for years. On 2.7.14, it reduces `hg version` from ~160ms to ~55ms (~34% of original). On Python 3, we're using `importlib.util.LazyLoader` and it reduces `hg version` on 3.7 from ~245ms to ~120ms (~49% of original). I'm not sure why Python 3's built-in module importer doesn't yield the speedup that our custom Python 2 importer does. One explanation is our custom importer is more advanced than importlib. Another is that Python 3's import mechanism is slower (possibly due to being written in Python instead of C). We haven't yet spent much time optimizing Mercurial for Python 3: our immediate goal is to get it working first. Given the startup performance problem on Python 3, it is only a matter of time before we dig into this further. It's worth noting that lazy module importing can be undone via common patterns. Most commonly, `from foo import X`. It's *really* difficult to implement a proper object proxy. Mercurial's lazy importer gives up in this case and imports the module and exports the symbol. (But if the imported module is a package, we detect that and make the module exports proxies to a lazy module.) Another common undermining of the lazy importer is code that runs during import time module exec that accesses an attribute. e.g. ``` import foo class myobject(foo.Foo): pass ``` Mercurial goes out of its way to avoid these patterns so modules can be delay imported as much as possible. As long as import times are problematic, it would be helpful if the standard library adopted similar patterns. Although I recognize there are backwards compatibility concerns that tie your hands a bit. > Nick Coghlan is working on the C API to configure Python startup: PEP > 432. When it will be ready, maybe Mercurial could use a custom Python > optimized for its use case. > That looks great! The direction Mercurial is going in is that `hg` will likely become a Rust binary (instead of a #!python script) that will use an embedded Python interpreter. So we will have low-level control over the interpreter via the C API. I'd also like to see us distribute a copy of Python in our official builds. This will allow us to take various shortcuts, such as not having to probe various sys.path entries since certain packages can only exist in one place. I'd love to get to the state Google is at where they have self-contained binaries with ELF sections containing Python modules. But that requires a bit of very low-level hacking. We'll likely have a Rust binary (that possibly static links libpython) and a separate JAR/zip-like file containing resources. But many people obtain Python via their system package manager and no matter how hard we scream that Mercurial is a standalone application, they will configure their packages to link against the system libpython and use the system Python's standard library. This will potentially undo many of our startup time wins. > > IMHO Python import system is inefficient. We try too many alternative > names. > > Example with Python 3.8 > > $ ./python -vv: > >>> import dontexist > # trying /home/vstinner/prog/python/master/dontexist.cpython-38dm- > x86_64-linux-gnu.so > # trying /home/vstinner/prog/python/master/dontexist.abi3.so > # trying /home/vstinner/prog/python/master/dontexist.so > # trying /home/vstinner/prog/python/master/dontexist.py > # trying /home/vstinner/prog/python/master/dontexist.pyc > # trying /home/vstinner/prog/python/master/Lib/dontexist.cpython- > 38dm-x86_64-linux-gnu.so > # trying /home/vstinner/prog/python/master/Lib/dontexist.abi3.so > # trying /home/vstinner/prog/python/master/Lib/dontexist.so > # trying /home/vstinner/prog/python/master/Lib/dontexist.py > # trying /home/vstinner/prog/python/master/Lib/dontexist.pyc > # trying /home/vstinner/prog/python/master/build/lib.linux-x86_64- > 3.8-pydebug/dontexist.cpython-38dm-x86_64-linux-gnu.so > # trying /home/vstinner/prog/python/master/build/lib.linux-x86_64- > 3.8-pydebug/dontexist.abi3.so > # trying /home/vstinner/prog/python/master/build/lib.linux-x86_64- > 3.8-pydebug/dontexist.so > # trying /home/vstinner/prog/python/master/build/lib.linux-x86_64- > 3.8-pydebug/dontexist.py > # trying /home/vstinner/prog/python/master/build/lib.linux-x86_64- > 3.8-pydebug/dontexist.pyc > # trying /home/vstinner/.local/lib/python3.8/site-packages/dontex > ist.cpython-38dm-x86_64-linux-gnu.so > # trying /home/vstinner/.local/lib/python3.8/site-packages/dontex > ist.abi3.so > # trying /home/vstinner/.local/lib/python3.8/site-packages/dontexist.so > # trying /home/vstinner/.local/lib/python3.8/site-packages/dontexist.py > # trying /home/vstinner/.local/lib/python3.8/site-packages/dontexist.pyc > Traceback (most recent call last): > File "", line 1, in > File "", line 983, in _find_and_load > File "", line 965, in > _find_and_load_unlocked > ModuleNotFoundError: No module named 'dontexist' > > Why do we still check for the .pyc file outside __pycache__ directories? > > Why do we have to check for 3 different names for .so files? > Yes, I also cringe every time I trace Python's system calls and see these needless stats and file opens. Unless Python adds the ability to tell the import mechanism what type of module to import, Mercurial will likely modify our custom importer to only look for specific files. We do provide pure Python modules for modules that have C implementations. But we have code that ensures that the C version is loaded for certain Python configurations because we don't want users accidentally using the non-C modules and then complaining about Mercurial's performance! We already denote the set of modules backed by C. What we're missing (but is certainly possible to implement) is code that limits the module finding search depending on whether the module is backed by Python or C. But this only really works for Mercurial's modules: we don't really know what the standard library is doing and coding assumptions into Mercurial about standard library behavior feels dangerous. If we ship our own Python distribution, we'll likely have a jar-like file containing all modules. Determining which file to load will read an in-memory file index and not require any expensive system calls to look for files. > > Does Mercurial need all directories of sys.path? > No and yes. Mercurial by itself can get by with just the standard library and Mercurial's own packages. But extensions change everything. An extension could modify sys.path though. So limiting sys.path inside Mercurial is somewhat reasonable. Although it's definitely unexpected for a Python application to be removing entries from sys.path when the application starts. > > What's the status of the "system python" project? :-) > > I also would prefer Python without the site module. Can we rewrite > this module in C maybe? Until recently, the site module was needed on > Python to create the "mbcs" encoding alias. Hopefully, the feature has > been removed into Lib/encodings/__init__.py (new private _alias_mbcs() > function). > I also lament the startup time effects of site.py. When `hg` is a Rust binary, we will almost certainly skip site.py and manually perform any required actions that it was performing. > > Python 3.7b3+: > > $ python3.7 -X importtime -c pass > import time: self [us] | cumulative | imported package > import time: 95 | 95 | zipimport > import time: 589 | 589 | _frozen_importlib_external > import time: 67 | 67 | _codecs > import time: 498 | 565 | codecs > import time: 425 | 425 | encodings.aliases > import time: 641 | 1629 | encodings > import time: 228 | 228 | encodings.utf_8 > import time: 143 | 143 | _signal > import time: 335 | 335 | encodings.latin_1 > import time: 58 | 58 | _abc > import time: 265 | 322 | abc > import time: 298 | 619 | io > import time: 69 | 69 | _stat > import time: 196 | 265 | stat > import time: 169 | 169 | genericpath > import time: 336 | 505 | posixpath > import time: 1190 | 1190 | _collections_abc > import time: 600 | 2557 | os > import time: 223 | 223 | _sitebuiltins > import time: 214 | 214 | sitecustomize > import time: 74 | 74 | usercustomize > import time: 477 | 3544 | site As for things Python could do to make things better, one idea is for "package bundles." Instead of using .py, .pyc, .so, etc files as separate files on the filesystem, allow Python packages to be distributed as standalone "archive" files. Like Java's jar files. This has the advantage that there is only a single place to look for files in a given Python package. And since the bundle is immutable, you can index it so imports don't need to touch the filesystem to discover what is present: you do a quick memory lookup and jump straight to the available file. If you go this route, please don't require the use of zlib for file compression, as zlib is painfully slow compared to alternatives like lz4 and zstandard. I know this kinda/sorta exists with zipimporter. But zipimporter uses zlib (slow) and only allows .py/.pyc files. And I think some Python application distribution tools have also solved this problem. I'd *really* like to see a proper/robust solution in Python itself. Along that vein, it would be really nice if the "standalone Python application" story were a bit more formalized. From my perspective, it is insanely difficult to package and distribute an application that happens to use Python. It requires vastly different solutions for different platforms. I want to declare a minimal boilerplate somewhere (perhaps in setup.py) and run a command that produces an as-self-contained-as-possible application complete with platform-native installers. Presumably such a self-contained application could take many shortcuts with regards to process startup and mitigate this general problem. Again, Mercurial is trending in the direction of making `hg` a Rust binary and distributing its own Python. Since we have to solve this packaging+distribution problem on multiple platforms, I'll try to keep an eye towards making whatever solution we concoct reusable by other projects. -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Wed May 2 13:24:05 2018 From: brett at python.org (Brett Cannon) Date: Wed, 02 May 2018 17:24:05 +0000 Subject: [Python-Dev] Process to remove a Python feature In-Reply-To: References: Message-ID: On Wed, 2 May 2018 at 02:12 Victor Stinner wrote: > Hi, > > As a follow-up to the "[Python-Dev] (Looking for) A Retrospective on > the Move to Python 3" thread, I will like to clarify how a feature > should be removed from Python. > > We have multiple tools: > > * Emit a PendingDeprecationWarning warning at runtime for *slow* > deprecation (remove a feature in at least 3 cycles) > * Emit a DeprecationWarning warning at runtime for fast deprecation > (remove a feature in 2 cycles) > * Document that a feature is deprecated in the module documentation > * "What's New in Python X.Y?" documents: especially Deprecated, > Removed and Porting to Python X.Y sections. > * Communicate on python-dev, Twitter, Python Insider blog, etc. > * Collaborate with major Python projects to help them to migrate the > alternative > > IMHO a feature should not be removed if there is no alternative, or if > the alternative is too new. > > The usual process is: > > * Document the deprecation and emit DeprecationWarning in version N > * Document the removal and remove the feature in version N+1 > > Slow deprecation: > > * Document the deprecation and emit PendingDeprecationWarning in version N > * Emit a DeprecationWarning in version N+1 > * Document the removal and remove the feature in version N+2 > > The hidden ghost is the old rule: > > "keep old APIs around to make porting from Python 2 easier" > > Is this rule still applicable? > Python 2 is still supported, so I assume so. > > Does it mean that the Python 3 release following Python 2 end-of-life > (2020) will be our next feared "Python 4"? Are we going to remove all > deprecated features at once, to maximize incompatibilities and make > users unhappy? > I don't see why removing features that already raise a DeprecationWarning would require bumping the major version number. Personally, I assumed either Python 3.9 or 3.10 would be the version where we were okay clearing out the stuff that had been raising DeprecationWarning for years. > > Should we always document in which version a feature will be removed? > We should at least open an issue to track when the removal is scheduled. But if we know ahead of time then I so no reason not to document it. > Some features are deprecated since many versions, and the deprecated > features are still there. In most cases, it's because of the Python 2 > rule. > > Sometimes, we forget to remove features which has been scheduled for > removal in a specific version. > Right, which is why we should open an issue immediately. I had a "remove pyvenv" issue open and assigned to myself for years while I waited for Python 3.8 development to open. Staring at that issue for so long made sure I didn't forget. ;) > > Maybe we should create a tool to list features scheduled for removal, > and open a discussion to check each removal? > I don't know if a tool is necessary. We could have a meta issue or text file somewhere to track what's to be removed in a certain version. > > Ten years ago, I wanted to remove most modules and many functions from > the standard library. Now my experience showed me that *each* removal > is very painful, hurt more projects than expected, and takes longer > than 3 years to be fully effective (no longer used in most common 3rd > party modules). > > The usual issue is to write a single code base working on all Python > versions. For example, the new alternative is only available on recent > Python versions, which requires to have at least two code paths to > support all versions. Sometimes, there are 3 code paths... > > For a recent example, see "remove platform.linux_distribution()": > https://bugs.python.org/issue28167 > > Removing a feature from the C API is more complex, since there is no > portable way to emit a deprecation warning at compilation. There is > Py_DEPRECATED() which seems to only be available on GCC (3.1 and > newer). > > Maybe we should stop to remove features, except if there is really a > good reason to do that? > I thought that already was the policy. ;) I think the real question is what is people's definition of a "good reason". For instance, I'm all for removing unsupported code so we don't have to maintain it, even if it's just from code modernization and such (i.e. pruning down the stdlib). But I also know others disagree with me and are fine just having modules sit there as long as the issue rate is low enough to not notice. It's just one of those things we don't have an official policy on (yet?). -Brett > > Victor > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Wed May 2 13:55:53 2018 From: njs at pobox.com (Nathaniel Smith) Date: Wed, 02 May 2018 17:55:53 +0000 Subject: [Python-Dev] Python startup time In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> Message-ID: On Wed, May 2, 2018, 09:51 Gregory Szorc wrote: > Correct me if I'm wrong, but aren't there downsides with regards to C > extension compatibility to not having a shared libpython? Or does all the > packaging tooling "just work" without a libpython? (It's possible I have my > wires crossed up with something else regarding a statically linked Python.) > IIRC, the rule on Linux is that if you build an extension on a statically built python, then it can be imported on a shared python, but not vice-versa. Manylinux wheels are therefore always built on a static python so that they'll work everywhere. (We should probably clean this up upstream at some point, but there's not a lot of appetite for touching this stuff ? very obscure, very easy to break things without realizing it, not much upside.) On Windows I don't think there is such a thing as a static build, because extensions have to link to the python dll to work at all. And on MacOS I'm not sure, though from knowing how their linker works my guess is that all extensions act like static extensions do on Linux. -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Wed May 2 14:14:41 2018 From: njs at pobox.com (Nathaniel Smith) Date: Wed, 02 May 2018 18:14:41 +0000 Subject: [Python-Dev] [RELEASE] Python 2.7.15 In-Reply-To: <1525269930.1196415.1358244048.5E673230@webmail.messagingengine.com> References: <1525147795.1134689.1356448000.51663AA1@webmail.messagingengine.com> <1525269930.1196415.1358244048.5E673230@webmail.messagingengine.com> Message-ID: I would guess that the folks who end up supporting python 2 past 2020 (either as distributors or as library authors) will have an easier time of it if python 2's ssl module gets resynced with python 3 before the eol. But I suppose it's up to them to do the work... and probably other changes like tls 1.3 support are more important than MemoryBIO? The only way I'd expect the MemoryBIO backport to really matter is if there ends up being a period when pip drops support for everything except the last python 2 micro release, and wants to switch to PEP 543 mode everywhere. (Pip is special because everyone else who needs fancy SSL features can 'pip install pyopenssl', but that doesn't work for pip itself.) But so far pip has never done anything like this, and I don't think keeping support for pre-PEP 543 Pythons will be difficult either, so this doesn't seem super compelling. On Wed, May 2, 2018, 07:07 Benjamin Peterson wrote: > The lack of movement for a year makes me wonder if PEP 546 should be moved > to Withdrawn status. > > On Wed, May 2, 2018, at 02:35, Victor Stinner wrote: > > Sadly, Python 2.7.15 still miss the implementation of the "PEP 546 -- > > Backport ssl.MemoryBIO and ssl.SSLObject to Python 2.7": > > https://www.python.org/dev/peps/pep-0546/ > > > > Last time I checked, the tests failed on Travis CI and I failed to > > reproduce the issue: > > https://bugs.python.org/issue22559 > > > > I expected Cory Benfield to jump into this issue since his "PEP 543 -- > > A Unified TLS API for Python" was my motivation for the PEP 546, but > > it seems like he is busy and the TLS PEP doesn't move anymore :-( > > https://www.python.org/dev/peps/pep-0543/ > > > > Victor > > > > 2018-05-01 6:09 GMT+02:00 Benjamin Peterson : > > > Greetings, > > > I'm pleased to announce the immediate availability of Python 2.7.15, > the latest bug fix release in the senescent Python 2.7 series. > > > > > > Source and binary downloads may be found on python.org: > > > > > > https://www.python.org/downloads/release/python-2715/ > > > > > > Bugs should be reported to https://bugs.python.org/ > > > > > > The source tarball contains a complete changelog in the Misc/NEWS > file. The only change since the release candidate is a fix for undefined C > behavior that newer compilers (including GCC 8) have started to exploit. > > > > > > Users of the macOS binaries should note that all python.org macOS > installers now ship with a builtin copy of OpenSSL. Additionally, there is > a new additional installer variant for macOS 10.9+ that includes a built-in > version of Tcl/Tk 8.6. See the installer README for more information. > > > > > > Happy May, > > > Benjamin > > > 2.7 release manager > > > _______________________________________________ > > > Python-Dev mailing list > > > Python-Dev at python.org > > > https://mail.python.org/mailman/listinfo/python-dev > > > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/vstinner%40redhat.com > > _______________________________________________ > > Python-Dev mailing list > > Python-Dev at python.org > > https://mail.python.org/mailman/listinfo/python-dev > > Unsubscribe: > > https://mail.python.org/mailman/options/python-dev/benjamin%40python.org > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/njs%40pobox.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nas-python at arctrix.com Wed May 2 16:10:25 2018 From: nas-python at arctrix.com (Neil Schemenauer) Date: Wed, 2 May 2018 13:10:25 -0700 Subject: [Python-Dev] Python startup time In-Reply-To: <20180502114341.35790502@fsol> Message-ID: <20180502201025.dgz7rxbdsvzukz4x@python.ca> Antoine: > The overhead of importing is not in trying too many names, but in > loading the module and executing its bytecode. That was my conclusion as well when I did some profiling last fall at the Python core sprint. My lazy execution experiments are an attempt to solve this: https://github.com/python/cpython/pull/6194 I expect that Mercurial is already doing a lot of tricks to make execution more lazy. They have a lazy module import hook but they probably do other things to not execute more bytecode at startup then is needed. My lazy execution idea is that this could happen more automatically. I.e. don't pay for something you don't use. Right now, with eager module imports, you usually pay a price for every bit of bytecode that your program potentially uses. Another idea, suggested to me by Carl Shapiro, is to store unmarshalled Python data in the heap section of the executable (or in DLLs). Then, the OS page fault handling would take care of only loading the data into RAM that is actually being used. The linker would take care of fixing up pointer references. There are a lot of details to work out with this idea but I have heard that Jeethu Rao (Carl's colleague at Instagram) has a prototype implementation that shows promise. Regards, Neil From larry at hastings.org Wed May 2 16:25:11 2018 From: larry at hastings.org (Larry Hastings) Date: Wed, 2 May 2018 13:25:11 -0700 Subject: [Python-Dev] [RELEASE] Python 2.7.15 In-Reply-To: References: <1525147795.1134689.1356448000.51663AA1@webmail.messagingengine.com> <1525269930.1196415.1358244048.5E673230@webmail.messagingengine.com> Message-ID: <8df3c821-99ec-a13a-6e8b-3d39bbd75318@hastings.org> On 05/02/2018 11:14 AM, Nathaniel Smith wrote: > I would guess that the folks who end up supporting python 2 past 2020 > (either as distributors or as library authors) will have an easier > time of it if python 2's ssl module gets resynced with python 3 before > the eol. But I suppose it's up to them to do the work... You mean feature-wise, or do you mean "use more modern SSL libraries"?? IIUC Windows doesn't use the system-provided SSL libraries, *and* 2.7 is built with a nasty old compiler.? It's entirely possible that on Windows 2.7 *can't* use current SSL libraries because they won't build under said nasty old compiler. (summoning Steve Dower etc here) //arry/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From vano at mail.mipt.ru Wed May 2 16:38:45 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Wed, 2 May 2018 23:38:45 +0300 Subject: [Python-Dev] bpo-33257: seeking advice & approval on the course of action Message-ID: <0a89a3c6-65f5-c0e0-826b-7db3a79a7ef0@mail.mipt.ru> The bottom line is: Tkinter is currently broken -- as in, it's not thread-safe (in both Py2 and Py3) despite being designed and advertizing itself as such. All the fix options require some redesign of either `_tkinter', or some of the core as well. So, I'd like to get some kind of core team's feedback and/or approval before pursuing any of them. The options are outlined in https://bugs.python.org/issue33257#msg316087 . If anyone of you is in Moscow, we can meet up and discuss this in a more time-efficient manner. -- Regards, Ivan From vano at mail.mipt.ru Wed May 2 16:51:13 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Wed, 2 May 2018 23:51:13 +0300 Subject: [Python-Dev] Drop/deprecate Tkinter? Message-ID: As https://bugs.python.org/issue33257 and https://bugs.python.org/issue33316 showed, Tkinter is broken, for both Py2 and Py3, with both threaded and non-threaded Tcl, since 2002 at least, and no-one gives a damn. This seems to be a testament that very few people are actually interested in or are using it. If that's so, there's no use keeping it in the standard library -- if anything, because there's not enough incentive and/or resources to support it. And to avoid screwing people (=me) up when they have the foolishness to think they can rely on it in their projects -- nowhere in the docs it is said that the module is only partly functional. -- Regards, Ivan From guido at python.org Wed May 2 17:02:40 2018 From: guido at python.org (Guido van Rossum) Date: Wed, 2 May 2018 14:02:40 -0700 Subject: [Python-Dev] Drop/deprecate Tkinter? In-Reply-To: References: Message-ID: Wow. I guess your code was broken and now you seem really upset. Go punch a bag or something, and then propose something a little more constructive, like adding a warning to the docs. I can assure you that there are many people using apps written using Tkinter (e.g. IDLE) and there's a mailing list as well (tkinter-discuss at python.org). On Wed, May 2, 2018 at 1:51 PM, Ivan Pozdeev via Python-Dev < python-dev at python.org> wrote: > As https://bugs.python.org/issue33257 and https://bugs.python.org/issue3 > 3316 showed, Tkinter is broken, for both Py2 and Py3, with both threaded > and non-threaded Tcl, since 2002 at least, and no-one gives a damn. > > This seems to be a testament that very few people are actually interested > in or are using it. > > If that's so, there's no use keeping it in the standard library -- if > anything, because there's not enough incentive and/or resources to support > it. And to avoid screwing people (=me) up when they have the foolishness to > think they can rely on it in their projects -- nowhere in the docs it is > said that the module is only partly functional. > > -- > > Regards, > Ivan > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/guido% > 40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From barry at python.org Wed May 2 17:13:45 2018 From: barry at python.org (Barry Warsaw) Date: Wed, 2 May 2018 14:13:45 -0700 Subject: [Python-Dev] Python startup time In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> Message-ID: Thanks for bringing this topic up again. At $day_job, this is a highly visible and important topic, since the majority of our command line tools are written in Python (of varying versions from 2.7 to 3.6). Some of those tools can take upwards of 5 seconds or more just to respond to ?help, which causes lots of pain for developers, who complain (rightly so) up the management chain. ;) We?ve done a fair bit of work to bring those numbers down without super radical workarounds. Often there are problems not strictly related to the Python interpreter that contribute to this. Python gets blamed, but it?s not always the interpreter?s fault. Common issues include: * Modules that have import-time side effects, such as network access or expensive creation of data structures. Python 3.7?s `-X importtime` switch is a really wonderful way to identify the worst offenders. Once 3.7 is released, I do plan to spend some time using this to collect data internally so we can attack our own libraries, and perhaps put automated performance testing into our build stack, to identify start up time regressions. * pkg_resources. When you have tons of entries on sys.path, pkg_resources does a lot of work at import time, and because of common patterns which tend to use pkg_resources namespace package support in __init__.py files, this just kills start up times. Of course, pkg_resources has other uses too, so even in a purely Python 3 world (where your namespace packages can omit the __init__.py), you?ll often get clobbered as soon as you want to use the Basic Resource Access API. This is also pretty common, and it?s the main reason why Brett and I created importlib.resources for 3.7 (with a standalone API-compatible library for older Pythons). That?s one less reason to use pkg_resources, but it doesn?t address the __init__.py use. Brett and I have been talking about addressing that for 3.8. * pex - which we use as our single file zipapp tool. Especially the interaction between pex and pkg_resources introduces pretty significant overhead. My colleague Loren Carvalho created a tool called shiv which requires at least Python 3.6, avoids the use of pkg_resources, and implements other tricks to be much more performant than pex. Shiv is now open source and you can find it on RTD and GitHub. The switch to shiv and importlib.resources can shave 25-50% off of warm cache start up times for zipapp style executables. Another thing we?ve done, although I?m much less sanguine about them as a general approach, is to move imports into functions, but we?re trying to only use that trick on the most critical cases. Some import time effects can?t be changed. Decorators come to mind, and click is a popular library for CLIs that provides some great features, but decorators do prevent a lazy loading approach. > On May 1, 2018, at 20:26, Gregory Szorc wrote: >> You might think "what's a few milliseconds matter". But if you run >> hundreds of commands in a shell script it adds up. git's speed is one >> of the few bright spots in its UX, and hg's comparative slowness here is >> a palpable disadvantage. Oh, for command line tools, milliseconds absolutely matter. > As a concrete example, I recently landed a Mercurial patch [2] that > stubs out zope.interface to prevent the import of 9 modules on every > `hg` invocation. I have a similar dastardly plan to provide a pkg_resources stub :). > Mercurial provides a `chg` program that essentially spins up a daemon > `hg` process running a "command server" so the `chg` program [written in > C - no startup overhead] can dispatch commands to an already-running > Python/`hg` process and avoid paying the startup overhead cost. When you > run Mercurial's test suite using `chg`, it completes *minutes* faster. > `chg` exists mainly as a workaround for slow startup overhead. A couple of our developers demoed a similar approach for one of our CLIs that almost everyone uses. It?s a big application with lots of dependencies, so particularly vulnerable to pex and pkg_resources overhead. While it was just a prototype, it was darn impressive to see subsequent invocations produce output almost immediately. It?s unfortunate that we have to utilize all these tricks to get even moderately performant Python CLIs. A few of us spent some time at last year?s core Python dev talking about other things we could do to improve Python?s start up time, not just with the interpreter itself, but within the larger context of the Python ecosystem. Many ideas seem promising until you dive into the details, so it?s definitely hard to imagine maintaining all of Python?s dynamic semantics and still making it an order of magnitude faster to start up. But that?s not an excuse to give up, and I?m hoping we can continue to attack the problem, both in the micro and the macro, for 3.8 and beyond, because the alternative is that Python becomes less popular as an implementation language for CLIs. That would be sad, and definitely has a long term impact on Python?s popularity. Cheers, -Barry -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 833 bytes Desc: Message signed with OpenPGP URL: From robertwb at gmail.com Wed May 2 17:14:42 2018 From: robertwb at gmail.com (Robert Bradshaw) Date: Wed, 2 May 2018 14:14:42 -0700 Subject: [Python-Dev] PEP 575: Unifying function/method classes In-Reply-To: <5AE73C79.6090409@UGent.be> References: <5ACF8552.6020607@UGent.be> <43fe216c540847aa838bba67987b9657@xmail101.UGent.be> <5AE73C79.6090409@UGent.be> Message-ID: This would be really useful for Cython, as well as a nice cleanup in general (e.g. replacing 4 special cases with one check). It seems the main concern is the user-visible change in types. If this is determined to be too backwards incompatible (I would be surprised if many projects are impacted, but also surprised if none are--more data is warranted) I think the main points of this proposal could be addressed by introducing the common superclass(es) while keeping the "leaf" types of builtin_function_or_method, etc. exactly the same similar to the two-phase proposal (though of course it'd be a nice to split this up, as well as unify normal-method and c-defined-method if that's palatable). - Robert On Mon, Apr 30, 2018 at 8:55 AM, Jeroen Demeyer wrote: > On 2018-04-30 15:38, Mark Shannon wrote: > >> While a unified *interface* makes sense, a unified class hierarchy and >> implementation, IMO, do not. >> > > The main reason for the common base class is performance: in the bytecode > interpreter, when we call an object, CPython currently has a special case > for calling Python functions, a special case for calling methods, a special > case for calling method descriptors, a special case for calling built-in > functions. > > By introducing a common base class, we reduce the number of special cases. > Second, we allow using this fast path for custom classes. With PEP 575, it > is possible to create new classes with the same __call__ performance as the > current built-in function class. > > Bound-methods may be callables, but they are not functions, they are a >> pair of a function and a "self" object. >> > > From the Python language point of view, that may be true but that's not > how you want to implement methods. When I write a method in C, I want that > it can be called either as unbound method or as bound method: the C code > shouldn't see the difference between the calls X.foo(obj) or obj.foo(). And > you want both calls to be equally fast, so you don't want that the bound > method just wraps the unbound method. For this reason, it makes sense to > unify functions and methods. > > IMO, there are so many versions of "function" and "bound-method", that a >> unified class hierarchy and the resulting restriction to the >> implementation will make implementing a unified interface harder, not >> easier. >> > > PEP 575 does not add any restrictions: I never claimed that all callables > should inherit from base_function. Regardless, why would the common base > class add restrictions? You can still add attributes and customize whatever > you want in subclasses. > > > > Jeroen. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/robertwb% > 40gmail.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From barry at python.org Wed May 2 17:24:05 2018 From: barry at python.org (Barry Warsaw) Date: Wed, 2 May 2018 14:24:05 -0700 Subject: [Python-Dev] Python startup time In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> Message-ID: On May 2, 2018, at 09:42, Gregory Szorc wrote: > As for things Python could do to make things better, one idea is for "package bundles." Instead of using .py, .pyc, .so, etc files as separate files on the filesystem, allow Python packages to be distributed as standalone "archive" files. Of course, .so files have to be extracted to the file system, because we have to live with dlopen()?s API. In our first release of shiv, we had a loader that did exactly that for just .so files. We ended up just doing .pyz file unpacking unconditionally, ignoring zip-safe, mostly because too many packages still use __file__, which doesn?t work in a zipapp. I?ll plug shiv and importlib.resources (and the standalone importlib_resources) again here. :) > If you go this route, please don't require the use of zlib for file compression, as zlib is painfully slow compared to alternatives like lz4 and zstandard. shiv works in a similar manner to pex, although it?s a completely new implementation that doesn?t suffer from huge sys.paths or the use of pkg_resources. shiv + importlib.resources saves us 25-50% of warm cache startup time. That makes things better but still not ideal. Ultimately though that means we don?t suffer from the slowness of zlib since we don?t count cold cache times (i.e. before the initial pyz unpacking operation). Cheers, -Barry -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 195 bytes Desc: Message signed with OpenPGP URL: From brian at python.org Wed May 2 17:24:07 2018 From: brian at python.org (Brian Curtin) Date: Wed, 02 May 2018 21:24:07 +0000 Subject: [Python-Dev] Drop/deprecate Tkinter? In-Reply-To: References: Message-ID: On Wed, May 2, 2018 at 16:55 Ivan Pozdeev via Python-Dev < python-dev at python.org> wrote: > As https://bugs.python.org/issue33257 and > https://bugs.python.org/issue33316 showed, Tkinter is broken, for both > Py2 and Py3, with both threaded and non-threaded Tcl, since 2002 at > least, and no-one gives a damn. > > This seems to be a testament that very few people are actually > interested in or are using it. > > If that's so, there's no use keeping it in the standard library -- if > anything, because there's not enough incentive and/or resources to > support it. And to avoid screwing people (=me) up when they have the > foolishness to think they can rely on it in their projects -- nowhere in > the docs it is said that the module is only partly functional. For the future, this is not how you communicate with the development mailing list of any open source software project. I would suggest reading https://www.python.org/psf/codeofconduct/ for some pointers on how people typically behave around here in particular. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From solipsis at pitrou.net Wed May 2 17:28:22 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Wed, 2 May 2018 23:28:22 +0200 Subject: [Python-Dev] Drop/deprecate Tkinter? References: Message-ID: <20180502232822.01778283@fsol> On Wed, 02 May 2018 21:24:07 +0000 Brian Curtin wrote: > On Wed, May 2, 2018 at 16:55 Ivan Pozdeev via Python-Dev < > python-dev at python.org> wrote: > > > As https://bugs.python.org/issue33257 and > > https://bugs.python.org/issue33316 showed, Tkinter is broken, for both > > Py2 and Py3, with both threaded and non-threaded Tcl, since 2002 at > > least, and no-one gives a damn. > > > > This seems to be a testament that very few people are actually > > interested in or are using it. > > > > If that's so, there's no use keeping it in the standard library -- if > > anything, because there's not enough incentive and/or resources to > > support it. And to avoid screwing people (=me) up when they have the > > foolishness to think they can rely on it in their projects -- nowhere in > > the docs it is said that the module is only partly functional. > > > For the future, this is not how you communicate with the development > mailing list of any open source software project. I would suggest reading > https://www.python.org/psf/codeofconduct/ for some pointers on how people > typically behave around here in particular. Perhaps it would be more constructive to address the OP's point than to play speech police. Regards Antoine. From solipsis at pitrou.net Wed May 2 17:37:05 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Wed, 2 May 2018 23:37:05 +0200 Subject: [Python-Dev] Drop/deprecate Tkinter? References: <20180502232822.01778283@fsol> Message-ID: <20180502233705.7249da5f@fsol> On Wed, 2 May 2018 23:28:22 +0200 Antoine Pitrou wrote: > On Wed, 02 May 2018 21:24:07 +0000 > Brian Curtin wrote: > > On Wed, May 2, 2018 at 16:55 Ivan Pozdeev via Python-Dev < > > python-dev at python.org> wrote: > > > > > As https://bugs.python.org/issue33257 and > > > https://bugs.python.org/issue33316 showed, Tkinter is broken, for both > > > Py2 and Py3, with both threaded and non-threaded Tcl, since 2002 at > > > least, and no-one gives a damn. > > > > > > This seems to be a testament that very few people are actually > > > interested in or are using it. > > > > > > If that's so, there's no use keeping it in the standard library -- if > > > anything, because there's not enough incentive and/or resources to > > > support it. And to avoid screwing people (=me) up when they have the > > > foolishness to think they can rely on it in their projects -- nowhere in > > > the docs it is said that the module is only partly functional. > > > > > > For the future, this is not how you communicate with the development > > mailing list of any open source software project. I would suggest reading > > https://www.python.org/psf/codeofconduct/ for some pointers on how people > > typically behave around here in particular. > > Perhaps it would be more constructive to address the OP's point than to > play speech police. To elaborate a bit: the OP, while angry, produced both a detailed analysis *and* a PR. It's normal to be angry when an advertised feature doesn't work and it makes you lose hours of work (or, even, forces you to a wholesale redesign). Producing a detailed analysis and a PR is more than most people will ever do. Regards Antoine. From guido at python.org Wed May 2 17:41:41 2018 From: guido at python.org (Guido van Rossum) Date: Wed, 2 May 2018 14:41:41 -0700 Subject: [Python-Dev] Drop/deprecate Tkinter? In-Reply-To: <20180502233705.7249da5f@fsol> References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> Message-ID: So what do *you* think. Do you agree with the OP that Tkinter (and hence IDLE) should be scrapped? On Wed, May 2, 2018 at 2:37 PM, Antoine Pitrou wrote: > On Wed, 2 May 2018 23:28:22 +0200 > Antoine Pitrou wrote: > > On Wed, 02 May 2018 21:24:07 +0000 > > Brian Curtin wrote: > > > On Wed, May 2, 2018 at 16:55 Ivan Pozdeev via Python-Dev < > > > python-dev at python.org> wrote: > > > > > > > As https://bugs.python.org/issue33257 and > > > > https://bugs.python.org/issue33316 showed, Tkinter is broken, for > both > > > > Py2 and Py3, with both threaded and non-threaded Tcl, since 2002 at > > > > least, and no-one gives a damn. > > > > > > > > This seems to be a testament that very few people are actually > > > > interested in or are using it. > > > > > > > > If that's so, there's no use keeping it in the standard library -- if > > > > anything, because there's not enough incentive and/or resources to > > > > support it. And to avoid screwing people (=me) up when they have the > > > > foolishness to think they can rely on it in their projects -- > nowhere in > > > > the docs it is said that the module is only partly functional. > > > > > > > > > For the future, this is not how you communicate with the development > > > mailing list of any open source software project. I would suggest > reading > > > https://www.python.org/psf/codeofconduct/ for some pointers on how > people > > > typically behave around here in particular. > > > > Perhaps it would be more constructive to address the OP's point than to > > play speech police. > > To elaborate a bit: the OP, while angry, produced both a detailed > analysis *and* a PR. It's normal to be angry when an advertised > feature doesn't work and it makes you lose hours of work (or, even, > forces you to a wholesale redesign). Producing a detailed analysis and a > PR is more than most people will ever do. > > Regards > > Antoine. > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ > guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From antoine at python.org Wed May 2 17:43:40 2018 From: antoine at python.org (Antoine Pitrou) Date: Wed, 2 May 2018 23:43:40 +0200 Subject: [Python-Dev] Drop/deprecate Tkinter? In-Reply-To: References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> Message-ID: I have no opinion about scrapping IDLE and Tkinter, but if we don't, I think his concerns deserve addressing instead of being dismissed by wielding the CoC magic wand. Regards Antoine. Le 02/05/2018 ? 23:41, Guido van Rossum a ?crit?: > So what do *you* think. Do you agree with the OP that Tkinter (and hence > IDLE) should be scrapped? > > On Wed, May 2, 2018 at 2:37 PM, Antoine Pitrou > wrote: > > On Wed, 2 May 2018 23:28:22 +0200 > Antoine Pitrou > wrote: > > On Wed, 02 May 2018 21:24:07 +0000 > > Brian Curtin > wrote: > > > On Wed, May 2, 2018 at 16:55 Ivan Pozdeev via Python-Dev < > > > python-dev at python.org > wrote: > > >? ? > > > > As https://bugs.python.org/issue33257 > and > > > > https://bugs.python.org/issue33316 > showed, Tkinter is broken, for both > > > > Py2 and Py3, with both threaded and non-threaded Tcl, since 2002 at > > > > least, and no-one gives a damn. > > > > > > > > This seems to be a testament that very few people are actually > > > > interested in or are using it. > > > > > > > > If that's so, there's no use keeping it in the standard library -- if > > > > anything, because there's not enough incentive and/or resources to > > > > support it. And to avoid screwing people (=me) up when they have the > > > > foolishness to think they can rely on it in their projects -- nowhere in > > > > the docs it is said that the module is only partly functional.? ? > > > > > > > > > For the future, this is not how you communicate with the development > > > mailing list of any open source software project. I would suggest reading > > > https://www.python.org/psf/codeofconduct/ > for some pointers on how > people > > > typically behave around here in particular.? > > > > Perhaps it would be more constructive to address the OP's point than to > > play speech police. > > To elaborate a bit: the OP, while angry, produced both a detailed > analysis *and* a PR.? It's normal to be angry when an advertised > feature doesn't work and it makes you lose hours of work (or, even, > forces you to a wholesale redesign). Producing a detailed analysis and a > PR is more than most people will ever do. > > Regards > > Antoine. > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > > > > > -- > --Guido van Rossum (python.org/~guido ) From brian at python.org Wed May 2 17:51:00 2018 From: brian at python.org (Brian Curtin) Date: Wed, 02 May 2018 21:51:00 +0000 Subject: [Python-Dev] Drop/deprecate Tkinter? In-Reply-To: <20180502233705.7249da5f@fsol> References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> Message-ID: On Wed, May 2, 2018 at 5:37 PM Antoine Pitrou wrote: > On Wed, 2 May 2018 23:28:22 +0200 > Antoine Pitrou wrote: > > On Wed, 02 May 2018 21:24:07 +0000 > > Brian Curtin wrote: > > > On Wed, May 2, 2018 at 16:55 Ivan Pozdeev via Python-Dev < > > > python-dev at python.org> wrote: > > > > > > > As https://bugs.python.org/issue33257 and > > > > https://bugs.python.org/issue33316 showed, Tkinter is broken, for > both > > > > Py2 and Py3, with both threaded and non-threaded Tcl, since 2002 at > > > > least, and no-one gives a damn. > > > > > > > > This seems to be a testament that very few people are actually > > > > interested in or are using it. > > > > > > > > If that's so, there's no use keeping it in the standard library -- if > > > > anything, because there's not enough incentive and/or resources to > > > > support it. And to avoid screwing people (=me) up when they have the > > > > foolishness to think they can rely on it in their projects -- > nowhere in > > > > the docs it is said that the module is only partly functional. > > > > > > > > > For the future, this is not how you communicate with the development > > > mailing list of any open source software project. I would suggest > reading > > > https://www.python.org/psf/codeofconduct/ for some pointers on how > people > > > typically behave around here in particular. > > > > Perhaps it would be more constructive to address the OP's point than to > > play speech police. > > To elaborate a bit: the OP, while angry, produced both a detailed > analysis *and* a PR. It's normal to be angry when an advertised > feature doesn't work and it makes you lose hours of work (or, even, > forces you to a wholesale redesign). Producing a detailed analysis and a > PR is more than most people will ever do. > It may be normal to be angry when something doesn't work the way it should, but analyzing and creating a PR aren't the gateway to normalizing this behavior. Sending thousands of people this type of email isn't how it works. To address their point: no, next topic. -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Wed May 2 17:54:04 2018 From: p.f.moore at gmail.com (Paul Moore) Date: Wed, 2 May 2018 22:54:04 +0100 Subject: [Python-Dev] Drop/deprecate Tkinter? In-Reply-To: <20180502233705.7249da5f@fsol> References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> Message-ID: On 2 May 2018 at 22:37, Antoine Pitrou wrote: > To elaborate a bit: the OP, while angry, produced both a detailed > analysis *and* a PR. It's normal to be angry when an advertised > feature doesn't work and it makes you lose hours of work (or, even, > forces you to a wholesale redesign). Producing a detailed analysis and a > PR is more than most people will ever do. His *other* email seems reasonable, and warrants a response, yes. But are we to take the suggestion made here (to drop tkinter) seriously, based on the fact that there's a (rare - at least it appears that the many IDLE users haven't hit it yet) race condition that causes a crash in Python 2.7? (It appears that the problem doesn't happen in the python.org 3.x builds, if I understand the description of the issue). I don't have an opinion on the proposed fixes to tkinter, but I definitely don't think that dropping it is a reasonable option. Nor do I think the tone of his message here is acceptable - regardless of how annoyed he is, posting insults ("no-one gives a damn") about volunteer contributors in a public mailing list isn't reasonable or constructive. Call that "playing speech police" if you want, but I think that being offended or annoyed and saying so is perfectly reasonable. Paul From bsdtux at gmail.com Wed May 2 17:56:59 2018 From: bsdtux at gmail.com (Josh Stephens) Date: Wed, 2 May 2018 14:56:59 -0700 Subject: [Python-Dev] Drop/deprecate Tkinter? In-Reply-To: References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> Message-ID: Hello list, ? If I may voice my opinion I would like to say that I just built an application using Tkinter using python3. I used it because it was included in python by default and I didn't have to using something like PyQT or any other framework that was heavy. While I agree that the docs can sometimes be confusing, I am not sure that it warrants tossing it out. I am not even sure that my opinion gives much weight but I figured I would just toss in a quick here is my vote and my story about using Tkinter with SqlAlchemy and Py2App to build a native Mac OS X app as of last month. Best Regards, Josh Stephens On May 2, 2018 at 4:46:29 PM, Antoine Pitrou (antoine at python.org) wrote: > > I have no opinion about scrapping IDLE and Tkinter, but if we don't, I > think his concerns deserve addressing instead of being dismissed by > wielding the CoC magic wand. > > Regards > > Antoine. > > > Le 02/05/2018 ? 23:41, Guido van Rossum a ?crit : > > So what do *you* think. Do you agree with the OP that Tkinter (and hence > > IDLE) should be scrapped? > > > > On Wed, May 2, 2018 at 2:37 PM, Antoine Pitrou > > > wrote: > > > > On Wed, 2 May 2018 23:28:22 +0200 > > Antoine Pitrou > wrote: > > > On Wed, 02 May 2018 21:24:07 +0000 > > > Brian Curtin > wrote: > > > > On Wed, May 2, 2018 at 16:55 Ivan Pozdeev via Python-Dev < > > > > python-dev at python.org > wrote: > > > > > > > > > As https://bugs.python.org/issue33257 > > and > > > > > https://bugs.python.org/issue33316 > > showed, Tkinter is broken, for both > > > > > Py2 and Py3, with both threaded and non-threaded Tcl, since 2002 at > > > > > least, and no-one gives a damn. > > > > > > > > > > This seems to be a testament that very few people are actually > > > > > interested in or are using it. > > > > > > > > > > If that's so, there's no use keeping it in the standard library -- if > > > > > anything, because there's not enough incentive and/or resources to > > > > > support it. And to avoid screwing people (=me) up when they have the > > > > > foolishness to think they can rely on it in their projects -- nowhere in > > > > > the docs it is said that the module is only partly functional. > > > > > > > > > > > > For the future, this is not how you communicate with the development > > > > mailing list of any open source software project. I would suggest reading > > > > https://www.python.org/psf/codeofconduct/ > > for some pointers on how > > people > > > > typically behave around here in particular. > > > > > > Perhaps it would be more constructive to address the OP's point than to > > > play speech police. > > > > To elaborate a bit: the OP, while angry, produced both a detailed > > analysis *and* a PR. It's normal to be angry when an advertised > > feature doesn't work and it makes you lose hours of work (or, even, > > forces you to a wholesale redesign). Producing a detailed analysis and a > > PR is more than most people will ever do. > > > > Regards > > > > Antoine. > > > > > > _______________________________________________ > > Python-Dev mailing list > > Python-Dev at python.org > > https://mail.python.org/mailman/listinfo/python-dev > > > > Unsubscribe: > > https://mail.python.org/mailman/options/python-dev/guido%40python.org > > > > > > > > > > -- > > --Guido van Rossum (python.org/~guido ) > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/bsdtux%40gmail.com > From solipsis at pitrou.net Wed May 2 18:01:53 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Thu, 3 May 2018 00:01:53 +0200 Subject: [Python-Dev] Drop/deprecate Tkinter? In-Reply-To: References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> Message-ID: <20180503000153.6f8d3b33@fsol> On Wed, 2 May 2018 22:54:04 +0100 Paul Moore wrote: > On 2 May 2018 at 22:37, Antoine Pitrou wrote: > > To elaborate a bit: the OP, while angry, produced both a detailed > > analysis *and* a PR. It's normal to be angry when an advertised > > feature doesn't work and it makes you lose hours of work (or, even, > > forces you to a wholesale redesign). Producing a detailed analysis and a > > PR is more than most people will ever do. > > His *other* email seems reasonable, and warrants a response, yes. But > are we to take the suggestion made here (to drop tkinter) seriously, > based on the fact that there's a (rare - at least it appears that the > many IDLE users haven't hit it yet) race condition that causes a crash > in Python 2.7? (It appears that the problem doesn't happen in the > python.org 3.x builds, if I understand the description of the issue). I and others actually suggested it seriously in the past. Now, admittedly, at least IDLE seems better maintained than it used to be -- not sure about Tkinter itself. > Nor do I think the tone of his message here is acceptable - regardless > of how annoyed he is, posting insults ("no-one gives a damn") about > volunteer contributors in a public mailing list isn't reasonable or > constructive. Call that "playing speech police" if you want, but I > think that being offended or annoyed and saying so is perfectly > reasonable. Will all due respect, it's sometimes unpredictable what kind of wording Anglo-Saxons will take as an insult, as there's lot of obsequiosity there that doesn't exist in other cultures. To me, "not give a damn" reads like a familiar version of "not care about something", but apparently it can be offensive. Regards Antoine. From ronaldoussoren at mac.com Wed May 2 17:19:42 2018 From: ronaldoussoren at mac.com (Ronald Oussoren) Date: Wed, 02 May 2018 23:19:42 +0200 Subject: [Python-Dev] Drop/deprecate Tkinter? In-Reply-To: References: Message-ID: <47BD1AED-E123-4560-AC84-B8671E08A95C@mac.com> > On 2 May 2018, at 22:51, Ivan Pozdeev via Python-Dev wrote: > > As https://bugs.python.org/issue33257 and https://bugs.python.org/issue33316 showed, Tkinter is broken, for both Py2 and Py3, with both threaded and non-threaded Tcl, since 2002 at least, and no-one gives a damn. The second issue number doesn?t refer to a Tkinter issue, the former is about a month old and has reactions from a core developer. That?s not ?nobody cares?. > > This seems to be a testament that very few people are actually interested in or are using it. Not necessarily, it primarily reflects that CPython is volunteer-driven project. This appears to be related to the interaction of Tkinter and threads, and requires hacking on C code. That seriously shrinks the pool of people that feel qualified to work on this. > > If that's so, there's no use keeping it in the standard library -- if anything, because there's not enough incentive and/or resources to support it. And to avoid screwing people (=me) up when they have the foolishness to think they can rely on it in their projects -- nowhere in the docs it is said that the module is only partly functional. Tkinter is used fairly often as an easily available GUI library and is not much as you imply. I don?t know how save calling GUI code from multiple threads is in general (separate from this Tkinter issue), but do know that this is definitely not save across platforms: at least on macOS calling GUI methods in Apple?s libraries from secondary threads is unsafe unless those methods are explicitly documented as thread-safe. Ronald > > -- > > Regards, > Ivan > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ronaldoussoren%40mac.com From gregory.szorc at gmail.com Wed May 2 18:24:21 2018 From: gregory.szorc at gmail.com (Gregory Szorc) Date: Wed, 2 May 2018 15:24:21 -0700 Subject: [Python-Dev] Python startup time In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> Message-ID: <6ede5055-e89e-407c-a6b7-10677da0c886@gmail.com> On 5/2/18 2:24 PM, Barry Warsaw wrote: > On May 2, 2018, at 09:42, Gregory Szorc wrote: > >> As for things Python could do to make things better, one idea is for "package bundles." Instead of using .py, .pyc, .so, etc files as separate files on the filesystem, allow Python packages to be distributed as standalone "archive" files. > > Of course, .so files have to be extracted to the file system, because we have to live with dlopen()?s API. In our first release of shiv, we had a loader that did exactly that for just .so files. We ended up just doing .pyz file unpacking unconditionally, ignoring zip-safe, mostly because too many packages still use __file__, which doesn?t work in a zipapp. FWIW, Google has a patched glibc that implements dlopen_with_offset(). It allows you to do things like memory map the current binary and then dlopen() a shared library embedded in an ELF section. I've seen the code in the branch at https://sourceware.org/git/?p=glibc.git;a=shortlog;h=refs/heads/google/grte/v4-2.19/master. It likely exists elsewhere. An attempt to upstream it occurred at https://sourceware.org/bugzilla/show_bug.cgi?id=11767. It is probably well worth someone's time to pick up the torch and get this landed in glibc so everyone can be a massive step closer to self-contained, single binary applications. Of course, it will take years before you can rely on a glibc version with this API being deployed universally. But the sooner this lands... > > I?ll plug shiv and importlib.resources (and the standalone importlib_resources) again here. :) > >> If you go this route, please don't require the use of zlib for file compression, as zlib is painfully slow compared to alternatives like lz4 and zstandard. > > shiv works in a similar manner to pex, although it?s a completely new implementation that doesn?t suffer from huge sys.paths or the use of pkg_resources. shiv + importlib.resources saves us 25-50% of warm cache startup time. That makes things better but still not ideal. Ultimately though that means we don?t suffer from the slowness of zlib since we don?t count cold cache times (i.e. before the initial pyz unpacking operation). > > Cheers, > -Barry > > > From greg.ewing at canterbury.ac.nz Wed May 2 18:26:25 2018 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 03 May 2018 10:26:25 +1200 Subject: [Python-Dev] Drop/deprecate Tkinter? In-Reply-To: References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> Message-ID: <5AEA3B11.1020509@canterbury.ac.nz> Guido van Rossum wrote: > So what do *you* think. Do you agree with the OP that Tkinter (and hence > IDLE) should be scrapped? I don't have an opinion on that, but the issue of whether tkinter should be in the stdlib has been debated at least once before, and I took the OP as saying "maybe we should talk about that again". -- Greg From barry at python.org Wed May 2 19:11:50 2018 From: barry at python.org (Barry Warsaw) Date: Wed, 2 May 2018 16:11:50 -0700 Subject: [Python-Dev] Python startup time In-Reply-To: <6ede5055-e89e-407c-a6b7-10677da0c886@gmail.com> References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> <6ede5055-e89e-407c-a6b7-10677da0c886@gmail.com> Message-ID: <7CE27A91-AF96-4AD8-BE6B-0CAB39852E22@python.org> On May 2, 2018, at 15:24, Gregory Szorc wrote: > > FWIW, Google has a patched glibc that implements dlopen_with_offset(). > It allows you to do things like memory map the current binary and then > dlopen() a shared library embedded in an ELF section. > > I've seen the code in the branch at > https://sourceware.org/git/?p=glibc.git;a=shortlog;h=refs/heads/google/grte/v4-2.19/master. > It likely exists elsewhere. An attempt to upstream it occurred at > https://sourceware.org/bugzilla/show_bug.cgi?id=11767. It is probably > well worth someone's time to pick up the torch and get this landed in > glibc so everyone can be a massive step closer to self-contained, single > binary applications. Of course, it will take years before you can rely > on a glibc version with this API being deployed universally. But the > sooner this lands... Oh, I?m well aware of the history of this patch. :) I?d love to see it available on the platforms I use, and agree it?s well worth someone?s time to continue to shepherd this through the processes to make that happen. Even if it did take years to roll out, Python could use it with the proper compile-time checks. -Barry -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 833 bytes Desc: Message signed with OpenPGP URL: From vano at mail.mipt.ru Wed May 2 19:12:48 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Thu, 3 May 2018 02:12:48 +0300 Subject: [Python-Dev] Drop/deprecate Tkinter? In-Reply-To: <20180503000153.6f8d3b33@fsol> References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> Message-ID: On 03.05.2018 1:01, Antoine Pitrou wrote: > On Wed, 2 May 2018 22:54:04 +0100 > Paul Moore wrote: >> On 2 May 2018 at 22:37, Antoine Pitrou wrote: >>> To elaborate a bit: the OP, while angry, produced both a detailed >>> analysis *and* a PR. It's normal to be angry when an advertised >>> feature doesn't work and it makes you lose hours of work (or, even, >>> forces you to a wholesale redesign). Producing a detailed analysis and a >>> PR is more than most people will ever do. >> His *other* email seems reasonable, and warrants a response, yes. But >> are we to take the suggestion made here (to drop tkinter) seriously, >> based on the fact that there's a (rare - at least it appears that the >> many IDLE users haven't hit it yet) race condition that causes a crash >> in Python 2.7? (It appears that the problem doesn't happen in the >> python.org 3.x builds, if I understand the description of the issue). In 3.x, Tkinter+threads is broken too, albeit in a different way -- see https://bugs.python.org/issue33412 (this should've been the 2nd link in the initial message, sorry for the mix-up). The 2.x bug also shows in 3.x if it's linked with a nonthreaded version of Tcl (dunno how rare that is, but the code still supports this setup). > I and others actually suggested it seriously in the past. Now, > admittedly, at least IDLE seems better maintained than it used to > be -- not sure about Tkinter itself. > >> Nor do I think the tone of his message here is acceptable - regardless >> of how annoyed he is, posting insults ("no-one gives a damn") about >> volunteer contributors in a public mailing list isn't reasonable or >> constructive. Call that "playing speech police" if you want, but I >> think that being offended or annoyed and saying so is perfectly >> reasonable. > Will all due respect, it's sometimes unpredictable what kind of wording > Anglo-Saxons will take as an insult, as there's lot of obsequiosity > there that doesn't exist in other cultures. To me, "not give a damn" > reads like a familiar version of "not care about something", but > apparently it can be offensive. Confirm, never meant this as an insult. I had to use emotional language to drive the point home that it's not some nitpick, it really causes people serious trouble (I lost a source of income, for the record). Without the emotional impact, my message could easily be ignored as some noise not worth attention. This time, it's just too damn important to allow this possibility. The module being abandoned and unused is truly the only explanation I could think of when seeing that glaring bugs have stayed unfixed for 15 years (an infinity in IT), in an actively developed and highly used software. This may be flattering for my ego, but if the module really is in any production use to speak of, then in all these years, with all this humongous user base, someone, somewhere in the world, at some point, should have looked into this. I don't even program in C professionally, yet was able to diagnose it and make a PR! --- I'll make a PR with the doc warning as Guido suggested unless there are any better ideas. Meanwhile, I'd really appreciate any response to my other message -- it is about actually fixing the issue, and I do need feedback to be able to proceed. No need to delve all the way in and give an official authorization or something. I'm only looking for an opinion poll on which redesign option (if any) looks like the most reasonable way to proceed and/or in line with the big picture (the last one -- to provide a unifying vision -- is _the_ job of a BDFL IIRC). > Regards > > Antoine. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/vano%40mail.mipt.ru -- Regards, Ivan From skip.montanaro at gmail.com Wed May 2 19:33:34 2018 From: skip.montanaro at gmail.com (Skip Montanaro) Date: Wed, 02 May 2018 23:33:34 +0000 Subject: [Python-Dev] Drop/deprecate Tkinter? In-Reply-To: References: Message-ID: I still use it a bit, in simple contexts to be sure, but I do find it useful. Others think so as well. I think TkAgg is probably the most commonly used backend in Matplotlib. I wrote a single Matplotlib-using program which plots columns from CSV files. I use it almost daily with no problems. Again, I use the TkAgg backend by default. So, does it have problems? Almost certainly. It seems you've encountered some. If you want to see something change though, just screaming at the developers is almost certainly the least productive thing you could do. Here are some things you *could* do: * submit some bug reports * review patches related to the problems, assuming there are some * write some patches for the documentation adding warnings about sketchy bits Skip -------------- next part -------------- An HTML attachment was scrubbed... URL: From python at mrabarnett.plus.com Wed May 2 19:54:14 2018 From: python at mrabarnett.plus.com (MRAB) Date: Thu, 3 May 2018 00:54:14 +0100 Subject: [Python-Dev] Drop/deprecate Tkinter? In-Reply-To: References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> Message-ID: <6773b068-62ef-9983-c88b-b0f83922ca1e@mrabarnett.plus.com> On 2018-05-02 22:56, Josh Stephens wrote: > Hello list, > > ? If I may voice my opinion I would like to say that I just built an > application using Tkinter using python3. I used it because it was > included in python by default and I didn't have to using something > like PyQT or any other framework that was heavy. While I agree that > the docs can sometimes be confusing, I am not sure that it warrants > tossing it out. I am not even sure that my opinion gives much weight > but I figured I would just toss in a quick here is my vote and my > story about using Tkinter with SqlAlchemy and Py2App to build a native > Mac OS X app as of last month. > I have a few applications that use tkiner. Whilst it has its limitations, it's too useful to throw out. It's a battery, not a power station, but batteries have their uses. From tjreedy at udel.edu Wed May 2 20:21:26 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Wed, 2 May 2018 20:21:26 -0400 Subject: [Python-Dev] bpo-33257: seeking advice & approval on the course of action In-Reply-To: <0a89a3c6-65f5-c0e0-826b-7db3a79a7ef0@mail.mipt.ru> References: <0a89a3c6-65f5-c0e0-826b-7db3a79a7ef0@mail.mipt.ru> Message-ID: On 5/2/2018 4:38 PM, Ivan Pozdeev via Python-Dev wrote: > The bottom line is: Tkinter is currently broken This is way over-stated. Many modules have bugs, somethings in features more central to their main purpose. > -- as in, it's not thread-safe (in both Py2 and Py3) Meaning that tkinter calls in threads sometimes work, and sometimes not. Most people do not think of trying this, and are therefore not affected. Others who want do either play it safe and desist or experiment to find out what does dependable work on their system. > despite being designed [as such]. Martin Loewis said this on a tracker issue several years ago, when he invited submission of patches he could review. Too bad he is not active now that someone (you) finally submitted one. The intention for Tkinter to be thread safe may have predated tcl/tk having a decent Mac version. > and advertizing itself as such. Where? According to Firefox, the current 3.6 tkinter chapter does not contain the string 'thread'. > All the fix options require some redesign of either `_tkinter', or some > of the core as well. This should be discussed on the tracker. Posting here was premature. > So, I'd like to get some kind of core team's feedback and/or approval > before pursuing any of them. Serhiy Storchaka is the tkinter maintainer. He is aware of your patch, https://github.com/python/cpython/pull/6444 having added himself as a reviewer. Your comments since then on https://bugs.python.org/issue33257 suggest that this is a first-draft patch that you yourself consider obsolete. In particular, your message today seems to, in effect, cancel the patch pending discussion of which fix option to pursue. He might be waiting for you to push updates. In any case, Serhiy is an extremely productive core developer, either submitting or merging nearly a patch a day. I am sure he saw fixing other issues, including some other and older tkinter issues, before the releases this week, as higher priority. -- Terry Jan Reedy From nad at python.org Wed May 2 20:16:25 2018 From: nad at python.org (Ned Deily) Date: Wed, 2 May 2018 20:16:25 -0400 Subject: [Python-Dev] [RELEASE] Python 3.7.0b4, final 3.7 beta, now available for testing Message-ID: <82F6CAB9-4144-4937-B73B-914AD6518173@python.org> Python 3.7.0b4 is the final beta preview of Python 3.7, the next feature release of Python. Beta releases are intended to give you the opportunity to test new features and bug fixes and to prepare your projects to support the new feature release. We strongly encourage you to test your projects with 3.7 during the beta phase and report issues found to bugs.python.org as soon as possible. While the release is feature complete entering the beta phase, it is possible that features may be modified or, in rare cases, deleted up until the start of the release candidate phase. Please keep in mind that this is a preview release and its use is not recommended for production environments. Attention macOS users: there is now a new installer variant for macOS 10.9+ that includes a built-in version of Tcl/Tk 8.6. This variant is expected to become the default version when 3.7.0 releases. Check it out! The next preview release will be the release candidate and is planned for 2018-05-21 followed by the official release of 3.7.0, planned for 2018-06-15. You can find Python 3.7.0b4 and more information here: https://www.python.org/downloads/release/python-370b4/ -- Ned Deily nad at python.org -- [] From tjreedy at udel.edu Wed May 2 21:37:56 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Wed, 2 May 2018 21:37:56 -0400 Subject: [Python-Dev] Drop/deprecate Tkinter? In-Reply-To: References: Message-ID: On 5/2/2018 4:51 PM, Ivan Pozdeev via Python-Dev wrote: > As https://bugs.python.org/issue33257 As I report there, the 'crasher' does not crash on my Win 10 with either installed 3.7 or built 3.8. > https://bugs.python.org/issue33316 showed, nothing about tkinter > Tkinter is broken, for both One can crash CPython with legal Python code. I don't think that the language community should deprecate and drop CPython ;-). > Py2 and Py3, with both threaded and non-threaded Tcl, since 2002 at > least, and no-one gives a damn. The experience of perhaps hundred of thousands of people successfully writing or running tkinter-based programs says otherwise. > This seems to be a testament that very few people are actually > interested in or are using it. It is a testament that most people write sane code, for which tkinter (and Python) work well. > If that is so But it is not. Tkinter is actively maintained. > there's no use keeping it in the standard library Ridiculous. All you have done with this post is distract attention from real problems. -- Terry Jan Reedy From tjreedy at udel.edu Wed May 2 21:59:39 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Wed, 2 May 2018 21:59:39 -0400 Subject: [Python-Dev] Drop/deprecate Tkinter? In-Reply-To: <20180503000153.6f8d3b33@fsol> References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> Message-ID: On 5/2/2018 6:01 PM, Antoine Pitrou wrote: > On Wed, 2 May 2018 22:54:04 +0100 > Paul Moore wrote: >> His *other* email seems reasonable, and warrants a response, yes. But >> are we to take the suggestion made here (to drop tkinter) seriously, >> based on the fact that there's a (rare - at least it appears that the >> many IDLE users haven't hit it yet) race condition that causes a crash >> in Python 2.7? (It appears that the problem doesn't happen in the >> python.org 3.x builds, if I understand the description of the issue). I got the same impression, and indeed got no crashes in 25 tries. > I and others actually suggested it seriously in the past. Now, > admittedly, at least IDLE seems better maintained than it used to > be -- not sure about Tkinter itself. Serhiy continues to work on tkinter even though he now works on much else. We just made an overlooked 2to3 fix last week. -- Terry Jan Reedy From steve at pearwood.info Wed May 2 22:26:20 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Thu, 3 May 2018 12:26:20 +1000 Subject: [Python-Dev] Drop/deprecate Tkinter? In-Reply-To: <20180503000153.6f8d3b33@fsol> References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> Message-ID: <20180503022619.GB9562@ando.pearwood.info> On Thu, May 03, 2018 at 12:01:53AM +0200, Antoine Pitrou wrote: > On Wed, 2 May 2018 22:54:04 +0100 > Paul Moore wrote: > > Nor do I think the tone of his message here is acceptable - regardless > > of how annoyed he is, posting insults ("no-one gives a damn") about > > volunteer contributors in a public mailing list isn't reasonable or > > constructive. Call that "playing speech police" if you want, but I > > think that being offended or annoyed and saying so is perfectly > > reasonable. > > Will all due respect, it's sometimes unpredictable what kind of wording > Anglo-Saxons will take as an insult, as there's lot of obsequiosity > there that doesn't exist in other cultures. To me, "not give a damn" > reads like a familiar version of "not care about something", but > apparently it can be offensive. I'm Anglo-Saxon[1], and honestly I believe that it is thin-skinned to the point of ludicrousness to say that "no-one gives a damn" is an insult. This isn't 1939 when Clark Gable's famous line "Frankly my dear, I don't give a damn" was considered shocking. Its 2018 and to not give a damn is a more forceful way of saying that people don't care, that they are indifferent. It is a truism on the internet that nobody gets to decide for anyone else what they do or don't find offensive, but I think that the respectful and kind response is to interpret Ivan's statement as a cry of anguish and pain, to read it with at least a modicum of sympathy, rather than to read it as an insult and offensive accusation of indifference. (And why should being accused of indifference be offensive? The world is full of things I have neither the time nor inclination to give a damn about. I deny that I ought to feel guilty or ashamed by that fact.) I think Guido's response was great: acknowledge Ivan's pain (apparently he lost a job or some income) without attacking him, neither dismissing Ivan's feelings nor validating them as a tactic for getting his way. Thank you Guido for leading by example. With respect to Paul, I literally cannot imagine why he thinks that *anyone*, not even the tkinter maintainers or developers themselves, ought to feel *offended* by Ivan's words. But I think a clue might be his subsequent use of the word *annoyed*. Is it annoying to be told that "no-one cares" when in fact you care? Of course it can be. It is a perfectly reasonable to feel annoyed. But it isn't reasonable to lash out at every little annoyance. All interpersonal interactions can involve annoyances. And none of us are purely on the receiving end, we all also cause them. None of us are so perfect that we can afford to lash out each time somebody causes some tiny little annoyance. We ought to gloss over the little ones, just as we hope others will swallow *their* annoyance at the things we do. If we're going to be open, respectful and considerate, we have a duty not to have a hair-trigger "I'm offended" response at tiny annoyances. "That's offensive!", in this day and age, is the nuclear weapon of interpersonal conflict, and nothing Ivan said was so terrible that it deserved such an attack. Not if we are to be open, considerate and respectful. We ought to start by respecting the clear emotional pain in his email and not responding by going on the attack. "A soft answer turns away wrath". [1] By culture, not genetics. -- Steve From benjamin at python.org Wed May 2 23:26:25 2018 From: benjamin at python.org (Benjamin Peterson) Date: Wed, 02 May 2018 20:26:25 -0700 Subject: [Python-Dev] Python startup time In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> Message-ID: <1525317985.4183457.1359051176.6E8A4764@webmail.messagingengine.com> On Wed, May 2, 2018, at 09:42, Gregory Szorc wrote: > The direction Mercurial is going in is that `hg` will likely become a Rust > binary (instead of a #!python script) that will use an embedded Python > interpreter. So we will have low-level control over the interpreter via the > C API. I'd also like to see us distribute a copy of Python in our official > builds. This will allow us to take various shortcuts, such as not having to > probe various sys.path entries since certain packages can only exist in one > place. I'd love to get to the state Google is at where they have > self-contained binaries with ELF sections containing Python modules. But > that requires a bit of very low-level hacking. We'll likely have a Rust > binary (that possibly static links libpython) and a separate JAR/zip-like > file containing resources. I'm curious about the rust binary. I can see that would give you startup time benefits similar to the ones you could get hacking the interpreter directly; e.g., you can use a zipfile for everything and not have site.py. But it seems like the Python-side wins would stop there. Is this all a prelude to incrementally rewriting hg in rust? (Mercuric oxide?) From songofacandy at gmail.com Wed May 2 23:57:28 2018 From: songofacandy at gmail.com (INADA Naoki) Date: Thu, 03 May 2018 03:57:28 +0000 Subject: [Python-Dev] Python startup time In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> Message-ID: Recently, I reported how stdlib slows down `import requests`. https://github.com/requests/requests/issues/4315#issuecomment-385584974 For Python 3.8, my ideas for faster startup time are: * Add lazy compiling API or flag in `re` module. The pattern is compiled when first used. * Add IntEnum and IntFlag alternative in C, like PyStructSequence for namedtuple. It will make importing `socket` and `ssl` module much faster. (Both module has huge enum/flag). * Add special casing for UTF-8 and ASCII in TextIOWrapper. When application uses only UTF-8 or ASCII, we can skip importing codecs and encodings package entirely. * Add faster and simpler http.parser (maybe, based on h11 [1]) and avoid using email module in http module. [1]: https://h11.readthedocs.io/en/latest/ I don't have significant estimate how they can make `import requests` faster, but I believe most of these ideas are worth enough. Regards, From tjreedy at udel.edu Thu May 3 00:01:10 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Thu, 3 May 2018 00:01:10 -0400 Subject: [Python-Dev] Python startup time In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> Message-ID: On 5/2/2018 12:42 PM, Gregory Szorc wrote: > I know this kinda/sorta exists with zipimporter. But zipimporter uses > zlib (slow) and only allows .py/.pyc files. And I think some Python > application distribution tools have also solved this problem. I'd > *really* like to see a proper/robust solution in Python itself. Along > that vein, it would be really nice if the "standalone Python > application" story were a bit more formalized. From my perspective, it > is insanely difficult to package and distribute an application that > happens to use Python. It requires vastly different solutions for > different platforms. I want to declare a minimal boilerplate somewhere > (perhaps in setup.py) and run a command that produces an > as-self-contained-as-possible application complete with platform-native > installers. I few years ago I helped my wife create a tutorial in the Renpy visual storytelling engine. It is free and open source. https://www.renpy.org It is written in Python, while users write scripts in both Python and a custom scripting language. When we were done, we pressed a button and it generated self-contained zip files for Windows, Linux, and Mac. This can be done from any of the three platforms. After we tested all three files, she created a web page with links to the three files for download. There have been no complaints so far. Perhaps the file generators could be adapted to packaging a project directory into a self-contained app. -- Terry Jan Reedy From gregory.szorc at gmail.com Wed May 2 23:56:50 2018 From: gregory.szorc at gmail.com (Gregory Szorc) Date: Wed, 2 May 2018 20:56:50 -0700 Subject: [Python-Dev] Python startup time In-Reply-To: <1525317985.4183457.1359051176.6E8A4764@webmail.messagingengine.com> References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> <1525317985.4183457.1359051176.6E8A4764@webmail.messagingengine.com> Message-ID: On Wed, May 2, 2018 at 8:26 PM, Benjamin Peterson wrote: > > > On Wed, May 2, 2018, at 09:42, Gregory Szorc wrote: > > The direction Mercurial is going in is that `hg` will likely become a > Rust > > binary (instead of a #!python script) that will use an embedded Python > > interpreter. So we will have low-level control over the interpreter via > the > > C API. I'd also like to see us distribute a copy of Python in our > official > > builds. This will allow us to take various shortcuts, such as not having > to > > probe various sys.path entries since certain packages can only exist in > one > > place. I'd love to get to the state Google is at where they have > > self-contained binaries with ELF sections containing Python modules. But > > that requires a bit of very low-level hacking. We'll likely have a Rust > > binary (that possibly static links libpython) and a separate JAR/zip-like > > file containing resources. > > I'm curious about the rust binary. I can see that would give you startup > time benefits similar to the ones you could get hacking the interpreter > directly; e.g., you can use a zipfile for everything and not have site.py. > But it seems like the Python-side wins would stop there. Is this all a > prelude to incrementally rewriting hg in rust? (Mercuric oxide?) > The plans are recorded at https://www.mercurial-scm.org/wiki/OxidationPlan. tl;dr we want to write some low-level bits in Rust but we anticipate the bulk of the application logic remaining in Python. Nobody in the project is seriously talking about a complete rewrite in Rust. Contributors to the project have varying opinions on how aggressively Rust should be utilized. People who contribute to the C code, low-level primitives (like storage, deltas, etc), and those who care about performance tend to want more Rust. One thing we almost universally agree on is that we want to rewrite all of Mercurial's C code in Rust. I anticipate that figuring out the balance between Rust and Python in Mercurial will be an ongoing conversation/process for the next few years. -------------- next part -------------- An HTML attachment was scrubbed... URL: From mcepl at cepl.eu Thu May 3 01:39:05 2018 From: mcepl at cepl.eu (=?UTF-8?Q?Mat=C4=9Bj?= Cepl) Date: Thu, 03 May 2018 07:39:05 +0200 Subject: [Python-Dev] Drop/deprecate Tkinter? References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> Message-ID: On 2018-05-02, 21:41 GMT, Guido van Rossum wrote: > So what do *you* think. Do you agree with the OP that Tkinter (and hence > IDLE) should be scrapped? It absolutely impossible to remove Tkinter IMHO (it has been part of stdlib since like forever and people expect it there; its removal would be betrayal on the level of switching = to :=), I have my doubts about IDLE though. I know, the same argument applies, but really, does anybody use IDLE for development for long time, what is its real value for the community? Although, even this argument is questionable, because Python has some affinity with the learning, and IDLE is a nice for first steps nibbling into Python. Best, Mat?j -- https://matej.ceplovi.cz/blog/, Jabber: mcepl at ceplovi.cz GPG Finger: 3C76 A027 CA45 AD70 98B5 BC1D 7920 5802 880B C9D8 Only two of my personalities are schizophrenic, but one of them is paranoid and the other one is out to get him. From steve at pearwood.info Thu May 3 02:33:38 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Thu, 3 May 2018 16:33:38 +1000 Subject: [Python-Dev] Drop/deprecate Tkinter? In-Reply-To: References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> Message-ID: <20180503063338.GF9562@ando.pearwood.info> On Thu, May 03, 2018 at 07:39:05AM +0200, Mat?j Cepl wrote: > I have my doubts about IDLE though. I know, the same > argument applies, but really, does anybody use IDLE for > development for long time Yes, tons of beginners use it. On the tutor and python-list mailing lists, there are plenty of questions from people using IDLE. -- Steve From v+python at g.nevcal.com Thu May 3 01:56:38 2018 From: v+python at g.nevcal.com (Glenn Linderman) Date: Wed, 2 May 2018 22:56:38 -0700 Subject: [Python-Dev] Python startup time In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> <1525317985.4183457.1359051176.6E8A4764@webmail.messagingengine.com> Message-ID: On 5/2/2018 8:56 PM, Gregory Szorc wrote: > Nobody in the project is seriously talking about a complete rewrite in > Rust. Contributors to the project have varying opinions on how > aggressively Rust should be utilized. People who contribute to the C > code, low-level primitives (like storage, deltas, etc), and those who > care about performance tend to want more Rust. One thing we almost > universally agree on is that we want to rewrite all of Mercurial's C > code in Rust. I anticipate that figuring out the balance between Rust > and Python in Mercurial will be an ongoing conversation/process for > the next few years. Have you considered simply rewriting CPython in Rust? And yes, the 4th word in that question was intended to produce peals of shocked laughter. But why Rust? Why not Go? http://esr.ibiblio.org/?p=7724 -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Thu May 3 04:26:22 2018 From: p.f.moore at gmail.com (Paul Moore) Date: Thu, 3 May 2018 09:26:22 +0100 Subject: [Python-Dev] Drop/deprecate Tkinter? In-Reply-To: <20180503022619.GB9562@ando.pearwood.info> References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> Message-ID: On 3 May 2018 at 03:26, Steven D'Aprano wrote: >> Will all due respect, it's sometimes unpredictable what kind of wording >> Anglo-Saxons will take as an insult, as there's lot of obsequiosity >> there that doesn't exist in other cultures. To me, "not give a damn" >> reads like a familiar version of "not care about something", but >> apparently it can be offensive. > > I'm Anglo-Saxon[1], and honestly I believe that it is thin-skinned to > the point of ludicrousness to say that "no-one gives a damn" is an > insult. This isn't 1939 when Clark Gable's famous line "Frankly my dear, > I don't give a damn" was considered shocking. Its 2018 and to not give a > damn is a more forceful way of saying that people don't care, that they > are indifferent. Sigh. That's not what I was saying at all. I was trying to point out that Antoine's claim that people should ignore the rhetoric and that complaining about the attitude was unreasonable, was in itself unfair. People have a right to point out that a mail like the OP's was badly worded. > With respect to Paul, I literally cannot imagine why he thinks that > *anyone*, not even the tkinter maintainers or developers themselves, > ought to feel *offended* by Ivan's words. Personally, they didn't offend me. I don't pretend to know how others might take them. But they *did* annoy me. I'm frankly sick of people (not on this list) complaining that people who work on projects in their own time, free of charge, "don't care enough" or "are ignoring my requirement". We all do it, to an extent, and it's natural to get frustrated, but the onus is on the person asking for help to be polite and fair. And maybe this response was the one where I finally let that frustration show through. I may read less email for a week or two, just to get a break. > But I think a clue might be his subsequent use of the word *annoyed*. Is > it annoying to be told that "no-one cares" when in fact you care? Of > course it can be. It is a perfectly reasonable to feel annoyed. But it > isn't reasonable to lash out at every little annoyance. Correct, I'm personally annoyed rather than offended. And maybe I reacted strongly (although my reaction was mainly a defense of people's right to be annoyed or offended, not a direct response to the OP). I *hope* it wasn't "lashing out", but I concede that others may view it differently than I do. It *certainly* isn't against "every little annoyance" though - I've been dealing politely and calmly with *many* entitled and misguided complaints recently (in many lists and fora - no point in trying to go hunting down what I'm referring to ;-)) and this was one too many. I should probably have just shut up and deleted the thread. I *will* stop at this point and not respond again on this thread. > All interpersonal interactions can involve annoyances. And none of us > are purely on the receiving end, we all also cause them. None of us are > so perfect that we can afford to lash out each time somebody causes some > tiny little annoyance. We ought to gloss over the little ones, just as > we hope others will swallow *their* annoyance at the things we do. > > If we're going to be open, respectful and considerate, we have a duty > not to have a hair-trigger "I'm offended" response at tiny annoyances. While true, this is biased in favour of people who start new threads, allowing them the freedom to not consider other's feelings or situations while expecting the recipients to be forgiving of hyperbole and overstated rhetoric. Relevant xkcd: https://xkcd.com/1984/ > "That's offensive!", in this day and age, is the nuclear weapon of > interpersonal conflict, and nothing Ivan said was so terrible that it > deserved such an attack. Not if we are to be open, considerate and > respectful. We ought to start by respecting the clear emotional pain in > his email and not responding by going on the attack. "A soft answer > turns away wrath". If that's directed at me, it's unfair. Personally, I consider "it's offensive" to be a mild expression of distaste at what someone says - often used somewhat jokingly, in reference to "political correctness" style jokes. Antoine clearly took it otherwise - his mention of "Anglo Saxon" suggests to me that he feels it's a cultural thing - although if so, he's misinterpreted the relevant cultures as far as I can see. British informal culture in my experience tends to be similar to what I describe above as my view. Or maybe he's just thinking of "people taking too much care to stick within the letter of codes of conduct", I don't know. I'm happy to try to avoid the word "offensive" in my future posts - it clearly has connotations that don't match what I intend. I've said my piece, so I'll leave it at that. I don't want mailing lists to become sterile places where everyone feels unable to speak their mind for fear of upsetting others, but I do think we *all* need to consider the other person's perspective. And I *particularly* think that people who start new threads need to be careful of the tone they want to set for the thread. For me, having to deal with a huge range of people with radically different backgrounds and experiences is one of the biggest benefits of participating in open source, and of the internet in general - I'd hate for that benefit to be stifled by excessive need for "careful" speech. But conversely, we need to respect and learn from those differences, not ignore them. Sermon over, sorry. Paul From stefan_ml at behnel.de Thu May 3 05:22:58 2018 From: stefan_ml at behnel.de (Stefan Behnel) Date: Thu, 3 May 2018 11:22:58 +0200 Subject: [Python-Dev] PEP 575: Unifying function/method classes In-Reply-To: References: <5ACF8552.6020607@UGent.be> <5AD1C725.5090603@UGent.be> Message-ID: Hi, let me start by saying that I'm much in favour of this change. It cleans up a lot of the function implementation and makes it much easier to integrate efficiently with external wrapper tools. Guido van Rossum schrieb am 14.04.2018 um 23:14: > On Sat, Apr 14, 2018 at 2:17 AM, Jeroen Demeyer wrote: >> On 2018-04-13 21:30, Raymond Hettinger wrote: >> >>> It would be nice to have a section that specifically discusses the >>> implications with respect to other existing function-like tooling: >>> classmethod, staticmethod, partial, itemgetter, attrgetter, methodgetter, >>> etc. >>> >> >> My hope is that there are no such implications. An important design goal >> of this PEP (which I believe I achieved) is that as long as you're doing >> duck typing, you should be safe. I believe that the tools in your list do >> exactly that. >> >> It's only when you use inspect or when you do type checks that you will >> see the difference with this PEP. > > That actually sounds like a pretty big problem. I'm sure there is lots of > code that doesn't *just* duck-type nor calls inspect but uses isinstance() > to decide how to extract the desired information. After some discussion, it seems that we can avoid the backwards incompatibility by going half of the way first. We can keep the existing "builtin_function_or_method" type for now, and mostly just add the common base function type at the top. That provides most of the benefits, including fast integration with native external function implementations and most of the cleanup, while not requiring changes to type tests in user code. The final split could then be done later, e.g. for Py4.0, where people would be less surprised about minor breakages. The problem is that this change does not really fit into the deprecation cycle since there is no specific use case to warn about. Most code will simply keep working, and there is no specific code pattern that would break. It really depends on the exact reasons why some piece of code (thinks it) needs to do a type check. Such code is usually easy to fix, and also improve along the way, probably even by reducing the type checking or by starting to use "inspect" instead of direct (fragile) type checks. Much of the code that could potentially break probably only exists to work around the quirks of the current function types implementation in CPython, which this PEP specifically aims to clean up. Stefan From vstinner at redhat.com Thu May 3 05:30:31 2018 From: vstinner at redhat.com (Victor Stinner) Date: Thu, 3 May 2018 11:30:31 +0200 Subject: [Python-Dev] PEP 575: Unifying function/method classes In-Reply-To: References: <5ACF8552.6020607@UGent.be> <5AD1C725.5090603@UGent.be> Message-ID: 2018-05-03 11:22 GMT+02:00 Stefan Behnel : > The final split could then be done later, e.g. for Py4.0, where people > would be less surprised about minor breakages. Please don't queue backward incompatible changes for Python 4.0. You should use the regular deprecation process. (I didn't read the full thread, sorry :-p) Victor From J.Demeyer at UGent.be Thu May 3 06:22:38 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Thu, 3 May 2018 12:22:38 +0200 Subject: [Python-Dev] PEP 575: Unifying function/method classes In-Reply-To: <9e0f7462d6e4431faee21ba5f997f32a@xmail101.UGent.be> References: <5ACF8552.6020607@UGent.be> <5AD1C725.5090603@UGent.be> <9e0f7462d6e4431faee21ba5f997f32a@xmail101.UGent.be> Message-ID: <5AEAE2EE.9040704@UGent.be> On 2018-05-03 11:30, Victor Stinner wrote: > Please don't queue backward incompatible changes for Python 4.0. You > should use the regular deprecation process. I don't really see how that can be done here. As Stefan said > The problem is that this > change does not really fit into the deprecation cycle since there is no > specific use case to warn about. The PEP proposes to change an implementation detail. It's really hard to determine at runtime whether code is relying on that implementation detail. We could insert a DeprecationWarning in some places, but those would mostly be false positives (a DeprecationWarning is shown but the code won't break). On top of that, there is no way to show a DeprecationWarning for code like "type(x) is foo". Jeroen. From skip.montanaro at gmail.com Thu May 3 07:43:05 2018 From: skip.montanaro at gmail.com (Skip Montanaro) Date: Thu, 03 May 2018 11:43:05 +0000 Subject: [Python-Dev] Drop/deprecate Tkinter? Message-ID: One other small bit... There is some precedent for retaining modules where the underlying library was known to be buggy. The dearly departed bsddb module exposed libdb 1.85 (as I recall) which had an unfixable bug. Still, bsddb supported that broken version of the library for quite awhile before itself being deprecated, then removed, from the stdlib. I believe it was the sole persistent key/value store for most of the early years. So, bugs or not (& fixable or not) it's not like we haven't encountered this kind of case before. Skip -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at holdenweb.com Thu May 3 08:24:17 2018 From: steve at holdenweb.com (Steve Holden) Date: Thu, 3 May 2018 13:24:17 +0100 Subject: [Python-Dev] Drop/deprecate Tkinter? In-Reply-To: References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> Message-ID: On Thu, May 3, 2018 at 12:12 AM, Ivan Pozdeev via Python-Dev < python-dev at python.org> wrote: > On 03.05.2018 1:01, Antoine Pitrou wrote: > >> On Wed, 2 May 2018 22:54:04 +0100 >> Paul Moore wrote: >> >>> On 2 May 2018 at 22:37, Antoine Pitrou wrote: >>> >>>> To elaborate a bit: the OP, while angry, produced both a detailed >>>> analysis *and* a PR. It's normal to be angry when an advertised >>>> feature doesn't work and it makes you lose hours of work (or, even, >>>> forces you to a wholesale redesign). Producing a detailed analysis and a >>>> PR is more than most people will ever do. >>>> >>> His *other* email seems reasonable, and warrants a response, yes. But >>> are we to take the suggestion made here (to drop tkinter) seriously, >>> based on the fact that there's a (rare - at least it appears that the >>> many IDLE users haven't hit it yet) race condition that causes a crash >>> in Python 2.7? (It appears that the problem doesn't happen in the >>> python.org 3.x builds, if I understand the description of the issue). >>> >> In 3.x, Tkinter+threads is broken too, albeit in a different way -- see > https://bugs.python.org/issue33412 (this should've been the 2nd link in > the initial message, sorry for the mix-up). > ?The observation in t?hat issue that tkinter and threads should be handled in specific ways is certainly a given for old hands, who have long put the GUI code in one thread with one or more concurrent worker threads typically communicating through queues. But I haven't built anything like that recently, so I couldn't say how helpful the current documenation might be. The 2.x bug also shows in 3.x if it's linked with a nonthreaded version of > Tcl (dunno how rare that is, but the code still supports this setup). > >> I and others actually suggested it seriously in the past. Now, >> admittedly, at least IDLE seems better maintained than it used to >> be -- not sure about Tkinter itself. >> >> Nor do I think the tone of his message here is acceptable - regardless >>> of how annoyed he is, posting insults ("no-one gives a damn") about >>> volunteer contributors in a public mailing list isn't reasonable or >>> constructive. Call that "playing speech police" if you want, but I >>> think that being offended or annoyed and saying so is perfectly >>> reasonable. >>> >> Will all due respect, it's sometimes unpredictable what kind of wording >> Anglo-Saxons will take as an insult, as there's lot of obsequiosity >> there that doesn't exist in other cultures. To me, "not give a damn" >> reads like a familiar version of "not care about something", but >> apparently it can be offensive. >> > Confirm, never meant this as an insult. > > I had to use emotional language to drive the point home that it's not some > nitpick, it really causes people serious trouble (I lost a source of > income, for the record). > Without the emotional impact, my message could easily be ignored as some > noise not worth attention. This time, it's just too damn important to allow > this possibility. > > With respect, I would say you CHOSE to use emotional language. I don't see that much indication that its absence had failed to produce responses, though they may not have been the responses you wanted. Unfortunately the developers are rather too used to this kind of gratuitous abuse and so many of them may have overlooked your detailed analysis of the issues you were experiencing, since constructive contributions don't normally accompany such rants. The module being abandoned and unused is truly the only explanation I could > think of when seeing that glaring bugs have stayed unfixed for 15 years (an > infinity in IT), in an actively developed and highly used software. > This may be flattering for my ego, but if the module really is in any > production use to speak of, then in all these years, with all this > humongous user base, someone, somewhere in the world, at some point, should > have looked into this. I don't even program in C professionally, yet was > able to diagnose it and make a PR! > > ?I think the fact that alarm bells haven't clanged is likely a product of ?tkinter's relatively small user base, perhaps amplified by dwindling availability of "GYU in one thread" lore. Anyway they have certainly clanged now. --- > > I'll make a PR with the doc warning as Guido suggested unless there are > any better ideas. > > ?In the absence of other actions this would be a good first step. Thank you. ? > Meanwhile, I'd really appreciate any response to my other message -- it is > about actually fixing the issue, and I do need feedback to be able to > proceed. > No need to delve all the way in and give an official authorization or > something. I'm only looking for an opinion poll on which redesign option > (if any) looks like the most reasonable way to proceed and/or in line with > the big picture (the last one -- to provide a unifying vision -- is _the_ > job of a BDFL IIRC). > ?I wouldn't presume to tell Guido his job, given that I've never done it and wouldn't be capable of it. Do you want the ??5 opinion poll or the ?10 opinion poll? Let's hope nobody here wants an argument ;-) regards Steve -------------- next part -------------- An HTML attachment was scrubbed... URL: From eric at trueblade.com Thu May 3 07:58:49 2018 From: eric at trueblade.com (Eric V. Smith) Date: Thu, 3 May 2018 07:58:49 -0400 Subject: [Python-Dev] PEP 575: Unifying function/method classes In-Reply-To: <5AEAE2EE.9040704@UGent.be> References: <5ACF8552.6020607@UGent.be> <5AD1C725.5090603@UGent.be> <9e0f7462d6e4431faee21ba5f997f32a@xmail101.UGent.be> <5AEAE2EE.9040704@UGent.be> Message-ID: <8c143fb6-9d2f-f870-3485-0fa926b695d4@trueblade.com> On 5/3/2018 6:22 AM, Jeroen Demeyer wrote: > On 2018-05-03 11:30, Victor Stinner wrote: >> Please don't queue backward incompatible changes for Python 4.0. You >> should use the regular deprecation process. > > I don't really see how that can be done here. As Stefan said > >> The problem is that this >> change does not really fit into the deprecation cycle since there is no >> specific use case to warn about. > > The PEP proposes to change an implementation detail. It's really hard to > determine at runtime whether code is relying on that implementation > detail. We could insert a DeprecationWarning in some places, but those > would mostly be false positives (a DeprecationWarning is shown but the > code won't break). > > On top of that, there is no way to show a DeprecationWarning for code > like "type(x) is foo". Deprecating doesn't necessarily involve a DeprecationWarning, although if possible it should, of course. It could just be a documented deprecation. We've done this before, although I can't think of an example off the top of my head (which I realize is not exactly helpful). "If you're doing a type check involving C functions, and you're doing it , change it to because we're going to deprecate the old way in version x.y". Of course this assumes both and can coexist for several versions, which itself might not be possible. Eric From rymg19 at gmail.com Thu May 3 08:41:56 2018 From: rymg19 at gmail.com (Ryan Gonzalez) Date: Thu, 03 May 2018 07:41:56 -0500 Subject: [Python-Dev] Python startup time In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> <1525317985.4183457.1359051176.6E8A4764@webmail.messagingengine.com> Message-ID: <1632605fa20.2837.db5b03704c129196a4e9415e55413ce6@gmail.com> I'm hardly an expert, but AFAIK CPython's start-up issues are more due to a mix of architectural issues and the fact that it's hard to optimize imports while maintaining backwards compatibility with Python's dynamism. -- Ryan (????) Yoko Shimomura, ryo (supercell/EGOIST), Hiroyuki Sawano >> everyone else https://refi64.com/ On May 3, 2018 1:37:57 AM Glenn Linderman wrote: > On 5/2/2018 8:56 PM, Gregory Szorc wrote: >> Nobody in the project is seriously talking about a complete rewrite in >> Rust. Contributors to the project have varying opinions on how >> aggressively Rust should be utilized. People who contribute to the C >> code, low-level primitives (like storage, deltas, etc), and those who >> care about performance tend to want more Rust. One thing we almost >> universally agree on is that we want to rewrite all of Mercurial's C >> code in Rust. I anticipate that figuring out the balance between Rust >> and Python in Mercurial will be an ongoing conversation/process for >> the next few years. > Have you considered simply rewriting CPython in Rust? > > And yes, the 4th word in that question was intended to produce peals of > shocked laughter. But why Rust? Why not Go? http://esr.ibiblio.org/?p=7724 > > > > ---------- > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com > From ncoghlan at gmail.com Thu May 3 10:29:55 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 4 May 2018 00:29:55 +1000 Subject: [Python-Dev] Python startup time In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> <1525317985.4183457.1359051176.6E8A4764@webmail.messagingengine.com> Message-ID: On 3 May 2018 at 15:56, Glenn Linderman wrote: > On 5/2/2018 8:56 PM, Gregory Szorc wrote: > > Nobody in the project is seriously talking about a complete rewrite in > Rust. Contributors to the project have varying opinions on how aggressively > Rust should be utilized. People who contribute to the C code, low-level > primitives (like storage, deltas, etc), and those who care about > performance tend to want more Rust. One thing we almost universally agree > on is that we want to rewrite all of Mercurial's C code in Rust. I > anticipate that figuring out the balance between Rust and Python in > Mercurial will be an ongoing conversation/process for the next few years. > > Have you considered simply rewriting CPython in Rust? > FWIW, I'd actually like to see Rust approved as a language for writing stdlib extension modules, but actually ever making that change in policy would require a concrete motivating use case. > And yes, the 4th word in that question was intended to produce peals of > shocked laughter. But why Rust? Why not Go? > Trying to get two different garbage collection engines to play nice with each other is a recipe for significant pain, since you can easily end up with uncollectable cycles that neither GC system has complete visibility into (all it needs is a loop from PyObject A -> Go Object B -> back to PyObject A). Combining Python and Rust can still get into that kind of trouble when using reference counting on the Rust side, but it's a lot easier to avoid than it is in runtimes with mandatory GC. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From tim.peters at gmail.com Thu May 3 11:56:41 2018 From: tim.peters at gmail.com (Tim Peters) Date: Thu, 3 May 2018 10:56:41 -0500 Subject: [Python-Dev] Drop/deprecate Tkinter? In-Reply-To: References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> Message-ID: [Mat?j Cepl ] > It absolutely impossible to remove Tkinter IMHO (it has been > part of stdlib since like forever and people expect it there; > its removal would be betrayal on the level of switching = to > :=), I have my doubts about IDLE though. I know, the same > argument applies, but really, does anybody use IDLE for > development for long time, what is its real value for the > community? Although, even this argument is questionable, because > Python has some affinity with the learning, and IDLE is a nice > for first steps nibbling into Python. IDLE isn't just for eager beginners, but also for those so old & senile they're incapable of learning anything new ever again. As proof, IDLE is still _my_ primary Python development environment, used multiple times every day, and I'm so old & out-of-it that I'm +1 on the binding expressions PEP ;-) From python at mrabarnett.plus.com Thu May 3 12:55:08 2018 From: python at mrabarnett.plus.com (MRAB) Date: Thu, 3 May 2018 17:55:08 +0100 Subject: [Python-Dev] Drop/deprecate Tkinter? In-Reply-To: References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> Message-ID: <59fc47a7-5948-6895-6aba-a4c8e6669a31@mrabarnett.plus.com> On 2018-05-03 13:24, Steve Holden wrote: > On Thu, May 3, 2018 at 12:12 AM, Ivan Pozdeev via Python-Dev > > wrote: > > On 03.05.2018 1:01, Antoine Pitrou wrote: > > On Wed, 2 May 2018 22:54:04 +0100 > Paul Moore > wrote: > > On 2 May 2018 at 22:37, Antoine Pitrou > wrote: > > To elaborate a bit: the OP, while angry, produced both a > detailed > analysis *and* a PR.? It's normal to be angry when an > advertised > feature doesn't work and it makes you lose hours of work > (or, even, > forces you to a wholesale redesign). Producing a > detailed analysis and a > PR is more than most people will ever do. > > His *other* email seems reasonable, and warrants a response, > yes. But > are we to take the suggestion made here (to drop tkinter) > seriously, > based on the fact that there's a (rare - at least it appears > that the > many IDLE users haven't hit it yet) race condition that > causes a crash > in Python 2.7? (It appears that the problem doesn't happen > in the > python.org 3.x builds, if I understand > the description of the issue). > > In 3.x, Tkinter+threads is broken too, albeit in a different way -- > see https://bugs.python.org/issue33412 > (this should've been the 2nd > link in the initial message, sorry for the mix-up). > > > ?The observation in t?hat issue that tkinter and threads should be > handled in specific ways is certainly a given for old hands, who have > long put the GUI code in one thread with one or more concurrent worker > threads typically communicating through queues. But I haven't built > anything like that recently, so I couldn't say how helpful the current > documenation might be. > Interacting with the GUI only in the main thread is something that I've had to do in other languages (it is/was the recommended practice), so I naturally do the same with Python and tkinter. It's also easier to reason about because you don't get elements of the GUI changing unexpectedly. [snip] From arj.python at gmail.com Thu May 3 13:06:11 2018 From: arj.python at gmail.com (Abdur-Rahmaan Janhangeer) Date: Thu, 03 May 2018 17:06:11 +0000 Subject: [Python-Dev] Drop/deprecate Tkinter? In-Reply-To: References: Message-ID: Maybe the only 1 thing needs an update : some nice ui else, i'm glad python has a gui library as one interested in languages, py is just crazy (though i miss some android apps using it). in a GPL, a gui library is one of those extra goodies if you would browse the source codes, you'd see good old compiler theories being used (no ANTLR for example) tkinter is pretty good. as one that still believes in tkinter and has as a result of it explored many apps, what you can do with tkinter is crazy. as to no one using it; did you consider production: installing a 3rd party package for what is already integrated ah just the docs, they are not as candy as python's docs to drop it, we use what? maybe we'll all join punching some bags before we get a stable gui package in production Abdur-Rahmaan Janhangeer https://github.com/Abdur-rahmaanJ On Thu, 3 May 2018, 00:55 Ivan Pozdeev via Python-Dev, < python-dev at python.org> wrote: > As https://bugs.python.org/issue33257 and > https://bugs.python.org/issue33316 showed, Tkinter is broken, for both > Py2 and Py3, with both threaded and non-threaded Tcl, since 2002 at > least, and no-one gives a damn. > > This seems to be a testament that very few people are actually > interested in or are using it. > > If that's so, there's no use keeping it in the standard library -- if > anything, because there's not enough incentive and/or resources to > support it. And to avoid screwing people (=me) up when they have the > foolishness to think they can rely on it in their projects -- nowhere in > the docs it is said that the module is only partly functional. > > -- > > Regards, > Ivan > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/arj.python%40gmail.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From rymg19 at gmail.com Thu May 3 13:11:55 2018 From: rymg19 at gmail.com (Ryan Gonzalez) Date: Thu, 03 May 2018 12:11:55 -0500 Subject: [Python-Dev] Drop/deprecate Tkinter? In-Reply-To: <59fc47a7-5948-6895-6aba-a4c8e6669a31@mrabarnett.plus.com> References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <59fc47a7-5948-6895-6aba-a4c8e6669a31@mrabarnett.plus.com> Message-ID: <16326fd2778.2837.db5b03704c129196a4e9415e55413ce6@gmail.com> On May 3, 2018 11:56:24 AM MRAB wrote: > On 2018-05-03 13:24, Steve Holden wrote: >> On Thu, May 3, 2018 at 12:12 AM, Ivan Pozdeev via Python-Dev >> > wrote: >> >> On 03.05.2018 1:01, Antoine Pitrou wrote: >> >> On Wed, 2 May 2018 22:54:04 +0100 >> Paul Moore > wrote: >> >> On 2 May 2018 at 22:37, Antoine Pitrou > > wrote: >> >> To elaborate a bit: the OP, while angry, produced both a >> detailed >> analysis *and* a PR.? It's normal to be angry when an >> advertised >> feature doesn't work and it makes you lose hours of work >> (or, even, >> forces you to a wholesale redesign). Producing a >> detailed analysis and a >> PR is more than most people will ever do. >> >> His *other* email seems reasonable, and warrants a response, >> yes. But >> are we to take the suggestion made here (to drop tkinter) >> seriously, >> based on the fact that there's a (rare - at least it appears >> that the >> many IDLE users haven't hit it yet) race condition that >> causes a crash >> in Python 2.7? (It appears that the problem doesn't happen >> in the >> python.org 3.x builds, if I understand >> the description of the issue). >> >> In 3.x, Tkinter+threads is broken too, albeit in a different way -- >> see https://bugs.python.org/issue33412 >> (this should've been the 2nd >> link in the initial message, sorry for the mix-up). >> >> >> ?The observation in t?hat issue that tkinter and threads should be >> handled in specific ways is certainly a given for old hands, who have >> long put the GUI code in one thread with one or more concurrent worker >> threads typically communicating through queues. But I haven't built >> anything like that recently, so I couldn't say how helpful the current >> documenation might be. >> > Interacting with the GUI only in the main thread is something that I've > had to do in other languages (it is/was the recommended practice), so I > naturally do the same with Python and tkinter. It's also easier to > reason about because you don't get elements of the GUI changing > unexpectedly. To add to this, most GUI frameworks disallow modifications outside the main thread altogether. IIRC both GTK+ and Qt require this, or else it's undefined altogether. > > [snip] > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com -- Ryan (????) Yoko Shimomura, ryo (supercell/EGOIST), Hiroyuki Sawano >> everyone else https://refi64.com/ From brett at python.org Thu May 3 14:08:22 2018 From: brett at python.org (Brett Cannon) Date: Thu, 03 May 2018 18:08:22 +0000 Subject: [Python-Dev] Python startup time In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> <1525317985.4183457.1359051176.6E8A4764@webmail.messagingengine.com> Message-ID: On Thu, 3 May 2018 at 07:31 Nick Coghlan wrote: > On 3 May 2018 at 15:56, Glenn Linderman wrote: > >> On 5/2/2018 8:56 PM, Gregory Szorc wrote: >> >> Nobody in the project is seriously talking about a complete rewrite in >> Rust. Contributors to the project have varying opinions on how aggressively >> Rust should be utilized. People who contribute to the C code, low-level >> primitives (like storage, deltas, etc), and those who care about >> performance tend to want more Rust. One thing we almost universally agree >> on is that we want to rewrite all of Mercurial's C code in Rust. I >> anticipate that figuring out the balance between Rust and Python in >> Mercurial will be an ongoing conversation/process for the next few years. >> >> Have you considered simply rewriting CPython in Rust? >> > > FWIW, I'd actually like to see Rust approved as a language for writing > stdlib extension modules, but actually ever making that change in policy > would require a concrete motivating use case. > Eric Snow, Barry Warsaw, and I have actually discussed this as part of our weekly open source office hours as work where we tend to talk about massive ideas that would take multiple people full-time to accomplish. :) > > >> And yes, the 4th word in that question was intended to produce peals of >> shocked laughter. But why Rust? Why not Go? >> > > Trying to get two different garbage collection engines to play nice with > each other is a recipe for significant pain, since you can easily end up > with uncollectable cycles that neither GC system has complete visibility > into (all it needs is a loop from PyObject A -> Go Object B -> back to > PyObject A). > > Combining Python and Rust can still get into that kind of trouble when > using reference counting on the Rust side, but it's a lot easier to avoid > than it is in runtimes with mandatory GC. > Rust supports RAII so it shouldn't be that bad. -------------- next part -------------- An HTML attachment was scrubbed... URL: From vano at mail.mipt.ru Thu May 3 14:24:02 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Thu, 3 May 2018 21:24:02 +0300 Subject: [Python-Dev] Drop/deprecate Tkinter? In-Reply-To: <16326fd2778.2837.db5b03704c129196a4e9415e55413ce6@gmail.com> References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <59fc47a7-5948-6895-6aba-a4c8e6669a31@mrabarnett.plus.com> <16326fd2778.2837.db5b03704c129196a4e9415e55413ce6@gmail.com> Message-ID: <37045e97-d113-38d1-5eb4-1347b14adaa6@mail.mipt.ru> On 03.05.2018 20:11, Ryan Gonzalez wrote: > On May 3, 2018 11:56:24 AM MRAB wrote: > >> On 2018-05-03 13:24, Steve Holden wrote: >>> On Thu, May 3, 2018 at 12:12 AM, Ivan Pozdeev via Python-Dev >>> > wrote: >>> >>> ??? On 03.05.2018 1:01, Antoine Pitrou wrote: >>> >>> ??????? On Wed, 2 May 2018 22:54:04 +0100 >>> ??????? Paul Moore >> > wrote: >>> >>> ??????????? On 2 May 2018 at 22:37, Antoine Pitrou >> ??????????? > wrote: >>> >>> ??????????????? To elaborate a bit: the OP, while angry, produced >>> both a >>> ??????????????? detailed >>> ??????????????? analysis *and* a PR.? It's normal to be angry when an >>> ??????????????? advertised >>> ??????????????? feature doesn't work and it makes you lose hours of >>> work >>> ??????????????? (or, even, >>> ??????????????? forces you to a wholesale redesign). Producing a >>> ??????????????? detailed analysis and a >>> ??????????????? PR is more than most people will ever do. >>> >>> ??????????? His *other* email seems reasonable, and warrants a >>> response, >>> ??????????? yes. But >>> ??????????? are we to take the suggestion made here (to drop tkinter) >>> ??????????? seriously, >>> ??????????? based on the fact that there's a (rare - at least it >>> appears >>> ??????????? that the >>> ??????????? many IDLE users haven't hit it yet) race condition that >>> ??????????? causes a crash >>> ??????????? in Python 2.7? (It appears that the problem doesn't happen >>> ??????????? in the >>> ??????????? python.org 3.x builds, if I understand >>> ??????????? the description of the issue). >>> >>> ??? In 3.x, Tkinter+threads is broken too, albeit in a different way -- >>> ??? see https://bugs.python.org/issue33412 >>> ??? (this should've been the 2nd >>> ??? link in the initial message, sorry for the mix-up). >>> >>> >>> ?The observation in t?hat issue that tkinter and threads should be >>> handled in specific ways is certainly a given for old hands, who have >>> long put the GUI code in one thread with one or more concurrent worker >>> threads typically communicating through queues. But I haven't built >>> anything like that recently, so I couldn't say how helpful the current >>> documenation might be. >>> >> Interacting with the GUI only in the main thread is something that I've >> had to do in other languages (it is/was the recommended practice), so I >> naturally do the same with Python and tkinter. It's also easier to >> reason about because you don't get elements of the GUI changing >> unexpectedly. > > To add to this, most GUI frameworks disallow modifications outside the > main thread altogether. IIRC both GTK+ and Qt require this, or else > it's undefined altogether. > You still need some facility (*cough*SendMessage*cough*) to send update commands to the GUI (the model->view link in MVC, presenter->view in MVP). Who and how specifically carries out these commands is unimportant, an implementation detail. Every GUI has an event/message queue exactly for that, that other threads can sent work requests into: https://doc.qt.io/qt-5.10/qcoreapplication.html#postEvent , https://developer.gnome.org/gdk3/stable/gdk3-Threads.html#gdk3-Threads.description , https://en.wikipedia.org/wiki/Event_dispatching_thread#Submitting_user_code_to_the_EDT , the aforementioned WinAPI's SendMessage() and PostMessage() just to name a few. Tcl/Tk, being arguably the oldest usable GUI toolkit in existence, has an event queue likewise but doesn't provide a complete event loop implementation, only the building blocks for it. Tkinter fills that gap with its `tk.mainloop()`. It fails to provide a working means to send work into it though. Having to use a second, duplicating event queue and poll it (=busy loop) instead is an obvious crutch. >> >> [snip] >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com > > > -- > Ryan (????) > Yoko Shimomura, ryo (supercell/EGOIST), Hiroyuki Sawano >> everyone else > https://refi64.com/ > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/vano%40mail.mipt.ru -- Regards, Ivan From brett at python.org Thu May 3 14:31:03 2018 From: brett at python.org (Brett Cannon) Date: Thu, 03 May 2018 18:31:03 +0000 Subject: [Python-Dev] Dealing with tone in an email (was: Drop/deprecate Tkinter?) In-Reply-To: References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> Message-ID: On Thu, 3 May 2018 at 01:27 Paul Moore wrote: > On 3 May 2018 at 03:26, Steven D'Aprano wrote: > > >> Will all due respect, it's sometimes unpredictable what kind of wording > >> Anglo-Saxons will take as an insult, as there's lot of obsequiosity > >> there that doesn't exist in other cultures. To me, "not give a damn" > >> reads like a familiar version of "not care about something", but > >> apparently it can be offensive. > > > > I'm Anglo-Saxon[1], and honestly I believe that it is thin-skinned to > > the point of ludicrousness to say that "no-one gives a damn" is an > > insult. This isn't 1939 when Clark Gable's famous line "Frankly my dear, > > I don't give a damn" was considered shocking. Its 2018 and to not give a > > damn is a more forceful way of saying that people don't care, that they > > are indifferent. > > Sigh. That's not what I was saying at all. I was trying to point out > that Antoine's claim that people should ignore the rhetoric and that > complaining about the attitude was unreasonable, was in itself unfair. > People have a right to point out that a mail like the OP's was badly > worded. > > > With respect to Paul, I literally cannot imagine why he thinks that > > *anyone*, not even the tkinter maintainers or developers themselves, > > ought to feel *offended* by Ivan's words. > > Personally, they didn't offend me. I don't pretend to know how others > might take them. But they *did* annoy me. I'm frankly sick of people > (not on this list) complaining that people who work on projects in > their own time, free of charge, "don't care enough" or "are ignoring > my requirement". We all do it, to an extent, and it's natural to get > frustrated, but the onus is on the person asking for help to be polite > and fair. And maybe this response was the one where I finally let that > frustration show through. I may read less email for a week or two, > just to get a break. > I had the same response as Paul: annoyed. And while Ivan thought he was using "emotional language to drive the point home that it's not some nitpick", it actually had the reverse effect on me and caused me not to care because I don't need to invite annoyance into my life when putting in my personal time into something. No one is saying people can't be upset and if you are ever upset there's something wrong; we're human beings after all. But those of us speaking up about the tone are saying that you can also wait until you're not so upset to write an email. This was never going to be resolved in an hour, so waiting an hour until you're in a better place to write an email that wasn't quite so inflammatory seems like a reasonable thing to ask. -------------- next part -------------- An HTML attachment was scrubbed... URL: From vano at mail.mipt.ru Thu May 3 14:45:34 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Thu, 3 May 2018 21:45:34 +0300 Subject: [Python-Dev] Dealing with tone in an email In-Reply-To: References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> Message-ID: <5a7c46ef-122d-acdc-8507-a8afc22b0001@mail.mipt.ru> On 03.05.2018 21:31, Brett Cannon wrote: > > > On Thu, 3 May 2018 at 01:27 Paul Moore > wrote: > > On 3 May 2018 at 03:26, Steven D'Aprano > wrote: > > >> Will all due respect, it's sometimes unpredictable what kind of > wording > >> Anglo-Saxons will take as an insult, as there's lot of obsequiosity > >> there that doesn't exist in other cultures. To me, "not give a > damn" > >> reads like a familiar version of "not care about something", but > >> apparently it can be offensive. > > > > I'm Anglo-Saxon[1], and honestly I believe that it is > thin-skinned to > > the point of ludicrousness to say that "no-one gives a damn" is an > > insult. This isn't 1939 when Clark Gable's famous line "Frankly > my dear, > > I don't give a damn" was considered shocking. Its 2018 and to > not give a > > damn is a more forceful way of saying that people don't care, > that they > > are indifferent. > > Sigh. That's not what I was saying at all. I was trying to point out > that Antoine's claim that people should ignore the rhetoric and that > complaining about the attitude was unreasonable, was in itself unfair. > People have a right to point out that a mail like the OP's was badly > worded. > > > With respect to Paul, I literally cannot imagine why he thinks that > > *anyone*, not even the tkinter maintainers or developers themselves, > > ought to feel *offended* by Ivan's words. > > Personally, they didn't offend me. I don't pretend to know how others > might take them. But they *did* annoy me. I'm frankly sick of people > (not on this list) complaining that people who work on projects in > their own time, free of charge, "don't care enough" or "are ignoring > my requirement". We all do it, to an extent, and it's natural to get > frustrated, but the onus is on the person asking for help to be polite > and fair. And maybe this response was the one where I finally let that > frustration show through. I may read less email for a week or two, > just to get a break. > > > I had the same response as Paul: annoyed. And while Ivan thought he > was using "emotional language to drive the point home that it's not > some nitpick", it actually had the reverse effect on me and caused me > not to care because I don't need to invite annoyance into my life when > putting in my personal time into something. > > No one is saying people can't be upset and if you are ever upset > there's something wrong; we're human beings after all. But those of us > speaking up about the tone are saying that you can also wait until > you're not so upset to write an email. This was never going to be > resolved in an hour, so waiting an hour until you're in a better place > to write an email that wasn't quite so inflammatory seems like a > reasonable thing to ask. > Let me express things right from the horse's mouth. The sole purpose of the tone was to not let the mesage be flat-out ignored. I had my neutral-toned, to-the-point messages to mailing lists flat-out ignored one too many times for reasons that I can only guess about. This time, the situation was too important to let that happen. Whatever anyone may think of this, it worked. I got my message through, and got the feedback on the topic that I needed to proceed in resolving the problem that caused it. I seriously doubt I could achieve that with a neutral-toned message just stating the facts: dry facts would not show ppl how this could be important ("ah, just another n00b struggling with Tkinter basics" or something). > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/vano%40mail.mipt.ru -- Regards, Ivan -------------- next part -------------- An HTML attachment was scrubbed... URL: From brian at python.org Thu May 3 14:55:19 2018 From: brian at python.org (Brian Curtin) Date: Thu, 03 May 2018 18:55:19 +0000 Subject: [Python-Dev] Dealing with tone in an email In-Reply-To: <5a7c46ef-122d-acdc-8507-a8afc22b0001@mail.mipt.ru> References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> <5a7c46ef-122d-acdc-8507-a8afc22b0001@mail.mipt.ru> Message-ID: On Thu, May 3, 2018 at 2:45 PM Ivan Pozdeev via Python-Dev < python-dev at python.org> wrote: > On 03.05.2018 21:31, Brett Cannon wrote: > > > > On Thu, 3 May 2018 at 01:27 Paul Moore wrote: > >> On 3 May 2018 at 03:26, Steven D'Aprano wrote: >> >> >> Will all due respect, it's sometimes unpredictable what kind of wording >> >> Anglo-Saxons will take as an insult, as there's lot of obsequiosity >> >> there that doesn't exist in other cultures. To me, "not give a damn" >> >> reads like a familiar version of "not care about something", but >> >> apparently it can be offensive. >> > >> > I'm Anglo-Saxon[1], and honestly I believe that it is thin-skinned to >> > the point of ludicrousness to say that "no-one gives a damn" is an >> > insult. This isn't 1939 when Clark Gable's famous line "Frankly my dear, >> > I don't give a damn" was considered shocking. Its 2018 and to not give a >> > damn is a more forceful way of saying that people don't care, that they >> > are indifferent. >> >> Sigh. That's not what I was saying at all. I was trying to point out >> that Antoine's claim that people should ignore the rhetoric and that >> complaining about the attitude was unreasonable, was in itself unfair. >> People have a right to point out that a mail like the OP's was badly >> worded. >> >> > With respect to Paul, I literally cannot imagine why he thinks that >> > *anyone*, not even the tkinter maintainers or developers themselves, >> > ought to feel *offended* by Ivan's words. >> >> Personally, they didn't offend me. I don't pretend to know how others >> might take them. But they *did* annoy me. I'm frankly sick of people >> (not on this list) complaining that people who work on projects in >> their own time, free of charge, "don't care enough" or "are ignoring >> my requirement". We all do it, to an extent, and it's natural to get >> frustrated, but the onus is on the person asking for help to be polite >> and fair. And maybe this response was the one where I finally let that >> frustration show through. I may read less email for a week or two, >> just to get a break. >> > > I had the same response as Paul: annoyed. And while Ivan thought he was > using "emotional language to drive the point home that it's not some > nitpick", it actually had the reverse effect on me and caused me not to > care because I don't need to invite annoyance into my life when putting in > my personal time into something. > > No one is saying people can't be upset and if you are ever upset there's > something wrong; we're human beings after all. But those of us speaking up > about the tone are saying that you can also wait until you're not so upset > to write an email. This was never going to be resolved in an hour, so > waiting an hour until you're in a better place to write an email that > wasn't quite so inflammatory seems like a reasonable thing to ask. > > Let me express things right from the horse's mouth. > > The sole purpose of the tone was to not let the mesage be flat-out ignored. > I had my neutral-toned, to-the-point messages to mailing lists flat-out > ignored one too many times for reasons that I can only guess about. > This time, the situation was too important to let that happen. > > Whatever anyone may think of this, it worked. I got my message through, > and got the feedback on the topic that I needed to proceed in resolving the > problem that caused it. > I seriously doubt I could achieve that with a neutral-toned message just > stating the facts: dry facts would not show ppl how this could be important > ("ah, just another n00b struggling with Tkinter basics" or something). > As I said on the other thread, that doesn't make it any more acceptable as over time it normalizes the behavior. If enough people want results?because yes, sometimes things break, it's not fun, and sometimes things don't receive response in the most timely fashion?they'll take that tone and sometimes get what they want. Eventually it'll work enough that it becomes more acceptable to behave that way, and eventually the people who are willing to accept that type of behavior will be gone. -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Thu May 3 14:59:33 2018 From: njs at pobox.com (Nathaniel Smith) Date: Thu, 03 May 2018 18:59:33 +0000 Subject: [Python-Dev] Python startup time In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> Message-ID: On Wed, May 2, 2018, 20:59 INADA Naoki wrote: > Recently, I reported how stdlib slows down `import requests`. > https://github.com/requests/requests/issues/4315#issuecomment-385584974 [...] > * Add faster and simpler http.parser (maybe, based on h11 [1]) and avoid > using email module in http module. > It's always risky making predictions, but hopefully by the time 3.8 is out, requests will have switched to using h11 directly instead of the http module. (Kenneth wants the big headline feature for the next major requests release to be async support, and that pretty much requires switching to something like h11.) I don't know how fast importing h11 is though... It does currently compile a bunch of regexps at import time :-). -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From facundobatista at gmail.com Thu May 3 15:01:11 2018 From: facundobatista at gmail.com (Facundo Batista) Date: Thu, 3 May 2018 16:01:11 -0300 Subject: [Python-Dev] Process to remove a Python feature In-Reply-To: References: Message-ID: 2018-05-02 14:24 GMT-03:00 Brett Cannon : >> Maybe we should create a tool to list features scheduled for removal, >> and open a discussion to check each removal? > > I don't know if a tool is necessary. We could have a meta issue or text file > somewhere to track what's to be removed in a certain version. Maybe a specific PEP that list all removals that happened, those that will happen next version, and next ones after that? IOW, a single point where one can see which features/details will fly away and when. Probably it's a good idea to see everything at once and evaluate if "we're removing too many things at once", and have a better deprecation/removal cadence. -- . Facundo Blog: http://www.taniquetil.com.ar/plog/ PyAr: http://www.python.org/ar/ Twitter: @facundobatista From gvanrossum at gmail.com Thu May 3 15:27:49 2018 From: gvanrossum at gmail.com (Guido van Rossum) Date: Thu, 03 May 2018 19:27:49 +0000 Subject: [Python-Dev] Dealing with tone in an email In-Reply-To: References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> <5a7c46ef-122d-acdc-8507-a8afc22b0001@mail.mipt.ru> Message-ID: EVENTUALLY WE'LL ALL BE SHOUTING ALL THE TIME. Sad. On Thu, May 3, 2018, 11:57 Brian Curtin wrote: > On Thu, May 3, 2018 at 2:45 PM Ivan Pozdeev via Python-Dev < > python-dev at python.org> wrote: > >> On 03.05.2018 21:31, Brett Cannon wrote: >> >> >> >> On Thu, 3 May 2018 at 01:27 Paul Moore wrote: >> >>> On 3 May 2018 at 03:26, Steven D'Aprano wrote: >>> >>> >> Will all due respect, it's sometimes unpredictable what kind of >>> wording >>> >> Anglo-Saxons will take as an insult, as there's lot of obsequiosity >>> >> there that doesn't exist in other cultures. To me, "not give a damn" >>> >> reads like a familiar version of "not care about something", but >>> >> apparently it can be offensive. >>> > >>> > I'm Anglo-Saxon[1], and honestly I believe that it is thin-skinned to >>> > the point of ludicrousness to say that "no-one gives a damn" is an >>> > insult. This isn't 1939 when Clark Gable's famous line "Frankly my >>> dear, >>> > I don't give a damn" was considered shocking. Its 2018 and to not give >>> a >>> > damn is a more forceful way of saying that people don't care, that they >>> > are indifferent. >>> >>> Sigh. That's not what I was saying at all. I was trying to point out >>> that Antoine's claim that people should ignore the rhetoric and that >>> complaining about the attitude was unreasonable, was in itself unfair. >>> People have a right to point out that a mail like the OP's was badly >>> worded. >>> >>> > With respect to Paul, I literally cannot imagine why he thinks that >>> > *anyone*, not even the tkinter maintainers or developers themselves, >>> > ought to feel *offended* by Ivan's words. >>> >>> Personally, they didn't offend me. I don't pretend to know how others >>> might take them. But they *did* annoy me. I'm frankly sick of people >>> (not on this list) complaining that people who work on projects in >>> their own time, free of charge, "don't care enough" or "are ignoring >>> my requirement". We all do it, to an extent, and it's natural to get >>> frustrated, but the onus is on the person asking for help to be polite >>> and fair. And maybe this response was the one where I finally let that >>> frustration show through. I may read less email for a week or two, >>> just to get a break. >>> >> >> I had the same response as Paul: annoyed. And while Ivan thought he was >> using "emotional language to drive the point home that it's not some >> nitpick", it actually had the reverse effect on me and caused me not to >> care because I don't need to invite annoyance into my life when putting in >> my personal time into something. >> >> No one is saying people can't be upset and if you are ever upset there's >> something wrong; we're human beings after all. But those of us speaking up >> about the tone are saying that you can also wait until you're not so upset >> to write an email. This was never going to be resolved in an hour, so >> waiting an hour until you're in a better place to write an email that >> wasn't quite so inflammatory seems like a reasonable thing to ask. >> >> Let me express things right from the horse's mouth. >> >> The sole purpose of the tone was to not let the mesage be flat-out >> ignored. >> I had my neutral-toned, to-the-point messages to mailing lists flat-out >> ignored one too many times for reasons that I can only guess about. >> This time, the situation was too important to let that happen. >> >> Whatever anyone may think of this, it worked. I got my message through, >> and got the feedback on the topic that I needed to proceed in resolving the >> problem that caused it. >> I seriously doubt I could achieve that with a neutral-toned message just >> stating the facts: dry facts would not show ppl how this could be important >> ("ah, just another n00b struggling with Tkinter basics" or something). >> > > As I said on the other thread, that doesn't make it any more acceptable as > over time it normalizes the behavior. If enough people want results?because > yes, sometimes things break, it's not fun, and sometimes things don't > receive response in the most timely fashion?they'll take that tone and > sometimes get what they want. Eventually it'll work enough that it becomes > more acceptable to behave that way, and eventually the people who are > willing to accept that type of behavior will be gone. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Thu May 3 15:55:26 2018 From: brett at python.org (Brett Cannon) Date: Thu, 03 May 2018 19:55:26 +0000 Subject: [Python-Dev] Process to remove a Python feature In-Reply-To: References: Message-ID: On Thu, 3 May 2018 at 12:01 Facundo Batista wrote: > 2018-05-02 14:24 GMT-03:00 Brett Cannon : > > >> Maybe we should create a tool to list features scheduled for removal, > >> and open a discussion to check each removal? > > > > I don't know if a tool is necessary. We could have a meta issue or text > file > > somewhere to track what's to be removed in a certain version. > > Maybe a specific PEP that list all removals that happened, those that > will happen next version, and next ones after that? > I don't know if we need a history section as the What's New docs cover that. But having a PEP or some other file/doc that acts like a future What's New but for removals seems like a reasonable idea. -Brett > > IOW, a single point where one can see which features/details will fly > away and when. > > Probably it's a good idea to see everything at once and evaluate if > "we're removing too many things at once", and have a better > deprecation/removal cadence. > > -- > . Facundo > > Blog: http://www.taniquetil.com.ar/plog/ > PyAr: http://www.python.org/ar/ > Twitter: @facundobatista > -------------- next part -------------- An HTML attachment was scrubbed... URL: From carl.shapiro at gmail.com Thu May 3 16:13:35 2018 From: carl.shapiro at gmail.com (Carl Shapiro) Date: Thu, 3 May 2018 13:13:35 -0700 Subject: [Python-Dev] A fast startup patch (was: Python startup time) Message-ID: Hello, Yesterday Neil Schemenauer mentioned some work that a colleague of mine (CCed) and I have done to improve CPython start-up time. Given the recent discussion, it seems timely to discuss what we are doing and whether it is of interest to other people hacking on the CPython runtime. There are many ways to reduce the start-up time overhead. For this experiment, we are specifically targeting the cost of unmarshaling heap objects from compiled Python bytecode. Our measurements show this specific cost to represent 10% to 25% of the start-up time among the applications we have examined. Our approach to eliminating this overhead is to store unmarshaled objects into the data segment of the python executable. We do this by processing the compiled python bytecode for a module, creating native object code with the unmarshaled objects in their in-memory representation, and linking this into the python executable. When a module is imported, we simply return a pointer to the top-level code object in the data segment directly without invoking the unmarshaling code or touching the file system. What we are doing is conceptually similar to the existing capability to freeze a module, but we avoid non-trivial unmarshaling costs. The patch is still under development and there is still a little bit more work to do. With that in mind, the numbers look good but please take these with a grain of salt Baseline $ bench "./python.exe -c ''" benchmarking ./python.exe -c '' time 31.46 ms (31.24 ms .. 31.78 ms) 1.000 R? (0.999 R? .. 1.000 R?) mean 32.08 ms (31.82 ms .. 32.63 ms) std dev 778.1 ?s (365.6 ?s .. 1.389 ms) $ bench "./python.exe -c 'import difflib'" benchmarking ./python.exe -c 'import difflib' time 32.82 ms (32.64 ms .. 33.02 ms) 1.000 R? (1.000 R? .. 1.000 R?) mean 33.17 ms (33.01 ms .. 33.44 ms) std dev 430.7 ?s (233.8 ?s .. 675.4 ?s) With our patch $ bench "./python.exe -c ''" benchmarking ./python.exe -c '' time 24.86 ms (24.62 ms .. 25.08 ms) 0.999 R? (0.999 R? .. 1.000 R?) mean 25.58 ms (25.36 ms .. 25.94 ms) std dev 592.8 ?s (376.2 ?s .. 907.8 ?s) $ bench "./python.exe -c 'import difflib'" benchmarking ./python.exe -c 'import difflib' time 25.30 ms (25.00 ms .. 25.55 ms) 0.999 R? (0.998 R? .. 1.000 R?) mean 26.78 ms (26.30 ms .. 27.64 ms) std dev 1.413 ms (747.5 ?s .. 2.250 ms) variance introduced by outliers: 20% (moderately inflated) Here are some numbers with the patch but with the stat calls preserved to isolate just the marshaling effects Baseline $ bench "./python.exe -c 'import difflib'" benchmarking ./python.exe -c 'import difflib' time 34.67 ms (33.17 ms .. 36.52 ms) 0.995 R? (0.990 R? .. 1.000 R?) mean 35.36 ms (34.81 ms .. 36.25 ms) std dev 1.450 ms (1.045 ms .. 2.133 ms) variance introduced by outliers: 12% (moderately inflated) With our patch (and calls to stat) $ bench "./python.exe -c 'import difflib'" benchmarking ./python.exe -c 'import difflib' time 30.24 ms (29.02 ms .. 32.66 ms) 0.988 R? (0.968 R? .. 1.000 R?) mean 31.86 ms (31.13 ms .. 32.75 ms) std dev 1.789 ms (1.329 ms .. 2.437 ms) variance introduced by outliers: 17% (moderately inflated) (This work was done in CPython 3.6 and we are exploring back-porting to 2.7 so we can run the hg startup benchmarks in the performance test suite.) This is effectively a drop-in replacement for the frozen module capability and (so far) required only minimal changes to the runtime. To us, it seems like a very nice win without compromising on compatibility or complexity. I am happy to discuss more of the technical details until we have a public patch available. I hope this provides some optimism around the possibility of improving the start-up time of CPython. What do you all think? Kindly, Carl -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Thu May 3 16:13:54 2018 From: brett at python.org (Brett Cannon) Date: Thu, 03 May 2018 20:13:54 +0000 Subject: [Python-Dev] Dealing with tone in an email In-Reply-To: References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> <5a7c46ef-122d-acdc-8507-a8afc22b0001@mail.mipt.ru> Message-ID: On Thu, 3 May 2018 at 12:29 Guido van Rossum wrote: > EVENTUALLY WE'LL ALL BE SHOUTING ALL THE TIME. Sad. > Yep. And that leads to burn-out. So while Ivan may have lucked out in getting the attention of people who are helped him (and given the wrong kind of positive reinforcement that this approach is reasonable), this can lead to people quitting open source and not being available to help you next time (e.g. notice how it drove Paul Moore over the edge to pull back for a week or so and he may have been the expert you needed for packaging or me for imports; IOW I would say Ivan was lucky this time and may not be so lucky next time). -Brett > > On Thu, May 3, 2018, 11:57 Brian Curtin wrote: > >> On Thu, May 3, 2018 at 2:45 PM Ivan Pozdeev via Python-Dev < >> python-dev at python.org> wrote: >> >>> On 03.05.2018 21:31, Brett Cannon wrote: >>> >>> >>> >>> On Thu, 3 May 2018 at 01:27 Paul Moore wrote: >>> >>>> On 3 May 2018 at 03:26, Steven D'Aprano wrote: >>>> >>>> >> Will all due respect, it's sometimes unpredictable what kind of >>>> wording >>>> >> Anglo-Saxons will take as an insult, as there's lot of obsequiosity >>>> >> there that doesn't exist in other cultures. To me, "not give a damn" >>>> >> reads like a familiar version of "not care about something", but >>>> >> apparently it can be offensive. >>>> > >>>> > I'm Anglo-Saxon[1], and honestly I believe that it is thin-skinned to >>>> > the point of ludicrousness to say that "no-one gives a damn" is an >>>> > insult. This isn't 1939 when Clark Gable's famous line "Frankly my >>>> dear, >>>> > I don't give a damn" was considered shocking. Its 2018 and to not >>>> give a >>>> > damn is a more forceful way of saying that people don't care, that >>>> they >>>> > are indifferent. >>>> >>>> Sigh. That's not what I was saying at all. I was trying to point out >>>> that Antoine's claim that people should ignore the rhetoric and that >>>> complaining about the attitude was unreasonable, was in itself unfair. >>>> People have a right to point out that a mail like the OP's was badly >>>> worded. >>>> >>>> > With respect to Paul, I literally cannot imagine why he thinks that >>>> > *anyone*, not even the tkinter maintainers or developers themselves, >>>> > ought to feel *offended* by Ivan's words. >>>> >>>> Personally, they didn't offend me. I don't pretend to know how others >>>> might take them. But they *did* annoy me. I'm frankly sick of people >>>> (not on this list) complaining that people who work on projects in >>>> their own time, free of charge, "don't care enough" or "are ignoring >>>> my requirement". We all do it, to an extent, and it's natural to get >>>> frustrated, but the onus is on the person asking for help to be polite >>>> and fair. And maybe this response was the one where I finally let that >>>> frustration show through. I may read less email for a week or two, >>>> just to get a break. >>>> >>> >>> I had the same response as Paul: annoyed. And while Ivan thought he was >>> using "emotional language to drive the point home that it's not some >>> nitpick", it actually had the reverse effect on me and caused me not to >>> care because I don't need to invite annoyance into my life when putting in >>> my personal time into something. >>> >>> No one is saying people can't be upset and if you are ever upset there's >>> something wrong; we're human beings after all. But those of us speaking up >>> about the tone are saying that you can also wait until you're not so upset >>> to write an email. This was never going to be resolved in an hour, so >>> waiting an hour until you're in a better place to write an email that >>> wasn't quite so inflammatory seems like a reasonable thing to ask. >>> >>> Let me express things right from the horse's mouth. >>> >>> The sole purpose of the tone was to not let the mesage be flat-out >>> ignored. >>> I had my neutral-toned, to-the-point messages to mailing lists flat-out >>> ignored one too many times for reasons that I can only guess about. >>> This time, the situation was too important to let that happen. >>> >>> Whatever anyone may think of this, it worked. I got my message through, >>> and got the feedback on the topic that I needed to proceed in resolving the >>> problem that caused it. >>> I seriously doubt I could achieve that with a neutral-toned message just >>> stating the facts: dry facts would not show ppl how this could be important >>> ("ah, just another n00b struggling with Tkinter basics" or something). >>> >> >> As I said on the other thread, that doesn't make it any more acceptable >> as over time it normalizes the behavior. If enough people want >> results?because yes, sometimes things break, it's not fun, and sometimes >> things don't receive response in the most timely fashion?they'll take that >> tone and sometimes get what they want. Eventually it'll work enough that it >> becomes more acceptable to behave that way, and eventually the people who >> are willing to accept that type of behavior will be gone. >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> > Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/guido%40python.org >> > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mcepl at cepl.eu Thu May 3 16:39:34 2018 From: mcepl at cepl.eu (=?UTF-8?Q?Mat=C4=9Bj?= Cepl) Date: Thu, 03 May 2018 22:39:34 +0200 Subject: [Python-Dev] Drop/deprecate Tkinter? References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> Message-ID: On 2018-05-03, 15:56 GMT, Tim Peters wrote: > IDLE isn't just for eager beginners, but also for those so old > & senile they're incapable of learning anything new ever again. As > proof, IDLE is still _my_ primary Python development environment, used > multiple times every day, and I'm so old & out-of-it that I'm +1 on > the binding expressions PEP ;-) Glad to find that such person exists! I thought that you are just a mythical legend, I am glad to be shown otherwise. Best, Mat?j Cepl -- https://matej.ceplovi.cz/blog/, Jabber: mcepl at ceplovi.cz GPG Finger: 3C76 A027 CA45 AD70 98B5 BC1D 7920 5802 880B C9D8 To err is human, to purr feline. From tjreedy at udel.edu Thu May 3 19:44:13 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Thu, 3 May 2018 19:44:13 -0400 Subject: [Python-Dev] Dealing with tone in an email In-Reply-To: <5a7c46ef-122d-acdc-8507-a8afc22b0001@mail.mipt.ru> References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> <5a7c46ef-122d-acdc-8507-a8afc22b0001@mail.mipt.ru> Message-ID: On 5/3/2018 2:45 PM, Ivan Pozdeev via Python-Dev wrote: > Let me express things right from the horse's mouth. Ditto, as the only person who responded on the tracker before you posted here and the only person other that Guido to respond on the tracker since and the only person to collect data which mostly refute your claims. > The sole purpose of the tone was to not let the mesage be flat-out ignored. > I had my neutral-toned, to-the-point messages to mailing lists flat-out > ignored one too many times for reasons that I can only guess about. For 24 years, from 1994 to 2017, _tkinter.c has gotten 333 revisions at a steady rate, an average of about 1 patch per month. Tkinter's .py files have gotten even more -- 426 -- including 5 in the first 4 months of this year. I think at least 20 different core developers, including me, have been involved over the years. Calling this 'indifference' is dog poo. I believe these patches mostly or all happened from people making relatively calm reports, like you originally did. Other people wait at least a month, or two or three, without *any* response to a patch. If they care, they write a polite note to core-mentorship or pydev. Then someone nearly always responds and reviews their patch. > This time, the situation was too important to let that happen. The situation was that the tkinter maintainer, Serhiy Storchaka, and a tkinter user, me, had already given you initial responses. I indecated that I considered the issue real and Serhiy promised a review of your patch. The tkinter situation is that if one makes tk calls to the main thread from other threads when using non-thread tcl/tk, bad thinks can happen. No surprise. It turns out that some of the people above have tried but only partly succeeded in making thread calls work even with non-thread tcl/tk. This was generally known, but you added the fact that tcl has a thread compile switch. I do appreciate knowing a concrete reason why different people get different results with the same code. If your patch improves the situation great, but very few people will be affected. This is really not important to people who don't do the above, and not worth disrupting pydev. You appear to claim that your 2nd example, sending fake events to the main thread, reposted to https://bugs.python.org/issue33412, fails even with thread tcl/tk. However, when I tested with 3.6 repository, 3.7 installed, and 3.8 repository builds, it nearly finished rather than failing immediately. When I fixed the deadlock bug, IT RAN TO COMPLETION. TO EVERYONE ELSE: If you want to be helpful, ignore the pydev threads. If you have a tcl thread build, as indicated by the 't' in 'tcl86t.dll' in /dlls, test the example, with my fix, on your machine and report on the tracker. > Whatever anyone may think of this, it worked. This makes me want to scream. I was about to respond to msg316087 on https://bugs.python.org/issue33257 when I saw the crap about removing tkinter. I considered washing my hands of the issue then and there, to avoid 'rewarding' your nastiness. But because I care about tkinter, I decided to continue on the tracker and then inform you in yesterday's response on the thread that I did so in spite of, not because of, 'this'. Since you ignored me and continue to defend 'this', I say it again, and request that you retract your 'delete tkinter' post. -- Terry Jan Reedy From greg at krypto.org Thu May 3 20:02:34 2018 From: greg at krypto.org (Gregory P. Smith) Date: Thu, 3 May 2018 17:02:34 -0700 Subject: [Python-Dev] Python startup time In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> Message-ID: On Wed, May 2, 2018 at 2:13 PM, Barry Warsaw wrote: > Thanks for bringing this topic up again. At $day_job, this is a highly > visible and important topic, since the majority of our command line tools > are written in Python (of varying versions from 2.7 to 3.6). Some of those > tools can take upwards of 5 seconds or more just to respond to ?help, which > causes lots of pain for developers, who complain (rightly so) up the > management chain. ;) > > We?ve done a fair bit of work to bring those numbers down without super > radical workarounds. Often there are problems not strictly related to the > Python interpreter that contribute to this. Python gets blamed, but it?s > not always the interpreter?s fault. Common issues include: > > * Modules that have import-time side effects, such as network access or > expensive creation of data structures. Python 3.7?s `-X importtime` switch > is a really wonderful way to identify the worst offenders. Once 3.7 is > released, I do plan to spend some time using this to collect data > internally so we can attack our own libraries, and perhaps put automated > performance testing into our build stack, to identify start up time > regressions. > > * pkg_resources. When you have tons of entries on sys.path, pkg_resources > does a lot of work at import time, and because of common patterns which > tend to use pkg_resources namespace package support in __init__.py files, > this just kills start up times. Of course, pkg_resources has other uses > too, so even in a purely Python 3 world (where your namespace packages can > omit the __init__.py), you?ll often get clobbered as soon as you want to > use the Basic Resource Access API. This is also pretty common, and it?s > the main reason why Brett and I created importlib.resources for 3.7 (with a > standalone API-compatible library for older Pythons). That?s one less > reason to use pkg_resources, but it doesn?t address the __init__.py use. > Brett and I have been talking about addressing that for 3.8. > > * pex - which we use as our single file zipapp tool. Especially the > interaction between pex and pkg_resources introduces pretty significant > overhead. My colleague Loren Carvalho created a tool called shiv which > requires at least Python 3.6, avoids the use of pkg_resources, and > implements other tricks to be much more performant than pex. Shiv is now > open source and you can find it on RTD and GitHub. > > The switch to shiv and importlib.resources can shave 25-50% off of warm > cache start up times for zipapp style executables. > > Another thing we?ve done, although I?m much less sanguine about them as a > general approach, is to move imports into functions, but we?re trying to > only use that trick on the most critical cases. > > Some import time effects can?t be changed. Decorators come to mind, and > click is a popular library for CLIs that provides some great features, but > decorators do prevent a lazy loading approach. > > > On May 1, 2018, at 20:26, Gregory Szorc wrote: > > >> You might think "what's a few milliseconds matter". But if you run > >> hundreds of commands in a shell script it adds up. git's speed is one > >> of the few bright spots in its UX, and hg's comparative slowness here is > >> a palpable disadvantage. > > Oh, for command line tools, milliseconds absolutely matter. > > > As a concrete example, I recently landed a Mercurial patch [2] that > > stubs out zope.interface to prevent the import of 9 modules on every > > `hg` invocation. > > I have a similar dastardly plan to provide a pkg_resources stub :). > > > Mercurial provides a `chg` program that essentially spins up a daemon > > `hg` process running a "command server" so the `chg` program [written in > > C - no startup overhead] can dispatch commands to an already-running > > Python/`hg` process and avoid paying the startup overhead cost. When you > > run Mercurial's test suite using `chg`, it completes *minutes* faster. > > `chg` exists mainly as a workaround for slow startup overhead. > > A couple of our developers demoed a similar approach for one of our CLIs > that almost everyone uses. It?s a big application with lots of > dependencies, so particularly vulnerable to pex and pkg_resources > overhead. While it was just a prototype, it was darn impressive to see > subsequent invocations produce output almost immediately. It?s unfortunate > that we have to utilize all these tricks to get even moderately performant > Python CLIs. > > Note that this kind of "trick" is not unique to Python. I see it used by large Java tools at work. In effect emacs has done similar things for many decades with its saved core-dump at build time. It saves a snapshot of initialized elisp interpreter state and loads that into memory instead of rerunning initialization to reproduce the state. I don't know if anyone has looked at making a similar concept of saved post-startup interpreter state for rapid loading as a memory image possible in Python. I'm don't believe we're even at the point where all state can actually accurately be captured from CPython (extension modules can do anything). When you do that kind of trick things like hash randomization tend to complicate matters and might need to be disabled. That feature may not matter for all CLI tools. -gps A few of us spent some time at last year?s core Python dev talking about > other things we could do to improve Python?s start up time, not just with > the interpreter itself, but within the larger context of the Python > ecosystem. Many ideas seem promising until you dive into the details, so > it?s definitely hard to imagine maintaining all of Python?s dynamic > semantics and still making it an order of magnitude faster to start up. > But that?s not an excuse to give up, and I?m hoping we can continue to > attack the problem, both in the micro and the macro, for 3.8 and beyond, > because the alternative is that Python becomes less popular as an > implementation language for CLIs. That would be sad, and definitely has a > long term impact on Python?s popularity. > > Cheers, > -Barry > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/greg% > 40krypto.org > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mingw.android at gmail.com Thu May 3 20:21:54 2018 From: mingw.android at gmail.com (Ray Donnelly) Date: Fri, 04 May 2018 00:21:54 +0000 Subject: [Python-Dev] Python startup time In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> Message-ID: On Wed, May 2, 2018 at 6:55 PM, Nathaniel Smith wrote: > On Wed, May 2, 2018, 09:51 Gregory Szorc wrote: >> >> Correct me if I'm wrong, but aren't there downsides with regards to C >> extension compatibility to not having a shared libpython? Or does all the >> packaging tooling "just work" without a libpython? (It's possible I have my >> wires crossed up with something else regarding a statically linked Python.) > > > IIRC, the rule on Linux is that if you build an extension on a statically > built python, then it can be imported on a shared python, but not > vice-versa. Manylinux wheels are therefore always built on a static python > so that they'll work everywhere. (We should probably clean this up upstream > at some point, but there's not a lot of appetite for touching this stuff ? > very obscure, very easy to break things without realizing it, not much > upside.) > > On Windows I don't think there is such a thing as a static build, because > extensions have to link to the python dll to work at all. And on MacOS I'm > not sure, though from knowing how their linker works my guess is that all > extensions act like static extensions do on Linux. Yes, on Windows there's always a python?.dll. macOS is an interesting one. For Anaconda 5.0 I read somewhere (how's that for a useless reference - and perhaps I got the wrong end of the stick) that Python for all Unixen should use a statically linked interpreter so I happily went ahead and did that. Of course I tested it against a good few wheels at the time and everything seemed fine (well, no worse than the usual binary compatibility woes at least) so I went ahead with it. Now that Python 3.7 is around the corner we have a chance to re-evaluate this decision. We have received no binary compat. bugs whatsoever due to this change (we got a few bugs where people used python-config incorrectly either directly or via swig or CMake), were we just lucky? Anyway, it is obviously safer for us to do what upstream does and I will try to post some benchmarks of static vs shared to the list so we can discuss it. I guess it is a little late in the release schedule to propose any such change for 3.7? If not I will try to prepare something. I will discuss it in depth with the rest of the AD team soon too. > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/mingw.android%40gmail.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From lukasz at langa.pl Thu May 3 20:22:39 2018 From: lukasz at langa.pl (Lukasz Langa) Date: Thu, 3 May 2018 17:22:39 -0700 Subject: [Python-Dev] Python startup time In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> Message-ID: <5C4A992D-891A-4278-82E9-A1625EF4BA3E@langa.pl> > On May 2, 2018, at 8:57 PM, INADA Naoki wrote: > > Recently, I reported how stdlib slows down `import requests`. > https://github.com/requests/requests/issues/4315#issuecomment-385584974 > > For Python 3.8, my ideas for faster startup time are: > > * Add lazy compiling API or flag in `re` module. The pattern is compiled > when first used. How about go the other way and allow compiling at Python *compile*-time? That would actually make things faster instead of just moving the time spent around. I do see value in being less eager in Python but I think the real wins are hiding behind ahead-of-time compilation. - ? From greg at krypto.org Thu May 3 20:24:03 2018 From: greg at krypto.org (Gregory P. Smith) Date: Thu, 3 May 2018 17:24:03 -0700 Subject: [Python-Dev] A fast startup patch (was: Python startup time) In-Reply-To: References: Message-ID: +1 to the concept! On Thu, May 3, 2018 at 1:13 PM, Carl Shapiro wrote: > Hello, > > Yesterday Neil Schemenauer mentioned some work that a colleague of mine > (CCed) and I have done to improve CPython start-up time. Given the recent > discussion, it seems timely to discuss what we are doing and whether it is > of interest to other people hacking on the CPython runtime. > > There are many ways to reduce the start-up time overhead. For this > experiment, we are specifically targeting the cost of unmarshaling heap > objects from compiled Python bytecode. Our measurements show this specific > cost to represent 10% to 25% of the start-up time among the applications we > have examined. > > Our approach to eliminating this overhead is to store unmarshaled objects > into the data segment of the python executable. We do this by processing > the compiled python bytecode for a module, creating native object code with > the unmarshaled objects in their in-memory representation, and linking this > into the python executable. > > When a module is imported, we simply return a pointer to the top-level > code object in the data segment directly without invoking the unmarshaling > code or touching the file system. What we are doing is conceptually > similar to the existing capability to freeze a module, but we avoid > non-trivial unmarshaling costs. > > The patch is still under development and there is still a little bit more > work to do. With that in mind, the numbers look good but please take these > with a grain of salt > > Baseline > > $ bench "./python.exe -c ''" > benchmarking ./python.exe -c '' > time 31.46 ms (31.24 ms .. 31.78 ms) > 1.000 R? (0.999 R? .. 1.000 R?) > mean 32.08 ms (31.82 ms .. 32.63 ms) > std dev 778.1 ?s (365.6 ?s .. 1.389 ms) > > $ bench "./python.exe -c 'import difflib'" > benchmarking ./python.exe -c 'import difflib' > time 32.82 ms (32.64 ms .. 33.02 ms) > 1.000 R? (1.000 R? .. 1.000 R?) > mean 33.17 ms (33.01 ms .. 33.44 ms) > std dev 430.7 ?s (233.8 ?s .. 675.4 ?s) > > > With our patch > > $ bench "./python.exe -c ''" > benchmarking ./python.exe -c '' > time 24.86 ms (24.62 ms .. 25.08 ms) > 0.999 R? (0.999 R? .. 1.000 R?) > mean 25.58 ms (25.36 ms .. 25.94 ms) > std dev 592.8 ?s (376.2 ?s .. 907.8 ?s) > > $ bench "./python.exe -c 'import difflib'" > benchmarking ./python.exe -c 'import difflib' > time 25.30 ms (25.00 ms .. 25.55 ms) > 0.999 R? (0.998 R? .. 1.000 R?) > mean 26.78 ms (26.30 ms .. 27.64 ms) > std dev 1.413 ms (747.5 ?s .. 2.250 ms) > variance introduced by outliers: 20% (moderately inflated) > > > Here are some numbers with the patch but with the stat calls preserved to > isolate just the marshaling effects > > Baseline > > $ bench "./python.exe -c 'import difflib'" > benchmarking ./python.exe -c 'import difflib' > time 34.67 ms (33.17 ms .. 36.52 ms) > 0.995 R? (0.990 R? .. 1.000 R?) > mean 35.36 ms (34.81 ms .. 36.25 ms) > std dev 1.450 ms (1.045 ms .. 2.133 ms) > variance introduced by outliers: 12% (moderately inflated) > > > With our patch (and calls to stat) > > $ bench "./python.exe -c 'import difflib'" > benchmarking ./python.exe -c 'import difflib' > time 30.24 ms (29.02 ms .. 32.66 ms) > 0.988 R? (0.968 R? .. 1.000 R?) > mean 31.86 ms (31.13 ms .. 32.75 ms) > std dev 1.789 ms (1.329 ms .. 2.437 ms) > variance introduced by outliers: 17% (moderately inflated) > > > (This work was done in CPython 3.6 and we are exploring back-porting to > 2.7 so we can run the hg startup benchmarks in the performance test suite.) > > This is effectively a drop-in replacement for the frozen module capability > and (so far) required only minimal changes to the runtime. To us, it seems > like a very nice win without compromising on compatibility or complexity. > I am happy to discuss more of the technical details until we have a public > patch available. > > I hope this provides some optimism around the possibility of improving the > start-up time of CPython. What do you all think? > > Kindly, > > Carl > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ > greg%40krypto.org > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From greg at krypto.org Thu May 3 20:43:26 2018 From: greg at krypto.org (Gregory P. Smith) Date: Thu, 3 May 2018 17:43:26 -0700 Subject: [Python-Dev] Python startup time In-Reply-To: <5C4A992D-891A-4278-82E9-A1625EF4BA3E@langa.pl> References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> <5C4A992D-891A-4278-82E9-A1625EF4BA3E@langa.pl> Message-ID: On Thu, May 3, 2018 at 5:22 PM, Lukasz Langa wrote: > > > On May 2, 2018, at 8:57 PM, INADA Naoki wrote: > > > > Recently, I reported how stdlib slows down `import requests`. > > https://github.com/requests/requests/issues/4315#issuecomment-385584974 > > > > For Python 3.8, my ideas for faster startup time are: > > > > * Add lazy compiling API or flag in `re` module. The pattern is compiled > > when first used. > > How about go the other way and allow compiling at Python *compile*-time? > That would actually make things faster instead of just moving the time > spent around. > > I do see value in being less eager in Python but I think the real wins are > hiding behind ahead-of-time compilation. > Agreed in concept. We've got a lot of unused letters that could be new string prefixes... (ugh) I'd also like to see this concept somehow extended to decorators so that the results of the decoration can be captured in the compiled pyc rather than requiring execution at import time. I realize that limits what decorators can do, but the evil things they could do that this would eliminate are things they just shouldn't be doing in most situations. meaning: there would probably be two types of decorators... colons seem to be all the rage these days so we could add an @: operator for that. :P ... Along with a from __future__ import to change the behavior or all decorators in a file from runtime to compile time by default. from __future__ import compile_time_decorators # we'd be unlikely to ever change the default and break things, __future__ seems wrong @this_happens_at_compile_time(3) def ... @:this_waits_until_runtime(5) def ... Just a not-so-wild idea, no idea if this should become a PEP for 3.8. (the : syntax is a joke - i'd prefer @@ so it looks like eyeballs) If this were done to decorators, you can imagine extending that concept to something similar to allow compile time re.compile calls as some form of assignment decorator: GREGS_RE = @re.compile(r'A regex compiled at compile time\. number = \d+') -gps > - ? > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ > greg%40krypto.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From rosuav at gmail.com Thu May 3 20:55:42 2018 From: rosuav at gmail.com (Chris Angelico) Date: Fri, 4 May 2018 10:55:42 +1000 Subject: [Python-Dev] Python startup time In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> <5C4A992D-891A-4278-82E9-A1625EF4BA3E@langa.pl> Message-ID: On Fri, May 4, 2018 at 10:43 AM, Gregory P. Smith wrote: > I'd also like to see this concept somehow extended to decorators so that the > results of the decoration can be captured in the compiled pyc rather than > requiring execution at import time. I realize that limits what decorators > can do, but the evil things they could do that this would eliminate are > things they just shouldn't be doing in most situations. meaning: there > would probably be two types of decorators... colons seem to be all the rage > these days so we could add an @: operator for that. :P ... Along with a from > __future__ import to change the behavior or all decorators in a file from > runtime to compile time by default. > > from __future__ import compile_time_decorators # we'd be unlikely to ever > change the default and break things, __future__ seems wrong > > @this_happens_at_compile_time(3) > def ... > > @:this_waits_until_runtime(5) > def ... > > Just a not-so-wild idea, no idea if this should become a PEP for 3.8. (the > : syntax is a joke - i'd prefer @@ so it looks like eyeballs) At this point, we're squarely in python-ideas territory, but there are some possibilities. Imagine popping this line of code at the bottom of your file: import importlib; importlib.freeze_module() as a declaration that the dictionary for this module is now locked in and can be dumped out in whatever form is most efficient. Effectively, you're stating that you do not need any sort of dynamism (that call could be easily disabled for testing), and that, if the optimization breaks anything, you accept responsibility for it. How this would be implemented, I'm not sure, but that's no different from the @: idea. ChrisA From wes.turner at gmail.com Thu May 3 20:58:28 2018 From: wes.turner at gmail.com (Wes Turner) Date: Thu, 3 May 2018 20:58:28 -0400 Subject: [Python-Dev] Drop/deprecate Tkinter? In-Reply-To: <20180503063338.GF9562@ando.pearwood.info> References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503063338.GF9562@ando.pearwood.info> Message-ID: On Thursday, May 3, 2018, Steven D'Aprano wrote: > On Thu, May 03, 2018 at 07:39:05AM +0200, Mat?j Cepl wrote: > > > I have my doubts about IDLE though. I know, the same > > argument applies, but really, does anybody use IDLE for > > development for long time > > Yes, tons of beginners use it. On the tutor and python-list mailing > lists, there are plenty of questions from people using IDLE. Turtle is built with Tkinter: https://docs.python.org/3/library/turtle.html IIRC, I used IDLE when first learning Python. Dive Into Python 3 recommends IDLE. http://www.diveintopython3.net/installing-python.html#idle Hopefully this bug is fixed by someone. PyQt and PySide: - are more accessible to screen readers - support QThread, QThreadPool, QRunnable - quamash is an asyncio event loop with/for Qt - PyQtGraph > > -- > Steve > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ > wes.turner%40gmail.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.jerdonek at gmail.com Thu May 3 21:44:39 2018 From: chris.jerdonek at gmail.com (Chris Jerdonek) Date: Thu, 3 May 2018 18:44:39 -0700 Subject: [Python-Dev] Python startup time In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> <5C4A992D-891A-4278-82E9-A1625EF4BA3E@langa.pl> Message-ID: FYI, a lot of these ideas were discussed back in September and October of 2017 on this list if you search the subject lines for "startup" e.g. starting here and here: https://mail.python.org/pipermail/python-dev/2017-September/149150.html https://mail.python.org/pipermail/python-dev/2017-October/149670.html At the end Guido kicked (at least part of) the discussion back to python-ideas. --Chris On Thu, May 3, 2018 at 5:55 PM, Chris Angelico wrote: > On Fri, May 4, 2018 at 10:43 AM, Gregory P. Smith wrote: > > I'd also like to see this concept somehow extended to decorators so that > the > > results of the decoration can be captured in the compiled pyc rather than > > requiring execution at import time. I realize that limits what > decorators > > can do, but the evil things they could do that this would eliminate are > > things they just shouldn't be doing in most situations. meaning: there > > would probably be two types of decorators... colons seem to be all the > rage > > these days so we could add an @: operator for that. :P ... Along with a > from > > __future__ import to change the behavior or all decorators in a file from > > runtime to compile time by default. > > > > from __future__ import compile_time_decorators # we'd be unlikely to > ever > > change the default and break things, __future__ seems wrong > > > > @this_happens_at_compile_time(3) > > def ... > > > > @:this_waits_until_runtime(5) > > def ... > > > > Just a not-so-wild idea, no idea if this should become a PEP for 3.8. > (the > > : syntax is a joke - i'd prefer @@ so it looks like eyeballs) > > At this point, we're squarely in python-ideas territory, but there are > some possibilities. Imagine popping this line of code at the bottom of > your file: > > import importlib; importlib.freeze_module() > > as a declaration that the dictionary for this module is now locked in > and can be dumped out in whatever form is most efficient. Effectively, > you're stating that you do not need any sort of dynamism (that call > could be easily disabled for testing), and that, if the optimization > breaks anything, you accept responsibility for it. > > How this would be implemented, I'm not sure, but that's no different > from the @: idea. > > ChrisA > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ > chris.jerdonek%40gmail.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From paddy3118 at gmail.com Fri May 4 03:17:48 2018 From: paddy3118 at gmail.com (Paddy McCarthy) Date: Fri, 04 May 2018 07:17:48 +0000 Subject: [Python-Dev] Dealing with tone in an email In-Reply-To: References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> <5a7c46ef-122d-acdc-8507-a8afc22b0001@mail.mipt.ru> Message-ID: > > Whatever anyone may think of this, it worked. I help on other forums and have two practises that I work at: When asking a question I try to be polite. It may be more challenging to be ultra polite but sometimes it's worthwhile. Being told I am polite online is a much rarer accolade and can lift my day :-) On those being brusk/abusive in asking for help then I don't like to reward that behaviour and someone will tell them it's not necessary. If repeated then the best one can do is ignore them leaving them knowing why. Hopefully it leads to a better community. People have to understand **"Help on the helpers terms".** From steve at holdenweb.com Fri May 4 03:23:19 2018 From: steve at holdenweb.com (Steve Holden) Date: Fri, 4 May 2018 08:23:19 +0100 Subject: [Python-Dev] Dealing with tone in an email In-Reply-To: References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> <5a7c46ef-122d-acdc-8507-a8afc22b0001@mail.mipt.ru> Message-ID: On Thu, May 3, 2018 at 9:13 PM, Brett Cannon wrote: > > > On Thu, 3 May 2018 at 12:29 Guido van Rossum wrote: > >> EVENTUALLY WE'LL ALL BE SHOUTING ALL THE TIME. Sad. >> > > Yep. And that leads to burn-out. So while Ivan may have lucked out in > getting the attention of people who are helped him (and given the wrong > kind of positive reinforcement that this approach is reasonable), this can > lead to people quitting open source and not being available to help you > next time (e.g. notice how it drove Paul Moore over the edge to pull back > for a week or so and he may have been the expert you needed for packaging > or me for imports; IOW I would say Ivan was lucky this time and may not be > so lucky next time). > > -Brett > > ?Yup. Tolerance has to have its limits, and this came close to abusive behaviour. ? ?I suspect others among us might have been guilty of similar behaviours in the past I certainly couldn't cast the first stone), but times change as do standards, and it's certainly not the tone the majority of the readers of this list would want, I believe. Let's hope this message continues to come across loud and clear. ?regards Steve ? -------------- next part -------------- An HTML attachment was scrubbed... URL: From solipsis at pitrou.net Fri May 4 06:00:10 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Fri, 4 May 2018 12:00:10 +0200 Subject: [Python-Dev] Python linkage on macOS References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> Message-ID: <20180504120010.00546151@fsol> On Fri, 04 May 2018 00:21:54 +0000 Ray Donnelly wrote: > > Yes, on Windows there's always a python?.dll. > > macOS is an interesting one. For Anaconda 5.0 I read somewhere (how's that > for a useless reference - and perhaps I got the wrong end of the stick) > that Python for all Unixen should use a statically linked interpreter so I > happily went ahead and did that. A statically linked Python can also be significantly faster (10 to 20% IIRC, more perhaps on ARM). I think you already know about that :-) > Anyway, it is obviously safer for us to do what upstream does and I will > try to post some benchmarks of static vs shared to the list so we can > discuss it. I have no idea what our default builds do on macOS, I'll let Ned Deily or another mac expert answer (changing the topic in the hope he notices this subthread :-)). Regards Antoine. From solipsis at pitrou.net Fri May 4 06:14:21 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Fri, 4 May 2018 12:14:21 +0200 Subject: [Python-Dev] Dealing with tone in an email References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> <5a7c46ef-122d-acdc-8507-a8afc22b0001@mail.mipt.ru> Message-ID: <20180504121421.6e6f06e2@fsol> On Fri, 4 May 2018 08:23:19 +0100 Steve Holden wrote: > > ?Yup. Tolerance has to have its limits, and this came close to abusive > behaviour. ? > > ?I suspect others among us might have been guilty of similar behaviours in > the past I certainly couldn't cast the first stone), but times change as do > standards, and it's certainly not the tone the majority of the readers of > this list would want, I believe. > > Let's hope this message continues to come across loud and clear. Since Terry Reedy (the IDLE maintainer) responded and refuted (some of?) the OP's assertions, I guess the case is settled. But that's not the same thing as threatening a poster with some accusations of CoC violation, just because the poster happened to use a familiar expression. Personally, I don't want people to be intimidated away from contributing because their English expression differs from that of the dominant (or, rather, most vocal and/or better organized) fraction. Regards Antoine. From vstinner at redhat.com Fri May 4 06:56:57 2018 From: vstinner at redhat.com (Victor Stinner) Date: Fri, 4 May 2018 12:56:57 +0200 Subject: [Python-Dev] Process to remove a Python feature In-Reply-To: References: Message-ID: 2018-05-02 19:24 GMT+02:00 Brett Cannon : > On Wed, 2 May 2018 at 02:12 Victor Stinner wrote: >> Does it mean that the Python 3 release following Python 2 end-of-life >> (2020) will be our next feared "Python 4"? Are we going to remove all >> deprecated features at once, to maximize incompatibilities and make >> users unhappy? > > I don't see why removing features that already raise a DeprecationWarning > would require bumping the major version number. Personally, I assumed either > Python 3.9 or 3.10 would be the version where we were okay clearing out the > stuff that had been raising DeprecationWarning for years. Sorry, when I wrote "Python 4" I mean "the new Python release which introduces a lot of backward incompatible changes and will annoy everyone". It can be Python 3.9 or 3.10, or whatever version (including 4.3 if you want) :-) My point is that deprecating a feature is one thing, removing it is something else. We should slow down feature removal, or more generally reduce the number of backward incompatible changes per release. Maybe keep a deprecating warning for 10 years is just fine. Extract of the Zen of Python: "Although practicality beats purity." ;-) Victor From mingw.android at gmail.com Fri May 4 08:10:46 2018 From: mingw.android at gmail.com (Ray Donnelly) Date: Fri, 4 May 2018 13:10:46 +0100 Subject: [Python-Dev] Python linkage on macOS In-Reply-To: <20180504120010.00546151@fsol> References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> <20180504120010.00546151@fsol> Message-ID: On Fri, May 4, 2018 at 11:00 AM, Antoine Pitrou wrote: > On Fri, 04 May 2018 00:21:54 +0000 > Ray Donnelly wrote: >> >> Yes, on Windows there's always a python?.dll. >> >> macOS is an interesting one. For Anaconda 5.0 I read somewhere (how's that >> for a useless reference - and perhaps I got the wrong end of the stick) >> that Python for all Unixen should use a statically linked interpreter so I >> happily went ahead and did that. > > A statically linked Python can also be significantly faster (10 to 20% > IIRC, more perhaps on ARM). I think you already know about that :-) > Indeed, and it worked out well on Intel too. Thanks for the recommendation. >> Anyway, it is obviously safer for us to do what upstream does and I will >> try to post some benchmarks of static vs shared to the list so we can >> discuss it. > > I have no idea what our default builds do on macOS, I'll let Ned Deily > or another mac expert answer (changing the topic in the hope he notices > this subthread :-)). > And thanks for doing this. For the benchmarks I think I should build Python 3.6.5 (or would 3.7.0b4 be better?) from pyperformance built each way using the AD scripts and reply here with the results. If I do not get it done today then I hope to get them ready by Monday. > Regards > > Antoine. > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/mingw.android%40gmail.com From ncoghlan at gmail.com Fri May 4 08:14:20 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 4 May 2018 22:14:20 +1000 Subject: [Python-Dev] A fast startup patch (was: Python startup time) In-Reply-To: References: Message-ID: On 4 May 2018 at 06:13, Carl Shapiro wrote: > Hello, > > Yesterday Neil Schemenauer mentioned some work that a colleague of mine > (CCed) and I have done to improve CPython start-up time. Given the recent > discussion, it seems timely to discuss what we are doing and whether it is > of interest to other people hacking on the CPython runtime. > > There are many ways to reduce the start-up time overhead. For this > experiment, we are specifically targeting the cost of unmarshaling heap > objects from compiled Python bytecode. Our measurements show this specific > cost to represent 10% to 25% of the start-up time among the applications we > have examined. > > Our approach to eliminating this overhead is to store unmarshaled objects > into the data segment of the python executable. We do this by processing > the compiled python bytecode for a module, creating native object code with > the unmarshaled objects in their in-memory representation, and linking this > into the python executable. > > When a module is imported, we simply return a pointer to the top-level > code object in the data segment directly without invoking the unmarshaling > code or touching the file system. What we are doing is conceptually > similar to the existing capability to freeze a module, but we avoid > non-trivial unmarshaling costs. > This definitely seems interesting, but is it something you'd be seeing us being able to take advantage of for conventional Python installations, or is it more something you'd expect to be useful for purpose-built interpreter instances? (e.g. if Mercurial were running their own Python, they could precache the heap objects for their commonly imported modules in their custom interpreter binary, regardless of whether those were standard library modules or not). Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at holdenweb.com Fri May 4 08:33:15 2018 From: steve at holdenweb.com (Steve Holden) Date: Fri, 4 May 2018 13:33:15 +0100 Subject: [Python-Dev] Dealing with tone in an email In-Reply-To: <20180504121421.6e6f06e2@fsol> References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> <5a7c46ef-122d-acdc-8507-a8afc22b0001@mail.mipt.ru> <20180504121421.6e6f06e2@fsol> Message-ID: Me neither, but I do want people to accept that there are norms, which should usually be observed. S Steve Holden On Fri, May 4, 2018 at 11:14 AM, Antoine Pitrou wrote: > On Fri, 4 May 2018 08:23:19 +0100 > Steve Holden wrote: > > > > ?Yup. Tolerance has to have its limits, and this came close to abusive > > behaviour. ? > > > > ?I suspect others among us might have been guilty of similar behaviours > in > > the past I certainly couldn't cast the first stone), but times change as > do > > standards, and it's certainly not the tone the majority of the readers of > > this list would want, I believe. > > > > Let's hope this message continues to come across loud and clear. > > Since Terry Reedy (the IDLE maintainer) responded and refuted (some > of?) the OP's assertions, I guess the case is settled. > > But that's not the same thing as threatening a poster with some > accusations of CoC violation, just because the poster happened to use a > familiar expression. > > Personally, I don't want people to be intimidated away from contributing > because their English expression differs from that of the dominant (or, > rather, most vocal and/or better organized) fraction. > > Regards > > Antoine. > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ > steve%40holdenweb.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at pearwood.info Fri May 4 09:14:03 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Fri, 4 May 2018 23:14:03 +1000 Subject: [Python-Dev] Process to remove a Python feature In-Reply-To: References: Message-ID: <20180504131403.GJ9562@ando.pearwood.info> On Fri, May 04, 2018 at 12:56:57PM +0200, Victor Stinner wrote: > Sorry, when I wrote "Python 4" I mean "the new Python release which > introduces a lot of backward incompatible changes and will annoy > everyone". It can be Python 3.9 or 3.10, or whatever version > (including 4.3 if you want) :-) I call that "Python 4000", in analogy with Python 3000 which became Python 3, and to further emphasise how far away it is, I've started calling it "Python 5000". As I understand it, Guido has said that we won't be doing a repeat of the 2 to 3 break-a-lot-of-stuff-at-once transition unless there is some unforeseen necessity. > My point is that deprecating a feature is one thing, removing it is > something else. > > We should slow down feature removal, or more generally reduce the > number of backward incompatible changes per release. Have there been many features removed since 3.1? I know there were some features removed in 3.0, like callable(), which were later put back in, but I can't think of anythin removed since then. If there were, the pace of it is pretty slow. > Maybe keep a deprecating warning for 10 years is just fine. > > Extract of the Zen of Python: "Although practicality beats purity." ;-) Aside from things that need to be removed for security reasons, this seems good to me. -- Steve From nad at python.org Fri May 4 09:52:17 2018 From: nad at python.org (Ned Deily) Date: Fri, 4 May 2018 09:52:17 -0400 Subject: [Python-Dev] Python linkage on macOS In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> <20180504120010.00546151@fsol> Message-ID: <68D4F6C9-83BF-4D8B-A5B4-9334D49A828E@python.org> On May 4, 2018, at 08:10, Ray Donnelly wrote: > On Fri, May 4, 2018 at 11:00 AM, Antoine Pitrou wrote: >> On Fri, 04 May 2018 00:21:54 +0000 >> Ray Donnelly wrote: >>> Anyway, it is obviously safer for us to do what upstream does and I will >>> try to post some benchmarks of static vs shared to the list so we can >>> discuss it. >> I have no idea what our default builds do on macOS, I'll let Ned Deily >> or another mac expert answer (changing the topic in the hope he notices >> this subthread :-)). > And thanks for doing this. For the benchmarks I think I should build > Python 3.6.5 (or would 3.7.0b4 be better?) from pyperformance built > each way using the AD scripts and reply here with the results. If I do > not get it done today then I hope to get them ready by Monday. The macOS python interpreters provided by python.org binary installers have always (for a very long time of always) been built as shared, in particular the special macOS framework build configuration. It would be very interesting to do Apple to Apple comparisons of shared vs static builds on macOS. I would look forward to seeing any results you have, Ray, and your methodology. Static builds is on my list of things to look at for 3.8. -- Ned Deily nad at python.org -- [] From steve at pearwood.info Fri May 4 11:43:00 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Sat, 5 May 2018 01:43:00 +1000 Subject: [Python-Dev] Dealing with tone in an email (was: Drop/deprecate Tkinter?) In-Reply-To: References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> Message-ID: <20180504154243.GA9539@ando.pearwood.info> On Thu, May 03, 2018 at 06:31:03PM +0000, Brett Cannon wrote: > No one is saying people can't be upset and if you are ever upset there's > something wrong; we're human beings after all. But those of us speaking up > about the tone are saying that you can also wait until you're not so upset > to write an email. This was never going to be resolved in an hour, so > waiting an hour until you're in a better place to write an email that > wasn't quite so inflammatory seems like a reasonable thing to ask. Certainly! I'm not defending Ivan's initial email. His tantrum *was* annoying, unreasonable, and unfair to those who do care about tkinter. He could have done better. But *we* should be better too. Our response to Ivan has not been welcoming, and as a community we haven't lived up to our own standards, as we have piled onto him to express our rightous indignation: 1. Guido responded telling Ivan to calm down and work off his frustration elsewhere. And that's where things should have stopped, unless Ivan had persisted in his impoliteness. 2. Brian upped the ante by bringing the CoC into discussion. 3. Paul raised it again by describing Ivan's post as "offensive". 4. And now, Steve H has claimed that Ivan's initial post was bordering on "abusive". We've gone from rightly treating Ivan's post as intemperate and impolite, and telling him to chill, to calling his post "offensive", to "abusive". (Next, I presume, someone will claim to be traumatised by Ivan's email.) Just as Ivan should have waited until he had calmed down before firing off his rant, so we ought to resist the temptation to strike back with hostility at trivial social transgressions, especially from newcomers. This is what Ivan actually said: - Tkinter is broken and partly functional (an opinion with only the most tenuous connection with fact, but hardly abusive); - that nobody cares (factually wrong, but not abusive); - that possibly nobody is using it (factually wrong, but not abusive); - that if that's the case (it isn't), then it should be removed from the std lib (a reasonable suggestion if only the premise had been correct). Intemperate and impolite it certainly was, as well as full of factual inaccuracies, but to call it "close to abusive" is a hostile over- reaction. We ought to be kinder than that. Our response to Ivan has been more hostile, and less open and respectful, than his email that triggered the response. Brett is right to say people can afford to wait a little while before firing off an angry email. But the same applies to us: we too can afford to wait a little while before raising the threat of the CoC over a minor social faux pas. This community isn't so fragile that we have to jump down the throat of a newcomer lest the community immediately collapses into Call Of Duty gamer culture. -- Steve From storchaka at gmail.com Fri May 4 12:00:17 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Fri, 4 May 2018 19:00:17 +0300 Subject: [Python-Dev] Process to remove a Python feature In-Reply-To: <20180504131403.GJ9562@ando.pearwood.info> References: <20180504131403.GJ9562@ando.pearwood.info> Message-ID: 04.05.18 16:14, Steven D'Aprano ????: > Have there been many features removed since 3.1? I know there were some > features removed in 3.0, like callable(), which were later put back in, > but I can't think of anythin removed since then. If there were, the pace > of it is pretty slow. Read "What's New" documents, sections "Removed" and "Porting to Python X.Y". There is a long list in every release. From guido at python.org Fri May 4 12:04:20 2018 From: guido at python.org (Guido van Rossum) Date: Fri, 4 May 2018 09:04:20 -0700 Subject: [Python-Dev] Dealing with tone in an email (was: Drop/deprecate Tkinter?) In-Reply-To: <20180504154243.GA9539@ando.pearwood.info> References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> <20180504154243.GA9539@ando.pearwood.info> Message-ID: Thank you Steven! I assume that Brian hadn't seen my response (such crossed messages due to delivery delays are very common in this mailing list). I'd like to use your email (nearly) verbatim to start off the discussion about civility we're going to have at the Language Summit. On Fri, May 4, 2018 at 8:43 AM, Steven D'Aprano wrote: > On Thu, May 03, 2018 at 06:31:03PM +0000, Brett Cannon wrote: > > > No one is saying people can't be upset and if you are ever upset there's > > something wrong; we're human beings after all. But those of us speaking > up > > about the tone are saying that you can also wait until you're not so > upset > > to write an email. This was never going to be resolved in an hour, so > > waiting an hour until you're in a better place to write an email that > > wasn't quite so inflammatory seems like a reasonable thing to ask. > > Certainly! > > I'm not defending Ivan's initial email. His tantrum *was* annoying, > unreasonable, and unfair to those who do care about tkinter. He could > have done better. > > But *we* should be better too. Our response to Ivan has not been > welcoming, and as a community we haven't lived up to our own standards, > as we have piled onto him to express our rightous indignation: > > 1. Guido responded telling Ivan to calm down and work off his > frustration elsewhere. And that's where things should have > stopped, unless Ivan had persisted in his impoliteness. > > 2. Brian upped the ante by bringing the CoC into discussion. > > 3. Paul raised it again by describing Ivan's post as "offensive". > > 4. And now, Steve H has claimed that Ivan's initial post was > bordering on "abusive". > > We've gone from rightly treating Ivan's post as intemperate and > impolite, and telling him to chill, to calling his post "offensive", to > "abusive". (Next, I presume, someone will claim to be traumatised by > Ivan's email.) > > Just as Ivan should have waited until he had calmed down before firing > off his rant, so we ought to resist the temptation to strike back with > hostility at trivial social transgressions, especially from newcomers. > This is what Ivan actually said: > > - Tkinter is broken and partly functional (an opinion with only the > most tenuous connection with fact, but hardly abusive); > > - that nobody cares (factually wrong, but not abusive); > > - that possibly nobody is using it (factually wrong, but not abusive); > > - that if that's the case (it isn't), then it should be removed > from the std lib (a reasonable suggestion if only the premise had > been correct). > > Intemperate and impolite it certainly was, as well as full of factual > inaccuracies, but to call it "close to abusive" is a hostile over- > reaction. We ought to be kinder than that. Our response to Ivan has been > more hostile, and less open and respectful, than his email that > triggered the response. > > Brett is right to say people can afford to wait a little while before > firing off an angry email. But the same applies to us: we too can afford > to wait a little while before raising the threat of the CoC over a minor > social faux pas. This community isn't so fragile that we have to jump > down the throat of a newcomer lest the community immediately collapses > into Call Of Duty gamer culture. > > > -- > Steve > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ > guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From status at bugs.python.org Fri May 4 12:09:55 2018 From: status at bugs.python.org (Python tracker) Date: Fri, 4 May 2018 18:09:55 +0200 (CEST) Subject: [Python-Dev] Summary of Python tracker Issues Message-ID: <20180504160955.3224E562BA@psf.upfronthosting.co.za> ACTIVITY SUMMARY (2018-04-27 - 2018-05-04) Python tracker at https://bugs.python.org/ To view or respond to any of the issues listed below, click on the issue. Do NOT respond to this message. Issues counts and deltas: open 6629 (+14) closed 38547 (+37) total 45176 (+51) Open issues with patches: 2593 Issues opened (34) ================== #33375: warnings: get filename from frame.f_code.co_filename https://bugs.python.org/issue33375 opened by takluyver #33376: [pysqlite] Duplicate rows can be returned after rolling back a https://bugs.python.org/issue33376 opened by cary #33379: PyImport_Cleanup is called with builtins_copy == NULL in test_ https://bugs.python.org/issue33379 opened by serhiy.storchaka #33381: Incorrect documentation for strftime()/strptime() format code https://bugs.python.org/issue33381 opened by jujuwoman #33384: Build does not work with closed stdin https://bugs.python.org/issue33384 opened by MartinHusemann #33385: setdefault() with a single argument doesn't work for dbm.gnu a https://bugs.python.org/issue33385 opened by serhiy.storchaka #33387: Simplify bytecodes for try-finally, try-except and with blocks https://bugs.python.org/issue33387 opened by Mark.Shannon #33388: Support PEP 566 metadata in dist.py https://bugs.python.org/issue33388 opened by rbricheno #33389: argparse redundant help string https://bugs.python.org/issue33389 opened by stefan #33392: pathlib .glob('*/') returns files as well as directories https://bugs.python.org/issue33392 opened by robbuckley #33393: update config.guess and config.sub https://bugs.python.org/issue33393 opened by doko #33394: the build of the shared modules is quiet/non-visible when GNU https://bugs.python.org/issue33394 opened by doko #33396: IDLE: Improve and document help doc viewer https://bugs.python.org/issue33396 opened by terry.reedy #33397: IDLE help viewer: let users control font size https://bugs.python.org/issue33397 opened by terry.reedy #33398: From, To, Cc lines break when calling send_message() https://bugs.python.org/issue33398 opened by _savage #33399: site.abs_paths should handle None __cached__ type https://bugs.python.org/issue33399 opened by demian.brecht #33400: logging.Formatter does not default to ISO8601 date format https://bugs.python.org/issue33400 opened by paulc #33403: asyncio.tasks.wait does not allow to set custom exception when https://bugs.python.org/issue33403 opened by pyneda #33406: [ctypes] increase the refcount of a callback function https://bugs.python.org/issue33406 opened by Zuzu_Typ #33407: Implement Py_DEPRECATED() macro for Visual Studio https://bugs.python.org/issue33407 opened by vstinner #33408: AF_UNIX is now supported in Windows https://bugs.python.org/issue33408 opened by filips123 #33409: Clarify the interaction between locale coercion & UTF-8 mode https://bugs.python.org/issue33409 opened by ncoghlan #33411: All console message are in the error output in bash interpreto https://bugs.python.org/issue33411 opened by Quentin Millardet #33413: asyncio.gather should not use special Future https://bugs.python.org/issue33413 opened by Martin.Teichmann #33414: Make shutil.copytree use os.scandir to take advantage of cache https://bugs.python.org/issue33414 opened by adelfino #33415: When add_mutually_exclusive_group is built without argument, t https://bugs.python.org/issue33415 opened by ariel-anieli #33416: Add endline and endcolumn to every AST node https://bugs.python.org/issue33416 opened by levkivskyi #33417: Isinstance() behavior is not consistent with the document https://bugs.python.org/issue33417 opened by hongweipeng #33418: Memory leaks in functions https://bugs.python.org/issue33418 opened by jdemeyer #33419: Add functools.partialclass https://bugs.python.org/issue33419 opened by NeilGirdhar #33420: __origin__ invariant broken https://bugs.python.org/issue33420 opened by apaszke #33421: Missing documentation for typing.AsyncContextManager https://bugs.python.org/issue33421 opened by Travis DePrato #33422: Fix and update string/byte literals in help() https://bugs.python.org/issue33422 opened by adelfino #33423: [logging] Improve consistency of logger mechanism. https://bugs.python.org/issue33423 opened by Daehee Kim Most recent 15 issues with no replies (15) ========================================== #33421: Missing documentation for typing.AsyncContextManager https://bugs.python.org/issue33421 #33418: Memory leaks in functions https://bugs.python.org/issue33418 #33417: Isinstance() behavior is not consistent with the document https://bugs.python.org/issue33417 #33416: Add endline and endcolumn to every AST node https://bugs.python.org/issue33416 #33415: When add_mutually_exclusive_group is built without argument, t https://bugs.python.org/issue33415 #33413: asyncio.gather should not use special Future https://bugs.python.org/issue33413 #33409: Clarify the interaction between locale coercion & UTF-8 mode https://bugs.python.org/issue33409 #33408: AF_UNIX is now supported in Windows https://bugs.python.org/issue33408 #33403: asyncio.tasks.wait does not allow to set custom exception when https://bugs.python.org/issue33403 #33399: site.abs_paths should handle None __cached__ type https://bugs.python.org/issue33399 #33398: From, To, Cc lines break when calling send_message() https://bugs.python.org/issue33398 #33396: IDLE: Improve and document help doc viewer https://bugs.python.org/issue33396 #33394: the build of the shared modules is quiet/non-visible when GNU https://bugs.python.org/issue33394 #33389: argparse redundant help string https://bugs.python.org/issue33389 #33388: Support PEP 566 metadata in dist.py https://bugs.python.org/issue33388 Most recent 15 issues waiting for review (15) ============================================= #33422: Fix and update string/byte literals in help() https://bugs.python.org/issue33422 #33421: Missing documentation for typing.AsyncContextManager https://bugs.python.org/issue33421 #33419: Add functools.partialclass https://bugs.python.org/issue33419 #33413: asyncio.gather should not use special Future https://bugs.python.org/issue33413 #33403: asyncio.tasks.wait does not allow to set custom exception when https://bugs.python.org/issue33403 #33399: site.abs_paths should handle None __cached__ type https://bugs.python.org/issue33399 #33397: IDLE help viewer: let users control font size https://bugs.python.org/issue33397 #33394: the build of the shared modules is quiet/non-visible when GNU https://bugs.python.org/issue33394 #33393: update config.guess and config.sub https://bugs.python.org/issue33393 #33388: Support PEP 566 metadata in dist.py https://bugs.python.org/issue33388 #33387: Simplify bytecodes for try-finally, try-except and with blocks https://bugs.python.org/issue33387 #33375: warnings: get filename from frame.f_code.co_filename https://bugs.python.org/issue33375 #33365: http/client.py does not print correct headers in debug https://bugs.python.org/issue33365 #33358: [EASY] x86 Ubuntu Shared 3.x: test_embed.test_pre_initializati https://bugs.python.org/issue33358 #33354: Python2: test_ssl fails on non-ASCII path https://bugs.python.org/issue33354 Top 10 most discussed issues (10) ================================= #33257: Race conditions in Tkinter with non-threaded Tcl https://bugs.python.org/issue33257 15 msgs #32797: Tracebacks from Cython modules no longer work https://bugs.python.org/issue32797 11 msgs #20104: expose posix_spawn(p) https://bugs.python.org/issue20104 8 msgs #21822: KeyboardInterrupt during Thread.join hangs that Thread https://bugs.python.org/issue21822 8 msgs #33419: Add functools.partialclass https://bugs.python.org/issue33419 8 msgs #33422: Fix and update string/byte literals in help() https://bugs.python.org/issue33422 7 msgs #32769: Add 'annotations' to the glossary https://bugs.python.org/issue32769 5 msgs #28167: remove platform.linux_distribution() https://bugs.python.org/issue28167 4 msgs #33012: Invalid function cast warnings with gcc 8 for METH_NOARGS https://bugs.python.org/issue33012 4 msgs #33038: GzipFile doesn't always ignore None as filename https://bugs.python.org/issue33038 4 msgs Issues closed (35) ================== #6270: Menu deletecommand fails if command is already deleted https://bugs.python.org/issue6270 closed by csabella #21474: Idle: updata fixwordbreaks() for unicode identifiers https://bugs.python.org/issue21474 closed by terry.reedy #25198: Idle: improve idle.html help viewer. https://bugs.python.org/issue25198 closed by terry.reedy #31026: test_dbm fails when run directly https://bugs.python.org/issue31026 closed by serhiy.storchaka #31802: 'import posixpath' fails if 'os.path' has not be imported alre https://bugs.python.org/issue31802 closed by Mark.Shannon #33191: Refleak in posix_spawn https://bugs.python.org/issue33191 closed by ned.deily #33254: Have importlib.resources.contents() return an interable instea https://bugs.python.org/issue33254 closed by brett.cannon #33256: module is not displayed by cgitb.html https://bugs.python.org/issue33256 closed by serhiy.storchaka #33281: ctypes.util.find_library not working on macOS https://bugs.python.org/issue33281 closed by ned.deily #33328: pdb error when stepping through generator https://bugs.python.org/issue33328 closed by Ricyteach #33332: Expose valid signal set (sigfillset()): add signal.valid_signa https://bugs.python.org/issue33332 closed by pitrou #33343: [argparse] Add subcommand abbreviations https://bugs.python.org/issue33343 closed by terry.reedy #33352: [EASY] Windows: test_regrtest fails on installed Python https://bugs.python.org/issue33352 closed by vstinner #33363: async for statement is not a syntax error in sync context https://bugs.python.org/issue33363 closed by yselivanov #33366: `contextvars` documentation incorrectly refers to "non-local s https://bugs.python.org/issue33366 closed by yselivanov #33370: Addition of mypy cache to gitignore https://bugs.python.org/issue33370 closed by brett.cannon #33373: Tkinter ttk Label background ignored on MacOS https://bugs.python.org/issue33373 closed by serhiy.storchaka #33374: generate-posix-vars failed when building Python 2.7.14 on Linu https://bugs.python.org/issue33374 closed by benjamin.peterson #33377: add new triplets for mips r6 and riscv variants https://bugs.python.org/issue33377 closed by ned.deily #33378: Add Korean to the language switcher https://bugs.python.org/issue33378 closed by mdk #33380: Update module attribute on namedtuple methods for introspectio https://bugs.python.org/issue33380 closed by a-feld #33382: make [profile-opt] failed with --enable-optimizations https://bugs.python.org/issue33382 closed by kmahyyg #33383: Crash in the get() method a single argument in dbm.ndbm https://bugs.python.org/issue33383 closed by serhiy.storchaka #33386: IDLE: Double clicking only recognizes ascii chars as identifie https://bugs.python.org/issue33386 closed by terry.reedy #33390: matmul @ operator precedence https://bugs.python.org/issue33390 closed by steven.daprano #33391: leak in set_symmetric_difference? https://bugs.python.org/issue33391 closed by inada.naoki #33395: TypeError: unhashable type: 'instancemethod' https://bugs.python.org/issue33395 closed by r.david.murray #33401: `exec` attribute can't be set to objects in Python2 (SyntaxErr https://bugs.python.org/issue33401 closed by ebarry #33402: Change the fractions.Fraction class to convert to a unicode fr https://bugs.python.org/issue33402 closed by gappleto97 #33404: Phone Number Generator https://bugs.python.org/issue33404 closed by zach.ware #33405: PYTHONCOERCECLOCALE no longer being respected https://bugs.python.org/issue33405 closed by vstinner #33410: Using type in a format with padding causes TypeError https://bugs.python.org/issue33410 closed by ebarry #33412: Tkinter hangs if using multiple threads and event handlers https://bugs.python.org/issue33412 closed by gvanrossum #33424: 4.4. break and continue Statements, and else Clauses on Loops https://bugs.python.org/issue33424 closed by tim.peters #33425: Library glob : Can't find a specific year with glob https://bugs.python.org/issue33425 closed by Robin Champavier From steve at pearwood.info Fri May 4 12:50:38 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Sat, 5 May 2018 02:50:38 +1000 Subject: [Python-Dev] Dealing with tone in an email (was: Drop/deprecate Tkinter?) In-Reply-To: References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> <20180504154243.GA9539@ando.pearwood.info> Message-ID: <20180504165037.GL9562@ando.pearwood.info> On Fri, May 04, 2018 at 09:04:20AM -0700, Guido van Rossum wrote: > Thank you Steven! I assume that Brian hadn't seen my response (such crossed > messages due to delivery delays are very common in this mailing list). > > I'd like to use your email (nearly) verbatim to start off the discussion > about civility we're going to have at the Language Summit. Please do. I would be honoured! -- Steve From steve at holdenweb.com Fri May 4 12:53:53 2018 From: steve at holdenweb.com (Steve Holden) Date: Fri, 4 May 2018 17:53:53 +0100 Subject: [Python-Dev] Dealing with tone in an email (was: Drop/deprecate Tkinter?) In-Reply-To: <20180504154243.GA9539@ando.pearwood.info> References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> <20180504154243.GA9539@ando.pearwood.info> Message-ID: On Fri, May 4, 2018 at 4:43 PM, Steven D'Aprano wrote: > On Thu, May 03, 2018 at 06:31:03PM +0000, Brett Cannon wrote: > > > No one is saying people can't be upset and if you are ever upset there's > > something wrong; we're human beings after all. But those of us speaking > up > > about the tone are saying that you can also wait until you're not so > upset > > to write an email. This was never going to be resolved in an hour, so > > waiting an hour until you're in a better place to write an email that > > wasn't quite so inflammatory seems like a reasonable thing to ask. > > Certainly! > > I'm not defending Ivan's initial email. His tantrum *was* annoying, > unreasonable, and unfair to those who do care about tkinter. He could > have done better. > > But *we* should be better too. Our response to Ivan has not been > welcoming, and as a community we haven't lived up to our own standards, > as we have piled onto him to express our rightous indignation: > > 1. Guido responded telling Ivan to calm down and work off his > frustration elsewhere. And that's where things should have > stopped, unless Ivan had persisted in his impoliteness. > > 2. Brian upped the ante by bringing the CoC into discussion. > > 3. Paul raised it again by describing Ivan's post as "offensive". > > 4. And now, Steve H has claimed that Ivan's initial post was > bordering on "abusive". > > We've gone from rightly treating Ivan's post as intemperate and > impolite, and telling him to chill, to calling his post "offensive", to > "abusive". (Next, I presume, someone will claim to be traumatised by > Ivan's email.) > > ?I for one hold my hand up, and will simply not respond the next time I am tempted to respond in this way. I do not wish to enter into discussion on the semantics of abuse, neither do I want to sidetrack the list from its intended purpose. regards Steve? -------------- next part -------------- An HTML attachment was scrubbed... URL: From bussonniermatthias at gmail.com Fri May 4 13:57:33 2018 From: bussonniermatthias at gmail.com (Matthias Bussonnier) Date: Fri, 04 May 2018 17:57:33 +0000 Subject: [Python-Dev] Process to remove a Python feature In-Reply-To: References: <20180504131403.GJ9562@ando.pearwood.info> Message-ID: I would like to take a step back, I think that for removal of feature you need to make sure that the "old way" is not common enough. I think that in many context, users of Python see DeprecationWarnings as a stick. A deprecation warning means you'll have to do some work. A pep or a document that list all the removal may be useful, but I'm unsure if in reading what's knew you'll catch all the removal of functionality you use and convince yourself (or your management) to thouroughly go through your code base and fix these things. Maybe to help transition you need to soften the stick by being more explicit as since when things are deprecated and what can be done. Personally I find deprecations when I run test-suites and use libraries interactively, not by reading ahead of time documents. Python has made huge progress in the deprecation messages. And the usage of "X Is is deprecated, use Y" has become common, this is already tremendously more helpful than just "X is deprecated". But when I hit a DeprecationWarning message there is one crucial piece of information missing most of the time: Since which version number it's deprecated (and sometime since when the replacement is available could be good if overlap between functionality there was). Yes it is available in the docs (or in the docstring), but if I have to switch context to search in the documentation I might not bother. In particular when the version is in the deprecation message, as I know the python requirement of my project, I can immediately know if it's a search and replace away, or if I need to have conditional branches. That is at least to me the biggest change that would push me to update the deprecated features in my own project, and make me do drive by contributions on Open-Source project I use where I spotted one of those. There are other improvement possible to deprecation messages, but I think this one would have the largest impact for many developers who do not closely follow python development, and for whom DeprecationWarnings are just an extra hassle. Also if there is a standard, or conventional way _in warnings_ to say since when something is deprecated, then it's relatively easy to search for it when version N+p is released and remove what you believe is old enough. Thanks, -- Matthias On Fri, 4 May 2018 at 09:03, Serhiy Storchaka wrote: > 04.05.18 16:14, Steven D'Aprano ????: > > Have there been many features removed since 3.1? I know there were some > > features removed in 3.0, like callable(), which were later put back in, > > but I can't think of anythin removed since then. If there were, the pace > > of it is pretty slow. > > Read "What's New" documents, sections "Removed" and "Porting to Python > X.Y". There is a long list in every release. > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/bussonniermatthias%40gmail.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From storchaka at gmail.com Fri May 4 14:48:55 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Fri, 4 May 2018 21:48:55 +0300 Subject: [Python-Dev] Process to remove a Python feature In-Reply-To: References: <20180504131403.GJ9562@ando.pearwood.info> Message-ID: 04.05.18 20:57, Matthias Bussonnier ????: > But when I hit a DeprecationWarning message there is one crucial piece of > information missing most of the time: Since which version number it's > deprecated > (and sometime since when the replacement is available could be good if > overlap > between functionality there was). I think the information about since which version number it will be removed is more useful. Different cases need different deprecation periods. The more common the case, the longer deprecation period should be. Some recently added warnings contain this information. Ideally any deprecated feature should have a replacement, and this replacement should be available in at least one version before adding the deprecation warning. X.Y: added a replacement X.Y+1: added a deprecation warning. Many users need to support only two recent versions and can move to using the replacement now. X.Y+3 (or X.Y+2): removed the deprecated feature. Versions older than X.Y should grew out of use at that moment. From njs at pobox.com Fri May 4 14:55:35 2018 From: njs at pobox.com (Nathaniel Smith) Date: Fri, 04 May 2018 18:55:35 +0000 Subject: [Python-Dev] Process to remove a Python feature In-Reply-To: References: <20180504131403.GJ9562@ando.pearwood.info> Message-ID: On Fri, May 4, 2018, 11:50 Serhiy Storchaka wrote: > > Ideally any deprecated feature should have a replacement, and this > replacement should be available in at least one version before adding > the deprecation warning. > > X.Y: added a replacement > > X.Y+1: added a deprecation warning. Many users need to support only two > recent versions and can move to using the replacement now. > Isn't the whole point of making DeprecationWarnings hidden by default that it lets us add the new thing and deprecate the old thing in the same release, without creating too much disruption? -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Fri May 4 14:59:25 2018 From: guido at python.org (Guido van Rossum) Date: Fri, 4 May 2018 11:59:25 -0700 Subject: [Python-Dev] Process to remove a Python feature In-Reply-To: References: <20180504131403.GJ9562@ando.pearwood.info> Message-ID: No, the reason they're hidden by default is that for most users they're not actionable most of the time. On Fri, May 4, 2018 at 11:55 AM, Nathaniel Smith wrote: > On Fri, May 4, 2018, 11:50 Serhiy Storchaka wrote: > >> >> Ideally any deprecated feature should have a replacement, and this >> replacement should be available in at least one version before adding >> the deprecation warning. >> >> X.Y: added a replacement >> >> X.Y+1: added a deprecation warning. Many users need to support only two >> recent versions and can move to using the replacement now. >> > > Isn't the whole point of making DeprecationWarnings hidden by default that > it lets us add the new thing and deprecate the old thing in the same > release, without creating too much disruption? > > -n > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ > guido%40python.org > > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From bussonniermatthias at gmail.com Fri May 4 15:08:06 2018 From: bussonniermatthias at gmail.com (Matthias Bussonnier) Date: Fri, 04 May 2018 19:08:06 +0000 Subject: [Python-Dev] Process to remove a Python feature In-Reply-To: References: <20180504131403.GJ9562@ando.pearwood.info> Message-ID: On Fri, 4 May 2018 at 11:49, Serhiy Storchaka wrote: > 04.05.18 20:57, Matthias Bussonnier ????: > > But when I hit a DeprecationWarning message there is one crucial piece of > > information missing most of the time: Since which version number it's > > deprecated > > (and sometime since when the replacement is available could be good if > > overlap > > between functionality there was). > > I think the information about since which version number it will be > removed is more useful. Different cases need different deprecation > periods. The more common the case, the longer deprecation period should > be. Some recently added warnings contain this information. > Maybe to push people forward, but from experience it is hard to predict future, so saying when it _will_ be remove is hard. When you _want_ to remove it probably. Victor's first mail in this thread is a good example. The functionality was marked to be removed from 3.7, but I think it is likely too late now. You can always update, but I hate giving differing information between software version. I'm curious about your use case for the version of removal, I usually don't care when it's going to be removed, I prefer since when the functionality is deprecated. if pyversion < deprecated_version: old_stuff else: new_stuff And I soon as my project drop deprecated_version, I remove the conditional. I do not try/except on purpose to be able to grep for when to remove the code. Could you share you use case ? Of be ore detailed ? We can also be more generic and say that if DeprecationWarning messages could contain timeline informations it would likely encourage the migration. One related question is how much are DeprecationWarning messages stables between versions ? Would any update to many of these be accepted of refused because users might be filtering them ? -- Matthias > > Ideally any deprecated feature should have a replacement, and this > replacement should be available in at least one version before adding > the deprecation warning. > > X.Y: added a replacement > > X.Y+1: added a deprecation warning. Many users need to support only two > recent versions and can move to using the replacement now. > > X.Y+3 (or X.Y+2): removed the deprecated feature. Versions older than > X.Y should grew out of use at that moment. > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/bussonniermatthias%40gmail.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From tjreedy at udel.edu Fri May 4 15:21:28 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Fri, 4 May 2018 15:21:28 -0400 Subject: [Python-Dev] Dealing with tone in an email In-Reply-To: <20180504154243.GA9539@ando.pearwood.info> References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> <20180504154243.GA9539@ando.pearwood.info> Message-ID: On 5/4/2018 11:43 AM, Steven D'Aprano wrote: > I'm not defending Ivan's initial email. His tantrum *was* annoying, > unreasonable, and unfair to those who do care about tkinter. Ivan's email was a disinformation troll intended to jump the attention queue of core developers. He is proud of its apparent success, and seemingly unconcerned about equally apparent side-effects. In form, it reminds me of Ranting Rick Johnson's performances on python-list, where such things are more acceptable. At least Rick knows to not do the same on pydev. > He could have done better. I tried to persuade him of this by explaining how others have had the same success with respectful and factual emails. In either case, any success is because some of us care deeply about the quality of the CPython distribution and want to encourage others to join in the enterprise. > We've gone from rightly treating Ivan's post as intemperate and > impolite, and telling him to chill, to calling his post "offensive", to > "abusive". Discursive writing has two components: content and tone. I agree with you that the tone was not that bad. But I am more concerned with content. To me, posting disinformation to pydev *is* abusive of its 'authoritativeness'. This forum is read and mirrored around the world. People tend to believe what they read here. Based on past events, I will not be surprised if someone somewhere, in a blog, talk, or SO answer, quotes Ivan as a reason to not use tkinter. > (Next, I presume, someone will claim to be traumatised by > Ivan's email.) Yep, but not my concern here. > Just as Ivan should have waited until he had calmed down before firing > off his rant, so we ought to resist the temptation to strike back with > hostility at trivial social transgressions, especially from newcomers. Is disinformation and FUD really trivial? > This is what Ivan actually said: > > - Tkinter is broken and partly functional (an opinion with only the > most tenuous connection with fact, but hardly abusive); > > - that nobody cares (factually wrong, but not abusive); > > - that possibly nobody is using it (factually wrong, but not abusive); I previously responded to the first and second points above. As to the third, not only do many people use the tkinter-based IDLE and turtle frameworks (and many others), but as evidenced by Stackoverflow questions, some Python beginners dive into writing GUIs with tkinter after as little as 2 weeks exposure to Python. Given that I ignored tkinter for over a decade, I am impressed at their boldness. The following is a paraphrase of a combination of multiple things I have read and heard . "Don't use IDLE. Its buggy, not used much, and not maintained. Someone said so on pydev." Ivan saying the same about tkinter will possibly prompt others to imitate him. > - that if that's the case (it isn't), then it should be removed > from the std lib (a reasonable suggestion if only the premise had > been correct). I am dubious that Ivan actually believed what he wrote. It looks more like rhetorical devices rather than factual claims. Yet many people responding here treated 'the case' as plausible. This supports my contention that people tend to treat claims posted here as plausible and made in good faith. > Intemperate and impolite it certainly was, as well as full of factual > inaccuracies, but to call it "close to abusive" is a hostile over- > reaction. Dismissing as non-existent the hard work of volunteers tends to result in less volunteer work. Given that 'factual inaccuracies' can have negative consequences for the future of CPython, I think a bit of hostility is appropriate. > We ought to be kinder than that. Our response to Ivan has been > more hostile, and less open and respectful, than his email that > triggered the response. I agree that too much attention was give to 'tone'. I think too little was given to the validity of the claims. -- Terry Jan Reedy From tjreedy at udel.edu Fri May 4 15:24:19 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Fri, 4 May 2018 15:24:19 -0400 Subject: [Python-Dev] Dealing with tone in an email In-Reply-To: References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> <20180504154243.GA9539@ando.pearwood.info> Message-ID: On 5/4/2018 12:04 PM, Guido van Rossum wrote: > Thank you Steven! > I'd like to use your email (nearly) verbatim to start off the discussion > about civility we're going to have at the Language Summit. I won't be there but sounds like a good idea. I hope you consider that bad content as well as bad tone can be uncivil and negatively impact Python development. -- Terry Jan Reedy From brett at python.org Fri May 4 15:28:09 2018 From: brett at python.org (Brett Cannon) Date: Fri, 04 May 2018 19:28:09 +0000 Subject: [Python-Dev] Process to remove a Python feature In-Reply-To: References: <20180504131403.GJ9562@ando.pearwood.info> Message-ID: On Fri, 4 May 2018 at 12:09 Matthias Bussonnier < bussonniermatthias at gmail.com> wrote: > On Fri, 4 May 2018 at 11:49, Serhiy Storchaka wrote: > >> 04.05.18 20:57, Matthias Bussonnier ????: >> > But when I hit a DeprecationWarning message there is one crucial piece >> of >> > information missing most of the time: Since which version number it's >> > deprecated >> > (and sometime since when the replacement is available could be good if >> > overlap >> > between functionality there was). >> >> I think the information about since which version number it will be >> removed is more useful. Different cases need different deprecation >> periods. The more common the case, the longer deprecation period should >> be. Some recently added warnings contain this information. >> > > Maybe to push people forward, but from experience it is hard to predict > future, so saying when > it _will_ be remove is hard. When you _want_ to remove it probably. > Victor's first mail in > this thread is a good example. The functionality was marked to be removed > from 3.7, but I think > it is likely too late now. You can always update, but I hate giving > differing information between software version. > So there is actually an opportunity here to programmatically help prevent missing a removal. If we attach a version number for when a DeprecationWarning is expected to go away due to removal we can then have the warning itself raise a warning that while the DeprecationWarning was supposed to be gone by e.g. 3.7, it's still being raised. The drawback to this is it will make the first cut-over to a newer version of Python a bit more painful, but as long as warnings don't raise exceptions it should just be a noisy. > > I'm curious about your use case for the version of removal, > I usually don't care when it's going to be removed, I prefer since > when the functionality is deprecated. > > if pyversion < deprecated_version: > old_stuff > else: > new_stuff > > And I soon as my project drop deprecated_version, I remove the > conditional. > I do not try/except on purpose to be able to grep for when to remove the > code. > The other way to look at that is: if pyversion < replacement: old_stuff else: new_stuff In which case you don't care about when the deprecation happened, just when it's replacement was introduced (which Serhiy suggested being a version before the deprecation is added). > Could you share you use case ? Of be ore detailed ? > > We can also be more generic and say that if DeprecationWarning messages > could contain > timeline informations it would likely encourage the migration. > > One related question is how much are DeprecationWarning messages stables > between versions ? > Same as any other exception; typically left alone unless there's a better way to phrase things (IOW don't parse the message ;) . So at this point there are basically two threads going on here. One is an official deprecation policy. E.g. X.Y something happens, X.Y+1 something else happens, etc. The other one is how to make what gets deprecated more discoverable. E.g. an informational PEP that tracks what's planned, warnings having more metadata, etc. Both of these things seem independent, although one of them will require a PEP while the other just takes some work (I leave it as an exercise for the reader to figure out which one is which ;) . -Brett > Would any update to many of these be accepted of refused because users > might be filtering them ? > -- > Matthias > > >> >> Ideally any deprecated feature should have a replacement, and this >> replacement should be available in at least one version before adding >> the deprecation warning. >> >> X.Y: added a replacement >> >> X.Y+1: added a deprecation warning. Many users need to support only two >> recent versions and can move to using the replacement now. >> >> X.Y+3 (or X.Y+2): removed the deprecated feature. Versions older than >> X.Y should grew out of use at that moment. >> >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> > Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/bussonniermatthias%40gmail.com >> > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Fri May 4 15:40:22 2018 From: brett at python.org (Brett Cannon) Date: Fri, 04 May 2018 19:40:22 +0000 Subject: [Python-Dev] Dealing with tone in an email (was: Drop/deprecate Tkinter?) In-Reply-To: References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> <20180504154243.GA9539@ando.pearwood.info> Message-ID: On Fri, 4 May 2018 at 09:07 Guido van Rossum wrote: > Thank you Steven! I assume that Brian hadn't seen my response (such > crossed messages due to delivery delays are very common in this mailing > list). > > I'd like to use your email (nearly) verbatim to start off the discussion > about civility we're going to have at the Language Summit. > And I will also want to say thanks for the email, Steven! I agree with everything you said and that any expectations we have of others should apply equally to anyone on this list in any interaction. And I think it's especially true in our responses as we should try to keep the moral high ground by being examples of what we expect and want from others. > > On Fri, May 4, 2018 at 8:43 AM, Steven D'Aprano > wrote: > >> On Thu, May 03, 2018 at 06:31:03PM +0000, Brett Cannon wrote: >> >> > No one is saying people can't be upset and if you are ever upset there's >> > something wrong; we're human beings after all. But those of us speaking >> up >> > about the tone are saying that you can also wait until you're not so >> upset >> > to write an email. This was never going to be resolved in an hour, so >> > waiting an hour until you're in a better place to write an email that >> > wasn't quite so inflammatory seems like a reasonable thing to ask. >> >> Certainly! >> >> I'm not defending Ivan's initial email. His tantrum *was* annoying, >> unreasonable, and unfair to those who do care about tkinter. He could >> have done better. >> >> But *we* should be better too. Our response to Ivan has not been >> welcoming, and as a community we haven't lived up to our own standards, >> as we have piled onto him to express our rightous indignation: >> >> 1. Guido responded telling Ivan to calm down and work off his >> frustration elsewhere. And that's where things should have >> stopped, unless Ivan had persisted in his impoliteness. >> >> 2. Brian upped the ante by bringing the CoC into discussion. >> >> 3. Paul raised it again by describing Ivan's post as "offensive". >> >> 4. And now, Steve H has claimed that Ivan's initial post was >> bordering on "abusive". >> >> We've gone from rightly treating Ivan's post as intemperate and >> impolite, and telling him to chill, to calling his post "offensive", to >> "abusive". (Next, I presume, someone will claim to be traumatised by >> Ivan's email.) >> >> Just as Ivan should have waited until he had calmed down before firing >> off his rant, so we ought to resist the temptation to strike back with >> hostility at trivial social transgressions, especially from newcomers. >> This is what Ivan actually said: >> >> - Tkinter is broken and partly functional (an opinion with only the >> most tenuous connection with fact, but hardly abusive); >> >> - that nobody cares (factually wrong, but not abusive); >> >> - that possibly nobody is using it (factually wrong, but not abusive); >> >> - that if that's the case (it isn't), then it should be removed >> from the std lib (a reasonable suggestion if only the premise had >> been correct). >> >> Intemperate and impolite it certainly was, as well as full of factual >> inaccuracies, but to call it "close to abusive" is a hostile over- >> reaction. We ought to be kinder than that. Our response to Ivan has been >> more hostile, and less open and respectful, than his email that >> triggered the response. >> >> Brett is right to say people can afford to wait a little while before >> firing off an angry email. But the same applies to us: we too can afford >> to wait a little while before raising the threat of the CoC over a minor >> social faux pas. This community isn't so fragile that we have to jump >> down the throat of a newcomer lest the community immediately collapses >> into Call Of Duty gamer culture. >> > >> >> -- >> Steve >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> > Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/guido%40python.org >> > > > > -- > --Guido van Rossum (python.org/~guido) > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexander.belopolsky at gmail.com Fri May 4 16:27:16 2018 From: alexander.belopolsky at gmail.com (Alexander Belopolsky) Date: Fri, 4 May 2018 16:27:16 -0400 Subject: [Python-Dev] Dealing with tone in an email (was: Drop/deprecate Tkinter?) In-Reply-To: <20180504154243.GA9539@ando.pearwood.info> References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> <20180504154243.GA9539@ando.pearwood.info> Message-ID: On Fri, May 4, 2018 at 11:43 AM, Steven D'Aprano wrote: > On Thu, May 03, 2018 at 06:31:03PM +0000, Brett Cannon wrote: > .. > I'm not defending Ivan's initial email. His tantrum *was* annoying, > unreasonable, and unfair to those who do care about tkinter. He could > have done better. > > But *we* should be better too. Our response to Ivan has not been > welcoming, and as a community we haven't lived up to our own standards, > as we have piled onto him to express our rightous indignation: +1 It may be a reflection of me sharing the cultural roots with Ivan, but his original post did not sound particularly offensive to me. I've seen worse. When it comes to communication on public fora, I am a strong believer in Postel's principle of robustness: "be conservative in what you do, be liberal in what you accept from others." From tjreedy at udel.edu Fri May 4 17:59:42 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Fri, 4 May 2018 17:59:42 -0400 Subject: [Python-Dev] Process to remove a Python feature In-Reply-To: References: Message-ID: On 5/2/2018 5:11 AM, Victor Stinner wrote: > As a follow-up to the "[Python-Dev] (Looking for) A Retrospective on > the Move to Python 3" thread, I will like to clarify how a feature > should be removed from Python. Would it be possible (and sensible) to use the 2to3 machinery to produce 36to37.py, etc., to do mechanical replacements when possible and flag other things when necessary? -- Terry Jan Reedy From vstinner at redhat.com Fri May 4 18:18:59 2018 From: vstinner at redhat.com (Victor Stinner) Date: Sat, 5 May 2018 00:18:59 +0200 Subject: [Python-Dev] Process to remove a Python feature In-Reply-To: References: Message-ID: 2018-05-04 23:59 GMT+02:00 Terry Reedy : > Would it be possible (and sensible) to use the 2to3 machinery to produce > 36to37.py, etc., to do mechanical replacements when possible and flag other > things when necessary? I suggest you to watch Daniele Esposti's talk "Evolution or stagnation programming languages". He explains that Javascript is more successful than Python to introduce *language* evolutions thanks to transpiling (things like babel and polyfill): https://www.pycon.it/conference/talks/evolution-or-stagnation-programming-languages Victor From vstinner at redhat.com Fri May 4 18:16:01 2018 From: vstinner at redhat.com (Victor Stinner) Date: Sat, 5 May 2018 00:16:01 +0200 Subject: [Python-Dev] Process to remove a Python feature In-Reply-To: References: <20180504131403.GJ9562@ando.pearwood.info> Message-ID: 2018-05-04 20:48 GMT+02:00 Serhiy Storchaka : > I think the information about since which version number it will be removed > is more useful. About deprecation, there is the funny story of bytes filenames on Windows. I deprecated this feature in Windows 3.3 since it was broken. I really wanted hard to remove support for bytes filenames on Windows. One day, Steve Dower showed up with the PEP 529 "Change Windows filesystem encoding to UTF-8" and he just *removed the deprecation warning* from Python 3.6, since bytes filenames are now working as intended :-) Victor From carl.shapiro at gmail.com Fri May 4 18:06:38 2018 From: carl.shapiro at gmail.com (Carl Shapiro) Date: Fri, 4 May 2018 15:06:38 -0700 Subject: [Python-Dev] A fast startup patch (was: Python startup time) In-Reply-To: References: Message-ID: On Fri, May 4, 2018 at 5:14 AM, Nick Coghlan wrote: > This definitely seems interesting, but is it something you'd be seeing us > being able to take advantage of for conventional Python installations, or > is it more something you'd expect to be useful for purpose-built > interpreter instances? (e.g. if Mercurial were running their own Python, > they could precache the heap objects for their commonly imported modules in > their custom interpreter binary, regardless of whether those were standard > library modules or not). > Yes, this would be a win for a conventional Python installation as well. Specifically, users and their scripts would enjoy a reduction in cold-startup time. In the numbers I showed yesterday, the version of the interpreter with our patch applied included unmarshaled data for the modules that always appear on the sys.modules list after an ordinary interpreter cold-start. I believe it is worthwhile to including that set of modules in the standard CPython interpreter build. Expanding that set to include the commonly imported modules might be an additional win, especially for short-running scripts. -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Fri May 4 21:58:27 2018 From: njs at pobox.com (Nathaniel Smith) Date: Sat, 05 May 2018 01:58:27 +0000 Subject: [Python-Dev] A fast startup patch (was: Python startup time) In-Reply-To: References: Message-ID: What are the obstacles to including "preloaded" objects in regular .pyc files, so that everyone can take advantage of this without rebuilding the interpreter? Off the top of my head: We'd be making the in-memory layout of those objects part of the .pyc format, so we couldn't change that within a minor release. I suspect this wouldn't be a big change though, since we already commit to ABI compatibility for C extensions within a minor release? In principle there are some cases where this would be different (e.g. adding new fields at the end of an object is generally ABI compatible), but this might not be an issue for the types of objects we're talking about. There's some memory management concern, since these are, y'know, heap objects, and we wouldn't be heap allocating them. The main constraint would be that you couldn't free them one at a time, but would have to free the whole block at once. But I think it at least wouldn't be too hard to track whether any of the objects in the block are still alive, and free the whole block if there aren't any. E.g., we could have an object flag that means "when this object is freed, don't call free(), instead find the containing block and decrement its live-object count. You probably need this flag even in the current version, right? (And the flag could also be an escape hatch if we did need to change object size: check for the flag before accessing the new fields.) Or maybe you could get clever tracking object liveness on an page by page basis; not sure it's worth it though. Unloading module-level objects is pretty rare. I'm assuming these objects can have pointers to each other, and to well known constants like None, so you need some kind of relocation engine to fix those up. Right now I guess you're probably using the one built into the dynamic loader? In theory it shouldn't be too hard to write our own ? basically just a list of offsets in the block where we need to add the base address or write the address of a well known constant, I think? Anything else I'm missing? On Fri, May 4, 2018, 16:06 Carl Shapiro wrote: > On Fri, May 4, 2018 at 5:14 AM, Nick Coghlan wrote: > >> This definitely seems interesting, but is it something you'd be seeing us >> being able to take advantage of for conventional Python installations, or >> is it more something you'd expect to be useful for purpose-built >> interpreter instances? (e.g. if Mercurial were running their own Python, >> they could precache the heap objects for their commonly imported modules in >> their custom interpreter binary, regardless of whether those were standard >> library modules or not). >> > > Yes, this would be a win for a conventional Python installation as well. > Specifically, users and their scripts would enjoy a reduction in > cold-startup time. > > In the numbers I showed yesterday, the version of the interpreter with our > patch applied included unmarshaled data for the modules that always appear > on the sys.modules list after an ordinary interpreter cold-start. I > believe it is worthwhile to including that set of modules in the standard > CPython interpreter build. Expanding that set to include the commonly > imported modules might be an additional win, especially for short-running > scripts. > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/njs%40pobox.com > On May 4, 2018 16:06, "Carl Shapiro" wrote: On Fri, May 4, 2018 at 5:14 AM, Nick Coghlan wrote: > This definitely seems interesting, but is it something you'd be seeing us > being able to take advantage of for conventional Python installations, or > is it more something you'd expect to be useful for purpose-built > interpreter instances? (e.g. if Mercurial were running their own Python, > they could precache the heap objects for their commonly imported modules in > their custom interpreter binary, regardless of whether those were standard > library modules or not). > Yes, this would be a win for a conventional Python installation as well. Specifically, users and their scripts would enjoy a reduction in cold-startup time. In the numbers I showed yesterday, the version of the interpreter with our patch applied included unmarshaled data for the modules that always appear on the sys.modules list after an ordinary interpreter cold-start. I believe it is worthwhile to including that set of modules in the standard CPython interpreter build. Expanding that set to include the commonly imported modules might be an additional win, especially for short-running scripts. _______________________________________________ Python-Dev mailing list Python-Dev at python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/njs%40pobox.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at pearwood.info Fri May 4 22:07:50 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Sat, 5 May 2018 12:07:50 +1000 Subject: [Python-Dev] Dealing with tone in an email In-Reply-To: References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> <20180504154243.GA9539@ando.pearwood.info> Message-ID: <20180505020748.GM9562@ando.pearwood.info> On Fri, May 04, 2018 at 03:21:28PM -0400, Terry Reedy wrote: > On 5/4/2018 11:43 AM, Steven D'Aprano wrote: > > >I'm not defending Ivan's initial email. His tantrum *was* annoying, > >unreasonable, and unfair to those who do care about tkinter. > > Ivan's email was a disinformation troll intended to jump the attention > queue of core developers. He is proud of its apparent success, and > seemingly unconcerned about equally apparent side-effects. Terry, please, to persist in attacking Ivan's past behaviour when he has not repeated it is not open, considerate or respectful. At this point, *our reaction* to Ivan's transgression has been much worse and more disruptive than anything he has done. People had already jumped down Ivan's throat long before he rationalised the tone of his initial email as necessary to get people's attention. Whether that rationalisation was an excuse he came up with afterwards, or a deliberate plan he started with, is impossible to tell. I wouldn't even expect Ivan to be 100% in his own mind which it was. Psychology isn't that cut and dried. But it doesn't matter. *One* impolite email, even if deliberate, should not be enough to condemn a newbie. If he persists with this pattern of behaviour, that is a different story. I understand your concern about deliberate disinformation, but even if your worst fears come true and "someone somewhere, in a blog, talk, or SO answer, quotes Ivan as a reason to not use tkinter", that hardly makes giving the opinion "tkinter is broken" an unforgiveable sin. As you point out yourself, that opinion is hardly new or rare. And frankly, people are allowed to be wrong. Let's not have thought police here, please. Everyone, can we *PLEASE* stop attacking this newbie now, and give him a chance to show by his future actions that he either has or hasn't changed his ways? Wrongly suggesting that nobody uses a software package should not be a Zero Tolerance offense. -- Steve From tjreedy at udel.edu Fri May 4 22:42:09 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Fri, 4 May 2018 22:42:09 -0400 Subject: [Python-Dev] Dealing with tone in an email In-Reply-To: <20180505020748.GM9562@ando.pearwood.info> References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> <20180504154243.GA9539@ando.pearwood.info> <20180505020748.GM9562@ando.pearwood.info> Message-ID: On 5/4/2018 10:07 PM, Steven D'Aprano wrote: > Terry, please, to persist in attacking Ivan's past behaviour when he has > not repeated it is not open, considerate or respectful. I did not do that. My first sentence was background for a *discussion* about a partial disagreement with what you said. I won't repeat any of it. > At this point, > *our reaction* to Ivan's transgression has been much worse and more > disruptive than anything he has done. I partially agree with this and said so in my last summary line. And I agree that these threads should die. -- Terry Jan Reedy From tjreedy at udel.edu Fri May 4 23:45:52 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Fri, 4 May 2018 23:45:52 -0400 Subject: [Python-Dev] Process to remove a Python feature In-Reply-To: References: Message-ID: On 5/4/2018 6:18 PM, Victor Stinner wrote: > 2018-05-04 23:59 GMT+02:00 Terry Reedy : >> Would it be possible (and sensible) to use the 2to3 machinery to produce >> 36to37.py, etc., to do mechanical replacements when possible and flag other >> things when necessary? > > I suggest you to watch Daniele Esposti's talk "Evolution or stagnation > programming languages". He explains that Javascript is more successful > than Python to introduce *language* evolutions thanks to transpiling > (things like babel and polyfill): > https://www.pycon.it/conference/talks/evolution-or-stagnation-programming-languages I ran through the slides and found the babelsite. What I found: Babel translates new code back to a sufficiently powerful and presumably ubiquitous older version. It does so on a selectable feature basis rather than a language version basis. (In other words, define your own 'new' version.) Polyfill supplies the backported new objects needed to make the back translations run with new semantics. This would be equivalent of defining, for instance, 2.7 as a base version and having 3xbytesto27, 35asyncto27, etc for every new 3.x feature. Some people wanted this, but, of course, 2.7 is *not* installed everywhere. If Microsoft were to treat Python like it once did Basic, and install it on all Windows machines, it would start with recent 3.x. Neither the slides nor site said anything about bug fixes and about the need to have multiple versions of every function touched. Because of the unique features of how Javascript is distributed and used, I don't see how the Babel example would apply very well to Python. -- Terry Jan Reedy From steve.dower at python.org Sat May 5 00:48:44 2018 From: steve.dower at python.org (Steve Dower) Date: Fri, 4 May 2018 21:48:44 -0700 Subject: [Python-Dev] Process to remove a Python feature In-Reply-To: References: <20180504131403.GJ9562@ando.pearwood.info> Message-ID: To be fair, if they hadn?t already been deprecated we would have had to deprecate the old behaviour for a couple of releases before changing it. It just so happened that we did the deprecation first without knowing what the fix would be :) Top-posted from my Windows phone From: Victor Stinner Sent: Friday, May 4, 2018 15:24 To: Serhiy Storchaka Cc: python-dev Subject: Re: [Python-Dev] Process to remove a Python feature 2018-05-04 20:48 GMT+02:00 Serhiy Storchaka : > I think the information about since which version number it will be removed > is more useful. About deprecation, there is the funny story of bytes filenames on Windows. I deprecated this feature in Windows 3.3 since it was broken. I really wanted hard to remove support for bytes filenames on Windows. One day, Steve Dower showed up with the PEP 529 "Change Windows filesystem encoding to UTF-8" and he just *removed the deprecation warning* from Python 3.6, since bytes filenames are now working as intended :-) Victor _______________________________________________ Python-Dev mailing list Python-Dev at python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/steve.dower%40python.org -------------- next part -------------- An HTML attachment was scrubbed... URL: From J.Demeyer at UGent.be Sat May 5 04:55:02 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Sat, 5 May 2018 10:55:02 +0200 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update Message-ID: <5AED7166.1010008@UGent.be> Hello all, I have updated PEP 575 in response to some posts on this mailing list and to some discussions in person with the core Cython developers. See https://www.python.org/dev/peps/pep-0575/ The main differences with respect to the previous version are: * "builtin_function" was renamed to "cfunction". Since we are changing the name anyway, "cfunction" looked like a better choice because the word "built-in" typically refers to things from the builtins module. * defined_function now only defines an API (it must support all attributes that a Python function has) without specifying the implementation. * The "Two-phase Implementation" proposal for better backwards compatibility has been expanded and now offers 100% backwards compatibility for the classes and for the inspect functions. Jeroen. From ncoghlan at gmail.com Sat May 5 12:16:24 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun, 6 May 2018 02:16:24 +1000 Subject: [Python-Dev] A fast startup patch (was: Python startup time) In-Reply-To: References: Message-ID: On 5 May 2018 at 11:58, Nathaniel Smith wrote: > What are the obstacles to including "preloaded" objects in regular .pyc > files, so that everyone can take advantage of this without rebuilding the > interpreter? > > Off the top of my head: > > We'd be making the in-memory layout of those objects part of the .pyc > format, so we couldn't change that within a minor release. I suspect this > wouldn't be a big change though, since we already commit to ABI > compatibility for C extensions within a minor release? In principle there > are some cases where this would be different (e.g. adding new fields at the > end of an object is generally ABI compatible), but this might not be an > issue for the types of objects we're talking about. > I'd frame this one a bit differently: what if we had a platform-specific variant of the pyc format that was essentially a frozen module packaged as an extension module? We probably couldn't quite do that for arbitrary Python modules *today* (due to the remaining capability differences between regular modules and extension modules), but multi-phase initialisation gets things *much* closer to parity, and running embedded bytecode instead of accessing the C API directly should avoid the limitations that exist for classes defined in C. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Sat May 5 12:30:55 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun, 6 May 2018 02:30:55 +1000 Subject: [Python-Dev] Process to remove a Python feature In-Reply-To: References: Message-ID: On 5 May 2018 at 07:59, Terry Reedy wrote: > On 5/2/2018 5:11 AM, Victor Stinner wrote: > > As a follow-up to the "[Python-Dev] (Looking for) A Retrospective on >> the Move to Python 3" thread, I will like to clarify how a feature >> should be removed from Python. >> > > Would it be possible (and sensible) to use the 2to3 machinery to produce > 36to37.py, etc., to do mechanical replacements when possible and flag other > things when necessary? > That capability actually exists in the python-future project (in the form of their "pasteurize" script, which takes idiomatic Python 3 code and turns it into Python 2.7 compatible code). One major issue though is that without a JIT compiler to speed things back up, the polyfills needed to target old versions in the general case (rather than when restricting yourself to a "backport friendly Python subset") are often either going to be unacceptably slow, or else unacceptably irritating to maintain, so folks have invested their time into figuring out how to ship their own runtimes in order to avoid being restricted by redistributors with slow runtime update cycles, and in being able to rely on PyPI packages rather than standard library packages. (That's very different from the incentives for browser vendors, who have users that expect them to be able to make arbitrary websites work, and blame the browser rather than the site when particular sites they want to use don't work properly) The other thing worth keeping in mind is that one of the core assumptions of JavaScript tooling is that debugging tools will be able to reach out to the internet and download additional files (most notably source maps) to help make sense of the currently running code. Without that ability to trace from compiled code back to the preferred source form for editing, making sense of an otherwise straightforward traceback can become a complex debugging exercise. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From solipsis at pitrou.net Sat May 5 12:49:15 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Sat, 5 May 2018 18:49:15 +0200 Subject: [Python-Dev] Dealing with tone in an email (was: Drop/deprecate Tkinter?) References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> <20180504154243.GA9539@ando.pearwood.info> Message-ID: <20180505184915.3f7ec9ca@fsol> On Sat, 5 May 2018 01:43:00 +1000 Steven D'Aprano wrote: > On Thu, May 03, 2018 at 06:31:03PM +0000, Brett Cannon wrote: > > > No one is saying people can't be upset and if you are ever upset there's > > something wrong; we're human beings after all. But those of us speaking up > > about the tone are saying that you can also wait until you're not so upset > > to write an email. This was never going to be resolved in an hour, so > > waiting an hour until you're in a better place to write an email that > > wasn't quite so inflammatory seems like a reasonable thing to ask. > > Certainly! > > I'm not defending Ivan's initial email. His tantrum *was* annoying, > unreasonable, and unfair to those who do care about tkinter. He could > have done better. > > But *we* should be better too. Our response to Ivan has not been > welcoming, and as a community we haven't lived up to our own standards, > as we have piled onto him to express our rightous indignation: [...] Well summed up. Regards Antoine. From a.badger at gmail.com Sat May 5 13:30:33 2018 From: a.badger at gmail.com (Toshio Kuratomi) Date: Sat, 05 May 2018 17:30:33 +0000 Subject: [Python-Dev] A fast startup patch (was: Python startup time) In-Reply-To: References: Message-ID: On Fri, May 4, 2018, 7:00 PM Nathaniel Smith wrote: > What are the obstacles to including "preloaded" objects in regular .pyc > files, so that everyone can take advantage of this without rebuilding the > interpreter? > Would this make .pyc files arch specific? -Toshio -------------- next part -------------- An HTML attachment was scrubbed... URL: From ericfahlgren at gmail.com Sat May 5 13:40:27 2018 From: ericfahlgren at gmail.com (Eric Fahlgren) Date: Sat, 5 May 2018 10:40:27 -0700 Subject: [Python-Dev] A fast startup patch (was: Python startup time) In-Reply-To: References: Message-ID: On Sat, May 5, 2018 at 10:30 AM, Toshio Kuratomi wrote: > On Fri, May 4, 2018, 7:00 PM Nathaniel Smith wrote: > >> What are the obstacles to including "preloaded" objects in regular .pyc >> files, so that everyone can take advantage of this without rebuilding the >> interpreter? >> > > Would this make .pyc files arch specific? > Or have parallel "pyh" (Python "heap") files, that are architecture specific... (But that would cost more stat calls.) -------------- next part -------------- An HTML attachment was scrubbed... URL: From a.badger at gmail.com Sat May 5 14:33:55 2018 From: a.badger at gmail.com (Toshio Kuratomi) Date: Sat, 05 May 2018 18:33:55 +0000 Subject: [Python-Dev] A fast startup patch (was: Python startup time) In-Reply-To: References: Message-ID: On Sat, May 5, 2018, 10:40 AM Eric Fahlgren wrote: > On Sat, May 5, 2018 at 10:30 AM, Toshio Kuratomi > wrote: > >> On Fri, May 4, 2018, 7:00 PM Nathaniel Smith wrote: >> >>> What are the obstacles to including "preloaded" objects in regular .pyc >>> files, so that everyone can take advantage of this without rebuilding the >>> interpreter? >>> >> >> Would this make .pyc files arch specific? >> > > Or have parallel "pyh" (Python "heap") files, that are architecture > specific... (But that would cost more stat calls.) > I ask because arch specific byte code files are a big change in consumers expectations. It's not necessarily a bad change but it should be communicated to downstreams so they can decide how to adjust to it. Linux distros which ship byte code files will need to build them for each arch, for instance. People who ship just the byte code as an obfuscation of the source code will need to decide whether to ship packages for each arch they care about or change how they distribute. -Toshio > -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Sat May 5 15:00:10 2018 From: njs at pobox.com (Nathaniel Smith) Date: Sat, 05 May 2018 19:00:10 +0000 Subject: [Python-Dev] A fast startup patch (was: Python startup time) In-Reply-To: References: Message-ID: On Sat, May 5, 2018, 11:34 Toshio Kuratomi wrote: > > > On Sat, May 5, 2018, 10:40 AM Eric Fahlgren > wrote: > >> On Sat, May 5, 2018 at 10:30 AM, Toshio Kuratomi >> wrote: >> >>> On Fri, May 4, 2018, 7:00 PM Nathaniel Smith wrote: >>> >>>> What are the obstacles to including "preloaded" objects in regular .pyc >>>> files, so that everyone can take advantage of this without rebuilding the >>>> interpreter? >>>> >>> >>> Would this make .pyc files arch specific? >>> >> >> Or have parallel "pyh" (Python "heap") files, that are architecture >> specific... (But that would cost more stat calls.) >> > > I ask because arch specific byte code files are a big change in consumers > expectations. It's not necessarily a bad change but it should be > communicated to downstreams so they can decide how to adjust to it. > > Linux distros which ship byte code files will need to build them for each > arch, for instance. People who ship just the byte code as an obfuscation > of the source code will need to decide whether to ship packages for each > arch they care about or change how they distribute. > That's a good point. One way to minimize the disruption would be to include both the old and new info in the .pyc files, so at load time if the new version is incompatible then you can fall back on the old way, even if it's a bit slower. I think in the vast majority of cases currently .pyc files are built on the same architecture where they're used? Pip and Debian/Ubuntu and the interpreter's automatic compilation-on-import all build .pyc files on the computer where they'll be run. It might also be worth double checking much the memory layout of these objects even varies. Obviously it'll be different for 32- and 64-bit systems, but beyond that, most ISAs and OSes and compilers use pretty similar struct layout rules AFAIK... we're not talking about actual machine code. -n > -------------- next part -------------- An HTML attachment was scrubbed... URL: From v+python at g.nevcal.com Sat May 5 14:25:00 2018 From: v+python at g.nevcal.com (Glenn Linderman) Date: Sat, 5 May 2018 11:25:00 -0700 Subject: [Python-Dev] A fast startup patch (was: Python startup time) In-Reply-To: References: Message-ID: <22cbd040-c9ec-4de3-623a-a33a56e17d09@g.nevcal.com> On 5/5/2018 10:30 AM, Toshio Kuratomi wrote: > On Fri, May 4, 2018, 7:00 PM Nathaniel Smith > wrote: > > What are the obstacles to including "preloaded" objects in regular > .pyc files, so that everyone can take advantage of this without > rebuilding the interpreter? > > > Would this make .pyc files arch specific? Lots of room in the __pycache__ folder. As compilation of the .py module proceeds, could it be determined if there is anything that needs to be architecture specific, and emit an architecture-specific one or an architecture-independent one as appropriate?? Data structures are mostly bitness-dependent, no? But if an architecture-specific .pyc is required, could/should it be structured and named according to the OS conventions also:? .dll .so? .etc ? Even if it doesn't contain executable code, the bytecode could be contained in appropriate data sections, and there has been talk about doing relocation of pointer in such pre-compiled data structures, and the linker _already_ can do that sort of thing... -------------- next part -------------- An HTML attachment was scrubbed... URL: From mhroncok at redhat.com Sat May 5 15:17:17 2018 From: mhroncok at redhat.com (=?UTF-8?Q?Miro_Hron=c4=8dok?=) Date: Sat, 5 May 2018 21:17:17 +0200 Subject: [Python-Dev] A fast startup patch (was: Python startup time) In-Reply-To: References: Message-ID: <9a568bba-4bbc-8ba3-7b2d-ab75f37c1b46@redhat.com> On 5.5.2018 21:00, Nathaniel Smith wrote: > I think in the vast majority of cases currently .pyc files are built on > the same architecture where they're used? On Fedora (and by extension also on RHEL and CentOS) this is not rue. When the package is noarch (no extension module shipped, only pure Python) it is built and bytecompiled on a random architecture. Bytecompilation happens during build time. If bytecode gets arch specific, we'd need to make all our Python packages arch specific or switch to install-time bytecompilation. -- Miro Hron?ok -- Phone: +420777974800 IRC: mhroncok From rob.cliffe at btinternet.com Sat May 5 15:49:26 2018 From: rob.cliffe at btinternet.com (Rob Cliffe) Date: Sat, 5 May 2018 20:49:26 +0100 Subject: [Python-Dev] (name := expression) doesn't fit the narrative of PEP 20 In-Reply-To: References: Message-ID: <70153d09-65a9-2f05-2b97-a77f11460500@btinternet.com> Reading this sub-thread, it struck me that a good way to make PEP 562 more likely to be accepted is to launch an over-the-top attack on it. Then more moderate people - who were/are not necessarily in favour of the PEP - feel pressurised into defending it. Hah! Watch this space for my vicious, vitriolic, withering attack on PEP 463 (Exception-catching expressions)! :-) Best wishes Rob Cliffe From tjreedy at udel.edu Sat May 5 15:56:48 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Sat, 5 May 2018 15:56:48 -0400 Subject: [Python-Dev] A fast startup patch (was: Python startup time) In-Reply-To: References: Message-ID: On 5/5/2018 2:33 PM, Toshio Kuratomi wrote: > > > On Sat, May 5, 2018, 10:40 AM Eric Fahlgren > wrote: > > On Sat, May 5, 2018 at 10:30 AM, Toshio Kuratomi >wrote: > > On Fri, May 4, 2018, 7:00 PM Nathaniel Smith > wrote: > > What are the obstacles to including "preloaded" objects in > regular .pyc files, so that everyone can take advantage of > this without rebuilding the interpreter? > > > Would this make .pyc files arch specific? > > > Or have parallel "pyh" (Python "heap") files, that are architecture > specific... (But that would cost more stat calls.) > > > I ask because arch specific byte code files are a big change in > consumers expectations.? It's not necessarily a bad change but it should > be communicated to downstreams so they can decide how to adjust to it. > > Linux distros which ship byte code files will need to build them for > each arch, for instance.? People who ship just the byte code as an > obfuscation of the source code will need to decide whether to ship > packages for each arch they care about or change how they distribute. It is an advertised feature that CPython *can* produce cross-platform version-specific .pyc files. I believe this should continue, at least for a few releases. They are currently named modname.cpython-xy.pyc, with optional '.opt-1', '.opt-2', and '.opt-4' tags inserted before in __pycache__. These name formats should continue to mean what they do now. I believe *can* should not mean *always*. Architecture-specific files will need an additional architecture tag anyway, such as win32 and win64, anyway. Or would bitness and endianess be sufficient across platforms? If we make architecture-specific the default, we could add startup and compile and compile_all options for the cross-platform format. Or maybe add a recompile function that imports cross-platform .pycs and outputs local-architecture .pycs. -- Terry Jan Reedy From brett at python.org Sat May 5 15:34:25 2018 From: brett at python.org (Brett Cannon) Date: Sat, 05 May 2018 19:34:25 +0000 Subject: [Python-Dev] A fast startup patch (was: Python startup time) In-Reply-To: References: Message-ID: On Sat, 5 May 2018 at 10:41 Eric Fahlgren wrote: > On Sat, May 5, 2018 at 10:30 AM, Toshio Kuratomi > wrote: > >> On Fri, May 4, 2018, 7:00 PM Nathaniel Smith wrote: >> >>> What are the obstacles to including "preloaded" objects in regular .pyc >>> files, so that everyone can take advantage of this without rebuilding the >>> interpreter? >>> >> >> Would this make .pyc files arch specific? >> > > Or have parallel "pyh" (Python "heap") files, that are architecture > specific... > .pyc files have tags to specify details about them (e.g. were they compiled with -OO), so this isn't an "all or nothing" option, nor does it require a different file extension. There just needs to be an appropriate finder that knows how to recognize a .pyc file with the appropriate tag that can be used, and then a loader that knows how to read that .pyc. > (But that would cost more stat calls.) > Nope, we actually cache directory contents so file lookup existence is essentially free (this is why importlib.invalidate_caches() exists specifically to work around when the timestamp is too coarse for a directory content mutation). -Brett -------------- next part -------------- An HTML attachment was scrubbed... URL: From vano at mail.mipt.ru Sat May 5 20:22:23 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Sun, 6 May 2018 03:22:23 +0300 Subject: [Python-Dev] Dealing with tone in an email In-Reply-To: References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> <20180504154243.GA9539@ando.pearwood.info> Message-ID: <93bae6b3-f5f5-ab7d-a375-a6b1e8971d4b@mail.mipt.ru> On 04.05.2018 19:04, Guido van Rossum wrote: > Thank you Steven! I assume that Brian hadn't seen my response (such > crossed messages due to delivery delays are very common in this > mailing list). > > I'd like to use your email (nearly) verbatim to start off the > discussion about civility we're going to have at the Language Summit. > Since I won't be present at the summit to tell my side of the story, you can see it below. It's up to you to judge it, but as least you need to know what to judge. In a nutshell, this is an exceptional situation, and I saw no better way that was guaranteed to work. I never meant or mean to use this as a standard tactic, this is only the second such case in my life. > On Fri, May 4, 2018 at 8:43 AM, Steven D'Aprano > wrote: > > On Thu, May 03, 2018 at 06:31:03PM +0000, Brett Cannon wrote: > > > No one is saying people can't be upset and if you are ever upset > there's > > something wrong; we're human beings after all. But those of us > speaking up > > about the tone are saying that you can also wait until you're > not so upset > > to write an email. This was never going to be resolved in an > hour, so > > waiting an hour until you're in a better place to write an email > that > > wasn't quite so inflammatory seems like a reasonable thing to ask. > > Certainly! > > I'm not defending Ivan's initial email. His tantrum *was* annoying, > unreasonable, and unfair to those who do care about tkinter. He could > have done better. > > But *we* should be better too. Our response to Ivan has not been > welcoming, and as a community we haven't lived up to our own > standards, > as we have piled onto him to express our rightous indignation: > > 1. Guido responded telling Ivan to calm down and work off his > ? ?frustration elsewhere. And that's where things should have > ? ?stopped, unless Ivan had persisted in his impoliteness. > > 2. Brian upped the ante by bringing the CoC into discussion. > > 3. Paul raised it again by describing Ivan's post as "offensive". > > 4. And now, Steve H has claimed that Ivan's initial post was > ? ?bordering on "abusive". > > We've gone from rightly treating Ivan's post as intemperate and > impolite, and telling him to chill, to calling his post > "offensive", to > "abusive". (Next, I presume, someone will claim to be traumatised by > Ivan's email.) > > Just as Ivan should have waited until he had calmed down before > firing > off his rant, so we ought to resist the temptation to strike back > with > hostility at trivial social transgressions, especially from > newcomers. > This is what Ivan actually said: > > - Tkinter is broken and partly functional (an opinion with only the > ? most tenuous connection with fact, but hardly abusive); > > - that nobody cares (factually wrong, but not abusive); > > - that possibly nobody is using it (factually wrong, but not abusive); > > - that if that's the case (it isn't), then it should be removed > ? from the std lib (a reasonable suggestion if only the premise had > ? been correct). > As I suspected. This is a classic scenario that is occasionally seen anywhere: "everyone is underestimating a problem until a disaster strikes". The team's perception of Tkinter is basically: "well, there are slight issues, and the docs are lacking, but no big deal." Well, this _is_ a big deal. As in, "with 15+ years of experience, 5+ with Python, I failed to produce a working GUI in a week; no-one on the Net, regardless of experience, (including Terry) is ever sure how to do things right; every online tutorial says: "all the industry-standard and expected ways are broken/barred, we have to resort to ugly workarounds to accomplish just about anything"" big deal. This is anything but normal, and all the more shocking in Python where the opposite is the norm. And now, a disaster striked. Not knowing this, I've relied on Tkinter with very much at stake (my income for the two following months, basically), and lost. If that's not a testament just how much damage Tkinter's current state actually does, I dunno what is. Of course, it's up to me to write fixes and all since this is a volunteer project. But I can't do this alone, I must recruit the team's cooperation if I hope to ever be successful. Unless I shatter their current outlook on the matter first, any fixes I provide will likely be dismissed as unneeded or deferred indefinitely as unimportant. There are precedents of that, including with no response whatsoever, and the messages were written neutrally, with a thorough explanation, patch/PR etc. (I do believe the maintainers are doing their best. Still, the mere fact that they chose to work with other tickers over mine shows that they considered those more important. So it does matter if they underestimate a topic.) That's why I had to resort to shock value. First, it would guarantee that my message won't fall on deaf ears this time as well. Second, I had to express, somehow, that there indeed was a systemic disaster, not just your average newbie conundrum, in graphic details to shock the team and disrupt their current way of thinking about the Tkinter case. Putting the question point-blank: "drop/deprecate" -- also helped the shock value, and would also force the team to reassess if they really have the will or resources to bring the module up to Python standards or at least prevent any more such damage. I also did require the team's feedback on this question to assess the perspectives for results of my efforts -- thus if they're worth the time -- as explained in https://mail.python.org/pipermail/python-dev/2018-May/153330.html . By no means I consider this a standard way of action. This is only the second such case in my life. The previous one was when I created a translation project for the GPL and found that I cannot legally do that the way that was required to make it work due to overly defensive FSF's terms. This is the justification. It's up to you to judge how sound it is, but you need to know what to judge, at least. I wasn't happy to have to resort to this, but found no better way that would be guaranteed to work. Now that the social issues are out of the way and I got the required feedback to continue, I can finally concentrate on the patches, with confidence that my efforts won't go to waste. > > > Intemperate and impolite it certainly was, as well as full of factual > inaccuracies, but to call it "close to abusive" is a hostile over- > reaction. We ought to be kinder than that. Our response to Ivan > has been > more hostile, and less open and respectful, than his email that > triggered the response. > > Brett is right to say people can afford to wait a little while before > firing off an angry email. But the same applies to us: we too can > afford > to wait a little while before raising the threat of the CoC over a > minor > social faux pas. This community isn't so fragile that we have to jump > down the throat of a newcomer lest the community immediately > collapses > into Call Of Duty gamer culture. > > > -- > Steve > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > > > > > > -- > --Guido van Rossum (python.org/~guido ) > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/vano%40mail.mipt.ru -- Regards, Ivan -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Sat May 5 20:44:24 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun, 6 May 2018 10:44:24 +1000 Subject: [Python-Dev] A fast startup patch (was: Python startup time) In-Reply-To: References: Message-ID: On 6 May 2018 at 05:34, Brett Cannon wrote: > > > On Sat, 5 May 2018 at 10:41 Eric Fahlgren wrote: > >> On Sat, May 5, 2018 at 10:30 AM, Toshio Kuratomi >> wrote: >> >>> On Fri, May 4, 2018, 7:00 PM Nathaniel Smith wrote: >>> >>>> What are the obstacles to including "preloaded" objects in regular .pyc >>>> files, so that everyone can take advantage of this without rebuilding the >>>> interpreter? >>>> >>> >>> Would this make .pyc files arch specific? >>> >> >> Or have parallel "pyh" (Python "heap") files, that are architecture >> specific... >> > > .pyc files have tags to specify details about them (e.g. were they > compiled with -OO), so this isn't an "all or nothing" option, nor does it > require a different file extension. There just needs to be an appropriate > finder that knows how to recognize a .pyc file with the appropriate tag > that can be used, and then a loader that knows how to read that .pyc. > Right, this is the kind of change I had in mind (perhaps in combination with Diana Clarke's suggestion from several months back to make pyc tagging more feature-flag centric, rather than the current focus on a numeric optimisation level). We also wouldn't ever generate this hypothetical format implicitly - similar to the new deterministic pyc's in 3.7, they'd be something you had to explicitly request via a compileall invocation. In the Linux distro use case then, the relevant distro packaging helper scripts and macros could generate traditional cross-platform pyc files for no-arch packages, but automatically switch to the load-time optimised arch-specific format if the package was already arch-specific. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From mertz at gnosis.cx Sat May 5 21:03:51 2018 From: mertz at gnosis.cx (David Mertz) Date: Sat, 5 May 2018 21:03:51 -0400 Subject: [Python-Dev] Dealing with tone in an email In-Reply-To: <93bae6b3-f5f5-ab7d-a375-a6b1e8971d4b@mail.mipt.ru> References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> <20180504154243.GA9539@ando.pearwood.info> <93bae6b3-f5f5-ab7d-a375-a6b1e8971d4b@mail.mipt.ru> Message-ID: The below is really just making this whole situation worse. On Sat, May 5, 2018 at 8:22 PM, Ivan Pozdeev via Python-Dev < python-dev at python.org> wrote: > As I suspected. This is a classic scenario that is occasionally seen > anywhere: "everyone is underestimating a problem until a disaster strikes". > The team's perception of Tkinter is basically: "well, there are slight > issues, and the docs are lacking, but no big deal." > Well, this _is_ a big deal. As in, "with 15+ years of experience, 5+ with > Python, I failed to produce a working GUI in a week; no-one on the Net, > regardless of experience, (including Terry) is ever sure how to do things > right; every online tutorial says: "all the industry-standard and expected > ways are broken/barred, we have to resort to ugly workarounds to accomplish > just about anything"" big deal. This is anything but normal, and all the > more shocking in Python where the opposite is the norm. > This is simply objectively wrong, and still rather insulting to the core developers. The real-world fact is that many people?including the authors of IDLE, which is included with Python itself?use Tkinter to develop friendly, working, GUIs. Obviously, there *is* a way to make Tkinter work. I confess I haven't worked with it for a while, and even when I had, it was fairly toy apps. I never saw any terrible problems, but I confess I also never pushed the edges of it. It's quite possible, even likely, that some sufficiently complicated GUI apps are better off eschewing Tkinter and using a different GUI library. It's also quite possible that the documentation around Tkinter could be improved to convey more accurate messaging around this (and to convey the common pattern of "GUI in one thread, workers in other threads." > And now, a disaster striked. Not knowing this, I've relied on Tkinter with > very much at stake (my income for the two following months, basically), and > lost. If that's not a testament just how much damage Tkinter's current > state actually does, I dunno what is. > I've sunk two months each into trying to wrestle quite a large number of frameworks or libraries to do what I want. Sometimes I finally made it work, other times not. That's the reality of software development. Sometimes the problems were bugs per se, other times limits of my understanding. Often the problems were with extremely widely used and "solid" libraries (not just in Python, across numerous languages). There are a few recurring posters here and on python-ideas of whom I roll my eyes when I see a post is from them... I think most actual core contributors simply have them on auto-delete filters by now. I don't know where the threshold is exactly, but I suspect you're getting close to that with this post. Yours, David... -- Keeping medicines from the bloodstreams of the sick; food from the bellies of the hungry; books from the hands of the uneducated; technology from the underdeveloped; and putting advocates of freedom in prisons. Intellectual property is to the 21st century what the slave trade was to the 16th. -------------- next part -------------- An HTML attachment was scrubbed... URL: From rosuav at gmail.com Sat May 5 21:09:21 2018 From: rosuav at gmail.com (Chris Angelico) Date: Sun, 6 May 2018 11:09:21 +1000 Subject: [Python-Dev] Dealing with tone in an email In-Reply-To: <93bae6b3-f5f5-ab7d-a375-a6b1e8971d4b@mail.mipt.ru> References: <20180502232822.01778283@fsol> <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> <20180504154243.GA9539@ando.pearwood.info> <93bae6b3-f5f5-ab7d-a375-a6b1e8971d4b@mail.mipt.ru> Message-ID: On Sun, May 6, 2018 at 10:22 AM, Ivan Pozdeev via Python-Dev wrote: > Well, this _is_ a big deal. As in, "with 15+ years of experience, 5+ with > Python, I failed to produce a working GUI in a week; no-one on the Net, > regardless of experience, (including Terry) is ever sure how to do things > right; every online tutorial says: "all the industry-standard and expected > ways are broken/barred, we have to resort to ugly workarounds to accomplish > just about anything"" big deal. This is anything but normal, and all the > more shocking in Python where the opposite is the norm. > > And now, a disaster striked. Not knowing this, I've relied on Tkinter with > very much at stake (my income for the two following months, basically), and > lost. If that's not a testament just how much damage Tkinter's current state > actually does, I dunno what is. What exactly didn't work? I don't understand. What online tutorials are telling you that everything is broken, and how can you lose two months' income because things are exactly as broken as everything tells you? As far as I can tell, you ran into problems when you tried to put GUI operations onto a thread other than the main thread. Okay, so maybe that's a limitation that bit you, but I can't accept that this is "industry-standard". In fact, I would be much more inclined to say that the industry standard is single-threaded code, given how terrified a lot of people are of concurrency in general. (Not everyone, but a lot of people.) As long as your GUI operations happen on your main thread, you should be fine. I told my brother about that consideration recently, and the solution was simple: instead of doing socket operations on the main thread and GUI operations on a secondary thread, just switch them around. It's really that simple. So what actually went wrong when you tried, and how did you manage to get so far into it before disaster that you lost two months' income? Tkinter is an important part of Python's ecosystem. It isn't the greatest GUI toolkit (and it isn't trying to be), and it isn't stopping people from using GTK or Qt or wxWindows, but Tk and Tkinter can be depended on much more easily, since that's part of the standard library. If they were "broken" or "unusable", people would have figured that out by now. so that part is very definitely exaggeration. So if your words are not just empty hyperbole, be specific: what is broken? If I'm considering using Tkinter for something, what *precisely* do I need to be aware of? Is it just using Tkinter from threads other than the main thread? ChrisA From steve at pearwood.info Sat May 5 21:39:51 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Sun, 6 May 2018 11:39:51 +1000 Subject: [Python-Dev] Dealing with tone in an email In-Reply-To: References: <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> <20180504154243.GA9539@ando.pearwood.info> <93bae6b3-f5f5-ab7d-a375-a6b1e8971d4b@mail.mipt.ru> Message-ID: <20180506013950.GO9562@ando.pearwood.info> On Sun, May 06, 2018 at 11:09:21AM +1000, Chris Angelico wrote: > What exactly didn't work? I don't understand. https://bugs.python.org/issue33412 -- Steve From rosuav at gmail.com Sat May 5 21:45:38 2018 From: rosuav at gmail.com (Chris Angelico) Date: Sun, 6 May 2018 11:45:38 +1000 Subject: [Python-Dev] Dealing with tone in an email In-Reply-To: <20180506013950.GO9562@ando.pearwood.info> References: <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> <20180504154243.GA9539@ando.pearwood.info> <93bae6b3-f5f5-ab7d-a375-a6b1e8971d4b@mail.mipt.ru> <20180506013950.GO9562@ando.pearwood.info> Message-ID: On Sun, May 6, 2018 at 11:39 AM, Steven D'Aprano wrote: > On Sun, May 06, 2018 at 11:09:21AM +1000, Chris Angelico wrote: > >> What exactly didn't work? I don't understand. > > https://bugs.python.org/issue33412 > I've read it and I still don't fully understand the problem. Is it ALL of Tkinter that fails in threaded mode? Is it just certain specific calls? Are there calls that fail in single-threaded programs? If the given test-case is the only thing that fails, it's a horrific exaggeration to say that Tkinter is broken. But I don't know how far it can be generalized. ChrisA From ncoghlan at gmail.com Sun May 6 03:35:54 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun, 6 May 2018 17:35:54 +1000 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: <5AED7166.1010008@UGent.be> References: <5AED7166.1010008@UGent.be> Message-ID: On 5 May 2018 at 18:55, Jeroen Demeyer wrote: > Hello all, > > I have updated PEP 575 in response to some posts on this mailing list and > to some discussions in person with the core Cython developers. > See https://www.python.org/dev/peps/pep-0575/ > > The main differences with respect to the previous version are: > > * "builtin_function" was renamed to "cfunction". Since we are changing the > name anyway, "cfunction" looked like a better choice because the word > "built-in" typically refers to things from the builtins module. > > * defined_function now only defines an API (it must support all attributes > that a Python function has) without specifying the implementation. > > * The "Two-phase Implementation" proposal for better backwards > compatibility has been expanded and now offers 100% backwards compatibility > for the classes and for the inspect functions. > Thanks for this update Jeroen! If it doesn't come up otherwise, I'll try to claim one of the lightning talk slots at the Language Summit to discuss this with folks in person :) Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From tjreedy at udel.edu Sun May 6 04:54:39 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Sun, 6 May 2018 04:54:39 -0400 Subject: [Python-Dev] Dealing with tone in an email In-Reply-To: References: <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> <20180504154243.GA9539@ando.pearwood.info> <93bae6b3-f5f5-ab7d-a375-a6b1e8971d4b@mail.mipt.ru> <20180506013950.GO9562@ando.pearwood.info> Message-ID: On 5/5/2018 9:45 PM, Chris Angelico wrote: > On Sun, May 6, 2018 at 11:39 AM, Steven D'Aprano wrote: >> On Sun, May 06, 2018 at 11:09:21AM +1000, Chris Angelico wrote: >> >>> What exactly didn't work? I don't understand. >> >> https://bugs.python.org/issue33412 > I've read it and I still don't fully understand the problem. Chris this is an excellent series of questions. I think I have enough knowledge to answer somewhat adequately. This is partly thanks to things people like you and Steven have posted on python-list abouts threads, deadlocks, and volatile conditions, and partly thanks to Ivan's contribution on the thread above and https://bugs.python.org/issue33257. > Is it ALL of Tkinter that fails in threaded mode? No. It is non-threaded tcl that fails in threaded mode, along with tkinter's attempt to make non-thread tcl work anyway. There are at least two different cases. Ivan has clarified the following. 1. Tcl has a 'threads' compile switch. 2. The default changed from 'off' for 8.5 and before to 'on' for 8.6. 3. When compiled with thread support, the resulting library file has t suffix. 4. The Windows installer for 2.7 installs tcl85.dll while at least 3.6 and later install tcl86t.dll. Hence things work that did not work before. 5. Changing 2.7 to tcl85t.dll and tk85t.dll could break 3rd code that interfaces to the .dlls. _tkinter is written with #ifdefs to accommodate thread or no thread compiles, but any code written just for the no-t version would could fail with the t version. > Is it just certain specific calls? 33257 is about calling widget modification methods from threads. The _tkinter code to make this work with non-t tcl fails haphazardly in maybe 1 in 10000 calls. Ivan signed the CA and submitted a PR which he claims fixes the Python and Tcl locking. Serhiy has not reviewed this yet. 33412 is about calling event_generate from threads. For non-t tcl and more than one thread making such calls, the example fails immediately. With thread tcl, the same example ran until it deadlocked during shutdown cleanup. I found one way to almost certainly avoid it. Ivan found a better way, which I am thinking about making part of a new doc section on tkinter and threads. > Are there calls that fail in single-threaded programs? This is a different issue. I don't know of much of anything except a few things on MacOS, due to tcl/tk bugs or Apple changing the graphics system. > If the given test-case is the only thing that fails, Neither fails with properly written code when using the thread build of tcl. > it's a horrific exaggeration to say that Tkinter is broken. Yep. But I don't care any more what Ivan writes here. I am sorry he lost a job bid or whatever. I appreciate what he has contributed on the tracker, in his more rational mode. -- Terry Jan Reedy From stefan at bytereef.org Sun May 6 08:51:45 2018 From: stefan at bytereef.org (Stefan Krah) Date: Sun, 6 May 2018 14:51:45 +0200 Subject: [Python-Dev] Use a queue in Tkinter (was: Dealing with tone in an email) Message-ID: <20180506125145.GA2777@bytereef.org> Steven D'Aprano wrote: >> What exactly didn't work? I don't understand. > > https://bugs.python.org/issue33412 Isn't the standard solution to use a queue for updating the GUI? At least I didn't have any problems at all with my one TKinter app, I think the method is described in the Python Cookbook. Stefan Krah From rosuav at gmail.com Sun May 6 10:03:27 2018 From: rosuav at gmail.com (Chris Angelico) Date: Mon, 7 May 2018 00:03:27 +1000 Subject: [Python-Dev] Dealing with tone in an email In-Reply-To: References: <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> <20180504154243.GA9539@ando.pearwood.info> <93bae6b3-f5f5-ab7d-a375-a6b1e8971d4b@mail.mipt.ru> <20180506013950.GO9562@ando.pearwood.info> Message-ID: On Sun, May 6, 2018 at 6:54 PM, Terry Reedy wrote: >> Is it ALL of Tkinter that fails in threaded mode? > > No. It is non-threaded tcl that fails in threaded mode, along with > tkinter's attempt to make non-thread tcl work anyway. There are at least > two different cases. > > Ivan has clarified the following. > 1. Tcl has a 'threads' compile switch. > 2. The default changed from 'off' for 8.5 and before to 'on' for 8.6. > 3. When compiled with thread support, the resulting library file has t > suffix. Okay, that makes a HUGE difference. Thank you for clarifying. So, in theory, threads SHOULD be supported, which means that bug reports of the nature of "this fails in threaded mode" are 100% valid. If it were up to me, I would deprecate non-threaded mode immediately, with a view to just paying whatever price threaded mode incurs (presumably performance) starting in Python 3.9 or thereabouts. Supporting threads is a Good Thing. Thank you for that explanation. ChrisA From general at vultaire.net Sun May 6 04:05:50 2018 From: general at vultaire.net (Paul Goins) Date: Sun, 6 May 2018 01:05:50 -0700 Subject: [Python-Dev] Windows 10 build agent failures Message-ID: Hi, Just kind of "looking around" at stuff I can help with, and I noticed a few days ago that Windows 10 AMD64 builds of Python 3.6/3.7/3.x are generally failing. It seems like the failures started April 16th around 1am per BuildBot and went from consistently passing to consistently failing. The errors appear to be timeouts; the test runs are going over 15 minutes. I can run the tests locally and things seem fine. The one thing which seems to have changed is, under the "10 slowest tests" header, test_io has shot up in terms of time to complete: 3.6: 2 min 13 sec to 9 min 25 sec 3.7: 2 min 10 sec to 9 min 26 sec 3.x: 3 min 39 sec to 9 min 17 sec Locally this test suite runs in around 36 seconds. I see no real change between running one of the last "good" changesets versus the current head of master. I'm suspecting an issue on the build agent perhaps? Thoughts? Best Regards, Paul Goins -------------- next part -------------- An HTML attachment was scrubbed... URL: From tjreedy at udel.edu Sun May 6 13:45:48 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Sun, 6 May 2018 13:45:48 -0400 Subject: [Python-Dev] Use a queue in Tkinter (was: Dealing with tone in an email) In-Reply-To: <20180506125145.GA2777@bytereef.org> References: <20180506125145.GA2777@bytereef.org> Message-ID: On 5/6/2018 8:51 AM, Stefan Krah wrote: > > Steven D'Aprano wrote: >>> What exactly didn't work? I don't understand. >> >> https://bugs.python.org/issue33412 > > Isn't the standard solution to use a queue for updating the GUI? > > At least I didn't have any problems at all with my one TKinter app, I > think the method is described in the Python Cookbook. The 2nd edition, Recipe 11.9 Combining GUIs and Asynchronous I/O with Threads gives an example that processes items in the queue 5 times a second. I can understand the desire to avoid the put and get overhead and the poll wait by calling a gui method directly. This works when one uses tcl/tk compiled with thread support and avoids inter-thread deadlocks. See my answer to Chris A. for details. -- Terry Jan Reedy From zachary.ware+pydev at gmail.com Sun May 6 14:35:33 2018 From: zachary.ware+pydev at gmail.com (Zachary Ware) Date: Sun, 6 May 2018 13:35:33 -0500 Subject: [Python-Dev] Windows 10 build agent failures In-Reply-To: References: Message-ID: On Sun, May 6, 2018 at 3:05 AM, Paul Goins wrote: > Hi, > > Just kind of "looking around" at stuff I can help with, and I noticed a few > days ago that Windows 10 AMD64 builds of Python 3.6/3.7/3.x are generally > failing. > > It seems like the failures started April 16th around 1am per BuildBot and > went from consistently passing to consistently failing. The errors appear > to be timeouts; the test runs are going over 15 minutes. > > I can run the tests locally and things seem fine. > > The one thing which seems to have changed is, under the "10 slowest tests" > header, test_io has shot up in terms of time to complete: > > 3.6: 2 min 13 sec to 9 min 25 sec > 3.7: 2 min 10 sec to 9 min 26 sec > 3.x: 3 min 39 sec to 9 min 17 sec > > Locally this test suite runs in around 36 seconds. I see no real change > between running one of the last "good" changesets versus the current head of > master. I'm suspecting an issue on the build agent perhaps? Thoughts? We have an open issue for this at https://bugs.python.org/issue33355. -- Zach From tjreedy at udel.edu Sun May 6 16:04:45 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Sun, 6 May 2018 16:04:45 -0400 Subject: [Python-Dev] Dealing with tone in an email In-Reply-To: References: <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> <20180504154243.GA9539@ando.pearwood.info> <93bae6b3-f5f5-ab7d-a375-a6b1e8971d4b@mail.mipt.ru> <20180506013950.GO9562@ando.pearwood.info> Message-ID: On 5/6/2018 10:03 AM, Chris Angelico wrote: > On Sun, May 6, 2018 at 6:54 PM, Terry Reedy wrote: >>> Is it ALL of Tkinter that fails in threaded mode? >> >> No. It is non-threaded tcl that fails in threaded mode, along with >> tkinter's attempt to make non-thread tcl work anyway. There are at least >> two different cases. >> >> Ivan has clarified the following. >> 1. Tcl has a 'threads' compile switch. >> 2. The default changed from 'off' for 8.5 and before to 'on' for 8.6. >> 3. When compiled with thread support, the resulting library file has t >> suffix. > > Okay, that makes a HUGE difference. Thank you for clarifying. So, in > theory, threads SHOULD be supported, which means that bug reports of > the nature of "this fails in threaded mode" are 100% valid. If the reported code does not have bugs, including thread deadlock and shutdown bugs, which are non-trivial to avoid, yes. I dug up more information. Though he did not say so, Ivan's first issue, https://bugs.python.org/issue33257, reopens and continues https://bugs.python.org/issue11077, using a slightly modified version of the original ballistic launch example. Ivan added discussion of the tcl thread-support compile switch, which was not mentioned in the original. Ditto for a patch. The originator of the original, Scott M., taught new programmers how to display data from multiple sources, some blocking. It seemed most natural to him and students to use threads to access sources and use the documented widget.method(*args) APIs to display data, rather than have to invent a protocol to pass such calls through a queue. See the initial message. In that older issue, I mentioned, as I did in response to Ivan, that 'thread' does not appear in the tkinter docs. Martin L?wis replied "My claim is that Tkinter is thread-safe as it stands. A lot of thought has been put into making Tkinter thread-safe, so if there is any claim to the contrary, we would need more details: what exact Python version is being used, what exact operating system is being used, what exact code is run, and what exact output is produced." and later said "It's supported on Unix since 1.5.1, and on Windows since 2.3." > If it were up to me, I would deprecate non-threaded mode immediately, Given that 99% of tkinter users do not need threaded tcl, why cut some of them off? When tkinter is import and a root is created, tkinter cannot know whether the user is going to later make failing calls from threads. Tkinter has traditional been slow to remove support of old versions; it still supports 8.4. It will eventually become a moot point, at least on Windows, as current Windows installers install threaded tcl. I presume the same is true for the new Mac installers. I have no idea what people have on linux. -- Terry Jan Reedy From rosuav at gmail.com Sun May 6 16:10:57 2018 From: rosuav at gmail.com (Chris Angelico) Date: Mon, 7 May 2018 06:10:57 +1000 Subject: [Python-Dev] Dealing with tone in an email In-Reply-To: References: <20180502233705.7249da5f@fsol> <20180503000153.6f8d3b33@fsol> <20180503022619.GB9562@ando.pearwood.info> <20180504154243.GA9539@ando.pearwood.info> <93bae6b3-f5f5-ab7d-a375-a6b1e8971d4b@mail.mipt.ru> <20180506013950.GO9562@ando.pearwood.info> Message-ID: On Mon, May 7, 2018 at 6:04 AM, Terry Reedy wrote: > On 5/6/2018 10:03 AM, Chris Angelico wrote: >> If it were up to me, I would deprecate non-threaded mode immediately, > > > Given that 99% of tkinter users do not need threaded tcl, why cut some of > them off? "Non-threaded" really just means "non-thread-safe". There's nothing wrong with using thread-safe APIs when you're using only a single thread, other than the performance overhead. Is that significant enough to require the distinction? > When tkinter is import and a root is created, tkinter cannot know > whether the user is going to later make failing calls from threads. Tkinter > has traditional been slow to remove support of old versions; it still > supports 8.4. It will eventually become a moot point, at least on Windows, > as current Windows installers install threaded tcl. I presume the same is > true for the new Mac installers. I have no idea what people have on linux. That's what I'm hoping for, yes. Eventually threaded will be the only way to do things. ChrisA From vano at mail.mipt.ru Sun May 6 16:33:04 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Sun, 6 May 2018 23:33:04 +0300 Subject: [Python-Dev] Windows 10 build agent failures In-Reply-To: References: Message-ID: <45aed143-9882-fc15-a5ce-187bbcd8be59@mail.mipt.ru> For me, Tcl/Tk failed to build with SDK 10.0.16299.0 , I had to expicitly fall back to 10.0.15063.0 ( https://stackoverflow.com/questions/48559337/error-when-building-tcltk-in-visual-studio-2017 ). May be related if VS was (auto)updated on the builders.|| On 06.05.2018 11:05, Paul Goins wrote: || > Hi, > > Just kind of "looking around" at stuff I can help with, and I noticed > a few days ago that Windows 10 AMD64 builds of Python 3.6/3.7/3.x are > generally failing. > > It seems like the failures started April 16th around 1am per BuildBot > and went from consistently passing to consistently failing.? The > errors appear to be timeouts; the test runs are going over 15 minutes. > > I can run the tests locally and things seem fine. > > The one thing which seems to have changed is, under the "10 slowest > tests" header, test_io has shot up in terms of time to complete: > > 3.6: 2 min 13 sec > > to 9 min 25 sec > > 3.7: 2 min 10 sec > > to 9 min 26 sec > > 3.x: 3 min 39 sec > > to 9 min 17 sec > > > Locally this test suite runs in around 36 seconds.? I see no real > change between running one of the last "good" changesets versus the > current head of master.? I'm suspecting an issue on the build agent > perhaps?? Thoughts? > > Best Regards, > Paul Goins > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe:https://mail.python.org/mailman/options/python-dev/vano%40mail.mipt.ru -- Regards, Ivan -------------- next part -------------- An HTML attachment was scrubbed... URL: From vultairejp at gmail.com Sun May 6 17:14:36 2018 From: vultairejp at gmail.com (Paul Goins) Date: Sun, 06 May 2018 21:14:36 +0000 Subject: [Python-Dev] Windows 10 build agent failures In-Reply-To: References: Message-ID: Thanks for the heads-up; I skimmed python-dev but did not check bpo. - Paul On Sun, May 6, 2018, 11:35 AM Zachary Ware wrote: > On Sun, May 6, 2018 at 3:05 AM, Paul Goins wrote: > > Hi, > > > > Just kind of "looking around" at stuff I can help with, and I noticed a > few > > days ago that Windows 10 AMD64 builds of Python 3.6/3.7/3.x are generally > > failing. > > > > It seems like the failures started April 16th around 1am per BuildBot and > > went from consistently passing to consistently failing. The errors > appear > > to be timeouts; the test runs are going over 15 minutes. > > > > I can run the tests locally and things seem fine. > > > > The one thing which seems to have changed is, under the "10 slowest > tests" > > header, test_io has shot up in terms of time to complete: > > > > 3.6: 2 min 13 sec to 9 min 25 sec > > 3.7: 2 min 10 sec to 9 min 26 sec > > 3.x: 3 min 39 sec to 9 min 17 sec > > > > Locally this test suite runs in around 36 seconds. I see no real change > > between running one of the last "good" changesets versus the current > head of > > master. I'm suspecting an issue on the build agent perhaps? Thoughts? > > We have an open issue for this at https://bugs.python.org/issue33355. > > -- > Zach > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/general%40vultaire.net > -------------- next part -------------- An HTML attachment was scrubbed... URL: From drsalists at gmail.com Sun May 6 21:30:23 2018 From: drsalists at gmail.com (Dan Stromberg) Date: Sun, 6 May 2018 18:30:23 -0700 Subject: [Python-Dev] Slow down... Message-ID: When I think of why Python is so far ahead of Perl in language design, I think it's simply that Python is the result of cautious design, and Perl is the result of exuberant design. I think Python is in danger of becoming a large language - which isn't a good thing. A great language, like Scheme or Python or even C, is a language that has a small, very useful core, and large _libraries_. I'd very much like a live in a world where Jython and IronPython and MicroPython and Cython and Pyjamas can all catch up and implement Python 3.7, 3.8, and so forth. From ncoghlan at gmail.com Sun May 6 22:25:46 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Mon, 7 May 2018 12:25:46 +1000 Subject: [Python-Dev] Slow down... In-Reply-To: References: Message-ID: On 7 May 2018 at 11:30, Dan Stromberg wrote: > I'd very much like a live in a world where Jython and IronPython and > MicroPython and Cython and Pyjamas can all catch up and implement > Python 3.7, 3.8, and so forth. > I'm inclined to agree that a Python 3.8 PEP in the spirit of the PEP 3003 language moratorium could be a very good idea. Between matrix multiplication, enhanced tuple unpacking, native coroutines, f-strings, and type hinting for variable assignments, we've had quite a bit of syntactic churn in the past few releases, and the rest of the ecosystem really hasn't caught up on it all yet (and that's not just other implementations - it's training material, online courses, etc, etc). If we're going to take such a step, now's also the time to do it, since 3.8 feature development is only just getting under way, and if we did decide to repeat the language moratorium, we could co-announce it with the Python 3.7 release. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From solipsis at pitrou.net Mon May 7 03:29:18 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Mon, 7 May 2018 09:29:18 +0200 Subject: [Python-Dev] Slow down... References: Message-ID: <20180507092918.540c9042@fsol> On Mon, 7 May 2018 12:25:46 +1000 Nick Coghlan wrote: > On 7 May 2018 at 11:30, Dan Stromberg wrote: > > > I'd very much like a live in a world where Jython and IronPython and > > MicroPython and Cython and Pyjamas can all catch up and implement > > Python 3.7, 3.8, and so forth. > > > > I'm inclined to agree that a Python 3.8 PEP in the spirit of the PEP 3003 > language moratorium could be a very good idea. Between matrix > multiplication, enhanced tuple unpacking, native coroutines, f-strings, and > type hinting for variable assignments, we've had quite a bit of syntactic > churn in the past few releases, and the rest of the ecosystem really hasn't > caught up on it all yet (and that's not just other implementations - it's > training material, online courses, etc, etc). Not to mention people themselves. Regards Antoine. From levkivskyi at gmail.com Mon May 7 05:17:30 2018 From: levkivskyi at gmail.com (Ivan Levkivskyi) Date: Mon, 7 May 2018 10:17:30 +0100 Subject: [Python-Dev] Slow down... In-Reply-To: References: Message-ID: On 7 May 2018 at 03:25, Nick Coghlan wrote: > On 7 May 2018 at 11:30, Dan Stromberg wrote: > >> I'd very much like a live in a world where Jython and IronPython and >> MicroPython and Cython and Pyjamas can all catch up and implement >> Python 3.7, 3.8, and so forth. >> > > I'm inclined to agree that a Python 3.8 PEP in the spirit of the PEP 3003 > language moratorium could be a very good idea. Between matrix > multiplication, enhanced tuple unpacking, native coroutines, f-strings, and > type hinting for variable assignments, we've had quite a bit of syntactic > churn in the past few releases, and the rest of the ecosystem really hasn't > caught up on it all yet (and that's not just other implementations - it's > training material, online courses, etc, etc). > > If we're going to take such a step, now's also the time to do it, since > 3.8 feature development is only just getting under way, and if we did > decide to repeat the language moratorium, we could co-announce it with the > Python 3.7 release. > > These are all god points. I think it will be a good idea to take a little pause with syntactic additions and other "cognitively loaded" changes. On the other hand, I think it is fine to work on performance improvements (start-up time, import system etc.), internal APIs (like simplifying start-up sequence and maybe even C API), and polishing corner cases/simplifying existing constructs (like scoping in comprehensions that many people find confusing). IOW, I think the PEP should describe precisely what is OK, and what is not OK during the moratorium. -- Ivan -------------- next part -------------- An HTML attachment was scrubbed... URL: From solipsis at pitrou.net Mon May 7 06:59:10 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Mon, 7 May 2018 12:59:10 +0200 Subject: [Python-Dev] static linking Python References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> Message-ID: <20180507125910.0ac4ae75@fsol> On Fri, 04 May 2018 00:21:54 +0000 Ray Donnelly wrote: > > Now that Python 3.7 is around the corner we have a chance to re-evaluate > this decision. We have received no binary compat. bugs whatsoever due to > this change (we got a few bugs where people used python-config incorrectly > either directly or via swig or CMake), were we just lucky? As a sidenote, it seems there may be issues when static linking against Python to embed it: https://bugs.python.org/issue33438 Regards Antoine. From ncoghlan at gmail.com Mon May 7 07:19:48 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Mon, 7 May 2018 21:19:48 +1000 Subject: [Python-Dev] Slow down... In-Reply-To: References: Message-ID: On 7 May 2018 at 19:17, Ivan Levkivskyi wrote: > These are all god points. I think it will be a good idea to take a little > pause with syntactic additions and other "cognitively loaded" changes. On > the other hand, I think it is fine to work on performance improvements > (start-up time, import system etc.), internal APIs (like simplifying > start-up sequence and maybe even C API), and polishing corner > cases/simplifying existing constructs (like scoping in comprehensions that > many people find confusing). > > IOW, I think the PEP should describe precisely what is OK, and what is not > OK during the moratorium. > Aye, for folks that haven't read it before, https://www.python.org/dev/peps/pep-3003/#details is worth a look as to the specifics of what a language moratorium entails. While only 3.2 was specifically covered by the moratorium, from 3.2 through to 3.4, the only language level changes we made were to allow "yield from x" as a coroutine-friendly alternative to "for x in iterable: yield x", and to reinstate support for the "u" prefix on strings. The main builtin changes in that period were to rework the exception hierarchy to make it more descriptive of what *caused* an error, rather than where the error was raised, and those were deliberately designed to be almost entirely backwards compatible. Since that last period of calm, 3.5 and 3.6 both introduced a broad selection of syntax changes, and even 3.7 has introduced a new __future__ statement to change the way annotations are handled. And as the current python-ideas discussion about accessing paths relative to __file__ shows, even core devs (i.e. me) are still digesting the full implications that pathlib and the os.fspath protocol may have for the recommendations that we should be giving to new (and existing!) developers that don't need to worry about Python 2.7 compatibility. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From storchaka at gmail.com Mon May 7 09:20:10 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Mon, 7 May 2018 16:20:10 +0300 Subject: [Python-Dev] Slow down... In-Reply-To: References: Message-ID: 07.05.18 14:19, Nick Coghlan ????: > And as the current python-ideas discussion about accessing paths > relative to __file__ shows, I can't believe this is discussed seriously. Forgot about the Python 2 legacy, just use importlib. From jsbueno at python.org.br Mon May 7 09:50:49 2018 From: jsbueno at python.org.br (Joao S. O. Bueno) Date: Mon, 7 May 2018 10:50:49 -0300 Subject: [Python-Dev] Slow down... In-Reply-To: References: Message-ID: May it is important to note that Python 3.7 already has very little syntactic changes. Actually, there are no new syntac changes with PEP 563 - (Postponed Evaluation of Annotations) being maybe the only change to existing behavior, and PEP 562 as new "non-library-dependent" feature, even though no new syntax is added. (contextvars - PEP 567 - is a big change, but is constrained to a new stdlib module). On the other hand, one of the hottest topics being discussed now - PEP 572 - would be quite a change that would violate such a moratorium. What would happen if we wd'd take such a moratorium now? Have Python with no real syntactic changes since f-strings up to version 3.9 (~4.5 years after 3.6)? Allow just whatever is already being discussed - introducing a major change (local assignments) in 3.8, but just bar others because people did not think of then before this moratorium was proposed? Maybe simply there is just no need for such a moratorium mechanism, and the natural barriers for new features can work to keep things in manageable pace, as we can see by the release getting ready right now. best regards, js -><- On 7 May 2018 at 10:20, Serhiy Storchaka wrote: > 07.05.18 14:19, Nick Coghlan ????: >> >> And as the current python-ideas discussion about accessing paths relative >> to __file__ shows, > > > I can't believe this is discussed seriously. Forgot about the Python 2 > legacy, just use importlib. > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/jsbueno%40python.org.br From storchaka at gmail.com Mon May 7 10:19:28 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Mon, 7 May 2018 17:19:28 +0300 Subject: [Python-Dev] Slow down... In-Reply-To: References: Message-ID: 07.05.18 16:50, Joao S. O. Bueno ????: > May it is important to note that > Python 3.7 already has very little syntactic changes. > Actually, there are no new syntac changes with PEP 563 - > (Postponed Evaluation of Annotations) being maybe > the only change to existing behavior, and PEP 562 as new > "non-library-dependent" feature, even though no new syntax is > added. (contextvars - PEP 567 - is a big change, but is constrained to > a new stdlib module). It looks to me that Python 3.7 may cause the largest breakage since 3.2 because of "async". It is pretty popular name for arguments, attributes and methods. Many third-party projects needed to fix this now (despites it was deprecated since 3.5). Many third-party projects depends on libraries which needed the "async" fix for 3.7. But in general this breakage is very small. It just looks larger than in previous versions. From ericsnowcurrently at gmail.com Mon May 7 10:59:14 2018 From: ericsnowcurrently at gmail.com (Eric Snow) Date: Mon, 7 May 2018 08:59:14 -0600 Subject: [Python-Dev] Slow down... In-Reply-To: References: Message-ID: On Sun, May 6, 2018 at 8:25 PM, Nick Coghlan wrote: > I'm inclined to agree that a Python 3.8 PEP in the spirit of the PEP 3003 > language moratorium could be a very good idea. Note that the PEP specifically applies to "syntax, semantics, and built-ins". Here's the full abstract [1]: This PEP proposes a temporary moratorium (suspension) of all changes to the Python language syntax, semantics, and built-ins for a period of at least two years from the release of Python 3.1. In particular, the moratorium would include Python 3.2 (to be released 18-24 months after 3.1) but allow Python 3.3 (assuming it is not released prematurely) to once again include language changes. This suspension of features is designed to allow non-CPython implementations to "catch up" to the core implementation of the language, help ease adoption of Python 3.x, and provide a more stable base for the community. -eric [1] https://www.python.org/dev/peps/pep-3003/#abstract From jmcs at jsantos.eu Mon May 7 05:24:15 2018 From: jmcs at jsantos.eu (=?UTF-8?B?Sm/Do28gU2FudG9z?=) Date: Mon, 07 May 2018 09:24:15 +0000 Subject: [Python-Dev] Slow down... In-Reply-To: References: Message-ID: Hi, I would like to see this go even further and have a tick-tock approach to python versions, i.e. adopt new syntax and other large changes on one version (for example odd versions) and polish everything up in the next (even versions). Best regards, Jo?o Santos On Mon, 7 May 2018 at 11:19 Ivan Levkivskyi wrote: > On 7 May 2018 at 03:25, Nick Coghlan wrote: > >> On 7 May 2018 at 11:30, Dan Stromberg wrote: >> >>> I'd very much like a live in a world where Jython and IronPython and >>> MicroPython and Cython and Pyjamas can all catch up and implement >>> Python 3.7, 3.8, and so forth. >>> >> >> I'm inclined to agree that a Python 3.8 PEP in the spirit of the PEP 3003 >> language moratorium could be a very good idea. Between matrix >> multiplication, enhanced tuple unpacking, native coroutines, f-strings, and >> type hinting for variable assignments, we've had quite a bit of syntactic >> churn in the past few releases, and the rest of the ecosystem really hasn't >> caught up on it all yet (and that's not just other implementations - it's >> training material, online courses, etc, etc). >> >> If we're going to take such a step, now's also the time to do it, since >> 3.8 feature development is only just getting under way, and if we did >> decide to repeat the language moratorium, we could co-announce it with the >> Python 3.7 release. >> >> > These are all god points. I think it will be a good idea to take a little > pause with syntactic additions and other "cognitively loaded" changes. On > the other hand, I think it is fine to work on performance improvements > (start-up time, import system etc.), internal APIs (like simplifying > start-up sequence and maybe even C API), and polishing corner > cases/simplifying existing constructs (like scoping in comprehensions that > many people find confusing). > > IOW, I think the PEP should describe precisely what is OK, and what is not > OK during the moratorium. > > -- > Ivan > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/jmcs%40jsantos.eu > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nas-python at arctrix.com Mon May 7 12:28:46 2018 From: nas-python at arctrix.com (Neil Schemenauer) Date: Mon, 7 May 2018 10:28:46 -0600 Subject: [Python-Dev] Python startup time In-Reply-To: <5C4A992D-891A-4278-82E9-A1625EF4BA3E@langa.pl> References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> <5C4A992D-891A-4278-82E9-A1625EF4BA3E@langa.pl> Message-ID: <20180507162846.oxkkag27zgoah4wb@python.ca> On 2018-05-03, Lukasz Langa wrote: > > On May 2, 2018, at 8:57 PM, INADA Naoki wrote: > > * Add lazy compiling API or flag in `re` module. The pattern is compiled > > when first used. > > How about go the other way and allow compiling at Python > *compile*-time? That would actually make things faster instead of > just moving the time spent around. Lisp has a special form 'eval-when'. It can be used to cause evaluation of the body expression at compile time. In Carl's "A fast startup patch" post, he talks about getting rid of the unmarshal step and storing objects in the heap segment of the executable. Those would be the objects necessary to evaluate code. The marshal module has a limited number of types that it handle. I believe they are: bool, bytes, code objects, complex, Ellipsis float, frozenset, int, None, tuple and str. If the same mechanism could handle more types, rather than storing the code to be evaluated, we could store the objects created after evaluation of the top-level module body. Or, have a mechanism to mark which code should be evaluated at compile time (much like the eval-when form). For the re.compile example, the compiled regex could be what is stored after compiling the Python module (i.e. the re.compile gets run at compile time). The objects created by re.compile (e.g. SRE_Pattern) would have to be something that the heap dumper could handle. Traditionally, Python has had the model "there is only runtime". So, starting to do things at compile time complicates that model. Regards, Neil From brett at python.org Mon May 7 12:32:48 2018 From: brett at python.org (Brett Cannon) Date: Mon, 07 May 2018 16:32:48 +0000 Subject: [Python-Dev] Slow down... In-Reply-To: References: Message-ID: On Mon, 7 May 2018 at 08:18 Jo?o Santos wrote: > Hi, > > I would like to see this go even further and have a tick-tock approach to > python versions, i.e. adopt new syntax and other large changes on one > version (for example odd versions) and polish everything up in the next > (even versions). > That's basically an LTS release cycle and discussions of changing how we do releases is a massive discussion that ultimately goes nowhere, so I would advise we stick to the discussion on a moratorium before trying to change how we release Python. :) -Brett > > Best regards, > Jo?o Santos > > On Mon, 7 May 2018 at 11:19 Ivan Levkivskyi wrote: > >> On 7 May 2018 at 03:25, Nick Coghlan wrote: >> >>> On 7 May 2018 at 11:30, Dan Stromberg wrote: >>> >>>> I'd very much like a live in a world where Jython and IronPython and >>>> MicroPython and Cython and Pyjamas can all catch up and implement >>>> Python 3.7, 3.8, and so forth. >>>> >>> >>> I'm inclined to agree that a Python 3.8 PEP in the spirit of the PEP >>> 3003 language moratorium could be a very good idea. Between matrix >>> multiplication, enhanced tuple unpacking, native coroutines, f-strings, and >>> type hinting for variable assignments, we've had quite a bit of syntactic >>> churn in the past few releases, and the rest of the ecosystem really hasn't >>> caught up on it all yet (and that's not just other implementations - it's >>> training material, online courses, etc, etc). >>> >>> If we're going to take such a step, now's also the time to do it, since >>> 3.8 feature development is only just getting under way, and if we did >>> decide to repeat the language moratorium, we could co-announce it with the >>> Python 3.7 release. >>> >>> >> These are all god points. I think it will be a good idea to take a little >> pause with syntactic additions and other "cognitively loaded" changes. On >> the other hand, I think it is fine to work on performance improvements >> (start-up time, import system etc.), internal APIs (like simplifying >> start-up sequence and maybe even C API), and polishing corner >> cases/simplifying existing constructs (like scoping in comprehensions that >> many people find confusing). >> >> IOW, I think the PEP should describe precisely what is OK, and what is >> not OK during the moratorium. >> >> -- >> Ivan >> >> >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> > Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/jmcs%40jsantos.eu >> > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From levkivskyi at gmail.com Mon May 7 12:55:54 2018 From: levkivskyi at gmail.com (Ivan Levkivskyi) Date: Mon, 7 May 2018 17:55:54 +0100 Subject: [Python-Dev] Slow down... In-Reply-To: References: Message-ID: On 7 May 2018 at 17:32, Brett Cannon wrote: > > > On Mon, 7 May 2018 at 08:18 Jo?o Santos wrote: > >> Hi, >> >> I would like to see this go even further and have a tick-tock approach to >> python versions, i.e. adopt new syntax and other large changes on one >> version (for example odd versions) and polish everything up in the next >> (even versions). >> > > [...], so I would advise we stick to the discussion on a moratorium [...] > > Btw the upcoming Language Summit may be a good opportunity for such discussion. -- Ivan -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Mon May 7 12:57:46 2018 From: brett at python.org (Brett Cannon) Date: Mon, 07 May 2018 16:57:46 +0000 Subject: [Python-Dev] Slow down... In-Reply-To: References: Message-ID: On Mon, 7 May 2018 at 09:55 Ivan Levkivskyi wrote: > On 7 May 2018 at 17:32, Brett Cannon wrote: > >> On Mon, 7 May 2018 at 08:18 Jo?o Santos wrote: >> >>> Hi, >>> >>> I would like to see this go even further and have a tick-tock approach >>> to python versions, i.e. adopt new syntax and other large changes on one >>> version (for example odd versions) and polish everything up in the next >>> (even versions). >>> >> >> [...], so I would advise we stick to the discussion on a moratorium [...] >> >> > Btw the upcoming Language Summit may be a good opportunity for such > discussion. > If it's not already on the schedule for discussion then the best you are going to get is a lightning talk to bring up the idea which will definitely not enough time ;) . Otherwise the schedule is full and locked down at this point. -------------- next part -------------- An HTML attachment was scrubbed... URL: From lukasz at langa.pl Mon May 7 13:06:07 2018 From: lukasz at langa.pl (Lukasz Langa) Date: Mon, 7 May 2018 10:06:07 -0700 Subject: [Python-Dev] Slow down... In-Reply-To: References: Message-ID: > On May 7, 2018, at 9:57 AM, Brett Cannon wrote: > > > > On Mon, 7 May 2018 at 09:55 Ivan Levkivskyi > wrote: > On 7 May 2018 at 17:32, Brett Cannon > wrote: > On Mon, 7 May 2018 at 08:18 Jo?o Santos > wrote: > Hi, > > I would like to see this go even further and have a tick-tock approach to python versions, i.e. adopt new syntax and other large changes on one version (for example odd versions) and polish everything up in the next (even versions). > > [...], so I would advise we stick to the discussion on a moratorium [...] > > > Btw the upcoming Language Summit may be a good opportunity for such discussion. > > If it's not already on the schedule for discussion then the best you are going to get is a lightning talk to bring up the idea which will definitely not enough time ;) . Otherwise the schedule is full and locked down at this point. FWIW I'm hearing the 3.8 release manager has a short talk related to this subject! - ? -------------- next part -------------- An HTML attachment was scrubbed... URL: From v+python at g.nevcal.com Mon May 7 13:21:13 2018 From: v+python at g.nevcal.com (Glenn Linderman) Date: Mon, 7 May 2018 10:21:13 -0700 Subject: [Python-Dev] Slow down... In-Reply-To: References: Message-ID: <15824bb9-d765-d645-9516-201ef2c52324@g.nevcal.com> On 5/7/2018 7:59 AM, Eric Snow wrote: > On Sun, May 6, 2018 at 8:25 PM, Nick Coghlan wrote: >> I'm inclined to agree that a Python 3.8 PEP in the spirit of the PEP 3003 >> language moratorium could be a very good idea. > Note that the PEP specifically applies to "syntax, semantics, and > built-ins". Here's the full abstract [1]: > > This PEP proposes a temporary moratorium (suspension) of all changes to the > Python language syntax, semantics, and built-ins for a period of > at least two > years from the release of Python 3.1. In particular, the moratorium would > include Python 3.2 (to be released 18-24 months after 3.1) but allow Python > 3.3 (assuming it is not released prematurely) to once again include language > changes. > > This suspension of features is designed to allow non-CPython implementations > to "catch up" to the core implementation of the language, help ease adoption > of Python 3.x, and provide a more stable base for the community. > > -eric Here's my "lightning" response to a "lightning talk" about a moratorium: So if other implementations didn't catch up during the last moratorium, either the moratorium then was lifted too soon, or the other implementations don't really want to catch up, or the thought that they should catch up was deemed less important than making forward progress with the language. Have any of those opinions changed? While async is a big change that I personally haven't grasped, but which has little impact (a couple keywords) on code that doesn't use it, a new moratorium wouldn't impact it, and nothing else that is happening seems too much or too fast from my perspective. Dan's original comment about language versus library is interesting, though. It is probably true that one should resist adding language where library suffices, but sometimes a lack of certain expressiveness in the language causes cumbersome implementations of library routines to achieve the goal. f-strings and binding expressions are cases where (1) the functionality is certainly possible via library (2) there is a large amount of code that uses the functionality, and (3) said code is more cumbersome without the expressiveness of the newer syntax. -------------- next part -------------- An HTML attachment was scrubbed... URL: From rodrigc at crodrigues.org Mon May 7 14:49:57 2018 From: rodrigc at crodrigues.org (Craig Rodrigues) Date: Mon, 07 May 2018 18:49:57 +0000 Subject: [Python-Dev] Slow down... In-Reply-To: References: Message-ID: On Sun, May 6, 2018 at 7:35 PM Nick Coghlan wrote: > > I'm inclined to agree that a Python 3.8 PEP in the spirit of the PEP 3003 > language moratorium could be a very good idea. Between matrix > multiplication, enhanced tuple unpacking, native coroutines, f-strings, and > type hinting for variable assignments, we've had quite a bit of syntactic > churn in the past few releases, and the rest of the ecosystem really hasn't > caught up on it all yet (and that's not just other implementations - it's > training material, online courses, etc, etc). > > If we're going to take such a step, now's also the time to do it, since > 3.8 feature development is only just getting under way, and if we did > decide to repeat the language moratorium, we could co-announce it with the > Python 3.7 release. > > Would it be reasonable to request a 10 year moratorium on making changes to the core Python language, and for the next 10 years only focus on things that do not require core language changes, such as improving/bugfixing existing libraries, writing new libraries, improving tooling, improving infrastructure (PyPI), improving performance, etc., etc.? There are still many companies still stuck on Python 2, so giving 10 years of breathing room for these companies to catch up to Python 3 core language, even past 2020 would be very helpful. -- Craig -------------- next part -------------- An HTML attachment was scrubbed... URL: From rymg19 at gmail.com Mon May 7 15:19:28 2018 From: rymg19 at gmail.com (Ryan Gonzalez) Date: Mon, 07 May 2018 19:19:28 +0000 Subject: [Python-Dev] Slow down... In-Reply-To: References: Message-ID: 10 years feels like a simultaneously long and arbitrary limit. IMO a policy of "try to avoid major language features for a while" would work better. On Mon, May 7, 2018 at 2:11 PM Craig Rodrigues wrote: > > > On Sun, May 6, 2018 at 7:35 PM Nick Coghlan wrote: > >> >> I'm inclined to agree that a Python 3.8 PEP in the spirit of the PEP 3003 >> language moratorium could be a very good idea. Between matrix >> multiplication, enhanced tuple unpacking, native coroutines, f-strings, and >> type hinting for variable assignments, we've had quite a bit of syntactic >> churn in the past few releases, and the rest of the ecosystem really hasn't >> caught up on it all yet (and that's not just other implementations - it's >> training material, online courses, etc, etc). >> >> If we're going to take such a step, now's also the time to do it, since >> 3.8 feature development is only just getting under way, and if we did >> decide to repeat the language moratorium, we could co-announce it with the >> Python 3.7 release. >> >> > Would it be reasonable to request a 10 year moratorium on making changes > to the core Python language, > and for the next 10 years only focus on things that do not require core > language changes, > such as improving/bugfixing existing libraries, writing new libraries, > improving tooling, improving infrastructure (PyPI), > improving performance, etc., etc.? > > There are still many companies still stuck on Python 2, so giving 10 years > of breathing room > for these companies to catch up to Python 3 core language, even past 2020 > would be very helpful. > > -- > Craig > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com > -- Ryan (????) Yoko Shimomura, ryo (supercell/EGOIST), Hiroyuki Sawano >> everyone else https://refi64.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From solipsis at pitrou.net Mon May 7 16:23:32 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Mon, 7 May 2018 22:23:32 +0200 Subject: [Python-Dev] Slow down... References: Message-ID: <20180507222332.662ae3da@fsol> On Mon, 07 May 2018 19:19:28 +0000 Ryan Gonzalez wrote: > 10 years feels like a simultaneously long and arbitrary limit. IMO a policy > of "try to avoid major language features for a while" would work better. I would remove "for a while". "Try to avoid major language features" sounds good. Regards Antoine. From barry at python.org Mon May 7 17:23:28 2018 From: barry at python.org (Barry Warsaw) Date: Mon, 7 May 2018 14:23:28 -0700 Subject: [Python-Dev] Slow down... In-Reply-To: References: Message-ID: On May 7, 2018, at 11:49, Craig Rodrigues wrote: > > Would it be reasonable to request a 10 year moratorium on making changes to the core Python language, > and for the next 10 years only focus on things that do not require core language changes, > such as improving/bugfixing existing libraries, writing new libraries, improving tooling, improving infrastructure (PyPI), > improving performance, etc., etc.? IMHO, no. I don?t believe that the way for Python to remain relevant and useful for the next 10 years is to cease all language evolution. Who knows what the computing landscape will look like in 5 years, let alone 10? Something as arbitrary as a 10 year moratorium is (again, IMHO) a death sentence for the language. But I do think it makes sense to think about how Python-the-language and CPython-the-reference implementation can better balance the desire to evolve vs the need to concentrate on improving what we?ve got. With that in mind, it does make sense to occasionally use a moratorium release to focus on tech debt, cleaning up the stdlib, improve performance, etc. CPython?s 18 month release cycle has served us well for a long time, but I do think it?s time to discuss whether it will still be appropriate moving forward. I?m not saying it is or isn?t, but with the release of 3.7, I think it?s a great time to explore our options. Cheers, -Barry -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 833 bytes Desc: Message signed with OpenPGP URL: From carl.shapiro at gmail.com Mon May 7 17:32:09 2018 From: carl.shapiro at gmail.com (Carl Shapiro) Date: Mon, 7 May 2018 14:32:09 -0700 Subject: [Python-Dev] A fast startup patch (was: Python startup time) In-Reply-To: References: Message-ID: On Fri, May 4, 2018 at 6:58 PM, Nathaniel Smith wrote: > What are the obstacles to including "preloaded" objects in regular .pyc > files, so that everyone can take advantage of this without rebuilding the > interpreter? > The system we have developed can create a shared object file for each compiled Python file. However, such a representation is not directly usable. First, certain shared constants, such as interned strings, must be kept globally unique across object code files. Second, some marshaled objects, such as the hashed collections, must be initialized with randomization state that is not available until after the hosting runtime has been initialized. We are able to work around the first issue by generating a heap image with the transitive closure of all modules that will be loaded which allows us to easily maintain uniqueness guarantees. We are able to work around the second issue with some unobservable changes to the affected data structures. Based on our numbers, it appears there should be some hesitancy--at this time--to changing the format of compiled Python file for the sake of load-time performance. In contrast, the data shows that a focused change to address file system inefficiencies has the potential to broadly and transparently deliver benefit to users without affecting existing code or workflows. -------------- next part -------------- An HTML attachment was scrubbed... URL: From python at mrabarnett.plus.com Mon May 7 18:58:11 2018 From: python at mrabarnett.plus.com (MRAB) Date: Mon, 7 May 2018 23:58:11 +0100 Subject: [Python-Dev] Slow down... In-Reply-To: References: Message-ID: <0b036b5e-52ad-80f6-4da0-276aec1e5a3d@mrabarnett.plus.com> On 2018-05-07 19:49, Craig Rodrigues wrote: > [snip] > > Would it be reasonable to request a 10 year moratorium on making changes > to the core Python language, > and for the next 10 years only focus on things that do not require core > language changes, > such as improving/bugfixing existing libraries, writing new libraries, > improving tooling, improving infrastructure (PyPI), > improving performance, etc., etc.? > > There are still many companies still stuck on Python 2, so giving 10 > years of breathing room > for these companies to catch up to Python 3 core language, even past > 2020 would be very helpful. > [snip] I don't see how a 10 year moratorium will help those still on Python 2, given that Python 2.7 has been around for almost 8 years already, by 2020 it will have been 10 years, and it was made clear that it would be the last in the Python 2 line. From steve.dower at python.org Mon May 7 21:30:02 2018 From: steve.dower at python.org (Steve Dower) Date: Mon, 7 May 2018 18:30:02 -0700 Subject: [Python-Dev] Slow down... In-Reply-To: References: Message-ID: A moratorium on new features to focus on cleaning up and planning for transition away from the 2.7 compatibility features that still exist? The most obvious being the libraries that we promised not to remove until 2.7 EOL. Top-posted from my Windows phone From: Barry Warsaw Sent: Monday, May 7, 2018 14:26 To: Python Dev Subject: Re: [Python-Dev] Slow down... On May 7, 2018, at 11:49, Craig Rodrigues wrote: > > Would it be reasonable to request a 10 year moratorium on making changes to the core Python language, > and for the next 10 years only focus on things that do not require core language changes, > such as improving/bugfixing existing libraries, writing new libraries, improving tooling, improving infrastructure (PyPI), > improving performance, etc., etc.? IMHO, no. I don?t believe that the way for Python to remain relevant and useful for the next 10 years is to cease all language evolution. Who knows what the computing landscape will look like in 5 years, let alone 10? Something as arbitrary as a 10 year moratorium is (again, IMHO) a death sentence for the language. But I do think it makes sense to think about how Python-the-language and CPython-the-reference implementation can better balance the desire to evolve vs the need to concentrate on improving what we?ve got. With that in mind, it does make sense to occasionally use a moratorium release to focus on tech debt, cleaning up the stdlib, improve performance, etc. CPython?s 18 month release cycle has served us well for a long time, but I do think it?s time to discuss whether it will still be appropriate moving forward. I?m not saying it is or isn?t, but with the release of 3.7, I think it?s a great time to explore our options. Cheers, -Barry -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve.dower at python.org Mon May 7 22:14:15 2018 From: steve.dower at python.org (Steve Dower) Date: Mon, 7 May 2018 19:14:15 -0700 Subject: [Python-Dev] A fast startup patch (was: Python startup time) In-Reply-To: References: Message-ID: ?the data shows that a focused change to address file system inefficiencies has the potential to broadly and transparently deliver benefit to users without affecting existing code or workflows.? This is consistent with a Node.js experiment I heard about where they compiled an entire application in a single (HUGE!) .js file. Reading a single large file from disk is quicker than many small files on every significant file system I?m aware of. Is there benefit to supporting import of .tar files as we currently do .zip? Or perhaps having a special fast-path for uncompressed .zip files? Top-posted from my Windows phone From: Carl Shapiro Sent: Monday, May 7, 2018 14:36 To: Nathaniel Smith Cc: Nick Coghlan; Python Dev Subject: Re: [Python-Dev] A fast startup patch (was: Python startup time) On Fri, May 4, 2018 at 6:58 PM, Nathaniel Smith wrote: What are the obstacles to including "preloaded" objects in regular .pyc files, so that everyone can take advantage of this without rebuilding the interpreter? The system we have developed can create a shared object file for each compiled Python file.? However, such a representation is not directly usable.? First, certain shared constants, such as interned strings, must be kept globally unique across object code files.? Second, some marshaled objects, such as the hashed collections, must be initialized with randomization state that is not available until after the hosting runtime has been initialized. We are able to work around the first issue by generating a heap image with the transitive closure of all modules that will be loaded which allows us to easily maintain uniqueness guarantees.? We are able to work around the second issue with some unobservable changes to the affected data structures. ? Based on our numbers, it appears there should be some hesitancy--at this time--to changing the format of compiled Python file for the sake of load-time performance.? In contrast, the data shows that a focused change to address file system inefficiencies has the potential to broadly and transparently deliver benefit to users without affecting existing code or workflows.? -------------- next part -------------- An HTML attachment was scrubbed... URL: From rymg19 at gmail.com Mon May 7 22:47:31 2018 From: rymg19 at gmail.com (Ryan Gonzalez) Date: Mon, 07 May 2018 21:47:31 -0500 Subject: [Python-Dev] A fast startup patch (was: Python startup time) In-Reply-To: <40g3443RKHzFr3d@mail.python.org> References: <40g3443RKHzFr3d@mail.python.org> Message-ID: <1633da591b8.2837.db5b03704c129196a4e9415e55413ce6@gmail.com> On May 7, 2018 9:15:32 PM Steve Dower wrote: > ?the data shows that a focused change to address file system inefficiencies > has the potential to broadly and transparently deliver benefit to users > without affecting existing code or workflows.? > > This is consistent with a Node.js experiment I heard about where they > compiled an entire application in a single (HUGE!) .js file. Reading a > single large file from disk is quicker than many small files on every > significant file system I?m aware of. Is there benefit to supporting import > of .tar files as we currently do .zip? Or perhaps having a special > fast-path for uncompressed .zip files? I kind of built something like this, though I haven't really put in the effort to make it overly usable yet: https://github.com/kirbyfan64/bluesnow (Bonus points to anyone who gets the character reference in the name, though I seriously doubt it.) Main thing I noticed was that reading compiled .pyc files is far faster than uncompiled Python code, even if you eliminate the disk access. Kind of obvious in retrospect, but still something to note However, there are more obstacles to this in the Python world than the JS world. C extensions have a heavier prevalence here, distribution is a bit weirder (sorry, even with Pipfiles), and JavaScript already has an entire ecosystem built around packing files together from the web world. > > Top-posted from my Windows phone > > From: Carl Shapiro > Sent: Monday, May 7, 2018 14:36 > To: Nathaniel Smith > Cc: Nick Coghlan; Python Dev > Subject: Re: [Python-Dev] A fast startup patch (was: Python startup time) > > On Fri, May 4, 2018 at 6:58 PM, Nathaniel Smith wrote: > What are the obstacles to including "preloaded" objects in regular .pyc > files, so that everyone can take advantage of this without rebuilding the > interpreter? > > The system we have developed can create a shared object file for each > compiled Python file.? However, such a representation is not directly > usable.? First, certain shared constants, such as interned strings, must be > kept globally unique across object code files.? Second, some marshaled > objects, such as the hashed collections, must be initialized with > randomization state that is not available until after the hosting runtime > has been initialized. > > We are able to work around the first issue by generating a heap image with > the transitive closure of all modules that will be loaded which allows us > to easily maintain uniqueness guarantees.? We are able to work around the > second issue with some unobservable changes to the affected data structures. > ? > Based on our numbers, it appears there should be some hesitancy--at this > time--to changing the format of compiled Python file for the sake of > load-time performance.? In contrast, the data shows that a focused change > to address file system inefficiencies has the potential to broadly and > transparently deliver benefit to users without affecting existing code or > workflows.? > > > > > ---------- > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com > -- Ryan (????) Yoko Shimomura, ryo (supercell/EGOIST), Hiroyuki Sawano >> everyone else https://refi64.com/ From J.Demeyer at UGent.be Tue May 8 11:22:02 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Tue, 8 May 2018 17:22:02 +0200 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: References: <5AED7166.1010008@UGent.be> Message-ID: <5AF1C09A.9060503@UGent.be> On 2018-05-06 09:35, Nick Coghlan wrote: > Thanks for this update Jeroen! If it doesn't come up otherwise, I'll try > to claim one of the lightning talk slots at the Language Summit to > discuss this with folks in person :) Sounds great! I'd love to hear what people think. As an example of how the new functionality of PEP 575 can be used, I changed functools.lru_cache to implement the _lru_cache_wrapper class as subclass of base_function. I added this to the reference implementation https://github.com/jdemeyer/cpython/tree/pep575 From stephane.blondon at gmail.com Tue May 8 09:20:50 2018 From: stephane.blondon at gmail.com (=?UTF-8?Q?St=c3=a9phane_Blondon?=) Date: Tue, 8 May 2018 15:20:50 +0200 Subject: [Python-Dev] Process to remove a Python feature In-Reply-To: References: Message-ID: <0aebfeb3-ce43-8686-7bae-13429f78b7fd@gmail.com> Le 02/05/2018 ? 11:11, Victor Stinner a ?crit?: > * Communicate on python-dev, Twitter, Python Insider blog, etc. > * Collaborate with major Python projects to help them to migrate the alternative I wonder if it would be interesting to have a package available by pypi.org which would provide the removed features. In your example, the developers would have to update their source code: # giving 'obsolete' as name for this package platform.linux_distribution() -> obsolete.platform.linux_distribution() The code in 'obsolete' package could come from the removed code from cpython or a wrapper around a third-party package ('distro' package in the example). Plus: - quick temporary fix for users -> the removal is less painful - the name of the import is a hint that something has to be fixed -> useful for new comers on the user source code Cons: - it pushes the question to how many times the previous behavior should be maintained from python language to 'obsolete' package. So it's not completely solved. - it adds a step to removal procedure. - I guess there are some features not movable into a package. -- St?phane -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 833 bytes Desc: OpenPGP digital signature URL: From rodrigc at crodrigues.org Tue May 8 03:03:30 2018 From: rodrigc at crodrigues.org (Craig Rodrigues) Date: Tue, 08 May 2018 07:03:30 +0000 Subject: [Python-Dev] Slow down... In-Reply-To: References: Message-ID: On Mon, May 7, 2018 at 2:24 PM Barry Warsaw wrote: > On May 7, 2018, at 11:49, Craig Rodrigues wrote: > > > > Would it be reasonable to request a 10 year moratorium on making changes > to the core Python language, > > and for the next 10 years only focus on things that do not require core > language changes, > > such as improving/bugfixing existing libraries, writing new libraries, > improving tooling, improving infrastructure (PyPI), > > improving performance, etc., etc.? > > IMHO, no. > > I don?t believe that the way for Python to remain relevant and useful for > the next 10 years is to cease all language evolution. Who knows what the > computing landscape will look like in 5 years, let alone 10? Something as > arbitrary as a 10 year moratorium is (again, IMHO) a death sentence for the > language. > > But I do think it makes sense to think about how Python-the-language and > CPython-the-reference implementation can better balance the desire to > evolve vs the need to concentrate on improving what we?ve got. With that > in mind, it does make sense to occasionally use a moratorium release to > focus on tech debt, cleaning up the stdlib, improve performance, etc. > > CPython?s 18 month release cycle has served us well for a long time, but I > do think it?s time to discuss whether it will still be appropriate moving > forward. I?m not saying it is or isn?t, but with the release of 3.7, I > think it?s a great time to explore our options. > > 10 years is a long time for many types of applications, such as web server and desktop applications where regular and somewhat frequent upgrades can happen. However, I have worked on embedding Python in networking and storage devices. It is true that many of these types of devices can also have their software upgraded, but often the frequency of such upgrades is much slower than for conventional web server and desktop applications. Upgrades of these devices usually spans user-space and kernel/device drivers, so upgrades are usually done more cautiously and less frequently. For Python 2.x, the ship has sailed. However, 10 years from now, if the Python language is pretty much the same as Python 3.7 today, that would be nice. I'm not stuck on the number "10 years", but I am just throwing it out there as a draft proposal. So even 5-8 year moratorium would be nice to strive for. Outside of the embedded space, I will give another example where folks in industry are behind. I don't want to pick on a particular vendor, but from April 24-26, I attended training sessions at RedisConf18 in San Francisco. During the training sessions, multiple sample Python code examples were given for accessing the Redis database. The instructor thought that the code examples worked in Python 3, but in fact, they only worked in Python 2 mostly due to bytes/unicode issues. During the class, I fixed the examples for Python 3 and submitted the patches to the instructor, who gratefully accepted my patches. However, there are going to be many, many users of Redis out there who maybe will not upgrade their Python code for newer versions of Python for many years. Besides Redis users, I am seeing all sorts of communities and companies which are behind in terms of having working code and documentation that works on latest Python 3. It is going to take YEARS for all these communities and companies to catch up (if ever). I understand that Python as a language must evolve over time to succeed and thrive. But I would request that at least for the next 5-10 years, a moratorium on language changes be considered, to allow the world to catch up in terms of code, documentation, and mind/understanding. Looking back at how the Python 2.7 EOL was extended by 5 years, I would remind the core Python developers that it is very easy to underestimate how slow the world is to change their code, documentation, training, and understanding to adapt to new Python versions. Going slow or imposing a moratorium wouldn't be such a bad thing. -- Craig -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Tue May 8 11:48:54 2018 From: brett at python.org (Brett Cannon) Date: Tue, 08 May 2018 15:48:54 +0000 Subject: [Python-Dev] Slow down... In-Reply-To: References: Message-ID: On Tue, 8 May 2018 at 08:26 Craig Rodrigues wrote: > On Mon, May 7, 2018 at 2:24 PM Barry Warsaw wrote: > >> On May 7, 2018, at 11:49, Craig Rodrigues wrote: >> > >> > Would it be reasonable to request a 10 year moratorium on making >> changes to the core Python language, >> > and for the next 10 years only focus on things that do not require core >> language changes, >> > such as improving/bugfixing existing libraries, writing new libraries, >> improving tooling, improving infrastructure (PyPI), >> > improving performance, etc., etc.? >> >> IMHO, no. >> >> I don?t believe that the way for Python to remain relevant and useful for >> the next 10 years is to cease all language evolution. Who knows what the >> computing landscape will look like in 5 years, let alone 10? Something as >> arbitrary as a 10 year moratorium is (again, IMHO) a death sentence for the >> language. >> >> But I do think it makes sense to think about how Python-the-language and >> CPython-the-reference implementation can better balance the desire to >> evolve vs the need to concentrate on improving what we?ve got. With that >> in mind, it does make sense to occasionally use a moratorium release to >> focus on tech debt, cleaning up the stdlib, improve performance, etc. >> >> CPython?s 18 month release cycle has served us well for a long time, but >> I do think it?s time to discuss whether it will still be appropriate moving >> forward. I?m not saying it is or isn?t, but with the release of 3.7, I >> think it?s a great time to explore our options. >> >> > 10 years is a long time for many types of applications, such as web server > and desktop applications > where regular and somewhat frequent upgrades can happen. > However, I have worked on embedding Python in networking and storage > devices. > It is true that many of these types of devices can also have their > software upgraded, > but often the frequency of such upgrades is much slower than for > conventional web server and desktop applications. > Upgrades of these devices usually spans user-space and kernel/device > drivers, so > upgrades are usually done more cautiously and less frequently. > > For Python 2.x, the ship has sailed. However, 10 years from now, if the > Python language > is pretty much the same as Python 3.7 today, that would be nice. > Then feel free to stay on Python 3.7. We have versioned releases so people can choose to do that. :) > > I'm not stuck on the number "10 years", but I am just throwing it out > there as a draft proposal. > So even 5-8 year moratorium would be nice to strive for. > Timespans of that length are still too long to freeze the language. Look at it this way: node.js 0.10.0 was released 5 years ago and now it's a thing. If we had not moved forward and added async/await in Python 3.5 -- which was only 3 years ago -- but instead froze ourselves for 5 years would we be considered relevant in the networking world like we are, or viewed as somewhat as a dinosaur? I realize the embedded world moves at a different pace (as well as other groups of developers), but that doesn't mean we have to move at the speed of our slowest adopters to punish those willing and wanting newer features. > > > Outside of the embedded space, I will give another example where folks in > industry are behind. > I don't want to pick on a particular vendor, but from April 24-26, I > attended training sessions at RedisConf18 in San Francisco. > During the training sessions, multiple sample Python code examples were > given for accessing the Redis database. > The instructor thought that the code examples worked in Python 3, but in > fact, they only worked in Python 2 mostly due to > bytes/unicode issues. During the class, I fixed the examples for Python 3 > and submitted the patches to the instructor, > who gratefully accepted my patches. However, there are going to be many, > many users of Redis out there who > maybe will not upgrade their Python code for newer versions of Python for > many years. > Why is Redis specifically going to be behind specifically? Are they embedding the interpreter? > > Besides Redis users, I am seeing all sorts of communities and companies > which are behind in terms of having working > code and documentation that works on latest Python 3. It is going to take > YEARS for all these communities and companies > to catch up (if ever). > And that's fine. If they want to continue to maintain Python 2 and stay on it, or simply stick with our final release and never worry about potential security issues, that's their prerogative. But once again, we shouldn't have to hold up the entire language for the slowest adopters. > > I understand that Python as a language must evolve over time to succeed > and thrive. But I would request that > at least for the next 5-10 years, a moratorium on language changes be > considered, to allow the world to catch > up in terms of code, documentation, and mind/understanding. > 5 years is 3-4 releases, 10 years is 6-7. That's basically saying we should still be like 3.3/3.2 or 2.7, both of which I don't think the majority of people want (I know I am a happier programmer in 3.6 than I am in any of those versions). > > > Looking back at how the Python 2.7 EOL was extended by 5 years, I would > remind the core Python developers > that it is very easy to underestimate how slow the world is to change > their code, documentation, training, > and understanding to adapt to new Python versions. Going slow or > imposing a moratorium wouldn't be such a bad thing. > I think a better way to phrase this is, "should we not change the language because there are still people on Python 3.3? We've already stated many times that there won't be a major language upheaval like 2/3 ever again, so we are only talking about changes on the order of e.g. 3.5/3.6. And for me, I like what we have added. I am certainly not about to ask anyone to give up f-strings and deal with those pitchforks. ;) And if people don't upgrade then people don't upgrade. We have all the old versions of libraries on PyPI, so people can continue to use the libraries that they were depending on when they choose to not move forward with Python releases and continue to function. -------------- next part -------------- An HTML attachment was scrubbed... URL: From j.orponen at 4teamwork.ch Tue May 8 11:51:06 2018 From: j.orponen at 4teamwork.ch (Joni Orponen) Date: Tue, 8 May 2018 17:51:06 +0200 Subject: [Python-Dev] Process to remove a Python feature In-Reply-To: <0aebfeb3-ce43-8686-7bae-13429f78b7fd@gmail.com> References: <0aebfeb3-ce43-8686-7bae-13429f78b7fd@gmail.com> Message-ID: On Tue, May 8, 2018 at 3:20 PM, St?phane Blondon wrote: > Le 02/05/2018 ? 11:11, Victor Stinner a ?crit : > > * Communicate on python-dev, Twitter, Python Insider blog, etc. > > * Collaborate with major Python projects to help them to migrate the > alternative > > I wonder if it would be interesting to have a package available by > pypi.org which would provide the removed features. In your example, the > developers would have to update their source code: > > # giving 'obsolete' as name for this package > platform.linux_distribution() -> obsolete.platform.linux_distribution() If one can import from the future can one can also import from the past? -- Joni Orponen -------------- next part -------------- An HTML attachment was scrubbed... URL: From ethan at stoneleaf.us Tue May 8 12:10:13 2018 From: ethan at stoneleaf.us (Ethan Furman) Date: Tue, 08 May 2018 09:10:13 -0700 Subject: [Python-Dev] Process to remove a Python feature In-Reply-To: References: <20180504131403.GJ9562@ando.pearwood.info> Message-ID: <5AF1CBE5.3060806@stoneleaf.us> On 05/04/2018 11:48 AM, Serhiy Storchaka wrote: > 04.05.18 20:57, Matthias Bussonnier ????: >> But when I hit a DeprecationWarning message there is one crucial piece of >> information missing most of the time: Since which version number it's deprecated >> (and sometime since when the replacement is available could be good if overlap >> between functionality there was). > > I think the information about since which version number it will be removed is more useful. Different cases need > different deprecation periods. The more common the case, the longer deprecation period should be. Some recently added > warnings contain this information. If we are going to provide extra metadata, we may as well supply it all. Some folks will need the start date, some the end date, and probably some both dates. > X.Y+1: added a deprecation warning. Many users need to support only two recent versions and can move to using the > replacement now. I'm curious how you arrived at this conclusion? I know I've only worked at two different Python-using companies, but neither aggressively tracks the latest Python minor version, and as a library author I support more than the two most recent versions. -- ~Ethan~ From storchaka at gmail.com Tue May 8 12:13:15 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Tue, 8 May 2018 19:13:15 +0300 Subject: [Python-Dev] Process to remove a Python feature In-Reply-To: References: <0aebfeb3-ce43-8686-7bae-13429f78b7fd@gmail.com> Message-ID: 08.05.18 18:51, Joni Orponen ????: > If one can import from the future can one can also import from the past? One can move removed feature to a third-party module and import them from it. From steve at pearwood.info Tue May 8 12:26:30 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Wed, 9 May 2018 02:26:30 +1000 Subject: [Python-Dev] Process to remove a Python feature In-Reply-To: <0aebfeb3-ce43-8686-7bae-13429f78b7fd@gmail.com> References: <0aebfeb3-ce43-8686-7bae-13429f78b7fd@gmail.com> Message-ID: <20180508162629.GF9562@ando.pearwood.info> On Tue, May 08, 2018 at 03:20:50PM +0200, St?phane Blondon wrote: > I wonder if it would be interesting to have a package available by > pypi.org which would provide the removed features. [...] > Cons: > - it pushes the question to how many times the previous behavior should > be maintained from python language to 'obsolete' package. So it's not > completely solved. Its not solved at all. Who is responsible for moving code to this third-party package? Who is responsible for maintaining that third-party package? If it is the core developers, then it isn't really removed at all. If nobody is maintaining it, then its going to suffer bitrot and become unusable. Python is open source, so anyone can fork a module or library from the std lib and move it to PyPI, they don't need the core dev's blessing, they just need to follow the licence requirements. -- Steve From storchaka at gmail.com Tue May 8 12:40:03 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Tue, 8 May 2018 19:40:03 +0300 Subject: [Python-Dev] Process to remove a Python feature In-Reply-To: References: <20180504131403.GJ9562@ando.pearwood.info> Message-ID: 04.05.18 22:08, Matthias Bussonnier ????: > Maybe to push people forward, but from experience it is hard to predict > future, so saying when > it _will_ be remove is hard. Right. But the data of removing is usually specified when the code for removing already is written, or even merged in the next branch. This is true for most warnings introduced in 3.7 that warns about the removal in 3.8. This is true for Py3k warnings backported to 2.7. In short-term deprecations you have the only one release in which the warning is emitted. From storchaka at gmail.com Tue May 8 12:53:16 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Tue, 8 May 2018 19:53:16 +0300 Subject: [Python-Dev] Process to remove a Python feature In-Reply-To: <5AF1CBE5.3060806@stoneleaf.us> References: <20180504131403.GJ9562@ando.pearwood.info> <5AF1CBE5.3060806@stoneleaf.us> Message-ID: 08.05.18 19:10, Ethan Furman ????: >> X.Y+1: added a deprecation warning. Many users need to support only >> two recent versions and can move to using the >> replacement now. > > I'm curious how you arrived at this conclusion?? I know I've only worked > at two different Python-using companies, but neither aggressively tracks > the latest Python minor version, and as a library author I support more > than the two most recent versions. Maybe I was too optimistic. ;-) Libraries need to support more Python versions of course. But two versions is a minimum, and I thing that for many applications (if they are targeted to the specific OS version or shipped with an own Python) this is enough. Even if their number is not large, they will get a benefit from introducing a replacement before adding a warning. If you support versions X.Y-1 and X.Y, you just use the old feature. If you support versions X.Y and X.Y+1, you replace it with the new feature. If you support versions X.Y-1, X.Y and X.Y+1 you need either to ignore varnings, or to add a runtime check for switching between using the old and the new feature. This complicates the code. But at least the code is clear in the first two cases. From rosuav at gmail.com Tue May 8 13:59:18 2018 From: rosuav at gmail.com (Chris Angelico) Date: Wed, 9 May 2018 03:59:18 +1000 Subject: [Python-Dev] Process to remove a Python feature In-Reply-To: References: <20180504131403.GJ9562@ando.pearwood.info> <5AF1CBE5.3060806@stoneleaf.us> Message-ID: On Wed, May 9, 2018 at 2:53 AM, Serhiy Storchaka wrote: > 08.05.18 19:10, Ethan Furman ????: >>> >>> X.Y+1: added a deprecation warning. Many users need to support only two >>> recent versions and can move to using the >>> replacement now. >> >> >> I'm curious how you arrived at this conclusion? I know I've only worked >> at two different Python-using companies, but neither aggressively tracks the >> latest Python minor version, and as a library author I support more than the >> two most recent versions. > > > Maybe I was too optimistic. ;-) Libraries need to support more Python > versions of course. But two versions is a minimum, and I thing that for many > applications (if they are targeted to the specific OS version or shipped > with an own Python) this is enough. Even if their number is not large, they > will get a benefit from introducing a replacement before adding a warning. > > If you support versions X.Y-1 and X.Y, you just use the old feature. If you > support versions X.Y and X.Y+1, you replace it with the new feature. If you > support versions X.Y-1, X.Y and X.Y+1 you need either to ignore varnings, or > to add a runtime check for switching between using the old and the new > feature. This complicates the code. But at least the code is clear in the > first two cases. Here in this house, we have: * 3.8, with or without various patches as are being proposed upstream * 3.7, a slightly old alpha build, as a secondary on the laptop * 3.6 on the latest Ubuntu * 3.5 on the Raspberry Pi * 3.4 as shipped by Debian, as the laptop's primary Python 3 * And I'm not even counting the various different 2.7s. My brother and I built a TCP-managed doorbell involving the rPi. At an absolute minimum, it has to support 3.4, 3.5, and 3.6; and supporting 3.7 is important, given how close it is to release. Supporting 3.8 is no harder than supporting 3.7, but anything that did actual version number checks would need to be aware of it. So that's potentially five different versions. (Fortunately, I have all five installed on one computer, so testing isn't hard.) Supporting just two versions seems a bit too hopeful. Supporting three would be a minimum for an in-house app; and if your users run different Linux distros with different release cadences, four wouldn't be unlikely, even among current releases. As a general rule, I prefer to avoid writing "before" and "after" code. For a lot of deprecations, that's easy - when "await" started becoming a keyword, I renamed a function to "wait" [1], and it was 100% compatible with all versions. If there absolutely has to be code doing two different things, I'd rather catch ImportError than do actual version checks. But if it has to check versions, it'll need to be aware of quite a few. ChrisA [1] https://github.com/Rosuav/LetMeKnow/commit/2ecbbdcc3588139932525140ceb8c2cb66930284 From eric at trueblade.com Tue May 8 14:35:41 2018 From: eric at trueblade.com (Eric V. Smith) Date: Tue, 8 May 2018 14:35:41 -0400 Subject: [Python-Dev] Slow down... In-Reply-To: References: Message-ID: <6B512059-16B3-4898-A26B-775ED4627B7C@trueblade.com> > On May 8, 2018, at 11:48 AM, Brett Cannon wrote: > > > >> On Tue, 8 May 2018 at 08:26 Craig Rodrigues wrote: >>> On Mon, May 7, 2018 at 2:24 PM Barry Warsaw wrote: >>> On May 7, 2018, at 11:49, Craig Rodrigues wrote: >>> > >>> > Would it be reasonable to request a 10 year moratorium on making changes to the core Python language, >>> > and for the next 10 years only focus on things that do not require core language changes, >>> > such as improving/bugfixing existing libraries, writing new libraries, improving tooling, improving infrastructure (PyPI), >>> > improving performance, etc., etc.? >>> >>> IMHO, no. >>> >>> I don?t believe that the way for Python to remain relevant and useful for the next 10 years is to cease all language evolution. Who knows what the computing landscape will look like in 5 years, let alone 10? Something as arbitrary as a 10 year moratorium is (again, IMHO) a death sentence for the language. >>> >>> But I do think it makes sense to think about how Python-the-language and CPython-the-reference implementation can better balance the desire to evolve vs the need to concentrate on improving what we?ve got. With that in mind, it does make sense to occasionally use a moratorium release to focus on tech debt, cleaning up the stdlib, improve performance, etc. >>> >>> CPython?s 18 month release cycle has served us well for a long time, but I do think it?s time to discuss whether it will still be appropriate moving forward. I?m not saying it is or isn?t, but with the release of 3.7, I think it?s a great time to explore our options. >>> >> >> 10 years is a long time for many types of applications, such as web server and desktop applications >> where regular and somewhat frequent upgrades can happen. >> However, I have worked on embedding Python in networking and storage devices. >> It is true that many of these types of devices can also have their software upgraded, >> but often the frequency of such upgrades is much slower than for conventional web server and desktop applications. >> Upgrades of these devices usually spans user-space and kernel/device drivers, so >> upgrades are usually done more cautiously and less frequently. >> >> For Python 2.x, the ship has sailed. However, 10 years from now, if the Python language >> is pretty much the same as Python 3.7 today, that would be nice. > > Then feel free to stay on Python 3.7. We have versioned releases so people can choose to do that. :) Also, you can pay people to support old versions, and sometimes even backport features you need. So I think this use case is already covered. You just can?t expect to hold up language development because you want to have a stable supported version. Eric > >> >> I'm not stuck on the number "10 years", but I am just throwing it out there as a draft proposal. >> So even 5-8 year moratorium would be nice to strive for. > > Timespans of that length are still too long to freeze the language. Look at it this way: node.js 0.10.0 was released 5 years ago and now it's a thing. If we had not moved forward and added async/await in Python 3.5 -- which was only 3 years ago -- but instead froze ourselves for 5 years would we be considered relevant in the networking world like we are, or viewed as somewhat as a dinosaur? > > I realize the embedded world moves at a different pace (as well as other groups of developers), but that doesn't mean we have to move at the speed of our slowest adopters to punish those willing and wanting newer features. > >> >> >> Outside of the embedded space, I will give another example where folks in industry are behind. >> I don't want to pick on a particular vendor, but from April 24-26, I attended training sessions at RedisConf18 in San Francisco. >> During the training sessions, multiple sample Python code examples were given for accessing the Redis database. >> The instructor thought that the code examples worked in Python 3, but in fact, they only worked in Python 2 mostly due to >> bytes/unicode issues. During the class, I fixed the examples for Python 3 and submitted the patches to the instructor, >> who gratefully accepted my patches. However, there are going to be many, many users of Redis out there who >> maybe will not upgrade their Python code for newer versions of Python for many years. > > Why is Redis specifically going to be behind specifically? Are they embedding the interpreter? > >> >> Besides Redis users, I am seeing all sorts of communities and companies which are behind in terms of having working >> code and documentation that works on latest Python 3. It is going to take YEARS for all these communities and companies >> to catch up (if ever). > > And that's fine. If they want to continue to maintain Python 2 and stay on it, or simply stick with our final release and never worry about potential security issues, that's their prerogative. But once again, we shouldn't have to hold up the entire language for the slowest adopters. > >> >> I understand that Python as a language must evolve over time to succeed and thrive. But I would request that >> at least for the next 5-10 years, a moratorium on language changes be considered, to allow the world to catch >> up in terms of code, documentation, and mind/understanding. > > 5 years is 3-4 releases, 10 years is 6-7. That's basically saying we should still be like 3.3/3.2 or 2.7, both of which I don't think the majority of people want (I know I am a happier programmer in 3.6 than I am in any of those versions). > >> >> >> Looking back at how the Python 2.7 EOL was extended by 5 years, I would remind the core Python developers >> that it is very easy to underestimate how slow the world is to change their code, documentation, training, >> and understanding to adapt to new Python versions. Going slow or imposing a moratorium wouldn't be such a bad thing. > > I think a better way to phrase this is, "should we not change the language because there are still people on Python 3.3? We've already stated many times that there won't be a major language upheaval like 2/3 ever again, so we are only talking about changes on the order of e.g. 3.5/3.6. And for me, I like what we have added. I am certainly not about to ask anyone to give up f-strings and deal with those pitchforks. ;) > > And if people don't upgrade then people don't upgrade. We have all the old versions of libraries on PyPI, so people can continue to use the libraries that they were depending on when they choose to not move forward with Python releases and continue to function. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/eric%2Ba-python-dev%40trueblade.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From mertz at gnosis.cx Tue May 8 14:45:01 2018 From: mertz at gnosis.cx (David Mertz) Date: Tue, 08 May 2018 18:45:01 +0000 Subject: [Python-Dev] Slow down... In-Reply-To: References: Message-ID: This seems like a rather bad idea. None of the core changes in the last few versions were on the radar 10 years in advance. And likewise, no one really knows what new issues will become critical over the next 10. The asyncio module and the async/await keywords only developed as important concerns for a year or two before they became part of the language. Likewise for type annotations. Neither is used by most Python programmers, but for a subset, they are very important. I supposed f-strings are incidental. We could have lived without them (I myself doughty opposed "another way to do it"). But they do make code nicer at the cost of incompatible syntax. Likewise underscores in numbers like 17_527_103. Not everyone needs the __mmul__() operator. But for linear algebra, 'a @ b.T' is better than 'np.dot(a, b.T)'. On Mon, May 7, 2018, 3:10 PM Craig Rodrigues wrote: > > > On Sun, May 6, 2018 at 7:35 PM Nick Coghlan wrote: > >> >> I'm inclined to agree that a Python 3.8 PEP in the spirit of the PEP 3003 >> language moratorium could be a very good idea. Between matrix >> multiplication, enhanced tuple unpacking, native coroutines, f-strings, and >> type hinting for variable assignments, we've had quite a bit of syntactic >> churn in the past few releases, and the rest of the ecosystem really hasn't >> caught up on it all yet (and that's not just other implementations - it's >> training material, online courses, etc, etc). >> >> If we're going to take such a step, now's also the time to do it, since >> 3.8 feature development is only just getting under way, and if we did >> decide to repeat the language moratorium, we could co-announce it with the >> Python 3.7 release. >> >> > Would it be reasonable to request a 10 year moratorium on making changes > to the core Python language, > and for the next 10 years only focus on things that do not require core > language changes, > such as improving/bugfixing existing libraries, writing new libraries, > improving tooling, improving infrastructure (PyPI), > improving performance, etc., etc.? > > There are still many companies still stuck on Python 2, so giving 10 years > of breathing room > for these companies to catch up to Python 3 core language, even past 2020 > would be very helpful. > > -- > Craig > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/mertz%40gnosis.cx > -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at pearwood.info Tue May 8 23:54:50 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Wed, 9 May 2018 13:54:50 +1000 Subject: [Python-Dev] Slow down... In-Reply-To: <15824bb9-d765-d645-9516-201ef2c52324@g.nevcal.com> References: <15824bb9-d765-d645-9516-201ef2c52324@g.nevcal.com> Message-ID: <20180509035450.GH9562@ando.pearwood.info> On Mon, May 07, 2018 at 10:21:13AM -0700, Glenn Linderman wrote: > Dan's original comment about language versus library is interesting, > though. It is probably true that one should resist adding language where > library suffices, but sometimes a lack of certain expressiveness in the > language causes cumbersome implementations of library routines to > achieve the goal. +1 -- Steve From steve at pearwood.info Wed May 9 00:25:06 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Wed, 9 May 2018 14:25:06 +1000 Subject: [Python-Dev] Slow down... In-Reply-To: <20180507222332.662ae3da@fsol> References: <20180507222332.662ae3da@fsol> Message-ID: <20180509042506.GI9562@ando.pearwood.info> On Mon, May 07, 2018 at 10:23:32PM +0200, Antoine Pitrou wrote: > On Mon, 07 May 2018 19:19:28 +0000 > Ryan Gonzalez wrote: > > 10 years feels like a simultaneously long and arbitrary limit. IMO a policy > > of "try to avoid major language features for a while" would work better. > > I would remove "for a while". "Try to avoid major language features" > sounds good. It sounds good, until you ask about "What if we had that policy from the beginning?" Let's see what sort of language features we would miss out on if we avoided language features because there was an existing alternative: - async/await ("use the asyncio library") - comprehensions ("just write a loop") - True/False builtins ("just define your own constants at the top of the module") - f-strings ("just use str.format") - yield from ("most of the time you can just use ``for x in thing: yield x``, the rest of the cases are too obscure and unimportant to justify new syntax") - with statements and context managers ("just use a try... finally") - ternary if ("just re-write it as a if...else block, or use the ``x or y and z`` trick") - closures ("just write a class") - yield as an expression for sending values into a generator (a pre-async kind of coroutine) ("just write a class") - function annotations ("just use one of three or four competing standards for docstring annotations") The only language feature I can think of that had no way of doing it before being added to the library was Unicode support. So we'd effectively have Python 1.5 plus Unicode. If we could look forward to 2028, when we're running Python 3.14 or so (4.7 if you prefer), how many fantastic language features that we cannot bear to give up would we be missing out on? -- Steve From storchaka at gmail.com Wed May 9 00:50:10 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Wed, 9 May 2018 07:50:10 +0300 Subject: [Python-Dev] Slow down... In-Reply-To: <20180509042506.GI9562@ando.pearwood.info> References: <20180507222332.662ae3da@fsol> <20180509042506.GI9562@ando.pearwood.info> Message-ID: 09.05.18 07:25, Steven D'Aprano ????: > If we could look forward to 2028, when we're running Python 3.14 or so > (4.7 if you prefer), how many fantastic language features that we cannot > bear to give up would we be missing out on? Like tab-delimited tables. From tim.peters at gmail.com Wed May 9 00:50:02 2018 From: tim.peters at gmail.com (Tim Peters) Date: Tue, 8 May 2018 23:50:02 -0500 Subject: [Python-Dev] Slow down... In-Reply-To: <20180509042506.GI9562@ando.pearwood.info> References: <20180507222332.662ae3da@fsol> <20180509042506.GI9562@ando.pearwood.info> Message-ID: [Steven D'Aprano ] > ... > If we could look forward to 2028, when we're running Python 3.14 or so > (4.7 if you prefer), how many fantastic language features that we cannot > bear to give up would we be missing out on? This, for just one: k = 6 if >!{myobj.meth(arg)[2]} elsenone 7 elsenan 8 else 5 Which is really annoying to write today, but would be much clearer with binding expressions: if myobj is None: k = 7 elif (t := myobj.meth) is None: k = 7 elif (t := t(arg)) is None: k = 7 elif (t := t[2]) is None: k = 7 eiif math.isnan(t): k = 8 elif t: k = 6 else: k = 5 The future is blindingly bright :-) From ben+python at benfinney.id.au Wed May 9 01:05:16 2018 From: ben+python at benfinney.id.au (Ben Finney) Date: Wed, 09 May 2018 15:05:16 +1000 Subject: [Python-Dev] Slow down... References: <20180507222332.662ae3da@fsol> <20180509042506.GI9562@ando.pearwood.info> Message-ID: <85po25o1v7.fsf@benfinney.id.au> Serhiy Storchaka writes: > 09.05.18 07:25, Steven D'Aprano ????: > > If we could look forward to 2028, when we're running Python 3.14 or so > > (4.7 if you prefer), how many fantastic language features that we cannot > > bear to give up would we be missing out on? > > Like tab-delimited tables. Cheeky :-) (For those who are not closely following comp.lang.python, I am pretty sure Serhiy is wryly referring to an ongoing thread in that forum. Someone is arguing that Python should grow a U+0009-delimited syntax for tabular data, and quickly demonstrated a lack of interest in the familir reasons why this is a bad idea. The thread is already past the point of calm discussion, so I am not linking to it because it's not worth wading into.) -- \ ?I wish I had a dollar for every time I spent a dollar, because | `\ then, yahoo!, I'd have all my money back.? ?Jack Handey | _o__) | Ben Finney From greg.ewing at canterbury.ac.nz Wed May 9 01:28:06 2018 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Wed, 09 May 2018 17:28:06 +1200 Subject: [Python-Dev] Slow down... In-Reply-To: References: <20180507222332.662ae3da@fsol> <20180509042506.GI9562@ando.pearwood.info> Message-ID: <5AF286E6.7040800@canterbury.ac.nz> Serhiy Storchaka wrote: > Like tab-delimited tables. No, the obvious way to do tables is to allow HTML-marked-up source code. There would be other benefits to that as well. For example we could decree that all keywords must be formatted in bold, and then there would be no trouble adding new keywords. -- Greg From solipsis at pitrou.net Wed May 9 07:35:53 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Wed, 9 May 2018 13:35:53 +0200 Subject: [Python-Dev] Slow down... References: <20180507222332.662ae3da@fsol> <20180509042506.GI9562@ando.pearwood.info> Message-ID: <20180509133553.6408e47f@fsol> On Wed, 9 May 2018 14:25:06 +1000 Steven D'Aprano wrote: > On Mon, May 07, 2018 at 10:23:32PM +0200, Antoine Pitrou wrote: > > On Mon, 07 May 2018 19:19:28 +0000 > > Ryan Gonzalez wrote: > > > 10 years feels like a simultaneously long and arbitrary limit. IMO a policy > > > of "try to avoid major language features for a while" would work better. > > > > I would remove "for a while". "Try to avoid major language features" > > sounds good. > > It sounds good, until you ask about "What if we had that policy from > the beginning?" > > Let's see what sort of language features we would miss out on if we > avoided language features because there was an existing alternative: [...] "Try to avoid" would make it more of a general guideline than a hard rule, IMHO. I proposed the idea in another thread that Python had reached "peak syntax" and maybe it was time to consider the core language essentially mature. Of course, we don't know what the future will bring and perhaps some fundamentally new programming idiom will appear in the next years that deserve implementing in Python. Regards Antoine. From steve at pearwood.info Wed May 9 07:50:18 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Wed, 9 May 2018 21:50:18 +1000 Subject: [Python-Dev] Slow down... In-Reply-To: <20180509133553.6408e47f@fsol> References: <20180507222332.662ae3da@fsol> <20180509042506.GI9562@ando.pearwood.info> <20180509133553.6408e47f@fsol> Message-ID: <20180509115018.GJ9562@ando.pearwood.info> On Wed, May 09, 2018 at 01:35:53PM +0200, Antoine Pitrou wrote: > > > I would remove "for a while". "Try to avoid major language features" > > > sounds good. > > > > It sounds good, until you ask about "What if we had that policy from > > the beginning?" [...] > "Try to avoid" would make it more of a general guideline than a hard > rule, IMHO. Fair enough. But it really depends on how hard we try: too hard, or not hard enough? *wink* -- Steve From storchaka at gmail.com Wed May 9 08:11:46 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Wed, 9 May 2018 15:11:46 +0300 Subject: [Python-Dev] Slow down... In-Reply-To: <20180509133553.6408e47f@fsol> References: <20180507222332.662ae3da@fsol> <20180509042506.GI9562@ando.pearwood.info> <20180509133553.6408e47f@fsol> Message-ID: 09.05.18 14:35, Antoine Pitrou ????: > I proposed the idea in another thread that Python had reached "peak > syntax" and maybe it was time to consider the core language essentially > mature. Of course, we don't know what the future will bring and > perhaps some fundamentally new programming idiom will appear in the > next years that deserve implementing in Python. "If you want to add a new syntax feature, first remove two old syntax features." From jdhardy at gmail.com Wed May 9 12:30:00 2018 From: jdhardy at gmail.com (Jeff Hardy) Date: Wed, 9 May 2018 09:30:00 -0700 Subject: [Python-Dev] Slow down... In-Reply-To: <15824bb9-d765-d645-9516-201ef2c52324@g.nevcal.com> References: <15824bb9-d765-d645-9516-201ef2c52324@g.nevcal.com> Message-ID: On Mon, May 7, 2018 at 10:21 AM, Glenn Linderman wrote: > On 5/7/2018 7:59 AM, Eric Snow wrote: > > On Sun, May 6, 2018 at 8:25 PM, Nick Coghlan wrote: > > I'm inclined to agree that a Python 3.8 PEP in the spirit of the PEP 3003 > language moratorium could be a very good idea. > > Note that the PEP specifically applies to "syntax, semantics, and > built-ins". Here's the full abstract [1]: > > This PEP proposes a temporary moratorium (suspension) of all changes to > the > Python language syntax, semantics, and built-ins for a period of > at least two > years from the release of Python 3.1. In particular, the moratorium > would > include Python 3.2 (to be released 18-24 months after 3.1) but allow > Python > 3.3 (assuming it is not released prematurely) to once again include > language > changes. > > This suspension of features is designed to allow non-CPython > implementations > to "catch up" to the core implementation of the language, help ease > adoption > of Python 3.x, and provide a more stable base for the community. > > -eric > > Here's my "lightning" response to a "lightning talk" about a moratorium: > > So if other implementations didn't catch up during the last moratorium, > either the moratorium then was lifted too soon, or the other implementations > don't really want to catch up, or the thought that they should catch up was > deemed less important than making forward progress with the language. Have > any of those opinions changed? Speaking as the maintainer of IronPython during the last moratorium: while catching up was certainly desirable, there simply wasn't enough person-power to do it any reasonable amount of time (I'm not sure any implementation besides PyPy even has yield-from, let alone async). Between fixing issues in 2.x branches, trying to implement 3.x features, and dealing with underlying platform churn I don't think even two years was ever realistic. Plus, every feature has to be considered with how it works in Python and the other platform (like, what sort of fun .NET interop can we do with type annotations?). Another moratorium would probably have the same (lack of) effect. Better, IMO, to just raise the bar on expensive features and let them catch up naturally. - Jeff From marybutlerub395 at yahoo.com Wed May 9 12:58:56 2018 From: marybutlerub395 at yahoo.com (Mary Butler) Date: Wed, 9 May 2018 16:58:56 +0000 (UTC) Subject: [Python-Dev] [Python-checkins] bpo-33038: Fix gzip.GzipFile for file objects with a non-string name attribute. (GH-6095) In-Reply-To: <40gsfc52NqzFrCp@mail.python.org> References: <40gsfc52NqzFrCp@mail.python.org> Message-ID: <1915014398.548763.1525885136284@mail.yahoo.com> From: Serhiy Storchaka To: python-checkins at python.org Sent: Wednesday, 9 May 2018, 10:14 Subject: [Python-checkins] bpo-33038: Fix gzip.GzipFile for file objects with a non-string name attribute. (GH-6095) https://github.com/python/cpython/commit/afe5f633e49e0e873d42088ae56819609c803ba0 commit: afe5f633e49e0e873d42088ae56819609c803ba0 branch: 2.7 author: Bo Bayles committer: Serhiy Storchaka date: 2018-05-09T13:14:40+03:00 summary: bpo-33038: Fix gzip.GzipFile for file objects with a non-string name attribute. (GH-6095) files: A Misc/NEWS.d/next/Library/2018-03-10-20-14-36.bpo-33038.yA6CP5.rst M Lib/gzip.py M Lib/test/test_gzip.py M Misc/ACKS diff --git a/Lib/gzip.py b/Lib/gzip.py index 07c6db493b0b..76ace394f482 100644 --- a/Lib/gzip.py +++ b/Lib/gzip.py @@ -95,9 +95,8 @@ def __init__(self, filename=None, mode=None, ? ? ? ? if filename is None: ? ? ? ? ? ? # Issue #13781: os.fdopen() creates a fileobj with a bogus name ? ? ? ? ? ? # attribute. Avoid saving this in the gzip header's filename field. -? ? ? ? ? ? if hasattr(fileobj, 'name') and fileobj.name != '': -? ? ? ? ? ? ? ? filename = fileobj.name -? ? ? ? ? ? else: +? ? ? ? ? ? filename = getattr(fileobj, 'name', '') +? ? ? ? ? ? if not isinstance(filename, basestring) or filename == '': ? ? ? ? ? ? ? ? filename = '' ? ? ? ? if mode is None: ? ? ? ? ? ? if hasattr(fileobj, 'mode'): mode = fileobj.mode diff --git a/Lib/test/test_gzip.py b/Lib/test/test_gzip.py index 902d93fe043f..cdb1af5c3d13 100644 --- a/Lib/test/test_gzip.py +++ b/Lib/test/test_gzip.py @@ -6,6 +6,7 @@ import os import io import struct +import tempfile gzip = test_support.import_module('gzip') data1 = """? int length=DEFAULTALLOC, err = Z_OK; @@ -331,6 +332,12 @@ def test_fileobj_from_fdopen(self): ? ? ? ? ? ? with gzip.GzipFile(fileobj=f, mode="w") as g: ? ? ? ? ? ? ? ? self.assertEqual(g.name, "") +? ? def test_fileobj_from_io_open(self): +? ? ? ? fd = os.open(self.filename, os.O_WRONLY | os.O_CREAT) +? ? ? ? with io.open(fd, "wb") as f: +? ? ? ? ? ? with gzip.GzipFile(fileobj=f, mode="w") as g: +? ? ? ? ? ? ? ? self.assertEqual(g.name, "") + ? ? def test_fileobj_mode(self): ? ? ? ? gzip.GzipFile(self.filename, "wb").close() ? ? ? ? with open(self.filename, "r+b") as f: @@ -359,6 +366,14 @@ def test_read_with_extra(self): ? ? ? ? with gzip.GzipFile(fileobj=io.BytesIO(gzdata)) as f: ? ? ? ? ? ? self.assertEqual(f.read(), b'Test') +? ? def test_fileobj_without_name(self): +? ? ? ? # Issue #33038: GzipFile should not assume that file objects that have +? ? ? ? # a .name attribute use a non-None value. +? ? ? ? with tempfile.SpooledTemporaryFile() as f: +? ? ? ? ? ? with gzip.GzipFile(fileobj=f, mode='wb') as archive: +? ? ? ? ? ? ? ? archive.write(b'data') +? ? ? ? ? ? ? ? self.assertEqual(archive.name, '') + def test_main(verbose=None): ? ? test_support.run_unittest(TestGzip) diff --git a/Misc/ACKS b/Misc/ACKS index 580b0c5bf76d..458f31e6a6b7 100644 --- a/Misc/ACKS +++ b/Misc/ACKS @@ -94,6 +94,7 @@ Michael R Bax Anthony Baxter Mike Bayer Samuel L. Bayer +Bo Bayles Donald Beaudry David Beazley Carlo Beccarini diff --git a/Misc/NEWS.d/next/Library/2018-03-10-20-14-36.bpo-33038.yA6CP5.rst b/Misc/NEWS.d/next/Library/2018-03-10-20-14-36.bpo-33038.yA6CP5.rst new file mode 100644 index 000000000000..22d394b85ab7 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2018-03-10-20-14-36.bpo-33038.yA6CP5.rst @@ -0,0 +1,2 @@ +gzip.GzipFile no longer produces an AttributeError exception when used with +a file object with a non-string name attribute. Patch by Bo Bayles. _______________________________________________ Python-checkins mailing list Python-checkins at python.org https://mail.python.org/mailman/listinfo/python-checkins -------------- next part -------------- An HTML attachment was scrubbed... URL: From paddy3118 at gmail.com Thu May 10 03:11:37 2018 From: paddy3118 at gmail.com (Paddy McCarthy) Date: Thu, 10 May 2018 07:11:37 +0000 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update: limit? In-Reply-To: <5AF1C09A.9060503@UGent.be> References: <5AED7166.1010008@UGent.be> <5AF1C09A.9060503@UGent.be> Message-ID: On Tue, 8 May 2018, 16:33 Jeroen Demeyer, wrote: > On 2018-05-06 09:35, Nick Coghlan wrote: > > Thanks for this update Jeroen! If it doesn't come up otherwise, I'll try > > to claim one of the lightning talk slots at the Language Summit to > > discuss this with folks in person :) > > Sounds great! I'd love to hear what people think. > > As an example of how the new functionality of PEP 575 can be used, I > changed functools.lru_cache to implement the _lru_cache_wrapper class as > subclass of base_function. I added this to the reference implementation > https://github.com/jdemeyer/cpython/tree/pep575 > _______________________________________________ > It is not so much the use, but the abuse that I am worried about. If their was a way to limit the complexity of expression allowed around the use of the name assigned to by :=, then that would be a start. -------------- next part -------------- An HTML attachment was scrubbed... URL: From rosuav at gmail.com Fri May 11 07:33:11 2018 From: rosuav at gmail.com (Chris Angelico) Date: Fri, 11 May 2018 21:33:11 +1000 Subject: [Python-Dev] Associated images in PEPs broken? Message-ID: https://www.python.org/dev/peps/pep-0495/ All the images seem to be missing - showing up 404. They're in the peps repository, but aren't showing up in the page. Who's in charge of the HTML rendering there? Infrastructure? ChrisA From rosuav at gmail.com Fri May 11 07:34:14 2018 From: rosuav at gmail.com (Chris Angelico) Date: Fri, 11 May 2018 21:34:14 +1000 Subject: [Python-Dev] Associated images in PEPs broken? In-Reply-To: References: Message-ID: On Fri, May 11, 2018 at 9:33 PM, Chris Angelico wrote: > https://www.python.org/dev/peps/pep-0495/ > > All the images seem to be missing - showing up 404. They're in the > peps repository, but aren't showing up in the page. Who's in charge of > the HTML rendering there? Infrastructure? > > ChrisA Never mind! They just came good. I probably saw the page right as a build was taking place or something. ChrisA From chris.barker at noaa.gov Fri May 11 10:38:05 2018 From: chris.barker at noaa.gov (Chris Barker - NOAA Federal) Date: Fri, 11 May 2018 07:38:05 -0700 Subject: [Python-Dev] Python startup time In-Reply-To: <20180507162846.oxkkag27zgoah4wb@python.ca> References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> <5C4A992D-891A-4278-82E9-A1625EF4BA3E@langa.pl> <20180507162846.oxkkag27zgoah4wb@python.ca> Message-ID: Inspired by chg: Could one make a little startup utility that, when invoked the first time, starts up a raw python interpreter, keeps it running somewhere, and then forks it to run the actual python code. Then every invocation after that would make a new fork. I presume forking is a LOT faster than re-invoking the entire startup. I suspect that many of the cases where startup time really matters is when a command line utility is likely to be invoked many times ? often in the same shell instance. So having a ?pre-built? warm interpreter ready to go could really help. This is way past my technical expertise to know if it?s possible, or to try to prototype it, but I?m sure many of you would know. -CHB Sent from my iPhone > On May 7, 2018, at 12:28 PM, Neil Schemenauer wrote: > > On 2018-05-03, Lukasz Langa wrote: >>> On May 2, 2018, at 8:57 PM, INADA Naoki wrote: >>> * Add lazy compiling API or flag in `re` module. The pattern is compiled >>> when first used. >> >> How about go the other way and allow compiling at Python >> *compile*-time? That would actually make things faster instead of >> just moving the time spent around. > > Lisp has a special form 'eval-when'. It can be used to cause > evaluation of the body expression at compile time. > > In Carl's "A fast startup patch" post, he talks about getting rid of > the unmarshal step and storing objects in the heap segment of the > executable. Those would be the objects necessary to evaluate code. > The marshal module has a limited number of types that it handle. > I believe they are: bool, bytes, code objects, complex, Ellipsis > float, frozenset, int, None, tuple and str. > > If the same mechanism could handle more types, rather than storing > the code to be evaluated, we could store the objects created after > evaluation of the top-level module body. Or, have a mechanism to > mark which code should be evaluated at compile time (much like the > eval-when form). > > For the re.compile example, the compiled regex could be what is > stored after compiling the Python module (i.e. the re.compile gets > run at compile time). The objects created by re.compile (e.g. > SRE_Pattern) would have to be something that the heap dumper could > handle. > > Traditionally, Python has had the model "there is only runtime". > So, starting to do things at compile time complicates that model. > > Regards, > > Neil > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/chris.barker%40noaa.gov From rymg19 at gmail.com Fri May 11 11:05:04 2018 From: rymg19 at gmail.com (Ryan Gonzalez) Date: Fri, 11 May 2018 10:05:04 -0500 Subject: [Python-Dev] Python startup time In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> <5C4A992D-891A-4278-82E9-A1625EF4BA3E@langa.pl> <20180507162846.oxkkag27zgoah4wb@python.ca> Message-ID: <1634fbbe500.27a3.db5b03704c129196a4e9415e55413ce6@gmail.com> https://refi64.com/uprocd/ On May 11, 2018 9:39:28 AM Chris Barker - NOAA Federal via Python-Dev wrote: > Inspired by chg: > > Could one make a little startup utility that, when invoked the first > time, starts up a raw python interpreter, keeps it running somewhere, > and then forks it to run the actual python code. > > Then every invocation after that would make a new fork. I presume > forking is a LOT faster than re-invoking the entire startup. > > I suspect that many of the cases where startup time really matters is > when a command line utility is likely to be invoked many times ? often > in the same shell instance. > > So having a ?pre-built? warm interpreter ready to go could really help. > > This is way past my technical expertise to know if it?s possible, or > to try to prototype it, but I?m sure many of you would know. > > -CHB > > Sent from my iPhone > >> On May 7, 2018, at 12:28 PM, Neil Schemenauer wrote: >> >> On 2018-05-03, Lukasz Langa wrote: >>>> On May 2, 2018, at 8:57 PM, INADA Naoki wrote: >>>> * Add lazy compiling API or flag in `re` module. The pattern is compiled >>>> when first used. >>> >>> How about go the other way and allow compiling at Python >>> *compile*-time? That would actually make things faster instead of >>> just moving the time spent around. >> >> Lisp has a special form 'eval-when'. It can be used to cause >> evaluation of the body expression at compile time. >> >> In Carl's "A fast startup patch" post, he talks about getting rid of >> the unmarshal step and storing objects in the heap segment of the >> executable. Those would be the objects necessary to evaluate code. >> The marshal module has a limited number of types that it handle. >> I believe they are: bool, bytes, code objects, complex, Ellipsis >> float, frozenset, int, None, tuple and str. >> >> If the same mechanism could handle more types, rather than storing >> the code to be evaluated, we could store the objects created after >> evaluation of the top-level module body. Or, have a mechanism to >> mark which code should be evaluated at compile time (much like the >> eval-when form). >> >> For the re.compile example, the compiled regex could be what is >> stored after compiling the Python module (i.e. the re.compile gets >> run at compile time). The objects created by re.compile (e.g. >> SRE_Pattern) would have to be something that the heap dumper could >> handle. >> >> Traditionally, Python has had the model "there is only runtime". >> So, starting to do things at compile time complicates that model. >> >> Regards, >> >> Neil >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/chris.barker%40noaa.gov > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com From phd at phdru.name Fri May 11 11:27:35 2018 From: phd at phdru.name (Oleg Broytman) Date: Fri, 11 May 2018 17:27:35 +0200 Subject: [Python-Dev] Python startup time - daemon In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> <5C4A992D-891A-4278-82E9-A1625EF4BA3E@langa.pl> <20180507162846.oxkkag27zgoah4wb@python.ca> Message-ID: <20180511152735.s7dpmifkd2gvz7zc@phdru.name> On Fri, May 11, 2018 at 07:38:05AM -0700, Chris Barker - NOAA Federal via Python-Dev wrote: > Could one make a little startup utility that, when invoked the first > time, starts up a raw python interpreter, keeps it running somewhere, > and then forks it to run the actual python code. > > Then every invocation after that would make a new fork. Used to be implemented (and discussed in this list) many times. Just a few examples: http://readyexec.sourceforge.net/ https://blogs.gnome.org/johan/2007/01/18/introducing-python-launcher/ Proven to be hard and never gain any traction. a) you don't want the daemon to import all possible modules so you need to run a separate copy of the daemon for every Python version, every user and every client program; b) you need to find "your" daemon - using TCP? unix sockets? named pipes? b) need to redirect stdio to/from the daemon; c) need to redirect signals and exceptions; d) have problems with elevated privileges (how do you elevate the daemon if the client was started with `sudo -H`?); e) not portable (there is a popular GUI that cannot fork). > -CHB > Sent from my iPhone Oleg. -- Oleg Broytman http://phdru.name/ phd at phdru.name Programmers don't die, they just GOSUB without RETURN. From solipsis at pitrou.net Fri May 11 11:34:26 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Fri, 11 May 2018 17:34:26 +0200 Subject: [Python-Dev] Python startup time - daemon References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> <5C4A992D-891A-4278-82E9-A1625EF4BA3E@langa.pl> <20180507162846.oxkkag27zgoah4wb@python.ca> <20180511152735.s7dpmifkd2gvz7zc@phdru.name> Message-ID: <20180511173426.41059616@fsol> Yes, you don't want this to be a generic utility, rather a helper library that people can integrate into their command-line applications to enable such startup caching. Regards Antoine. On Fri, 11 May 2018 17:27:35 +0200 Oleg Broytman wrote: > On Fri, May 11, 2018 at 07:38:05AM -0700, Chris Barker - NOAA Federal via Python-Dev wrote: > > Could one make a little startup utility that, when invoked the first > > time, starts up a raw python interpreter, keeps it running somewhere, > > and then forks it to run the actual python code. > > > > Then every invocation after that would make a new fork. > > Used to be implemented (and discussed in this list) many times. Just > a few examples: > > http://readyexec.sourceforge.net/ > https://blogs.gnome.org/johan/2007/01/18/introducing-python-launcher/ > > Proven to be hard and never gain any traction. > > a) you don't want the daemon to import all possible modules so you need > to run a separate copy of the daemon for every Python version, every > user and every client program; > b) you need to find "your" daemon - using TCP? unix sockets? named pipes? > b) need to redirect stdio to/from the daemon; > c) need to redirect signals and exceptions; > d) have problems with elevated privileges (how do you elevate the daemon > if the client was started with `sudo -H`?); > e) not portable (there is a popular GUI that cannot fork). > > > -CHB > > Sent from my iPhone > > Oleg. From status at bugs.python.org Fri May 11 12:09:55 2018 From: status at bugs.python.org (Python tracker) Date: Fri, 11 May 2018 18:09:55 +0200 (CEST) Subject: [Python-Dev] Summary of Python tracker Issues Message-ID: <20180511160955.659DA5672E@psf.upfronthosting.co.za> ACTIVITY SUMMARY (2018-05-04 - 2018-05-11) Python tracker at https://bugs.python.org/ To view or respond to any of the issues listed below, click on the issue. Do NOT respond to this message. Issues counts and deltas: open 6644 (+15) closed 38571 (+24) total 45215 (+39) Open issues with patches: 2604 Issues opened (34) ================== #14384: Add "default" kw argument to operator.itemgetter and operator. https://bugs.python.org/issue14384 reopened by gvanrossum #33426: Behavior of os.path.join does not match documentation https://bugs.python.org/issue33426 opened by Michael Klatt #33427: Dead link in "The Python Standard Library" page https://bugs.python.org/issue33427 opened by lighthawk #33428: pathlib.Path.glob does not follow symlinks https://bugs.python.org/issue33428 opened by brianmsheldon #33430: Import secrets module in secrets examples https://bugs.python.org/issue33430 opened by dchimeno #33431: Change description about doc in programming, faq. https://bugs.python.org/issue33431 opened by lvhuiyang #33433: ipaddress is_private misleading for IPv4 mapped IPv6 addresses https://bugs.python.org/issue33433 opened by Thomas Kriechbaumer #33435: incorrect detection of information of some distributions https://bugs.python.org/issue33435 opened by mrandybu #33436: Add an interactive shell for Sqlite3 https://bugs.python.org/issue33436 opened by rhettinger #33437: Defining __init__ in enums https://bugs.python.org/issue33437 opened by killerrex #33438: pkg-config file misses flags for static linking https://bugs.python.org/issue33438 opened by pitrou #33439: python-config.py should be part of the stdlib https://bugs.python.org/issue33439 opened by pitrou #33440: Possible lazy import opportunities in `pathlib` https://bugs.python.org/issue33440 opened by ncoghlan #33441: Expose the sigset_t converter via private API https://bugs.python.org/issue33441 opened by serhiy.storchaka #33442: Python 3 doc sidebar dosnt follow page scrolling like 2.7 doc https://bugs.python.org/issue33442 opened by pete312 #33443: Typo in Python/import.c https://bugs.python.org/issue33443 opened by ukwksk #33446: destructors of local variables are not traced https://bugs.python.org/issue33446 opened by xdegaye #33447: Asynchronous lambda syntax https://bugs.python.org/issue33447 opened by Noah Simon #33448: Output_Typo_Error https://bugs.python.org/issue33448 opened by vishva_11 #33449: Documentation for email.charset confusing about the location o https://bugs.python.org/issue33449 opened by Francois Labelle #33450: unexpected EPROTOTYPE returned by sendto on MAC OSX https://bugs.python.org/issue33450 opened by racitup #33451: Start pyc file lock the file https://bugs.python.org/issue33451 opened by Jean-Louis Tamburini #33452: add user notification that parent init will not be called in d https://bugs.python.org/issue33452 opened by Ricyteach #33453: from __future__ import annotations breaks dataclasses ClassVar https://bugs.python.org/issue33453 opened by Ricyteach #33454: Mismatched C function signature in _xxsubinterpreters.channel_ https://bugs.python.org/issue33454 opened by serhiy.storchaka #33456: site.py: by default, a virtual environment is *not* isolated f https://bugs.python.org/issue33456 opened by meribold #33457: python-config ldflags, PEP 513 and explicit linking to libpyth https://bugs.python.org/issue33457 opened by dimi #33458: pdb.run() does not trace destructors of __main__ https://bugs.python.org/issue33458 opened by xdegaye #33459: Define "tuple display" in the docs https://bugs.python.org/issue33459 opened by adelfino #33460: ... used to indicate many different things in chapter 3, some https://bugs.python.org/issue33460 opened by lew18 #33461: json.loads(encoding=) does not emit deprecation warn https://bugs.python.org/issue33461 opened by mbussonn #33462: reversible dict https://bugs.python.org/issue33462 opened by selik #33463: Can namedtuple._asdict return a regular dict instead of Ordere https://bugs.python.org/issue33463 opened by selik #33464: AIX and "specialized downloads" links https://bugs.python.org/issue33464 opened by Michael.Felt Most recent 15 issues with no replies (15) ========================================== #33462: reversible dict https://bugs.python.org/issue33462 #33460: ... used to indicate many different things in chapter 3, some https://bugs.python.org/issue33460 #33457: python-config ldflags, PEP 513 and explicit linking to libpyth https://bugs.python.org/issue33457 #33456: site.py: by default, a virtual environment is *not* isolated f https://bugs.python.org/issue33456 #33454: Mismatched C function signature in _xxsubinterpreters.channel_ https://bugs.python.org/issue33454 #33451: Start pyc file lock the file https://bugs.python.org/issue33451 #33443: Typo in Python/import.c https://bugs.python.org/issue33443 #33442: Python 3 doc sidebar dosnt follow page scrolling like 2.7 doc https://bugs.python.org/issue33442 #33439: python-config.py should be part of the stdlib https://bugs.python.org/issue33439 #33435: incorrect detection of information of some distributions https://bugs.python.org/issue33435 #33433: ipaddress is_private misleading for IPv4 mapped IPv6 addresses https://bugs.python.org/issue33433 #33431: Change description about doc in programming, faq. https://bugs.python.org/issue33431 #33428: pathlib.Path.glob does not follow symlinks https://bugs.python.org/issue33428 #33421: Missing documentation for typing.AsyncContextManager https://bugs.python.org/issue33421 #33418: Memory leaks in functions https://bugs.python.org/issue33418 Most recent 15 issues waiting for review (15) ============================================= #33461: json.loads(encoding=) does not emit deprecation warn https://bugs.python.org/issue33461 #33459: Define "tuple display" in the docs https://bugs.python.org/issue33459 #33456: site.py: by default, a virtual environment is *not* isolated f https://bugs.python.org/issue33456 #33454: Mismatched C function signature in _xxsubinterpreters.channel_ https://bugs.python.org/issue33454 #33446: destructors of local variables are not traced https://bugs.python.org/issue33446 #33443: Typo in Python/import.c https://bugs.python.org/issue33443 #33441: Expose the sigset_t converter via private API https://bugs.python.org/issue33441 #33437: Defining __init__ in enums https://bugs.python.org/issue33437 #33435: incorrect detection of information of some distributions https://bugs.python.org/issue33435 #33431: Change description about doc in programming, faq. https://bugs.python.org/issue33431 #33430: Import secrets module in secrets examples https://bugs.python.org/issue33430 #33427: Dead link in "The Python Standard Library" page https://bugs.python.org/issue33427 #33421: Missing documentation for typing.AsyncContextManager https://bugs.python.org/issue33421 #33413: asyncio.gather should not use special Future https://bugs.python.org/issue33413 #33403: asyncio.tasks.wait does not allow to set custom exception when https://bugs.python.org/issue33403 Top 10 most discussed issues (10) ================================= #33453: from __future__ import annotations breaks dataclasses ClassVar https://bugs.python.org/issue33453 12 msgs #20087: Mismatch between glibc and X11 locale.alias https://bugs.python.org/issue20087 7 msgs #20104: expose posix_spawn(p) https://bugs.python.org/issue20104 6 msgs #33012: Invalid function cast warnings with gcc 8 for METH_NOARGS https://bugs.python.org/issue33012 4 msgs #33355: Windows 10 buildbot: 15 min timeout on test_mmap.test_large_fi https://bugs.python.org/issue33355 4 msgs #33389: argparse redundant help string https://bugs.python.org/issue33389 4 msgs #3692: improper scope in list comprehension, when used in class decla https://bugs.python.org/issue3692 3 msgs #13044: pdb throws AttributeError at end of debugging session https://bugs.python.org/issue13044 3 msgs #14384: Add "default" kw argument to operator.itemgetter and operator. https://bugs.python.org/issue14384 3 msgs #21983: segfault in ctypes.cast https://bugs.python.org/issue21983 3 msgs Issues closed (23) ================== #13525: Tutorial: Example of Source Code Encoding triggers error https://bugs.python.org/issue13525 closed by serhiy.storchaka #16653: reference kept in f_locals prevents the tracing/profiling of a https://bugs.python.org/issue16653 closed by xdegaye #25109: test_code_module tests fail when run from the installed locati https://bugs.python.org/issue25109 closed by doko #32189: SyntaxError for yield expressions inside comprehensions & gene https://bugs.python.org/issue32189 closed by serhiy.storchaka #32362: multiprocessing.connection.Connection misdocumented as multipr https://bugs.python.org/issue32362 closed by serhiy.storchaka #32626: Subscript unpacking raises SyntaxError https://bugs.python.org/issue32626 closed by serhiy.storchaka #32717: Document PEP 560 https://bugs.python.org/issue32717 closed by levkivskyi #32857: tkinter after_cancel does not behave correctly when called wit https://bugs.python.org/issue32857 closed by serhiy.storchaka #33038: GzipFile doesn't always ignore None as filename https://bugs.python.org/issue33038 closed by serhiy.storchaka #33296: datetime.datetime.utcfromtimestamp call decimal causes precisi https://bugs.python.org/issue33296 closed by serhiy.storchaka #33311: cgitb: remove parentheses when the error is in module https://bugs.python.org/issue33311 closed by serhiy.storchaka #33392: pathlib .glob('*/') returns files as well as directories https://bugs.python.org/issue33392 closed by SilentGhost #33414: Make shutil.copytree use os.scandir to take advantage of cache https://bugs.python.org/issue33414 closed by serhiy.storchaka #33417: Isinstance() behavior is not consistent with the document https://bugs.python.org/issue33417 closed by terry.reedy #33419: Add functools.partialclass https://bugs.python.org/issue33419 closed by ncoghlan #33420: [typing] __origin__ invariant broken https://bugs.python.org/issue33420 closed by levkivskyi #33422: Fix and update string/byte literals in help() https://bugs.python.org/issue33422 closed by serhiy.storchaka #33429: IDLE tooltips stopped working between 2.7.14 and 2.7.15 on Mac https://bugs.python.org/issue33429 closed by ned.deily #33432: No locale alias mapping key for en_IL https://bugs.python.org/issue33432 closed by serhiy.storchaka #33434: ^L character in Lib/email/generator.py https://bugs.python.org/issue33434 closed by serhiy.storchaka #33444: Memory leak/high usage on copy in different thread https://bugs.python.org/issue33444 closed by pitrou #33445: test_cprofile should fail instead of displaying a message https://bugs.python.org/issue33445 closed by benjamin.peterson #33455: test.test_posix.TestPosixSpawn::test_specify_environment fails https://bugs.python.org/issue33455 closed by serhiy.storchaka From guido at python.org Fri May 11 12:23:31 2018 From: guido at python.org (Guido van Rossum) Date: Fri, 11 May 2018 12:23:31 -0400 Subject: [Python-Dev] Python startup time - daemon In-Reply-To: <20180511173426.41059616@fsol> References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> <5C4A992D-891A-4278-82E9-A1625EF4BA3E@langa.pl> <20180507162846.oxkkag27zgoah4wb@python.ca> <20180511152735.s7dpmifkd2gvz7zc@phdru.name> <20180511173426.41059616@fsol> Message-ID: Indeed, we have an implementation of this specific to mypy. On Fri, May 11, 2018 at 11:34 AM, Antoine Pitrou wrote: > > Yes, you don't want this to be a generic utility, rather a helper > library that people can integrate into their command-line applications > to enable such startup caching. > > Regards > > Antoine. > > > On Fri, 11 May 2018 17:27:35 +0200 > Oleg Broytman wrote: > > On Fri, May 11, 2018 at 07:38:05AM -0700, Chris Barker - NOAA Federal > via Python-Dev wrote: > > > Could one make a little startup utility that, when invoked the first > > > time, starts up a raw python interpreter, keeps it running somewhere, > > > and then forks it to run the actual python code. > > > > > > Then every invocation after that would make a new fork. > > > > Used to be implemented (and discussed in this list) many times. Just > > a few examples: > > > > http://readyexec.sourceforge.net/ > > https://blogs.gnome.org/johan/2007/01/18/introducing-python-launcher/ > > > > Proven to be hard and never gain any traction. > > > > a) you don't want the daemon to import all possible modules so you need > > to run a separate copy of the daemon for every Python version, every > > user and every client program; > > b) you need to find "your" daemon - using TCP? unix sockets? named pipes? > > b) need to redirect stdio to/from the daemon; > > c) need to redirect signals and exceptions; > > d) have problems with elevated privileges (how do you elevate the daemon > > if the client was started with `sudo -H`?); > > e) not portable (there is a popular GUI that cannot fork). > > > > > -CHB > > > Sent from my iPhone > > > > Oleg. > > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ > guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Fri May 11 14:15:11 2018 From: chris.barker at noaa.gov (Chris Barker - NOAA Federal) Date: Fri, 11 May 2018 11:15:11 -0700 Subject: [Python-Dev] (Looking for) A Retrospective on the Move to Python 3 In-Reply-To: <20180428175002.62163b10@fsol> References: <20180428175002.62163b10@fsol> Message-ID: > while the changes introduced by Python 3 > affect pretty much everyone, even people who only write small simple > scripts. Sure they do, but the *hard stuff* not so much. I have found 2to3 conversion to be remarkably easy and painless. And the whole Unicode thing is much easier. CHB > Regards > > Antoine. > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/chris.barker%40noaa.gov From barry at python.org Fri May 11 23:57:07 2018 From: barry at python.org (Barry Warsaw) Date: Fri, 11 May 2018 23:57:07 -0400 Subject: [Python-Dev] Python startup time - daemon In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> <5C4A992D-891A-4278-82E9-A1625EF4BA3E@langa.pl> <20180507162846.oxkkag27zgoah4wb@python.ca> <20180511152735.s7dpmifkd2gvz7zc@phdru.name> <20180511173426.41059616@fsol> Message-ID: <2183B04D-58FC-4F8F-AD2E-214141CC7D6C@python.org> On May 11, 2018, at 12:23, Guido van Rossum wrote: > > Indeed, we have an implementation of this specific to mypy. Is there anything in mypy?s implementation that can be generalized into a library? -Barry -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 833 bytes Desc: Message signed with OpenPGP URL: From guido at python.org Sat May 12 00:08:40 2018 From: guido at python.org (Guido van Rossum) Date: Sat, 12 May 2018 00:08:40 -0400 Subject: [Python-Dev] Python startup time - daemon In-Reply-To: <2183B04D-58FC-4F8F-AD2E-214141CC7D6C@python.org> References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> <5C4A992D-891A-4278-82E9-A1625EF4BA3E@langa.pl> <20180507162846.oxkkag27zgoah4wb@python.ca> <20180511152735.s7dpmifkd2gvz7zc@phdru.name> <20180511173426.41059616@fsol> <2183B04D-58FC-4F8F-AD2E-214141CC7D6C@python.org> Message-ID: On Fri, May 11, 2018 at 11:57 PM, Barry Warsaw wrote: > On May 11, 2018, at 12:23, Guido van Rossum wrote: > > > > Indeed, we have an implementation of this specific to mypy. > > Is there anything in mypy?s implementation that can be generalized into a > library? > Not sure, here's the code: https://github.com/python/mypy/blob/master/mypy/dmypy.py https://github.com/python/mypy/blob/master/mypy/dmypy_server.py (also dmypy_util.py there) -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at pearwood.info Sat May 12 02:39:45 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Sat, 12 May 2018 16:39:45 +1000 Subject: [Python-Dev] (Looking for) A Retrospective on the Move to Python 3 In-Reply-To: References: <20180428175002.62163b10@fsol> Message-ID: <20180512063944.GA12683@ando.pearwood.info> On Fri, May 11, 2018 at 11:15:11AM -0700, Chris Barker - NOAA Federal via Python-Dev wrote: > > while the changes introduced by Python 3 > > affect pretty much everyone, even people who only write small simple > > scripts. > > Sure they do, but the *hard stuff* not so much. > > I have found 2to3 conversion to be remarkably easy and painless. For what it's worth, one of the programmers at my former employer decided to port their entire code base from 2.7 to 3.x without telling anyone. He got bored waiting for management permission, so he stayed back late after work one night and ported the whole thing in his own time, it took him about four hours, and then he casually announced it over IRC the next day. Now this was a small team of coders with a relatively small code base, maybe fifty to a hundred modules or so, ranging in size from a few lines to maybe a thousand. And they did have a fair set of unit tests. (Not as many as I would like, but some.) So not all Python 3 migration stories turn into horror stories :-) -- Steve From esr at thyrsus.com Sat May 12 03:05:02 2018 From: esr at thyrsus.com (Eric S. Raymond) Date: Sat, 12 May 2018 03:05:02 -0400 Subject: [Python-Dev] (Looking for) A Retrospective on the Move to Python 3 In-Reply-To: <20180512063944.GA12683@ando.pearwood.info> References: <20180428175002.62163b10@fsol> <20180512063944.GA12683@ando.pearwood.info> Message-ID: <20180512070502.GA8794@thyrsus.com> Steven D'Aprano : > So not all Python 3 migration stories turn into horror stories :-) Peter Donis and wrote "Practical Python porting for systems programmers": http://www.catb.org/esr/faqs/practical-python-porting/ We developed and applied these techniques on src (a lightweight version-control system for single-contributor projects), reposurgeon (a tool for surgery on version-control repositories, 14KLOC), doclifter (man-page to XML-DocBook markup lifter, 8KLOC), the Python components of GPSD (9KLOC) and the Python components of NTPSec (secure network time service, 16KLOC). All this code runs under either 2 nor 3 without requiring six or any other shim library. Applying the techniques is not particularly difficult. There were no horror stories at any point. I expect to keep writing Python in this polyglot idiom until 2 is obsolete enough to fall off the radar. -- Eric S. Raymond My work is funded by the Internet Civil Engineering Institute: https://icei.org Please visit their site and donate: the civilization you save might be your own. From eric at trueblade.com Sat May 12 07:11:16 2018 From: eric at trueblade.com (Eric V. Smith) Date: Sat, 12 May 2018 07:11:16 -0400 Subject: [Python-Dev] PEP 572 and f-strings Message-ID: <661c3447-e52e-aa66-b028-63330c8fb279@trueblade.com> I don't think it matters to its acceptance, but PEP 572 should at least mention that the := syntax means that you cannot use assignment expressions in f-strings. As I wrote in a python-ideas email, f'{x:=4}' already has a defined meaning (even if no one is using it). Eric From skip.montanaro at gmail.com Sat May 12 08:14:09 2018 From: skip.montanaro at gmail.com (Skip Montanaro) Date: Sat, 12 May 2018 07:14:09 -0500 Subject: [Python-Dev] (Looking for) A Retrospective on the Move to Python 3 In-Reply-To: References: <20180428175002.62163b10@fsol> Message-ID: > I have found 2to3 conversion to be remarkably easy and painless. > And the whole Unicode thing is much easier. The intersection of bytes, str and unicode has been the only pain point for me. Everything else I've encountered has been pretty trivial. Skip From rosuav at gmail.com Sat May 12 09:03:22 2018 From: rosuav at gmail.com (Chris Angelico) Date: Sat, 12 May 2018 23:03:22 +1000 Subject: [Python-Dev] PEP 572 and f-strings In-Reply-To: <661c3447-e52e-aa66-b028-63330c8fb279@trueblade.com> References: <661c3447-e52e-aa66-b028-63330c8fb279@trueblade.com> Message-ID: On Sat, May 12, 2018 at 9:11 PM, Eric V. Smith wrote: > I don't think it matters to its acceptance, but PEP 572 should at least > mention that the := syntax means that you cannot use assignment expressions > in f-strings. > > As I wrote in a python-ideas email, f'{x:=4}' already has a defined meaning > (even if no one is using it). As with lambda functions, you can't write them the simple way. However, you can parenthesize it, and then it works fine. >>> f"@{(lambda: 42)}@" '@ at 0x7f09e18c4268>@' >>> f"@{(y := 1)}@" '@1@' >>> y 1 ChrisA From eric at trueblade.com Sat May 12 09:20:21 2018 From: eric at trueblade.com (Eric V. Smith) Date: Sat, 12 May 2018 09:20:21 -0400 Subject: [Python-Dev] PEP 572 and f-strings In-Reply-To: References: <661c3447-e52e-aa66-b028-63330c8fb279@trueblade.com> Message-ID: > On May 12, 2018, at 9:03 AM, Chris Angelico wrote: > >> On Sat, May 12, 2018 at 9:11 PM, Eric V. Smith wrote: >> I don't think it matters to its acceptance, but PEP 572 should at least >> mention that the := syntax means that you cannot use assignment expressions >> in f-strings. >> >> As I wrote in a python-ideas email, f'{x:=4}' already has a defined meaning >> (even if no one is using it). > > As with lambda functions, you can't write them the simple way. > However, you can parenthesize it, and then it works fine. > >>>> f"@{(lambda: 42)}@" > '@ at 0x7f09e18c4268>@' >>>> f"@{(y := 1)}@" > '@1@' >>>> y > 1 > Yes, but just as PEP 498 mentions lambdas, I think 572 should mention f-strings, and point out this workaround. Eric From steve at holdenweb.com Sat May 12 10:20:11 2018 From: steve at holdenweb.com (Steve Holden) Date: Sat, 12 May 2018 15:20:11 +0100 Subject: [Python-Dev] PEP 554 - strange random boldface Message-ID: Does anyone know why some lines in the tables of https://www.python.org/dev/peps/pep-0554/ appear in bold when the markup doesn't seem to call for it? I can't find a way to stop this, and it's bugging me! regards ? ? Steve -------------- next part -------------- An HTML attachment was scrubbed... URL: From solipsis at pitrou.net Sat May 12 10:26:34 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Sat, 12 May 2018 16:26:34 +0200 Subject: [Python-Dev] PEP 554 - strange random boldface References: Message-ID: <20180512162634.18e805f5@fsol> On Sat, 12 May 2018 15:20:11 +0100 Steve Holden wrote: > Does anyone know why some lines in the tables of > https://www.python.org/dev/peps/pep-0554/ appear in bold when the markup > doesn't seem to call for it? I can't find a way to stop this, and it's > bugging me! Apparently this may have to do with the given table cells being laid out in multiple lines? Regards Antoine. From steve at holdenweb.com Sat May 12 10:48:47 2018 From: steve at holdenweb.com (Steve Holden) Date: Sat, 12 May 2018 15:48:47 +0100 Subject: [Python-Dev] PEP 554 - strange random boldface In-Reply-To: <20180512162634.18e805f5@fsol> References: <20180512162634.18e805f5@fsol> Message-ID: It's certainly true that when I split any of those left-hand cells the bizarre emphasis occurs. Still looking for a workaround. regards Steve Steve Holden On Sat, May 12, 2018 at 3:26 PM, Antoine Pitrou wrote: > On Sat, 12 May 2018 15:20:11 +0100 > Steve Holden wrote: > > Does anyone know why some lines in the tables of > > https://www.python.org/dev/peps/pep-0554/ appear in bold when the markup > > doesn't seem to call for it? I can't find a way to stop this, and it's > > bugging me! > > Apparently this may have to do with the given table cells being laid out > in multiple lines? > > Regards > > Antoine. > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ > steve%40holdenweb.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From solipsis at pitrou.net Sat May 12 11:03:54 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Sat, 12 May 2018 17:03:54 +0200 Subject: [Python-Dev] PEP 554 - strange random boldface In-Reply-To: References: <20180512162634.18e805f5@fsol> Message-ID: <20180512170354.342cf75f@fsol> On Sat, 12 May 2018 15:48:47 +0100 Steve Holden wrote: > It's certainly true that when I split any of those left-hand cells the > bizarre emphasis occurs. Still looking for a workaround. Depending on whether you indent the second line it might be interpreted as a definition list item: http://docutils.sourceforge.net/docs/user/rst/quickref.html#definition-lists Regards Antoine. From storchaka at gmail.com Sat May 12 11:57:02 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Sat, 12 May 2018 18:57:02 +0300 Subject: [Python-Dev] PEP 554 - strange random boldface In-Reply-To: References: <20180512162634.18e805f5@fsol> Message-ID: 12.05.18 17:48, Steve Holden ????: > It's certainly true that when I split any of those left-hand cells the > bizarre emphasis occurs. Still looking for a workaround. Remove an extra indentation of the second line. From ericfahlgren at gmail.com Sat May 12 13:13:24 2018 From: ericfahlgren at gmail.com (Eric Fahlgren) Date: Sat, 12 May 2018 10:13:24 -0700 Subject: [Python-Dev] (Looking for) A Retrospective on the Move to Python 3 In-Reply-To: References: <20180428175002.62163b10@fsol> Message-ID: [esr] > All this code runs under either 2 nor 3 without requiring six or any other > shim library. We've got an application that's about 500k loc, runs under both 2 and 3. It has only one shim, a 'metaclass' decorator similar to what six provides, other than that it's all quite clean 2- and 3-wise. We long ago adopted "from __future__" as a standard part of every source file, so we have internalized the Py3 print, division and import behaviors as the norm. An occasion scan with 2to3 kept us honest about list-producing vs iterator-producing functions, and renamings and such. Our major pain point was getting extension libraries that worked with 3, notably VTK and wxPython, which weren't ported completely until last year. We had been ready to switch over completely to Py3 almost four years ago, but those major pieces were missing. We have a production ready version running under 3.6, but are going to wait for the 3.7 release before cutting off support for Python 2 altogether. Of note, we did not have any Unicode issues, as we adopted wxPython's Unicode version as soon as it was available (6-7 years ago?), and had virtually no issues then or since. -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at holdenweb.com Sun May 13 06:34:24 2018 From: steve at holdenweb.com (Steve Holden) Date: Sun, 13 May 2018 11:34:24 +0100 Subject: [Python-Dev] PEP 554 - strange random boldface In-Reply-To: References: <20180512162634.18e805f5@fsol> Message-ID: On Sat, May 12, 2018 at 4:57 PM, Serhiy Storchaka wrote: > 12.05.18 17:48, Steve Holden ????: > >> It's certainly true that when I split any of those left-hand cells the >> bizarre emphasis occurs. Still looking for a workaround. >> > > Remove an extra indentation of the second line. > > ?That's nailed it, thanks!? -------------- next part -------------- An HTML attachment was scrubbed... URL: From mmangoba at python.org Sun May 13 11:44:59 2018 From: mmangoba at python.org (Mark Mangoba) Date: Sun, 13 May 2018 08:44:59 -0700 Subject: [Python-Dev] Bugs Migration to OpenShift In-Reply-To: References: Message-ID: Hi All, Victor made a good point here. After discussion with Maciej, we will postpone this migration to OpenShift until after sprints since bpo will be heavily used. Maciej and I will update everyone on the timeline after sprints. Best regards, Mark On Mon, Apr 30, 2018 at 12:54 AM, Victor Stinner wrote: > Does it mean that the final switch will happen during the sprints? > Would it be possible to do it before or after? If bugs.python.org > doesn't work during the sprint, it will be much harder to contribute > to CPython during the sprints. > > (If I misunderstood, ignore my message :-)) > > Victor > > 2018-04-29 19:07 GMT+02:00 Mark Mangoba : >> Hi All, >> >> We?re planning to finish up the bugs.python.org migration to Red Hat >> OpenShift by May 14th (US Pycon Sprints). For the most part >> everything will stay same, with the exception of cleaning up some old >> URL?s and redirects from the previous hosting provider: Upfront >> Software. >> >> We will post a more concrete timeline here by May 1st, but wanted to >> share this exciting news to move bugs.python.org into a more stable >> and optimal state. >> >> Thank you all for your patience and feedback. A special thanks to >> Maciej Szulik and Red Hat for helping the PSF with this project. >> >> Best regards, >> Mark >> >> -- >> Mark Mangoba | PSF IT Manager | Python Software Foundation | >> mmangoba at python.org | python.org | Infrastructure Staff: >> infrastructure-staff at python.org | GPG: 2DE4 D92B 739C 649B EBB8 CCF6 >> DC05 E024 5F4C A0D1 >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: https://mail.python.org/mailman/options/python-dev/vstinner%40redhat.com -- Mark Mangoba | PSF IT Manager | Python Software Foundation | mmangoba at python.org | python.org | Infrastructure Staff: infrastructure-staff at python.org | GPG: 2DE4 D92B 739C 649B EBB8 CCF6 DC05 E024 5F4C A0D1 From christian at python.org Sun May 13 13:42:48 2018 From: christian at python.org (Christian Heimes) Date: Sun, 13 May 2018 13:42:48 -0400 Subject: [Python-Dev] bpo-28055: Fix unaligned accesses in siphash24(). (GH-6123) In-Reply-To: <40kLQS4ymzzFr4Z@mail.python.org> References: <40kLQS4ymzzFr4Z@mail.python.org> Message-ID: On 2018-05-13 06:57, Serhiy Storchaka wrote: > https://github.com/python/cpython/commit/1e2ec8a996daec65d8d5a3d43b66a9909c6d0653 > commit: 1e2ec8a996daec65d8d5a3d43b66a9909c6d0653 > branch: master > author: Rolf Eike Beer > committer: Serhiy Storchaka > date: 2018-05-13T13:57:31+03:00 > summary: > > bpo-28055: Fix unaligned accesses in siphash24(). (GH-6123) > > The hash implementation casts the input pointer to uint64_t* and directly reads > from this, which may cause unaligned accesses. Use memcpy() instead so this code > will not crash with SIGBUS on sparc. > > https://bugs.gentoo.org/show_bug.cgi?id=636400 > > files: > A Misc/NEWS.d/next/Core and Builtins/2018-04-25-20-44-42.bpo-28055.f49kfC.rst > M Python/pyhash.c Hi Serhiy, I was against the approach a good reason. The PR adds additional CPU instructions and changes memory access pattern in a critical path of CPython. There is no reason to add additional overhead for the majority of users or X86 and X86_64 architectures. The memcpy() call should only be used on architectures that do not support unaligned memory access. See comment https://bugs.python.org/issue28055#msg276257 At least for latest GCC, the change seems to be fine. GCC emits the same assembly code for X86_64 before and after your change. Did you check the output on other CPU architectures as well as clang and MSVC, too? Christian From rosuav at gmail.com Mon May 14 03:30:40 2018 From: rosuav at gmail.com (Chris Angelico) Date: Mon, 14 May 2018 17:30:40 +1000 Subject: [Python-Dev] Looking for examples: proof that a list comp is a function Message-ID: Guido has stated that this parallel is desired and important: result = [f(x) for x in iter if g(x)] result = list(f(x) for x in iter if g(x)) Obviously the genexp has to be implemented with a nested function, since there's no guarantee that it'll be iterated over in this way. With current semantics, you can easily prove that a list comp is implemented with a function by looking at how it interacts with other scopes (mainly class scope), but Tim's proposal may change that. So I'm looking for examples that prove that a list comp is executed inside an implicit function. Ideally, examples that are supported by language guarantees, but something that's "CPython has done it this way since 3.0" is important too. I'm aware of just two: the name lookup interaction that may be changing, and the fact that there's an extra line in a traceback. And the latter, as far as I know, is not guaranteed (and I doubt anyone would care if it changed). Are there any other provable points? ChrisA From leewangzhong+python at gmail.com Mon May 14 03:52:01 2018 From: leewangzhong+python at gmail.com (Franklin? Lee) Date: Mon, 14 May 2018 03:52:01 -0400 Subject: [Python-Dev] Looking for examples: proof that a list comp is a function In-Reply-To: References: Message-ID: On Mon, May 14, 2018, 03:36 Chris Angelico wrote: > Guido has stated that this parallel is desired and important: > > result = [f(x) for x in iter if g(x)] > result = list(f(x) for x in iter if g(x)) > > Obviously the genexp has to be implemented with a nested function, > since there's no guarantee that it'll be iterated over in this way. > With current semantics, you can easily prove that a list comp is > implemented with a function by looking at how it interacts with other > scopes (mainly class scope), but Tim's proposal may change that. > > So I'm looking for examples that prove that a list comp is executed > inside an implicit function. Ideally, examples that are supported by > language guarantees, but something that's "CPython has done it this > way since 3.0" is important too. > > I'm aware of just two: the name lookup interaction that may be > changing, and the fact that there's an extra line in a traceback. And > the latter, as far as I know, is not guaranteed (and I doubt anyone > would care if it changed). Are there any other provable points? > Related to the traceback one: the extra stack frame shows up in a debugger, and a profiler counts the extra frame separately. The first often confuses me because I don't immediately see which frame I'm in just by seeing the line of code. There are odd interactions between `yield`/`yield from` and comprehensions that was discussed some months ago: "[Python-Dev] Tricky way of of creating a generator via a comprehension expression". Wait, is this a continuation of that discussion? > -------------- next part -------------- An HTML attachment was scrubbed... URL: From tim.peters at gmail.com Mon May 14 04:05:27 2018 From: tim.peters at gmail.com (Tim Peters) Date: Mon, 14 May 2018 03:05:27 -0500 Subject: [Python-Dev] Looking for examples: proof that a list comp is a function In-Reply-To: References: Message-ID: [Chris Angelico ... > With current semantics, you can easily prove that a list comp is > implemented with a function by looking at how it interacts with other > scopes (mainly class scope), but Tim's proposal may change that. Absolutely not. I haven't considered for a nanosecond that anything _essential_ would change in the current implementation. In effect, my proposal to bind assignment statement targets that appear in a listcomp or genexp in the blocks that immediately contain their synthetic functions "merely" sprinkles in some `nonlocal` and/or `global` declarations to change the targets' scopes. Indeed, it _relies_ on that they're implemented as (potentially nested) synthetic functions today. And if you haven't read my proposed changes to the reference manual, they explicitly state that they're talking about the synthetic functions created to implement genexps and listcomps. > So I'm looking for examples that prove that a list comp is executed > inside an implicit function. Ideally, examples that are supported by > language guarantees, but something that's "CPython has done it this > way since 3.0" is important too. I don't believe you'll find that - but, of course, may be wrong about that. > I'm aware of just two: the name lookup interaction that may be > changing, and the fact that there's an extra line in a traceback. And > the latter, as far as I know, is not guaranteed (and I doubt anyone > would care if it changed). Are there any other provable points? Nick pointed me to these future docs that _will_ pretty much imply it: https://docs.python.org/dev/reference/expressions.html#displays-for-lists-sets-and-dictionaries In part: """ However, aside from the iterable expression in the leftmost for clause, the comprehension is executed in a separate implicitly nested scope. This ensures that names assigned to in the target list don?t ?leak? into the enclosing scope. The iterable expression in the leftmost for clause is evaluated directly in the enclosing scope and then passed as an argument to the implictly nested scope. """ I say "pretty much" because, for whatever reason(s), it seems to be trying hard _not_ to use the word "function". But I can't guess what "then passed as an argument to the implicitly nested scope" could possibly mean otherwise (it doesn't make literal sense to "pass an argument" to "a scope"). From ncoghlan at gmail.com Mon May 14 08:16:29 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Mon, 14 May 2018 08:16:29 -0400 Subject: [Python-Dev] Looking for examples: proof that a list comp is a function In-Reply-To: References: Message-ID: On 14 May 2018 at 04:05, Tim Peters wrote: > I say "pretty much" because, for whatever reason(s), it seems to be > trying hard _not_ to use the word "function". But I can't guess what > "then passed as an argument to the implicitly nested scope" could > possibly mean otherwise (it doesn't make literal sense to "pass an > argument" to "a scope"). > I think my motivation was to avoid promising *exact* equivalence with a regular nested function, since the define-and-call may allow us opportunities for optimization that don't exist when those two are separated (e.g. Guido's point in another thread that we actually avoid calling "iter" twice even though the nominal expansion implies that we should). However, you're right that just calling it a function may be clearer than relying on the ill-defined phrase "implicitly nested scope". For Chris's actual question, this is part of why I think adding "parentlocal" would actually make the scoping proposal easier to explain, as it means the name binding semantics aren't a uniquely magical property of binding expressions (however spelled), they're just a new form of target scope declaration that the compiler understands, and the binding expression form implies. Note: eas*ier*, not easy ;) It also occurs to me that we could do something pretty neat for class scopes: have parent local declarations in methods target the implicit lexical scope where __class__ lives (to support zero-arg super), *not* the class body. That would entail adding a "classlocal" declaration to target that implied scope, though. That would give the following definition for "lexical scopes that parent local scoping can target": - module globals (parentlocal -> global) - function locals, including lambda expression locals (parentlocal -> nonlocal) - implicit class closure, where __class__ lives (parentlocal -> nonlocal in current scope, classlocal in class scope) Most notably, in the synthetic functions created for generator expressions and comprehensions, a parentlocal declaration in a child scope would imply a parentlocal declaration in the synthetic function as well, propagating back up the chain of nested lexical scopes until it terminated in one of the above three permitted targets. Using the explicit forms would then look like: from __future import parent_scopes # Enable the explicit declaration forms class C: classlocal _n # Declares _n as a cell akin to __class__ rather than a class attribute _n = [] @staticmethod def get_count(): return len(_n) assert not hasattr(C, "_n") assert C.get_count() == 0 def _writes_to_parent_scope(): parentlocal outer_name outer_name = 42 assert outer_name == 42 I'm still doubtful the complexity of actually doing that is warranted, but I'm now satisfied the semantics can be well specified in a way that allows us to retain the explanation of generator expressions and comprehensions in terms of their statement level counterparts (with the added bonus of making "__class__" a little less of a magically unique snowflake along the way). Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Mon May 14 12:20:01 2018 From: chris.barker at noaa.gov (Chris Barker) Date: Mon, 14 May 2018 12:20:01 -0400 Subject: [Python-Dev] bpo-33257: seeking advice & approval on the course of action In-Reply-To: References: <0a89a3c6-65f5-c0e0-826b-7db3a79a7ef0@mail.mipt.ru> Message-ID: On Wed, May 2, 2018 at 8:21 PM, Terry Reedy wrote: > On 5/2/2018 4:38 PM, Ivan Pozdeev via Python-Dev wrote: > >> The bottom line is: Tkinter is currently broken >> > > This is way over-stated. Many modules have bugs, somethings in features > more central to their main purpose. I'll suggest a re-statement: tkinter is not thread safe, and yet it is documented as being thread safe (or at least not documented as NOT being thread safe) This is either a bug(s) in the implementation or the docs. So what are the solutions? 1) fix the docs -- unless tkInter is made thread safe really soon, and fixes are back-ported, this seems like a no brainer -- at least temporarily. 2) fix the issues that make tkInter not thread safe -- apparently there is a thread safe tcl/tk, so it should be possible, though I have to say I'm really surprised that that's the case for an old C code base -- but great! The problem here is that we'll need qualified people to submit and review the code, and it really should get some extensive testing -- that's a lot of work. And it's going to take time, so see (1) above. Another issue: Many GUI toolkits are not thread safe (I have personal experience with wxPython), so it's not so bad to simply say "don't do that" for tkInter -- that is, don't make tkInter calls from more than one thread. However, wxPython (for example) makes this not-too-bad for multi-threaded programs by providing thread-safe ways to put events on the event queue -- whether with wx.PostEvent, or the utilities wx.CallAfter() and wx.CallLater(). This makes it pretty easy to keep the GUI in one thread while taking advantage of threads for long running tasks, etc. IIUC, tkinter does not have such an easy way to interact with the GUI from another thread -- so maybe adding that would be a good first step. I neither use tkinter, nor have the expertise to contribute this -- but clearly Ivan does -- and maybe others would want to help. -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Mon May 14 12:26:19 2018 From: chris.barker at noaa.gov (Chris Barker) Date: Mon, 14 May 2018 12:26:19 -0400 Subject: [Python-Dev] Python startup time In-Reply-To: <1634fbbe500.27a3.db5b03704c129196a4e9415e55413ce6@gmail.com> References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> <5C4A992D-891A-4278-82E9-A1625EF4BA3E@langa.pl> <20180507162846.oxkkag27zgoah4wb@python.ca> <1634fbbe500.27a3.db5b03704c129196a4e9415e55413ce6@gmail.com> Message-ID: On Fri, May 11, 2018 at 11:05 AM, Ryan Gonzalez wrote: > https://refi64.com/uprocd/ very cool -- but *nix only, of course :-( But it seems that there is a demand for this sort of thing, and a few major projects are rolling their own. So maybe it makes sense to put something into the standard library that everyone could contribute to and use. With regard to forking -- is there another way? I don't have the expertise to have any idea if this is possible, but: start up python capture the entire runtime image as a single binary blob. could that blob be simply loaded into memory and run? (hmm -- probably not -- memory addresses would be hard-coded then, yes?) or is memory virtualized enough these days? -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From songofacandy at gmail.com Mon May 14 12:33:18 2018 From: songofacandy at gmail.com (INADA Naoki) Date: Tue, 15 May 2018 01:33:18 +0900 Subject: [Python-Dev] Python startup time In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> <5C4A992D-891A-4278-82E9-A1625EF4BA3E@langa.pl> <20180507162846.oxkkag27zgoah4wb@python.ca> <1634fbbe500.27a3.db5b03704c129196a4e9415e55413ce6@gmail.com> Message-ID: On Tue, May 15, 2018 at 1:29 AM Chris Barker via Python-Dev < python-dev at python.org> wrote: > On Fri, May 11, 2018 at 11:05 AM, Ryan Gonzalez wrote: >> https://refi64.com/uprocd/ > very cool -- but *nix only, of course :-( > But it seems that there is a demand for this sort of thing, and a few major projects are rolling their own. So maybe it makes sense to put something into the standard library that everyone could contribute to and use. > With regard to forking -- is there another way? I don't have the expertise to have any idea if this is possible, but: > start up python > capture the entire runtime image as a single binary blob. > could that blob be simply loaded into memory and run? > (hmm -- probably not -- memory addresses would be hard-coded then, yes?) or is memory virtualized enough these days? > -CHB It will broke hash randomization. See also: https://www.cvedetails.com/cve/CVE-2017-11499/ Regards, -- Inada Naoki From chris.barker at noaa.gov Mon May 14 12:38:21 2018 From: chris.barker at noaa.gov (Chris Barker) Date: Mon, 14 May 2018 12:38:21 -0400 Subject: [Python-Dev] Python startup time In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> <5C4A992D-891A-4278-82E9-A1625EF4BA3E@langa.pl> <20180507162846.oxkkag27zgoah4wb@python.ca> <1634fbbe500.27a3.db5b03704c129196a4e9415e55413ce6@gmail.com> Message-ID: On Mon, May 14, 2018 at 12:33 PM, INADA Naoki wrote: > It will broke hash randomization. > > See also: https://www.cvedetails.com/cve/CVE-2017-11499/ I'm not enough of a security expert to know how much that matters in this case, but I suppose one could do a bit of post-proccessing on the image to randomize the hashes? or is that just insane? Also -- I wasn't thinking it would be a pre-build binary blob that everyone used -- but one built on the fly on an individual system, maybe once per reboot, or once per shell instance even. So if you are running, e.g. hg a bunch of times in a shell, does it matter that the instances are all identical? -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Mon May 14 12:34:22 2018 From: chris.barker at noaa.gov (Chris Barker) Date: Mon, 14 May 2018 12:34:22 -0400 Subject: [Python-Dev] (Looking for) A Retrospective on the Move to Python 3 In-Reply-To: References: <20180428175002.62163b10@fsol> Message-ID: On Sat, May 12, 2018 at 8:14 AM, Skip Montanaro wrote: > > I have found 2to3 conversion to be remarkably easy and painless. > > > And the whole Unicode thing is much easier. > Another point here: between 3.0 and 3.6 (.5?) -- py3 grew a lot of minor features that made it easier to write py2/py3 compatible code. u"string", b'bytes %i' % something -- and when where the various __future__ imports made available? If these had been in place in 3.0, the whole process would have been easier :-( -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From ethan at stoneleaf.us Mon May 14 12:58:44 2018 From: ethan at stoneleaf.us (Ethan Furman) Date: Mon, 14 May 2018 09:58:44 -0700 Subject: [Python-Dev] (Looking for) A Retrospective on the Move to Python 3 In-Reply-To: References: <20180428175002.62163b10@fsol> Message-ID: <5AF9C044.5080001@stoneleaf.us> On 05/14/2018 09:34 AM, Chris Barker via Python-Dev wrote: > between 3.0 and 3.6 (.5?) -- py3 grew a lot of minor features that made it easier to write py2/py3 compatible code. > u"string", b'bytes %i' % something -- and when where the various __future__ imports made available? > > If these had been in place in 3.0, the whole process would have been easier :-( You'll need to be more specific. __future__ has been around for a looooong time. -- ~Ethan~ From solipsis at pitrou.net Mon May 14 12:57:50 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Mon, 14 May 2018 18:57:50 +0200 Subject: [Python-Dev] Python startup time References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> <5C4A992D-891A-4278-82E9-A1625EF4BA3E@langa.pl> <20180507162846.oxkkag27zgoah4wb@python.ca> <1634fbbe500.27a3.db5b03704c129196a4e9415e55413ce6@gmail.com> Message-ID: <20180514185750.7525d1eb@fsol> On Tue, 15 May 2018 01:33:18 +0900 INADA Naoki wrote: > > It will broke hash randomization. > > See also: https://www.cvedetails.com/cve/CVE-2017-11499/ I don't know why it would. The mechanism of pre-initializing a process which is re-used accross many requests is how most server applications of Python already work (you don't want to bear the cost of spawning a new interpreter for each request, as antiquated CGI does). I have not heard that it breaks hash randomization, so a similar mechanism on the CLI side shouldn't break it either. Regards Antoine. From storchaka at gmail.com Mon May 14 13:03:22 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Mon, 14 May 2018 20:03:22 +0300 Subject: [Python-Dev] bpo-28055: Fix unaligned accesses in siphash24(). (GH-6123) In-Reply-To: References: <40kLQS4ymzzFr4Z@mail.python.org> Message-ID: 13.05.18 20:42, Christian Heimes ????: > I was against the approach a good reason. The PR adds additional CPU > instructions and changes memory access pattern in a critical path of > CPython. There is no reason to add additional overhead for the majority > of users or X86 and X86_64 architectures. The memcpy() call should only > be used on architectures that do not support unaligned memory access. > See comment https://bugs.python.org/issue28055#msg276257 > > At least for latest GCC, the change seems to be fine. GCC emits the same > assembly code for X86_64 before and after your change. Did you check the > output on other CPU architectures as well as clang and MSVC, too? For the initial implementation of pyhash.c [1] I proposed a patch that use memcpy() conditionally to avoid an overhead on Windows: +#ifdef _MSC_VER + block.value = *(const Py_uhash_t*)p; +#else + memcpy(block.bytes, p, SIZEOF_PY_UHASH_T); +#endif (and similar code for FNV). But many developers confirmed that all modern compilers including latest versions of MS VS optimize memcpy() with a constant size into a single CPU instruction. Seems avoiding to use memcpy() no longer needed. If using memcpy() adds an overhead on some platforms we can return to using a compiler/platform depending code. [1] https://bugs.python.org/issue19183 From songofacandy at gmail.com Mon May 14 13:12:18 2018 From: songofacandy at gmail.com (INADA Naoki) Date: Tue, 15 May 2018 02:12:18 +0900 Subject: [Python-Dev] Python startup time In-Reply-To: <20180514185750.7525d1eb@fsol> References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> <5C4A992D-891A-4278-82E9-A1625EF4BA3E@langa.pl> <20180507162846.oxkkag27zgoah4wb@python.ca> <1634fbbe500.27a3.db5b03704c129196a4e9415e55413ce6@gmail.com> <20180514185750.7525d1eb@fsol> Message-ID: I'm sorry, the word *will* may be stronger than I thought. I meant if memory image dumped on disk is used casually, it may make easier to make security hole. For example, if `hg` memory image is reused, and it can be leaked in some way, hg serve will be hashdos weak. I don't deny that it's useful and safe when it's used carefully. Regards, On Tue, May 15, 2018 at 1:58 AM Antoine Pitrou wrote: > On Tue, 15 May 2018 01:33:18 +0900 > INADA Naoki wrote: > > > > It will broke hash randomization. > > > > See also: https://www.cvedetails.com/cve/CVE-2017-11499/ > I don't know why it would. The mechanism of pre-initializing a process > which is re-used accross many requests is how most server applications > of Python already work (you don't want to bear the cost of spawning > a new interpreter for each request, as antiquated CGI does). I have not > heard that it breaks hash randomization, so a similar mechanism on the > CLI side shouldn't break it either. > Regards > Antoine. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/songofacandy%40gmail.com -- -- INADA Naoki From antoine at python.org Mon May 14 13:17:32 2018 From: antoine at python.org (Antoine Pitrou) Date: Mon, 14 May 2018 19:17:32 +0200 Subject: [Python-Dev] Python startup time In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> <5C4A992D-891A-4278-82E9-A1625EF4BA3E@langa.pl> <20180507162846.oxkkag27zgoah4wb@python.ca> <1634fbbe500.27a3.db5b03704c129196a4e9415e55413ce6@gmail.com> <20180514185750.7525d1eb@fsol> Message-ID: Le 14/05/2018 ? 19:12, INADA Naoki a ?crit?: > I'm sorry, the word *will* may be stronger than I thought. > > I meant if memory image dumped on disk is used casually, > it may make easier to make security hole. > > For example, if `hg` memory image is reused, and it can be leaked in some > way, > hg serve will be hashdos weak. This discussion subthread is not about having a memory image dumped on disk, but a daemon utility that preloads a new Python process when you first start up your CLI application. Each time a new process is preloaded, it will by construction use a new hash seed. (by contrast, the Node.js CVE issue you linked to is about having the same hash seed accross a Node.js version; that's disastrous) Also you add a reuse limit to ensure that the hash seed is rotated (e.g. every 100 invocations). Regards Antoine. From songofacandy at gmail.com Mon May 14 13:34:11 2018 From: songofacandy at gmail.com (INADA Naoki) Date: Tue, 15 May 2018 02:34:11 +0900 Subject: [Python-Dev] Python startup time In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> <5C4A992D-891A-4278-82E9-A1625EF4BA3E@langa.pl> <20180507162846.oxkkag27zgoah4wb@python.ca> <1634fbbe500.27a3.db5b03704c129196a4e9415e55413ce6@gmail.com> <20180514185750.7525d1eb@fsol> Message-ID: 2018?5?15?(?) 2:17 Antoine Pitrou : > > Le 14/05/2018 ? 19:12, INADA Naoki a ?crit : > > I'm sorry, the word *will* may be stronger than I thought. > > > > I meant if memory image dumped on disk is used casually, > > it may make easier to make security hole. > > > > For example, if `hg` memory image is reused, and it can be leaked in some > > way, > > hg serve will be hashdos weak. > > This discussion subthread is not about having a memory image dumped on > disk, but a daemon utility that preloads a new Python process when you > first start up your CLI application. Each time a new process is > preloaded, it will by construction use a new hash seed. > My reply was to: > capture the entire runtime image as a single binary blob. > could that blob be simply loaded into memory and run? So I thought about reusing memory image undeterministic times. Of course, prefork is much safer because hash initial vector is only in process ram. Regards, -------------- next part -------------- An HTML attachment was scrubbed... URL: From wes.turner at gmail.com Mon May 14 13:36:34 2018 From: wes.turner at gmail.com (Wes Turner) Date: Mon, 14 May 2018 13:36:34 -0400 Subject: [Python-Dev] (Looking for) A Retrospective on the Move to Python 3 In-Reply-To: <5AF9C044.5080001@stoneleaf.us> References: <20180428175002.62163b10@fsol> <5AF9C044.5080001@stoneleaf.us> Message-ID: On Monday, May 14, 2018, Ethan Furman wrote: > On 05/14/2018 09:34 AM, Chris Barker via Python-Dev wrote: > > between 3.0 and 3.6 (.5?) -- py3 grew a lot of minor features that made it >> easier to write py2/py3 compatible code. >> u"string", b'bytes %i' % something -- and when where the various >> __future__ imports made available? >> >> If these had been in place in 3.0, the whole process would have been >> easier :-( >> > > You'll need to be more specific. __future__ has been around for a > looooong time. https://github.com/python/cpython/blame/master/Lib/__future__.py > > -- > ~Ethan~ > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/wes. > turner%40gmail.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From phd at phdru.name Mon May 14 13:51:35 2018 From: phd at phdru.name (Oleg Broytman) Date: Mon, 14 May 2018 19:51:35 +0200 Subject: [Python-Dev] Python startup time In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> <5C4A992D-891A-4278-82E9-A1625EF4BA3E@langa.pl> <20180507162846.oxkkag27zgoah4wb@python.ca> <1634fbbe500.27a3.db5b03704c129196a4e9415e55413ce6@gmail.com> Message-ID: <20180514175135.yjsccannratvn5pl@phdru.name> On Mon, May 14, 2018 at 12:26:19PM -0400, Chris Barker via Python-Dev wrote: > With regard to forking -- is there another way? I don't have the expertise > to have any idea if this is possible, but: > > start up python > > capture the entire runtime image as a single binary blob. > could that blob be simply loaded into memory and run? Like emacs unexec? https://www.google.com/search?q=emacs+unexec > -CHB > > -- > > Christopher Barker, Ph.D. > Oceanographer > > Emergency Response Division > NOAA/NOS/OR&R (206) 526-6959 voice > 7600 Sand Point Way NE (206) 526-6329 fax > Seattle, WA 98115 (206) 526-6317 main reception > > Chris.Barker at noaa.gov Oleg. -- Oleg Broytman http://phdru.name/ phd at phdru.name Programmers don't die, they just GOSUB without RETURN. From encukou at gmail.com Mon May 14 13:56:56 2018 From: encukou at gmail.com (Petr Viktorin) Date: Mon, 14 May 2018 13:56:56 -0400 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: <5AED7166.1010008@UGent.be> References: <5AED7166.1010008@UGent.be> Message-ID: <55ce70f5-4c6a-566a-712a-ec0f4c9405f1@gmail.com> On 05/05/18 04:55, Jeroen Demeyer wrote: > Hello all, > > I have updated PEP 575 in response to some posts on this mailing list > and to some discussions in person with the core Cython developers. > See https://www.python.org/dev/peps/pep-0575/ > > The main differences with respect to the previous version are: > > * "builtin_function" was renamed to "cfunction". Since we are changing > the name anyway, "cfunction" looked like a better choice because the > word "built-in" typically refers to things from the builtins module. > > * defined_function now only defines an API (it must support all > attributes that a Python function has) without specifying the > implementation. > > * The "Two-phase Implementation" proposal for better backwards > compatibility has been expanded and now offers 100% backwards > compatibility for the classes and for the inspect functions. Hi, I'm reading the PEP thoroughly, trying to "swap it into my brain" for the next few days. It does quite a lot of things, and the changes are all intertwined, which will make it hard to get reviewed and accepted. Are there parts that can be left to a subsequent PEP, to simplify the document (and implementation)? It seems to me that the current complexity is (partly) due to the fact that how functions are *called* is tied to how they are *introspected*. Perhaps starting to separate that is a better way to untangle things than arranging a class hierarchy? Can the problem of allowing introspection ("It is currently not possible to implement a function efficiently in C (only built-in functions can do that) while still allowing introspection like inspect.signature or inspect.getsourcefile (only Python functions can do that)") be solved in a better way? Maybe we can change `inspect` to use duck-typing instead of isinstance? Then, if built-in functions were subclassable, Cython functions could need to provide appropriate __code__/__defaults__/__kwdefaults__ attributes that inspect would pick up. Maybe we could eve add more attributes (__isgenerator__?) to separate how a function is called from how it should be introspected -- e.g. make inspect not consult co_flags. From robb at datalogics.com Mon May 14 11:59:57 2018 From: robb at datalogics.com (Rob Boehne) Date: Mon, 14 May 2018 15:59:57 +0000 Subject: [Python-Dev] bpo-28055: Fix unaligned accesses in siphash24(). (GH-6123) In-Reply-To: References: <40kLQS4ymzzFr4Z@mail.python.org> Message-ID: ?On 5/13/18, 12:44 PM, "Python-Dev on behalf of Christian Heimes" wrote: On 2018-05-13 06:57, Serhiy Storchaka wrote: > https://github.com/python/cpython/commit/1e2ec8a996daec65d8d5a3d43b66a9909c6d0653 > commit: 1e2ec8a996daec65d8d5a3d43b66a9909c6d0653 > branch: master > author: Rolf Eike Beer > committer: Serhiy Storchaka > date: 2018-05-13T13:57:31+03:00 > summary: > > bpo-28055: Fix unaligned accesses in siphash24(). (GH-6123) > > The hash implementation casts the input pointer to uint64_t* and directly reads > from this, which may cause unaligned accesses. Use memcpy() instead so this code > will not crash with SIGBUS on sparc. > > https://bugs.gentoo.org/show_bug.cgi?id=636400 > > files: > A Misc/NEWS.d/next/Core and Builtins/2018-04-25-20-44-42.bpo-28055.f49kfC.rst > M Python/pyhash.c Hi Serhiy, I was against the approach a good reason. The PR adds additional CPU instructions and changes memory access pattern in a critical path of CPython. There is no reason to add additional overhead for the majority of users or X86 and X86_64 architectures. The memcpy() call should only be used on architectures that do not support unaligned memory access. See comment https://bugs.python.org/issue28055#msg276257 X86 won't *directly* write misaligned data either, it will intrinsically copy it out to a properly aligned location. In C this is also "undefined behavior", so technically the C implementation can do whatever it wants - like raise an exception - which is will on SPARC. While X86 users may not notice any problems, depending on undefined behavior working in any particular way has many drawbacks. Often C compilers will optimize code in ways that assume there is no undefined behavior in ways that breaks code that does. At least for latest GCC, the change seems to be fine. GCC emits the same assembly code for X86_64 before and after your change. Did you check the output on other CPU architectures as well as clang and MSVC, too? Christian _______________________________________________ Python-Dev mailing list Python-Dev at python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/robb%40datalogics.com From tjreedy at udel.edu Mon May 14 14:58:49 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Mon, 14 May 2018 14:58:49 -0400 Subject: [Python-Dev] bpo-33257: seeking advice & approval on the course of action In-Reply-To: References: <0a89a3c6-65f5-c0e0-826b-7db3a79a7ef0@mail.mipt.ru> Message-ID: On 5/14/2018 12:20 PM, Chris Barker via Python-Dev wrote: > On Wed, May 2, 2018 at 8:21 PM, Terry Reedy > wrote: > > On 5/2/2018 4:38 PM, Ivan Pozdeev via Python-Dev wrote: > > The bottom line is: Tkinter is currently broken > > > This is way over-stated.? Many modules have bugs, somethings in > features more central to their main purpose. > > I'll suggest a re-statement: > > tkinter is not thread safe, Still over-stated. If one uses tcl/tk compiled with thread support, tkinter *is* thread-safe. This is 'as far as I know' from running posted 'failing' examples (possible with bug fixes) with 3.5+ on Windows, which is installed with tcl/tk 8.6, which defaults to thread-safe. Tkinter was intended to also be thread-safe when using tcl/tk without thread support, which was the default for tcl/tk 8.5 and before. The posted examples can fail on 2.x on Windows, which comes with tcl/tk 8.5 or before. _tkinter.c has some different #ifdefs for the two situations. > and yet it is documented as being thread safe True in https://docs.python.org/3/library/tk.html Unspecified in https://docs.python.org/3/library/tkinter.html > This is either a bug(s) in the implementation or the docs. Both > So what are the solutions? > > 1) fix the docs -- unless tkInter is made thread safe really soon, and > fixes are back-ported, this seems like a no brainer -- at least temporarily. https://bugs.python.org/issue33479 'Document tkinter and threads' > 2) fix the issues that make tkInter not thread safe with non-thread tcl/tk. https://bugs.python.org/issue33257 has a patch that might improve the situation for one type of call. Fixing everything might not be possible. AFAIK, there are currently no tests of thread safety. -- Terry Jan Reedy From vano at mail.mipt.ru Mon May 14 15:05:12 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Mon, 14 May 2018 22:05:12 +0300 Subject: [Python-Dev] bpo-33257: seeking advice & approval on the course of action In-Reply-To: References: <0a89a3c6-65f5-c0e0-826b-7db3a79a7ef0@mail.mipt.ru> Message-ID: <6f5f8ad3-e583-cd6c-ea55-59a49e9ed7ac@mail.mipt.ru> On 14.05.2018 21:58, Terry Reedy wrote: > On 5/14/2018 12:20 PM, Chris Barker via Python-Dev wrote: >> On Wed, May 2, 2018 at 8:21 PM, Terry Reedy > > wrote: >> >> ??? On 5/2/2018 4:38 PM, Ivan Pozdeev via Python-Dev wrote: >> >> ??????? The bottom line is: Tkinter is currently broken >> >> >> ??? This is way over-stated.? Many modules have bugs, somethings in >> ??? features more central to their main purpose. >> >> I'll suggest a re-statement: >> >> tkinter is not thread safe, > > Still over-stated.? If one uses tcl/tk compiled with thread support, > tkinter *is* thread-safe.? This is 'as far as I know' from running > posted 'failing' examples (possible with bug fixes) with 3.5+ on > Windows, which is installed with tcl/tk 8.6, which defaults to > thread-safe. > This means that you didn't (yet) read the letter that I attached to https://bugs.python.org/issue33479 . Reciting the relevant section: === The reality is that with neither flavor of Tcl is Tkinter completely thread-safe, but with threaded flavor, it's more so: * with nonthreaded Tcl, making concurrent Tcl calls leads to crashes due to incorrect management of the "Tcl lock" as per https://bugs.python.org/issue33257 * with threaded Tcl, the only issue that I found so far is that a few APIs must be called from the interpreter's thread (https://bugs.python.org/issue33412#msg316152; so far, I know `mainloop()` and `destroy()` to be this) -- while most can be called from anywhere. Whether the exceptions are justified is a matter of discussion (e.g. at first glance, `destroy()` can be fixed). === > Tkinter was intended to also be thread-safe when using tcl/tk without > thread support, which was the default for tcl/tk 8.5 and before. The > posted examples can fail on 2.x on Windows, which comes with tcl/tk > 8.5 or before. _tkinter.c has some different #ifdefs for the two > situations. > >> and yet it is documented as being thread safe > > True in https://docs.python.org/3/library/tk.html > Unspecified in https://docs.python.org/3/library/tkinter.html > >> This is either a bug(s) in the implementation or the docs. > > Both > >> So what are the solutions? >> >> 1) fix the docs -- unless tkInter is made thread safe really soon, >> and fixes are back-ported, this seems like a no brainer -- at least >> temporarily. > > https://bugs.python.org/issue33479 'Document tkinter and threads' > >> 2) fix the issues that make tkInter not thread safe > > with non-thread tcl/tk. > > https://bugs.python.org/issue33257 has a patch that might improve the > situation for one type of call.? Fixing everything might not be > possible.? AFAIK, there are currently no tests of thread safety. > -- Regards, Ivan From vano at mail.mipt.ru Mon May 14 15:10:14 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Mon, 14 May 2018 22:10:14 +0300 Subject: [Python-Dev] bpo-33257: seeking advice & approval on the course of action In-Reply-To: <6f5f8ad3-e583-cd6c-ea55-59a49e9ed7ac@mail.mipt.ru> References: <0a89a3c6-65f5-c0e0-826b-7db3a79a7ef0@mail.mipt.ru> <6f5f8ad3-e583-cd6c-ea55-59a49e9ed7ac@mail.mipt.ru> Message-ID: <971b93b6-4750-c002-bc7c-37563306d432@mail.mipt.ru> On 14.05.2018 22:05, Ivan Pozdeev wrote: > On 14.05.2018 21:58, Terry Reedy wrote: >> On 5/14/2018 12:20 PM, Chris Barker via Python-Dev wrote: >>> On Wed, May 2, 2018 at 8:21 PM, Terry Reedy >> > wrote: >>> >>> ??? On 5/2/2018 4:38 PM, Ivan Pozdeev via Python-Dev wrote: >>> >>> ??????? The bottom line is: Tkinter is currently broken >>> >>> >>> ??? This is way over-stated.? Many modules have bugs, somethings in >>> ??? features more central to their main purpose. >>> >>> I'll suggest a re-statement: >>> >>> tkinter is not thread safe, >> >> Still over-stated.? If one uses tcl/tk compiled with thread support, >> tkinter *is* thread-safe.? This is 'as far as I know' from running >> posted 'failing' examples (possible with bug fixes) with 3.5+ on >> Windows, which is installed with tcl/tk 8.6, which defaults to >> thread-safe. >> > This means that you didn't (yet) read the letter that I attached to > https://bugs.python.org/issue33479 . > Reciting the relevant section: > > === > The reality is that with neither flavor of Tcl is Tkinter completely > thread-safe, but with threaded flavor, it's more so: > > * with nonthreaded Tcl, making concurrent Tcl calls leads to crashes > due to incorrect management of the "Tcl lock" as per > https://bugs.python.org/issue33257 > * with threaded Tcl, the only issue that I found so far is that a few > APIs must be called from the interpreter's thread > (https://bugs.python.org/issue33412#msg316152; so far, I know > `mainloop()` and `destroy()` to be this) -- while most can be called > from anywhere. Whether the exceptions are justified is a matter of > discussion (e.g. at first glance, `destroy()` can be fixed). And another undocumented limitation for threaded Tcl: when calling anything from outside the interpreter thread, `mainloop()` must be running in the interpreter threads, or the call will either raise or hang (dunno any more details atm). > === >> Tkinter was intended to also be thread-safe when using tcl/tk without >> thread support, which was the default for tcl/tk 8.5 and before. The >> posted examples can fail on 2.x on Windows, which comes with tcl/tk >> 8.5 or before. _tkinter.c has some different #ifdefs for the two >> situations. >> >>> and yet it is documented as being thread safe >> >> True in https://docs.python.org/3/library/tk.html >> Unspecified in https://docs.python.org/3/library/tkinter.html >> >>> This is either a bug(s) in the implementation or the docs. >> >> Both >> >>> So what are the solutions? >>> >>> 1) fix the docs -- unless tkInter is made thread safe really soon, >>> and fixes are back-ported, this seems like a no brainer -- at least >>> temporarily. >> >> https://bugs.python.org/issue33479 'Document tkinter and threads' >> >>> 2) fix the issues that make tkInter not thread safe >> >> with non-thread tcl/tk. >> >> https://bugs.python.org/issue33257 has a patch that might improve the >> situation for one type of call.? Fixing everything might not be >> possible.? AFAIK, there are currently no tests of thread safety. >> > -- Regards, Ivan From mal at egenix.com Mon May 14 15:41:53 2018 From: mal at egenix.com (M.-A. Lemburg) Date: Mon, 14 May 2018 21:41:53 +0200 Subject: [Python-Dev] Python startup time In-Reply-To: References: <12e547e7-de6f-2dca-d3fe-47b63e108a8b@hastings.org> <5C4A992D-891A-4278-82E9-A1625EF4BA3E@langa.pl> <20180507162846.oxkkag27zgoah4wb@python.ca> <1634fbbe500.27a3.db5b03704c129196a4e9415e55413ce6@gmail.com> Message-ID: On 14.05.2018 18:26, Chris Barker via Python-Dev wrote: > > > On Fri, May 11, 2018 at 11:05 AM, Ryan Gonzalez > wrote: > > https://refi64.com/uprocd/ > > > very cool -- but *nix only, of course :-( > > But it seems that there is a demand for this sort of thing, and a few > major projects are rolling their own. So maybe it makes sense to put > something into the standard library that everyone could contribute to > and use. > > With regard to forking -- is there another way? I don't have the > expertise to have any idea if this is possible, but: > > start up python > > capture the entire runtime image as a single binary blob. > > could that blob be simply loaded into memory and run? > > (hmm -- probably not -- memory addresses would be hard-coded then, yes?) > or is memory virtualized enough these days? You might want to look into combining this with PyRun: https://www.egenix.com/products/python/PyRun/ which takes care of mmap'ing the byte code of the stdlib into memory. -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Experts >>> Python Projects, Coaching and Consulting ... http://www.egenix.com/ >>> Python Database Interfaces ... http://products.egenix.com/ >>> Plone/Zope Database Interfaces ... http://zope.egenix.com/ ________________________________________________________________________ ::: We implement business ideas - efficiently in both time and costs ::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/ http://www.malemburg.com/ From encukou at gmail.com Mon May 14 16:38:36 2018 From: encukou at gmail.com (Petr Viktorin) Date: Mon, 14 May 2018 16:38:36 -0400 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: <5AED7166.1010008@UGent.be> References: <5AED7166.1010008@UGent.be> Message-ID: On 05/05/18 04:55, Jeroen Demeyer wrote: > Hello all, > > I have updated PEP 575 in response to some posts on this mailing list > and to some discussions in person with the core Cython developers. > See https://www.python.org/dev/peps/pep-0575/ > > The main differences with respect to the previous version are: > > * "builtin_function" was renamed to "cfunction". Since we are changing > the name anyway, "cfunction" looked like a better choice because the > word "built-in" typically refers to things from the builtins module. > > * defined_function now only defines an API (it must support all > attributes that a Python function has) without specifying the > implementation. > > * The "Two-phase Implementation" proposal for better backwards > compatibility has been expanded and now offers 100% backwards > compatibility for the classes and for the inspect functions. The PEP says: > User flags: METH_CUSTOM and METH_USRx > These flags are meant for applications that want to use tp_methods for an extension type or m_methods for a module but that do not want the default built-in functions to be created. Those applications would set METH_CUSTOM. The application is also free to use METH_USR0, ..., METH_USR7 for its own purposes, for example to customize the creation of special function instances. > > There is no immediate concrete use case, but we expect that tools which auto-generate functions or extension types may want to define custom flags. Given that it costs essentially nothing to have these flags, it seems like a good idea to allow it. Why are these flags added? They aren't free ? the space of available flags is not infinite. If something (Cython?) needs eight of them, it would be nice to mention the use case, at least as an example. What should Python do with a m_methods entry that has METH_CUSTOM set? Again it would be nice to have an example or use case. From chris.barker at noaa.gov Mon May 14 18:33:14 2018 From: chris.barker at noaa.gov (Chris Barker - NOAA Federal) Date: Mon, 14 May 2018 18:33:14 -0400 Subject: [Python-Dev] (Looking for) A Retrospective on the Move to Python 3 In-Reply-To: References: <20180428175002.62163b10@fsol> <5AF9C044.5080001@stoneleaf.us> Message-ID: > between 3.0 and 3.6 (.5?) -- py3 grew a lot of minor features that made it >> easier to write py2/py3 compatible code. >> u"string", b'bytes %i' % something -- and when where the various >> __future__ imports made available? >> > > You'll need to be more specific. __future__ has been around for a > looooong time. I meant the various ones that support py2/3 compatibility ? I know division predates py3, not sure about the others. But it was a rhetorical question anyway :-) -CHB https://github.com/python/cpython/blame/master/Lib/__future__.py > > -- > ~Ethan~ > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/wes. > turner%40gmail.com > _______________________________________________ Python-Dev mailing list Python-Dev at python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/chris.barker%40noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From lists at eitanadler.com Tue May 15 00:01:12 2018 From: lists at eitanadler.com (Eitan Adler) Date: Mon, 14 May 2018 21:01:12 -0700 Subject: [Python-Dev] Changes to configure.ac, auto-detection and related build issues Message-ID: Hi all, Hope this is an appropriate list to send this message to; if not, please redirect me. I am working on updating, fixing, or otherwise changing python's configure.ac. This work is complex, lacks dedicated unit tests, and is easy to miss corner cases. As these changes make it into master I'll be watching the various build bots and be on the look out for related failures. That said, I will miss things. Please feel free to tag me in related PRs or bugs or emails over the next few weeks. Thanks -- Eitan Adler From vstinner at redhat.com Tue May 15 00:36:53 2018 From: vstinner at redhat.com (Victor Stinner) Date: Tue, 15 May 2018 00:36:53 -0400 Subject: [Python-Dev] Changes to configure.ac, auto-detection and related build issues In-Reply-To: References: Message-ID: Hi Eitan, 2018-05-15 0:01 GMT-04:00 Eitan Adler : > I am working on updating, fixing, or otherwise changing python's > configure.ac. This work is complex, (...) Is your work public? Is there an open issue on bugs.python.org or an open pull request? If not, would you mind to at least describe the changes that you plan to do? > Please feel free to tag me in > related PRs or bugs or emails over the next few weeks. Hopefully, we only rarely need to modify configure.ac. Victor From lists at eitanadler.com Tue May 15 01:58:41 2018 From: lists at eitanadler.com (Eitan Adler) Date: Mon, 14 May 2018 22:58:41 -0700 Subject: [Python-Dev] Changes to configure.ac, auto-detection and related build issues In-Reply-To: References: Message-ID: On Monday, 14 May 2018, Victor Stinner wrote: > Hi Eitan, > > 2018-05-15 0:01 GMT-04:00 Eitan Adler : > > I am working on updating, fixing, or otherwise changing python's > > configure.ac. This work is complex, (...) > > Is your work public? Is there an open issue on bugs.python.org or an > open pull request? I'm opening bugs and PRs as I Go. Some examples are: https://github.com/python/cpython/commit/98929b545e86e7c7296c912d8f34e8e8d3fd6439 https://github.com/python/cpython/pull/6845 https://github.com/python/cpython/pull/6848 https://github.com/python/cpython/pull/6849 https://bugs.python.org/issue33485 And so on > > If not, would you mind to at least describe the changes that you plan to > do? > > > Please feel free to tag me in > > related PRs or bugs or emails over the next few weeks. > > Hopefully, we only rarely need to modify configure.ac I'm primarily worried about breaking arcane platforms I don't have direct access to. > > Victor > -- Sent from my Turing Machine -------------- next part -------------- An HTML attachment was scrubbed... URL: From J.Demeyer at UGent.be Tue May 15 04:45:57 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Tue, 15 May 2018 10:45:57 +0200 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: <97d56787b9094d5cb88401ea6ff4c722@xmail101.UGent.be> References: <5AED7166.1010008@UGent.be> <97d56787b9094d5cb88401ea6ff4c722@xmail101.UGent.be> Message-ID: <5AFA9E45.9070106@UGent.be> On 2018-05-14 22:38, Petr Viktorin wrote: > Why are these flags added? > They aren't free ? the space of available flags is not infinite. If > something (Cython?) needs eight of them, it would be nice to mention the > use case, at least as an example. > > What should Python do with a m_methods entry that has METH_CUSTOM set? > Again it would be nice to have an example or use case. They have no specific use case. I just added this because it made sense abstractly. I can remove this from my PEP to simplify it. From J.Demeyer at UGent.be Tue May 15 05:15:27 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Tue, 15 May 2018 11:15:27 +0200 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> Message-ID: <5AFAA52F.6070908@UGent.be> On 2018-05-14 19:56, Petr Viktorin wrote: > It does quite a lot of things, and the changes are all intertwined, > which will make it hard to get reviewed and accepted. The problem is that many things *are* already intertwined currently. You cannot deal with functions without involving methods for example. An important note is that it was never my goal to create a minimal PEP. I did not aim for changing as little as possible. I was thinking: we are changing functions, what would be the best way to implement them? The main goal was fixing introspection but a secondary goal was fixing many of the existing warts with functions. Probably this secondary goal will in the end be more important for the general Python community. I would argue that my PEP may look complicated, but I'm sure that the end result will be a simpler implementation than we have today. Instead of having four related classes implementing similar functionality (builtin_function_or_method, method, method_descriptor and function), we have just one (base_function). The existing classes like method still exist with my PEP but a lot of the core functionality is implemented in the common base_function. This is really one of the key points: while my PEP *could* be implemented without the base_function class, the resulting code would be far more complicated. > Are there parts that can be left to a subsequent PEP, to simplify the > document (and implementation)? It depends. The current PEP is more or less a finished product. You can of course pick parts of the PEP and implement those, but then those parts will be somewhat meaningless individually. But if PEP 575 is accepted "in principle" (you accept the new class hierarchy for functions), then the details could be spread over several PEPs. But those individual PEPs would only make sense in the light of PEP 575. A few small details could be left out, such as METH_BINDING. But that wouldn't yield a significant simplification. > It seems to me that the current complexity is (partly) due to the fact > that how functions are *called* is tied to how they are *introspected*. The *existing* situation is that introspection is totally tied to how functions are called. So I would argue that my PEP improves on that by removing some of those ties by moving __call__ to a common base class. > Maybe we can change `inspect` to use duck-typing instead of isinstance? That was rejected on https://bugs.python.org/issue30071 > Then, if built-in functions were subclassable, Cython functions could > need to provide appropriate __code__/__defaults__/__kwdefaults__ > attributes that inspect would pick up. Of course, that's possible. I don't think that it would be a *better* solution than my PEP though. Essentially, my PEP started from that idea. But then you realize that you'll need to handle not only built-in functions but also method descriptors (unbound methods of extension types). And you'll want to allow __get__ for the new subclasses. For efficiency, you really want to implement __get__ in the base classes (both builtin_function_or_method and method_descriptor) because of optimizations combining __get__ and __call__ (the LOAD_METHOD and CALL_METHOD opcodes). And then you realize that it makes no sense to duplicate all that functionality in both classes. So you add a new base class. You already end up with a major part of my PEP this way. That still leaves the issue of what inspect.isfunction() should do. Often, "isfunction" is used to check for "has introspection" so you certainly want to allow for custom built-in function classes to satisfy inspect.isfunction(). So you need to involve Python functions too in the class hierarchy. And that's more or less my PEP. Jeroen. From nad at python.org Tue May 15 07:51:52 2018 From: nad at python.org (Ned Deily) Date: Tue, 15 May 2018 07:51:52 -0400 Subject: [Python-Dev] FINAL WEEK FOR 3.7.0 CHANGES! Message-ID: This is it! We are down to THE FINAL WEEK for 3.7.0! Please get your feature fixes, bug fixes, and documentation updates in before 2018-05-21 ~23:59 Anywhere on Earth (UTC-12:00). That's about 7 days from now. We will then tag and produce the 3.7.0 release candidate. Our goal continues been to be to have no changes between the release candidate and final; AFTER NEXT WEEK'S RC1, CHANGES APPLIED TO THE 3.7 BRANCH WILL BE RELEASED IN 3.7.1. Please double-check that there are no critical problems outstanding and that documentation for new features in 3.7 is complete (including NEWS and What's New items), and that 3.7 is getting exposure and tested with our various platorms and third-party distributions and applications. Those of us who are participating in the development sprints at PyCon US 2018 here in Cleveland can feel the excitement building as we work through the remaining issues, including completing the "What's New in 3.7" document and final feature documentation. (We wish you could all be here.) As noted before, the ABI for 3.7.0 was frozen as of 3.7.0b3. You should now be treating the 3.7 branch as if it were already released and in maintenance mode. That means you should only push the kinds of changes that are appropriate for a maintenance release: non-ABI-changing bug and feature fixes and documentation updates. If you find a problem that requires an ABI-altering or other significant user-facing change (for example, something likely to introduce an incompatibility with existing users' code or require rebuilding of user extension modules), please make sure to set the b.p.o issue to "release blocker" priority and describe there why you feel the change is necessary. If you are reviewing PRs for 3.7 (and please do!), be on the lookout for and flag potential incompatibilities (we've all made them). Thanks again for all of your hard work towards making 3.7.0 yet another great release - coming to a website near you on 06-15! Release Managerly Yours, --Ned https://www.python.org/dev/peps/pep-0537/ -- Ned Deily nad at python.org -- [] From nad at python.org Tue May 15 08:54:22 2018 From: nad at python.org (Ned Deily) Date: Tue, 15 May 2018 08:54:22 -0400 Subject: [Python-Dev] Changes to configure.ac, auto-detection and related build issues In-Reply-To: References: Message-ID: On May 15, 2018, at 01:58, Eitan Adler wrote: > On Monday, 14 May 2018, Victor Stinner wrote: > Hi Eitan, > > 2018-05-15 0:01 GMT-04:00 Eitan Adler : > > I am working on updating, fixing, or otherwise changing python's > > configure.ac. This work is complex, (...) > > Is your work public? Is there an open issue on bugs.python.org or an > open pull request? > > I'm opening bugs and PRs as I Go. Some examples are: > > https://github.com/python/cpython/commit/98929b545e86e7c7296c912d8f34e8e8d3fd6439 > https://github.com/python/cpython/pull/6845 > https://github.com/python/cpython/pull/6848 > https://github.com/python/cpython/pull/6849 > https://bugs.python.org/issue33485 > > And so on > > > If not, would you mind to at least describe the changes that you plan to do? > > > Please feel free to tag me in > > related PRs or bugs or emails over the next few weeks. > > Hopefully, we only rarely need to modify configure.ac > > I'm primarily worried about breaking arcane platforms I don't have direct access to. Hi, Eitan! As you recognize, it is always a bit dangerous to modify configure.ac and friends as we do support so many platforms and configuration and downstream users try combinations that we don't claim to test or support. So, we try to be conservative about making changes there and do so only with good reason. So far, it's somewhat difficult for me to understand what you are trying to accomplish with the changes you've noted so far other than various cosmetic cleanups. It is also difficult to properly review a bunch of small PRs that modify the same configuration files and especially without an overall tracking issue. For most of this to move forward, I think you should create or adapt at least one b.p.o issue to describe what changes you are suggesting and why and how they apply to our various platforms and then consolidate PRs under that PR. Don't be surprised if the PRs don't get much attention right away as we're busy at the moment trying to get 3.7.0 out the door. And it would be best to avoid including generated files (like configure vs configure.ac) in new PRs as that will only add to the likelihood of merge conflicts and review complexity. Thanks! - Ned Deily nad at python.org -- [] From J.Demeyer at UGent.be Tue May 15 09:57:14 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Tue, 15 May 2018 15:57:14 +0200 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: <97d56787b9094d5cb88401ea6ff4c722@xmail101.UGent.be> References: <5AED7166.1010008@UGent.be> <97d56787b9094d5cb88401ea6ff4c722@xmail101.UGent.be> Message-ID: <5AFAE73A.1010204@UGent.be> On 2018-05-14 22:38, Petr Viktorin wrote: > Why are these flags added? I made a minor edit to the PEP to remove those flags: https://github.com/python/peps/pull/649 From encukou at gmail.com Tue May 15 12:36:18 2018 From: encukou at gmail.com (Petr Viktorin) Date: Tue, 15 May 2018 12:36:18 -0400 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: <5AFAA52F.6070908@UGent.be> References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> Message-ID: <74624524-2ef9-7ef7-677a-72a65a81f911@gmail.com> On 05/15/18 05:15, Jeroen Demeyer wrote: > On 2018-05-14 19:56, Petr Viktorin wrote: >> It does quite a lot of things, and the changes are all intertwined, >> which will make it hard to get reviewed and accepted. > > The problem is that many things *are* already intertwined currently. You > cannot deal with functions without involving methods for example. > > An important note is that it was never my goal to create a minimal PEP. > I did not aim for changing as little as possible. I was thinking: we are > changing functions, what would be the best way to implement them? That might be a problem. For the change to be accepted, a core developer will need to commit to maintaining the code, understand it, and accept responsibility for anything that's broken. Naturally, large-scale changes have less of a chance there. With such a "finished product" PEP, it's hard to see if some of the various problems could be solved in a better way -- faster, more maintainable, or less disruptive. It's also harder from a psychological point of view: you obviously already put in a lot of good work, and it's harder to waste that work if an even better solution is found. (I always tell Marcel to view large-scale changes as a hands-on learning experiment -- more likely to be thrown away than accepted -- rather than as creating a finished project.) > The main goal was fixing introspection but a secondary goal was fixing > many of the existing warts with functions. Probably this secondary goal > will in the end be more important for the general Python community. > > I would argue that my PEP may look complicated, but I'm sure that the > end result will be a simpler implementation than we have today. Instead > of having four related classes implementing similar functionality > (builtin_function_or_method, method, method_descriptor and function), we > have just one (base_function). The existing classes like method still > exist with my PEP but a lot of the core functionality is implemented in > the common base_function. Is a branching class hierarchy, with quite a few new of flags for feature selection, the kind of simplicity we want? Would it be possible to first decouple things, reducing the complexity, and then tackle the individual problems? > This is really one of the key points: while my PEP *could* be > implemented without the base_function class, the resulting code would be > far more complicated. > >> Are there parts that can be left to a subsequent PEP, to simplify the >> document (and implementation)? > > It depends. The current PEP is more or less a finished product. You can > of course pick parts of the PEP and implement those, but then those > parts will be somewhat meaningless individually. > > But if PEP 575 is accepted "in principle" (you accept the new class > hierarchy for functions), then the details could be spread over several > PEPs. But those individual PEPs would only make sense in the light of > PEP 575. Well, that's the thing I'm not sure about. The class hierarchy still makes it hard to decouple the introspection side (how functions look on the outside) from the calling mechanism (how the calling works internally). It fear that it is replacing complexity with a different kind of complexity. So my main question now is, can this all be *simplified* rather than *reorganized*? It's a genuine question ? I don't know, but I feel it should be explored more. > A few small details could be left out, such as METH_BINDING. But that > wouldn't yield a significant simplification. > >> It seems to me that the current complexity is (partly) due to the fact >> that how functions are *called* is tied to how they are *introspected*. > > The *existing* situation is that introspection is totally tied to how > functions are called. So I would argue that my PEP improves on that by > removing some of those ties by moving __call__ to a common base class. > >> Maybe we can change `inspect` to use duck-typing instead of isinstance? > > That was rejected on https://bugs.python.org/issue30071 > >> Then, if built-in functions were subclassable, Cython functions could >> need to provide appropriate __code__/__defaults__/__kwdefaults__ >> attributes that inspect would pick up. > > Of course, that's possible. I don't think that it would be a *better* > solution than my PEP though. > > Essentially, my PEP started from that idea. But then you realize that > you'll need to handle not only built-in functions but also method > descriptors (unbound methods of extension types). And you'll want to > allow __get__ for the new subclasses. For efficiency, you really want to > implement __get__ in the base classes (both builtin_function_or_method > and method_descriptor) because of optimizations combining __get__ and > __call__ (the LOAD_METHOD and CALL_METHOD opcodes). And then you realize > that it makes no sense to duplicate all that functionality in both > classes. So you add a new base class. You already end up with a major > part of my PEP this way. Starting from an idea and ironing out the details it lets you (and, if since you published results, everyone else) figure out the tricky details. But ultimately it's exploring one path of doing things ? it doesn't necessarily lead to the best way of doing something. > That still leaves the issue of what inspect.isfunction() should do. > Often, "isfunction" is used to check for "has introspection" so you > certainly want to allow for custom built-in function classes to satisfy > inspect.isfunction(). So you need to involve Python functions too in the > class hierarchy. And that's more or less my PEP. That's a good question. Maybe inspect.isfunction() serves too many use cases to be useful. Cython functons should behave like "def" functions in some cases, and like built-in functions in others. *Can* a single boolean usefully distinguish between these? The current docs are unclear, and before we change how inspect.isfunction ultimately behaves, I'd like to make its purpose clearer (and try to check how that meshes with the current use cases). What is your ultimate use case? Is it documentation tools like pydoc? If we design for them, what other uses of isfunction() will be left out? I hope this doesn't read as too negative. I'm grateful for the work you put in, and it is useful (even in case we do end up settling on a different solution). From lists at eitanadler.com Tue May 15 14:03:10 2018 From: lists at eitanadler.com (Eitan Adler) Date: Tue, 15 May 2018 11:03:10 -0700 Subject: [Python-Dev] Changes to configure.ac, auto-detection and related build issues In-Reply-To: References: Message-ID: On 15 May 2018 at 05:54, Ned Deily wrote: > On May 15, 2018, at 01:58, Eitan Adler wrote: >> On Monday, 14 May 2018, Victor Stinner wrote: >> Hi Eitan, >> >> 2018-05-15 0:01 GMT-04:00 Eitan Adler : >> > I am working on updating, fixing, or otherwise changing python's >> > configure.ac. This work is complex, (...) >> >> Is your work public? Is there an open issue on bugs.python.org or an >> open pull request? >> >> I'm opening bugs and PRs as I Go. Some examples are: >> >> https://github.com/python/cpython/commit/98929b545e86e7c7296c912d8f34e8e8d3fd6439 >> https://github.com/python/cpython/pull/6845 >> https://github.com/python/cpython/pull/6848 >> https://github.com/python/cpython/pull/6849 >> https://bugs.python.org/issue33485 >> >> And so on >> >> >> If not, would you mind to at least describe the changes that you plan to do? >> >> > Please feel free to tag me in >> > related PRs or bugs or emails over the next few weeks. >> >> Hopefully, we only rarely need to modify configure.ac >> >> I'm primarily worried about breaking arcane platforms I don't have direct access to. > > > Hi, Eitan! > > As you recognize, it is always a bit dangerous to modify configure.ac and friends as we do support so many platforms and configuration and downstream users try combinations that we don't claim to test or support. So, we try to be conservative about making changes there and do so only with good reason. > > So far, it's somewhat difficult for me to understand what you are trying to accomplish with the changes you've noted so far other than various cosmetic cleanups. This all started when I found a bug in the C code of python. I wanted to submit a PR and test my change, but found that it was painful to compile Python on many platforms. In particular I needed to use "clang" but configure.ac was looking for a compiler called "gcc -pthread". As I started to fix this, I realized there is a lot of unneeded complexity in configure.ac and am now trying to clean that up. > It is also difficult to properly review a bunch of small PRs that modify the same configuration files and especially without an overall tracking issue. I wanted to keep the reviews small to be reviewable, revertable, and bisectable. Is there a nicer way of handling that? Maybe a single large review with a series of small commits? > For most of this to move forward, I think you should create or adapt at least one b.p.o issue to describe what changes you are suggesting and why and how they apply to our various platforms and then consolidate PRs under that PR. Don't be surprised if the PRs don't get much attention right away as we're busy at the moment trying to get 3.7.0 out the door. Understood. There are lot of PRs and a lot of work. I've been pretty happy with the traction so far. > And it would be best to avoid including generated files (like configure vs configure.ac) in new PRs as that will only add to the likelihood of merge conflicts and review complexity. I've gotten three different pieces of advice about this: (a) always include them, so its easier to review (b) never include, so its easier to review and and avoid merge conflicts (c) only include if your tool version matches what was used originally I don't care much but its clear there isn't agreement. -- Eitan Adler From vstinner at redhat.com Tue May 15 14:40:02 2018 From: vstinner at redhat.com (Victor Stinner) Date: Tue, 15 May 2018 14:40:02 -0400 Subject: [Python-Dev] Changes to configure.ac, auto-detection and related build issues In-Reply-To: References: Message-ID: I didn't look at your PRs yet, but PR commits are squashed into a single commit. So it's better to have multiple PRs for different changes. Victor From nataliemorrisonxm980xm at yahoo.com Tue May 15 16:43:10 2018 From: nataliemorrisonxm980xm at yahoo.com (nataliemorrisonxm980xm at yahoo.com) Date: Tue, 15 May 2018 16:43:10 -0400 (EDT) Subject: [Python-Dev] =?utf-8?q?=5BPython-checkins=5D_bpo-33038=3A_Fix_gz?= =?utf-8?q?ip=2EGzipFile_for_file_objects_with_a_non-string_name_attribute?= =?utf-8?b?LiAoR0gtNjA5NSk=?= Message-ID: <40lqJt01xkzFqRd@mail.python.org> An HTML attachment was scrubbed... URL: From nataliemorrisonxm980xm at yahoo.com Tue May 15 16:43:14 2018 From: nataliemorrisonxm980xm at yahoo.com (nataliemorrisonxm980xm at yahoo.com) Date: Tue, 15 May 2018 16:43:14 -0400 (EDT) Subject: [Python-Dev] =?utf-8?q?=5BPython-checkins=5D_bpo-33038=3A_Fix_gz?= =?utf-8?q?ip=2EGzipFile_for_file_objects_with_a_non-string_name_attribute?= =?utf-8?b?LiAoR0gtNjA5NSk=?= Message-ID: <40lqJy13jVzFqpT@mail.python.org> An HTML attachment was scrubbed... URL: From nataliemorrisonxm980xm at yahoo.com Tue May 15 16:43:18 2018 From: nataliemorrisonxm980xm at yahoo.com (nataliemorrisonxm980xm at yahoo.com) Date: Tue, 15 May 2018 16:43:18 -0400 (EDT) Subject: [Python-Dev] =?utf-8?q?=5BPython-checkins=5D_bpo-33038=3A_Fix_gz?= =?utf-8?q?ip=2EGzipFile_for_file_objects_with_a_non-string_name_attribute?= =?utf-8?b?LiAoR0gtNjA5NSk=?= Message-ID: <40lqK252vnzFqpT@mail.python.org> An HTML attachment was scrubbed... URL: From J.Demeyer at UGent.be Tue May 15 16:44:36 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Tue, 15 May 2018 22:44:36 +0200 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> Message-ID: <5AFB46B4.4070603@UGent.be> On 2018-05-15 18:36, Petr Viktorin wrote: > What is your ultimate use case? (I'll just answer this one question now and reply to the more technical comments in another thread) My ultimate use case is being able to implement functions and methods which are (A) equally fast as the existing built-in function and methods (B) and behave from a user's point of view like Python functions. With objective (A) I want no compromises. CPython has many optimizations for built-in functions and all of them should work for my new functions. Objective (B) means more precisely: 1. Implementing __get__ to turn a function in a method. 2. Being recognized as "functions" by tools like Sphinx and IPython. 3. Introspection support such as inspect.signature() and inspect.getsource(). Jeroen. From tim.peters at gmail.com Tue May 15 16:53:06 2018 From: tim.peters at gmail.com (Tim Peters) Date: Tue, 15 May 2018 15:53:06 -0500 Subject: [Python-Dev] [Python-checkins] bpo-33038: Fix gzip.GzipFile for file objects with a non-string name attribute. (GH-6095) In-Reply-To: <40lqJt01xkzFqRd@mail.python.org> References: <40lqJt01xkzFqRd@mail.python.org> Message-ID: Sorry about approving this message (I'm a python-dev list moderator)! There will be a few more like it. Looking closer, it appears to be another variation of pure-nuisance spam that's been flooding all sorts of python.org lists. You've been spared many hundreds of those here, but since this one appeared to contain actual Python-related content, I reflexively approved it. On Tue, May 15, 2018 at 3:43 PM, nataliemorrisonxm980xm--- via Python-Dev wrote: > > > ________________________________ > From: Serhiy Storchaka > To: python-checkins at python.org > Sent: Wednesday, 9 May 2018, 10:14 > Subject: [Python-checkins] bpo-33038: Fix gzip.GzipFile for file objects > with a non-string name attribute. (GH-6095) > ,,, From ivan_pozdeev at mail.ru Tue May 15 13:43:45 2018 From: ivan_pozdeev at mail.ru (Ivan Pozdeev) Date: Tue, 15 May 2018 20:43:45 +0300 Subject: [Python-Dev] Making Tcl/Tk more suitable for embedding (was: [issue33479] Document tkinter and threads) In-Reply-To: <1526397606.43.0.682650639539.issue33479@psf.upfronthosting.co.za> References: <1526397606.43.0.682650639539.issue33479@psf.upfronthosting.co.za> Message-ID: <8b14bed0-87d2-9118-cca8-1e353c6b4ecb@mail.ru> Subj is off topic for the ticket, so I guess this discussion is better continued here. On 15.05.2018 18:20, Mark Roseman wrote: > Mark Roseman added the comment: > > Hi Ivan, thanks for your detailed response. The approach you're suggesting ("Since the sole offender is their threading model, the way is to show them how it's defective and work towards improving it.") is in the end not something I think is workable. > > Some historical context. Ousterhout had some specific ideas about how Tcl/Tk should be used, and that was well-reflected in his early control of the code base. He was certainly outspoken against threads. The main argument is that they're complicated if you don't know what you're doing, which included the "non-professional programmers" he considered the core audience. Enumerating how threads were used at the time, most of the uses could be handled (more simply) in other ways, such as event-driven and non-blocking timers and I/O (so what people today would refer to as the "node.js event model"). Threads (or separate communicating processes) were for long-running computations, things he always envisioned happening in C code (written by more "professional programmers"), not Tcl. His idea of how Tcl and C development would be split didn't match reality given faster machines, more memory, etc. Very enlightening. Many thanks. > The second thing is that Tcl had multiple interpreters baked in pretty much from the beginning at the C level and exposed fairly early on (1996?) at the Tcl level, akin to PEP 554. Code isolation and resource management were the key motivators, but of course others followed. Creating and using Tcl interpreters was quick, lightweight (fast startup, low memory overhead, etc.) and easy. So in other words, the notion of multiple interpreters in Tcl vs. Python is completely different. I had one large application I built around that time that often ended up with hundreds of interpreters running. Not familiar with the concept so can't say atm if tkinter can make any use of this. All tkinter-using code I've seen so far only ever uses a single tkinter.Tk() -- thus a single interpreter. > Which brings me to threads and how they were added to the language. Your guess ("My guess for the decision is it was the easiest way to migrate the code base") is incorrect. The idea of "one thread/one interpreter" was just not seen as a restriction, and was a very natural extension of what had come before. It fit the use cases well (AOLserver was another good example) and was still very understandable from the user level. Contrast with Python's GIL, etc. I'm not actually suggesting any changes to Tcl as a language, only to its C interface (details follow). AFAIK Tcl also advertises itself as an embeddable language as its main selling point, having been developed primarity as an interface to Tk rather than a self-sufficient language (or, equivalently, this being its primary use case now). Having to do an elaborate setup with lots of custom logic to be able to embed it is a major roadblock. This can be the leverage. From C interface's standpoint, an interpreter is effectively a bunch of data that can be passed to APIs. Currently, all Tcl_* calls with a specific interpreter instance must be made from the same thread, and this fact enforces sequential access. I'm suggesting to wrap all these public APIs with an interpreter-specific lock -- so calls can be made from any OS thread and the lock enforces sequential access. For Tcl's execution model and existing code, nothing will change. The downside (that will definitely be brought up) is the overhead, of course. The question is thus whether the above-mentioned benefit outweighs it. > With that all said, there would be very little motivation to change the Tcl/Tk side to allow multiple threads to access one interpreter, because in terms of the API and programming model that Tcl/Tk advertises, it's simply not a problem. Keep in mind, the people working on the Tcl/Tk core are very smart programmers, know threads very well, etc., so it's not an issue of "they should know better" or "it's old." In other words, "show them how it's defective" is a non-starter. > > The other, more practical matter in pushing for changes in the Tcl/Tk core, is that there are a fairly small number of people working on it, very part-time. Almost all of them are most interested in the Tcl side, not Tk. Changes made in Tk most often amount to bug fixes because someone's running into something in their own work. Expecting large-scale changes to happen to Tk without some way to get dedicated new resources put into it is not realistic. > > A final matter on the practical side. As you've carefully noted, certain Tcl/Tk calls now happen to work when called from different threads. Consider those a side-effect of present implementation, not a guarantee. Future core changes could change what can be called from different threads, making the situation better or worse. From the Tcl/Tk perspective, this is not a problem, and would not be caught by any testing, etc. Even if it were, it likely wouldn't be fixed. It would be considered an "abuse" of their API (I think correctly). > My suggestion, given the philosophical and practical mismatch, is that Tkinter move towards operating as if the API Tk provides is inviolate. In other words, all calls into a Tcl interpreter happen from the same thread that created the Tcl interpreter. Tkinter acts as a bridge between Python and Tcl/Tk. It should present an API to Python programs compatible with the Python threading model. It's Tkinter's responsibility to map that onto Tcl/Tk's single threaded API through whatever internal mechanism is necessary (i.e. pass everything to main thread, block caller thread until get response, etc.) That's exactly what Tkinter currently does, see the letter attached to the ticket. Writing clean and correct code is Python's principal standpoint, it doesn't use unsupported functions. > I'd go so far as to suggest that all the Tkapp 'call' code (i.e. every place that Tkinter calls Tcl_Eval) check what thread it's in, and issue a warning or error (at least for testing purposes) if it's being called from the "wrong" thread. Having this available in the near future would help people who are debugging what are fairly inexplicable problems now. > > The approach of making Tkinter responsible also has the advantage of dealing with far more Tcl/Tk versions and builds. > > Given in practice that few people are really running into things, and that if they are, they know enough to be able to follow the instruction "all Tkinter calls from the same thread" for now, add the warnings/errors in via whatever "turn on debugging" mechanism makes sense. A future version of Python would include a fully thread-safe Tkinter that internally makes all Tcl/Tk calls from a single thread, as per above. > > Sorry this is so incredibly long-winded. I hope the context at least is useful information. > > ---------- > > _______________________________________ > Python tracker > > _______________________________________ -- -- Regards, Ivan From J.Demeyer at UGent.be Tue May 15 17:55:00 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Tue, 15 May 2018 23:55:00 +0200 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> Message-ID: <5AFB5734.7080908@UGent.be> On 2018-05-15 18:36, Petr Viktorin wrote: > Naturally, large-scale > changes have less of a chance there. Does it really matter that much how large the change is? I think you are focusing too much on the change instead of the end result. As I said in my previous post, I could certainly make less disruptive changes. But would that really be better? (If you think that the answer is "yes" here, I honestly want to know). I could make the code less different than today but at the cost of added complexity. Building on top of the existing code is like building on a bad foundation: the higher you build, the messier it gets. Instead, I propose a solid new foundation. Of course, that requires more work to build but once it is built, the finished building looks a lot better. > With such a "finished product" PEP, it's hard to see if some of the > various problems could be solved in a better way -- faster, more > maintainable, or less disruptive. With "faster", you mean runtime speed? I'm pretty confident that we won't lose anything there. As I argued above, my PEP might very well make things "more maintainable", but this is of course very subjective. And "less disruptive" was never a goal for this PEP. > It's also harder from a psychological point of view: you obviously > already put in a lot of good work, and it's harder to waste that work if > an even better solution is found. I hope that this won't be my psychology. As a developer, I prefer to focus on problems rather than on solutions: I don't want to push a particular solution, I want to fix a particular problem. If an even better solution is accepted, I will be a very happy man. What I would hate is that this PEP gets rejected because some people claim that the problem can be solved in a better way, but without actually suggesting such a better way. > Is a branching class hierarchy, with quite a few new of flags for > feature selection, the kind of simplicity we want? Maybe yes because it *concentrates* all complexity in one small place. Currently, we have several independent classes (builtin_function_or_method, method_descriptor, function, method) which all require various forms of special casing in the interpreter with some code duplication. With my PEP, this all goes away and instead we need to understand just one class, namely base_function. > Would it be possible to first decouple things, reducing the complexity, > and then tackle the individual problems? What do you mean with "decouple things"? Can you be more concrete? > The class hierarchy still makes it hard to decouple the introspection > side (how functions look on the outside) from the calling mechanism (how > the calling works internally). Any class who wants to profit from fast function calls can inherit from base_function. It can add whatever attributes it wants and it can choose to implement documentation and/or introspection in whatever way it wants. It can choose to not care about that at all. That looks very decoupled to me. > Starting from an idea and ironing out the details it lets you (and, if > since you published results, everyone else) figure out the tricky > details. But ultimately it's exploring one path of doing things ? it > doesn't necessarily lead to the best way of doing something. So far I haven't seen any other proposals... > That's a good question. Maybe inspect.isfunction() serves too many use > cases to be useful. Cython functons should behave like "def" functions > in some cases, and like built-in functions in others. From the outside, i.e. user's point of view, I want them to behave like Python functions. Whether it's implemented in C or Python should just be an implementation detail. Of course there are attributes like __code__ which dive into implementation details, so there you will see the difference. > before we change how inspect.isfunction ultimately behaves, > I'd like to make its purpose clearer (and try to check how that meshes > with the current use cases). The problem is that this is not easy to do. You could search CPython for occurrences of inspect.isfunction() and you could search your favorite Python projects. This will give you some indication, but I'm not sure whether that will be representative. From what I can tell, inspect.isfunction() is mainly used as guard for attribute access: it implies for example that a __globals__ attribute exists. And it's used by documentation tools to decide that it should be documented as Python function whose signature can be extracted using inspect.signature(). Jeroen. From tim.peters at gmail.com Tue May 15 18:30:22 2018 From: tim.peters at gmail.com (Tim Peters) Date: Tue, 15 May 2018 17:30:22 -0500 Subject: [Python-Dev] Looking for examples: proof that a list comp is a function In-Reply-To: References: Message-ID: [ Tim, about the most version of the docs at https://docs.python.org/dev/reference/expressions.html#displays-for-lists-sets-and-dictionaries ] >> I say "pretty much" because, for whatever reason(s), it seems to be >> trying hard _not_ to use the word "function". But I can't guess what >> "then passed as an argument to the implicitly nested scope" could >> possibly mean otherwise (it doesn't make literal sense to "pass an >> argument" to "a scope"). [Nick Coghlan ] > I think my motivation was to avoid promising *exact* equivalence with a > regular nested function, since the define-and-call may allow us > opportunities for optimization that don't exist when those two are separated > (e.g. Guido's point in another thread that we actually avoid calling "iter" > twice even though the nominal expansion implies that we should). However, > you're right that just calling it a function may be clearer than relying on > the ill-defined phrase "implicitly nested scope". Plus that, as noted, what passing an argument "to a scope" means is mysterious. Language standard committees struggle for years with how to phrase things so that no more than is intended appears to be promised. It's hard! For example, if you were to show a workalike function and note that the exact placement - and number - of `iter()` calls is not guaranteed, someone else would point out that you need to explicitly say that by "iter" you mean the builtin function of that name, not one user code may have overridden it with in the current scope. Then someone else will note that it's tedious to say things like that whenever they're needed, and more-general text will be added elsewhere in the docs saying that the _rest_ of the docs always mean the language-supplied versions of such-&-such explicitly named functions/classes/modules/... I'd say "nested function" anyway ;-) And for another reason: not just someone from Mars is prone to misreading 'scope", but just about anyone on Earth coming from another language. The idea that the word "scope" all by itself implies "and in general any name bound to within the top-level code spanned by the scope is implicitly local to the scope unless explicitly declared `global` or `nonlocal` in the scope" may be unique to Python. > For Chris's actual question, this is part of why I think adding > "parentlocal" would actually make the scoping proposal easier to explain, as > it means the name binding semantics aren't a uniquely magical property of > binding expressions (however spelled), they're just a new form of target > scope declaration that the compiler understands, and the binding expression > form implies. Note: eas*ier*, not easy ;) Adding an explanation of `parentlocal` to the docs could be a useful pedagogical device, but I don't think I'd support adding that statement to the _language_. It's too weird, and seems to be at a wrong level for plausible future language developments. Let's step waaaaay back for a minute. In many languages with full-blown closures, first-class functions, and nested lexical scopes, it's pretty common to define the meaning of various language constructs in terms of calling derived lexically nested functions. In those languages, any "work variables" needed by the synthetic functions are declared as being local to those functions, and _that's the end of it_. They're done. All other names inside the expansions mean exactly the same as what they mean in whatever chunks of user-supplied code the construct interpolates into the synthesized functions. It doesn't matter one whit in which context(s) they appear. That's the only long-term sane way to go about defining constructs in terms of calling synthesized functions interpolating user-supplied pieces of code. Now _if_ Python had been able to do that, the meaning of genexps and listcomps would have been defined, from the start, in terms of synthesized functions that declared all & only the for-target names "local". And, in fact, the change I'm suggesting wouldn't have required changing the comprehension implementation _at all_ when assignment expressions were added. Instead the implementation would need to change to _add_ assignment expression targets to the things declared local if it was decided that those targets should be _local_ to the derived functions instead. That's why this all seems so bloody obvious to me ;-) It's how virtually every other language in the business of defining constructs in terms of nested synthesized functions works. So if that's something we may ever do again - and possibly even if we don't expect to ever do it again - I suggest a more generally useful approach would be to add a new flavor of _function_ to Python. Namely one wherein the only locals are the formal arguments and those explicitly declared local. Whether or not a name is bound in the body would be irrelevant. To avoid a new keyword, `local` could be spelled `not nonlocal` ;-) Note that the only use for `parentlocal` so far is tediously emulating the _effects_ of what that hypothetical `deflocal` flavor of function would do all the time with names not declared local in it. If we ever do something like this again, it would be far easier and clearer to just say the synthetic functions are of the `deflocal` flavor, and here are the names declared local in this case: x, y, z, ... User-defined functions may well find that useful at times too. Although it would be a large conceptual addition to part of Python, adding `parentlocal` would be too, and all by itself the latter looks like an incoherent pile of bizarre tricks. The _meaning_ of `deflocal` would be immediately clear to people coming from any number of other modern-ish languages. > It also occurs to me that we could do something pretty neat for class > scopes: have parent local declarations in methods target the implicit > lexical scope where __class__ lives (to support zero-arg super), *not* the > class body. That would entail adding a "classlocal" declaration to target > that implied scope, though. > > That would give the following definition for "lexical scopes that parent > local scoping can target": > > - module globals (parentlocal -> global) > - function locals, including lambda expression locals (parentlocal -> > nonlocal) Except that for top-level functions, parentlocal -> global; and regardless of nesting level, also implies `global` if the name is declared `global` in the parent block. Unless I've wholly lost track of your intent, which is quite possible. === no new content below === > - implicit class closure, where __class__ lives (parentlocal -> nonlocal in > current scope, classlocal in class scope) > > Most notably, in the synthetic functions created for generator expressions > and comprehensions, a parentlocal declaration in a child scope would imply a > parentlocal declaration in the synthetic function as well, propagating back > up the chain of nested lexical scopes until it terminated in one of the > above three permitted targets. > > Using the explicit forms would then look like: > > from __future import parent_scopes # Enable the explicit declaration > forms > > class C: > classlocal _n # Declares _n as a cell akin to __class__ rather than > a class attribute > _n = [] > @staticmethod > def get_count(): > return len(_n) > > assert not hasattr(C, "_n") > assert C.get_count() == 0 > > def _writes_to_parent_scope(): > parentlocal outer_name > outer_name = 42 > > assert outer_name == 42 > > I'm still doubtful the complexity of actually doing that is warranted, but > I'm now satisfied the semantics can be well specified in a way that allows > us to retain the explanation of generator expressions and comprehensions in > terms of their statement level counterparts (with the added bonus of making > "__class__" a little less of a magically unique snowflake along the way). From tritium-list at sdamon.com Wed May 16 00:35:54 2018 From: tritium-list at sdamon.com (Alex Walters) Date: Wed, 16 May 2018 00:35:54 -0400 Subject: [Python-Dev] What is the rationale behind source only releases? Message-ID: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> In the spirit of learning why there is a fence across the road before I tear it down out of ignorance [1], I'd like to know the rationale behind source only releases of cpython. I have an opinion on their utility and perhaps an idea about changing them, but I'd like to know why they are done (as opposed to source+binary releases or no release at all) before I head over to python-ideas. Is this documented somewhere where my google-fu can't find it? [1]: https://en.wikipedia.org/wiki/Wikipedia:Chesterton%27s_fence From ben+python at benfinney.id.au Wed May 16 01:06:12 2018 From: ben+python at benfinney.id.au (Ben Finney) Date: Wed, 16 May 2018 15:06:12 +1000 Subject: [Python-Dev] What is the rationale behind source only releases? References: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> Message-ID: <85tvr8kx4r.fsf@benfinney.id.au> "Alex Walters" writes: > I'd like to know the rationale behind source only releases of cpython. Software freedom entails the freedom to modify and build the software. For that, one needs the source form of the software. Portable software should be feasible to build from source, on a platform where no builds (of that particular release) were done before. For that, one needs the source form of the software. > I have an opinion on their utility and perhaps an idea about changing > them, but I'd like to know why they are done The above rationales seem sufficient to me. Are you looking for additional ones? > (as opposed to source+binary releases or no release at all) I don't see a good justification for adding ?source+binary? releases to the existing ones. We already have a source release (once), anda separate binary (one per platform). Why bother *also* making a source+binary release ? presumably an additional one per platform? As for ?no release at all?, it seems that those who want that can download it very quickly now :-) > before I head over to python-ideas. Is this documented somewhere where > my google-fu can't find it? I am not clear on why this would need specific documentation for Python; these are not issues that are different from any other software where the recipients have software freedom in the work. I hope these answers are useful. -- \ ?My business is to teach my aspirations to conform themselves | `\ to fact, not to try and make facts harmonise with my | _o__) aspirations.? ?Thomas Henry Huxley, 1860-09-23 | Ben Finney From rosuav at gmail.com Wed May 16 01:18:55 2018 From: rosuav at gmail.com (Chris Angelico) Date: Wed, 16 May 2018 15:18:55 +1000 Subject: [Python-Dev] What is the rationale behind source only releases? In-Reply-To: <85tvr8kx4r.fsf@benfinney.id.au> References: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> <85tvr8kx4r.fsf@benfinney.id.au> Message-ID: On Wed, May 16, 2018 at 3:06 PM, Ben Finney wrote: > "Alex Walters" writes: > >> I'd like to know the rationale behind source only releases of cpython. > > Software freedom entails the freedom to modify and build the software. > For that, one needs the source form of the software. > > Portable software should be feasible to build from source, on a platform > where no builds (of that particular release) were done before. For that, > one needs the source form of the software. AIUI Alex is asking about the last release(s) of each branch, eg 3.4.8. There are no official Python.org binaries published for these releases, so anyone who wants to upgrade within the 3.4 branch has to build it themselves. ChrisA From donald at stufft.io Wed May 16 01:23:10 2018 From: donald at stufft.io (Donald Stufft) Date: Wed, 16 May 2018 01:23:10 -0400 Subject: [Python-Dev] What is the rationale behind source only releases? In-Reply-To: <85tvr8kx4r.fsf@benfinney.id.au> References: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> <85tvr8kx4r.fsf@benfinney.id.au> Message-ID: <4E3C4E06-E86F-434B-A010-737278E8F6D5@stufft.io> > On May 16, 2018, at 1:06 AM, Ben Finney wrote: > >> >> I'd like to know the rationale behind source only releases of cpython. > > Software freedom entails the freedom to modify and build the software. > For that, one needs the source form of the software. > > Portable software should be feasible to build from source, on a platform > where no builds (of that particular release) were done before. For that, > one needs the source form of the software. I?m guessing the question isn?t why is it useful to have a source release of CPython, but why does CPython transition from having both source releases and binary releases to only source releases. My assumption is the rationale is to reduce the maintenance burden as time goes on for older release channels. -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.jerdonek at gmail.com Wed May 16 01:55:07 2018 From: chris.jerdonek at gmail.com (Chris Jerdonek) Date: Tue, 15 May 2018 22:55:07 -0700 Subject: [Python-Dev] What is the rationale behind source only releases? In-Reply-To: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> References: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> Message-ID: What does ?no release at all? mean? If it?s not released, how would people use it? ?Chris On Tue, May 15, 2018 at 9:36 PM Alex Walters wrote: > In the spirit of learning why there is a fence across the road before I > tear > it down out of ignorance [1], I'd like to know the rationale behind source > only releases of cpython. I have an opinion on their utility and perhaps > an > idea about changing them, but I'd like to know why they are done (as > opposed > to source+binary releases or no release at all) before I head over to > python-ideas. Is this documented somewhere where my google-fu can't find > it? > > > [1]: https://en.wikipedia.org/wiki/Wikipedia:Chesterton%27s_fence > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/chris.jerdonek%40gmail.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at pearwood.info Wed May 16 02:52:16 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Wed, 16 May 2018 16:52:16 +1000 Subject: [Python-Dev] What is the rationale behind source only releases? In-Reply-To: References: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> Message-ID: <20180516065215.GX12683@ando.pearwood.info> On Tue, May 15, 2018 at 10:55:07PM -0700, Chris Jerdonek wrote: > What does ?no release at all? mean? If it?s not released, how would people > use it? I've been using Python 1.7 for years now. It is the perfect Python, with exactly all the features I want, and none that I don't want, and so much faster than Python 2.7 or 3.7 it is ridiculous. Unfortunately once I've woken up and tried to port my code to an actual computer, it doesn't work. *wink* In principle, we could continue adding fixes to a version in the source repository, but never cut a release with a new version. But I don't think we do that: once a version hits "no release", we stop adding fixes to the repo for that version: - full source and binary releases - source only releases - accumulate fixes in the VCS but don't cut a new release - stop making releases at all (the version is now unmaintained) The third (second from the bottom) doesn't (as far as I am aware) occur. -- Steve From p.f.moore at gmail.com Wed May 16 04:06:43 2018 From: p.f.moore at gmail.com (Paul Moore) Date: Wed, 16 May 2018 09:06:43 +0100 Subject: [Python-Dev] What is the rationale behind source only releases? In-Reply-To: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> References: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> Message-ID: On 16 May 2018 at 05:35, Alex Walters wrote: > In the spirit of learning why there is a fence across the road before I tear > it down out of ignorance [1], I'd like to know the rationale behind source > only releases of cpython. I have an opinion on their utility and perhaps an > idea about changing them, but I'd like to know why they are done (as opposed > to source+binary releases or no release at all) before I head over to > python-ideas. Is this documented somewhere where my google-fu can't find > it? > > > [1]: https://en.wikipedia.org/wiki/Wikipedia:Chesterton%27s_fence Assuming you're referring to the practice of no longer distributing binaries for patch releases of older versions of Python, the reason is basically as follows: 1. Producing binaries (to the quality we normally deliver - I'm not talking about auto-built binaries produced from a CI system) is a chunk of extra work for the release managers. 2. The releases in question are essentially end of life, and we're only accepting security fixes. 3. Not even releasing sources means that people still using those releases will no longer have access to security fixes, so we'd be reducing the length of time we offer that level of support. So extra binaries = more work for the release managers, no source release = less support for our users. There's no reason we couldn't have a discussion on changing the policy, but any such discussion would probably need active support from the release managers if it were to stand any chance of going anywhere (as they are the people directly impacted by any such change). Paul From storchaka at gmail.com Wed May 16 04:18:51 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Wed, 16 May 2018 11:18:51 +0300 Subject: [Python-Dev] What is the rationale behind source only releases? In-Reply-To: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> References: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> Message-ID: 16.05.18 07:35, Alex Walters ????: > In the spirit of learning why there is a fence across the road before I tear > it down out of ignorance [1], I'd like to know the rationale behind source > only releases of cpython. I have an opinion on their utility and perhaps an > idea about changing them, but I'd like to know why they are done (as opposed > to source+binary releases or no release at all) before I head over to > python-ideas. Is this documented somewhere where my google-fu can't find > it? Taking a snapshot of sources at the random point of time is dangerous. You can get broken sources. Making a source only release means that sources are in consistent state, most buildbots are green, and core developers made necessary changes and stopped merging risky changes for some period before the release. The difference with source+binary releases is that latter adds additional burden to release managers: building binaries and installers on different platforms and publishing results on the site. From storchaka at gmail.com Wed May 16 04:34:38 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Wed, 16 May 2018 11:34:38 +0300 Subject: [Python-Dev] What is the rationale behind source only releases? In-Reply-To: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> References: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> Message-ID: 16.05.18 07:35, Alex Walters ????: > [1]: https://en.wikipedia.org/wiki/Wikipedia:Chesterton%27s_fence And I wish that every author who suggested the idea for Python was familiar with the Chesterton's fence principle. From p.f.moore at gmail.com Wed May 16 04:40:19 2018 From: p.f.moore at gmail.com (Paul Moore) Date: Wed, 16 May 2018 09:40:19 +0100 Subject: [Python-Dev] What is the rationale behind source only releases? In-Reply-To: References: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> Message-ID: On 16 May 2018 at 09:34, Serhiy Storchaka wrote: > 16.05.18 07:35, Alex Walters ????: >> >> [1]: https://en.wikipedia.org/wiki/Wikipedia:Chesterton%27s_fence > > > And I wish that every author who suggested the idea for Python was familiar > with the Chesterton's fence principle. Agreed - thanks Alex for taking the time to research the issue! Paul From eric at trueblade.com Wed May 16 04:48:55 2018 From: eric at trueblade.com (Eric V. Smith) Date: Wed, 16 May 2018 04:48:55 -0400 Subject: [Python-Dev] What is the rationale behind source only releases? In-Reply-To: References: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> Message-ID: <3b564ad2-3306-e3a9-99ab-19e873a6f53d@trueblade.com> On 5/16/18 4:34 AM, Serhiy Storchaka wrote: > 16.05.18 07:35, Alex Walters ????: >> [1]: https://en.wikipedia.org/wiki/Wikipedia:Chesterton%27s_fence > > And I wish that every author who suggested the idea for Python was > familiar with the Chesterton's fence principle. Indeed! It's refreshing. Thanks, Alex. Eric From nad at python.org Wed May 16 07:06:52 2018 From: nad at python.org (Ned Deily) Date: Wed, 16 May 2018 07:06:52 -0400 Subject: [Python-Dev] What is the rationale behind source only releases? In-Reply-To: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> References: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> Message-ID: On May 16, 2018, at 00:35, Alex Walters wrote: > In the spirit of learning why there is a fence across the road before I tear > it down out of ignorance [1], I'd like to know the rationale behind source > only releases of cpython. I have an opinion on their utility and perhaps an > idea about changing them, but I'd like to know why they are done (as opposed > to source+binary releases or no release at all) before I head over to > python-ideas. Is this documented somewhere where my google-fu can't find > it? The Python Developer's Guide has a discussion of the lifecycle of cPython releases here: https://devguide.python.org/#status-of-python-branches The ~short answer is that we produce source+binary (Windows and macOS binary installers) artifacts for release branches in "bugfix" (AKA "maintenance") mode (currently 3.6 and 2.7) as well as during the later stages of the in-development phase for future feature releases ("prerelease" mode) (currently 3.7); we produce only source releases for release branches in "security" mode. After the initial release of a new feature branch (for example, the upcoming 3.7.0 release), we will continue to support the previous release branch in bugfix mode for some overlapping period of time. So, for example, the current plan is to support both 3.7.x and 3.6.x (along with 2.7.x) in bugfix mode, releasing both source and binary artifacts for about six months after the 3.7.0 release. At that point, 3.6.x will transition to security-fix-only mode, where we will only produce releases on an as-needed basis and only in source form. Currently, 3.5 and 3.4 are also in security-fix-only mode. Eventually, usually five years after its initial release, a release branch will reach end-of-life: the branch will be frozen and no further issues for that release branch will be accepted nor will fixes be produced by Python Dev. 2.7 is a special case, with a greatly extended bugfix phase; it will proceed directly to end-of-life status as of 2020-01-01. There is more information later elsewhere in the devguide: https://devguide.python.org/devcycle/ and in the release PEPs linked in the Status of Python Branches section. Hope that helps! -- Ned Deily nad at python.org -- [] From barry at python.org Wed May 16 08:54:10 2018 From: barry at python.org (Barry Warsaw) Date: Wed, 16 May 2018 08:54:10 -0400 Subject: [Python-Dev] What is the rationale behind source only releases? In-Reply-To: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> References: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> Message-ID: <9AEC5E09-A586-46BD-8C62-5DA31AC3BA78@python.org> On May 16, 2018, at 00:35, Alex Walters wrote: > > In the spirit of learning why there is a fence across the road before I tear > it down out of ignorance [1], I'd like to know the rationale behind source > only releases of cpython. Historically, it was a matter of resources. Making binary releases incurs costs and delays on the release process and release managers, including the folks who actually have to produce the binaries. As a version winds down, we wanted to impose less work on those folks and less friction and delay in cutting a release. There is still value in spinning a tarball though, for downstream consumers who need a tagged and blessed release. Cheers, -Barry -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 833 bytes Desc: Message signed with OpenPGP URL: From encukou at gmail.com Wed May 16 11:31:05 2018 From: encukou at gmail.com (Petr Viktorin) Date: Wed, 16 May 2018 11:31:05 -0400 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: <5AFB5734.7080908@UGent.be> References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> Message-ID: On 05/15/18 17:55, Jeroen Demeyer wrote: > On 2018-05-15 18:36, Petr Viktorin wrote: >> Naturally, large-scale >> changes have less of a chance there. > > Does it really matter that much how large the change is? I think you are > focusing too much on the change instead of the end result. > > As I said in my previous post, I could certainly make less disruptive > changes. But would that really be better? (If you think that the answer > is "yes" here, I honestly want to know). Yes, I believe it is better. The larger a change is, the harder it is to understand, meaning that less people can meaningfully join the conversation, think about how it interacts with their own use cases, and notice (and think through) any unpleasant details. Less disruptive changes tend to have a better backwards compatibility story. A less intertwined change makes it easier to revert just a single part, in case that becomes necessary. > I could make the code less different than today but at the cost of added > complexity. Building on top of the existing code is like building on a > bad foundation: the higher you build, the messier it gets. Instead, I > propose a solid new foundation. Of course, that requires more work to > build but once it is built, the finished building looks a lot better. To continue the analogy: the tenants have been customizing their apartments inside that building, possibly depending on structural details that we might think should be hidden from them. And they expect to continue living there while the foundation is being swapped under them :) >> With such a "finished product" PEP, it's hard to see if some of the >> various problems could be solved in a better way -- faster, more >> maintainable, or less disruptive. > > With "faster", you mean runtime speed? I'm pretty confident that we > won't lose anything there. > > As I argued above, my PEP might very well make things "more > maintainable", but this is of course very subjective. And "less > disruptive" was never a goal for this PEP. > >> It's also harder from a psychological point of view: you obviously >> already put in a lot of good work, and it's harder to waste that work if >> an even better solution is found. > > I hope that this won't be my psychology. As a developer, I prefer to > focus on problems rather than on solutions: I don't want to push a > particular solution, I want to fix a particular problem. If an even > better solution is accepted, I will be a very happy man. > > What I would hate is that this PEP gets rejected because some people > claim that the problem can be solved in a better way, but without > actually suggesting such a better way. Mark Shannon has an upcoming PEP with an alternative to some of the issues. (Not all of them ? but less intertwined is better, all else being equal.) >> Is a branching class hierarchy, with quite a few new of flags for >> feature selection, the kind of simplicity we want? > > Maybe yes because it *concentrates* all complexity in one small place. > Currently, we have several independent classes > (builtin_function_or_method, method_descriptor, function, method) which > all require various forms of special casing in the interpreter with some > code duplication. With my PEP, this all goes away and instead we need to > understand just one class, namely base_function. > >> Would it be possible to first decouple things, reducing the complexity, >> and then tackle the individual problems? > > What do you mean with "decouple things"? Can you be more concrete? Currently, the "outside" of a function (how it looks when introspected) is tied to the "inside" (what happens internally when it's called). That's what I'd like to see decoupled. Can we better enable pydoc/IPython developers to tackle introspection problems without wading deep in the internals and call optimizations? >> The class hierarchy still makes it hard to decouple the introspection >> side (how functions look on the outside) from the calling mechanism (how >> the calling works internally). > > Any class who wants to profit from fast function calls can inherit from > base_function. It can add whatever attributes it wants and it can choose > to implement documentation and/or introspection in whatever way it > wants. It can choose to not care about that at all. That looks very > decoupled to me. But, it still has to inherit from base_function to "look like a function". Can we remove that limitation in favor of duck typing? >> Starting from an idea and ironing out the details it lets you (and, if >> since you published results, everyone else) figure out the tricky >> details. But ultimately it's exploring one path of doing things ? it >> doesn't necessarily lead to the best way of doing something. > > So far I haven't seen any other proposals... > >> That's a good question. Maybe inspect.isfunction() serves too many use >> cases to be useful. Cython functons should behave like "def" functions >> in some cases, and like built-in functions in others. > > From the outside, i.e. user's point of view, I want them to behave like > Python functions. Whether it's implemented in C or Python should just be > an implementation detail. Of course there are attributes like __code__ > which dive into implementation details, so there you will see the > difference. > >> before we change how inspect.isfunction ultimately behaves, >> I'd like to make its purpose clearer (and try to check how that meshes >> with the current use cases). > > The problem is that this is not easy to do. You could search CPython for > occurrences of inspect.isfunction() and you could search your favorite > Python projects. This will give you some indication, but I'm not sure > whether that will be representative. > > From what I can tell, inspect.isfunction() is mainly used as guard for > attribute access: it implies for example that a __globals__ attribute > exists. That's unfortunate -- I don't see "is a function" and "has __globals__" as related. Can we provide better tools for people that currently need to rely on this? > And it's used by documentation tools to decide that it should be > documented as Python function whose signature can be extracted using > inspect.signature(). I think inspect.signature should never call "inspect.isfunction" for objects with a (lazily computed) __signature__ attribute. If that's not the case, let's fix that. Special handling for functions is reasonable because the signature object and lazy attributes are much higher-level than function. But, IMO, anything that's not low-level (e.g. a real "def" function, methods) can and should use __signature__. From stefan_ml at behnel.de Wed May 16 15:31:13 2018 From: stefan_ml at behnel.de (Stefan Behnel) Date: Wed, 16 May 2018 21:31:13 +0200 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: <74624524-2ef9-7ef7-677a-72a65a81f911@gmail.com> References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <74624524-2ef9-7ef7-677a-72a65a81f911@gmail.com> Message-ID: Petr Viktorin schrieb am 15.05.2018 um 18:36: > On 05/15/18 05:15, Jeroen Demeyer wrote: >> An important note is that it was never my goal to create a minimal PEP. I >> did not aim for changing as little as possible. I was thinking: we are >> changing functions, what would be the best way to implement them? > > That might be a problem. For the change to be accepted, a core developer > will need to commit to maintaining the code, understand it, and accept > responsibility for anything that's broken. Naturally, large-scale changes > have less of a chance there. Honestly, the current implementation involves such a clutter of special cases that the internal code simplification that this PEP allows should make every core developer who needs to make their hands dirty with the current code bow to Jeroen for coming up with this PEP and even implementing it. Just my personal opinion. Stefan From J.Demeyer at UGent.be Wed May 16 16:34:26 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Wed, 16 May 2018 22:34:26 +0200 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> Message-ID: <5AFC95D2.5070102@UGent.be> On 2018-05-16 17:31, Petr Viktorin wrote: > The larger a change is, the harder it is to understand I already disagree here... I'm afraid that you are still confusing the largeness of the *change* with the complexity of the *result* after the change was implemented. A change that *removes* complexity should be considered a good thing, even if it's a large change. That being said, if you want me to make smaller changes, I could do it. But I would do it for *you* personally because I'm afraid that other people might rightly complain that I'm making things too complicated. So I would certainly like some feedback from others on this point. > Less disruptive changes tend to have a better backwards compatibility story. Maybe in very general terms, yes. But I believe that the "disruptive" changes that I'm making will not contribute to backwards incompatibility. Adding new ml_flags flags shouldn't break anything and adding a base class shouldn't either (I doubt that there is code relying on the fact that type(len).__base__ is object). In my opinion, the one change that is most likely to cause backwards compatibility problems is changing the type of bound methods of extension types. And that change is even in the less disruptive PEP 576. > Mark Shannon has an upcoming PEP with an alternative to some of the > issues. I'm looking forward to a serious discussion about that. However, from a first reading, I'm not very optimistic about its performance implications. > Currently, the "outside" of a function (how it looks when introspected) > is tied to the "inside" (what happens internally when it's called). > Can we better enable pydoc/IPython developers to tackle introspection > problems without wading deep in the internals and call optimizations? I proposed complete decoupling in https://bugs.python.org/issue30071 and that was rejected. Anyway, decoupling of introspection is not the essence of this PEP. This PEP is really about allowing custom built-in function subclasses. That's the hard part where CPython internals come in. So I suggest that we leave the discussion about introspection and focus on the function classes. > But, it still has to inherit from base_function to "look like a > function". Can we remove that limitation in favor of duck typing? Duck typing is a Python thing, I don't know what "duck typing" would mean on the C level. We could change the existing isinstance(..., base_function) check by a different fast check. For example, we (together with the Cython devs) have been pondering about a new "type" field, say tp_cfunctionoffset pointing to a certain C field in the object structure. That would work but it would not be so fundamentally different from the current PEP. *PS*: On friday, I'm leaving for 2 weeks on holidays. So if I don't reply to comments on PEP 575 or alternative proposals, don't take it as a lack of interest. Jeroen. From anthony.flury at btinternet.com Wed May 16 17:48:27 2018 From: anthony.flury at btinternet.com (Anthony Flury) Date: Wed, 16 May 2018 22:48:27 +0100 Subject: [Python-Dev] Hashes in Python3.5 for tuples and frozensets Message-ID: <9b9dc1b0-6d46-dc46-2432-dfd8d4ad56b5@btinternet.com> This may be known but I wanted to ask this esteemed body first. I understand that from Python3.3 there was a security fix to ensure that different python processes would generate different hash value for the same input - to prevent denial of service based on crafted hash conflicts. I opened two python REPLs on my Linux 64bit PC and did the following Terminal 1: >>> hash('Hello World') -1010252950208276719 >>> hash( frozenset({1,9}) ) ?-7625378979602737914 >>> hash(frozenset({300,301})) -8571255922896611313 >>> hash((1,9)) 3713081631926832981 >>> hash((875,932)) 3712694086932196356 Terminal 2: >>> hash('Hello World') -8267767374510285039 >>> hash( frozenset({1,9}) ) ?-7625378979602737914 >>> hash(frozenset({300,301})) -8571255922896611313 >>> hash((1,9)) 3713081631926832981 >>> hash((875,932)) 3712694086932196356 As can be seen - taking a hash of a string does indeed create a different value between the two processes (as expected). However the frozen set hash, the same in both cases, as is the hash of the tuples - suggesting that the vulnerability resolved in Python 3.3 wasn't resolved across all potentially hashable values. lI even used different large numbers to ensure that the integers weren't being interned. I can imagine that frozensets aren't used frequently as hash keys - but I would think that tuples are regularly used. Since that their hashes are not salted does the vulnerability still exist in some form ?. -- -- Anthony Flury email : *Anthony.flury at btinternet.com* Twitter : *@TonyFlury * From raymond.hettinger at gmail.com Wed May 16 18:10:07 2018 From: raymond.hettinger at gmail.com (Raymond Hettinger) Date: Wed, 16 May 2018 18:10:07 -0400 Subject: [Python-Dev] Hashes in Python3.5 for tuples and frozensets In-Reply-To: <9b9dc1b0-6d46-dc46-2432-dfd8d4ad56b5@btinternet.com> References: <9b9dc1b0-6d46-dc46-2432-dfd8d4ad56b5@btinternet.com> Message-ID: <28183E82-3E9F-45FA-A19D-341AB772AC0F@gmail.com> > On May 16, 2018, at 5:48 PM, Anthony Flury via Python-Dev wrote: > > However the frozen set hash, the same in both cases, as is the hash of the tuples - suggesting that the vulnerability resolved in Python 3.3 wasn't resolved across all potentially hashable values. You are correct. The hash randomization only applies to strings. None of the other object hashes were altered. Whether this is a vulnerability or not depends greatly on what is exposed to users (generally strings) and how it is used. For the most part, it is considered a feature that integers hash to themselves. That is very fast to compute :-) Also, it tends to prevent hash collisions for consecutive integers. Raymond From christian at python.org Wed May 16 21:41:49 2018 From: christian at python.org (Christian Heimes) Date: Wed, 16 May 2018 21:41:49 -0400 Subject: [Python-Dev] Hashes in Python3.5 for tuples and frozensets In-Reply-To: <28183E82-3E9F-45FA-A19D-341AB772AC0F@gmail.com> References: <9b9dc1b0-6d46-dc46-2432-dfd8d4ad56b5@btinternet.com> <28183E82-3E9F-45FA-A19D-341AB772AC0F@gmail.com> Message-ID: <5d3f7c9d-7108-3c97-6e9e-6c07be130ce0@python.org> On 2018-05-16 18:10, Raymond Hettinger wrote: > > >> On May 16, 2018, at 5:48 PM, Anthony Flury via Python-Dev wrote: >> >> However the frozen set hash, the same in both cases, as is the hash of the tuples - suggesting that the vulnerability resolved in Python 3.3 wasn't resolved across all potentially hashable values. > > You are correct. The hash randomization only applies to strings. None of the other object hashes were altered. Whether this is a vulnerability or not depends greatly on what is exposed to users (generally strings) and how it is used. > > For the most part, it is considered a feature that integers hash to themselves. That is very fast to compute :-) Also, it tends to prevent hash collisions for consecutive integers. Raymond is 100% correct. Just one small nit pick: randomization applies to both string and bytes. Christian From christian at python.org Wed May 16 21:41:49 2018 From: christian at python.org (Christian Heimes) Date: Wed, 16 May 2018 21:41:49 -0400 Subject: [Python-Dev] Hashes in Python3.5 for tuples and frozensets In-Reply-To: <28183E82-3E9F-45FA-A19D-341AB772AC0F@gmail.com> References: <9b9dc1b0-6d46-dc46-2432-dfd8d4ad56b5@btinternet.com> <28183E82-3E9F-45FA-A19D-341AB772AC0F@gmail.com> Message-ID: <5d3f7c9d-7108-3c97-6e9e-6c07be130ce0@python.org> On 2018-05-16 18:10, Raymond Hettinger wrote: > > >> On May 16, 2018, at 5:48 PM, Anthony Flury via Python-Dev wrote: >> >> However the frozen set hash, the same in both cases, as is the hash of the tuples - suggesting that the vulnerability resolved in Python 3.3 wasn't resolved across all potentially hashable values. > > You are correct. The hash randomization only applies to strings. None of the other object hashes were altered. Whether this is a vulnerability or not depends greatly on what is exposed to users (generally strings) and how it is used. > > For the most part, it is considered a feature that integers hash to themselves. That is very fast to compute :-) Also, it tends to prevent hash collisions for consecutive integers. Raymond is 100% correct. Just one small nit pick: randomization applies to both string and bytes. Christian From tritium-list at sdamon.com Wed May 16 23:37:42 2018 From: tritium-list at sdamon.com (Alex Walters) Date: Wed, 16 May 2018 23:37:42 -0400 Subject: [Python-Dev] What is the rationale behind source only releases? In-Reply-To: References: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> Message-ID: <014501d3ed90$6a640650$3f2c12f0$@sdamon.com> Thank you, that's exactly what I needed to read. > -----Original Message----- > From: Ned Deily > Sent: Wednesday, May 16, 2018 7:07 AM > To: Alex Walters > Cc: Python-Dev > Subject: Re: [Python-Dev] What is the rationale behind source only releases? > > On May 16, 2018, at 00:35, Alex Walters wrote: > > In the spirit of learning why there is a fence across the road before I tear > > it down out of ignorance [1], I'd like to know the rationale behind source > > only releases of cpython. I have an opinion on their utility and perhaps an > > idea about changing them, but I'd like to know why they are done (as > opposed > > to source+binary releases or no release at all) before I head over to > > python-ideas. Is this documented somewhere where my google-fu can't > find > > it? > > The Python Developer's Guide has a discussion of the lifecycle of cPython > releases here: > > https://devguide.python.org/#status-of-python-branches > > The ~short answer is that we produce source+binary (Windows and macOS > binary installers) artifacts for release branches in "bugfix" (AKA > "maintenance") mode (currently 3.6 and 2.7) as well as during the later > stages of the in-development phase for future feature releases > ("prerelease" mode) (currently 3.7); we produce only source releases for > release branches in "security" mode. > > After the initial release of a new feature branch (for example, the upcoming > 3.7.0 release), we will continue to support the previous release branch in > bugfix mode for some overlapping period of time. So, for example, the > current plan is to support both 3.7.x and 3.6.x (along with 2.7.x) in bugfix > mode, releasing both source and binary artifacts for about six months after > the 3.7.0 release. At that point, 3.6.x will transition to security-fix-only mode, > where we will only produce releases on an as-needed basis and only in > source form. Currently, 3.5 and 3.4 are also in security-fix-only mode. > Eventually, usually five years after its initial release, a release branch will > reach end-of-life: the branch will be frozen and no further issues for that > release branch will be accepted nor will fixes be produced by Python Dev. > 2.7 is a special case, with a greatly extended bugfix phase; it will proceed > directly to end-of-life status as of 2020-01-01. > > There is more information later elsewhere in the devguide: > > https://devguide.python.org/devcycle/ > > and in the release PEPs linked in the Status of Python Branches section. > > Hope that helps! > > -- > Ned Deily > nad at python.org -- [] From tritium-list at sdamon.com Wed May 16 23:39:39 2018 From: tritium-list at sdamon.com (Alex Walters) Date: Wed, 16 May 2018 23:39:39 -0400 Subject: [Python-Dev] What is the rationale behind source only releases? In-Reply-To: <4E3C4E06-E86F-434B-A010-737278E8F6D5@stufft.io> References: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> <85tvr8kx4r.fsf@benfinney.id.au> <4E3C4E06-E86F-434B-A010-737278E8F6D5@stufft.io> Message-ID: <014601d3ed90$af7d3bd0$0e77b370$@sdamon.com> This is precisely what I meant. Before asking this question, I didn?t fully understand why, for example, 3.5.4 got a binary installer for windows and mac, but 3.5.5 did not. This thread has cleared that up for me. From: Python-Dev On Behalf Of Donald Stufft Sent: Wednesday, May 16, 2018 1:23 AM To: Ben Finney Cc: python-dev at python.org Subject: Re: [Python-Dev] What is the rationale behind source only releases? On May 16, 2018, at 1:06 AM, Ben Finney > wrote: I'd like to know the rationale behind source only releases of cpython. Software freedom entails the freedom to modify and build the software. For that, one needs the source form of the software. Portable software should be feasible to build from source, on a platform where no builds (of that particular release) were done before. For that, one needs the source form of the software. I?m guessing the question isn?t why is it useful to have a source release of CPython, but why does CPython transition from having both source releases and binary releases to only source releases. My assumption is the rationale is to reduce the maintenance burden as time goes on for older release channels. -------------- next part -------------- An HTML attachment was scrubbed... URL: From tritium-list at sdamon.com Wed May 16 23:46:27 2018 From: tritium-list at sdamon.com (Alex Walters) Date: Wed, 16 May 2018 23:46:27 -0400 Subject: [Python-Dev] What is the rationale behind source only releases? In-Reply-To: References: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> Message-ID: <015101d3ed91$a32e7cd0$e98b7670$@sdamon.com> > -----Original Message----- > From: Paul Moore > Sent: Wednesday, May 16, 2018 4:07 AM > To: Alex Walters > Cc: Python Dev > Subject: Re: [Python-Dev] What is the rationale behind source only releases? > > On 16 May 2018 at 05:35, Alex Walters wrote: > > In the spirit of learning why there is a fence across the road before I tear > > it down out of ignorance [1], I'd like to know the rationale behind source > > only releases of cpython. I have an opinion on their utility and perhaps an > > idea about changing them, but I'd like to know why they are done (as > opposed > > to source+binary releases or no release at all) before I head over to > > python-ideas. Is this documented somewhere where my google-fu can't > find > > it? > > > > > > [1]: https://en.wikipedia.org/wiki/Wikipedia:Chesterton%27s_fence > > Assuming you're referring to the practice of no longer distributing > binaries for patch releases of older versions of Python, the reason is > basically as follows: > > 1. Producing binaries (to the quality we normally deliver - I'm not > talking about auto-built binaries produced from a CI system) is a > chunk of extra work for the release managers. This is actually the heart of the reason I asked the question. CI tools are fairly good now. If the CI tools could be used in such a way to make the building of binary artifacts less of a burden on the release managers, would there be interest in doing that, and in the process, releasing binary artifact installers for all security update releases. My rationale for asking if its possible is... well.. security releases are important, and it's hard to ask Windows users to install Visual Studio and build python to use the most secure version of python that will run your python program. Yes there are better ideal solutions (porting your code to the latest and greatest feature release version), but that?s not a zero burden option either. If CI tools just aren't up to the task, then so be it, and this isn't something I would darken -ideas' door with. > 2. The releases in question are essentially end of life, and we're > only accepting security fixes. > 3. Not even releasing sources means that people still using those > releases will no longer have access to security fixes, so we'd be > reducing the length of time we offer that level of support. > > So extra binaries = more work for the release managers, no source > release = less support for our users. > > There's no reason we couldn't have a discussion on changing the > policy, but any such discussion would probably need active support > from the release managers if it were to stand any chance of going > anywhere (as they are the people directly impacted by any such > change). > > Paul From tjreedy at udel.edu Thu May 17 01:00:05 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Thu, 17 May 2018 01:00:05 -0400 Subject: [Python-Dev] What is the rationale behind source only releases? In-Reply-To: <015101d3ed91$a32e7cd0$e98b7670$@sdamon.com> References: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> <015101d3ed91$a32e7cd0$e98b7670$@sdamon.com> Message-ID: On 5/16/2018 11:46 PM, Alex Walters wrote: > This is actually the heart of the reason I asked the question. CI tools are fairly good now. If the CI tools could be used in such a way to make the building of binary artifacts less of a burden on the release managers, would there be interest in doing that, and in the process, releasing binary artifact installers for all security update releases. The CI tools are used to test whether the repository is ready for a release. The release manager and the two binary builders manually follow written scripts that include running various programs and scripts. I don't know whether they master scripts are stable enough to automate yet. The Windows binary production process was redone for 3.5. The MacOS process was redone for 3.7 (.0b1). > My rationale for asking if its possible is... well.. security releases are important, and it's hard to ask Windows users to install Visual Studio and build python to use the most secure version of python that will run your python program. I believe one rationale for not offering binaries is the the security patches are mostly of interest to server people, who *do* build Python themselves. If you think otherwise, you could offer to build an installer and see if a release manager would include it on python.org as an experiment. -- Terry Jan Reedy From vstinner at redhat.com Thu May 17 02:38:27 2018 From: vstinner at redhat.com (Victor Stinner) Date: Thu, 17 May 2018 02:38:27 -0400 Subject: [Python-Dev] Hashes in Python3.5 for tuples and frozensets In-Reply-To: <9b9dc1b0-6d46-dc46-2432-dfd8d4ad56b5@btinternet.com> References: <9b9dc1b0-6d46-dc46-2432-dfd8d4ad56b5@btinternet.com> Message-ID: Hi, String hash is randomized, but not the integer hash: $ python3.5 -c 'print(hash("abc"))' -8844814677999896014 $ python3.5 -c 'print(hash("abc"))' -7757160699952389646 $ python3.5 -c 'print(hash(1))' 1 $ python3.5 -c 'print(hash(1))' 1 frozenset hash is combined from values of the set. So it's only randomized if values hashes are randomized. The denial of service is more likely to occur with strings as keys, than with integers. See the following link for more information: http://python-security.readthedocs.io/vuln/cve-2012-1150_hash_dos.html Victor 2018-05-16 17:48 GMT-04:00 Anthony Flury via Python-Dev : > This may be known but I wanted to ask this esteemed body first. > > I understand that from Python3.3 there was a security fix to ensure that > different python processes would generate different hash value for the same > input - to prevent denial of service based on crafted hash conflicts. > > I opened two python REPLs on my Linux 64bit PC and did the following > > Terminal 1: > > >>> hash('Hello World') > -1010252950208276719 > > >>> hash( frozenset({1,9}) ) > -7625378979602737914 > >>> hash(frozenset({300,301})) > -8571255922896611313 > > >>> hash((1,9)) > 3713081631926832981 > >>> hash((875,932)) > 3712694086932196356 > > > > Terminal 2: > > >>> hash('Hello World') > -8267767374510285039 > > >>> hash( frozenset({1,9}) ) > -7625378979602737914 > >>> hash(frozenset({300,301})) > -8571255922896611313 > > >>> hash((1,9)) > 3713081631926832981 > >>> hash((875,932)) > 3712694086932196356 > > As can be seen - taking a hash of a string does indeed create a different > value between the two processes (as expected). > > However the frozen set hash, the same in both cases, as is the hash of the > tuples - suggesting that the vulnerability resolved in Python 3.3 wasn't > resolved across all potentially hashable values. lI even used different > large numbers to ensure that the integers weren't being interned. > > I can imagine that frozensets aren't used frequently as hash keys - but I > would think that tuples are regularly used. Since that their hashes are not > salted does the vulnerability still exist in some form ?. > > -- > -- > Anthony Flury > email : *Anthony.flury at btinternet.com* > Twitter : *@TonyFlury * > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/vstinner%40redhat.com From anthony.flury at btinternet.com Thu May 17 03:21:03 2018 From: anthony.flury at btinternet.com (Anthony Flury) Date: Thu, 17 May 2018 08:21:03 +0100 Subject: [Python-Dev] Hashes in Python3.5 for tuples and frozensets In-Reply-To: References: <9b9dc1b0-6d46-dc46-2432-dfd8d4ad56b5@btinternet.com> Message-ID: Victor, Thanks for the link, but to be honest it will just confuse people - neither the link or the related bpo entries state that the fix is only limited to strings. They simply talk about hash randomization - which in my opinion implies ALL hash algorithms; which is why I asked the question. I am not sure how much should be exposed about the scope of security fixes but you can understand my (and other's) confusion. I am aware that applications shouldn't make assumptions about the value of any given hash value - apart from some simple assumptions based hash value equality (i.e. if two objects have different hash values they can't be the same value). /B//TW : // // //This question was prompted by a question on a social media platform about the whether hash values are transferable between across platforms. Everything I could find stated that after Python 3.3 ALL hash values were randomized - but that clearly isn't the case; and the original questioner identified that some hash values are randomized and other aren't.// // //I did suggest strongly to the original questioner that relying on the same hash value across different platforms wasn't a clever solution - their original plan was to store hash values in a cross system database to enable quick retrieval of data (!!!). I did remind the OP that a hash value wasn't guaranteed to be unique anyway - and they might come across two different values with the same hash - and no way to distinguish between them if all they have is the hash. Hopefully their revised design will store the key, not the hash./ On 17/05/18 07:38, Victor Stinner wrote: > Hi, > > String hash is randomized, but not the integer hash: > > $ python3.5 -c 'print(hash("abc"))' > -8844814677999896014 > $ python3.5 -c 'print(hash("abc"))' > -7757160699952389646 > > $ python3.5 -c 'print(hash(1))' > 1 > $ python3.5 -c 'print(hash(1))' > 1 > > frozenset hash is combined from values of the set. So it's only > randomized if values hashes are randomized. > > The denial of service is more likely to occur with strings as keys, > than with integers. > > See the following link for more information: > http://python-security.readthedocs.io/vuln/cve-2012-1150_hash_dos.html > > Victor > > 2018-05-16 17:48 GMT-04:00 Anthony Flury via Python-Dev : >> This may be known but I wanted to ask this esteemed body first. >> >> I understand that from Python3.3 there was a security fix to ensure that >> different python processes would generate different hash value for the same >> input - to prevent denial of service based on crafted hash conflicts. >> >> I opened two python REPLs on my Linux 64bit PC and did the following >> >> Terminal 1: >> >> >>> hash('Hello World') >> -1010252950208276719 >> >> >>> hash( frozenset({1,9}) ) >> -7625378979602737914 >> >>> hash(frozenset({300,301})) >> -8571255922896611313 >> >> >>> hash((1,9)) >> 3713081631926832981 >> >>> hash((875,932)) >> 3712694086932196356 >> >> >> >> Terminal 2: >> >> >>> hash('Hello World') >> -8267767374510285039 >> >> >>> hash( frozenset({1,9}) ) >> -7625378979602737914 >> >>> hash(frozenset({300,301})) >> -8571255922896611313 >> >> >>> hash((1,9)) >> 3713081631926832981 >> >>> hash((875,932)) >> 3712694086932196356 >> >> As can be seen - taking a hash of a string does indeed create a different >> value between the two processes (as expected). >> >> However the frozen set hash, the same in both cases, as is the hash of the >> tuples - suggesting that the vulnerability resolved in Python 3.3 wasn't >> resolved across all potentially hashable values. lI even used different >> large numbers to ensure that the integers weren't being interned. >> >> I can imagine that frozensets aren't used frequently as hash keys - but I >> would think that tuples are regularly used. Since that their hashes are not >> salted does the vulnerability still exist in some form ?. >> >> -- >> -- >> Anthony Flury >> email : *Anthony.flury at btinternet.com* >> Twitter : *@TonyFlury * >> >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/vstinner%40redhat.com -- -- Anthony Flury email : *Anthony.flury at btinternet.com* Twitter : *@TonyFlury * From greg.ewing at canterbury.ac.nz Thu May 17 04:04:24 2018 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 17 May 2018 20:04:24 +1200 Subject: [Python-Dev] Hashes in Python3.5 for tuples and frozensets In-Reply-To: References: <9b9dc1b0-6d46-dc46-2432-dfd8d4ad56b5@btinternet.com> Message-ID: <5AFD3788.3050707@canterbury.ac.nz> Anthony Flury via Python-Dev wrote: > //I did suggest strongly to the original questioner that relying on the > same hash value across different platforms wasn't a clever solution Even without randomisation, I wouldn't rely on hash values staying the same between different Python versions. Storing them in a database sounds like a really bad idea. -- Greg From rosuav at gmail.com Thu May 17 04:16:16 2018 From: rosuav at gmail.com (Chris Angelico) Date: Thu, 17 May 2018 18:16:16 +1000 Subject: [Python-Dev] Hashes in Python3.5 for tuples and frozensets In-Reply-To: References: <9b9dc1b0-6d46-dc46-2432-dfd8d4ad56b5@btinternet.com> Message-ID: On Thu, May 17, 2018 at 5:21 PM, Anthony Flury via Python-Dev wrote: > Victor, > Thanks for the link, but to be honest it will just confuse people - neither > the link or the related bpo entries state that the fix is only limited to > strings. They simply talk about hash randomization - which in my opinion > implies ALL hash algorithms; which is why I asked the question. > > I am not sure how much should be exposed about the scope of security fixes > but you can understand my (and other's) confusion. > > I am aware that applications shouldn't make assumptions about the value of > any given hash value - apart from some simple assumptions based hash value > equality (i.e. if two objects have different hash values they can't be the > same value). The hash values of Python objects are calculated by the __hash__ method, so arbitrary objects can do what they like, including degenerate algorithms such as: class X: def __hash__(self): return 7 So it's impossible to randomize ALL hashes at the language level. Only str and bytes hashes are randomized, because they're the ones most likely to be exploitable - for instance, a web server will receive a query like "http://spam.example/target?a=1&b=2&c=3" and provide a dictionary {"a":1, "b":2, "c":3}. Similarly, a JSON decoder is always going to create string keys in its dictionaries (JSON objects). Do you know of any situation in which an attacker can provide the keys for a dict/set as integers? > /B//TW : // > // > //This question was prompted by a question on a social media platform about > the whether hash values are transferable between across platforms. > Everything I could find stated that after Python 3.3 ALL hash values were > randomized - but that clearly isn't the case; and the original questioner > identified that some hash values are randomized and other aren't.// > / That's actually immaterial. Even if the hashes weren't actually randomized, you shouldn't be making assumptions about anything specific in the hash, save that *within one Python process*, two equal values will have equal hashes (and therefore two objects with unequal hashes will not be equal). > //I did suggest strongly to the original questioner that relying on the same > hash value across different platforms wasn't a clever solution - their > original plan was to store hash values in a cross system database to enable > quick retrieval of data (!!!). I did remind the OP that a hash value wasn't > guaranteed to be unique anyway - and they might come across two different > values with the same hash - and no way to distinguish between them if all > they have is the hash. Hopefully their revised design will store the key, > not the hash./ Uhh.... if you're using a database, let the database do the work of being a database. I don't know what this "cross system database" would be implemented in, but if it's a proper multi-user relational database engine like PostgreSQL, it's already going to have way better indexing than anything you'd do manually. I think there are WAY better solutions than worrying about Python's inbuilt hashing. If you MUST hash your data for sharing and storage, the easiest solution is to just use a cryptographic hash straight out of hashlib.py. ChrisA From p.f.moore at gmail.com Thu May 17 04:24:10 2018 From: p.f.moore at gmail.com (Paul Moore) Date: Thu, 17 May 2018 09:24:10 +0100 Subject: [Python-Dev] What is the rationale behind source only releases? In-Reply-To: <015101d3ed91$a32e7cd0$e98b7670$@sdamon.com> References: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> <015101d3ed91$a32e7cd0$e98b7670$@sdamon.com> Message-ID: On 17 May 2018 at 04:46, Alex Walters wrote: >> 1. Producing binaries (to the quality we normally deliver - I'm not >> talking about auto-built binaries produced from a CI system) is a >> chunk of extra work for the release managers. > > This is actually the heart of the reason I asked the question. CI tools are fairly good now. If the CI tools could be used in such a way to make the building of binary artifacts less of a burden on the release managers, would there be interest in doing that, and in the process, releasing binary artifact installers for all security update releases. > > My rationale for asking if its possible is... well.. security releases are important, and it's hard to ask Windows users to install Visual Studio and build python to use the most secure version of python that will run your python program. Yes there are better ideal solutions (porting your code to the latest and greatest feature release version), but that?s not a zero burden option either. > > If CI tools just aren't up to the task, then so be it, and this isn't something I would darken -ideas' door with. I honestly don't know if we're at a point where an auto-built security release would be sufficient and/or useful. That's mostly a question for the release manager(s). One sticking point might be that I believe the Windows installers (at least) are signed, and only the release managers have the signing key. It's probably *not* OK to leave the security releases unsigned ;-) So there would be a key management issue to address there. Paul. From nad at python.org Thu May 17 05:08:10 2018 From: nad at python.org (Ned Deily) Date: Thu, 17 May 2018 05:08:10 -0400 Subject: [Python-Dev] What is the rationale behind source only releases? In-Reply-To: References: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> <015101d3ed91$a32e7cd0$e98b7670$@sdamon.com> Message-ID: On May 17, 2018, at 04:24, Paul Moore wrote: > On 17 May 2018 at 04:46, Alex Walters wrote: >>> 1. Producing binaries (to the quality we normally deliver - I'm not >>> talking about auto-built binaries produced from a CI system) is a >>> chunk of extra work for the release managers. >> >> This is actually the heart of the reason I asked the question. CI tools are fairly good now. If the CI tools could be used in such a way to make the building of binary artifacts less of a burden on the release managers, would there be interest in doing that, and in the process, releasing binary artifact installers for all security update releases. >> >> My rationale for asking if its possible is... well.. security releases are important, and it's hard to ask Windows users to install Visual Studio and build python to use the most secure version of python that will run your python program. Yes there are better ideal solutions (porting your code to the latest and greatest feature release version), but that?s not a zero burden option either. >> >> If CI tools just aren't up to the task, then so be it, and this isn't something I would darken -ideas' door with. > > I honestly don't know if we're at a point where an auto-built security > release would be sufficient and/or useful. That's mostly a question > for the release manager(s). One sticking point might be that I believe > the Windows installers (at least) are signed, and only the release > managers have the signing key. It's probably *not* OK to leave the > security releases unsigned ;-) So there would be a key management > issue to address there. IMO, the idea of having either the current CI system or a third party produce binary artifacts for Python releases to be downloadable from python.org is a non-starter for lots of reasons, primarily because of the security risks. The release team *could* produce those artifacts for releases in security mode and, while it would be some extra work, there are so few of them. The question is should we. Once a release moves from bugfix/maintenance mode to security mode, in some ways we are doing a disservice to our users to encourage them to not upgrade to a more recent maintained release. Release branches in security mode do not get any fixes other than, based on past experience, at most a small number of security issues that might arise. In particular, security mode release branches receive no platform-support fixes to support newer OS releases and/or newer hardware support and receive no buildbot testing. Security mode releases today are really for downstream distributors and DIYers who are comfortable building and maintaining their own versions of software. -- Ned Deily nad at python.org -- [] From J.Demeyer at UGent.be Thu May 17 07:27:43 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Thu, 17 May 2018 13:27:43 +0200 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> Message-ID: <5AFD672F.1000901@UGent.be> On 2018-05-16 17:31, Petr Viktorin wrote: > Less disruptive changes tend to have a better backwards compatibility story. > A less intertwined change makes it easier to revert just a single part, > in case that becomes necessary. I'll just repeat what I said in a different post on this thread: we can still *implement* the PEP in a less intertwined and more gradual way. The PEP deals with several classes and each class can be changed separately. However, there is not much point in starting this process if you don't intend to go all the way. The power of PEP 575 is really using this base_function class in many places. A PEP just adding the class base_function as base class of buitin_function_or_method without using it anywhere else would make no sense by itself. Still, that could be a first isolated step in the implementation. If PEP 575 is accepted, I would like to follow it up with PEPs to add more classes to the base_function hierarchy (candidates: staticmethod, classmethod, classmethod_descriptor, method-wrapper, slot wrapper, functools.lru_cache). Jeroen. From storchaka at gmail.com Thu May 17 08:18:50 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Thu, 17 May 2018 15:18:50 +0300 Subject: [Python-Dev] The history of PyXML Message-ID: Does anyone has the full copy of the PyXML repository, with the complete history? This library was included in Python 2.1 as the xml package and is not maintained as a separate project since 2004. It's home on SourceForge was removed. I have found sources of the last PyXML version (0.8.4), but without history. I'm trying to figure out some intentions and fix possible bugs in the xml package. The history of all commits could help. From brett at python.org Thu May 17 09:42:38 2018 From: brett at python.org (Brett Cannon) Date: Thu, 17 May 2018 09:42:38 -0400 Subject: [Python-Dev] What is the rationale behind source only releases? In-Reply-To: References: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> <015101d3ed91$a32e7cd0$e98b7670$@sdamon.com> Message-ID: On Thu, 17 May 2018 at 04:25 Paul Moore wrote: > On 17 May 2018 at 04:46, Alex Walters wrote: > >> 1. Producing binaries (to the quality we normally deliver - I'm not > >> talking about auto-built binaries produced from a CI system) is a > >> chunk of extra work for the release managers. > > > > This is actually the heart of the reason I asked the question. CI tools > are fairly good now. If the CI tools could be used in such a way to make > the building of binary artifacts less of a burden on the release managers, > would there be interest in doing that, and in the process, releasing binary > artifact installers for all security update releases. > > > > My rationale for asking if its possible is... well.. security releases > are important, and it's hard to ask Windows users to install Visual Studio > and build python to use the most secure version of python that will run > your python program. Yes there are better ideal solutions (porting your > code to the latest and greatest feature release version), but that?s not a > zero burden option either. > > > > If CI tools just aren't up to the task, then so be it, and this isn't > something I would darken -ideas' door with. > > I honestly don't know if we're at a point where an auto-built security > release would be sufficient and/or useful. That's mostly a question > for the release manager(s). One sticking point might be that I believe > the Windows installers (at least) are signed, and only the release > managers have the signing key. It's probably *not* OK to leave the > security releases unsigned ;-) So there would be a key management > issue to address there. > If I understand things correctly, our planned migration to VSTS will include eventually automating the signing of the Windows releases so that part wont be an issue (which are currently signed manually be Steve). -------------- next part -------------- An HTML attachment was scrubbed... URL: From solipsis at pitrou.net Thu May 17 09:57:17 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Thu, 17 May 2018 15:57:17 +0200 Subject: [Python-Dev] What is the rationale behind source only releases? References: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> <015101d3ed91$a32e7cd0$e98b7670$@sdamon.com> Message-ID: <20180517155717.5fe3afac@fsol> On Thu, 17 May 2018 09:42:38 -0400 Brett Cannon wrote: > > If I understand things correctly, our planned migration to VSTS will > include eventually automating the signing of the Windows releases so that > part wont be an issue (which are currently signed manually be Steve). What part is being "planned" to be migrated? I only heard about it on a bug tracker entry. Regards Antoine. From p.f.moore at gmail.com Thu May 17 09:57:49 2018 From: p.f.moore at gmail.com (Paul Moore) Date: Thu, 17 May 2018 14:57:49 +0100 Subject: [Python-Dev] What is the rationale behind source only releases? In-Reply-To: References: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> <015101d3ed91$a32e7cd0$e98b7670$@sdamon.com> Message-ID: On 17 May 2018 at 14:42, Brett Cannon wrote: > > If I understand things correctly, our planned migration to VSTS will include > eventually automating the signing of the Windows releases so that part wont > be an issue (which are currently signed manually be Steve). Somewhat off-topic for this discussion, but is there any background on the "planned migration to VSTS" that I can go and read up on? I've seen the comments on the committers mailing list and it looks cool, but I got the impression it was an addition, rather than a migration. (If it is just an additional CI service we'll be using, then no worries - more is always better!) Paul From brett at python.org Thu May 17 10:04:51 2018 From: brett at python.org (Brett Cannon) Date: Thu, 17 May 2018 10:04:51 -0400 Subject: [Python-Dev] What is the rationale behind source only releases? In-Reply-To: References: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> <015101d3ed91$a32e7cd0$e98b7670$@sdamon.com> Message-ID: On Thu, 17 May 2018 at 09:57 Paul Moore wrote: > On 17 May 2018 at 14:42, Brett Cannon wrote: > > > > If I understand things correctly, our planned migration to VSTS will > include > > eventually automating the signing of the Windows releases so that part > wont > > be an issue (which are currently signed manually be Steve). > > Somewhat off-topic for this discussion, but is there any background on > the "planned migration to VSTS" that I can go and read up on? I've > seen the comments on the committers mailing list and it looks cool, > but I got the impression it was an addition, rather than a migration. > (If it is just an additional CI service we'll be using, then no > worries - more is always better!) > To be a bit more specific, it's "planned assuming the testing of VSTS works out as we expect it to". :) IOW it isn't definite quite yet, but I am not expecting any blockers or people objecting, so in my head I'm optimistic that it's going to happen. (Any more discussion can be brought up on core-workflow.) -------------- next part -------------- An HTML attachment was scrubbed... URL: From anthony.flury at btinternet.com Thu May 17 10:15:59 2018 From: anthony.flury at btinternet.com (Anthony Flury) Date: Thu, 17 May 2018 15:15:59 +0100 Subject: [Python-Dev] Hashes in Python3.5 for tuples and frozensets In-Reply-To: References: <9b9dc1b0-6d46-dc46-2432-dfd8d4ad56b5@btinternet.com> Message-ID: <08d628fc-7d02-23ea-85be-6171d2044e2e@btinternet.com> Chris, I entirely agree. The same questioner also asked about the fastest data type to use as a key in a dictionary; and which data structure is fastest. I get the impression the person is very into micro-optimization, without profiling their application. It seems every choice is made based on the speed of that operation; without consideration of how often that operation is used. On 17/05/18 09:16, Chris Angelico wrote: > On Thu, May 17, 2018 at 5:21 PM, Anthony Flury via Python-Dev > wrote: >> Victor, >> Thanks for the link, but to be honest it will just confuse people - neither >> the link or the related bpo entries state that the fix is only limited to >> strings. They simply talk about hash randomization - which in my opinion >> implies ALL hash algorithms; which is why I asked the question. >> >> I am not sure how much should be exposed about the scope of security fixes >> but you can understand my (and other's) confusion. >> >> I am aware that applications shouldn't make assumptions about the value of >> any given hash value - apart from some simple assumptions based hash value >> equality (i.e. if two objects have different hash values they can't be the >> same value). > The hash values of Python objects are calculated by the __hash__ > method, so arbitrary objects can do what they like, including > degenerate algorithms such as: > > class X: > def __hash__(self): return 7 Agreed - I should have said the default hash algorithm. Hashes for custom object are entirely application dependent. > > So it's impossible to randomize ALL hashes at the language level. Only > str and bytes hashes are randomized, because they're the ones most > likely to be exploitable - for instance, a web server will receive a > query like "http://spam.example/target?a=1&b=2&c=3" and provide a > dictionary {"a":1, "b":2, "c":3}. Similarly, a JSON decoder is always > going to create string keys in its dictionaries (JSON objects). Do you > know of any situation in which an attacker can provide the keys for a > dict/set as integers? I was just asking the question - rather than critiquing the fault-fix. I am actually more concerned that the documentation relating to the fix doesn't make it clear that only strings have their hashes randomised. >> /B//TW : // >> // >> //This question was prompted by a question on a social media platform about >> the whether hash values are transferable between across platforms. >> Everything I could find stated that after Python 3.3 ALL hash values were >> randomized - but that clearly isn't the case; and the original questioner >> identified that some hash values are randomized and other aren't.// >> / > That's actually immaterial. Even if the hashes weren't actually > randomized, you shouldn't be making assumptions about anything > specific in the hash, save that *within one Python process*, two equal > values will have equal hashes (and therefore two objects with unequal > hashes will not be equal). Entirely agree - I was just trying to get to the bottom of the difference - especially considering that the documentation I could find implied that all hash algorithms had been randomized. >> //I did suggest strongly to the original questioner that relying on the same >> hash value across different platforms wasn't a clever solution - their >> original plan was to store hash values in a cross system database to enable >> quick retrieval of data (!!!). I did remind the OP that a hash value wasn't >> guaranteed to be unique anyway - and they might come across two different >> values with the same hash - and no way to distinguish between them if all >> they have is the hash. Hopefully their revised design will store the key, >> not the hash./ > Uhh.... if you're using a database, let the database do the work of > being a database. I don't know what this "cross system database" would > be implemented in, but if it's a proper multi-user relational database > engine like PostgreSQL, it's already going to have way better indexing > than anything you'd do manually. I think there are WAY better > solutions than worrying about Python's inbuilt hashing. Agreed > If you MUST hash your data for sharing and storage, the easiest > solution is to just use a cryptographic hash straight out of > hashlib.py. As stated before - I think the original questioner was intent on micro optimizations - and they had hit on the idea that storing an integer would be quicker than storing as string - entirely ignoring both the practicality of trying to code all strings into a value (since hashes aren't guaranteed not to collide), and the issues of trying to reverse that translation once the stored key had been retrieved. > ChrisA > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/anthony.flury%40btinternet.com Thanks for your comments :-) -- -- Anthony Flury email : *Anthony.flury at btinternet.com* Twitter : *@TonyFlury * From steve.dower at python.org Thu May 17 10:21:43 2018 From: steve.dower at python.org (Steve Dower) Date: Thu, 17 May 2018 10:21:43 -0400 Subject: [Python-Dev] Visual Studio Team Services checks on pull requests Message-ID: <566c926f-108f-59b8-c792-e4f20ebe26ad@python.org> Hi python-dev Just drawing your attention to a change we're currently working through on github. There are more details on my post on python-committers at https://mail.python.org/pipermail/python-committers/2018-May/005404.html but this is the short version. Microsoft has donated a significant amount of macOS, Windows and Linux build time on Visual Studio Team Services for CPython that we can use for PR and commit builds on github. We've hooked these up already, so you will see new checks on github pull requests (e.g. https://github.com/python/cpython/pull/6937 ). These are currently not required, but apart from some asyncio tests they appear to be totally stable and considerably faster than our current ones. There are a few limitations still, which I'll be working with the VSTS team to resolve. Feel free to email me with any questions or suggestions. And for complete openness, Microsoft hopes that this will be good publicity for VSTS. If you have any examples of this working (e.g. you adopt it for other projects, start using it at work, etc.) then please pass those on to me as well. It will help convince then to keep giving us free resources :) Cheers, Steve From steve.dower at python.org Thu May 17 10:27:02 2018 From: steve.dower at python.org (Steve Dower) Date: Thu, 17 May 2018 10:27:02 -0400 Subject: [Python-Dev] What is the rationale behind source only releases? In-Reply-To: References: <000b01d3eccf$6111fab0$2335f010$@sdamon.com> <015101d3ed91$a32e7cd0$e98b7670$@sdamon.com> Message-ID: <7ca49795-358c-12bf-892c-abc842375a68@python.org> On 17May2018 1004, Brett Cannon wrote: > > > On Thu, 17 May 2018 at 09:57 Paul Moore > wrote: > > On 17 May 2018 at 14:42, Brett Cannon > wrote: > > > > If I understand things correctly, our planned migration to VSTS > will include > > eventually automating the signing of the Windows releases so that > part wont > > be an issue (which are currently signed manually be Steve). > > Somewhat off-topic for this discussion, but is there any background on > the "planned migration to VSTS" that I can go and read up on? I've > seen the comments on the committers mailing list and it looks cool, > but I got the impression it was an addition, rather than a migration. > (If it is just an additional CI service we'll be using, then no > worries - more is always better!) > > > To be a bit more specific, it's "planned assuming the testing of VSTS > works out as we expect it to". :) IOW it isn't definite quite yet, but I > am not expecting any blockers or people objecting, so in my head I'm > optimistic that it's going to happen. (Any more discussion can be > brought up on core-workflow.) I just posted another email, but it looks like it's working out :) The migration hasn't really been planned as such, which is why so few people have heard about it. I've just spent the PyCon US sprints proving that it's a viable option to migrate to, and it can certainly help relieve the burden on AppVeyor and Travis. As Brett says, it'll be up to core-workflow as to whether we switch completely and when. On doing release builds through it, that is somewhat orthogonal. Right now, the Windows build still requires using my secure VM, which doesn't really let just anyone do the release, but we could easily get to a point where specifically authorised people can produce a complete build. Similarly, the macOS build probably shouldn't be done on the provided (up-to-date) CI machine, as I believe that would impact our compatibility. So perhaps having a well-powered and flexible build service available will help, but no promises. Cheers, Steve From rosuav at gmail.com Thu May 17 10:34:50 2018 From: rosuav at gmail.com (Chris Angelico) Date: Fri, 18 May 2018 00:34:50 +1000 Subject: [Python-Dev] Hashes in Python3.5 for tuples and frozensets In-Reply-To: <08d628fc-7d02-23ea-85be-6171d2044e2e@btinternet.com> References: <9b9dc1b0-6d46-dc46-2432-dfd8d4ad56b5@btinternet.com> <08d628fc-7d02-23ea-85be-6171d2044e2e@btinternet.com> Message-ID: On Fri, May 18, 2018 at 12:15 AM, Anthony Flury via Python-Dev wrote: > Chris, > I entirely agree. The same questioner also asked about the fastest data type > to use as a key in a dictionary; and which data structure is fastest. I get > the impression the person is very into micro-optimization, without profiling > their application. It seems every choice is made based on the speed of that > operation; without consideration of how often that operation is used. Sounds like we're on the same page here. > On 17/05/18 09:16, Chris Angelico wrote: >> The hash values of Python objects are calculated by the __hash__ >> method, so arbitrary objects can do what they like, including >> degenerate algorithms such as: >> >> class X: >> def __hash__(self): return 7 > > Agreed - I should have said the default hash algorithm. Hashes for custom > object are entirely application dependent. There isn't a single "default hash algorithm"; in fact, I'm not sure that there's even a single algorithm used for all strings. Certainly the algorithm used for integers is completely different from the one(s) used for strings; we have a guarantee that ints and floats representing the same real number are going to have the same hash (even if that hash isn't equal to the number - hash(1e22)==hash(10**22)!=10**22 is True), since they compare equal. The algorithms used and the resulting hashes may change between Python versions, when you change interpreters (PyPy vs Jython vs CPython vs Brython...), or even when you change word sizes, I believe (32-bit vs 64-bit). So, this is (a) a premature optimization, (b) depending on something that's not guaranteed, and (c) is a great way to paint yourself into a corner. Perfect! :) ChrisA From ncoghlan at gmail.com Thu May 17 10:44:57 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 17 May 2018 10:44:57 -0400 Subject: [Python-Dev] (Looking for) A Retrospective on the Move to Python 3 In-Reply-To: References: <20180428175002.62163b10@fsol> Message-ID: On 14 May 2018 at 12:34, Chris Barker via Python-Dev wrote: > On Sat, May 12, 2018 at 8:14 AM, Skip Montanaro > wrote: > >> > I have found 2to3 conversion to be remarkably easy and painless. >> >> > And the whole Unicode thing is much easier. >> > > Another point here: > > between 3.0 and 3.6 (.5?) -- py3 grew a lot of minor features that made it > easier to write py2/py3 compatible code. u"string", b'bytes %i' % > something -- and when where the various __future__ imports made available? > > If these had been in place in 3.0, the whole process would have been > easier :-( > The __future__ imports were already there in 2.6/3.0. The other ones weren't there initially because we didn't know which things we were tempted to add back because they were actually useful, and which ones we just thought we wanted because we were used to the way the Python 2 text model worked (or failed to work, as the case may be). (The build time source code translation step was also far less effective than we hoped it was going to be, since we completely failed to account for the problem of mapping tracebacks for converted code back to the original pre-translation code) Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at holdenweb.com Thu May 17 05:30:30 2018 From: steve at holdenweb.com (Steve Holden) Date: Thu, 17 May 2018 10:30:30 +0100 Subject: [Python-Dev] [Webmaster] Possible virus in Win32 build of python? In-Reply-To: References: Message-ID: On Thu, May 17, 2018 at 5:26 AM, Ryan Saunders wrote: > Hello webmaster, > > > > A little over a week ago, I got hit by a rather nasty virus?one of those > ?ransomware? viruses that encrypts everything on your disk and then demands > bitcoin payment in exchange for the decryption key. Yuck. > > > > One potential way in which this virus might have gotten onto my system is > via a version of Python I downloaded, as I was working on a script to > auto-download Python around that time. It?s a bit difficult to be sure, > since (a) my antivirus (Windows Defender) didn?t notice the virus at all > and (b) most files on my HDD are now hopelessly encrypted, including the > copies of Python I downloaded, which makes postmortem analysis?difficult. > > > > I plan to do some more investigation to try to determine exactly how I got > this bug, but I thought it prudent to bring this to your attention quickly, > just in case Python actually *was* the infection vector, so that you can > remove any infected files from your download site. > > > > If I recall correctly, the versions of Python that I was working with were > the following: > > - https://www.python.org/ftp/python/3.7.0/python-3.7.0b4-amd64.exe > - https://www.python.org/ftp/python/3.7.0/python-3.7.0b4- > embed-amd64.zip > - https://www.python.org/ftp/python/3.7.0/python-3.7.0b3-amd64.exe > - https://www.python.org/ftp/python/3.7.0/python-3.7.0b3- > embed-amd64.zip > - https://www.python.org/ftp/python/3.6.5/python-3.6.5-amd64.exe > - https://www.python.org/ftp/python/3.6.5/python-3.6.5-embed-amd64.zip > > > > The virus is the ?Arrow? virus, which most antivirus sites identify as a > variant of the ?dharma/crysys? family of malware. Unfortunately, Windows > Defender did not catch it, so I?m not sure what AV tools to recommend. But > I do suggest scanning the above files with whatever AV tools are at your > disposal, just to be on the safe side, so that no one else contracts this > thing. > > > > If I am later able to determine conclusively the source of my infection, I > will let you know. > > > > Ryan > > > > Sent from Mail for > Windows 10 > > > > _______________________________________________ > Webmaster mailing list > Webmaster at python.org > https://mail.python.org/mailman/listinfo/webmaster > > Hi Ryan, Thanks for your note, and I'm sorry to hear that you have fallen victim to malware. I suspect the probability of a virus in the official installer distributions is very low. I understand that the release process for Windows does involve anti-virus scans, and I am not personally aware of even any false positives on 3.6. Since 3.7.0 is a pre-release I am notifying the developers list as a precaution. You will hear from them if they require any further information. Good luck restoring your system. regards Steve -------------- next part -------------- An HTML attachment was scrubbed... URL: From storchaka at gmail.com Thu May 17 14:31:37 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Thu, 17 May 2018 21:31:37 +0300 Subject: [Python-Dev] [python-committers] FINAL WEEK FOR 3.7.0 CHANGES! In-Reply-To: References: Message-ID: <468a581b-2aa2-7fea-7a51-cd566f5c5a68@gmail.com> 15.05.18 14:51, Ned Deily ????: > This is it! We are down to THE FINAL WEEK for 3.7.0! Please get your > feature fixes, bug fixes, and documentation updates in before > 2018-05-21 ~23:59 Anywhere on Earth (UTC-12:00). That's about 7 days > from now. We will then tag and produce the 3.7.0 release candidate. > Our goal continues been to be to have no changes between the release > candidate and final; AFTER NEXT WEEK'S RC1, CHANGES APPLIED TO THE 3.7 > BRANCH WILL BE RELEASED IN 3.7.1. Please double-check that there are > no critical problems outstanding and that documentation for new > features in 3.7 is complete (including NEWS and What's New items), and > that 3.7 is getting exposure and tested with our various platorms and > third-party distributions and applications. Those of us who are > participating in the development sprints at PyCon US 2018 here in > Cleveland can feel the excitement building as we work through the > remaining issues, including completing the "What's New in 3.7" > document and final feature documentation. (We wish you could all be > here.) The "What's New in 3.7" document is still not complete. Actually it is far completing. In the previous releases somebody made a thoughtful review of the NEWS file and added all significant changes in What's New, and also removed insignificant entries, reorganized entries, fixed errors, improved wording and formatting. Many thanks to Martin Panter, Elvis Pranskevichus, Yury Selivanov, R. David Murray, Nick Coghlan, Antoine Pitrou, Victor Stinner and others for their great work! But seems in 3.7 this documents doesn't have an editor. From nad at python.org Thu May 17 14:39:35 2018 From: nad at python.org (Ned Deily) Date: Thu, 17 May 2018 14:39:35 -0400 Subject: [Python-Dev] [python-committers] FINAL WEEK FOR 3.7.0 CHANGES! In-Reply-To: <468a581b-2aa2-7fea-7a51-cd566f5c5a68@gmail.com> References: <468a581b-2aa2-7fea-7a51-cd566f5c5a68@gmail.com> Message-ID: <5B1F42D3-5BAD-47C5-8276-0CE4203834AD@python.org> Elvis has been working on the What?s New doc at the sprints this week. He should be checking in his edits soon. Stay tuned! -- Ned Deily nad at python.org -- [] > On May 17, 2018, at 14:31, Serhiy Storchaka wrote: > > 15.05.18 14:51, Ned Deily ????: >> This is it! We are down to THE FINAL WEEK for 3.7.0! Please get your >> feature fixes, bug fixes, and documentation updates in before >> 2018-05-21 ~23:59 Anywhere on Earth (UTC-12:00). That's about 7 days >> from now. We will then tag and produce the 3.7.0 release candidate. >> Our goal continues been to be to have no changes between the release >> candidate and final; AFTER NEXT WEEK'S RC1, CHANGES APPLIED TO THE 3.7 >> BRANCH WILL BE RELEASED IN 3.7.1. Please double-check that there are >> no critical problems outstanding and that documentation for new >> features in 3.7 is complete (including NEWS and What's New items), and >> that 3.7 is getting exposure and tested with our various platorms and >> third-party distributions and applications. Those of us who are >> participating in the development sprints at PyCon US 2018 here in >> Cleveland can feel the excitement building as we work through the >> remaining issues, including completing the "What's New in 3.7" >> document and final feature documentation. (We wish you could all be >> here.) > > The "What's New in 3.7" document is still not complete. Actually it is far completing. In the previous releases somebody made a thoughtful review of the NEWS file and added all significant changes in What's New, and also removed insignificant entries, reorganized entries, fixed errors, improved wording and formatting. Many thanks to Martin Panter, Elvis Pranskevichus, Yury Selivanov, R. David Murray, Nick Coghlan, Antoine Pitrou, Victor Stinner and others for their great work! But seems in 3.7 this documents doesn't have an editor. > From elprans at gmail.com Thu May 17 14:43:32 2018 From: elprans at gmail.com (Elvis Pranskevichus) Date: Thu, 17 May 2018 14:43:32 -0400 Subject: [Python-Dev] [python-committers] FINAL WEEK FOR 3.7.0 CHANGES! In-Reply-To: <468a581b-2aa2-7fea-7a51-cd566f5c5a68@gmail.com> References: <468a581b-2aa2-7fea-7a51-cd566f5c5a68@gmail.com> Message-ID: <15747132.OTsdByEcpE@hammer.magicstack.net> On Thursday, May 17, 2018 2:31:37 PM EDT Serhiy Storchaka wrote: > 15.05.18 14:51, Ned Deily ????: > > This is it! We are down to THE FINAL WEEK for 3.7.0! Please get your > > feature fixes, bug fixes, and documentation updates in before > > 2018-05-21 ~23:59 Anywhere on Earth (UTC-12:00). That's about 7 days > > from now. We will then tag and produce the 3.7.0 release candidate. > > Our goal continues been to be to have no changes between the release > > candidate and final; AFTER NEXT WEEK'S RC1, CHANGES APPLIED TO THE > > 3.7 BRANCH WILL BE RELEASED IN 3.7.1. Please double-check that > > there are no critical problems outstanding and that documentation > > for new features in 3.7 is complete (including NEWS and What's New > > items), and that 3.7 is getting exposure and tested with our > > various platorms and third-party distributions and applications. > > Those of us who are participating in the development sprints at > > PyCon US 2018 here in Cleveland can feel the excitement building as > > we work through the remaining issues, including completing the > > "What's New in 3.7" document and final feature documentation. (We > > wish you could all be here.) > > The "What's New in 3.7" document is still not complete. Actually it is > far completing. In the previous releases somebody made a thoughtful > review of the NEWS file and added all significant changes in What's > New, and also removed insignificant entries, reorganized entries, > fixed errors, improved wording and formatting. Many thanks to Martin > Panter, Elvis Pranskevichus, Yury Selivanov, R. David Murray, Nick > Coghlan, Antoine Pitrou, Victor Stinner and others for their great > work! But seems in 3.7 this documents doesn't have an editor. I'm working on the What's New document. Will start putting PRs in the next few days. Elvis From larry at hastings.org Thu May 17 15:01:05 2018 From: larry at hastings.org (Larry Hastings) Date: Thu, 17 May 2018 15:01:05 -0400 Subject: [Python-Dev] Why aren't escape sequences in literal strings handled by the tokenizer? Message-ID: <76e7da75-b06b-d20f-e047-913c9cddee71@hastings.org> I fed this into tokenize.tokenize(): b''' x = "\u1234" ''' I was a bit surprised to see \Uxxxx in the output.? Particularly because the output (t.string) was a *string* and not *bytes*. It turns out, Python's tokenizer ignores escape sequences.? All it does is ignore the next character so that \" does the proper thing. But it doesn't do any substitutions.? The escape sequences are only handled when the AST node is created for the literal string! Maybe I'm making a parade of my ignorance, but I assumed that string literals were parsed by the parser--just like everything else is parsed by the parser, hey it seems like a good place for it--and in particular that the escape sequence substitutions would be done in the tokenizer.? Having stared at it a little, I now detect a whiff of "this design solved a real problem".? So... what was the problem, and how does this design solve it? BTW, my use case is that I hoped to use CPython's tokenizer to parse some Python-ish-looking text and handle double-quoted strings for me.? *Especially* all the escape sequences--leveraging all CPython's support for funny things like \U{penguin}.? The current behavior of the tokenizer makes me think it'd be easier to roll my own! //arry/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From storchaka at gmail.com Thu May 17 15:14:11 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Thu, 17 May 2018 22:14:11 +0300 Subject: [Python-Dev] [python-committers] FINAL WEEK FOR 3.7.0 CHANGES! In-Reply-To: <15747132.OTsdByEcpE@hammer.magicstack.net> References: <468a581b-2aa2-7fea-7a51-cd566f5c5a68@gmail.com> <15747132.OTsdByEcpE@hammer.magicstack.net> Message-ID: <6b2cbd3c-1db7-5ef4-da18-33d19c765229@gmail.com> 17.05.18 21:43, Elvis Pranskevichus ????: > I'm working on the What's New document. Will start putting PRs in the > next few days. Great! From brett at python.org Thu May 17 14:39:14 2018 From: brett at python.org (Brett Cannon) Date: Thu, 17 May 2018 14:39:14 -0400 Subject: [Python-Dev] [python-committers] FINAL WEEK FOR 3.7.0 CHANGES! In-Reply-To: <468a581b-2aa2-7fea-7a51-cd566f5c5a68@gmail.com> References: <468a581b-2aa2-7fea-7a51-cd566f5c5a68@gmail.com> Message-ID: On Thu, 17 May 2018 at 14:31 Serhiy Storchaka wrote: > 15.05.18 14:51, Ned Deily ????: > > This is it! We are down to THE FINAL WEEK for 3.7.0! Please get your > > feature fixes, bug fixes, and documentation updates in before > > 2018-05-21 ~23:59 Anywhere on Earth (UTC-12:00). That's about 7 days > > from now. We will then tag and produce the 3.7.0 release candidate. > > Our goal continues been to be to have no changes between the release > > candidate and final; AFTER NEXT WEEK'S RC1, CHANGES APPLIED TO THE 3.7 > > BRANCH WILL BE RELEASED IN 3.7.1. Please double-check that there are > > no critical problems outstanding and that documentation for new > > features in 3.7 is complete (including NEWS and What's New items), and > > that 3.7 is getting exposure and tested with our various platorms and > > third-party distributions and applications. Those of us who are > > participating in the development sprints at PyCon US 2018 here in > > Cleveland can feel the excitement building as we work through the > > remaining issues, including completing the "What's New in 3.7" > > document and final feature documentation. (We wish you could all be > > here.) > > The "What's New in 3.7" document is still not complete. Actually it is > far completing. In the previous releases somebody made a thoughtful > review of the NEWS file and added all significant changes in What's New, > and also removed insignificant entries, reorganized entries, fixed > errors, improved wording and formatting. Many thanks to Martin Panter, > Elvis Pranskevichus, Yury Selivanov, R. David Murray, Nick Coghlan, > Antoine Pitrou, Victor Stinner and others for their great work! But > seems in 3.7 this documents doesn't have an editor. > Maybe we should start thinking about flagging PRs or issues as needing a What's New entry to help track when they need one, or always expect it in a PR and ignore that requirement when a 'skip whats new' label is applied. That would at least make it easier to keep track of what needs to be done. -------------- next part -------------- An HTML attachment was scrubbed... URL: From eric at trueblade.com Thu May 17 18:38:59 2018 From: eric at trueblade.com (Eric V. Smith) Date: Thu, 17 May 2018 18:38:59 -0400 Subject: [Python-Dev] Why aren't escape sequences in literal strings handled by the tokenizer? In-Reply-To: <76e7da75-b06b-d20f-e047-913c9cddee71@hastings.org> References: <76e7da75-b06b-d20f-e047-913c9cddee71@hastings.org> Message-ID: <5701cb51-d233-dbf3-cedd-8e3f271f366b@trueblade.com> On 5/17/2018 3:01 PM, Larry Hastings wrote: > > > I fed this into tokenize.tokenize(): > > b''' x = "\u1234" ''' > > I was a bit surprised to see \Uxxxx in the output.? Particularly because > the output (t.string) was a *string* and not *bytes*. For those (like me) who have no idea how to use tokenize.tokenize's wacky interface, the test code is: list(tokenize.tokenize(io.BytesIO(b''' x = "\u1234" ''').readline)) > Maybe I'm making a parade of my ignorance, but I assumed that string > literals were parsed by the parser--just like everything else is parsed > by the parser, hey it seems like a good place for it--and in particular > that the escape sequence substitutions would be done in the tokenizer. > Having stared at it a little, I now detect a whiff of "this design > solved a real problem".? So... what was the problem, and how does this > design solve it? I assume the intent is to not throw away any information in the lexer, and give the parser full access to the original string. But that's just a guess. > BTW, my use case is that I hoped to use CPython's tokenizer to parse > some Python-ish-looking text and handle double-quoted strings for me. > *Especially* all the escape sequences--leveraging all CPython's support > for funny things like \U{penguin}.? The current behavior of the > tokenizer makes me think it'd be easier to roll my own! Can you feed the token text to the ast? >>> ast.literal_eval('"\u1234"') '?' Eric From guido at python.org Thu May 17 21:51:56 2018 From: guido at python.org (Guido van Rossum) Date: Thu, 17 May 2018 18:51:56 -0700 Subject: [Python-Dev] Why aren't escape sequences in literal strings handled by the tokenizer? In-Reply-To: <5701cb51-d233-dbf3-cedd-8e3f271f366b@trueblade.com> References: <76e7da75-b06b-d20f-e047-913c9cddee71@hastings.org> <5701cb51-d233-dbf3-cedd-8e3f271f366b@trueblade.com> Message-ID: To answer Larry's question, there's an overwhelming number of different options -- bytes/unicode, raw/cooked, and (in Py2) `from __future__ import unicode_literals`. So it's easier to do the actual semantic conversion in a later stage -- then the lexer only has to worry about hopping over backslashes. On Thu, May 17, 2018 at 3:38 PM, Eric V. Smith wrote: > On 5/17/2018 3:01 PM, Larry Hastings wrote: > >> >> >> I fed this into tokenize.tokenize(): >> >> b''' x = "\u1234" ''' >> >> I was a bit surprised to see \Uxxxx in the output. Particularly because >> the output (t.string) was a *string* and not *bytes*. >> > > For those (like me) who have no idea how to use tokenize.tokenize's wacky > interface, the test code is: > > list(tokenize.tokenize(io.BytesIO(b''' x = "\u1234" ''').readline)) > > Maybe I'm making a parade of my ignorance, but I assumed that string >> literals were parsed by the parser--just like everything else is parsed by >> the parser, hey it seems like a good place for it--and in particular that >> the escape sequence substitutions would be done in the tokenizer. Having >> stared at it a little, I now detect a whiff of "this design solved a real >> problem". So... what was the problem, and how does this design solve it? >> > > I assume the intent is to not throw away any information in the lexer, and > give the parser full access to the original string. But that's just a guess. > > BTW, my use case is that I hoped to use CPython's tokenizer to parse some >> Python-ish-looking text and handle double-quoted strings for me. >> *Especially* all the escape sequences--leveraging all CPython's support for >> funny things like \U{penguin}. The current behavior of the tokenizer makes >> me think it'd be easier to roll my own! >> > > Can you feed the token text to the ast? > > >>> ast.literal_eval('"\u1234"') > '?' > > Eric > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/guido% > 40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From greg at krypto.org Fri May 18 00:29:11 2018 From: greg at krypto.org (Gregory P. Smith) Date: Thu, 17 May 2018 21:29:11 -0700 Subject: [Python-Dev] [Python-checkins] bpo-33522: Enable CI builds on Visual Studio Team Services (GH-6865) (GH-6925) In-Reply-To: <40mt4J0QJ1zFr2Z@mail.python.org> References: <40mt4J0QJ1zFr2Z@mail.python.org> Message-ID: Why did this commit modify .py files, unittests, and test.support? That is inappropriate for something claiming to merely enable a CI platform. -gps On Thu, May 17, 2018 at 6:50 AM Steve Dower wrote: > > https://github.com/python/cpython/commit/0d8f83f59c8f4cc7fe125434ca4ecdcac111810f > commit: 0d8f83f59c8f4cc7fe125434ca4ecdcac111810f > branch: 3.6 > author: Steve Dower > committer: GitHub > date: 2018-05-17T09:46:00-04:00 > summary: > > bpo-33522: Enable CI builds on Visual Studio Team Services (GH-6865) > (GH-6925) > > files: > A .vsts/docs-release.yml > A .vsts/docs.yml > A .vsts/linux-buildbot.yml > A .vsts/linux-coverage.yml > A .vsts/linux-deps.yml > A .vsts/linux-pr.yml > A .vsts/macos-buildbot.yml > A .vsts/macos-pr.yml > A .vsts/windows-buildbot.yml > A .vsts/windows-pr.yml > A Misc/NEWS.d/next/Build/2018-05-15-12-44-50.bpo-33522.mJoNcA.rst > A Misc/NEWS.d/next/Library/2018-05-16-17-05-48.bpo-33548.xWslmx.rst > M Doc/make.bat > M Lib/tempfile.py > M Lib/test/support/__init__.py > M Lib/test/test_asyncio/test_base_events.py > M Lib/test/test_bdb.py > M Lib/test/test_pathlib.py > M Lib/test/test_poplib.py > M Lib/test/test_selectors.py > M PCbuild/rt.bat > M Tools/ssl/multissltests.py > > diff --git a/.vsts/docs-release.yml b/.vsts/docs-release.yml > new file mode 100644 > index 000000000000..e90428a42494 > --- /dev/null > +++ b/.vsts/docs-release.yml > @@ -0,0 +1,43 @@ > +# Current docs for the syntax of this file are at: > +# > https://github.com/Microsoft/vsts-agent/blob/master/docs/preview/yamlgettingstarted.md > + > +name: $(BuildDefinitionName)_$(Date:yyyyMMdd)$(Rev:.rr) > + > +queue: > + name: Hosted Linux Preview > + > +#variables: > + > +steps: > +- checkout: self > + clean: true > + fetchDepth: 5 > + > +- script: sudo apt-get update && sudo apt-get install -qy --force-yes > texlive-full > + displayName: 'Install LaTeX' > + > +- task: UsePythonVersion at 0 > + displayName: 'Use Python 3.6 or later' > + inputs: > + versionSpec: '>=3.6' > + > +- script: python -m pip install sphinx blurb python-docs-theme > + displayName: 'Install build dependencies' > + > +- script: make dist PYTHON=python SPHINXBUILD='python -m sphinx' > BLURB='python -m blurb' > + workingDirectory: '$(build.sourcesDirectory)/Doc' > + displayName: 'Build documentation' > + > +- task: PublishBuildArtifacts at 1 > + displayName: 'Publish build' > + inputs: > + PathToPublish: '$(build.sourcesDirectory)/Doc/build' > + ArtifactName: build > + publishLocation: Container > + > +- task: PublishBuildArtifacts at 1 > + displayName: 'Publish dist' > + inputs: > + PathToPublish: '$(build.sourcesDirectory)/Doc/dist' > + ArtifactName: dist > + publishLocation: Container > diff --git a/.vsts/docs.yml b/.vsts/docs.yml > new file mode 100644 > index 000000000000..efa1e871656d > --- /dev/null > +++ b/.vsts/docs.yml > @@ -0,0 +1,43 @@ > +# Current docs for the syntax of this file are at: > +# > https://github.com/Microsoft/vsts-agent/blob/master/docs/preview/yamlgettingstarted.md > + > +name: $(BuildDefinitionName)_$(Date:yyyyMMdd)$(Rev:.rr) > + > +queue: > + name: Hosted Linux Preview > + > +trigger: > + branches: > + include: > + - master > + - 3.7 > + - 3.6 > + paths: > + include: > + - Doc/* > + > +#variables: > + > +steps: > +- checkout: self > + clean: true > + fetchDepth: 5 > + > +- task: UsePythonVersion at 0 > + displayName: 'Use Python 3.6 or later' > + inputs: > + versionSpec: '>=3.6' > + > +- script: python -m pip install sphinx~=1.6.1 blurb python-docs-theme > + displayName: 'Install build dependencies' > + > +- script: make check suspicious html PYTHON=python > + workingDirectory: '$(build.sourcesDirectory)/Doc' > + displayName: 'Build documentation' > + > +- task: PublishBuildArtifacts at 1 > + displayName: 'Publish build' > + inputs: > + PathToPublish: '$(build.sourcesDirectory)/Doc/build' > + ArtifactName: build > + publishLocation: Container > diff --git a/.vsts/linux-buildbot.yml b/.vsts/linux-buildbot.yml > new file mode 100644 > index 000000000000..d75d7f57650e > --- /dev/null > +++ b/.vsts/linux-buildbot.yml > @@ -0,0 +1,71 @@ > +# Current docs for the syntax of this file are at: > +# > https://github.com/Microsoft/vsts-agent/blob/master/docs/preview/yamlgettingstarted.md > + > +name: $(BuildDefinitionName)_$(Date:yyyyMMdd)$(Rev:.rr) > + > +queue: > + name: Hosted Linux Preview > + > +trigger: > + branches: > + include: > + - master > + - 3.7 > + - 3.6 > + paths: > + exclude: > + - Doc/* > + - Tools/* > + > +variables: > + # Copy-pasted from linux-deps.yml until template support arrives > + OPENSSL: 1.1.0g > + OPENSSL_DIR: "$(build.sourcesDirectory)/multissl/openssl/$(OPENSSL)" > + > + > +steps: > +- checkout: self > + clean: true > + fetchDepth: 5 > + > +#- template: linux-deps.yml > + > +# See > https://github.com/Microsoft/vsts-agent/blob/master/docs/preview/yamlgettingstarted-templates.md > +# For now, we copy/paste the steps > +- script: echo "deb-src http://archive.ubuntu.com/ubuntu/ xenial main" > > /etc/apt/sources.list.d/python.list && sudo apt-get update > + displayName: 'Update apt-get lists' > + > +- script: echo ##vso[task.prependpath]$(OPENSSL_DIR) > + displayName: 'Add $(OPENSSL_DIR) to PATH' > +- script: > > + sudo apt-get -yq install > + build-essential > + zlib1g-dev > + libbz2-dev > + liblzma-dev > + libncurses5-dev > + libreadline6-dev > + libsqlite3-dev > + libssl-dev > + libgdbm-dev > + tk-dev > + lzma > + lzma-dev > + liblzma-dev > + libffi-dev > + uuid-dev > + displayName: 'Install dependencies' > +- script: python3 Tools/ssl/multissltests.py --steps=library > --base-directory $(build.sourcesDirectory)/multissl --openssl $(OPENSSL) > --system Linux > + displayName: 'python multissltests.py' > + > +- script: ./configure --with-pydebug > + displayName: 'Configure CPython (debug)' > + > +- script: make -s -j4 > + displayName: 'Build CPython' > + > +- script: make pythoninfo > + displayName: 'Display build info' > + > +- script: make buildbottest TESTOPTS="-j4 -uall,-cpu" > + displayName: 'Tests' > diff --git a/.vsts/linux-coverage.yml b/.vsts/linux-coverage.yml > new file mode 100644 > index 000000000000..3657b1720ee2 > --- /dev/null > +++ b/.vsts/linux-coverage.yml > @@ -0,0 +1,77 @@ > +# Current docs for the syntax of this file are at: > +# > https://github.com/Microsoft/vsts-agent/blob/master/docs/preview/yamlgettingstarted.md > + > +name: $(BuildDefinitionName)_$(Date:yyyyMMdd)$(Rev:.rr) > + > +queue: > + name: Hosted Linux Preview > + > +trigger: > + branches: > + include: > + - master > + - 3.7 > + - 3.6 > + paths: > + exclude: > + - Doc/* > + - Tools/* > + > +variables: > + # Copy-pasted from linux-deps.yml until template support arrives > + OPENSSL: 1.1.0g > + OPENSSL_DIR: "$(build.sourcesDirectory)/multissl/openssl/$(OPENSSL)" > + > +steps: > +- checkout: self > + clean: true > + fetchDepth: 5 > + > +#- template: linux-deps.yml > + > +# See > https://github.com/Microsoft/vsts-agent/blob/master/docs/preview/yamlgettingstarted-templates.md > +# For now, we copy/paste the steps > +- script: echo "deb-src http://archive.ubuntu.com/ubuntu/ xenial main" > > /etc/apt/sources.list.d/python.list && sudo apt-get update > + displayName: 'Update apt-get lists' > + > +- script: echo ##vso[task.prependpath]$(OPENSSL_DIR) > + displayName: 'Add $(OPENSSL_DIR) to PATH' > +- script: > > + sudo apt-get -yq install > + build-essential > + zlib1g-dev > + libbz2-dev > + liblzma-dev > + libncurses5-dev > + libreadline6-dev > + libsqlite3-dev > + libssl-dev > + libgdbm-dev > + tk-dev > + lzma > + lzma-dev > + liblzma-dev > + libffi-dev > + uuid-dev > + displayName: 'Install dependencies' > +- script: python3 Tools/ssl/multissltests.py --steps=library > --base-directory $(build.sourcesDirectory)/multissl --openssl $(OPENSSL) > --system Linux > + displayName: 'python multissltests.py' > + > + > +- script: ./configure --with-pydebug > + displayName: 'Configure CPython (debug)' > + > +- script: make -s -j4 > + displayName: 'Build CPython' > + > +- script: ./python -m venv venv && ./venv/bin/python -m pip install -U > coverage > + displayName: 'Set up virtual environment' > + > +- script: ./venv/bin/python -m test.pythoninfo > + displayName: 'Display build info' > + > +- script: ./venv/bin/python -m coverage run --pylib -m test > --fail-env-changed -uall,-cpu -x test_multiprocessing_fork -x > test_multiprocessing_forkserver -x test_multiprocessing_spawn -x > test_concurrent_futures > + displayName: 'Tests with coverage' > + > +- script: source ./venv/bin/activate && bash <(curl -s > https://codecov.io/bash) > + displayName: 'Publish code coverage results' > diff --git a/.vsts/linux-deps.yml b/.vsts/linux-deps.yml > new file mode 100644 > index 000000000000..b6c8a3690ea1 > --- /dev/null > +++ b/.vsts/linux-deps.yml > @@ -0,0 +1,36 @@ > +# Note: this file is not currently used, but when template support comes > to VSTS it > +# will be referenced from the other scripts.. > + > +# Current docs for the syntax of this file are at: > +# > https://github.com/Microsoft/vsts-agent/blob/master/docs/preview/yamlgettingstarted.md > + > +parameters: > + OPENSSL: 1.1.0g > + OPENSSL_DIR: "$(build.sourcesDirectory)/multissl/openssl/$(OPENSSL)" > + > +steps: > +- script: echo "deb-src http://archive.ubuntu.com/ubuntu/ xenial main" > > /etc/apt/sources.list.d/python.list && sudo apt-get update > + displayName: 'Update apt-get lists' > + > +- script: echo ##vso[task.prependpath]$(OPENSSL_DIR) > + displayName: 'Add $(OPENSSL_DIR) to PATH' > +- script: > > + sudo apt-get -yq install > + build-essential > + zlib1g-dev > + libbz2-dev > + liblzma-dev > + libncurses5-dev > + libreadline6-dev > + libsqlite3-dev > + libssl-dev > + libgdbm-dev > + tk-dev > + lzma > + lzma-dev > + liblzma-dev > + libffi-dev > + uuid-dev > + displayName: 'Install dependencies' > +- script: python3 Tools/ssl/multissltests.py --steps=library > --base-directory $(build.sourcesDirectory)/multissl --openssl $(OPENSSL) > --system Linux > + displayName: 'python multissltests.py' > diff --git a/.vsts/linux-pr.yml b/.vsts/linux-pr.yml > new file mode 100644 > index 000000000000..7f4d458f5a7c > --- /dev/null > +++ b/.vsts/linux-pr.yml > @@ -0,0 +1,75 @@ > +# Current docs for the syntax of this file are at: > +# > https://github.com/Microsoft/vsts-agent/blob/master/docs/preview/yamlgettingstarted.md > + > +name: $(BuildDefinitionName)_$(Date:yyyyMMdd)$(Rev:.rr) > + > +queue: > + name: Hosted Linux Preview > + > +trigger: > + branches: > + include: > + - master > + - 3.7 > + - 3.6 > + paths: > + exclude: > + - Doc/* > + - Tools/* > + > +variables: > + # Copy-pasted from linux-deps.yml until template support arrives > + OPENSSL: 1.1.0g > + OPENSSL_DIR: "$(build.sourcesDirectory)/multissl/openssl/$(OPENSSL)" > + > +steps: > +- checkout: self > + clean: true > + fetchDepth: 5 > + > +#- template: linux-deps.yml > + > +# See > https://github.com/Microsoft/vsts-agent/blob/master/docs/preview/yamlgettingstarted-templates.md > +# For now, we copy/paste the steps > +- script: echo "deb-src http://archive.ubuntu.com/ubuntu/ xenial main" > > /etc/apt/sources.list.d/python.list && sudo apt-get update > + displayName: 'Update apt-get lists' > + > +- script: echo ##vso[task.prependpath]$(OPENSSL_DIR) > + displayName: 'Add $(OPENSSL_DIR) to PATH' > +- script: > > + sudo apt-get -yq install > + build-essential > + zlib1g-dev > + libbz2-dev > + liblzma-dev > + libncurses5-dev > + libreadline6-dev > + libsqlite3-dev > + libssl-dev > + libgdbm-dev > + tk-dev > + lzma > + lzma-dev > + liblzma-dev > + libffi-dev > + uuid-dev > + displayName: 'Install dependencies' > +- script: python3 Tools/ssl/multissltests.py --steps=library > --base-directory $(build.sourcesDirectory)/multissl --openssl $(OPENSSL) > --system Linux > + displayName: 'python multissltests.py' > + > + > +- script: ./configure --with-pydebug > + displayName: 'Configure CPython (debug)' > + > +- script: make -s -j4 > + displayName: 'Build CPython' > + > +- script: make pythoninfo > + displayName: 'Display build info' > + > +# Run patchcheck and fail if anything is discovered > +- script: ./python Tools/scripts/patchcheck.py --travis true > + displayName: 'Run patchcheck.py' > + > +- script: make buildbottest TESTOPTS="-j4 -uall,-cpu" > + displayName: 'Tests' > diff --git a/.vsts/macos-buildbot.yml b/.vsts/macos-buildbot.yml > new file mode 100644 > index 000000000000..8a4f6ba8cb8b > --- /dev/null > +++ b/.vsts/macos-buildbot.yml > @@ -0,0 +1,37 @@ > +# Current docs for the syntax of this file are at: > +# > https://github.com/Microsoft/vsts-agent/blob/master/docs/preview/yamlgettingstarted.md > + > +name: $(BuildDefinitionName)_$(Date:yyyyMMdd)$(Rev:.rr) > + > +queue: > + name: Hosted macOS Preview > + > +trigger: > + branches: > + include: > + - master > + - 3.7 > + - 3.6 > + paths: > + exclude: > + - Doc/* > + - Tools/* > + > +#variables: > + > +steps: > +- checkout: self > + clean: true > + fetchDepth: 5 > + > +- script: ./configure --with-pydebug --with-openssl=/usr/local/opt/openssl > + displayName: 'Configure CPython (debug)' > + > +- script: make -s -j4 > + displayName: 'Build CPython' > + > +- script: make pythoninfo > + displayName: 'Display build info' > + > +- script: make buildbottest TESTOPTS="-j4 -uall,-cpu" > + displayName: 'Tests' > diff --git a/.vsts/macos-pr.yml b/.vsts/macos-pr.yml > new file mode 100644 > index 000000000000..8a4f6ba8cb8b > --- /dev/null > +++ b/.vsts/macos-pr.yml > @@ -0,0 +1,37 @@ > +# Current docs for the syntax of this file are at: > +# > https://github.com/Microsoft/vsts-agent/blob/master/docs/preview/yamlgettingstarted.md > + > +name: $(BuildDefinitionName)_$(Date:yyyyMMdd)$(Rev:.rr) > + > +queue: > + name: Hosted macOS Preview > + > +trigger: > + branches: > + include: > + - master > + - 3.7 > + - 3.6 > + paths: > + exclude: > + - Doc/* > + - Tools/* > + > +#variables: > + > +steps: > +- checkout: self > + clean: true > + fetchDepth: 5 > + > +- script: ./configure --with-pydebug --with-openssl=/usr/local/opt/openssl > + displayName: 'Configure CPython (debug)' > + > +- script: make -s -j4 > + displayName: 'Build CPython' > + > +- script: make pythoninfo > + displayName: 'Display build info' > + > +- script: make buildbottest TESTOPTS="-j4 -uall,-cpu" > + displayName: 'Tests' > diff --git a/.vsts/windows-buildbot.yml b/.vsts/windows-buildbot.yml > new file mode 100644 > index 000000000000..5ec4522796ce > --- /dev/null > +++ b/.vsts/windows-buildbot.yml > @@ -0,0 +1,49 @@ > +# Current docs for the syntax of this file are at: > +# > https://github.com/Microsoft/vsts-agent/blob/master/docs/preview/yamlgettingstarted.md > + > +name: $(BuildDefinitionName)_$(Date:yyyyMMdd)$(Rev:.rr) > + > +queue: > + name: Hosted VS2017 > + parallel: 2 > + matrix: > + amd64: > + buildOpt: -p x64 > + outDirSuffix: amd64 > + win32: > + buildOpt: > + outDirSuffix: win32 > + > +trigger: > + branches: > + include: > + - master > + - 3.7 > + - 3.6 > + paths: > + exclude: > + - Doc/* > + - Tools/* > + > +variables: > + # Relocate build outputs outside of source directory to make cleaning > faster > + Py_IntDir: $(Build.BinariesDirectory)\obj > + # UNDONE: Do not build to a different directory because of broken tests > + Py_OutDir: $(Build.SourcesDirectory)\PCbuild > + EXTERNAL_DIR: $(Build.BinariesDirectory)\externals > + > +steps: > +- checkout: self > + clean: true > + fetchDepth: 5 > + > +- script: PCbuild\build.bat -e $(buildOpt) > + displayName: 'Build CPython' > + > +- script: python.bat -m test.pythoninfo > + displayName: 'Display build info' > + > +- script: PCbuild\rt.bat -q -uall -u-cpu -rwW --slowest --timeout=1200 -j0 > + displayName: 'Tests' > + env: > + PREFIX: $(Py_OutDir)\$(outDirSuffix) > diff --git a/.vsts/windows-pr.yml b/.vsts/windows-pr.yml > new file mode 100644 > index 000000000000..5ec4522796ce > --- /dev/null > +++ b/.vsts/windows-pr.yml > @@ -0,0 +1,49 @@ > +# Current docs for the syntax of this file are at: > +# > https://github.com/Microsoft/vsts-agent/blob/master/docs/preview/yamlgettingstarted.md > + > +name: $(BuildDefinitionName)_$(Date:yyyyMMdd)$(Rev:.rr) > + > +queue: > + name: Hosted VS2017 > + parallel: 2 > + matrix: > + amd64: > + buildOpt: -p x64 > + outDirSuffix: amd64 > + win32: > + buildOpt: > + outDirSuffix: win32 > + > +trigger: > + branches: > + include: > + - master > + - 3.7 > + - 3.6 > + paths: > + exclude: > + - Doc/* > + - Tools/* > + > +variables: > + # Relocate build outputs outside of source directory to make cleaning > faster > + Py_IntDir: $(Build.BinariesDirectory)\obj > + # UNDONE: Do not build to a different directory because of broken tests > + Py_OutDir: $(Build.SourcesDirectory)\PCbuild > + EXTERNAL_DIR: $(Build.BinariesDirectory)\externals > + > +steps: > +- checkout: self > + clean: true > + fetchDepth: 5 > + > +- script: PCbuild\build.bat -e $(buildOpt) > + displayName: 'Build CPython' > + > +- script: python.bat -m test.pythoninfo > + displayName: 'Display build info' > + > +- script: PCbuild\rt.bat -q -uall -u-cpu -rwW --slowest --timeout=1200 -j0 > + displayName: 'Tests' > + env: > + PREFIX: $(Py_OutDir)\$(outDirSuffix) > diff --git a/Doc/make.bat b/Doc/make.bat > index 6cb315fda405..c69cfae31941 100644 > --- a/Doc/make.bat > +++ b/Doc/make.bat > @@ -5,18 +5,21 @@ pushd %~dp0 > > set this=%~n0 > > -call ..\PCBuild\find_python.bat %PYTHON% > -if not defined SPHINXBUILD if defined PYTHON ( > +call ..\PCbuild\find_python.bat %PYTHON% > + > +if not defined PYTHON set PYTHON=py > + > +if not defined SPHINXBUILD ( > %PYTHON% -c "import sphinx" > nul 2> nul > if errorlevel 1 ( > echo Installing sphinx with %PYTHON% > - %PYTHON% -m pip install sphinx > + %PYTHON% -m pip install sphinx python-docs-theme > if errorlevel 1 exit /B > ) > set SPHINXBUILD=%PYTHON% -c "import sphinx, sys; sys.argv[0] = > 'sphinx-build'; sphinx.main()" > ) > > -if not defined BLURB if defined PYTHON ( > +if not defined BLURB ( > %PYTHON% -c "import blurb" > nul 2> nul > if errorlevel 1 ( > echo Installing blurb with %PYTHON% > @@ -26,7 +29,6 @@ if not defined BLURB if defined PYTHON ( > set BLURB=%PYTHON% -m blurb > ) > > -if not defined PYTHON set PYTHON=py > if not defined SPHINXBUILD set SPHINXBUILD=sphinx-build > if not defined BLURB set BLURB=blurb > > diff --git a/Lib/tempfile.py b/Lib/tempfile.py > index 38738082b996..2cb5434ba7b5 100644 > --- a/Lib/tempfile.py > +++ b/Lib/tempfile.py > @@ -173,7 +173,9 @@ def _candidate_tempdir_list(): > > # Failing that, try OS-specific locations. > if _os.name == 'nt': > - dirlist.extend([ r'c:\temp', r'c:\tmp', r'\temp', r'\tmp' ]) > + dirlist.extend([ _os.path.expanduser(r'~\AppData\Local\Temp'), > + _os.path.expandvars(r'%SYSTEMROOT%\Temp'), > + r'c:\temp', r'c:\tmp', r'\temp', r'\tmp' ]) > else: > dirlist.extend([ '/tmp', '/var/tmp', '/usr/tmp' ]) > > diff --git a/Lib/test/support/__init__.py b/Lib/test/support/__init__.py > index 867124b63e24..e46394e89d1f 100644 > --- a/Lib/test/support/__init__.py > +++ b/Lib/test/support/__init__.py > @@ -366,6 +366,20 @@ def _rmtree_inner(path): > _force_run(fullname, os.unlink, fullname) > _waitfor(_rmtree_inner, path, waitall=True) > _waitfor(lambda p: _force_run(p, os.rmdir, p), path) > + > + def _longpath(path): > + try: > + import ctypes > + except ImportError: > + # No ctypes means we can't expands paths. > + pass > + else: > + buffer = ctypes.create_unicode_buffer(len(path) * 2) > + length = ctypes.windll.kernel32.GetLongPathNameW(path, buffer, > + len(buffer)) > + if length: > + return buffer[:length] > + return path > else: > _unlink = os.unlink > _rmdir = os.rmdir > @@ -392,6 +406,9 @@ def _rmtree_inner(path): > _rmtree_inner(path) > os.rmdir(path) > > + def _longpath(path): > + return path > + > def unlink(filename): > try: > _unlink(filename) > @@ -2333,13 +2350,15 @@ def can_xattr(): > if not hasattr(os, "setxattr"): > can = False > else: > - tmp_fp, tmp_name = tempfile.mkstemp() > + tmp_dir = tempfile.mkdtemp() > + tmp_fp, tmp_name = tempfile.mkstemp(dir=tmp_dir) > try: > with open(TESTFN, "wb") as fp: > try: > # TESTFN & tempfile may use different file systems > with > # different capabilities > os.setxattr(tmp_fp, b"user.test", b"") > + os.setxattr(tmp_name, b"trusted.foo", b"42") > os.setxattr(fp.fileno(), b"user.test", b"") > # Kernels < 2.6.39 don't respect setxattr flags. > kernel_version = platform.release() > @@ -2350,6 +2369,7 @@ def can_xattr(): > finally: > unlink(TESTFN) > unlink(tmp_name) > + rmdir(tmp_dir) > _can_xattr = can > return can > > diff --git a/Lib/test/test_asyncio/test_base_events.py > b/Lib/test/test_asyncio/test_base_events.py > index 830f0d84a9d4..42c0707e8f21 100644 > --- a/Lib/test/test_asyncio/test_base_events.py > +++ b/Lib/test/test_asyncio/test_base_events.py > @@ -1750,5 +1750,6 @@ def runner(loop): > outer_loop.close() > > > + > if __name__ == '__main__': > unittest.main() > diff --git a/Lib/test/test_bdb.py b/Lib/test/test_bdb.py > index abefe6c4e57a..a36667869718 100644 > --- a/Lib/test/test_bdb.py > +++ b/Lib/test/test_bdb.py > @@ -417,15 +417,17 @@ def __init__(self, test_case, skip=None): > self.dry_run = test_case.dry_run > self.tracer = Tracer(test_case.expect_set, skip=skip, > dry_run=self.dry_run, test_case=test_case.id > ()) > + self._original_tracer = None > > def __enter__(self): > # test_pdb does not reset Breakpoint class attributes on exit :-( > reset_Breakpoint() > + self._original_tracer = sys.gettrace() > return self.tracer > > def __exit__(self, type_=None, value=None, traceback=None): > reset_Breakpoint() > - sys.settrace(None) > + sys.settrace(self._original_tracer) > > not_empty = '' > if self.tracer.set_list: > diff --git a/Lib/test/test_pathlib.py b/Lib/test/test_pathlib.py > index db53a8f202dc..bf9467e96e09 100644 > --- a/Lib/test/test_pathlib.py > +++ b/Lib/test/test_pathlib.py > @@ -1531,7 +1531,7 @@ def test_resolve_common(self): > # resolves to 'dirB/..' first before resolving to parent of > dirB. > self._check_resolve_relative(p, P(BASE, 'foo', 'in', 'spam'), > False) > # Now create absolute symlinks > - d = tempfile.mkdtemp(suffix='-dirD') > + d = support._longpath(tempfile.mkdtemp(suffix='-dirD')) > self.addCleanup(support.rmtree, d) > os.symlink(os.path.join(d), join('dirA', 'linkX')) > os.symlink(join('dirB'), os.path.join(d, 'linkY')) > diff --git a/Lib/test/test_poplib.py b/Lib/test/test_poplib.py > index ca9bc6217509..234c855545c2 100644 > --- a/Lib/test/test_poplib.py > +++ b/Lib/test/test_poplib.py > @@ -217,11 +217,12 @@ def start(self): > def run(self): > self.active = True > self.__flag.set() > - while self.active and asyncore.socket_map: > - self.active_lock.acquire() > - asyncore.loop(timeout=0.1, count=1) > - self.active_lock.release() > - asyncore.close_all(ignore_all=True) > + try: > + while self.active and asyncore.socket_map: > + with self.active_lock: > + asyncore.loop(timeout=0.1, count=1) > + finally: > + asyncore.close_all(ignore_all=True) > > def stop(self): > assert self.active > diff --git a/Lib/test/test_selectors.py b/Lib/test/test_selectors.py > index 852b2feb45fd..14ce91f3768c 100644 > --- a/Lib/test/test_selectors.py > +++ b/Lib/test/test_selectors.py > @@ -450,7 +450,14 @@ def test_above_fd_setsize(self): > self.skipTest("FD limit reached") > raise > > - self.assertEqual(NUM_FDS // 2, len(s.select())) > + try: > + fds = s.select() > + except OSError as e: > + if e.errno == errno.EINVAL and sys.platform == 'darwin': > + # unexplainable errors on macOS don't need to fail the > test > + self.skipTest("Invalid argument error calling poll()") > + raise > + self.assertEqual(NUM_FDS // 2, len(fds)) > > > class DefaultSelectorTestCase(BaseSelectorTestCase): > diff --git > a/Misc/NEWS.d/next/Build/2018-05-15-12-44-50.bpo-33522.mJoNcA.rst > b/Misc/NEWS.d/next/Build/2018-05-15-12-44-50.bpo-33522.mJoNcA.rst > new file mode 100644 > index 000000000000..f44862f0c454 > --- /dev/null > +++ b/Misc/NEWS.d/next/Build/2018-05-15-12-44-50.bpo-33522.mJoNcA.rst > @@ -0,0 +1,2 @@ > +Enable CI builds on Visual Studio Team Services at > +https://python.visualstudio.com/cpython > diff --git > a/Misc/NEWS.d/next/Library/2018-05-16-17-05-48.bpo-33548.xWslmx.rst > b/Misc/NEWS.d/next/Library/2018-05-16-17-05-48.bpo-33548.xWslmx.rst > new file mode 100644 > index 000000000000..65585c152987 > --- /dev/null > +++ b/Misc/NEWS.d/next/Library/2018-05-16-17-05-48.bpo-33548.xWslmx.rst > @@ -0,0 +1 @@ > +tempfile._candidate_tempdir_list should consider common TEMP locations > diff --git a/PCbuild/rt.bat b/PCbuild/rt.bat > index 808102f826d3..212befc95b06 100644 > --- a/PCbuild/rt.bat > +++ b/PCbuild/rt.bat > @@ -7,7 +7,7 @@ rem -q "quick" -- normally the tests are run twice, the > first time > rem after deleting all the .pyc files reachable from Lib/. > rem -q runs the tests just once, and without deleting .pyc files. > rem -x64 Run the 64-bit build of python (or python_d if -d was specified) > -rem from the 'amd64' dir instead of the 32-bit build in this dir. > +rem When omitted, uses %PREFIX% if set or the 32-bit build > rem All leading instances of these switches are shifted off, and > rem whatever remains (up to 9 arguments) is passed to regrtest.py. > rem For example, > @@ -28,28 +28,29 @@ rem rt -u "network,largefile" > setlocal > > set pcbuild=%~dp0 > -set prefix=%pcbuild%win32\ > set suffix= > set qmode= > set dashO= > set regrtestargs= > +set exe= > > :CheckOpts > if "%1"=="-O" (set dashO=-O) & shift & goto CheckOpts > if "%1"=="-q" (set qmode=yes) & shift & goto CheckOpts > if "%1"=="-d" (set suffix=_d) & shift & goto CheckOpts > -if "%1"=="-x64" (set prefix=%pcbuild%amd64\) & shift & goto CheckOpts > +if "%1"=="-x64" (set prefix=%pcbuild%amd64) & shift & goto CheckOpts > if NOT "%1"=="" (set regrtestargs=%regrtestargs% %1) & shift & goto > CheckOpts > > -set exe=%prefix%python%suffix%.exe > -set cmd="%exe%" %dashO% -Wd -E -bb -m test %regrtestargs% > +if not defined prefix set prefix=%pcbuild%win32 > +set exe=%prefix%\python%suffix%.exe > +set cmd="%exe%" %dashO% -u -Wd -E -bb -m test %regrtestargs% > if defined qmode goto Qmode > > echo Deleting .pyc files ... > "%exe%" "%pcbuild%rmpyc.py" > > echo Cleaning _pth files ... > -if exist %prefix%*._pth del %prefix%*._pth > +if exist %prefix%\*._pth del %prefix%\*._pth > > echo on > %cmd% > diff --git a/Tools/ssl/multissltests.py b/Tools/ssl/multissltests.py > index ba4529ae0611..f3241cd6071c 100755 > --- a/Tools/ssl/multissltests.py > +++ b/Tools/ssl/multissltests.py > @@ -123,6 +123,11 @@ > action='store_true', > help="Don't run tests, only compile _ssl.c and _hashopenssl.c." > ) > +parser.add_argument( > + '--system', > + default='', > + help="Override the automatic system type detection." > +) > > > class AbstractBuilder(object): > @@ -150,6 +155,7 @@ def __init__(self, version, compile_args=(), > # build directory (removed after install) > self.build_dir = os.path.join( > self.src_dir, self.build_template.format(version)) > + self.system = args.system > > def __str__(self): > return "<{0.__class__.__name__} for {0.version}>".format(self) > @@ -254,9 +260,13 @@ def _build_src(self): > cwd = self.build_dir > cmd = ["./config", "shared", > "--prefix={}".format(self.install_dir)] > cmd.extend(self.compile_args) > - self._subprocess_call(cmd, cwd=cwd) > + env = None > + if self.system: > + env = os.environ.copy() > + env['SYSTEM'] = self.system > + self._subprocess_call(cmd, cwd=cwd, env=env) > # Old OpenSSL versions do not support parallel builds. > - self._subprocess_call(["make", "-j1"], cwd=cwd) > + self._subprocess_call(["make", "-j1"], cwd=cwd, env=env) > > def _make_install(self, remove=True): > self._subprocess_call(["make", "-j1", "install"], > cwd=self.build_dir) > > _______________________________________________ > Python-checkins mailing list > Python-checkins at python.org > https://mail.python.org/mailman/listinfo/python-checkins > -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve.dower at python.org Fri May 18 02:04:55 2018 From: steve.dower at python.org (Steve Dower) Date: Thu, 17 May 2018 23:04:55 -0700 Subject: [Python-Dev] [Python-checkins] bpo-33522: Enable CI builds onVisual Studio Team Services (GH-6865) (GH-6925) In-Reply-To: References: <40mt4J0QJ1zFr2Z@mail.python.org> Message-ID: Fair point. It was because the new platforms are not identical to any of our existing ones and so exposed gaps in our tests (and one wart in tempfile, which I had explicitly reviewed by two others and gave its own bug and NEWS item). Pre-squashing, the change had nearly 100 attempts at resolving them all, and I saw no value in cluttering the main history with them for a not-yet-approved CI system, and also no possibility of it ever being approved if there were test failures. Happy to discuss any particular changes if you have specific concerns. Top-posted from my Windows phone From: Gregory P. Smith Sent: Thursday, May 17, 2018 21:32 To: python-dev at python.org Cc: python-checkins at python.org Subject: Re: [Python-Dev] [Python-checkins] bpo-33522: Enable CI builds onVisual Studio Team Services (GH-6865) (GH-6925) Why did this commit modify .py files, unittests, and test.support? That is inappropriate for something claiming to merely enable a CI platform. -gps On Thu, May 17, 2018 at 6:50 AM Steve Dower wrote: https://github.com/python/cpython/commit/0d8f83f59c8f4cc7fe125434ca4ecdcac111810f commit: 0d8f83f59c8f4cc7fe125434ca4ecdcac111810f branch: 3.6 author: Steve Dower committer: GitHub date: 2018-05-17T09:46:00-04:00 summary: bpo-33522: Enable CI builds on Visual Studio Team Services (GH-6865) (GH-6925) files: A .vsts/docs-release.yml A .vsts/docs.yml A .vsts/linux-buildbot.yml A .vsts/linux-coverage.yml A .vsts/linux-deps.yml A .vsts/linux-pr.yml A .vsts/macos-buildbot.yml A .vsts/macos-pr.yml A .vsts/windows-buildbot.yml A .vsts/windows-pr.yml A Misc/NEWS.d/next/Build/2018-05-15-12-44-50.bpo-33522.mJoNcA.rst A Misc/NEWS.d/next/Library/2018-05-16-17-05-48.bpo-33548.xWslmx.rst M Doc/make.bat M Lib/tempfile.py M Lib/test/support/__init__.py M Lib/test/test_asyncio/test_base_events.py M Lib/test/test_bdb.py M Lib/test/test_pathlib.py M Lib/test/test_poplib.py M Lib/test/test_selectors.py M PCbuild/rt.bat M Tools/ssl/multissltests.py diff --git a/.vsts/docs-release.yml b/.vsts/docs-release.yml new file mode 100644 index 000000000000..e90428a42494 --- /dev/null +++ b/.vsts/docs-release.yml @@ -0,0 +1,43 @@ +# Current docs for the syntax of this file are at: +#? https://github.com/Microsoft/vsts-agent/blob/master/docs/preview/yamlgettingstarted.md + +name: $(BuildDefinitionName)_$(Date:yyyyMMdd)$(Rev:.rr) + +queue: +? name: Hosted Linux Preview + +#variables: + +steps: +- checkout: self +? clean: true +? fetchDepth: 5 + +- script: sudo apt-get update && sudo apt-get install -qy --force-yes texlive-full +? displayName: 'Install LaTeX' + +- task: UsePythonVersion at 0 +? displayName: 'Use Python 3.6 or later' +? inputs: +? ? versionSpec: '>=3.6' + +- script: python -m pip install sphinx blurb python-docs-theme +? displayName: 'Install build dependencies' + +- script: make dist PYTHON=python SPHINXBUILD='python -m sphinx' BLURB='python -m blurb' +? workingDirectory: '$(build.sourcesDirectory)/Doc' +? displayName: 'Build documentation' + +- task: PublishBuildArtifacts at 1 +? displayName: 'Publish build' +? inputs: +? ? PathToPublish: '$(build.sourcesDirectory)/Doc/build' +? ? ArtifactName: build +? ? publishLocation: Container + +- task: PublishBuildArtifacts at 1 +? displayName: 'Publish dist' +? inputs: +? ? PathToPublish: '$(build.sourcesDirectory)/Doc/dist' +? ? ArtifactName: dist +? ? publishLocation: Container diff --git a/.vsts/docs.yml b/.vsts/docs.yml new file mode 100644 index 000000000000..efa1e871656d --- /dev/null +++ b/.vsts/docs.yml @@ -0,0 +1,43 @@ +# Current docs for the syntax of this file are at: +#? https://github.com/Microsoft/vsts-agent/blob/master/docs/preview/yamlgettingstarted.md + +name: $(BuildDefinitionName)_$(Date:yyyyMMdd)$(Rev:.rr) + +queue: +? name: Hosted Linux Preview + +trigger: +? branches: +? ? include: +? ? - master +? ? - 3.7 +? ? - 3.6 +? paths: +? ? include: +? ? - Doc/* + +#variables: + +steps: +- checkout: self +? clean: true +? fetchDepth: 5 + +- task: UsePythonVersion at 0 +? displayName: 'Use Python 3.6 or later' +? inputs: +? ? versionSpec: '>=3.6' + +- script: python -m pip install sphinx~=1.6.1 blurb python-docs-theme +? displayName: 'Install build dependencies' + +- script: make check suspicious html PYTHON=python +? workingDirectory: '$(build.sourcesDirectory)/Doc' +? displayName: 'Build documentation' + +- task: PublishBuildArtifacts at 1 +? displayName: 'Publish build' +? inputs: +? ? PathToPublish: '$(build.sourcesDirectory)/Doc/build' +? ? ArtifactName: build +? ? publishLocation: Container diff --git a/.vsts/linux-buildbot.yml b/.vsts/linux-buildbot.yml new file mode 100644 index 000000000000..d75d7f57650e --- /dev/null +++ b/.vsts/linux-buildbot.yml @@ -0,0 +1,71 @@ +# Current docs for the syntax of this file are at: +#? https://github.com/Microsoft/vsts-agent/blob/master/docs/preview/yamlgettingstarted.md + +name: $(BuildDefinitionName)_$(Date:yyyyMMdd)$(Rev:.rr) + +queue: +? name: Hosted Linux Preview + +trigger: +? branches: +? ? include: +? ? - master +? ? - 3.7 +? ? - 3.6 +? paths: +? ? exclude: +? ? - Doc/* +? ? - Tools/* + +variables: +? # Copy-pasted from linux-deps.yml until template support arrives +? OPENSSL: 1.1.0g +? OPENSSL_DIR: "$(build.sourcesDirectory)/multissl/openssl/$(OPENSSL)" + + +steps: +- checkout: self +? clean: true +? fetchDepth: 5 + +#- template: linux-deps.yml + +# See https://github.com/Microsoft/vsts-agent/blob/master/docs/preview/yamlgettingstarted-templates.md +# For now, we copy/paste the steps +- script: echo "deb-src http://archive.ubuntu.com/ubuntu/ xenial main" > /etc/apt/sources.list.d/python.list && sudo apt-get update +? displayName: 'Update apt-get lists' + +- script: echo ##vso[task.prependpath]$(OPENSSL_DIR) +? displayName: 'Add $(OPENSSL_DIR) to PATH' +- script: > +? ? sudo apt-get -yq install +? ? build-essential +? ? zlib1g-dev +? ? libbz2-dev +? ? liblzma-dev +? ? libncurses5-dev +? ? libreadline6-dev +? ? libsqlite3-dev +? ? libssl-dev +? ? libgdbm-dev +? ? tk-dev +? ? lzma +? ? lzma-dev +? ? liblzma-dev +? ? libffi-dev +? ? uuid-dev +? displayName: 'Install dependencies' +- script: python3 Tools/ssl/multissltests.py --steps=library --base-directory $(build.sourcesDirectory)/multissl --openssl $(OPENSSL) --system Linux +? displayName: 'python multissltests.py' + +- script: ./configure --with-pydebug +? displayName: 'Configure CPython (debug)' + +- script: make -s -j4 +? displayName: 'Build CPython' + +- script: make pythoninfo +? displayName: 'Display build info' + +- script: make buildbottest TESTOPTS="-j4 -uall,-cpu" +? displayName: 'Tests' diff --git a/.vsts/linux-coverage.yml b/.vsts/linux-coverage.yml new file mode 100644 index 000000000000..3657b1720ee2 --- /dev/null +++ b/.vsts/linux-coverage.yml @@ -0,0 +1,77 @@ +# Current docs for the syntax of this file are at: +#? https://github.com/Microsoft/vsts-agent/blob/master/docs/preview/yamlgettingstarted.md + +name: $(BuildDefinitionName)_$(Date:yyyyMMdd)$(Rev:.rr) + +queue: +? name: Hosted Linux Preview + +trigger: +? branches: +? ? include: +? ? - master +? ? - 3.7 +? ? - 3.6 +? paths: +? ? exclude: +? ? - Doc/* +? ? - Tools/* + +variables: +? # Copy-pasted from linux-deps.yml until template support arrives +? OPENSSL: 1.1.0g +? OPENSSL_DIR: "$(build.sourcesDirectory)/multissl/openssl/$(OPENSSL)" + +steps: +- checkout: self +? clean: true +? fetchDepth: 5 + +#- template: linux-deps.yml + +# See https://github.com/Microsoft/vsts-agent/blob/master/docs/preview/yamlgettingstarted-templates.md +# For now, we copy/paste the steps +- script: echo "deb-src http://archive.ubuntu.com/ubuntu/ xenial main" > /etc/apt/sources.list.d/python.list && sudo apt-get update +? displayName: 'Update apt-get lists' + +- script: echo ##vso[task.prependpath]$(OPENSSL_DIR) +? displayName: 'Add $(OPENSSL_DIR) to PATH' +- script: > +? ? sudo apt-get -yq install +? ? build-essential +? ? zlib1g-dev +? ? libbz2-dev +? ? liblzma-dev +? ? libncurses5-dev +? ? libreadline6-dev +? ? libsqlite3-dev +? ? libssl-dev +? ? libgdbm-dev +? ? tk-dev +? ? lzma +? ? lzma-dev +? ? liblzma-dev +? ? libffi-dev +? ? uuid-dev +? displayName: 'Install dependencies' +- script: python3 Tools/ssl/multissltests.py --steps=library --base-directory $(build.sourcesDirectory)/multissl --openssl $(OPENSSL) --system Linux +? displayName: 'python multissltests.py' + + +- script: ./configure --with-pydebug +? displayName: 'Configure CPython (debug)' + +- script: make -s -j4 +? displayName: 'Build CPython' + +- script: ./python -m venv venv && ./venv/bin/python -m pip install -U coverage +? displayName: 'Set up virtual environment' + +- script: ./venv/bin/python -m test.pythoninfo +? displayName: 'Display build info' + +- script: ./venv/bin/python -m coverage run --pylib -m test --fail-env-changed -uall,-cpu -x test_multiprocessing_fork -x test_multiprocessing_forkserver -x test_multiprocessing_spawn -x test_concurrent_futures +? displayName: 'Tests with coverage' + +- script: source ./venv/bin/activate && bash <(curl -s https://codecov.io/bash) +? displayName: 'Publish code coverage results' diff --git a/.vsts/linux-deps.yml b/.vsts/linux-deps.yml new file mode 100644 index 000000000000..b6c8a3690ea1 --- /dev/null +++ b/.vsts/linux-deps.yml @@ -0,0 +1,36 @@ +# Note: this file is not currently used, but when template support comes to VSTS it +# will be referenced from the other scripts.. + +# Current docs for the syntax of this file are at: +#? https://github.com/Microsoft/vsts-agent/blob/master/docs/preview/yamlgettingstarted.md + +parameters: +? OPENSSL: 1.1.0g +? OPENSSL_DIR: "$(build.sourcesDirectory)/multissl/openssl/$(OPENSSL)" + +steps: +- script: echo "deb-src http://archive.ubuntu.com/ubuntu/ xenial main" > /etc/apt/sources.list.d/python.list && sudo apt-get update +? displayName: 'Update apt-get lists' + +- script: echo ##vso[task.prependpath]$(OPENSSL_DIR) +? displayName: 'Add $(OPENSSL_DIR) to PATH' +- script: > +? ? sudo apt-get -yq install +? ? build-essential +? ? zlib1g-dev +? ? libbz2-dev +? ? liblzma-dev +? ? libncurses5-dev +? ? libreadline6-dev +? ? libsqlite3-dev +? ? libssl-dev +? ? libgdbm-dev +? ? tk-dev +? ? lzma +? ? lzma-dev +? ? liblzma-dev +? ? libffi-dev +? ? uuid-dev +? displayName: 'Install dependencies' +- script: python3 Tools/ssl/multissltests.py --steps=library --base-directory $(build.sourcesDirectory)/multissl --openssl $(OPENSSL) --system Linux +? displayName: 'python multissltests.py' diff --git a/.vsts/linux-pr.yml b/.vsts/linux-pr.yml new file mode 100644 index 000000000000..7f4d458f5a7c --- /dev/null +++ b/.vsts/linux-pr.yml @@ -0,0 +1,75 @@ +# Current docs for the syntax of this file are at: +#? https://github.com/Microsoft/vsts-agent/blob/master/docs/preview/yamlgettingstarted.md + +name: $(BuildDefinitionName)_$(Date:yyyyMMdd)$(Rev:.rr) + +queue: +? name: Hosted Linux Preview + +trigger: +? branches: +? ? include: +? ? - master +? ? - 3.7 +? ? - 3.6 +? paths: +? ? exclude: +? ? - Doc/* +? ? - Tools/* + +variables: +? # Copy-pasted from linux-deps.yml until template support arrives +? OPENSSL: 1.1.0g +? OPENSSL_DIR: "$(build.sourcesDirectory)/multissl/openssl/$(OPENSSL)" + +steps: +- checkout: self +? clean: true +? fetchDepth: 5 + +#- template: linux-deps.yml + +# See https://github.com/Microsoft/vsts-agent/blob/master/docs/preview/yamlgettingstarted-templates.md +# For now, we copy/paste the steps +- script: echo "deb-src http://archive.ubuntu.com/ubuntu/ xenial main" > /etc/apt/sources.list.d/python.list && sudo apt-get update +? displayName: 'Update apt-get lists' + +- script: echo ##vso[task.prependpath]$(OPENSSL_DIR) +? displayName: 'Add $(OPENSSL_DIR) to PATH' +- script: > +? ? sudo apt-get -yq install +? ? build-essential +? ? zlib1g-dev +? ? libbz2-dev +? ? liblzma-dev +? ? libncurses5-dev +? ? libreadline6-dev +? ? libsqlite3-dev +? ? libssl-dev +? ? libgdbm-dev +? ? tk-dev +? ? lzma +? ? lzma-dev +? ? liblzma-dev +? ? libffi-dev +? ? uuid-dev +? displayName: 'Install dependencies' +- script: python3 Tools/ssl/multissltests.py --steps=library --base-directory $(build.sourcesDirectory)/multissl --openssl $(OPENSSL) --system Linux +? displayName: 'python multissltests.py' + + +- script: ./configure --with-pydebug +? displayName: 'Configure CPython (debug)' + +- script: make -s -j4 +? displayName: 'Build CPython' + +- script: make pythoninfo +? displayName: 'Display build info' + +# Run patchcheck and fail if anything is discovered +- script: ./python Tools/scripts/patchcheck.py --travis true +? displayName: 'Run patchcheck.py' + +- script: make buildbottest TESTOPTS="-j4 -uall,-cpu" +? displayName: 'Tests' diff --git a/.vsts/macos-buildbot.yml b/.vsts/macos-buildbot.yml new file mode 100644 index 000000000000..8a4f6ba8cb8b --- /dev/null +++ b/.vsts/macos-buildbot.yml @@ -0,0 +1,37 @@ +# Current docs for the syntax of this file are at: +#? https://github.com/Microsoft/vsts-agent/blob/master/docs/preview/yamlgettingstarted.md + +name: $(BuildDefinitionName)_$(Date:yyyyMMdd)$(Rev:.rr) + +queue: +? name: Hosted macOS Preview + +trigger: +? branches: +? ? include: +? ? - master +? ? - 3.7 +? ? - 3.6 +? paths: +? ? exclude: +? ? - Doc/* +? ? - Tools/* + +#variables: + +steps: +- checkout: self +? clean: true +? fetchDepth: 5 + +- script: ./configure --with-pydebug --with-openssl=/usr/local/opt/openssl +? displayName: 'Configure CPython (debug)' + +- script: make -s -j4 +? displayName: 'Build CPython' + +- script: make pythoninfo +? displayName: 'Display build info' + +- script: make buildbottest TESTOPTS="-j4 -uall,-cpu" +? displayName: 'Tests' diff --git a/.vsts/macos-pr.yml b/.vsts/macos-pr.yml new file mode 100644 index 000000000000..8a4f6ba8cb8b --- /dev/null +++ b/.vsts/macos-pr.yml @@ -0,0 +1,37 @@ +# Current docs for the syntax of this file are at: +#? https://github.com/Microsoft/vsts-agent/blob/master/docs/preview/yamlgettingstarted.md + +name: $(BuildDefinitionName)_$(Date:yyyyMMdd)$(Rev:.rr) + +queue: +? name: Hosted macOS Preview + +trigger: +? branches: +? ? include: +? ? - master +? ? - 3.7 +? ? - 3.6 +? paths: +? ? exclude: +? ? - Doc/* +? ? - Tools/* + +#variables: + +steps: +- checkout: self +? clean: true +? fetchDepth: 5 + +- script: ./configure --with-pydebug --with-openssl=/usr/local/opt/openssl +? displayName: 'Configure CPython (debug)' + +- script: make -s -j4 +? displayName: 'Build CPython' + +- script: make pythoninfo +? displayName: 'Display build info' + +- script: make buildbottest TESTOPTS="-j4 -uall,-cpu" +? displayName: 'Tests' diff --git a/.vsts/windows-buildbot.yml b/.vsts/windows-buildbot.yml new file mode 100644 index 000000000000..5ec4522796ce --- /dev/null +++ b/.vsts/windows-buildbot.yml @@ -0,0 +1,49 @@ +# Current docs for the syntax of this file are at: +#? https://github.com/Microsoft/vsts-agent/blob/master/docs/preview/yamlgettingstarted.md + +name: $(BuildDefinitionName)_$(Date:yyyyMMdd)$(Rev:.rr) + +queue: +? name: Hosted VS2017 +? parallel: 2 +? matrix: +? ? amd64: +? ? ? buildOpt: -p x64 +? ? ? outDirSuffix: amd64 +? ? win32: +? ? ? buildOpt: +? ? ? outDirSuffix: win32 + +trigger: +? branches: +? ? include: +? ? - master +? ? - 3.7 +? ? - 3.6 +? paths: +? ? exclude: +? ? - Doc/* +? ? - Tools/* + +variables: +? # Relocate build outputs outside of source directory to make cleaning faster +? Py_IntDir: $(Build.BinariesDirectory)\obj +? # UNDONE: Do not build to a different directory because of broken tests +? Py_OutDir: $(Build.SourcesDirectory)\PCbuild +? EXTERNAL_DIR: $(Build.BinariesDirectory)\externals + +steps: +- checkout: self +? clean: true +? fetchDepth: 5 + +- script: PCbuild\build.bat -e $(buildOpt) +? displayName: 'Build CPython' + +- script: python.bat -m test.pythoninfo +? displayName: 'Display build info' + +- script: PCbuild\rt.bat -q -uall -u-cpu -rwW --slowest --timeout=1200 -j0 +? displayName: 'Tests' +? env: +? ? PREFIX: $(Py_OutDir)\$(outDirSuffix) diff --git a/.vsts/windows-pr.yml b/.vsts/windows-pr.yml new file mode 100644 index 000000000000..5ec4522796ce --- /dev/null +++ b/.vsts/windows-pr.yml @@ -0,0 +1,49 @@ +# Current docs for the syntax of this file are at: +#? https://github.com/Microsoft/vsts-agent/blob/master/docs/preview/yamlgettingstarted.md + +name: $(BuildDefinitionName)_$(Date:yyyyMMdd)$(Rev:.rr) + +queue: +? name: Hosted VS2017 +? parallel: 2 +? matrix: +? ? amd64: +? ? ? buildOpt: -p x64 +? ? ? outDirSuffix: amd64 +? ? win32: +? ? ? buildOpt: +? ? ? outDirSuffix: win32 + +trigger: +? branches: +? ? include: +? ? - master +? ? - 3.7 +? ? - 3.6 +? paths: +? ? exclude: +? ? - Doc/* +? ? - Tools/* + +variables: +? # Relocate build outputs outside of source directory to make cleaning faster +? Py_IntDir: $(Build.BinariesDirectory)\obj +? # UNDONE: Do not build to a different directory because of broken tests +? Py_OutDir: $(Build.SourcesDirectory)\PCbuild +? EXTERNAL_DIR: $(Build.BinariesDirectory)\externals + +steps: +- checkout: self +? clean: true +? fetchDepth: 5 + +- script: PCbuild\build.bat -e $(buildOpt) +? displayName: 'Build CPython' + +- script: python.bat -m test.pythoninfo +? displayName: 'Display build info' + +- script: PCbuild\rt.bat -q -uall -u-cpu -rwW --slowest --timeout=1200 -j0 +? displayName: 'Tests' +? env: +? ? PREFIX: $(Py_OutDir)\$(outDirSuffix) diff --git a/Doc/make.bat b/Doc/make.bat index 6cb315fda405..c69cfae31941 100644 --- a/Doc/make.bat +++ b/Doc/make.bat @@ -5,18 +5,21 @@ pushd %~dp0 ?set this=%~n0 -call ..\PCBuild\find_python.bat %PYTHON% -if not defined SPHINXBUILD if defined PYTHON ( +call ..\PCbuild\find_python.bat %PYTHON% + +if not defined PYTHON set PYTHON=py + +if not defined SPHINXBUILD ( ? ? ?%PYTHON% -c "import sphinx" > nul 2> nul ? ? ?if errorlevel 1 ( ? ? ? ? ?echo Installing sphinx with %PYTHON% -? ? ? ? %PYTHON% -m pip install sphinx +? ? ? ? %PYTHON% -m pip install sphinx python-docs-theme ? ? ? ? ?if errorlevel 1 exit /B ? ? ?) ? ? ?set SPHINXBUILD=%PYTHON% -c "import sphinx, sys; sys.argv[0] = 'sphinx-build'; sphinx.main()" ?) -if not defined BLURB if defined PYTHON ( +if not defined BLURB ( ? ? ?%PYTHON% -c "import blurb" > nul 2> nul ? ? ?if errorlevel 1 ( ? ? ? ? ?echo Installing blurb with %PYTHON% @@ -26,7 +29,6 @@ if not defined BLURB if defined PYTHON ( ? ? ?set BLURB=%PYTHON% -m blurb ?) -if not defined PYTHON set PYTHON=py ?if not defined SPHINXBUILD set SPHINXBUILD=sphinx-build ?if not defined BLURB set BLURB=blurb diff --git a/Lib/tempfile.py b/Lib/tempfile.py index 38738082b996..2cb5434ba7b5 100644 --- a/Lib/tempfile.py +++ b/Lib/tempfile.py @@ -173,7 +173,9 @@ def _candidate_tempdir_list(): ? ? ?# Failing that, try OS-specific locations. ? ? ?if _os.name == 'nt': -? ? ? ? dirlist.extend([ r'c:\temp', r'c:\tmp', r'\temp', r'\tmp' ]) +? ? ? ? dirlist.extend([ _os.path.expanduser(r'~\AppData\Local\Temp'), +? ? ? ? ? ? ? ? ? ? ? ? ?_os.path.expandvars(r'%SYSTEMROOT%\Temp'), +? ? ? ? ? ? ? ? ? ? ? ? ?r'c:\temp', r'c:\tmp', r'\temp', r'\tmp' ]) ? ? ?else: ? ? ? ? ?dirlist.extend([ '/tmp', '/var/tmp', '/usr/tmp' ]) diff --git a/Lib/test/support/__init__.py b/Lib/test/support/__init__.py index 867124b63e24..e46394e89d1f 100644 --- a/Lib/test/support/__init__.py +++ b/Lib/test/support/__init__.py @@ -366,6 +366,20 @@ def _rmtree_inner(path): ? ? ? ? ? ? ? ? ? ? ?_force_run(fullname, os.unlink, fullname) ? ? ? ? ?_waitfor(_rmtree_inner, path, waitall=True) ? ? ? ? ?_waitfor(lambda p: _force_run(p, os.rmdir, p), path) + +? ? def _longpath(path): +? ? ? ? try: +? ? ? ? ? ? import ctypes +? ? ? ? except ImportError: +? ? ? ? ? ? # No ctypes means we can't expands paths. +? ? ? ? ? ? pass +? ? ? ? else: +? ? ? ? ? ? buffer = ctypes.create_unicode_buffer(len(path) * 2) +? ? ? ? ? ? length = ctypes.windll.kernel32.GetLongPathNameW(path, buffer, +? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ?len(buffer)) +? ? ? ? ? ? if length: +? ? ? ? ? ? ? ? return buffer[:length] +? ? ? ? return path ?else: ? ? ?_unlink = os.unlink ? ? ?_rmdir = os.rmdir @@ -392,6 +406,9 @@ def _rmtree_inner(path): ? ? ? ? ?_rmtree_inner(path) ? ? ? ? ?os.rmdir(path) +? ? def _longpath(path): +? ? ? ? return path + ?def unlink(filename): ? ? ?try: ? ? ? ? ?_unlink(filename) @@ -2333,13 +2350,15 @@ def can_xattr(): ? ? ?if not hasattr(os, "setxattr"): ? ? ? ? ?can = False ? ? ?else: -? ? ? ? tmp_fp, tmp_name = tempfile.mkstemp() +? ? ? ? tmp_dir = tempfile.mkdtemp() +? ? ? ? tmp_fp, tmp_name = tempfile.mkstemp(dir=tmp_dir) ? ? ? ? ?try: ? ? ? ? ? ? ?with open(TESTFN, "wb") as fp: ? ? ? ? ? ? ? ? ?try: ? ? ? ? ? ? ? ? ? ? ?# TESTFN & tempfile may use different file systems with ? ? ? ? ? ? ? ? ? ? ?# different capabilities ? ? ? ? ? ? ? ? ? ? ?os.setxattr(tmp_fp, b"user.test", b"") +? ? ? ? ? ? ? ? ? ? os.setxattr(tmp_name, b"trusted.foo", b"42") ? ? ? ? ? ? ? ? ? ? ?os.setxattr(fp.fileno(), b"user.test", b"") ? ? ? ? ? ? ? ? ? ? ?# Kernels < 2.6.39 don't respect setxattr flags. ? ? ? ? ? ? ? ? ? ? ?kernel_version = platform.release() @@ -2350,6 +2369,7 @@ def can_xattr(): ? ? ? ? ?finally: ? ? ? ? ? ? ?unlink(TESTFN) ? ? ? ? ? ? ?unlink(tmp_name) +? ? ? ? ? ? rmdir(tmp_dir) ? ? ?_can_xattr = can ? ? ?return can diff --git a/Lib/test/test_asyncio/test_base_events.py b/Lib/test/test_asyncio/test_base_events.py index 830f0d84a9d4..42c0707e8f21 100644 --- a/Lib/test/test_asyncio/test_base_events.py +++ b/Lib/test/test_asyncio/test_base_events.py @@ -1750,5 +1750,6 @@ def runner(loop): ? ? ? ? ? ? ?outer_loop.close() + ?if __name__ == '__main__': ? ? ?unittest.main() diff --git a/Lib/test/test_bdb.py b/Lib/test/test_bdb.py index abefe6c4e57a..a36667869718 100644 --- a/Lib/test/test_bdb.py +++ b/Lib/test/test_bdb.py @@ -417,15 +417,17 @@ def __init__(self, test_case, skip=None): ? ? ? ? ?self.dry_run = test_case.dry_run ? ? ? ? ?self.tracer = Tracer(test_case.expect_set, skip=skip, ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? dry_run=self.dry_run, test_case=test_case.id()) +? ? ? ? self._original_tracer = None ? ? ?def __enter__(self): ? ? ? ? ?# test_pdb does not reset Breakpoint class attributes on exit :-( ? ? ? ? ?reset_Breakpoint() +? ? ? ? self._original_tracer = sys.gettrace() ? ? ? ? ?return self.tracer ? ? ?def __exit__(self, type_=None, value=None, traceback=None): ? ? ? ? ?reset_Breakpoint() -? ? ? ? sys.settrace(None) +? ? ? ? sys.settrace(self._original_tracer) ? ? ? ? ?not_empty = '' ? ? ? ? ?if self.tracer.set_list: diff --git a/Lib/test/test_pathlib.py b/Lib/test/test_pathlib.py index db53a8f202dc..bf9467e96e09 100644 --- a/Lib/test/test_pathlib.py +++ b/Lib/test/test_pathlib.py @@ -1531,7 +1531,7 @@ def test_resolve_common(self): ? ? ? ? ? ? ?# resolves to 'dirB/..' first before resolving to parent of dirB. ? ? ? ? ? ? ?self._check_resolve_relative(p, P(BASE, 'foo', 'in', 'spam'), False) ? ? ? ? ?# Now create absolute symlinks -? ? ? ? d = tempfile.mkdtemp(suffix='-dirD') +? ? ? ? d = support._longpath(tempfile.mkdtemp(suffix='-dirD')) ? ? ? ? ?self.addCleanup(support.rmtree, d) ? ? ? ? ?os.symlink(os.path.join(d), join('dirA', 'linkX')) ? ? ? ? ?os.symlink(join('dirB'), os.path.join(d, 'linkY')) diff --git a/Lib/test/test_poplib.py b/Lib/test/test_poplib.py index ca9bc6217509..234c855545c2 100644 --- a/Lib/test/test_poplib.py +++ b/Lib/test/test_poplib.py @@ -217,11 +217,12 @@ def start(self): ? ? ?def run(self): ? ? ? ? ?self.active = True ? ? ? ? ?self.__flag.set() -? ? ? ? while self.active and asyncore.socket_map: -? ? ? ? ? ? self.active_lock.acquire() -? ? ? ? ? ? asyncore.loop(timeout=0.1, count=1) -? ? ? ? ? ? self.active_lock.release() -? ? ? ? asyncore.close_all(ignore_all=True) +? ? ? ? try: +? ? ? ? ? ? while self.active and asyncore.socket_map: +? ? ? ? ? ? ? ? with self.active_lock: +? ? ? ? ? ? ? ? ? ? asyncore.loop(timeout=0.1, count=1) +? ? ? ? finally: +? ? ? ? ? ? asyncore.close_all(ignore_all=True) ? ? ?def stop(self): ? ? ? ? ?assert self.active diff --git a/Lib/test/test_selectors.py b/Lib/test/test_selectors.py index 852b2feb45fd..14ce91f3768c 100644 --- a/Lib/test/test_selectors.py +++ b/Lib/test/test_selectors.py @@ -450,7 +450,14 @@ def test_above_fd_setsize(self): ? ? ? ? ? ? ? ? ? ? ?self.skipTest("FD limit reached") ? ? ? ? ? ? ? ? ?raise -? ? ? ? self.assertEqual(NUM_FDS // 2, len(s.select())) +? ? ? ? try: +? ? ? ? ? ? fds = s.select() +? ? ? ? except OSError as e: +? ? ? ? ? ? if e.errno == errno.EINVAL and sys.platform == 'darwin': +? ? ? ? ? ? ? ? # unexplainable errors on macOS don't need to fail the test +? ? ? ? ? ? ? ? self.skipTest("Invalid argument error calling poll()") +? ? ? ? ? ? raise +? ? ? ? self.assertEqual(NUM_FDS // 2, len(fds)) ?class DefaultSelectorTestCase(BaseSelectorTestCase): diff --git a/Misc/NEWS.d/next/Build/2018-05-15-12-44-50.bpo-33522.mJoNcA.rst b/Misc/NEWS.d/next/Build/2018-05-15-12-44-50.bpo-33522.mJoNcA.rst new file mode 100644 index 000000000000..f44862f0c454 --- /dev/null +++ b/Misc/NEWS.d/next/Build/2018-05-15-12-44-50.bpo-33522.mJoNcA.rst @@ -0,0 +1,2 @@ +Enable CI builds on Visual Studio Team Services at +https://python.visualstudio.com/cpython diff --git a/Misc/NEWS.d/next/Library/2018-05-16-17-05-48.bpo-33548.xWslmx.rst b/Misc/NEWS.d/next/Library/2018-05-16-17-05-48.bpo-33548.xWslmx.rst new file mode 100644 index 000000000000..65585c152987 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2018-05-16-17-05-48.bpo-33548.xWslmx.rst @@ -0,0 +1 @@ +tempfile._candidate_tempdir_list should consider common TEMP locations diff --git a/PCbuild/rt.bat b/PCbuild/rt.bat index 808102f826d3..212befc95b06 100644 --- a/PCbuild/rt.bat +++ b/PCbuild/rt.bat @@ -7,7 +7,7 @@ rem -q? ?"quick" -- normally the tests are run twice, the first time ?rem? ? ? after deleting all the .pyc files reachable from Lib/. ?rem? ? ? -q runs the tests just once, and without deleting .pyc files. ?rem -x64 Run the 64-bit build of python (or python_d if -d was specified) -rem? ? ? from the 'amd64' dir instead of the 32-bit build in this dir. +rem? ? ? When omitted, uses %PREFIX% if set or the 32-bit build ?rem All leading instances of these switches are shifted off, and ?rem whatever remains (up to 9 arguments) is passed to regrtest.py. ?rem For example, @@ -28,28 +28,29 @@ rem? ? ?rt -u "network,largefile" ?setlocal ?set pcbuild=%~dp0 -set prefix=%pcbuild%win32\ ?set suffix= ?set qmode= ?set dashO= ?set regrtestargs= +set exe= ?:CheckOpts ?if "%1"=="-O" (set dashO=-O)? ? ?& shift & goto CheckOpts ?if "%1"=="-q" (set qmode=yes)? ? & shift & goto CheckOpts ?if "%1"=="-d" (set suffix=_d)? ? & shift & goto CheckOpts -if "%1"=="-x64" (set prefix=%pcbuild%amd64\) & shift & goto CheckOpts +if "%1"=="-x64" (set prefix=%pcbuild%amd64) & shift & goto CheckOpts ?if NOT "%1"=="" (set regrtestargs=%regrtestargs% %1) & shift & goto CheckOpts -set exe=%prefix%python%suffix%.exe -set cmd="%exe%" %dashO% -Wd -E -bb -m test %regrtestargs% +if not defined prefix set prefix=%pcbuild%win32 +set exe=%prefix%\python%suffix%.exe +set cmd="%exe%" %dashO% -u -Wd -E -bb -m test %regrtestargs% ?if defined qmode goto Qmode ?echo Deleting .pyc files ... ?"%exe%" "%pcbuild%rmpyc.py" ?echo Cleaning _pth files ... -if exist %prefix%*._pth del %prefix%*._pth +if exist %prefix%\*._pth del %prefix%\*._pth ?echo on ?%cmd% diff --git a/Tools/ssl/multissltests.py b/Tools/ssl/multissltests.py index ba4529ae0611..f3241cd6071c 100755 --- a/Tools/ssl/multissltests.py +++ b/Tools/ssl/multissltests.py @@ -123,6 +123,11 @@ ? ? ?action='store_true', ? ? ?help="Don't run tests, only compile _ssl.c and _hashopenssl.c." ?) +parser.add_argument( +? ? '--system', +? ? default='', +? ? help="Override the automatic system type detection." +) ?class AbstractBuilder(object): @@ -150,6 +155,7 @@ def __init__(self, version, compile_args=(), ? ? ? ? ?# build directory (removed after install) ? ? ? ? ?self.build_dir = os.path.join( ? ? ? ? ? ? ?self.src_dir, self.build_template.format(version)) +? ? ? ? self.system = args.system ? ? ?def __str__(self): ? ? ? ? ?return "<{0.__class__.__name__} for {0.version}>".format(self) @@ -254,9 +260,13 @@ def _build_src(self): ? ? ? ? ?cwd = self.build_dir ? ? ? ? ?cmd = ["./config", "shared", "--prefix={}".format(self.install_dir)] ? ? ? ? ?cmd.extend(self.compile_args) -? ? ? ? self._subprocess_call(cmd, cwd=cwd) +? ? ? ? env = None +? ? ? ? if self.system: +? ? ? ? ? ? env = os.environ.copy() +? ? ? ? ? ? env['SYSTEM'] = self.system +? ? ? ? self._subprocess_call(cmd, cwd=cwd, env=env) ? ? ? ? ?# Old OpenSSL versions do not support parallel builds. -? ? ? ? self._subprocess_call(["make", "-j1"], cwd=cwd) +? ? ? ? self._subprocess_call(["make", "-j1"], cwd=cwd, env=env) ? ? ?def _make_install(self, remove=True): ? ? ? ? ?self._subprocess_call(["make", "-j1", "install"], cwd=self.build_dir) _______________________________________________ Python-checkins mailing list Python-checkins at python.org https://mail.python.org/mailman/listinfo/python-checkins -------------- next part -------------- An HTML attachment was scrubbed... URL: From storchaka at gmail.com Fri May 18 03:55:14 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Fri, 18 May 2018 10:55:14 +0300 Subject: [Python-Dev] [python-committers] FINAL WEEK FOR 3.7.0 CHANGES! In-Reply-To: References: <468a581b-2aa2-7fea-7a51-cd566f5c5a68@gmail.com> Message-ID: <9a0fdca2-b9a7-aa7d-4007-e30e677940be@gmail.com> 17.05.18 21:39, Brett Cannon ????: > Maybe we should start thinking about flagging PRs or issues as needing > a What's New entry to help track when they need one, or always expect > it in a PR and ignore that requirement when a 'skip whats new' label > is applied. That would at least make it easier to keep track of what > needs to be done. The requirement of flagging PRs or issues as needing a What's New entry doesn't differ in principle from the requirement of creating a What's New entry for these changes. The latter is good, and I'm trying always create a What's New entry for significant enhancement or potentially breaking change. And even I sometimes is unsure and don't document some important changes (like in issue30399). Needed a look of yet one pair of eyes. As for requiring a What's New entry by default and introducing a 'skip whats new' label, I suppose this will add much nuisance. Most PRs (except docs and tests changes) need a news entry, but most PRs don't need a What's New entry because their are bug fixes. Therefore a 'skip whats new' label will be required much more times than 'skip news' or 'skip issue' labels. A thing that can help is a tool that makes a structural diff between NEWS files for different versions and between different branches. It will filter out bugfix changes. The simple 'diff' is not well appropriate because entries can be in different order, and news entries now are scattered between several files, and news entries for previous version sometimes should be searched in different files, and sometimes should be searched on a different branch. The text of entries in different versions can also be different because the same issue can change the behavior on the master and backport the part of changes as a bugfix. -------------- next part -------------- An HTML attachment was scrubbed... URL: From greg.ewing at canterbury.ac.nz Fri May 18 04:09:56 2018 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Fri, 18 May 2018 20:09:56 +1200 Subject: [Python-Dev] Why aren't escape sequences in literal strings handled by the tokenizer? In-Reply-To: <5701cb51-d233-dbf3-cedd-8e3f271f366b@trueblade.com> References: <76e7da75-b06b-d20f-e047-913c9cddee71@hastings.org> <5701cb51-d233-dbf3-cedd-8e3f271f366b@trueblade.com> Message-ID: <5AFE8A54.304@canterbury.ac.nz> Eric V. Smith wrote: > I assume the intent is to not throw away any information in the lexer, > and give the parser full access to the original string. But that's just > a guess. More likely it's because the lexer is fairly dumb and can basically just recognise regular expressions. -- Greg From steve at pearwood.info Fri May 18 07:46:11 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Fri, 18 May 2018 21:46:11 +1000 Subject: [Python-Dev] Normalisation of unicode and keywords Message-ID: <20180518114611.GR12683@ando.pearwood.info> Stephan Houben noticed that Python apparently allows identifiers to be keywords, if you use Unicode "mathematical bold" letters. His explanation is that the identifier is normalised, but not until after keywords are checked for. So this works: class Spam: locals()['if'] = 1 Spam.?? # U+1D422 U+1D41F # returns 1 Of course Spam.if fails with SyntaxError. Should this work? Is this a bug, a feature, or an accident of implementation we can ignore? -- Steve From vano at mail.mipt.ru Fri May 18 09:20:19 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Fri, 18 May 2018 16:20:19 +0300 Subject: [Python-Dev] [python-committers] FINAL WEEK FOR 3.7.0 CHANGES! In-Reply-To: <9a0fdca2-b9a7-aa7d-4007-e30e677940be@gmail.com> References: <468a581b-2aa2-7fea-7a51-cd566f5c5a68@gmail.com> <9a0fdca2-b9a7-aa7d-4007-e30e677940be@gmail.com> Message-ID: <7f10ed78-c3d0-273c-628e-3a875131f50b@mail.mipt.ru> On 18.05.2018 10:55, Serhiy Storchaka wrote: > 17.05.18 21:39, Brett Cannon ????: >> Maybe we should start thinking about flagging PRs or issues as >> needing a What's New entry to help track when they need one, or >> always expect it in a PR and ignore that requirement when a 'skip >> whats new' label is applied. That would at least make it easier to >> keep track of what needs to be done. > > The requirement of flagging PRs or issues as needing a What's New > entry doesn't differ in principle from the requirement of creating a > What's New entry for these changes. The latter is good, and I'm trying > always create a What's New entry for significant enhancement or > potentially breaking change. And even I sometimes is unsure and don't > document some important changes (like in issue30399). Needed a look of > yet one pair of eyes. > > As for requiring a What's New entry by default and introducing a 'skip > whats new' label, I suppose this will add much nuisance. Most PRs > (except docs and tests changes) need a news entry, but most PRs don't > need a What's New entry because their are bug fixes. Therefore a 'skip > whats new' label will be required much more times than 'skip news' or > 'skip issue' labels. > Since Python uses semantic versioning (https://semver.org), the criterion for "what's new-worthy" changes is simple: they are _public interface changes_ (which include visible changes to documented behavior). (I maintain that changes to behavior that is not documented -- incl. issue30399 -- are _not_ public interface changes, and whoever relies on them does that on their own risk.) Reading previous What's New, I see that it is structured like this * Entries for major changes: ??? * General design decisions ??? * Changes that fall into a category (more recent What's New's include about a dozen categories) * "Other": the list of the rest So, it makes sense to mark work items as "interface change" or something, and optionally with a caterory if that category is established. You can't make a mistake here 'cuz a public interface change requires an edit to related documentation. > A thing that can help is a tool that makes a structural diff between > NEWS files for different versions and between different branches. It > will filter out bugfix changes. The simple 'diff' is not well > appropriate because entries can be in different order, and news > entries now are scattered between several files, and news entries for > previous version sometimes should be searched in different files, and > sometimes should be searched on a different branch. The text of > entries in different versions can also be different because the same > issue can change the behavior on the master and backport the part of > changes as a bugfix. Not all bugs apply to all, or multiple branches, so that wouldn't filter them out reliably. > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/vano%40mail.mipt.ru -- Regards, Ivan -------------- next part -------------- An HTML attachment was scrubbed... URL: From vano at mail.mipt.ru Fri May 18 09:38:59 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Fri, 18 May 2018 16:38:59 +0300 Subject: [Python-Dev] Normalisation of unicode and keywords In-Reply-To: <20180518114611.GR12683@ando.pearwood.info> References: <20180518114611.GR12683@ando.pearwood.info> Message-ID: <9b6287a6-db89-2a2f-8273-8c562a3f3813@mail.mipt.ru> On 18.05.2018 14:46, Steven D'Aprano wrote: > Stephan Houben noticed that Python apparently allows identifiers to be > keywords, if you use Unicode "mathematical bold" letters. His > explanation is that the identifier is normalised, but not until after > keywords are checked for. So this works: > > class Spam: > locals()['if'] = 1 > > > Spam.?? # U+1D422 U+1D41F > # returns 1 > > > Of course Spam.if fails with SyntaxError. > > Should this work? Is this a bug, a feature, or an accident of > implementation we can ignore? Voting for bug: Either those identifiers should be considered equal, or they shouldn't. They can't be considered "partially" equal. > > -- Regards, Ivan From Richard at Damon-Family.org Fri May 18 09:55:33 2018 From: Richard at Damon-Family.org (Richard Damon) Date: Fri, 18 May 2018 09:55:33 -0400 Subject: [Python-Dev] [python-committers] FINAL WEEK FOR 3.7.0 CHANGES! In-Reply-To: <7f10ed78-c3d0-273c-628e-3a875131f50b@mail.mipt.ru> References: <468a581b-2aa2-7fea-7a51-cd566f5c5a68@gmail.com> <9a0fdca2-b9a7-aa7d-4007-e30e677940be@gmail.com> <7f10ed78-c3d0-273c-628e-3a875131f50b@mail.mipt.ru> Message-ID: On 5/18/18 9:20 AM, Ivan Pozdeev via Python-Dev wrote: > Since Python uses semantic versioning (https://semver.org), the > criterion for "what's new-worthy" changes is simple: they are _public > interface changes_ (which include visible changes to documented behavior). > (I maintain that changes to behavior that is not documented -- incl. > issue30399 -- are _not_ public interface changes, and whoever relies > on them does that on their own risk.) > Python does NOT use semantic versioning as features are allowed to be obsoleted and removed without a major version number change. Also the addition of a new keyword which breaks old code would not be allowed with semantic versioning. The basic rules of semantic versioning is that ANY program that uses documented features of version a.b.c will work on any version a.d.e where (d > b) or (d = b and e > c). If python did use semantic versioning then there would be no need to keep updates to older minor versions, once 3.7.0 was out, there would be no need for keeping the 3.6.x, 3.5.x, 3.4.x branch etc, as any program that was written for those older version would just work with the newer version. The need for those is proof that python does not use semantic versioning. If you wanted to map python version to a semantic versioning concept, the first two numbers of the version would correspond to what semantic versioning call the 'major revision' which is what is allowed to break backwards compatibility with API, with the first digit being major changes and the second minor but not fully backwards compatible changes. -- Richard Damon From greg at krypto.org Fri May 18 11:55:29 2018 From: greg at krypto.org (Gregory P. Smith) Date: Fri, 18 May 2018 08:55:29 -0700 Subject: [Python-Dev] please help triage VSTS failures Message-ID: These both look like VSTS infrastructure falling over on PRs: https://python.visualstudio.com/cpython/_build?buildId=522 https://python.visualstudio.com/cpython/_build?buildId=523 I don't see anywhere that gives information about the failures. (*) These CI failures on different platforms are both for the same change, on different branches, for a documentation only change. That passed on the other branches and platforms for the same change. -gps (*) I refuse to "Download logs as a zip file". I'm in a web browser, if the information I might need is potentially buried somewhere in a zip file of logs, that is a waste of my time. I'm not going to do it. The web UI *needs* to find and display the relevant failure info from any logs directly. -------------- next part -------------- An HTML attachment was scrubbed... URL: From status at bugs.python.org Fri May 18 12:10:04 2018 From: status at bugs.python.org (Python tracker) Date: Fri, 18 May 2018 18:10:04 +0200 (CEST) Subject: [Python-Dev] Summary of Python tracker Issues Message-ID: <20180518161004.A53AA11A861@psf.upfronthosting.co.za> ACTIVITY SUMMARY (2018-05-11 - 2018-05-18) Python tracker at https://bugs.python.org/ To view or respond to any of the issues listed below, click on the issue. Do NOT respond to this message. Issues counts and deltas: open 6686 (+42) closed 38637 (+66) total 45323 (+108) Open issues with patches: 2630 Issues opened (72) ================== #33466: Distutils does not support the compilation of Objective-C++ (? https://bugs.python.org/issue33466 opened by fish2000 #33467: Python 3.7: profile-opt build errors because a test seems to h https://bugs.python.org/issue33467 opened by Rahul Ravindran #33468: Add try-finally contextlib.contextmanager example https://bugs.python.org/issue33468 opened by ncoghlan #33469: RuntimeError after closing loop that used run_in_executor https://bugs.python.org/issue33469 opened by hniksic #33471: string format with 'n' failling with french locales https://bugs.python.org/issue33471 opened by David Vasseur #33473: build system incorrectly handles CC, CFLAGS, LDFLAGS, and rela https://bugs.python.org/issue33473 opened by eitan.adler #33474: Support immutability per-field in dataclasses https://bugs.python.org/issue33474 opened by Daniel Lindeman #33475: Fix converting AST expression to string and optimize parenthes https://bugs.python.org/issue33475 opened by serhiy.storchaka #33476: String index out of range in get_group(), email/_header_value_ https://bugs.python.org/issue33476 opened by Cacadril #33477: Document that compile(code, 'exec') has different behavior in https://bugs.python.org/issue33477 opened by mbussonn #33479: Document tkinter and threads https://bugs.python.org/issue33479 opened by terry.reedy #33481: configparser.write() does not save comments. https://bugs.python.org/issue33481 opened by pebaudhi #33482: codecs.StreamRecoder.writelines is broken https://bugs.python.org/issue33482 opened by Jelle Zijlstra #33483: build system requires explicit compiler, but should discover i https://bugs.python.org/issue33483 opened by eitan.adler #33484: build system runs when it may merely link https://bugs.python.org/issue33484 opened by eitan.adler #33485: autoconf target does not behave correctly https://bugs.python.org/issue33485 opened by eitan.adler #33486: regen autotools related files https://bugs.python.org/issue33486 opened by eitan.adler #33487: BZ2File(buffering=None) does not emit deprecation warning, dep https://bugs.python.org/issue33487 opened by mbussonn #33489: Newer externals for windows do not always trigger rebuild https://bugs.python.org/issue33489 opened by terry.reedy #33490: pthread auto-detection can use AX_PTHREAD https://bugs.python.org/issue33490 opened by eitan.adler #33491: mistype of method's name https://bugs.python.org/issue33491 opened by Ivan Gushchin #33492: Updating the Evaluation order section to cover *expression in https://bugs.python.org/issue33492 opened by mjpieters #33494: random.choices ought to check that cumulative weights are in a https://bugs.python.org/issue33494 opened by steven.daprano #33498: pathlib.Path wants an rmtree method https://bugs.python.org/issue33498 opened by Aaron Hall #33499: Environment variable to set alternate location for pycache tre https://bugs.python.org/issue33499 opened by carljm #33500: Python TKinter for Mac on latest 2.7.15 still extremely slow v https://bugs.python.org/issue33500 opened by Michael Romero #33501: split existing optimization levels into granular options https://bugs.python.org/issue33501 opened by carljm #33504: configparser should use dict instead of OrderedDict in 3.7+ https://bugs.python.org/issue33504 opened by jreese #33505: Optimize asyncio.ensure_future by reordering if conditions https://bugs.python.org/issue33505 opened by jimmylai #33507: Improving the html rendered by cgitb.html https://bugs.python.org/issue33507 opened by sblondon #33511: Update config.sub https://bugs.python.org/issue33511 opened by eitan.adler #33514: async and await as keywords not mentioned in What???s New In P https://bugs.python.org/issue33514 opened by hroncok #33515: subprocess.Popen on a Windows batch file always acts as if she https://bugs.python.org/issue33515 opened by abigail #33516: unittest.mock: Add __round__ to supported magicmock methods https://bugs.python.org/issue33516 opened by mjpieters #33518: Add PEP to glossary https://bugs.python.org/issue33518 opened by adelfino #33519: Should MutableSequence provide .copy()? https://bugs.python.org/issue33519 opened by Jelle Zijlstra #33521: Add 1.32x faster C implementation of asyncio.isfuture(). https://bugs.python.org/issue33521 opened by jimmylai #33523: loop.run_until_complete re-entrancy to support more complicate https://bugs.python.org/issue33523 opened by fried #33524: non-ascii characters in headers causes TypeError on email.poli https://bugs.python.org/issue33524 opened by radical164 #33525: os.spawnvpe() returns error code 127 instead of raising when e https://bugs.python.org/issue33525 opened by Mark.Shannon #33527: Invalid child function scope https://bugs.python.org/issue33527 opened by gasokiw #33528: os.getentropy support https://bugs.python.org/issue33528 opened by David Carlier #33529: Infinite loop on folding email if headers has no spaces https://bugs.python.org/issue33529 opened by radical164 #33530: Implement Happy Eyeball in asyncio https://bugs.python.org/issue33530 opened by twisteroid ambassador #33531: test_asyncio: test_subprocess test_stdin_broken_pipe() failure https://bugs.python.org/issue33531 opened by vstinner #33532: test_multiprocessing_forkserver: TestIgnoreEINTR.test_ignore() https://bugs.python.org/issue33532 opened by vstinner #33533: Provide an async-generator version of as_completed https://bugs.python.org/issue33533 opened by hniksic #33535: Consider quoting all values in Morsel objects https://bugs.python.org/issue33535 opened by berker.peksag #33537: Help on importlib.resources outputs the builtin open descripti https://bugs.python.org/issue33537 opened by serhiy.storchaka #33540: socketserver: Add an opt-in option to get Python 3.6 behaviour https://bugs.python.org/issue33540 opened by vstinner #33541: Remove private and apparently unused __pad function https://bugs.python.org/issue33541 opened by mariocj89 #33542: _ipconfig_getnode incorrectly selects a DUID as a MAC address https://bugs.python.org/issue33542 opened by zeffron #33544: Asyncio Event.wait() is a hold over from before awaitable, and https://bugs.python.org/issue33544 opened by fried #33545: Docs for uuid don't mention that uuid1 can repeat in some circ https://bugs.python.org/issue33545 opened by merelymoray #33546: asyncio.Condition should become awaitable in 3.9 https://bugs.python.org/issue33546 opened by fried #33550: Sigpipe handling issue should be documented https://bugs.python.org/issue33550 opened by splbio #33552: f-strings and string annotations https://bugs.python.org/issue33552 opened by serhiy.storchaka #33553: Documentation improvement proposal for multiprocessing https://bugs.python.org/issue33553 opened by Derek Kim #33556: leftover thread crumb in threading.ident docstring https://bugs.python.org/issue33556 opened by skip.montanaro #33557: Windows multiprocessing doesn't propagate tabcheck to children https://bugs.python.org/issue33557 opened by jwilk #33558: Python has no icon in taskbar and in start screen https://bugs.python.org/issue33558 opened by e_l_e_c_t_r_i_f_y #33561: Add .tostring() method to xml.etree.ElementTree.Element https://bugs.python.org/issue33561 opened by Stevoisiak #33562: Check that the global settings for asyncio are not changed by https://bugs.python.org/issue33562 opened by brett.cannon #33563: fileinput input's and Fileinput's bufsize=0 do not remit depre https://bugs.python.org/issue33563 opened by mbussonn #33565: strange tracemalloc results https://bugs.python.org/issue33565 opened by thehesiod #33566: re.findall() dead locked whent the expected ending char not oc https://bugs.python.org/issue33566 opened by mamamiaibm #33567: Explicitly mention bytes and other buffers in the documentatio https://bugs.python.org/issue33567 opened by mjpieters #33568: Inconsistent behavior of non-ascii handling in EmailPolicy.fol https://bugs.python.org/issue33568 opened by licht-t #33569: dataclasses InitVar does not maintain any type info https://bugs.python.org/issue33569 opened by reinhrst #33570: OpenSSL 1.1.1 / TLS 1.3 cipher suite changes https://bugs.python.org/issue33570 opened by christian.heimes #33571: Add triple quotes to list of delimiters that trigger '...' pro https://bugs.python.org/issue33571 opened by adelfino #33572: False/True as dictionary keys treated as integers https://bugs.python.org/issue33572 opened by Janusz Harkot Most recent 15 issues with no replies (15) ========================================== #33571: Add triple quotes to list of delimiters that trigger '...' pro https://bugs.python.org/issue33571 #33570: OpenSSL 1.1.1 / TLS 1.3 cipher suite changes https://bugs.python.org/issue33570 #33569: dataclasses InitVar does not maintain any type info https://bugs.python.org/issue33569 #33568: Inconsistent behavior of non-ascii handling in EmailPolicy.fol https://bugs.python.org/issue33568 #33567: Explicitly mention bytes and other buffers in the documentatio https://bugs.python.org/issue33567 #33563: fileinput input's and Fileinput's bufsize=0 do not remit depre https://bugs.python.org/issue33563 #33562: Check that the global settings for asyncio are not changed by https://bugs.python.org/issue33562 #33557: Windows multiprocessing doesn't propagate tabcheck to children https://bugs.python.org/issue33557 #33552: f-strings and string annotations https://bugs.python.org/issue33552 #33550: Sigpipe handling issue should be documented https://bugs.python.org/issue33550 #33546: asyncio.Condition should become awaitable in 3.9 https://bugs.python.org/issue33546 #33545: Docs for uuid don't mention that uuid1 can repeat in some circ https://bugs.python.org/issue33545 #33542: _ipconfig_getnode incorrectly selects a DUID as a MAC address https://bugs.python.org/issue33542 #33541: Remove private and apparently unused __pad function https://bugs.python.org/issue33541 #33535: Consider quoting all values in Morsel objects https://bugs.python.org/issue33535 Most recent 15 issues waiting for review (15) ============================================= #33571: Add triple quotes to list of delimiters that trigger '...' pro https://bugs.python.org/issue33571 #33563: fileinput input's and Fileinput's bufsize=0 do not remit depre https://bugs.python.org/issue33563 #33562: Check that the global settings for asyncio are not changed by https://bugs.python.org/issue33562 #33556: leftover thread crumb in threading.ident docstring https://bugs.python.org/issue33556 #33544: Asyncio Event.wait() is a hold over from before awaitable, and https://bugs.python.org/issue33544 #33542: _ipconfig_getnode incorrectly selects a DUID as a MAC address https://bugs.python.org/issue33542 #33541: Remove private and apparently unused __pad function https://bugs.python.org/issue33541 #33540: socketserver: Add an opt-in option to get Python 3.6 behaviour https://bugs.python.org/issue33540 #33537: Help on importlib.resources outputs the builtin open descripti https://bugs.python.org/issue33537 #33525: os.spawnvpe() returns error code 127 instead of raising when e https://bugs.python.org/issue33525 #33524: non-ascii characters in headers causes TypeError on email.poli https://bugs.python.org/issue33524 #33523: loop.run_until_complete re-entrancy to support more complicate https://bugs.python.org/issue33523 #33521: Add 1.32x faster C implementation of asyncio.isfuture(). https://bugs.python.org/issue33521 #33519: Should MutableSequence provide .copy()? https://bugs.python.org/issue33519 #33518: Add PEP to glossary https://bugs.python.org/issue33518 Top 10 most discussed issues (10) ================================= #32769: Add 'annotations' to the glossary https://bugs.python.org/issue32769 18 msgs #33499: Environment variable to set alternate location for pycache tre https://bugs.python.org/issue33499 11 msgs #20104: expose posix_spawn(p) https://bugs.python.org/issue20104 9 msgs #32414: PyCapsule_Import fails when name is in the form 'package.modul https://bugs.python.org/issue32414 9 msgs #33494: random.choices ought to check that cumulative weights are in a https://bugs.python.org/issue33494 8 msgs #12486: tokenize module should have a unicode API https://bugs.python.org/issue12486 7 msgs #33435: incorrect detection of information of some distributions https://bugs.python.org/issue33435 7 msgs #33479: Document tkinter and threads https://bugs.python.org/issue33479 7 msgs #19251: bitwise ops for bytes of equal length https://bugs.python.org/issue19251 6 msgs #25478: Consider adding a normalize() method to collections.Counter() https://bugs.python.org/issue25478 6 msgs Issues closed (65) ================== #4934: tp_del and tp_version_tag undocumented https://bugs.python.org/issue4934 closed by pitrou #5286: urrlib2 digest authentication problems https://bugs.python.org/issue5286 closed by petr.viktorin #13631: readline fails to parse some forms of .editrc under editline ( https://bugs.python.org/issue13631 closed by ned.deily #21475: Support the Sitemap extension in robotparser https://bugs.python.org/issue21475 closed by ned.deily #22069: TextIOWrapper(newline="\n", line_buffering=True) mistakenly tr https://bugs.python.org/issue22069 closed by Mariatta #22552: ctypes.CDLL returns singleton objects, resulting in usage conf https://bugs.python.org/issue22552 closed by Ivan.Pozdeev #24318: Better documentaiton of profile-opt (and release builds in gen https://bugs.python.org/issue24318 closed by gregory.p.smith #26264: keyword module missing async and await keywords https://bugs.python.org/issue26264 closed by terry.reedy #28167: remove platform.linux_distribution() https://bugs.python.org/issue28167 closed by petr.viktorin #29706: IDLE needs syntax highlighting for async and await https://bugs.python.org/issue29706 closed by terry.reedy #31607: Add listsize in pdb.py https://bugs.python.org/issue31607 closed by matrixise #31645: openssl build fails in win32 if .pl extension is not associate https://bugs.python.org/issue31645 closed by Ivan.Pozdeev #31947: names=None case is not handled by EnumMeta._create_ method https://bugs.python.org/issue31947 closed by ethan.furman #32216: Document PEP 557 Data Classes (dataclasses module) https://bugs.python.org/issue32216 closed by eric.smith #32374: Document that m_traverse for multi-phase initialized modules c https://bugs.python.org/issue32374 closed by petr.viktorin #32384: Generator tests is broken in non-CPython implementation https://bugs.python.org/issue32384 closed by berker.peksag #32463: problems with shutil.py and os.get_terminal_size https://bugs.python.org/issue32463 closed by berker.peksag #32551: Zipfile & directory execution in 3.5.4 also adds the parent di https://bugs.python.org/issue32551 closed by petr.viktorin #32601: PosixPathTest.test_expanduser fails in NixOS build sandbox https://bugs.python.org/issue32601 closed by serhiy.storchaka #32861: urllib.robotparser: incomplete __str__ methods https://bugs.python.org/issue32861 closed by serhiy.storchaka #33198: Build on Linux with --enable-optimizations fails https://bugs.python.org/issue33198 closed by fschulze #33314: Bad rendering in the documentation for the os module https://bugs.python.org/issue33314 closed by gregory.p.smith #33341: python3 fails to build if directory or sysroot contains "*icc* https://bugs.python.org/issue33341 closed by terry.reedy #33358: [EASY] x86 Ubuntu Shared 3.x: test_embed.test_pre_initializati https://bugs.python.org/issue33358 closed by cheryl.sabella #33427: Dead link in "The Python Standard Library" page https://bugs.python.org/issue33427 closed by ned.deily #33443: Typo in Python/import.c https://bugs.python.org/issue33443 closed by brett.cannon #33453: from __future__ import annotations breaks dataclasses ClassVar https://bugs.python.org/issue33453 closed by eric.smith #33464: AIX and "specialized downloads" links https://bugs.python.org/issue33464 closed by Michael.Felt #33465: test_from_import_missing_attr_has_name_and_so_path fails when https://bugs.python.org/issue33465 closed by barry #33470: Changes from GH-1638 (GH-3575, bpo-28411) are not documented i https://bugs.python.org/issue33470 closed by vstinner #33472: build system incorrectly handles CC, CFLAGS, LDFLAGS, and rela https://bugs.python.org/issue33472 closed by eadler #33478: PEP 8 CapWords reference wrong? https://bugs.python.org/issue33478 closed by berker.peksag #33480: Improvement suggestions for urllib.parse.urlparser https://bugs.python.org/issue33480 closed by r.david.murray #33488: github pull request template does not satisfy markdownlint https://bugs.python.org/issue33488 closed by benjamin.peterson #33493: dataclasses: allow keyword-only arguments https://bugs.python.org/issue33493 closed by eric.smith #33495: dataclasses: repr of Field objects should use repr of each mem https://bugs.python.org/issue33495 closed by eric.smith #33496: Accept Pathlib paths for sqlite file https://bugs.python.org/issue33496 closed by devala #33497: cgi.parse_multipart does not have an associated "errors" param https://bugs.python.org/issue33497 closed by ned.deily #33502: dataclasses: repr of _DataclassParams objects should use repr https://bugs.python.org/issue33502 closed by eric.smith #33503: use pypi.org instead of pypi.python.org https://bugs.python.org/issue33503 closed by ned.deily #33506: [logging] template for filename timestamps https://bugs.python.org/issue33506 closed by vinay.sajip #33508: [logging] allow %p code to put PID into log filename https://bugs.python.org/issue33508 closed by vinay.sajip #33509: warnings.warn_explicit with module_globals=True raises a Syste https://bugs.python.org/issue33509 closed by ned.deily #33510: [logging] add JSON log formatter https://bugs.python.org/issue33510 closed by vinay.sajip #33512: use more standard approach for detecting long double in config https://bugs.python.org/issue33512 closed by benjamin.peterson #33513: incorrect detection of information of some distributions pytho https://bugs.python.org/issue33513 closed by serhiy.storchaka #33517: dataclasses: Add the field type to Field repr https://bugs.python.org/issue33517 closed by eric.smith #33520: ast.Tuple has wrong col_offset https://bugs.python.org/issue33520 closed by r.david.murray #33522: Enable CI builds on Visual Studio Team Services https://bugs.python.org/issue33522 closed by gregory.p.smith #33526: hashlib leak on import https://bugs.python.org/issue33526 closed by thehesiod #33534: dataclasses: unneeded test in _is_classvar https://bugs.python.org/issue33534 closed by eric.smith #33536: dataclasses.make_dataclass does not validate fields for being https://bugs.python.org/issue33536 closed by eric.smith #33538: Remove useless check in subprocess https://bugs.python.org/issue33538 closed by serhiy.storchaka #33539: Remove `init` flag from @dataclass? https://bugs.python.org/issue33539 closed by eric.smith #33543: `make html` broken https://bugs.python.org/issue33543 closed by ned.deily #33547: Relative imports do not replace local variables https://bugs.python.org/issue33547 closed by ncoghlan #33548: tempfile._candidate_tempdir_list should consider common TEMP l https://bugs.python.org/issue33548 closed by steve.dower #33549: xmlbuilder's `_AsyncDeprecatedProperty` make no sens now that https://bugs.python.org/issue33549 closed by serhiy.storchaka #33551: The string prefixes u and f can't used together https://bugs.python.org/issue33551 closed by serhiy.storchaka #33554: Optimize PyDictObject https://bugs.python.org/issue33554 closed by serhiy.storchaka #33555: No SyntaxError raised for `return` with argument inside genera https://bugs.python.org/issue33555 closed by serhiy.storchaka #33559: Exception's repr change not documented https://bugs.python.org/issue33559 closed by serhiy.storchaka #33560: tuple.index() could return a more explicit error message https://bugs.python.org/issue33560 closed by serhiy.storchaka #33564: IDLE: add 'async' to codecontext block openers https://bugs.python.org/issue33564 closed by terry.reedy #991266: Cookie.py does not correctly quote Morsels https://bugs.python.org/issue991266 closed by berker.peksag From zachary.ware+pydev at gmail.com Fri May 18 12:12:03 2018 From: zachary.ware+pydev at gmail.com (Zachary Ware) Date: Fri, 18 May 2018 11:12:03 -0500 Subject: [Python-Dev] please help triage VSTS failures In-Reply-To: References: Message-ID: On Fri, May 18, 2018 at 10:55 AM, Gregory P. Smith wrote: > These both look like VSTS infrastructure falling over on PRs: > > https://python.visualstudio.com/cpython/_build?buildId=522 > > https://python.visualstudio.com/cpython/_build?buildId=523 > > I don't see anywhere that gives information about the failures. (*) Somewhat non-intuitively, logs are available by clicking something in the tree in the left column (in particular, the inner-most thing with a red X). Do be sure to choose "Logs" in the right pane rather than "Tests"; "Tests" appears to be a separate feature that we don't use. > These CI failures on different platforms are both for the same change, on > different branches, for a documentation only change. That passed on the > other branches and platforms for the same change. The Windows failure appears to be the test_asyncio instability that has been popping up everywhere the past couple of days. The Linux failure looks like a mis-use of Tools/ssl/multissltests.py in the setup, but I thought I'd seen a checkin go by that removed that. -- Zach From steve at holdenweb.com Fri May 18 12:19:41 2018 From: steve at holdenweb.com (Steve Holden) Date: Fri, 18 May 2018 17:19:41 +0100 Subject: [Python-Dev] Normalisation of unicode and keywords In-Reply-To: <9b6287a6-db89-2a2f-8273-8c562a3f3813@mail.mipt.ru> References: <20180518114611.GR12683@ando.pearwood.info> <9b6287a6-db89-2a2f-8273-8c562a3f3813@mail.mipt.ru> Message-ID: It's a canonicalisation error. Steve Holden On Fri, May 18, 2018 at 2:38 PM, Ivan Pozdeev via Python-Dev < python-dev at python.org> wrote: > On 18.05.2018 14:46, Steven D'Aprano wrote: > >> Stephan Houben noticed that Python apparently allows identifiers to be >> keywords, if you use Unicode "mathematical bold" letters. His >> explanation is that the identifier is normalised, but not until after >> keywords are checked for. So this works: >> >> class Spam: >> locals()['if'] = 1 >> >> >> Spam.?? # U+1D422 U+1D41F >> # returns 1 >> >> >> Of course Spam.if fails with SyntaxError. >> >> Should this work? Is this a bug, a feature, or an accident of >> implementation we can ignore? >> > Voting for bug: > Either those identifiers should be considered equal, or they shouldn't. > They can't be considered "partially" equal. > >> >> >> > -- > Regards, > Ivan > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/steve% > 40holdenweb.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From skip.montanaro at gmail.com Fri May 18 12:50:48 2018 From: skip.montanaro at gmail.com (Skip Montanaro) Date: Fri, 18 May 2018 11:50:48 -0500 Subject: [Python-Dev] [Python-checkins] bpo-33522: Enable CI builds on Visual Studio Team Services (GH-6865) (GH-6925) In-Reply-To: References: <40mt4J0QJ1zFr2Z@mail.python.org> Message-ID: On Thu, May 17, 2018 at 11:32 PM Gregory P. Smith wrote: > Why did this commit modify .py files, unittests, and test.support? > That is inappropriate for something claiming to merely enable a CI platform. I think there is probably an argument to be made that some of the changes will be improvements, but I think such changes belong in separate PRs. Even if that means you have to postpone the core bits of this change until they are merged. Skip From steve.dower at python.org Fri May 18 16:13:25 2018 From: steve.dower at python.org (Steve Dower) Date: Fri, 18 May 2018 13:13:25 -0700 Subject: [Python-Dev] please help triage VSTS failures In-Reply-To: References: Message-ID: Unfamiliar maybe, though I?m a big fan of separating build and test logs. If anyone is motivated enough to make unittest/regrtest generate Junit format XML then we can get a nice breakdown by individual test, too, which would save scrolling through the log entirely. The asyncio instability is apparently really hard to fix. There were 2-3 people looking into it yesterday on one of the other systems, but apparently we haven?t solved it yet (my guess is lingering state from a previous test). The multissl script was my fault for not realising that we don?t use it on 3.6 builds, but that should be fixed already. Close/reopen PR is the best way to trigger a rebuild right now. According to the VSTS dev team, an easy ?rerun this build? button and filtering by changed paths are coming soon, which should clean things up. We *could* add our own detection for doc-only changes and skip most steps ? I?m happy to add that in if someone can help with the right git incantation. Top-posted from my Windows phone From: Zachary Ware Sent: Friday, May 18, 2018 9:15 To: Python-Dev Subject: Re: [Python-Dev] please help triage VSTS failures On Fri, May 18, 2018 at 10:55 AM, Gregory P. Smith wrote: > These both look like VSTS infrastructure falling over on PRs: > > https://python.visualstudio.com/cpython/_build?buildId=522 > > https://python.visualstudio.com/cpython/_build?buildId=523 > > I don't see anywhere that gives information about the failures. (*) Somewhat non-intuitively, logs are available by clicking something in the tree in the left column (in particular, the inner-most thing with a red X). Do be sure to choose "Logs" in the right pane rather than "Tests"; "Tests" appears to be a separate feature that we don't use. > These CI failures on different platforms are both for the same change, on > different branches, for a documentation only change. That passed on the > other branches and platforms for the same change. The Windows failure appears to be the test_asyncio instability that has been popping up everywhere the past couple of days. The Linux failure looks like a mis-use of Tools/ssl/multissltests.py in the setup, but I thought I'd seen a checkin go by that removed that. -- Zach _______________________________________________ Python-Dev mailing list Python-Dev at python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/steve.dower%40python.org -------------- next part -------------- An HTML attachment was scrubbed... URL: From yselivanov.ml at gmail.com Fri May 18 16:16:36 2018 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Fri, 18 May 2018 16:16:36 -0400 Subject: [Python-Dev] please help triage VSTS failures In-Reply-To: <40nfYL0lCQzFr1C@mail.python.org> References: <40nfYL0lCQzFr1C@mail.python.org> Message-ID: On Fri, May 18, 2018 at 4:15 PM Steve Dower wrote: [..] > The asyncio instability is apparently really hard to fix. There were 2-3 people looking into it yesterday on one of the other systems, but apparently we haven?t solved it yet (my guess is lingering state from a previous test). The multissl script was my fault for not realising that we don?t use it on 3.6 builds, but that should be fixed already. Close/reopen PR is the best way to trigger a rebuild right now. I asked Andrew Svetlov to help with asyncio CI triage. Hopefully we'll resolve most of them early next week. Yury From tjreedy at udel.edu Fri May 18 17:30:30 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Fri, 18 May 2018 17:30:30 -0400 Subject: [Python-Dev] please help triage VSTS failures In-Reply-To: <40nfYt21jXzFr0m@mail.python.org> References: <40nfYt21jXzFr0m@mail.python.org> Message-ID: On 5/18/2018 4:13 PM, Steve Dower wrote: > Close/reopen PR is the best way to trigger a rebuild right now. It may be the way to retrigger VSTS, but if one want to merge, and either of Travis or AppVeyor pass, tossing the success is a foolish thing to do. Either may fail on a rebuild. -- Terry Jan Reedy From njs at pobox.com Fri May 18 20:33:36 2018 From: njs at pobox.com (Nathaniel Smith) Date: Fri, 18 May 2018 17:33:36 -0700 Subject: [Python-Dev] please help triage VSTS failures In-Reply-To: <40nfYJ3wfjzFqxc@mail.python.org> References: <40nfYJ3wfjzFqxc@mail.python.org> Message-ID: On Fri, May 18, 2018 at 1:13 PM, Steve Dower wrote: > According to the VSTS dev team, an easy ?rerun this build? button and > filtering by changed paths are coming soon, which should clean things up. If you're talking to them, please ask them to make sure that the "rerun this build" button doesn't erase the old log. (That's what it does on Travis. Appveyor is better.) The problem is that when you have a flaky/intermittent failure, your todo list is always (a) rerun the build so at least it's not blocking whatever this unrelated change is, (b) file some sort of bug, or comment on some existing bug, and link to the log to help track down the intermittent failure. If you click the "rebuild" button on Travis, then it solves (a), while deleting the information you need for (b) ? and for rare intermittent bugs you might not have much information to go on besides that build log. -n -- Nathaniel J. Smith -- https://vorpus.org From greg at krypto.org Fri May 18 20:49:14 2018 From: greg at krypto.org (Gregory P. Smith) Date: Fri, 18 May 2018 17:49:14 -0700 Subject: [Python-Dev] please help triage VSTS failures In-Reply-To: References: <40nfYJ3wfjzFqxc@mail.python.org> Message-ID: Hah, yes, that was un-intuitive for me. I was looking for something labelled "logs" to click on. thanks. (I hope MS is taking notes on UX issues here) So these failures were in fact the known flakes and not infrastructure. good! On the high level view of VSTS output on a failure: "Issues: phase1 Cmd.exe exited with code '2'." is not useful. Can we make that display the tail of the failing phase's log? that'd avoid needing to find the inner most red X and click on it which is what I'll always need to do otherwise as there doesn't appear to be a direct hyperlink to that. It looks like VSTS also had an API for surfacing individual test results into their Tests / Test Results summary pane? Doing something to integrate with that would likely be a nicer UI. We have something similar at work, here's how we make unittest.TestCase emit the JUnit XML files of test results for CI systems to display: https://github.com/abseil/abseil-py/blob/master/absl/testing/xml_reporter.py If there are industry standard format(s) that make Python test suites play nicer for reporting on CI systems such as that, they'd be worthy of stdlib inclusion. -gps On Fri, May 18, 2018 at 5:35 PM Nathaniel Smith wrote: > On Fri, May 18, 2018 at 1:13 PM, Steve Dower > wrote: > > According to the VSTS dev team, an easy ?rerun this build? button and > > filtering by changed paths are coming soon, which should clean things up. > > If you're talking to them, please ask them to make sure that the > "rerun this build" button doesn't erase the old log. (That's what it > does on Travis. Appveyor is better.) The problem is that when you have > a flaky/intermittent failure, your todo list is always (a) rerun the > build so at least it's not blocking whatever this unrelated change is, > (b) file some sort of bug, or comment on some existing bug, and link > to the log to help track down the intermittent failure. If you click > the "rebuild" button on Travis, then it solves (a), while deleting the > information you need for (b) ? and for rare intermittent bugs you > might not have much information to go on besides that build log. > > -n > > -- > Nathaniel J. Smith -- https://vorpus.org > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/greg%40krypto.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve.dower at python.org Fri May 18 20:52:40 2018 From: steve.dower at python.org (Steve Dower) Date: Fri, 18 May 2018 17:52:40 -0700 Subject: [Python-Dev] please help triage VSTS failures In-Reply-To: References: <40nfYJ3wfjzFqxc@mail.python.org> Message-ID: Hmm... I would guess that it would create a new "build? and likely lose the link to the original (though it would still be kept, just harder to find). Then again, the existing support for rerunning a release through VSTS keeps all the previous attempts... I?ll mention it, but the quickest fix here may not be immediately sufficient. Sent from my Windows 10 phone From: Nathaniel Smith Sent: Friday, May 18, 2018 17:34 To: Steve Dower Cc: Zachary Ware; Python-Dev Subject: Re: [Python-Dev] please help triage VSTS failures On Fri, May 18, 2018 at 1:13 PM, Steve Dower wrote: > According to the VSTS dev team, an easy ?rerun this build? button and > filtering by changed paths are coming soon, which should clean things up. If you're talking to them, please ask them to make sure that the "rerun this build" button doesn't erase the old log. (That's what it does on Travis. Appveyor is better.) The problem is that when you have a flaky/intermittent failure, your todo list is always (a) rerun the build so at least it's not blocking whatever this unrelated change is, (b) file some sort of bug, or comment on some existing bug, and link to the log to help track down the intermittent failure. If you click the "rebuild" button on Travis, then it solves (a), while deleting the information you need for (b) ? and for rare intermittent bugs you might not have much information to go on besides that build log. -n -- Nathaniel J. Smith -- https://vorpus.org -------------- next part -------------- An HTML attachment was scrubbed... URL: From christian at python.org Sat May 19 05:05:06 2018 From: christian at python.org (Christian Heimes) Date: Sat, 19 May 2018 11:05:06 +0200 Subject: [Python-Dev] Failing tests on master (asyncio, multiprocessing) Message-ID: Hi, several of my PRs as well as local tests have started failing recently. On my local Fedora 27 machine, four sendfile related tests of test_asyncio's BaseLoopSockSendfileTests suite are failing reproducible. For example Travis CI job https://travis-ci.org/python/cpython/jobs/380852981 fails in: * test_stdin_broken_pipe (test.test_asyncio.test_subprocess.SubprocessFastWatcherTests) * test_stdin_broken_pipe (test.test_asyncio.test_subprocess.SubprocessSafeWatcherTests) * test_ignore (test.test_multiprocessing_forkserver.TestIgnoreEINTR) Could somebody have a look, please? Christian From mark at hotpy.org Sat May 19 05:15:34 2018 From: mark at hotpy.org (mark) Date: Sat, 19 May 2018 05:15:34 -0400 Subject: [Python-Dev] PEP: 576 Title: Rationalize Built-in function classes Message-ID: <32fab306-d206-f6b7-ca4e-c3e0dfbed0eb@hotpy.org> Hi, At the language summit this year, there was some discussion of PEP 575. I wanted to simplify the PEP, but rather than modify that PEP, Nick Coghlan encouraged me to write an alternative PEP instead. PEP 576 aims to fulfill the same goals as PEP 575, but with fewer changes and to be fully backwards compatible. The PEP can be viewed here: https://github.com/python/peps/blob/master/pep-0576.rst Cheers, Mark. P.S. I'm happy to have discussion of this PEP take place via GitHub, rather than the mailing list, but I thought I would follow the conventional route for now. From lists at eitanadler.com Sat May 19 05:29:49 2018 From: lists at eitanadler.com (Eitan Adler) Date: Sat, 19 May 2018 02:29:49 -0700 Subject: [Python-Dev] Failing tests on master (asyncio, multiprocessing) In-Reply-To: References: Message-ID: On 19 May 2018 at 02:05, Christian Heimes wrote: > Hi, > > several of my PRs as well as local tests have started failing recently. > On my local Fedora 27 machine, four sendfile related tests of > test_asyncio's BaseLoopSockSendfileTests suite are failing reproducible. > https://bugs.python.org/issue33531 ? -- Eitan Adler From christian at python.org Sat May 19 05:33:36 2018 From: christian at python.org (Christian Heimes) Date: Sat, 19 May 2018 11:33:36 +0200 Subject: [Python-Dev] Failing tests on master (asyncio, multiprocessing) In-Reply-To: References: Message-ID: <82e07a73-e18f-a17d-d6c1-0dea3a397ddb@python.org> On 2018-05-19 11:29, Eitan Adler wrote: > On 19 May 2018 at 02:05, Christian Heimes wrote: >> Hi, >> >> several of my PRs as well as local tests have started failing recently. >> On my local Fedora 27 machine, four sendfile related tests of >> test_asyncio's BaseLoopSockSendfileTests suite are failing reproducible. >> > > https://bugs.python.org/issue33531 ? Yeah, that's it. Thanks! Christian From stefan_ml at behnel.de Sat May 19 08:48:26 2018 From: stefan_ml at behnel.de (Stefan Behnel) Date: Sat, 19 May 2018 14:48:26 +0200 Subject: [Python-Dev] PEP: 576 Title: Rationalize Built-in function classes In-Reply-To: <32fab306-d206-f6b7-ca4e-c3e0dfbed0eb@hotpy.org> References: <32fab306-d206-f6b7-ca4e-c3e0dfbed0eb@hotpy.org> Message-ID: mark schrieb am 19.05.2018 um 11:15: > At the language summit this year, there was some discussion of PEP 575. > I wanted to simplify the PEP, but rather than modify that PEP, Nick Coghlan > encouraged me to write an alternative PEP instead. > > PEP 576 aims to fulfill the same goals as PEP 575, but with fewer changes > and to be fully backwards compatible. > > The PEP can be viewed here: > > https://github.com/python/peps/blob/master/pep-0576.rst Quick question, since the PEP doesn't say it explicitly. I assume that the builtin function type would be subclassable, right? That would suggest that it also needs a type flag bit for fast type checking. Basically, subclassing still seems necessary in order to support closure state, non-constant default arguments, etc. Stefan From ncoghlan at gmail.com Sat May 19 09:29:36 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 19 May 2018 23:29:36 +1000 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: <5AFC95D2.5070102@UGent.be> References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> <5AFC95D2.5070102@UGent.be> Message-ID: On 17 May 2018 at 06:34, Jeroen Demeyer wrote: > On 2018-05-16 17:31, Petr Viktorin wrote: > >> The larger a change is, the harder it is to understand >> > > I already disagree here... > > I'm afraid that you are still confusing the largeness of the *change* with > the complexity of the *result* after the change was implemented. > That's not how code reviews work, as their complexity is governed by the number of lines changed (added/removed/modified), not just the number of lines that are left at the end. That said, "deletes more lines than it adds" is typically a point strongly in favour of a particular change. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From skip.montanaro at gmail.com Sat May 19 18:41:18 2018 From: skip.montanaro at gmail.com (Skip Montanaro) Date: Sat, 19 May 2018 17:41:18 -0500 Subject: [Python-Dev] "make test" routinely fails to terminate Message-ID: On the 3.7 branch, "make test" routinely fails to terminate. (Pretty vanilla Ubuntu 17.10 running on a Dell Laptop. Nothing esoteric at all) Lately, it's been one of the multiprocessing tests. After a long while (~2000 seconds), I kill it, then it complains many times about lack of a valid_signals attribute in the signal module: ====================================================================== ERROR: test_remove_signal_handler_error2 (test.test_asyncio.test_unix_events.SelectorEventLoopSignalTests) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/skip/src/python/cpython/Lib/unittest/mock.py", line 1191, in patched return func(*args, **keywargs) File "/home/skip/src/python/cpython/Lib/test/test_asyncio/test_unix_events.py", line 219, in test_remove_signal_handler_error2 m_signal.valid_signals = signal.valid_signals AttributeError: module 'signal' has no attribute 'valid_signals' ---------------------------------------------------------------------- Ran 1967 tests in 36.058s FAILED (errors=362, skipped=11) test test_asyncio failed /home/skip/src/python/cpython/Lib/asyncio/base_events.py:605: ResourceWarning: unclosed event loop <_UnixSelectorEventLoop running=False closed=False debug=False> source=self) Re-running test 'test_signal' in verbose mode then reruns test_signal in verbose mode. Earlier today, a run succeeded, so I'm guessing a race condition exists in the test system. I recall encountering a similar problem a few weeks ago and discovered this open ticket: https://bugs.python.org/issue33099 Should I expect this as the normal behavior? Skip From solipsis at pitrou.net Sun May 20 03:34:36 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Sun, 20 May 2018 09:34:36 +0200 Subject: [Python-Dev] "make test" routinely fails to terminate References: Message-ID: <20180520093436.100493a4@fsol> On Sat, 19 May 2018 17:41:18 -0500 Skip Montanaro wrote: > On the 3.7 branch, "make test" routinely fails to terminate. (Pretty > vanilla Ubuntu 17.10 running on a Dell Laptop. Nothing esoteric at all) > Lately, it's been one of the multiprocessing tests. After a long while > (~2000 seconds), I kill it, then it complains many times about lack of a > valid_signals attribute in the signal module: Can you try to rebuild Python? Use "make distclean" if that helps. Regards Antoine. From J.Demeyer at UGent.be Sun May 20 15:26:00 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Sun, 20 May 2018 21:26:00 +0200 Subject: [Python-Dev] PEP: 576 Title: Rationalize Built-in function classes In-Reply-To: <25bb4d002cf0465fbb44390ed774b157@xmail101.UGent.be> References: <25bb4d002cf0465fbb44390ed774b157@xmail101.UGent.be> Message-ID: <5B01CBC8.2070606@UGent.be> On 2018-05-19 11:15, mark wrote: > PEP 576 aims to fulfill the same goals as PEP 575 (this is a copy of my comments on GitHub before this PEP was official) **Performance** Most importantly, changing bound methods of extension types from builtin_function_or_method to bound_method will yield a performance loss. It might be possible to mitigate this somewhat by adding specific optimizations for calling bound_method. However, that would add extra complexity and it will probably still be slower than the existing code. And I would also like to know whether it will be possible for custom built-in function subclasses to implement __get__ to change a function into a method (like Python functions) and whether/how the LOAD_METHOD opcode will work in that case. **Introspection** When I want "introspection support", that goes beyond the call signature. Also inspect.getfile should be supported. Currently, that simply raises an exception for built-in functions. I think it's important to specify the semantics of inspect.isfunction. Given that you don't mention it, I assume that inspect.isfunction will continue to return True only for Python functions. But that way, these new function classes won't behave like Python functions. > fully backwards compatible. I wonder why you think it is "fully backwards compatible". Just like PEP 575, you are changing the classes of certain objects. I think it's fairer to say that both PEP 575 and PEP 576 might cause minor backwards compatibility issues. I certainly don't think that PEP 576 is significantly more backwards compatible than PEP 575. PS: in your PEP, you write "bound_method" but I guess you mean "method". PEP 575 proposes to rename "method" to "bound_method". Jeroen. From J.Demeyer at UGent.be Sun May 20 16:15:08 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Sun, 20 May 2018 22:15:08 +0200 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: <2115689ffb51476caff516fc5af1410f@xmail101.UGent.be> References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> <5AFC95D2.5070102@UGent.be> <2115689ffb51476caff516fc5af1410f@xmail101.UGent.be> Message-ID: <5B01D74C.3000506@UGent.be> On 2018-05-19 15:29, Nick Coghlan wrote: > That's not how code reviews work, as their complexity is governed by the > number of lines changed (added/removed/modified), not just the number of > lines that are left at the end. Of course, you are right. I didn't mean literally that only the end result matters. But it should certainly be considered. If you only do small incremental changes, complexity tends to build up because choices which are locally optimal are not always globally optimal. Sometimes you need to do some refactoring to revisit some of that complexity. This is part of what PEP 575 does. > That said, "deletes more lines than it > adds" is typically a point strongly in favour of a particular change. This certainly won't be true for my patch, because there is a lot of code that I need to support for backwards compatibility (all the old code for method_descriptor in particular). Going back to the review of PEP 575, I see the following possible outcomes: (A) Accept it as is (possibly with minor changes). (B) Accept the general idea but split the details up in several PEPs which can still be discussed individually. (C) Accept a minimal variant of PEP 575, only changing existing classes but not changing the class hierarchy. (D) Accept some yet-to-be-written variant of PEP 575. (E) Don't fix the use case that PEP 575 wants to address. Petr Viktorin suggests (C). I am personally quite hesitant because that only adds complexity and it wouldn't be the best choice for the future maintainability of CPython. I also fear that this hypothetical PEP variant would be rejected because of that reason. Of course, if there is some general agreement that (C) is the way to go, then that is fine for me. If people feel that PEP 575 is currently too complex, I think that (B) is a very good compromise. The end result would be the same as what PEP 575 proposes. Instead of changing many things at once, we could handle each class in a separate PEP. But the motivation of those mini-PEPs will still be PEP 575. So, in order for this to make sense, the general idea of PEP 575 needs to be accepted: adding a base_function base class and making various existing classes subclasses of that. Jeroen. From nad at python.org Sun May 20 18:34:29 2018 From: nad at python.org (Ned Deily) Date: Sun, 20 May 2018 18:34:29 -0400 Subject: [Python-Dev] 3.7.0rc1 deadline extended two days to 2018-05-23 AOE [Re: FINAL WEEK FOR 3.7.0 CHANGES!] In-Reply-To: References: Message-ID: <916214BF-DA73-4380-9C3D-2479CCE3A537@python.org> We are going to extend for 48 hours the deadline for 3.7.0rc1, that is, until 2018-05-23 23:59 AOE. While we have made tremendous progress towards the release candidate over the past week especially with the huge efforts at the PyCon US Sprints, we still have some important issues to resolve. A stumbling block has been the increased instability in the test suite, primarily in test_asyncio, which has caused delays in merging PRs due to intermittent failures in CI test runs and which has caused widespread buildbot failures. Another factor is that this weekend and Monday are public holidays in many countries, something I did not take into account when drawing up the schedule. (Note that next weekend is a major public holiday in the USA.) So let's plan on using the extra two days to work through the remaining release blockers. Thanks again! --Ned On May 15, 2018, at 07:51, Ned Deily wrote: > This is it! We are down to THE FINAL WEEK for 3.7.0! Please get your > feature fixes, bug fixes, and documentation updates in before > 2018-05-21 ~23:59 Anywhere on Earth (UTC-12:00). That's about 7 days > from now. We will then tag and produce the 3.7.0 release candidate. > Our goal continues been to be to have no changes between the release > candidate and final; AFTER NEXT WEEK'S RC1, CHANGES APPLIED TO THE 3.7 > BRANCH WILL BE RELEASED IN 3.7.1. Please double-check that there are > no critical problems outstanding and that documentation for new > features in 3.7 is complete (including NEWS and What's New items), and > that 3.7 is getting exposure and tested with our various platorms and > third-party distributions and applications. Those of us who are > participating in the development sprints at PyCon US 2018 here in > Cleveland can feel the excitement building as we work through the > remaining issues, including completing the "What's New in 3.7" > document and final feature documentation. (We wish you could all be > here.) > > As noted before, the ABI for 3.7.0 was frozen as of 3.7.0b3. You > should now be treating the 3.7 branch as if it were already released > and in maintenance mode. That means you should only push the kinds of > changes that are appropriate for a maintenance release: > non-ABI-changing bug and feature fixes and documentation updates. If > you find a problem that requires an ABI-altering or other significant > user-facing change (for example, something likely to introduce an > incompatibility with existing users' code or require rebuilding of > user extension modules), please make sure to set the b.p.o issue to > "release blocker" priority and describe there why you feel the change > is necessary. If you are reviewing PRs for 3.7 (and please do!), be on > the lookout for and flag potential incompatibilities (we've all made > them). > > Thanks again for all of your hard work towards making 3.7.0 yet > another great release - coming to a website near you on 06-15! > > Release Managerly Yours, > --Ned > > https://www.python.org/dev/peps/pep-0537/ -- Ned Deily nad at python.org -- [] From storchaka at gmail.com Mon May 21 08:41:59 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Mon, 21 May 2018 15:41:59 +0300 Subject: [Python-Dev] Procedure for adding new public C API Message-ID: Please don't forgot to perform the following steps when add a new public C API: * Document it in Doc/c-api/. * Add an entry in the What's New document. * Add it in Doc/data/refcounts.dat. * Add it in PC/python3.def. If you want to include it in the limited API, wrap its declaration with: #if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 >= 0x03080000 #endif (use the correct Python version of introducing a feature in the comparison) If you don't want to include it in the limited API, wrap its declaration with: #ifndef Py_LIMITED_API #endif You are free of adding private C API, but its name should start with _Py or _PY and its declaration should be wrapped with: #ifndef Py_LIMITED_API #endif From nad at python.org Mon May 21 09:20:15 2018 From: nad at python.org (Ned Deily) Date: Mon, 21 May 2018 09:20:15 -0400 Subject: [Python-Dev] Procedure for adding new public C API In-Reply-To: References: Message-ID: On May 21, 2018, at 08:41, Serhiy Storchaka wrote: > Please don't forgot to perform the following steps when add a new public C API: > [...] Perhaps this should be added to the Python Developer's Guide? -- Ned Deily nad at python.org -- [] From p.f.moore at gmail.com Mon May 21 09:27:49 2018 From: p.f.moore at gmail.com (Paul Moore) Date: Mon, 21 May 2018 14:27:49 +0100 Subject: [Python-Dev] Procedure for adding new public C API In-Reply-To: References: Message-ID: On 21 May 2018 at 13:41, Serhiy Storchaka wrote: > Please don't forgot to perform the following steps when add a new public C > API: > > * Document it in Doc/c-api/. > > * Add an entry in the What's New document. > > * Add it in Doc/data/refcounts.dat. > > * Add it in PC/python3.def. I thought python3.def should only contain symbols in the limited ABI (it defines the API of python3.dll, doesn't it?) > If you want to include it in the limited API, wrap its declaration with: > > #if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 >= 0x03080000 > #endif > > (use the correct Python version of introducing a feature in the comparison) > > If you don't want to include it in the limited API, wrap its declaration > with: > > #ifndef Py_LIMITED_API > #endif Is it even acceptable to add a symbol into the limited ABI? I thought the idea was that if I linked with python3.dll, my code would work with any version of Python 3? By introducing new symbols, code linked with the python3.dll shipped with (say) Python 3.8 would fail to run if executed with the python3.dll from Python 3.5. I have code that links with python3.dll, which is expected to run with any version of Python 3, so this isn't theoretical. (I'm not 100% sure whether, if I build with a Python 3.5 version of the headers, my code will link with a python3.dll with extra symbols - but even if that's the case, requiring cross-version binaries to be built with the oldest version of Python that they support seems restrictive at best. > You are free of adding private C API, but its name should start with _Py or > _PY and its declaration should be wrapped with: > > #ifndef Py_LIMITED_API > #endif Paul From storchaka at gmail.com Mon May 21 09:42:51 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Mon, 21 May 2018 16:42:51 +0300 Subject: [Python-Dev] Procedure for adding new public C API In-Reply-To: References: Message-ID: 21.05.18 16:27, Paul Moore ????: > On 21 May 2018 at 13:41, Serhiy Storchaka wrote: >> * Add it in PC/python3.def. > > I thought python3.def should only contain symbols in the limited ABI > (it defines the API of python3.dll, doesn't it?) Thank you for correction. Yes, and only for Windows. New API implemented as macros shouldn't be included here, but if it uses some new functions, they should be included. >> If you want to include it in the limited API, wrap its declaration with: >> >> #if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 >= 0x03080000 >> #endif >> >> (use the correct Python version of introducing a feature in the comparison) >> >> If you don't want to include it in the limited API, wrap its declaration >> with: >> >> #ifndef Py_LIMITED_API >> #endif > > Is it even acceptable to add a symbol into the limited ABI? I thought > the idea was that if I linked with python3.dll, my code would work > with any version of Python 3? By introducing new symbols, code linked > with the python3.dll shipped with (say) Python 3.8 would fail to run > if executed with the python3.dll from Python 3.5. The limited API is versioned. If you use only Python 3.5 API (define Py_LIMITED_API to 0x03050000), the built code will be expected to work on 3.5 and later. In theory. From p.f.moore at gmail.com Mon May 21 10:08:36 2018 From: p.f.moore at gmail.com (Paul Moore) Date: Mon, 21 May 2018 15:08:36 +0100 Subject: [Python-Dev] Procedure for adding new public C API In-Reply-To: References: Message-ID: On 21 May 2018 at 14:42, Serhiy Storchaka wrote: >> Is it even acceptable to add a symbol into the limited ABI? I thought >> the idea was that if I linked with python3.dll, my code would work >> with any version of Python 3? By introducing new symbols, code linked >> with the python3.dll shipped with (say) Python 3.8 would fail to run >> if executed with the python3.dll from Python 3.5. > > The limited API is versioned. If you use only Python 3.5 API (define > Py_LIMITED_API to 0x03050000), the built code will be expected to work on > 3.5 and later. In theory. Thanks, I'd missed that point (I need to go and check my build process, in that case :-)). Paul From skip.montanaro at gmail.com Mon May 21 20:07:22 2018 From: skip.montanaro at gmail.com (Skip Montanaro) Date: Mon, 21 May 2018 19:07:22 -0500 Subject: [Python-Dev] My fork lacks a 3.7 branch - can I create it somehow? Message-ID: My GitHub fork of the cpython repo was made awhile ago, before a 3.7 branch was created. I have no remotes/origin/3.7. Is there some way to create it from remotes/upstream/3.7? I asked on GitHub's help forums. The only recommendation was to to delete my fork and recreate it. That seemed kind of drastic, and I will do it if that's really the only way, but this seems like functionality Git and/or GitHub probably supports. Thx, Skip From skip.montanaro at gmail.com Mon May 21 19:47:38 2018 From: skip.montanaro at gmail.com (Skip Montanaro) Date: Mon, 21 May 2018 18:47:38 -0500 Subject: [Python-Dev] "make test" routinely fails to terminate In-Reply-To: <20180520093436.100493a4@fsol> References: <20180520093436.100493a4@fsol> Message-ID: me> On the 3.7 branch, "make test" routinely fails to terminate. Antoine> Can you try to rebuild Python? Use "make distclean" if that helps. Thanks, Antoine. That solved the termination problem. I still have problems with test_asyncio failing, but I can live with that for now. If "make distclean" is required, I suspect there is a missing/incorrect/incomplete Make dependency somewhere. I suppose "make distclean" is cheap enough that I should do it whenever I switch branches. Skip From rosuav at gmail.com Mon May 21 20:19:57 2018 From: rosuav at gmail.com (Chris Angelico) Date: Tue, 22 May 2018 10:19:57 +1000 Subject: [Python-Dev] My fork lacks a 3.7 branch - can I create it somehow? In-Reply-To: References: Message-ID: On Tue, May 22, 2018 at 10:07 AM, Skip Montanaro wrote: > My GitHub fork of the cpython repo was made awhile ago, before a 3.7 branch > was created. I have no remotes/origin/3.7. Is there some way to create it > from remotes/upstream/3.7? I asked on GitHub's help forums. The only > recommendation was to to delete my fork and recreate it. That seemed kind > of drastic, and I will do it if that's really the only way, but this seems > like functionality Git and/or GitHub probably supports. > Create it from upstream? Yep! Try this: git checkout -b 3.7 upstream/3.7 git push -u origin 3.7 That'll probably have to chug-chug-chug to push all that content up to GitHub; AFAIK they don't have any optimization for "this commit already exists in another fork of this repository", so you'll have to upload everything as if it's completely new. But other than that, it should be quick and easy. ChrisA From skip.montanaro at gmail.com Mon May 21 20:45:48 2018 From: skip.montanaro at gmail.com (Skip Montanaro) Date: Mon, 21 May 2018 19:45:48 -0500 Subject: [Python-Dev] My fork lacks a 3.7 branch - can I create it somehow? In-Reply-To: References: Message-ID: > Create it from upstream? Yep! Try this: > git checkout -b 3.7 upstream/3.7 > git push -u origin 3.7 Thanks, Chris! Didn't have to chug for too long either, just a few seconds. S From rosuav at gmail.com Mon May 21 20:49:32 2018 From: rosuav at gmail.com (Chris Angelico) Date: Tue, 22 May 2018 10:49:32 +1000 Subject: [Python-Dev] My fork lacks a 3.7 branch - can I create it somehow? In-Reply-To: References: Message-ID: On Tue, May 22, 2018 at 10:45 AM, Skip Montanaro wrote: >> Create it from upstream? Yep! Try this: > >> git checkout -b 3.7 upstream/3.7 >> git push -u origin 3.7 > > Thanks, Chris! Didn't have to chug for too long either, just a few seconds. > > S Perfect! I'm used to doing this sort of thing with long histories that need to be synchronized, so there can be minutes of uploading. But I guess here "3.7" is very close to "master", so there's not as much to sync. Even easier! :) ChrisA From ma3yuki.8mamo10 at gmail.com Mon May 21 21:27:29 2018 From: ma3yuki.8mamo10 at gmail.com (Masayuki YAMAMOTO) Date: Tue, 22 May 2018 10:27:29 +0900 Subject: [Python-Dev] Procedure for adding new public C API In-Reply-To: References: Message-ID: Thanks Serhiy, I missed adding PyThread_tss_* to Doc/data/refcounts.dat. I opened a PR to fix it. https://github.com/python/cpython/pull/7038 Regards, Masayuki 2018-05-21 21:41 GMT+09:00 Serhiy Storchaka : > Please don't forgot to perform the following steps when add a new public > C API: > > * Document it in Doc/c-api/. > > * Add an entry in the What's New document. > > * Add it in Doc/data/refcounts.dat. > > * Add it in PC/python3.def. > > If you want to include it in the limited API, wrap its declaration with: > > #if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 >= 0x03080000 > #endif > > (use the correct Python version of introducing a feature in the comparison) > > If you don't want to include it in the limited API, wrap its declaration > with: > > #ifndef Py_LIMITED_API > #endif > > You are free of adding private C API, but its name should start with _Py > or _PY and its declaration should be wrapped with: > > #ifndef Py_LIMITED_API > #endif > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ma3yuki. > 8mamo10%40gmail.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vstinner at redhat.com Tue May 22 08:29:19 2018 From: vstinner at redhat.com (Victor Stinner) Date: Tue, 22 May 2018 14:29:19 +0200 Subject: [Python-Dev] PEP: 576 Title: Rationalize Built-in function classes In-Reply-To: <32fab306-d206-f6b7-ca4e-c3e0dfbed0eb@hotpy.org> References: <32fab306-d206-f6b7-ca4e-c3e0dfbed0eb@hotpy.org> Message-ID: 2018-05-19 11:15 GMT+02:00 mark : > The PEP can be viewed here: > https://github.com/python/peps/blob/master/pep-0576.rst > (...) > P.S. > I'm happy to have discussion of this PEP take place via GitHub, > rather than the mailing list, but I thought I would follow the conventional > route for now. Previously, we required to add the full text of the PEP inside the email to be able to reply inline by email. I know that Guido van Rossum started a discussion on python-commiters to propose to change how we discuss PEPs, but I don't think that any decision has been taken at this point, no? Victor From vstinner at redhat.com Tue May 22 08:26:06 2018 From: vstinner at redhat.com (Victor Stinner) Date: Tue, 22 May 2018 14:26:06 +0200 Subject: [Python-Dev] Reminder: Please elaborate commit messages Message-ID: Hi, In https://bugs.python.org/issue33531, Andrew Svetlov wrote "Fixed failed sendfile tests on Windows (at least I hope so)." without giving any bpo number or a commit number. So I looked at latest commits and I found: --- commit e2537521916c5bf88fcf54d4654ff1bcd332be4a Author: Andrew Svetlov Date: Mon May 21 12:03:45 2018 +0300 Fix asyncio flaky tests (#7023) --- Please try to write better error messages for people who will dig into the Git history in 1, 5 or 10 years: * Usually, it's better to open a bug. Here you could give the full error message, mention on which platform the test fails, etc. * Mention which tests are affected * Maybe even give an extract of the error message of the fixed test in the commit message I know that it's more effort and fixing flaky tests is annoying and may require multiple iterations, but again, please think to people who will have to read the Git history later... I was able able to rebuild the context of this commit from a comment of https://bugs.python.org/issue33531 Victor From andrew.svetlov at gmail.com Tue May 22 08:37:19 2018 From: andrew.svetlov at gmail.com (Andrew Svetlov) Date: Tue, 22 May 2018 15:37:19 +0300 Subject: [Python-Dev] Reminder: Please elaborate commit messages In-Reply-To: References: Message-ID: Sorry for that. I thought that the bpo issue can be skipped because it is tests-only change, no asyncio code was affected. Will be more accurate next time. On Tue, May 22, 2018 at 3:26 PM Victor Stinner wrote: > Hi, > > In https://bugs.python.org/issue33531, Andrew Svetlov wrote "Fixed > failed sendfile tests on Windows (at least I hope so)." without giving > any bpo number or a commit number. So I looked at latest commits and I > found: > > --- > commit e2537521916c5bf88fcf54d4654ff1bcd332be4a > Author: Andrew Svetlov > Date: Mon May 21 12:03:45 2018 +0300 > > Fix asyncio flaky tests (#7023) > --- > > Please try to write better error messages for people who will dig into > the Git history in 1, 5 or 10 years: > > * Usually, it's better to open a bug. Here you could give the full > error message, mention on which platform the test fails, etc. > * Mention which tests are affected > * Maybe even give an extract of the error message of the fixed test in > the commit message > > I know that it's more effort and fixing flaky tests is annoying and > may require multiple iterations, but again, please think to people who > will have to read the Git history later... > > I was able able to rebuild the context of this commit from a comment > of https://bugs.python.org/issue33531 > > Victor > -- Thanks, Andrew Svetlov -------------- next part -------------- An HTML attachment was scrubbed... URL: From vstinner at redhat.com Tue May 22 08:50:42 2018 From: vstinner at redhat.com (Victor Stinner) Date: Tue, 22 May 2018 14:50:42 +0200 Subject: [Python-Dev] Reminder: Please elaborate commit messages In-Reply-To: References: Message-ID: Usually, I don't open a new bug to fix or enhance a test. So I wouldn't say that it's mandatory. It's really on a case by case basis. It seems like test_asyncio failures are a hot topic these days :-) It's one of the reasons why Python 3.7rc1 has been delayed by 2 days, no? :-) Victor 2018-05-22 14:37 GMT+02:00 Andrew Svetlov : > Sorry for that. > I thought that the bpo issue can be skipped because it is tests-only change, > no asyncio code was affected. > Will be more accurate next time. > > On Tue, May 22, 2018 at 3:26 PM Victor Stinner wrote: >> >> Hi, >> >> In https://bugs.python.org/issue33531, Andrew Svetlov wrote "Fixed >> failed sendfile tests on Windows (at least I hope so)." without giving >> any bpo number or a commit number. So I looked at latest commits and I >> found: >> >> --- >> commit e2537521916c5bf88fcf54d4654ff1bcd332be4a >> Author: Andrew Svetlov >> Date: Mon May 21 12:03:45 2018 +0300 >> >> Fix asyncio flaky tests (#7023) >> --- >> >> Please try to write better error messages for people who will dig into >> the Git history in 1, 5 or 10 years: >> >> * Usually, it's better to open a bug. Here you could give the full >> error message, mention on which platform the test fails, etc. >> * Mention which tests are affected >> * Maybe even give an extract of the error message of the fixed test in >> the commit message >> >> I know that it's more effort and fixing flaky tests is annoying and >> may require multiple iterations, but again, please think to people who >> will have to read the Git history later... >> >> I was able able to rebuild the context of this commit from a comment >> of https://bugs.python.org/issue33531 >> >> Victor > > -- > Thanks, > Andrew Svetlov From guido at python.org Tue May 22 10:41:40 2018 From: guido at python.org (Guido van Rossum) Date: Tue, 22 May 2018 07:41:40 -0700 Subject: [Python-Dev] PEP: 576 Title: Rationalize Built-in function classes In-Reply-To: References: <32fab306-d206-f6b7-ca4e-c3e0dfbed0eb@hotpy.org> Message-ID: On Tue, May 22, 2018 at 5:29 AM, Victor Stinner wrote: > 2018-05-19 11:15 GMT+02:00 mark : > > The PEP can be viewed here: > > https://github.com/python/peps/blob/master/pep-0576.rst > > (...) > > P.S. > > I'm happy to have discussion of this PEP take place via GitHub, > > rather than the mailing list, but I thought I would follow the > conventional > > route for now. > > Previously, we required to add the full text of the PEP inside the > email to be able to reply inline by email. > > I know that Guido van Rossum started a discussion on python-commiters > to propose to change how we discuss PEPs, but I don't think that any > decision has been taken at this point, no? > That doesn't stop us from experimenting with the new flow. If it works, we can adopt it officially. If it doesn't work, we can ask Mark to post a copy to python-ideas (or -dev) for discussion later. ISTR there are plenty of PEPs that never get posted to python-ideas because they are discussed on a separate list. -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve.dower at python.org Tue May 22 13:00:31 2018 From: steve.dower at python.org (Steve Dower) Date: Tue, 22 May 2018 10:00:31 -0700 Subject: [Python-Dev] Procedure for adding new public C API In-Reply-To: References: Message-ID: <6eadcaf1-75ec-be23-aef0-d7ba1f011f86@python.org> On 21May2018 0708, Paul Moore wrote: > On 21 May 2018 at 14:42, Serhiy Storchaka wrote: >>> Is it even acceptable to add a symbol into the limited ABI? I thought >>> the idea was that if I linked with python3.dll, my code would work >>> with any version of Python 3? By introducing new symbols, code linked >>> with the python3.dll shipped with (say) Python 3.8 would fail to run >>> if executed with the python3.dll from Python 3.5. >> >> The limited API is versioned. If you use only Python 3.5 API (define >> Py_LIMITED_API to 0x03050000), the built code will be expected to work on >> 3.5 and later. In theory. > > Thanks, I'd missed that point (I need to go and check my build > process, in that case :-)). The fact that the headers and python3.def claim different functions are in the limited API basically breaks any ability to use this. You really do need to build with the oldest possible version. Alternatively, we can try again to get everyone to agree that since their APIs shipped as stable in earlier versions that we need to actually make them stable (or take a breaking backwards incompatible change to make them non-stable). Last time Zach and I attempted this we got nowhere. (There's a bug on bpo somewhere with details and helper scripts, as well as unrecorded discussions from the sprints a couple years ago.) Sorry to be the bearer of bad news. Cheers, Steve From steve.dower at python.org Tue May 22 13:07:32 2018 From: steve.dower at python.org (Steve Dower) Date: Tue, 22 May 2018 10:07:32 -0700 Subject: [Python-Dev] PEP: 576 Title: Rationalize Built-in function classes In-Reply-To: References: <32fab306-d206-f6b7-ca4e-c3e0dfbed0eb@hotpy.org> Message-ID: On 22May2018 0741, Guido van Rossum wrote: > ISTR there are > plenty of PEPs that never get posted to python-ideas because they are > discussed on a separate list. There are often better venues for the initial discussion (such as security-sig, distutils-sig or datetime-sig), but I think that's orthogonal from posting the full text of a PEP. That said, if the aim is to keep discussion in another place (such as github), you really don't want copies floating around any other mailing lists. Eventually I'd hope it comes through for final review though, as I'm sure a number of us are unlikely to click through to github unless we have a specific interest in the topic. Cheers, Steve From tjreedy at udel.edu Tue May 22 14:24:18 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Tue, 22 May 2018 14:24:18 -0400 Subject: [Python-Dev] Reminder: Please elaborate commit messages In-Reply-To: References: Message-ID: On 5/22/2018 8:37 AM, Andrew Svetlov wrote: > Sorry for that. > I thought that the bpo issue can be skipped because it is tests-only > change, no asyncio code was affected. > Will be more accurate next time. A new issue was not needed. Adding 'bpo-33531' would have been fine, automatically linking the issue and the PR in both directions. -- Terry Jan Reedy From yselivanov.ml at gmail.com Tue May 22 14:26:13 2018 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Tue, 22 May 2018 14:26:13 -0400 Subject: [Python-Dev] Reminder: Please elaborate commit messages In-Reply-To: References: Message-ID: On Tue, May 22, 2018 at 8:52 AM Victor Stinner wrote: > Usually, I don't open a new bug to fix or enhance a test. So I > wouldn't say that it's mandatory. It's really on a case by case basis. > It seems like test_asyncio failures are a hot topic these days :-) > It's one of the reasons why Python 3.7rc1 has been delayed by 2 days, > no? :-) Yes, getting Windows tests stable wasn't easy. I think it's solved now (thanks to Andrew), but we always welcome any help from other core devs :) Yury From tjreedy at udel.edu Tue May 22 14:38:25 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Tue, 22 May 2018 14:38:25 -0400 Subject: [Python-Dev] Reminder: Please elaborate commit messages In-Reply-To: References: Message-ID: On 5/22/2018 2:26 PM, Yury Selivanov wrote: > On Tue, May 22, 2018 at 8:52 AM Victor Stinner wrote: > >> Usually, I don't open a new bug to fix or enhance a test. So I >> wouldn't say that it's mandatory. It's really on a case by case basis. > >> It seems like test_asyncio failures are a hot topic these days :-) >> It's one of the reasons why Python 3.7rc1 has been delayed by 2 days, >> no? :-) > > Yes, getting Windows tests stable wasn't easy. I think it's solved now > (thanks to Andrew), but we always welcome any help from other core devs :) The AppVeyor (Windows) failure have been gone for a day. The Travis failures are rarer. -- Terry Jan Reedy From p.f.moore at gmail.com Tue May 22 15:33:16 2018 From: p.f.moore at gmail.com (Paul Moore) Date: Tue, 22 May 2018 20:33:16 +0100 Subject: [Python-Dev] Procedure for adding new public C API In-Reply-To: <6eadcaf1-75ec-be23-aef0-d7ba1f011f86@python.org> References: <6eadcaf1-75ec-be23-aef0-d7ba1f011f86@python.org> Message-ID: On 22 May 2018 at 18:00, Steve Dower wrote: > On 21May2018 0708, Paul Moore wrote: >> >> On 21 May 2018 at 14:42, Serhiy Storchaka wrote: >>>> >>>> Is it even acceptable to add a symbol into the limited ABI? I thought >>>> the idea was that if I linked with python3.dll, my code would work >>>> with any version of Python 3? By introducing new symbols, code linked >>>> with the python3.dll shipped with (say) Python 3.8 would fail to run >>>> if executed with the python3.dll from Python 3.5. >>> >>> >>> The limited API is versioned. If you use only Python 3.5 API (define >>> Py_LIMITED_API to 0x03050000), the built code will be expected to work on >>> 3.5 and later. In theory. >> >> >> Thanks, I'd missed that point (I need to go and check my build >> process, in that case :-)). > > > The fact that the headers and python3.def claim different functions are in > the limited API basically breaks any ability to use this. You really do need > to build with the oldest possible version. > > Alternatively, we can try again to get everyone to agree that since their > APIs shipped as stable in earlier versions that we need to actually make > them stable (or take a breaking backwards incompatible change to make them > non-stable). Last time Zach and I attempted this we got nowhere. (There's a > bug on bpo somewhere with details and helper scripts, as well as unrecorded > discussions from the sprints a couple years ago.) > > Sorry to be the bearer of bad news. I think I recall the earlier discussion. I agree that the current situation makes the stable ABI less useful than it would otherwise have been. But being able to build an embedding application or extension, and say "this will work with any version of Python newer than the version I built it with" is still very useful. Paul From guido at python.org Tue May 22 15:47:29 2018 From: guido at python.org (Guido van Rossum) Date: Tue, 22 May 2018 12:47:29 -0700 Subject: [Python-Dev] PEP: 576 Title: Rationalize Built-in function classes In-Reply-To: References: <32fab306-d206-f6b7-ca4e-c3e0dfbed0eb@hotpy.org> Message-ID: On Tue, May 22, 2018 at 10:07 AM, Steve Dower wrote: > On 22May2018 0741, Guido van Rossum wrote: > >> ISTR there are plenty of PEPs that never get posted to python-ideas >> because they are discussed on a separate list. >> > > There are often better venues for the initial discussion (such as > security-sig, distutils-sig or datetime-sig), but I think that's orthogonal > from posting the full text of a PEP. > I don't think that the original rationale for posting the full text of a PEP to a mailing list still applies. The raw text is on GitHub in the python/peps repo, and the formatted text is on python.org. We're not some kind of bureaucratic org that pretends to still live in the world of paper and pencil. > That said, if the aim is to keep discussion in another place (such as > github), you really don't want copies floating around any other mailing > lists. Eventually I'd hope it comes through for final review though, as I'm > sure a number of us are unlikely to click through to github unless we have > a specific interest in the topic. > IMO if you can't be bothered to click through on GitHub you forfeit your right to comment. (Which isn't a right anyway, it's a privilege.) -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From J.Demeyer at UGent.be Tue May 22 16:32:08 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Tue, 22 May 2018 22:32:08 +0200 Subject: [Python-Dev] PEP: 576 Title: Rationalize Built-in function classes In-Reply-To: <49c4243132a54cbf8dfafa35fa884730@xmail101.UGent.be> References: <32fab306-d206-f6b7-ca4e-c3e0dfbed0eb@hotpy.org> <49c4243132a54cbf8dfafa35fa884730@xmail101.UGent.be> Message-ID: <5B047E48.80709@UGent.be> For the record: the only reason that I replied on GitHub was because the proposal was not yet posted (as far as I know) to any mailing list. Typically, a post is made to a mailing list more or less at the same time as creating the PEP. In this case, there was a delay of a few days, maybe also because of unrelated issues with the compilation of the PEPs. From vano at mail.mipt.ru Tue May 22 18:09:00 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Wed, 23 May 2018 01:09:00 +0300 Subject: [Python-Dev] My fork lacks a 3.7 branch - can I create it somehow? In-Reply-To: References: Message-ID: On 22.05.2018 3:07, Skip Montanaro wrote: > My GitHub fork of the cpython repo was made awhile ago, before a 3.7 branch > was created. I have no remotes/origin/3.7. Is there some way to create it > from remotes/upstream/3.7? I asked on GitHub's help forums. The only > recommendation was to to delete my fork and recreate it. That seemed kind > of drastic, and I will do it if that's really the only way, but this seems > like functionality Git and/or GitHub probably supports. > > Thx, You don't really need copies of official branches on your Github fork if you're not a maintainer for these branches. (You'll have to keep master though AFAIK since Git needs some branch to be marked as "default".) It's sufficient to just have topic branches for PRs there: you take official branches from python/cpython and topic branches from your fork, do the edits and manipulations locally, then upload the changed topic branches to your fork. I found this easier than having everything in your fork 'cuz it saves you the hassle of keeping your copies up-to-date and having unexpected merge conflicts in your PRs if the copies get out of date. > Skip > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/vano%40mail.mipt.ru -- Regards, Ivan From skip.montanaro at gmail.com Tue May 22 18:51:13 2018 From: skip.montanaro at gmail.com (Skip Montanaro) Date: Tue, 22 May 2018 17:51:13 -0500 Subject: [Python-Dev] My fork lacks a 3.7 branch - can I create it somehow? In-Reply-To: References: Message-ID: > You don't really need copies of official branches on your Github fork if you're not a maintainer for these branches. I explicitly wanted to run with 3.7 in the run-up to release. On that branch, the built ./python reports 3.7.0b4+ at startup. Master tells me 3.8.0a0 on startup. Since my local repo is a clone of my fork, it made sense to me to have a 3.7 branch on my fork which I could switch to. Am I only nutcase who thinks that might be mildly useful? (Or that if I want to test an application across multiple versions using tox that it makes sense to have pre-release visibility of point releases.) Skip From nad at python.org Tue May 22 19:29:48 2018 From: nad at python.org (Ned Deily) Date: Tue, 22 May 2018 19:29:48 -0400 Subject: [Python-Dev] My fork lacks a 3.7 branch - can I create it somehow? In-Reply-To: References: Message-ID: On May 22, 2018, at 18:51, Skip Montanaro wrote: > [Ivan Pozdeev]: >> You don't really need copies of official branches on your Github fork >> if you're not a maintainer for these branches. > I explicitly wanted to run with 3.7 in the run-up to release. On that > branch, the built ./python reports 3.7.0b4+ at startup. Master tells me > 3.8.0a0 on startup. Since my local repo is a clone of my fork, it made > sense to me to have a 3.7 branch on my fork which I could switch to. Am I > only nutcase who thinks that might be mildly useful? (Or that if I want to > test an application across multiple versions using tox that it makes sense > to have pre-release visibility of point releases.) No, what you what you want to do makes perfect sense. It sounds like Ivan is used to projects with a somewhat different workflow than ours. We don't have "branch maintainers"; core-developers are responsible themselves for merging changes into all appropriate branches. While these days some of the backporting can be semi-automated, thanks to the backport bot, but it is still up to the core developer to decide whether a change can and should be backported, so having all active branches available in a local repo is a pretty much a necessity. As always, the Developer's Guide should be able to answer questions like this: https://devguide.python.org/devcycle/ -- Ned Deily nad at python.org -- [] From njs at pobox.com Tue May 22 20:02:39 2018 From: njs at pobox.com (Nathaniel Smith) Date: Tue, 22 May 2018 17:02:39 -0700 Subject: [Python-Dev] My fork lacks a 3.7 branch - can I create it somehow? In-Reply-To: References: Message-ID: On Tue, May 22, 2018 at 3:51 PM, Skip Montanaro wrote: >> You don't really need copies of official branches on your Github fork if > you're not a maintainer for these branches. > > I explicitly wanted to run with 3.7 in the run-up to release. On that > branch, the built ./python reports 3.7.0b4+ at startup. Master tells me > 3.8.0a0 on startup. Since my local repo is a clone of my fork, it made > sense to me to have a 3.7 branch on my fork which I could switch to. Am I > only nutcase who thinks that might be mildly useful? (Or that if I want to > test an application across multiple versions using tox that it makes sense > to have pre-release visibility of point releases.) To run with 3.7 you need 3.7 in your local repo, but there's no particular reason that you need to push that branch back up to your personal fork on github. It's very unlikely that anyone looking for a 3.7 branch would go to your fork and expect to find it there. As far as git is concerned, the main repo on github, your fork on github, and your local repo are 3 independent repositories, equally valid. The relationships between them are purely a matter of convention. -n -- Nathaniel J. Smith -- https://vorpus.org From tim.peters at gmail.com Tue May 22 20:10:49 2018 From: tim.peters at gmail.com (Tim Peters) Date: Tue, 22 May 2018 19:10:49 -0500 Subject: [Python-Dev] My fork lacks a 3.7 branch - can I create it somehow? In-Reply-To: References: Message-ID: [Nathaniel Smith ] > ... > As far as git is concerned, the main repo on github, your fork on > github, and your local repo are 3 independent repositories, equally > valid. The relationships between them are purely a matter of > convention. Thanks for that! It instantly cleared up several mysteries for me. I'm just starting to learn git & github, and am starkly reminded of an old truth: there is absolutely nothing "obvious" about source-control systems, or workflows, before you already know them ;-) From solipsis at pitrou.net Wed May 23 04:14:19 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Wed, 23 May 2018 10:14:19 +0200 Subject: [Python-Dev] My fork lacks a 3.7 branch - can I create it somehow? References: Message-ID: <20180523101419.33c29f8c@fsol> On Tue, 22 May 2018 19:10:49 -0500 Tim Peters wrote: > [Nathaniel Smith ] > > ... > > As far as git is concerned, the main repo on github, your fork on > > github, and your local repo are 3 independent repositories, equally > > valid. The relationships between them are purely a matter of > > convention. > > Thanks for that! It instantly cleared up several mysteries for me. > I'm just starting to learn git & github, and am starkly reminded of an > old truth: there is absolutely nothing "obvious" about source-control > systems, or workflows, before you already know them ;-) I think you'll find out that git can be especially non-obvious :-) Regards Antoine. From turnbull.stephen.fw at u.tsukuba.ac.jp Wed May 23 05:01:37 2018 From: turnbull.stephen.fw at u.tsukuba.ac.jp (Stephen J. Turnbull) Date: Wed, 23 May 2018 18:01:37 +0900 Subject: [Python-Dev] My fork lacks a 3.7 branch - can I create it somehow? In-Reply-To: References: Message-ID: <23301.11761.472591.205506@turnbull.sk.tsukuba.ac.jp> Tim Peters writes: > there is absolutely nothing "obvious" about source-control systems, > or workflows, before you already know them ;-) Obvious, adj.: More an expletive than a true adjective, shows a state of mind in which the speaker is comfortable that a statement fits her preconceptions. Conveys little, if any, information. Syn.: intuitive, natural. -- The *New* New Devil's Dictionary From p.f.moore at gmail.com Wed May 23 05:25:18 2018 From: p.f.moore at gmail.com (Paul Moore) Date: Wed, 23 May 2018 10:25:18 +0100 Subject: [Python-Dev] My fork lacks a 3.7 branch - can I create it somehow? In-Reply-To: <20180523101419.33c29f8c@fsol> References: <20180523101419.33c29f8c@fsol> Message-ID: On 23 May 2018 at 09:14, Antoine Pitrou wrote: > On Tue, 22 May 2018 19:10:49 -0500 > Tim Peters wrote: > >> Thanks for that! It instantly cleared up several mysteries for me. >> I'm just starting to learn git & github, and am starkly reminded of an >> old truth: there is absolutely nothing "obvious" about source-control >> systems, or workflows, before you already know them ;-) > > I think you'll find out that git can be especially non-obvious :-) My understanding is that git becomes more obvious when you understand that it's not actually a source control system at all but rather a data model for text and changes. (Or something like that, I haven't reached that level of enlightenment myself yet...) Paul From storchaka at gmail.com Wed May 23 07:45:23 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Wed, 23 May 2018 14:45:23 +0300 Subject: [Python-Dev] [python-committers] FINAL WEEK FOR 3.7.0 CHANGES! In-Reply-To: References: Message-ID: <6304cea3-52fc-fd19-b920-016715f6f5d1@gmail.com> 15.05.18 14:51, Ned Deily ????: > This is it! We are down to THE FINAL WEEK for 3.7.0! Please get your > feature fixes, bug fixes, and documentation updates in before > 2018-05-21 ~23:59 Anywhere on Earth (UTC-12:00). That's about 7 days > from now. We will then tag and produce the 3.7.0 release candidate. > Our goal continues been to be to have no changes between the release > candidate and final; AFTER NEXT WEEK'S RC1, CHANGES APPLIED TO THE 3.7 > BRANCH WILL BE RELEASED IN 3.7.1. Please double-check that there are > no critical problems outstanding and that documentation for new > features in 3.7 is complete (including NEWS and What's New items), and > that 3.7 is getting exposure and tested with our various platorms and > third-party distributions and applications. Those of us who are > participating in the development sprints at PyCon US 2018 here in > Cleveland can feel the excitement building as we work through the > remaining issues, including completing the "What's New in 3.7" > document and final feature documentation. (We wish you could all be > here.) Is it possible to add yet one beta instead? CI was broken for few latest days, tests are not passed on my computer still (and fail on some buildbots), updating What's New exposed new features which need additional testing (and maybe fixing or reverting), and I'm not comfortable about some changes which would be harder to fix after the release. From vstinner at redhat.com Wed May 23 08:21:38 2018 From: vstinner at redhat.com (Victor Stinner) Date: Wed, 23 May 2018 14:21:38 +0200 Subject: [Python-Dev] [python-committers] FINAL WEEK FOR 3.7.0 CHANGES! In-Reply-To: References: <6304cea3-52fc-fd19-b920-016715f6f5d1@gmail.com> Message-ID: Ah, Python doesn't compile on Windows anymore :-) https://bugs.python.org/issue33614 Victor 2018-05-23 14:16 GMT+02:00 Victor Stinner : > 2018-05-23 13:45 GMT+02:00 Serhiy Storchaka : >> CI was broken for few latest days, tests are not passed on my computer still >> (and fail on some buildbots), (...) > > I looked at buildbots and I confirm that many of the 3.x buildbots are red: > > AMD64 FreeBSD 10.x Shared 3.x > AMD64 Windows8.1 Non-Debug 3.x > ARMv7 Ubuntu 3.x > PPC64 Fedora 3.x > s390x RHEL 3.x > x86 Gentoo Installed with X 3.x > x86 Gentoo Refleaks 3.x > AMD64 Windows8.1 Refleaks 3.x > > Victor From vstinner at redhat.com Wed May 23 08:16:14 2018 From: vstinner at redhat.com (Victor Stinner) Date: Wed, 23 May 2018 14:16:14 +0200 Subject: [Python-Dev] [python-committers] FINAL WEEK FOR 3.7.0 CHANGES! In-Reply-To: <6304cea3-52fc-fd19-b920-016715f6f5d1@gmail.com> References: <6304cea3-52fc-fd19-b920-016715f6f5d1@gmail.com> Message-ID: 2018-05-23 13:45 GMT+02:00 Serhiy Storchaka : > CI was broken for few latest days, tests are not passed on my computer still > (and fail on some buildbots), (...) I looked at buildbots and I confirm that many of the 3.x buildbots are red: AMD64 FreeBSD 10.x Shared 3.x AMD64 Windows8.1 Non-Debug 3.x ARMv7 Ubuntu 3.x PPC64 Fedora 3.x s390x RHEL 3.x x86 Gentoo Installed with X 3.x x86 Gentoo Refleaks 3.x AMD64 Windows8.1 Refleaks 3.x Victor From nad at python.org Wed May 23 09:13:53 2018 From: nad at python.org (Ned Deily) Date: Wed, 23 May 2018 09:13:53 -0400 Subject: [Python-Dev] [python-committers] FINAL WEEK FOR 3.7.0 CHANGES! In-Reply-To: <6304cea3-52fc-fd19-b920-016715f6f5d1@gmail.com> References: <6304cea3-52fc-fd19-b920-016715f6f5d1@gmail.com> Message-ID: On May 23, 2018, at 07:45, Serhiy Storchaka wrote: > 15.05.18 14:51, Ned Deily ????: >> This is it! We are down to THE FINAL WEEK for 3.7.0! Please get your >> feature fixes, bug fixes, and documentation updates in before >> 2018-05-21 ~23:59 Anywhere on Earth (UTC-12:00). That's about 7 days >> from now. We will then tag and produce the 3.7.0 release candidate. >> Our goal continues been to be to have no changes between the release >> candidate and final; AFTER NEXT WEEK'S RC1, CHANGES APPLIED TO THE 3.7 >> BRANCH WILL BE RELEASED IN 3.7.1. Please double-check that there are >> no critical problems outstanding and that documentation for new >> features in 3.7 is complete (including NEWS and What's New items), and >> that 3.7 is getting exposure and tested with our various platorms and >> third-party distributions and applications. Those of us who are >> participating in the development sprints at PyCon US 2018 here in >> Cleveland can feel the excitement building as we work through the >> remaining issues, including completing the "What's New in 3.7" >> document and final feature documentation. (We wish you could all be >> here.) > Is it possible to add yet one beta instead? > > CI was broken for few latest days, tests are not passed on my computer still (and fail on some buildbots), updating What's New exposed new features which need additional testing (and maybe fixing or reverting), and I'm not comfortable about some changes which would be harder to fix after the release. it is possible but there's no point in doing either another beta or a release candidate until we understand and address the current blocking issues, like the major buildbot failures. We have another 24 hours until rc1 was planned to be tagged. Let's keep working on the known issues and we will make a decision then. -- Ned Deily nad at python.org -- [] From ncoghlan at gmail.com Wed May 23 09:59:13 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Wed, 23 May 2018 23:59:13 +1000 Subject: [Python-Dev] PEP: 576 Title: Rationalize Built-in function classes In-Reply-To: References: <32fab306-d206-f6b7-ca4e-c3e0dfbed0eb@hotpy.org> Message-ID: On 23 May 2018 at 05:47, Guido van Rossum wrote: > On Tue, May 22, 2018 at 10:07 AM, Steve Dower > wrote: > >> On 22May2018 0741, Guido van Rossum wrote: >> >>> ISTR there are plenty of PEPs that never get posted to python-ideas >>> because they are discussed on a separate list. >>> >> >> There are often better venues for the initial discussion (such as >> security-sig, distutils-sig or datetime-sig), but I think that's orthogonal >> from posting the full text of a PEP. >> > > I don't think that the original rationale for posting the full text of a > PEP to a mailing list still applies. The raw text is on GitHub in the > python/peps repo, and the formatted text is on python.org. We're not some > kind of bureaucratic org that pretends to still live in the world of paper > and pencil. > The raw text being on Github rather than hg.python.org makes the rationale for archiving full copies on mail.python.org stronger, not weaker. That said, if the aim is to keep discussion in another place (such as >> github), you really don't want copies floating around any other mailing >> lists. Eventually I'd hope it comes through for final review though, as I'm >> sure a number of us are unlikely to click through to github unless we have >> a specific interest in the topic. > > > > IMO if you can't be bothered to click through on GitHub you forfeit your > right to comment. (Which isn't a right anyway, it's a privilege.) > I would never consider it an acceptable process restriction to require people to sign up for an account with a proprietary American software company in order to comment on the future of the Python programming language. If folks get more feedback than they have the ability to process in a short amount of time, then "Deferred" is a perfectly reasonable state to put a PEP into until they *do* have time to go through and account for the feedback - it isn't like it's a major disaster if we put an idea back on the shelf for a couple of months (or years!), let folks mull it over for a while, and then reconsider it later with fresh eyes. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Wed May 23 10:06:36 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 24 May 2018 00:06:36 +1000 Subject: [Python-Dev] My fork lacks a 3.7 branch - can I create it somehow? In-Reply-To: References: <20180523101419.33c29f8c@fsol> Message-ID: On 23 May 2018 at 19:25, Paul Moore wrote: > On 23 May 2018 at 09:14, Antoine Pitrou wrote: > > On Tue, 22 May 2018 19:10:49 -0500 > > Tim Peters wrote: > > > >> Thanks for that! It instantly cleared up several mysteries for me. > >> I'm just starting to learn git & github, and am starkly reminded of an > >> old truth: there is absolutely nothing "obvious" about source-control > >> systems, or workflows, before you already know them ;-) > > > > I think you'll find out that git can be especially non-obvious :-) > > My understanding is that git becomes more obvious when you understand > that it's not actually a source control system at all but rather a > data model for text and changes. (Or something like that, I haven't > reached that level of enlightenment myself yet...) > For data structure wonks, http://eagain.net/articles/git-for-computer-scientists/ can be more informative than any number of git usage guides :) The mapping from command line incantations to their effect on the DAG can be a little (*cough*) obscure, but having the right mental model of what's going on at a data structure level can still help enormously. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Wed May 23 12:54:29 2018 From: guido at python.org (Guido van Rossum) Date: Wed, 23 May 2018 09:54:29 -0700 Subject: [Python-Dev] PEP: 576 Title: Rationalize Built-in function classes In-Reply-To: References: <32fab306-d206-f6b7-ca4e-c3e0dfbed0eb@hotpy.org> Message-ID: We should take the discussion about how and where PEP discussions should be hosted off this thread and list. On Wed, May 23, 2018 at 6:59 AM, Nick Coghlan wrote: > On 23 May 2018 at 05:47, Guido van Rossum wrote: > >> On Tue, May 22, 2018 at 10:07 AM, Steve Dower >> wrote: >> >>> On 22May2018 0741, Guido van Rossum wrote: >>> >>>> ISTR there are plenty of PEPs that never get posted to python-ideas >>>> because they are discussed on a separate list. >>>> >>> >>> There are often better venues for the initial discussion (such as >>> security-sig, distutils-sig or datetime-sig), but I think that's orthogonal >>> from posting the full text of a PEP. >>> >> >> I don't think that the original rationale for posting the full text of a >> PEP to a mailing list still applies. The raw text is on GitHub in the >> python/peps repo, and the formatted text is on python.org. We're not >> some kind of bureaucratic org that pretends to still live in the world of >> paper and pencil. >> > > The raw text being on Github rather than hg.python.org makes the > rationale for archiving full copies on mail.python.org stronger, not > weaker. > > That said, if the aim is to keep discussion in another place (such as >>> github), you really don't want copies floating around any other mailing >>> lists. Eventually I'd hope it comes through for final review though, as I'm >>> sure a number of us are unlikely to click through to github unless we have >>> a specific interest in the topic. >> >> >> >> IMO if you can't be bothered to click through on GitHub you forfeit your >> right to comment. (Which isn't a right anyway, it's a privilege.) >> > > I would never consider it an acceptable process restriction to require > people to sign up for an account with a proprietary American software > company in order to comment on the future of the Python programming > language. > > If folks get more feedback than they have the ability to process in a > short amount of time, then "Deferred" is a perfectly reasonable state to > put a PEP into until they *do* have time to go through and account for the > feedback - it isn't like it's a major disaster if we put an idea back on > the shelf for a couple of months (or years!), let folks mull it over for a > while, and then reconsider it later with fresh eyes. > > Cheers, > Nick. > > -- > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From nad at python.org Thu May 24 03:23:59 2018 From: nad at python.org (Ned Deily) Date: Thu, 24 May 2018 03:23:59 -0400 Subject: [Python-Dev] [python-committers] FINAL WEEK FOR 3.7.0 CHANGES! In-Reply-To: References: <6304cea3-52fc-fd19-b920-016715f6f5d1@gmail.com> Message-ID: <33CAAEBB-8DE3-494D-9306-415D70D9A27B@python.org> On May 23, 2018, at 09:13, Ned Deily wrote: > On May 23, 2018, at 07:45, Serhiy Storchaka wrote: >> Is it possible to add yet one beta instead? >> CI was broken for few latest days, tests are not passed on my computer still (and fail on some buildbots), updating What's New exposed new features which need additional testing (and maybe fixing or reverting), and I'm not comfortable about some changes which would be harder to fix after the release. > it is possible but there's no point in doing either another beta or a release candidate until we understand and address the current blocking issues, like the major buildbot failures. We have another 24 hours until rc1 was planned to be tagged. Let's keep working on the known issues and we will make a decision then. An update: thanks to a lot of effort over the past day by a number of people (including Victor, Serhiy, Christian, Zach, and others I'm sure I'm forgetting - my apologies), we have addressed all of the "release blocker" issues and all but one of the persistent failures on the 3.7 stable buildbots. We should have the couple of remaining "deferred blockers" including the remaining stable buildbots in green status by later today. At that point, we will be ready to tag 3.7.0rc1 and begin producing the release candidate artifacts. So this *is* really your last chance: if you know of any true releasing blocking issues for 3.7.0, you have about 12 more hours to log it in the bug tracker as a "release blocker". I'll send out an email once we start the release manufacturing. Any merges to the 3.7 branch after that will be released in 3.7.1 which we tentatively are planning to ship sometime before the end of July (< 2018-07-31). If you do find a critical problem in 3.7.0rc1 that you think needs to be fixed in 3.7.0, please merge a fix into 3.7 (and other appropriate branches), leave the issue open and marked as "release blocker", and add a note why you think the fix needs to be cherry-picked into 3.7.0. More later today! --Ned P.S. To address a few of the earlier comments on this thread: Antoine: > Also there's https://bugs.python.org/issue33612 which appears quite critical. Resolved Victor: > Can someone please have a look at my socketserver change? Reviewed and merged Victor: > I looked at buildbots and I confirm that many of the 3.x buildbots are red: Yes, but it's the 3.7 buildbots that are of interest now, not the 3.x ones :) And, as noted above, I believe we have cleaned up (or will shortly) the remaining 3.7 stable buildbot failures. Coincidentally, we've also fixed some of the 3.x (master -> 3.8) buildbots. Victor: > Ah, Python doesn't compile on Windows anymore :-) Stale files on one of the Windows buildbots -> cleaned up -- Ned Deily nad at python.org -- [] From vstinner at redhat.com Thu May 24 07:26:25 2018 From: vstinner at redhat.com (Victor Stinner) Date: Thu, 24 May 2018 13:26:25 +0200 Subject: [Python-Dev] [python-committers] FINAL WEEK FOR 3.7.0 CHANGES! In-Reply-To: <33CAAEBB-8DE3-494D-9306-415D70D9A27B@python.org> References: <6304cea3-52fc-fd19-b920-016715f6f5d1@gmail.com> <33CAAEBB-8DE3-494D-9306-415D70D9A27B@python.org> Message-ID: 2018-05-24 9:23 GMT+02:00 Ned Deily : > Any merges to the 3.7 branch after > that will be released in 3.7.1 which we tentatively are planning to > ship sometime before the end of July (< 2018-07-31). I recall that Python 3.6.0 was full of bugs, some functions like os.waitpid() on Windows (if I recall correctly) were completely broken. We can do our best to test as much as possible, hope that more and more people use the "nightly" Python version to run their CI, but we always miss bugs. We always get the most testers when the final x.y.0 version is released. Why waiting two months to release bugfixes? Victor From nad at python.org Thu May 24 10:35:41 2018 From: nad at python.org (Ned Deily) Date: Thu, 24 May 2018 10:35:41 -0400 Subject: [Python-Dev] [python-committers] FINAL WEEK FOR 3.7.0 CHANGES! In-Reply-To: References: <6304cea3-52fc-fd19-b920-016715f6f5d1@gmail.com> <33CAAEBB-8DE3-494D-9306-415D70D9A27B@python.org> Message-ID: On May 24, 2018, at 07:26, Victor Stinner wrote: > 2018-05-24 9:23 GMT+02:00 Ned Deily : >> Any merges to the 3.7 branch after >> that will be released in 3.7.1 which we tentatively are planning to >> ship sometime before the end of July (< 2018-07-31). > I recall that Python 3.6.0 was full of bugs, some functions like > os.waitpid() on Windows (if I recall correctly) were completely > broken. > > We can do our best to test as much as possible, hope that more and > more people use the "nightly" Python version to run their CI, but we > always miss bugs. We always get the most testers when the final x.y.0 > version is released. > > Why waiting two months to release bugfixes? We're not planning on waiting two months. First, 3.7.0 final is not planned to release until 2018-06-15; if necessary, there could be one or more emergency bug fixes in it. Second, "before the end of July (< 2018-07-31)" does not mean we have to wait until the end of July. If necessary, it could be near the beginning of the month, so closer to two weeks after the release. Right now, our focus should be on getting high-quality 3.7.0rc1 and 3.7.0 final releases out there to our users and then we can focus on what comes next. Getting close! -- Ned Deily nad at python.org -- [] From storchaka at gmail.com Thu May 24 11:35:44 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Thu, 24 May 2018 18:35:44 +0300 Subject: [Python-Dev] [python-committers] FINAL WEEK FOR 3.7.0 CHANGES! In-Reply-To: <33CAAEBB-8DE3-494D-9306-415D70D9A27B@python.org> References: <6304cea3-52fc-fd19-b920-016715f6f5d1@gmail.com> <33CAAEBB-8DE3-494D-9306-415D70D9A27B@python.org> Message-ID: 24.05.18 10:23, Ned Deily ????: > So this *is* really your last chance: if you know of any true releasing > blocking issues for 3.7.0, you have about 12 more hours to log it in > the bug tracker as a "release blocker". I'll send out an email once we > start the release manufacturing. Any merges to the 3.7 branch after > that will be released in 3.7.1 which we tentatively are planning to > ship sometime before the end of July (< 2018-07-31). If you do find a > critical problem in 3.7.0rc1 that you think needs to be fixed in 3.7.0, > please merge a fix into 3.7 (and other appropriate branches), leave the > issue open and marked as "release blocker", and add a note why you > think the fix needs to be cherry-picked into 3.7.0. I have doubts about two issues. I feel the responsibility for them because I had the opportunity to solve them before, but I lost it. 1. Changes in the AST. Few third-party projects was broken by it and already are fixed. I suppose yet few projects will be changed after 3.7 be released. It is interesting that IPython was broken in different way than other projects. It was needed to reintroduce the docstring in the list of statements, effectively reverting the 3.7 change. IPython allows to enter several statements at prompt, and therefore it compiles them with the 'exec' mode instead of 'single' as the CPython REPL and IDLE shell. Currently CPython doesn't allow you to paste arbitrary script like the following: if a: ??? b if c: ??? d You need to add an empty line between top-level complex statements. If one time CPython will add support of pasting several statements without empty lines between, it might need to add the same hack as IPython. I afraid that we might be needed to change AST again, in 3.7.1 or in 3.8.0. 2. Pickle support in typing is not perfect. I was going to fix it (I had almost ready code), but lost a chance of doing this before. It can be changed in 3.7.1, but this means that pickles of some derived typing types created in 3.7.0 will be not compatible with future versions (may be 3.7.1 will not break compatibility, but it will be broken in future because we will not specially supported compatibility with 3.7.0). There is third issue, related to NetBSD, but it is less important. I think two weeks will be enough for fixing these issues, but not at rc1 stage. -------------- next part -------------- An HTML attachment was scrubbed... URL: From nad at python.org Thu May 24 12:02:48 2018 From: nad at python.org (Ned Deily) Date: Thu, 24 May 2018 12:02:48 -0400 Subject: [Python-Dev] [python-committers] FINAL WEEK FOR 3.7.0 CHANGES! In-Reply-To: References: <6304cea3-52fc-fd19-b920-016715f6f5d1@gmail.com> <33CAAEBB-8DE3-494D-9306-415D70D9A27B@python.org> Message-ID: <6C79D66A-548E-45BF-BD23-F35A90FAE49A@python.org> On May 24, 2018, at 11:35, Serhiy Storchaka wrote: > I have doubts about two issues. I feel the responsibility for them because I had the opportunity to solve them before, but I lost it. [...] Serhiy, what are the bugs.python.org issue numbers for these? Are they marked as "release blocker"? -- Ned Deily nad at python.org -- [] From levkivskyi at gmail.com Thu May 24 12:25:51 2018 From: levkivskyi at gmail.com (Ivan Levkivskyi) Date: Thu, 24 May 2018 12:25:51 -0400 Subject: [Python-Dev] [python-committers] FINAL WEEK FOR 3.7.0 CHANGES! In-Reply-To: <6C79D66A-548E-45BF-BD23-F35A90FAE49A@python.org> References: <6304cea3-52fc-fd19-b920-016715f6f5d1@gmail.com> <33CAAEBB-8DE3-494D-9306-415D70D9A27B@python.org> <6C79D66A-548E-45BF-BD23-F35A90FAE49A@python.org> Message-ID: > 2. Pickle support in typing is not perfect. I was going to fix it (I had almost ready code), but lost a chance of doing this before. It can be changed in 3.7.1, but this means that pickles of some derived typing types created in 3.7.0 will be not compatible with future versions (may be 3.7.1 will not break compatibility, but it will be broken in future because we will not specially supported compatibility with 3.7.0). I think I had fixed this one. At least the examples reported on typing tracker are now fixed. Do you have some other examples that still fail? -- Ivan On 24 May 2018 at 12:02, Ned Deily wrote: > On May 24, 2018, at 11:35, Serhiy Storchaka wrote: > > I have doubts about two issues. I feel the responsibility for them > because I had the opportunity to solve them before, but I lost it. > [...] > > Serhiy, what are the bugs.python.org issue numbers for these? Are they > marked as "release blocker"? > > -- > Ned Deily > nad at python.org -- [] > > _______________________________________________ > python-committers mailing list > python-committers at python.org > https://mail.python.org/mailman/listinfo/python-committers > Code of Conduct: https://www.python.org/psf/codeofconduct/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: From storchaka at gmail.com Thu May 24 12:26:27 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Thu, 24 May 2018 19:26:27 +0300 Subject: [Python-Dev] [python-committers] FINAL WEEK FOR 3.7.0 CHANGES! In-Reply-To: <6C79D66A-548E-45BF-BD23-F35A90FAE49A@python.org> References: <6304cea3-52fc-fd19-b920-016715f6f5d1@gmail.com> <33CAAEBB-8DE3-494D-9306-415D70D9A27B@python.org> <6C79D66A-548E-45BF-BD23-F35A90FAE49A@python.org> Message-ID: <6506e386-1be7-db3e-626b-9bcac8d5b7dd@gmail.com> 24.05.18 19:02, Ned Deily ????: > On May 24, 2018, at 11:35, Serhiy Storchaka wrote: >> I have doubts about two issues. I feel the responsibility for them because I had the opportunity to solve them before, but I lost it. > [...] > > Serhiy, what are the bugs.python.org issue numbers for these? Are they marked as "release blocker"? For docstring in AST: https://bugs.python.org/issue32911 Inada's patch looked complex (actually it mostly restored the code before his previous change). We didn't know about IPython and we decided that it is not worth to change this code at this stage (after beta2). And definitely it will be later to do this after rc1. For pickling of typing types: https://bugs.python.org/issue32873 Ivan fixed cases supported before 3.7. They now are backward and forward compatible. But cases not supported before 3.7 (like List[int]) now produce fragile pickles. From levkivskyi at gmail.com Thu May 24 12:42:27 2018 From: levkivskyi at gmail.com (Ivan Levkivskyi) Date: Thu, 24 May 2018 12:42:27 -0400 Subject: [Python-Dev] [python-committers] FINAL WEEK FOR 3.7.0 CHANGES! In-Reply-To: <6506e386-1be7-db3e-626b-9bcac8d5b7dd@gmail.com> References: <6304cea3-52fc-fd19-b920-016715f6f5d1@gmail.com> <33CAAEBB-8DE3-494D-9306-415D70D9A27B@python.org> <6C79D66A-548E-45BF-BD23-F35A90FAE49A@python.org> <6506e386-1be7-db3e-626b-9bcac8d5b7dd@gmail.com> Message-ID: > But cases not supported before 3.7 (like List[int]) now produce fragile pickles. List[int] pickled in 3.7 can't be un-pickled in 3.6, but I wouldn't worry too much about this because it never worked in 3.6. I remember you proposed using __getitem__ in __reduce__, but I am not sure it is a better way, although it will fix the above problem. I don't think this one is a blocker and we can move this discussion back to b.p.o., unless you have some particular concerns. The AST one however looks more serious. -- Ivan On 24 May 2018 at 12:26, Serhiy Storchaka wrote: > 24.05.18 19:02, Ned Deily ????: > >> On May 24, 2018, at 11:35, Serhiy Storchaka wrote: >> >>> I have doubts about two issues. I feel the responsibility for them >>> because I had the opportunity to solve them before, but I lost it. >>> >> [...] >> >> Serhiy, what are the bugs.python.org issue numbers for these? Are they >> marked as "release blocker"? >> > > For docstring in AST: https://bugs.python.org/issue32911 > > Inada's patch looked complex (actually it mostly restored the code before > his previous change). We didn't know about IPython and we decided that it > is not worth to change this code at this stage (after beta2). And > definitely it will be later to do this after rc1. > > For pickling of typing types: https://bugs.python.org/issue32873 > > Ivan fixed cases supported before 3.7. They now are backward and forward > compatible. But cases not supported before 3.7 (like List[int]) now produce > fragile pickles. > > > _______________________________________________ > python-committers mailing list > python-committers at python.org > https://mail.python.org/mailman/listinfo/python-committers > Code of Conduct: https://www.python.org/psf/codeofconduct/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nad at python.org Thu May 24 13:08:19 2018 From: nad at python.org (Ned Deily) Date: Thu, 24 May 2018 13:08:19 -0400 Subject: [Python-Dev] [python-committers] FINAL WEEK FOR 3.7.0 CHANGES! In-Reply-To: <6506e386-1be7-db3e-626b-9bcac8d5b7dd@gmail.com> References: <6304cea3-52fc-fd19-b920-016715f6f5d1@gmail.com> <33CAAEBB-8DE3-494D-9306-415D70D9A27B@python.org> <6C79D66A-548E-45BF-BD23-F35A90FAE49A@python.org> <6506e386-1be7-db3e-626b-9bcac8d5b7dd@gmail.com> Message-ID: On May 24, 2018, at 12:26, Serhiy Storchaka wrote: > 24.05.18 19:02, Ned Deily ????: >> On May 24, 2018, at 11:35, Serhiy Storchaka wrote: >>> I have doubts about two issues. I feel the responsibility for them because I had the opportunity to solve them before, but I lost it. >> [...] >> >> Serhiy, what are the bugs.python.org issue numbers for these? Are they marked as "release blocker"? > For docstring in AST: https://bugs.python.org/issue32911 > > Inada's patch looked complex (actually it mostly restored the code before his previous change). We didn't know about IPython and we decided that it is not worth to change this code at this stage (after beta2). And definitely it will be later to do this after rc1. We have had many discussions about this issue earlier and, while there were arguments made for more than one approach, I believe we reached agreement that this was a deliberate incompatibility that we and our users could live with. The issue has been closed since 2018-03-18. At some point, we need to move on. However, if additional exposure downstream has identified significant new problems, then the issue should be re-opened and a specific proposal made. BTW, do we know what the iPython folks think about this? But there still seems to be disagreements about whether anything needs to be changed. As I commented yesterday, I *really* don't want to keep revisiting this but I am not going to make a technical call. Without an open "release blocker" issue, though, nothing is going to change for 3.7.0rc1. If you (or anyone else) feels strongly enough about it, you should re-open the issue now and make it as a "release blocker" and we should discuss the implications and possible plans of action in the issue. > For pickling of typing types: https://bugs.python.org/issue32873 > > Ivan fixed cases supported before 3.7. They now are backward and forward compatible. But cases not supported before 3.7 (like List[int]) now produce fragile pickles. That issue was closed by Ivan and there have been no comments on it since 2018-04-04. I'll defer to his recent reply in this thread. -- Ned Deily nad at python.org -- [] From solipsis at pitrou.net Thu May 24 13:57:18 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Thu, 24 May 2018 19:57:18 +0200 Subject: [Python-Dev] PEP 574 (pickle 5) implementation and backport available Message-ID: <20180524195718.4a030167@fsol> Hi, While PEP 574 (pickle protocol 5 with out-of-band data) is still in draft status, I've made available an implementation in branch "pickle5" in my GitHub fork of CPython: https://github.com/pitrou/cpython/tree/pickle5 Also I've published an experimental backport on PyPI, for Python 3.6 and 3.7. This should help people play with the new API and features without having to compile Python: https://pypi.org/project/pickle5/ Any feedback is welcome. Regards Antoine. From vstinner at redhat.com Thu May 24 16:14:57 2018 From: vstinner at redhat.com (Victor Stinner) Date: Thu, 24 May 2018 22:14:57 +0200 Subject: [Python-Dev] PEP 574 (pickle 5) implementation and backport available In-Reply-To: <20180524195718.4a030167@fsol> References: <20180524195718.4a030167@fsol> Message-ID: Link to the PEP: "PEP 574 -- Pickle protocol 5 with out-of-band data" https://www.python.org/dev/peps/pep-0574/ Victor 2018-05-24 19:57 GMT+02:00 Antoine Pitrou : > > Hi, > > While PEP 574 (pickle protocol 5 with out-of-band data) is still in > draft status, I've made available an implementation in branch "pickle5" > in my GitHub fork of CPython: > https://github.com/pitrou/cpython/tree/pickle5 > > Also I've published an experimental backport on PyPI, for Python 3.6 > and 3.7. This should help people play with the new API and features > without having to compile Python: > https://pypi.org/project/pickle5/ > > Any feedback is welcome. > > Regards > > Antoine. > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/vstinner%40redhat.com From lists at janc.be Thu May 24 22:09:34 2018 From: lists at janc.be (Jan Claeys) Date: Fri, 25 May 2018 04:09:34 +0200 Subject: [Python-Dev] The history of PyXML In-Reply-To: References: Message-ID: <3e83f0be4977de6006f4efdf9019075f7924b560.camel@janc.be> On Thu, 2018-05-17 at 15:18 +0300, Serhiy Storchaka wrote: > Does anyone has the full copy of the PyXML repository, with the > complete history? > > This library was included in Python 2.1 as the xml package and is > not maintained as a separate project since 2004. It's home on > SourceForge was removed. I have found sources of the last PyXML > version (0.8.4), but without history. > Did you try asking SourceForge if they still have a backup copy? -- Jan Claeys From nad at python.org Fri May 25 01:33:43 2018 From: nad at python.org (Ned Deily) Date: Fri, 25 May 2018 01:33:43 -0400 Subject: [Python-Dev] 3.7.0rc1 Delayed [was] FINAL WEEK FOR 3.7.0 CHANGES! In-Reply-To: <33CAAEBB-8DE3-494D-9306-415D70D9A27B@python.org> References: <6304cea3-52fc-fd19-b920-016715f6f5d1@gmail.com> <33CAAEBB-8DE3-494D-9306-415D70D9A27B@python.org> Message-ID: <45628031-F42C-4B3B-8B1C-F3BE3B49BBA6@python.org> On May 24, 2018, at 03:23, Ned Deily wrote: > On May 23, 2018, at 09:13, Ned Deily wrote: >> On May 23, 2018, at 07:45, Serhiy Storchaka wrote: >>> Is it possible to add yet one beta instead? >>> CI was broken for few latest days, tests are not passed on my computer still (and fail on some buildbots), updating What's New exposed new features which need additional testing (and maybe fixing or reverting), and I'm not comfortable about some changes which would be harder to fix after the release. >> it is possible but there's no point in doing either another beta or a release candidate until we understand and address the current blocking issues, like the major buildbot failures. We have another 24 hours until rc1 was planned to be tagged. Let's keep working on the known issues and we will make a decision then. > An update: thanks to a lot of effort over the past day by a number of > people (including Victor, Serhiy, Christian, Zach, and others I'm sure > I'm forgetting - my apologies), we have addressed all of the "release > blocker" issues and all but one of the persistent failures on the 3.7 > stable buildbots. We should have the couple of remaining "deferred > blockers" including the remaining stable buildbots in green status by > later today. At that point, we will be ready to tag 3.7.0rc1 and begin > producing the release candidate artifacts. Further update: some good news and some changes. The good news is that we have resolutions for all of the previous release and deferred blockers. Thanks to a number of people for continuing to help get the remaining stable buildbot issues taken care of along with some lingering bugs. The not-quite-as-good news is that we have had more discussions about some unexpected incompatibilities that have shown up with downstream user testing with the AST docstrings changes in place (see bpo-32911). We have had some previous discussions about the expected user impact and, earlier in the beta phase, I encouraged us to stay the course with the feature as implemented. But I am now persuaded that we owe it to our users to take one more look at this to make sure we do not force them to make changes for 3.7 and then once again for 3.8. More details are in the bug tracker issue; I strongly encourage those of us who have been involved with this to "vote" there on the proposal to either (A) proceed with the release of the current implementation in 3.7.0 or (B) revert the feature in 3.7.0 and retarget for 3.8. Should the consensus be to revert (B), we will plan to have one more fast-track beta release (b5) prior to the release candidate, in order to allow downstream users to test their projects with the removal. PLEASE, keep the discussion about this on the bug tracker (and not here!) and keep it brief so we can move forward quickly. Because of the upcoming 3-day holiday weekend in some countries, I have set Tue 2018-05-29 18:00 UTC as a cutoff for "voting" but, if a clear consensus emerges earlier, we will likely cut the discussion short. So chime in now on the bug tracker if you have a stake in this issue. https://bugs.python.org/issue32911 This does mean that yesterday's "last chance" has been extended a bit, at most a few days. I will let you know as soon as we have made a decision about the feature and will provide updated 3.7.0 schedule info at that time. --Ned -- Ned Deily nad at python.org -- [] From skip.montanaro at gmail.com Fri May 25 09:11:59 2018 From: skip.montanaro at gmail.com (Skip Montanaro) Date: Fri, 25 May 2018 08:11:59 -0500 Subject: [Python-Dev] "make test" routinely fails to terminate In-Reply-To: References: <20180520093436.100493a4@fsol> Message-ID: > me> On the 3.7 branch, "make test" routinely fails to terminate. > Antoine> Can you try to rebuild Python? Use "make distclean" if that helps. > Thanks, Antoine. That solved the termination problem. I still have problems > with test_asyncio failing, but I can live with that for now. Final follow-up. I finally got myself a workable, updateable 3.7 branch in my fork. It looks like the asyncio issues are alsy resolved on both 3.7 and master. Skip From remi.lapeyre at vint.fr Thu May 24 08:55:32 2018 From: remi.lapeyre at vint.fr (=?utf-8?Q?R=C3=A9mi_Lapeyre?=) Date: Thu, 24 May 2018 14:55:32 +0200 Subject: [Python-Dev] Add __reversed__ methods for dict Message-ID: ? Hi, since dict keys are sorted by their insertion order since Python 3.6 and that it?s part of Python specs since 3.7 a proposal has been made in bpo-33462 to? add the __reversed__ method to dict and dict views. Concerns have been raised in the comments that this feature may add too much? bloat in the core interpreter and be harmful for other Python implementations. Given the different issues this change creates, I see three possibilities: 1. Accept the proposal has it is for dict and dict views, this would add about 300 lines and three new types in dictobject.c 2. Accept the proposal only for dict, this would add about 80 lines and one new type in dictobject.c while still being useful for some use cases 3. Drop the proposal as the whole, while having some use, reversed(dict(a=1, b=2)) may not be very common and could be done using OrderedDict instead. What?s your stance on the issue ? Best regards, R?mi Lapeyre From guido at python.org Fri May 25 11:46:56 2018 From: guido at python.org (Guido van Rossum) Date: Fri, 25 May 2018 08:46:56 -0700 Subject: [Python-Dev] Add __reversed__ methods for dict In-Reply-To: References: Message-ID: Please go find some real world code that would benefit from this. Don't make up examples, just show some code in a repository (public if possible, but private is okay, as long as you can quote small amounts of code from it) where te existence of reverse iteration over a dict would have been helpful. On Thu, May 24, 2018 at 5:55 AM, R?mi Lapeyre wrote: > > Hi, > > since dict keys are sorted by their insertion order since Python 3.6 and > that > it?s part of Python specs since 3.7 a proposal has been made in bpo-33462 > to > add the __reversed__ method to dict and dict views. > > Concerns have been raised in the comments that this feature may add too > much > bloat in the core interpreter and be harmful for other Python > implementations. > > Given the different issues this change creates, I see three possibilities: > > 1. Accept the proposal has it is for dict and dict views, this would add > about > 300 lines and three new types in dictobject.c > > 2. Accept the proposal only for dict, this would add about 80 lines and one > new type in dictobject.c while still being useful for some use cases > > 3. Drop the proposal as the whole, while having some use, > reversed(dict(a=1, b=2)) > may not be very common and could be done using OrderedDict instead. > > What?s your stance on the issue ? > > Best regards, > R?mi Lapeyre > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ > guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Fri May 25 11:48:01 2018 From: guido at python.org (Guido van Rossum) Date: Fri, 25 May 2018 08:48:01 -0700 Subject: [Python-Dev] Add __reversed__ methods for dict In-Reply-To: References: Message-ID: (Also this probably belongs in python-ideas, unless there's already a bugs.python.org issue for it -- but you didn't mention that so I assume it's just an idea? How did you reach the line count estimates?) On Fri, May 25, 2018 at 8:46 AM, Guido van Rossum wrote: > Please go find some real world code that would benefit from this. Don't > make up examples, just show some code in a repository (public if possible, > but private is okay, as long as you can quote small amounts of code from > it) where te existence of reverse iteration over a dict would have been > helpful. > > On Thu, May 24, 2018 at 5:55 AM, R?mi Lapeyre > wrote: > >> >> Hi, >> >> since dict keys are sorted by their insertion order since Python 3.6 and >> that >> it?s part of Python specs since 3.7 a proposal has been made in bpo-33462 >> to >> add the __reversed__ method to dict and dict views. >> >> Concerns have been raised in the comments that this feature may add too >> much >> bloat in the core interpreter and be harmful for other Python >> implementations. >> >> Given the different issues this change creates, I see three possibilities: >> >> 1. Accept the proposal has it is for dict and dict views, this would add >> about >> 300 lines and three new types in dictobject.c >> >> 2. Accept the proposal only for dict, this would add about 80 lines and >> one >> new type in dictobject.c while still being useful for some use cases >> >> 3. Drop the proposal as the whole, while having some use, >> reversed(dict(a=1, b=2)) >> may not be very common and could be done using OrderedDict instead. >> >> What?s your stance on the issue ? >> >> Best regards, >> R?mi Lapeyre >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: https://mail.python.org/mailman/options/python-dev/guido% >> 40python.org >> > > > > -- > --Guido van Rossum (python.org/~guido) > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From vstinner at redhat.com Fri May 25 11:50:29 2018 From: vstinner at redhat.com (Victor Stinner) Date: Fri, 25 May 2018 17:50:29 +0200 Subject: [Python-Dev] Add __reversed__ methods for dict In-Reply-To: References: Message-ID: It looks like an optimization, since you can already do something like reversed(list(d)). Do you have benchmark numbers to see the benefit of your change? Even if reversed(list(d)) is slow, I'm not sure that it's worth it to optimize it, since it's a rare usecase. Victor 2018-05-24 14:55 GMT+02:00 R?mi Lapeyre : > > Hi, > > since dict keys are sorted by their insertion order since Python 3.6 and that > it?s part of Python specs since 3.7 a proposal has been made in bpo-33462 to > add the __reversed__ method to dict and dict views. > > Concerns have been raised in the comments that this feature may add too much > bloat in the core interpreter and be harmful for other Python implementations. > > Given the different issues this change creates, I see three possibilities: > > 1. Accept the proposal has it is for dict and dict views, this would add about > 300 lines and three new types in dictobject.c > > 2. Accept the proposal only for dict, this would add about 80 lines and one > new type in dictobject.c while still being useful for some use cases > > 3. Drop the proposal as the whole, while having some use, reversed(dict(a=1, b=2)) > may not be very common and could be done using OrderedDict instead. > > What?s your stance on the issue ? > > Best regards, > R?mi Lapeyre > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/vstinner%40redhat.com From vstinner at redhat.com Fri May 25 11:53:58 2018 From: vstinner at redhat.com (Victor Stinner) Date: Fri, 25 May 2018 17:53:58 +0200 Subject: [Python-Dev] Add __reversed__ methods for dict In-Reply-To: References: Message-ID: INADA Naoki asked R?mi Lapeyre in https://bugs.python.org/issue33462 to start a discussion on python-dev. Victor 2018-05-25 17:48 GMT+02:00 Guido van Rossum : > (Also this probably belongs in python-ideas, unless there's already a > bugs.python.org issue for it -- but you didn't mention that so I assume it's > just an idea? How did you reach the line count estimates?) > > On Fri, May 25, 2018 at 8:46 AM, Guido van Rossum wrote: >> >> Please go find some real world code that would benefit from this. Don't >> make up examples, just show some code in a repository (public if possible, >> but private is okay, as long as you can quote small amounts of code from it) >> where te existence of reverse iteration over a dict would have been helpful. >> >> On Thu, May 24, 2018 at 5:55 AM, R?mi Lapeyre >> wrote: >>> >>> >>> Hi, >>> >>> since dict keys are sorted by their insertion order since Python 3.6 and >>> that >>> it?s part of Python specs since 3.7 a proposal has been made in bpo-33462 >>> to >>> add the __reversed__ method to dict and dict views. >>> >>> Concerns have been raised in the comments that this feature may add too >>> much >>> bloat in the core interpreter and be harmful for other Python >>> implementations. >>> >>> Given the different issues this change creates, I see three >>> possibilities: >>> >>> 1. Accept the proposal has it is for dict and dict views, this would add >>> about >>> 300 lines and three new types in dictobject.c >>> >>> 2. Accept the proposal only for dict, this would add about 80 lines and >>> one >>> new type in dictobject.c while still being useful for some use cases >>> >>> 3. Drop the proposal as the whole, while having some use, >>> reversed(dict(a=1, b=2)) >>> may not be very common and could be done using OrderedDict instead. >>> >>> What?s your stance on the issue ? >>> >>> Best regards, >>> R?mi Lapeyre >>> _______________________________________________ >>> Python-Dev mailing list >>> Python-Dev at python.org >>> https://mail.python.org/mailman/listinfo/python-dev >>> Unsubscribe: >>> https://mail.python.org/mailman/options/python-dev/guido%40python.org >> >> >> >> >> -- >> --Guido van Rossum (python.org/~guido) > > > > > -- > --Guido van Rossum (python.org/~guido) > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/vstinner%40redhat.com > From status at bugs.python.org Fri May 25 12:09:55 2018 From: status at bugs.python.org (Python tracker) Date: Fri, 25 May 2018 18:09:55 +0200 (CEST) Subject: [Python-Dev] Summary of Python tracker Issues Message-ID: <20180525160955.38DC256554@psf.upfronthosting.co.za> ACTIVITY SUMMARY (2018-05-18 - 2018-05-25) Python tracker at https://bugs.python.org/ To view or respond to any of the issues listed below, click on the issue. Do NOT respond to this message. Issues counts and deltas: open 6699 (+13) closed 38700 (+63) total 45399 (+76) Open issues with patches: 2638 Issues opened (54) ================== #27485: urllib.splitport -- is it official or not? https://bugs.python.org/issue27485 reopened by serhiy.storchaka #27535: Ignored ResourceWarning warnings leak memory in warnings regis https://bugs.python.org/issue27535 reopened by vstinner #33330: Better error handling in PyImport_Cleanup() https://bugs.python.org/issue33330 reopened by vstinner #33573: statistics.median does not work with ordinal scale https://bugs.python.org/issue33573 opened by W deW #33575: Python relies on C undefined behavior float-cast-overflow https://bugs.python.org/issue33575 opened by gregory.p.smith #33576: Make exception wrapping less intrusive for __set_name__ calls https://bugs.python.org/issue33576 opened by ncoghlan #33578: cjkcodecs missing getstate and setstate implementations https://bugs.python.org/issue33578 opened by libcthorne #33579: calendar.timegm not always an inverse of time.gmtime https://bugs.python.org/issue33579 opened by eitan.adler #33581: Document "optional components that are commonly included in Py https://bugs.python.org/issue33581 opened by Antony.Lee #33582: formatargspec deprecated but does nto emit DeprecationWarning. https://bugs.python.org/issue33582 opened by mbussonn #33586: 2.7.15 missing release notes on download page https://bugs.python.org/issue33586 opened by ericvw #33587: inspect.getsource performs unnecessary filesystem stat call wh https://bugs.python.org/issue33587 opened by Pankaj Pandey #33590: sched.enter priority has no impact on execution https://bugs.python.org/issue33590 opened by sahilmn #33591: ctypes does not support fspath protocol https://bugs.python.org/issue33591 opened by mrh1997 #33592: Document contextvars C API https://bugs.python.org/issue33592 opened by Elvis.Pranskevichus #33594: add deprecation since 3.5 for a few methods of inspect. https://bugs.python.org/issue33594 opened by mbussonn #33595: FIx references to lambda "arguments" https://bugs.python.org/issue33595 opened by adelfino #33597: Compact PyGC_Head https://bugs.python.org/issue33597 opened by inada.naoki #33598: ActiveState Recipes links in docs, and the apparent closure of https://bugs.python.org/issue33598 opened by tritium #33600: [EASY DOC] Python 2: document that platform.linux_distribution https://bugs.python.org/issue33600 opened by vstinner #33601: [EASY DOC] Py_UTF8Mode is not documented https://bugs.python.org/issue33601 opened by serhiy.storchaka #33602: Remove set and queue references from Data Types https://bugs.python.org/issue33602 opened by adelfino #33603: Subprocess Thread handles grow with each call and aren't relea https://bugs.python.org/issue33603 opened by GranPrego #33604: HMAC default to MD5 marked as to be removed in 3.6 https://bugs.python.org/issue33604 opened by mbussonn #33605: Detect accessing event loop from a different thread outside of https://bugs.python.org/issue33605 opened by hniksic #33606: Improve logging performance when logger disabled https://bugs.python.org/issue33606 opened by vinay.sajip #33607: [subinterpreters] Explicitly track object ownership (and alloc https://bugs.python.org/issue33607 opened by eric.snow #33608: [subinterpreters] Add a cross-interpreter-safe mechanism to in https://bugs.python.org/issue33608 opened by eric.snow #33609: Document that dicts preserve insertion order https://bugs.python.org/issue33609 opened by yselivanov #33610: IDLE: Make multiple improvements to CodeContext https://bugs.python.org/issue33610 opened by terry.reedy #33613: test_multiprocessing_fork: test_semaphore_tracker_sigint() fai https://bugs.python.org/issue33613 opened by vstinner #33614: Compilation of Python fails on AMD64 Windows8.1 Refleaks 3.x https://bugs.python.org/issue33614 opened by vstinner #33615: test__xxsubinterpreters crashed on x86 Gentoo Refleaks 3.x https://bugs.python.org/issue33615 opened by vstinner #33616: typing.NoReturn is undocumented https://bugs.python.org/issue33616 opened by srittau #33617: subprocess.Popen etc do not accept os.PathLike in passed seque https://bugs.python.org/issue33617 opened by altendky #33618: Support TLS 1.3 https://bugs.python.org/issue33618 opened by christian.heimes #33623: Fix possible SIGSGV when asyncio.Future is created in __del__ https://bugs.python.org/issue33623 opened by yselivanov #33624: Implement subclass hooks for asyncio abstract classes https://bugs.python.org/issue33624 opened by asvetlov #33625: Release GIL for grp.getgr{nam,gid} and pwd.getpw{nam,uid} https://bugs.python.org/issue33625 opened by wg #33627: test-complex of test_numeric_tower.test_complex() crashes inte https://bugs.python.org/issue33627 opened by vstinner #33630: test_posix: TestPosixSpawn fails on PPC64 Fedora 3.x https://bugs.python.org/issue33630 opened by vstinner #33632: undefined behaviour: signed integer overflow in threadmodule.c https://bugs.python.org/issue33632 opened by pitrou #33635: OSError when using pathlib.Path.rglob() to list device files https://bugs.python.org/issue33635 opened by Victor Domingos #33637: pip cannot build extensions for debug Python https://bugs.python.org/issue33637 opened by Ivan.Pozdeev #33638: condition lock not re-acquired https://bugs.python.org/issue33638 opened by christof #33639: Use high-performance os.sendfile() in shutil.copy* https://bugs.python.org/issue33639 opened by giampaolo.rodola #33640: uuid: endian of the bytes argument is not documented https://bugs.python.org/issue33640 opened by vstinner #33641: Add links to RFCs https://bugs.python.org/issue33641 opened by serhiy.storchaka #33642: IDLE: Use variable number of lines in CodeContext https://bugs.python.org/issue33642 opened by cheryl.sabella #33643: Mock functions with autospec STILL don't support assert_called https://bugs.python.org/issue33643 opened by dybi #33644: Fix signatures of tp_finalize handlers in testing code. https://bugs.python.org/issue33644 opened by serhiy.storchaka #33645: Importing bs4 fails with -3 option in Python 2.7.15 https://bugs.python.org/issue33645 opened by fschulze #33647: Make string.replace accept a dict instead of two arguments https://bugs.python.org/issue33647 opened by paalped #33648: unused with_c_locale_warning option in configure should be rem https://bugs.python.org/issue33648 opened by eitan.adler Most recent 15 issues with no replies (15) ========================================== #33644: Fix signatures of tp_finalize handlers in testing code. https://bugs.python.org/issue33644 #33643: Mock functions with autospec STILL don't support assert_called https://bugs.python.org/issue33643 #33641: Add links to RFCs https://bugs.python.org/issue33641 #33635: OSError when using pathlib.Path.rglob() to list device files https://bugs.python.org/issue33635 #33624: Implement subclass hooks for asyncio abstract classes https://bugs.python.org/issue33624 #33616: typing.NoReturn is undocumented https://bugs.python.org/issue33616 #33606: Improve logging performance when logger disabled https://bugs.python.org/issue33606 #33602: Remove set and queue references from Data Types https://bugs.python.org/issue33602 #33600: [EASY DOC] Python 2: document that platform.linux_distribution https://bugs.python.org/issue33600 #33598: ActiveState Recipes links in docs, and the apparent closure of https://bugs.python.org/issue33598 #33595: FIx references to lambda "arguments" https://bugs.python.org/issue33595 #33594: add deprecation since 3.5 for a few methods of inspect. https://bugs.python.org/issue33594 #33591: ctypes does not support fspath protocol https://bugs.python.org/issue33591 #33586: 2.7.15 missing release notes on download page https://bugs.python.org/issue33586 #33582: formatargspec deprecated but does nto emit DeprecationWarning. https://bugs.python.org/issue33582 Most recent 15 issues waiting for review (15) ============================================= #33648: unused with_c_locale_warning option in configure should be rem https://bugs.python.org/issue33648 #33645: Importing bs4 fails with -3 option in Python 2.7.15 https://bugs.python.org/issue33645 #33644: Fix signatures of tp_finalize handlers in testing code. https://bugs.python.org/issue33644 #33642: IDLE: Use variable number of lines in CodeContext https://bugs.python.org/issue33642 #33641: Add links to RFCs https://bugs.python.org/issue33641 #33639: Use high-performance os.sendfile() in shutil.copy* https://bugs.python.org/issue33639 #33630: test_posix: TestPosixSpawn fails on PPC64 Fedora 3.x https://bugs.python.org/issue33630 #33625: Release GIL for grp.getgr{nam,gid} and pwd.getpw{nam,uid} https://bugs.python.org/issue33625 #33623: Fix possible SIGSGV when asyncio.Future is created in __del__ https://bugs.python.org/issue33623 #33618: Support TLS 1.3 https://bugs.python.org/issue33618 #33617: subprocess.Popen etc do not accept os.PathLike in passed seque https://bugs.python.org/issue33617 #33616: typing.NoReturn is undocumented https://bugs.python.org/issue33616 #33609: Document that dicts preserve insertion order https://bugs.python.org/issue33609 #33604: HMAC default to MD5 marked as to be removed in 3.6 https://bugs.python.org/issue33604 #33602: Remove set and queue references from Data Types https://bugs.python.org/issue33602 Top 10 most discussed issues (10) ================================= #33614: Compilation of Python fails on AMD64 Windows8.1 Refleaks 3.x https://bugs.python.org/issue33614 13 msgs #32911: Doc strings no longer stored in body of AST https://bugs.python.org/issue32911 10 msgs #33462: reversible dict https://bugs.python.org/issue33462 10 msgs #33597: Compact PyGC_Head https://bugs.python.org/issue33597 8 msgs #33615: test__xxsubinterpreters crashed on x86 Gentoo Refleaks 3.x https://bugs.python.org/issue33615 8 msgs #33618: Support TLS 1.3 https://bugs.python.org/issue33618 8 msgs #33355: Windows 10 buildbot: 15 min timeout on test_mmap.test_large_fi https://bugs.python.org/issue33355 7 msgs #33521: Add 1.32x faster C implementation of asyncio.isfuture(). https://bugs.python.org/issue33521 7 msgs #33579: calendar.timegm not always an inverse of time.gmtime https://bugs.python.org/issue33579 7 msgs #19251: bitwise ops for bytes of equal length https://bugs.python.org/issue19251 6 msgs Issues closed (61) ================== #19950: Document that unittest.TestCase.__init__ is called once per te https://bugs.python.org/issue19950 closed by gregory.p.smith #20941: pytime.c:184 and pytime.c:218: runtime error, outside the rang https://bugs.python.org/issue20941 closed by vstinner #26819: _ProactorReadPipeTransport pause_reading()/resume_reading() br https://bugs.python.org/issue26819 closed by asvetlov #28547: Python to use Windows Certificate Store https://bugs.python.org/issue28547 closed by Jean-Philippe Landry #29428: Doctest documentation unclear about multi-line fixtures https://bugs.python.org/issue29428 closed by willingc #30877: possibe typo in json/scanner.py https://bugs.python.org/issue30877 closed by serhiy.storchaka #30940: Documentation for round() is incorrect. https://bugs.python.org/issue30940 closed by serhiy.storchaka #31106: os.posix_fallocate() generate exception with errno 0 https://bugs.python.org/issue31106 closed by ned.deily #31493: IDLE cond context: fix code update and font update timers https://bugs.python.org/issue31493 closed by terry.reedy #31868: Null pointer dereference in ndb.ndbm get when used with a defa https://bugs.python.org/issue31868 closed by serhiy.storchaka #32708: test_sendfile() hangs on AMD64 FreeBSD 10.x Shared 3.x buildbo https://bugs.python.org/issue32708 closed by vstinner #32831: IDLE: Add docstrings and tests for codecontext https://bugs.python.org/issue32831 closed by terry.reedy #33037: Skip sending/receiving after SSL transport closing https://bugs.python.org/issue33037 closed by asvetlov #33109: argparse: make new 'required' argument to add_subparsers defau https://bugs.python.org/issue33109 closed by ned.deily #33263: Asyncio server enters an invalid state after a request with SO https://bugs.python.org/issue33263 closed by asvetlov #33321: Add a Linux clang ubsan undefined behavior sanitizer buildbot https://bugs.python.org/issue33321 closed by gregory.p.smith #33353: test_asyncio: test_sock_sendfile_mix_with_regular_send() hangs https://bugs.python.org/issue33353 closed by vstinner #33354: Python2: test_ssl fails on non-ASCII path https://bugs.python.org/issue33354 closed by vstinner #33421: Missing documentation for typing.AsyncContextManager https://bugs.python.org/issue33421 closed by levkivskyi #33430: Import secrets module in secrets examples https://bugs.python.org/issue33430 closed by steven.daprano #33441: Expose the sigset_t converter via private API https://bugs.python.org/issue33441 closed by serhiy.storchaka #33447: Asynchronous lambda syntax https://bugs.python.org/issue33447 closed by yselivanov #33454: Mismatched C function signature in _xxsubinterpreters.channel_ https://bugs.python.org/issue33454 closed by serhiy.storchaka #33475: Fix converting AST expression to string and optimize parenthes https://bugs.python.org/issue33475 closed by serhiy.storchaka #33514: async and await as keywords not mentioned in What???s New In P https://bugs.python.org/issue33514 closed by hroncok #33516: unittest.mock: Add __round__ to supported magicmock methods https://bugs.python.org/issue33516 closed by vstinner #33518: Add PEP to glossary https://bugs.python.org/issue33518 closed by vstinner #33528: os.getentropy support https://bugs.python.org/issue33528 closed by David Carlier #33537: Help on importlib.resources outputs the builtin open descripti https://bugs.python.org/issue33537 closed by serhiy.storchaka #33540: socketserver: Add an opt-in option to get Python 3.6 behaviour https://bugs.python.org/issue33540 closed by vstinner #33541: Remove private and apparently unused __pad function https://bugs.python.org/issue33541 closed by belopolsky #33542: _ipconfig_getnode incorrectly selects a DUID as a MAC address https://bugs.python.org/issue33542 closed by serhiy.storchaka #33544: Asyncio Event.wait() is a hold over from before awaitable, and https://bugs.python.org/issue33544 closed by asvetlov #33546: asyncio.Condition should become awaitable in 3.9 https://bugs.python.org/issue33546 closed by asvetlov #33556: leftover thread crumb in threading.ident docstring https://bugs.python.org/issue33556 closed by zach.ware #33565: strange tracemalloc results https://bugs.python.org/issue33565 closed by thehesiod #33574: Conversion of Number to String(str(number)) https://bugs.python.org/issue33574 closed by eric.smith #33577: remove wrapping of __set_name__ exceptions in RuntimeError https://bugs.python.org/issue33577 closed by carljm #33580: Make binary/text file glossary entries follow most common "see https://bugs.python.org/issue33580 closed by serhiy.storchaka #33583: PyObject_GC_Resize() doesn't relink GCHead https://bugs.python.org/issue33583 closed by inada.naoki #33584: Fix several minor bugs in asyncio https://bugs.python.org/issue33584 closed by serhiy.storchaka #33585: re.sub calls repl function one time too many for catch-all reg https://bugs.python.org/issue33585 closed by serhiy.storchaka #33588: Unicode function arguments aren't preserved https://bugs.python.org/issue33588 closed by steven.daprano #33589: Remove dummy member in GCHead https://bugs.python.org/issue33589 closed by inada.naoki #33593: Support heapq on typed arrays? https://bugs.python.org/issue33593 closed by rhettinger #33596: fix memory leak in lib/json/scanner.py py_make_scanner https://bugs.python.org/issue33596 closed by serhiy.storchaka #33599: Copying objects subclassed from SimpleNamespace doesn't work https://bugs.python.org/issue33599 closed by serhiy.storchaka #33611: Fatal Python error: Py_Initialize: unable to load the file sys https://bugs.python.org/issue33611 closed by vstinner #33612: Assertion failure in PyThreadState_Clear https://bugs.python.org/issue33612 closed by vstinner #33619: libffi detection via pkg-config is broken https://bugs.python.org/issue33619 closed by benjamin.peterson #33620: requests.Session doesn't properly handle closed keep-alive ses https://bugs.python.org/issue33620 closed by Jonathan Lynch #33621: repr(threading._DummyThread) always fails. https://bugs.python.org/issue33621 closed by fabioz #33622: Fix errors handling in the garbage collector https://bugs.python.org/issue33622 closed by serhiy.storchaka #33626: test_sendfile_fallback_close_peer_in_middle_of_receiving() fai https://bugs.python.org/issue33626 closed by vstinner #33628: IDLE: Code cleanup on codecontext https://bugs.python.org/issue33628 closed by terry.reedy #33629: test_importlib creates a coredump on AMD64 FreeBSD 10.x Shared https://bugs.python.org/issue33629 closed by vstinner #33631: [ValueError] _strptime.py can't handle 12-hr format strings us https://bugs.python.org/issue33631 closed by ammar2 #33633: smtplib msg['To] = appends instead of assigning https://bugs.python.org/issue33633 closed by r.david.murray #33634: Buildbot configuration issue on Windows7 buildbots https://bugs.python.org/issue33634 closed by zach.ware #33636: Unexpected behavior with * and arrays https://bugs.python.org/issue33636 closed by steven.daprano #33646: os.fspath() bypasses __fspath__ for str subclasses https://bugs.python.org/issue33646 closed by serhiy.storchaka From solipsis at pitrou.net Fri May 25 12:32:01 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Fri, 25 May 2018 18:32:01 +0200 Subject: [Python-Dev] Add __reversed__ methods for dict References: Message-ID: <20180525183201.38a60491@fsol> It's worth nothing that OrderedDict already supports reversed(). The argument could go both ways: 1. dict is similar to OrderedDict nowadays, so it should support reversed() too; 2. you can use OrderedDict to signal explicitly that you care about ordering; no need to add anything to dict. Regards Antoine. On Thu, 24 May 2018 14:55:32 +0200 R?mi Lapeyre wrote: > ? > Hi, > > since dict keys are sorted by their insertion order since Python 3.6 and that > it?s part of Python specs since 3.7 a proposal has been made in bpo-33462 to? > add the __reversed__ method to dict and dict views. > > Concerns have been raised in the comments that this feature may add too much? > bloat in the core interpreter and be harmful for other Python implementations. > > Given the different issues this change creates, I see three possibilities: > > 1. Accept the proposal has it is for dict and dict views, this would add about > 300 lines and three new types in dictobject.c > > 2. Accept the proposal only for dict, this would add about 80 lines and one > new type in dictobject.c while still being useful for some use cases > > 3. Drop the proposal as the whole, while having some use, reversed(dict(a=1, b=2)) > may not be very common and could be done using OrderedDict instead. > > What?s your stance on the issue ? > > Best regards, > R?mi Lapeyre > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/python-python-dev%40m.gmane.org From olivier.grisel at ensta.org Fri May 25 12:57:08 2018 From: olivier.grisel at ensta.org (Olivier Grisel) Date: Fri, 25 May 2018 18:57:08 +0200 Subject: [Python-Dev] PEP 574 (pickle 5) implementation and backport available In-Reply-To: <20180524195718.4a030167@fsol> References: <20180524195718.4a030167@fsol> Message-ID: I tried this implementation to add no-copy pickling for large numpy arrays and seems to work as expected (for a simple contiguous array). I took some notes on the numpy tracker to advertise this PEP to the numpy developers: https://github.com/numpy/numpy/issues/11161 -- Olivier ? -------------- next part -------------- An HTML attachment was scrubbed... URL: From raymond.hettinger at gmail.com Fri May 25 13:26:52 2018 From: raymond.hettinger at gmail.com (Raymond Hettinger) Date: Fri, 25 May 2018 10:26:52 -0700 Subject: [Python-Dev] Add __reversed__ methods for dict In-Reply-To: <20180525183201.38a60491@fsol> References: <20180525183201.38a60491@fsol> Message-ID: > On May 25, 2018, at 9:32 AM, Antoine Pitrou wrote: > > It's worth nothing that OrderedDict already supports reversed(). > The argument could go both ways: > > 1. dict is similar to OrderedDict nowadays, so it should support > reversed() too; > > 2. you can use OrderedDict to signal explicitly that you care about > ordering; no need to add anything to dict. Those are both valid sentiments :-) My thought is that guaranteed insertion order for regular dicts is brand new, so it will take a while for the notion settle in and become part of everyday thinking about dicts. Once that happens, it is probably inevitable that use cases will emerge and that __reversed__ will get added at some point. The implementation seems straightforward and it isn't much of a conceptual leap to expect that a finite ordered collection would be reversible. Given that dicts now track insertion order, it seems reasonable to want to know the most recent insertions (i.e. looping over the most recently added tasks in a task dict). Other possible use cases will likely correspond to how we use the Unix tail command. If those use cases arise, it would be nice for __reversed__ to already be supported so that people won't be tempted to implement an ugly workaround using popitem() calls followed by reinsertions. Raymond . From raymond.hettinger at gmail.com Fri May 25 13:36:08 2018 From: raymond.hettinger at gmail.com (Raymond Hettinger) Date: Fri, 25 May 2018 10:36:08 -0700 Subject: [Python-Dev] PEP 574 (pickle 5) implementation and backport available In-Reply-To: <20180524195718.4a030167@fsol> References: <20180524195718.4a030167@fsol> Message-ID: <1E3100CB-1310-4C14-94F8-02E6348275A4@gmail.com> > On May 24, 2018, at 10:57 AM, Antoine Pitrou wrote: > > While PEP 574 (pickle protocol 5 with out-of-band data) is still in > draft status, I've made available an implementation in branch "pickle5" > in my GitHub fork of CPython: > https://github.com/pitrou/cpython/tree/pickle5 > > Also I've published an experimental backport on PyPI, for Python 3.6 > and 3.7. This should help people play with the new API and features > without having to compile Python: > https://pypi.org/project/pickle5/ > > Any feedback is welcome. Thanks for doing this. Hope it isn't too late, but I would like to suggest that protocol 5 support fast compression by default. We normally pickle objects so that they can be transported (saved to a file or sent over a socket). Transport costs (reading and writing a file or socket) are generally proportional to size, so compression is likely to be a net win (much as it was for header compression in HTTP/2). The PEP lists compression as a possible a refinement only for large objects, but I expect is will be a win for most pickles to compress them in their entirety. Raymond From guido at python.org Fri May 25 13:48:09 2018 From: guido at python.org (Guido van Rossum) Date: Fri, 25 May 2018 10:48:09 -0700 Subject: [Python-Dev] Add __reversed__ methods for dict In-Reply-To: References: <20180525183201.38a60491@fsol> Message-ID: OK, +1 On Fri, May 25, 2018 at 10:26 AM, Raymond Hettinger < raymond.hettinger at gmail.com> wrote: > > > > On May 25, 2018, at 9:32 AM, Antoine Pitrou wrote: > > > > It's worth nothing that OrderedDict already supports reversed(). > > The argument could go both ways: > > > > 1. dict is similar to OrderedDict nowadays, so it should support > > reversed() too; > > > > 2. you can use OrderedDict to signal explicitly that you care about > > ordering; no need to add anything to dict. > > Those are both valid sentiments :-) > > My thought is that guaranteed insertion order for regular dicts is brand > new, so it will take a while for the notion settle in and become part of > everyday thinking about dicts. Once that happens, it is probably > inevitable that use cases will emerge and that __reversed__ will get added > at some point. The implementation seems straightforward and it isn't much > of a conceptual leap to expect that a finite ordered collection would be > reversible. > > Given that dicts now track insertion order, it seems reasonable to want to > know the most recent insertions (i.e. looping over the most recently added > tasks in a task dict). Other possible use cases will likely correspond to > how we use the Unix tail command. > > If those use cases arise, it would be nice for __reversed__ to already be > supported so that people won't be tempted to implement an ugly workaround > using popitem() calls followed by reinsertions. > > > Raymond > > . > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ > guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From solipsis at pitrou.net Fri May 25 13:49:28 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Fri, 25 May 2018 19:49:28 +0200 Subject: [Python-Dev] PEP 574 (pickle 5) implementation and backport available In-Reply-To: <1E3100CB-1310-4C14-94F8-02E6348275A4@gmail.com> References: <20180524195718.4a030167@fsol> <1E3100CB-1310-4C14-94F8-02E6348275A4@gmail.com> Message-ID: <20180525194928.2e8aed94@fsol> On Fri, 25 May 2018 10:36:08 -0700 Raymond Hettinger wrote: > > On May 24, 2018, at 10:57 AM, Antoine Pitrou wrote: > > > > While PEP 574 (pickle protocol 5 with out-of-band data) is still in > > draft status, I've made available an implementation in branch "pickle5" > > in my GitHub fork of CPython: > > https://github.com/pitrou/cpython/tree/pickle5 > > > > Also I've published an experimental backport on PyPI, for Python 3.6 > > and 3.7. This should help people play with the new API and features > > without having to compile Python: > > https://pypi.org/project/pickle5/ > > > > Any feedback is welcome. > > Thanks for doing this. > > Hope it isn't too late, but I would like to suggest that protocol 5 support fast compression by default. We normally pickle objects so that they can be transported (saved to a file or sent over a socket). Transport costs (reading and writing a file or socket) are generally proportional to size, so compression is likely to be a net win (much as it was for header compression in HTTP/2). > > The PEP lists compression as a possible a refinement only for large objects, but I expect is will be a win for most pickles to compress them in their entirety. It's not too late (the PEP is still a draft, and there's a lot of time before 3.8), but I wonder what would be the benefit of making it a part of the pickle specification, rather than compressing independently. Whether and how to compress is generally a compromise between transmission (or storage) speed and computation speed. Also, there are specialized compressors for higher efficiency (for example, Blosc has datatype-specific compression for Numpy arrays). Such knowledge can be embodied in domain-specific libraries such as Dask/distributed, but it cannot really be incorporated in pickle itself. Do you have something specific in mind? Regards Antoine. From vano at mail.mipt.ru Fri May 25 14:28:57 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Fri, 25 May 2018 21:28:57 +0300 Subject: [Python-Dev] PEP 574 (pickle 5) implementation and backport available In-Reply-To: <1E3100CB-1310-4C14-94F8-02E6348275A4@gmail.com> References: <20180524195718.4a030167@fsol> <1E3100CB-1310-4C14-94F8-02E6348275A4@gmail.com> Message-ID: On 25.05.2018 20:36, Raymond Hettinger wrote: > >> On May 24, 2018, at 10:57 AM, Antoine Pitrou wrote: >> >> While PEP 574 (pickle protocol 5 with out-of-band data) is still in >> draft status, I've made available an implementation in branch "pickle5" >> in my GitHub fork of CPython: >> https://github.com/pitrou/cpython/tree/pickle5 >> >> Also I've published an experimental backport on PyPI, for Python 3.6 >> and 3.7. This should help people play with the new API and features >> without having to compile Python: >> https://pypi.org/project/pickle5/ >> >> Any feedback is welcome. > Thanks for doing this. > > Hope it isn't too late, but I would like to suggest that protocol 5 support fast compression by default. We normally pickle objects so that they can be transported (saved to a file or sent over a socket). Transport costs (reading and writing a file or socket) are generally proportional to size, so compression is likely to be a net win (much as it was for header compression in HTTP/2). > > The PEP lists compression as a possible a refinement only for large objects, but I expect is will be a win for most pickles to compress them in their entirety. I would advise against that. Pickle format is unreadable as it is, compression will make it literally impossible to diagnose problems. Python supports transparent compression, e.g. with the 'zlib' codec. > > Raymond > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/vano%40mail.mipt.ru -- Regards, Ivan From nas-python at arctrix.com Fri May 25 16:50:57 2018 From: nas-python at arctrix.com (Neil Schemenauer) Date: Fri, 25 May 2018 14:50:57 -0600 Subject: [Python-Dev] PEP 574 (pickle 5) implementation and backport available In-Reply-To: <20180525194928.2e8aed94@fsol> References: <20180524195718.4a030167@fsol> <1E3100CB-1310-4C14-94F8-02E6348275A4@gmail.com> <20180525194928.2e8aed94@fsol> Message-ID: <20180525205057.ytxazu6zx57g2dbs@python.ca> On 2018-05-25, Antoine Pitrou wrote: > Do you have something specific in mind? I think compressed by default is a good idea. My quick proposal: - Use fast compression like lz4 or zlib with Z_BEST_SPEED - Add a 'compress' keyword argument with a default of None. For protocol 5, None means to compress. Providing 'compress' != None for older protocols will raise an error. The compression overhead will be small compared to the pickle/unpickle costs. If someone wants to apply their own (e.g. better) compression, they can set compress=False. An alternative idea is to have two different protocol formats. E.g. 5 and 6. One is "pickle 5" with compression, one without compression. I don't like that as much since it breaks the idea that higher protocol numbers are "better". Regards, Neil From solipsis at pitrou.net Fri May 25 17:11:04 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Fri, 25 May 2018 23:11:04 +0200 Subject: [Python-Dev] PEP 574 (pickle 5) implementation and backport available References: <20180524195718.4a030167@fsol> <1E3100CB-1310-4C14-94F8-02E6348275A4@gmail.com> <20180525194928.2e8aed94@fsol> <20180525205057.ytxazu6zx57g2dbs@python.ca> Message-ID: <20180525231104.270ff681@fsol> On Fri, 25 May 2018 14:50:57 -0600 Neil Schemenauer wrote: > On 2018-05-25, Antoine Pitrou wrote: > > Do you have something specific in mind? > > I think compressed by default is a good idea. My quick proposal: > > - Use fast compression like lz4 or zlib with Z_BEST_SPEED > > - Add a 'compress' keyword argument with a default of None. For > protocol 5, None means to compress. Providing 'compress' != None > for older protocols will raise an error. The question is what purpose does it serve for pickle to do it rather than for the user to compress the pickle themselves. You're basically saving one line of code. Am I missing some other advantage? (also note that it requires us to ship the lz4 library with Python, or another modern compression library such as zstd; zlib's performance characteristics are outdated) Regards Antoine. From nas-python at arctrix.com Fri May 25 18:35:04 2018 From: nas-python at arctrix.com (Neil Schemenauer) Date: Fri, 25 May 2018 16:35:04 -0600 Subject: [Python-Dev] PEP 574 (pickle 5) implementation and backport available In-Reply-To: <20180525231104.270ff681@fsol> References: <20180524195718.4a030167@fsol> <1E3100CB-1310-4C14-94F8-02E6348275A4@gmail.com> <20180525194928.2e8aed94@fsol> <20180525205057.ytxazu6zx57g2dbs@python.ca> <20180525231104.270ff681@fsol> Message-ID: <20180525223504.pitglndoy4ptzjjy@python.ca> On 2018-05-25, Antoine Pitrou wrote: > The question is what purpose does it serve for pickle to do it rather > than for the user to compress the pickle themselves. You're basically > saving one line of code. It's one line of code everywhere pickling or unpicking happens. And you probably need to import a compression module, so at least two lines. Then maybe you need to figure out if the pickle is compressed and what kind of compression is used. So, add a few more lines. It seems logical to me that users of pickle want it to be fast and produce small pickles. Compressing by default seems the right choice, even though it complicates the implementation. Ivan brings up a valid point that compressed pickles are harder to debug. However, I think that's much less important than being small. > it requires us to ship the lz4 library with Python Yeah, that's not so great. I think zlib with Z_BEST_SPEED would be fine. However, some people might worry it is too slow or doesn't compress enough. Having lz4 as a battery included seems like a good idea anyhow. I understand that it is pretty well established as a useful compression method. Obviously requiring a new C library to be included expands the effort of implementation a lot. This discussion can easily lead into bikeshedding (e.g. relative merits of different compression schemes). Since I'm not volunteering to implement anything, I will stop responding at this point. ;-) Regards, Neil From ramseydsilva at gmail.com Fri May 25 19:55:34 2018 From: ramseydsilva at gmail.com (Ramsey D'silva) Date: Fri, 25 May 2018 19:55:34 -0400 Subject: [Python-Dev] Add __reversed__ methods for dict In-Reply-To: References: <20180525183201.38a60491@fsol> Message-ID: I am also in agreement. On Fri, May 25, 2018, 13:49 Guido van Rossum wrote: > OK, +1 > > On Fri, May 25, 2018 at 10:26 AM, Raymond Hettinger < > raymond.hettinger at gmail.com> wrote: > >> >> >> > On May 25, 2018, at 9:32 AM, Antoine Pitrou >> wrote: >> > >> > It's worth nothing that OrderedDict already supports reversed(). >> > The argument could go both ways: >> > >> > 1. dict is similar to OrderedDict nowadays, so it should support >> > reversed() too; >> > >> > 2. you can use OrderedDict to signal explicitly that you care about >> > ordering; no need to add anything to dict. >> >> Those are both valid sentiments :-) >> >> My thought is that guaranteed insertion order for regular dicts is brand >> new, so it will take a while for the notion settle in and become part of >> everyday thinking about dicts. Once that happens, it is probably >> inevitable that use cases will emerge and that __reversed__ will get added >> at some point. The implementation seems straightforward and it isn't much >> of a conceptual leap to expect that a finite ordered collection would be >> reversible. >> >> Given that dicts now track insertion order, it seems reasonable to want >> to know the most recent insertions (i.e. looping over the most recently >> added tasks in a task dict). Other possible use cases will likely >> correspond to how we use the Unix tail command. >> >> If those use cases arise, it would be nice for __reversed__ to already be >> supported so that people won't be tempted to implement an ugly workaround >> using popitem() calls followed by reinsertions. >> >> >> Raymond >> >> . >> >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/guido%40python.org >> > > > > -- > --Guido van Rossum (python.org/~guido) > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/ramseydsilva%40gmail.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Fri May 25 20:13:03 2018 From: njs at pobox.com (Nathaniel Smith) Date: Fri, 25 May 2018 17:13:03 -0700 Subject: [Python-Dev] PEP 574 (pickle 5) implementation and backport available In-Reply-To: <20180525223504.pitglndoy4ptzjjy@python.ca> References: <20180524195718.4a030167@fsol> <1E3100CB-1310-4C14-94F8-02E6348275A4@gmail.com> <20180525194928.2e8aed94@fsol> <20180525205057.ytxazu6zx57g2dbs@python.ca> <20180525231104.270ff681@fsol> <20180525223504.pitglndoy4ptzjjy@python.ca> Message-ID: On Fri, May 25, 2018 at 3:35 PM, Neil Schemenauer wrote: > This discussion can easily lead into bikeshedding (e.g. relative > merits of different compression schemes). Since I'm not > volunteering to implement anything, I will stop responding at this > point. ;-) I think the bikeshedding -- or more to the point, the fact that there's a wide variety of options for compressing pickles, and none of them are appropriate in all circumstances -- means that this is something that should remain a separate layer. Even super-fast algorithms like lz4 are inefficient when you're transmitting pickles between two processes on the same system ? they still add extra memory copies. And that's a very common use case. -n -- Nathaniel J. Smith -- https://vorpus.org From stefan_ml at behnel.de Sat May 26 03:12:43 2018 From: stefan_ml at behnel.de (Stefan Behnel) Date: Sat, 26 May 2018 09:12:43 +0200 Subject: [Python-Dev] PEP 574 (pickle 5) implementation and backport available In-Reply-To: <20180525231104.270ff681@fsol> References: <20180524195718.4a030167@fsol> <1E3100CB-1310-4C14-94F8-02E6348275A4@gmail.com> <20180525194928.2e8aed94@fsol> <20180525205057.ytxazu6zx57g2dbs@python.ca> <20180525231104.270ff681@fsol> Message-ID: Antoine Pitrou schrieb am 25.05.2018 um 23:11: > On Fri, 25 May 2018 14:50:57 -0600 > Neil Schemenauer wrote: >> On 2018-05-25, Antoine Pitrou wrote: >>> Do you have something specific in mind? >> >> I think compressed by default is a good idea. My quick proposal: >> >> - Use fast compression like lz4 or zlib with Z_BEST_SPEED >> >> - Add a 'compress' keyword argument with a default of None. For >> protocol 5, None means to compress. Providing 'compress' != None >> for older protocols will raise an error. > > The question is what purpose does it serve for pickle to do it rather > than for the user to compress the pickle themselves. You're basically > saving one line of code. Am I missing some other advantage? Regarding the pickling side, if the pickle is large, then it can save memory to compress while pickling, rather than compressing after pickling. But that can also be done with file-like objects, so the advantage is small here. I think a major advantage is on the unpickling side rather than the pickling side. Sure, users can compress a pickle after the fact, but if there's a (set of) standard algorithms that unpickle can handle automatically, then it's enough to pass "something pickled" into unpickle, rather than having to know (or figure out) if and how that pickle was originally compressed, and build up the decompression pipeline for it to get everything uncompressed efficiently without accidentally wasting memory or processing time. Obviously, auto-decompression opens up a gate for compression bombs, but then, unpickling data from untrusted sources is discouraged anyway, so... Stefan From songofacandy at gmail.com Sat May 26 10:20:51 2018 From: songofacandy at gmail.com (INADA Naoki) Date: Sat, 26 May 2018 23:20:51 +0900 Subject: [Python-Dev] Add __reversed__ methods for dict In-Reply-To: References: Message-ID: > Concerns have been raised in the comments that this feature may add too much > bloat in the core interpreter and be harmful for other Python implementations. To clarify, my point is it prohibit hashmap + single linked list implementation in other Python implementation. Because doubly linked list is very memory inefficient, every implementation would be forced to implement dict like PyPy (and CPython) for efficiency. But I don't know much about current MicroPython and other Python implementation's plan to catch Python 3.6 up. > Given the different issues this change creates, I see three possibilities: > 1. Accept the proposal has it is for dict and dict views, this would add about > 300 lines and three new types in dictobject.c > 2. Accept the proposal only for dict, this would add about 80 lines and one > new type in dictobject.c while still being useful for some use cases > 3. Drop the proposal as the whole, while having some use, reversed(dict(a=1, b=2)) > may not be very common and could be done using OrderedDict instead. > What?s your stance on the issue ? I want to wait one version (3.8) for other implementations. "Keep insertion order" is requirement from 3.7 which is not released yet. I feel it's too early to add more stronger requirements to core type. Regards, --- INADA Naoki From mrocklin at gmail.com Sat May 26 12:07:35 2018 From: mrocklin at gmail.com (Matthew Rocklin) Date: Sat, 26 May 2018 12:07:35 -0400 Subject: [Python-Dev] PEP 574 (pickle 5) implementation and backport available Message-ID: Hi all, I agree that compression is often a good idea when moving serialized objects around on a network, but for what it's worth I as a library author would always set compress=False and then handle it myself as a separate step. There are a few reasons for this: 1. Bandwidth is often pretty good, especially intra-node, on high performance networks, or on decent modern discs (NVMe) 2. I often use different compression technologies in different situations. LZ4 is a great all-around default, but often snappy, blosc, or z-standrad are better suited. This depends strongly on the characteristics of the data. 3. Very often data often isn't compressible, or is already in some compressed form, such as in images, and so compressing only hurts you. In general, my thought is that compression is a complex topic with enough intricaces that setting a single sane default that works 70+% of the time probably isn't possible (at least not with the applications that I get exposed to). Instead of baking a particular method into pickle.dumps I would recommend trying to solve this problem through documentation, pointing users to the various compression libraries within the broader Python ecosystem, and perhaps pointing to one of the many blogposts that discuss their strengths and weaknesses. Best, -matt -------------- next part -------------- An HTML attachment was scrubbed... URL: From olivier.grisel at ensta.org Sat May 26 12:42:42 2018 From: olivier.grisel at ensta.org (Olivier Grisel) Date: Sat, 26 May 2018 18:42:42 +0200 Subject: [Python-Dev] PEP 574 (pickle 5) implementation and backport available In-Reply-To: References: Message-ID: +1 for not adding in-pickle compression as it is already very easy to handle compression externally (for instance by passing a compressing file object as an argument to the pickler). Furthermore, as PEP 574 makes it possible to stream the buffer bytes directly to the file-object without any temporary memory copy I don't see any benefit in including the compression into the pickle protocol. However adding lz4.LZ4File to the standard library in addition to gzip.GzipFile and lzma.LZMAFile is probably a good idea as LZ4 is really fast compared to zlib/gzip. But this is not related to PEP 574. -- Olivier ? -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Sat May 26 13:11:13 2018 From: guido at python.org (Guido van Rossum) Date: Sat, 26 May 2018 10:11:13 -0700 Subject: [Python-Dev] Add __reversed__ methods for dict In-Reply-To: References: Message-ID: Hm, I find Inada's argument compelling that this might not be easy for all implementations. So let's wait. On Sat, May 26, 2018 at 7:20 AM, INADA Naoki wrote: > > Concerns have been raised in the comments that this feature may add too > much > > bloat in the core interpreter and be harmful for other Python > implementations. > > > To clarify, my point is it prohibit hashmap + single linked list > implementation in > other Python implementation. > Because doubly linked list is very memory inefficient, every implementation > would be forced to implement dict like PyPy (and CPython) for efficiency. > > But I don't know much about current MicroPython and other Python > implementation's > plan to catch Python 3.6 up. > > > Given the different issues this change creates, I see three > possibilities: > > > 1. Accept the proposal has it is for dict and dict views, this would add > about > > 300 lines and three new types in dictobject.c > > > 2. Accept the proposal only for dict, this would add about 80 lines and > one > > new type in dictobject.c while still being useful for some use cases > > > 3. Drop the proposal as the whole, while having some use, > reversed(dict(a=1, b=2)) > > may not be very common and could be done using OrderedDict instead. > > > What?s your stance on the issue ? > > > I want to wait one version (3.8) for other implementations. > "Keep insertion order" is requirement from 3.7 which is not released yet. > I feel it's too early to add more stronger requirements to core type. > > Regards, > > --- > INADA Naoki > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ > guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From solipsis at pitrou.net Sat May 26 13:12:18 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Sat, 26 May 2018 19:12:18 +0200 Subject: [Python-Dev] lz4 compression References: Message-ID: <20180526191218.311ce648@fsol> On Sat, 26 May 2018 18:42:42 +0200 Olivier Grisel wrote: > > However adding lz4.LZ4File to the standard library in addition to > gzip.GzipFile and lzma.LZMAFile is probably a good idea as LZ4 is really > fast compared to zlib/gzip. But this is not related to PEP 574. If we go that way, we may probably want zstd as well :-). But, yes, most likely unrelated to PEP 574. Regards Antoine. From raymond.hettinger at gmail.com Sat May 26 23:43:28 2018 From: raymond.hettinger at gmail.com (Raymond Hettinger) Date: Sat, 26 May 2018 20:43:28 -0700 Subject: [Python-Dev] Add __reversed__ methods for dict In-Reply-To: References: Message-ID: > On May 26, 2018, at 7:20 AM, INADA Naoki wrote: > > Because doubly linked list is very memory inefficient, every implementation > would be forced to implement dict like PyPy (and CPython) for efficiency. > But I don't know much about current MicroPython and other Python > implementation's > plan to catch Python 3.6 up. FWIW, Python 3.7 is the first Python that where the language guarantees that regular dicts are order preserving. And the feature being discussed in this thread is for Python 3.8. What potential implementation obstacles do you foresee? Can you imagine any possible way that an implementation would have an order preserving dict but would be unable to trivially implement __reversed__? How could an implementation have a __setitem__ that appends at the end, and a popitem() that pops from that same end, but still not be able to easily iterate in reverse? It really doesn't matter whether an implementer uses a dense array of keys or a doubly-linked-list; either way, looping backward is as easy as going forward. Raymond P.S. It isn't going to be hard to update MicroPython to have a compact and ordered dict (based on my review of their existing dict implementation). This is something they are really going to want because of the improved memory efficiency. Also, they're also already going to need it just to comply with guaranteed keyword argument ordering and guaranteed ordering of class dictionaries. From songofacandy at gmail.com Sun May 27 03:12:27 2018 From: songofacandy at gmail.com (INADA Naoki) Date: Sun, 27 May 2018 16:12:27 +0900 Subject: [Python-Dev] Add __reversed__ methods for dict In-Reply-To: References: Message-ID: On Sun, May 27, 2018 at 12:43 PM Raymond Hettinger < raymond.hettinger at gmail.com> wrote: > > On May 26, 2018, at 7:20 AM, INADA Naoki wrote: > > > > Because doubly linked list is very memory inefficient, every implementation > > would be forced to implement dict like PyPy (and CPython) for efficiency. > > But I don't know much about current MicroPython and other Python > > implementation's > > plan to catch Python 3.6 up. > FWIW, Python 3.7 is the first Python that where the language guarantees that regular dicts are order preserving. And the feature being discussed in this thread is for Python 3.8. Oh, my mistake. > What potential implementation obstacles do you foresee? Can you imagine any possible way that an implementation would have an order preserving dict but would be unable to trivially implement __reversed__? How could an implementation have a __setitem__ that appends at the end, and a popitem() that pops from that same end, but still not be able to easily iterate in reverse? It really doesn't matter whether an implementer uses a dense array of keys or a doubly-linked-list; either way, looping backward is as easy as going forward. I thought `popitem()` removes the last item is still implementation detail. So I thought about hashmap + single linked list. When removing item, dummy entry will be kept in the list. The dummy entry in the list will be removed when iterating over the list, or rebuilding hashmap. FWIW, quick survey of other languages hashmap implementations and APIs are: # PHP PHP 5 used hashmap + doubly linked list. PHP 7 uses Python-like implementation. While PHP doesn't have reverse iterator, there are `end()` and `prev()` which can be used to iterate backwards. # Ruby From Ruby 1.9, Hash is ordered. At the time, implementation is hashmap + doubly linked list. From Ruby 2.4, Python-like implementation. There are `Enumereble.reverse_each` API. But the API is documented as "Builds a temporary array and traverses that array in reverse order." So Ruby seems allow other implementation which doesn't have zerocopy reverse iterator. (I don't know CRuby provides it or not.) http://ruby-doc.org/core-2.2.2/Enumerable.html#method-i-reverse_each # Java The LinkedHashMap document says " it maintains a doubly-linked list ". https://docs.oracle.com/javase/8/docs/api/java/util/LinkedHashMap.html On the other hand, there are no reverse iterator API. So if we require `__reverse__` for dict, Jython can't use LinkedHashMap as backend of dict. # C# (.Net) There are legacy (non generic) OrderedDict. It's `remove()` seems O(n) implementation. https://referencesource.microsoft.com/#System/compmod/system/collections/specialized/ordereddictionary.cs,bc8d8035ee2d2927 # Rust, Swift, and Go Builtin mapping is arbitrary ordered, and there is no ordered mapping in the standard library. --- It seems: * There are no single linked list based OrderedDict implementation, but * Only PHP exposes "zerocopy reverse iterate" API. I may be wrong because I'm not expert of these languages. Please point out if I am wrong. > Raymond > P.S. It isn't going to be hard to update MicroPython to have a compact and ordered dict (based on my review of their existing dict implementation). This is something they are really going to want because of the improved memory efficiency. Also, they're also already going to need it just to comply with guaranteed keyword argument ordering and guaranteed ordering of class dictionaries. Thanks. Sadly speaking, Jython and IronPython development seems very slow and "wait until 3.9" may be not long enough for they catch Python 3.7 up. When focusing to CPython, PyPy and MicroPython, no problem for adding __reverse__ in 3.8 seems OK. Regards, -- INADA Naoki From vstinner at redhat.com Mon May 28 05:05:15 2018 From: vstinner at redhat.com (Victor Stinner) Date: Mon, 28 May 2018 11:05:15 +0200 Subject: [Python-Dev] macOS: minimum supported version? Message-ID: Hi, Ned Deily closed old bugs reported on the macOS Tiger buildbot, since this buildbot has been retired 3 months ago (the builders are still visible online, but last builds were 3 months ago). It seems like the oldest macOS buildbot is now macOS El Capitan (macOS 10.11, 2015). Does it mean that the minimum officially supported macOS version is now macOS 10.11 El Capitain? For me, to get an official "full" support, we need a buildbot. Without buildbot, we can only provide a weaker "best-effort" support. Otherwise, the risk of regression is too high. I failed to find any official and obvious list of CPython supported platforms, so I wrote my own list: http://vstinner.readthedocs.io/cpython.html#supported-platforms My first motivation for this list was to get a simple list of supported Windows versions, because I'm unable to follow Windows lifecycle (the PEP 11 has a vague statement about Windows which requires to follow Windows end of life for each Windows release). Victor From nad at python.org Mon May 28 05:51:15 2018 From: nad at python.org (Ned Deily) Date: Mon, 28 May 2018 05:51:15 -0400 Subject: [Python-Dev] macOS: minimum supported version? In-Reply-To: References: Message-ID: <06797F58-FB3E-4D2D-A0D3-122425F0122E@python.org> On May 28, 2018, at 05:05, Victor Stinner wrote: > Ned Deily closed old bugs reported on the macOS Tiger buildbot, since > this buildbot has been retired 3 months ago (the builders are still > visible online, but last builds were 3 months ago). Perhaps Zach or someone else can remove them from the list. > It seems like the oldest macOS buildbot is now macOS El Capitan (macOS > 10.11, 2015). Does it mean that the minimum officially supported macOS > version is now macOS 10.11 El Capitain? One way to answer that question is to look at the macOS versions we support with python.org installers. If you look at the current download pages for current releases, you'll see we provide installers that support macOS versions from 10.6 (Snow Leopard) theough the current 10.13 (High Sierrs) so we definitely support those versions. Earlier versions are supported on a best-effort basis. -- Ned Deily nad at python.org -- [] From vstinner at redhat.com Mon May 28 05:57:28 2018 From: vstinner at redhat.com (Victor Stinner) Date: Mon, 28 May 2018 11:57:28 +0200 Subject: [Python-Dev] Troubles to merge changes in the 2.7 branch: PR "out-of-date" branch Message-ID: Hi, Since one or two weeks, I noticed that it's difficult to merge pull requests into the 2.7 branch. If a different commit is pushed in the meanwhile (if a different PR has been merged), the 2.7 branch diverges and the PR is immediately marked as "This branch is out-of-date with the base branch" and the "Squash and Merge" button is disabled (grey). For example my PR https://github.com/python/cpython/pull/7120 which changes Lib/test/regrtest.py cannot be merged because a commit touching the documentation (Doc/ directory) has been merged after I posted my PR and before the CI completed. I don't see the same behavior on the master branch. Is the 2.7 branch configured as more strict? Victor From storchaka at gmail.com Mon May 28 08:09:46 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Mon, 28 May 2018 15:09:46 +0300 Subject: [Python-Dev] The history of PyXML In-Reply-To: <3e83f0be4977de6006f4efdf9019075f7924b560.camel@janc.be> References: <3e83f0be4977de6006f4efdf9019075f7924b560.camel@janc.be> Message-ID: 25.05.18 05:09, Jan Claeys ????: > On Thu, 2018-05-17 at 15:18 +0300, Serhiy Storchaka wrote: >> Does anyone has the full copy of the PyXML repository, with the >> complete history? >> >> This library was included in Python 2.1 as the xml package and is >> not maintained as a separate project since 2004. It's home on >> SourceForge was removed. I have found sources of the last PyXML >> version (0.8.4), but without history. >> > > Did you try asking SourceForge if they still have a backup copy? No, I didn't. I first tried to ask whether any of active Python core developers were involved in developing of PyXML and kept some history. I even don't know whether PyXML used any VCS served by SourceForge, or just published tarballs. From storchaka at gmail.com Mon May 28 08:17:45 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Mon, 28 May 2018 15:17:45 +0300 Subject: [Python-Dev] When tp_clear returns non-zero? Message-ID: The tp_clear field of PyTypeObject has type inquiry. It is a pointer to a function that takes PyObject * and return int. typedef int (*inquiry)(PyObject *); I'm interesting what the result of this function means. In what cases it can return non-zero, and can it set an exception? Currently tp_clear() is called in a single place, and its result is ignored. All tp_clear implementations in the stdlib always return 0. From rymg19 at gmail.com Mon May 28 10:04:49 2018 From: rymg19 at gmail.com (Ryan Gonzalez) Date: Mon, 28 May 2018 09:04:49 -0500 Subject: [Python-Dev] Troubles to merge changes in the 2.7 branch: PR "out-of-date" branch In-Reply-To: References: Message-ID: <163a70fca60.27a3.db5b03704c129196a4e9415e55413ce6@gmail.com> AFAIK there's no setting like this available, and I've done this many times on other repos with no trouble. Maybe it could be a GitHub bug? On May 28, 2018 4:59:03 AM Victor Stinner wrote: > Hi, > > Since one or two weeks, I noticed that it's difficult to merge pull > requests into the 2.7 branch. If a different commit is pushed in the > meanwhile (if a different PR has been merged), the 2.7 branch diverges > and the PR is immediately marked as "This branch is out-of-date with > the base branch" and the "Squash and Merge" button is disabled (grey). > > For example my PR https://github.com/python/cpython/pull/7120 which > changes Lib/test/regrtest.py cannot be merged because a commit > touching the documentation (Doc/ directory) has been merged after I > posted my PR and before the CI completed. > > I don't see the same behavior on the master branch. Is the 2.7 branch > configured as more strict? > > Victor > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com From brett at python.org Mon May 28 12:09:00 2018 From: brett at python.org (Brett Cannon) Date: Mon, 28 May 2018 09:09:00 -0700 Subject: [Python-Dev] Troubles to merge changes in the 2.7 branch: PR "out-of-date" branch In-Reply-To: <163a70fca60.27a3.db5b03704c129196a4e9415e55413ce6@gmail.com> References: <163a70fca60.27a3.db5b03704c129196a4e9415e55413ce6@gmail.com> Message-ID: Ryan is right that there's no special setting in GitHub at least which would make merging more strict for certain branches as you're describing. On Mon, 28 May 2018 at 07:06 Ryan Gonzalez wrote: > AFAIK there's no setting like this available, and I've done this many > times > on other repos with no trouble. Maybe it could be a GitHub bug? > > > On May 28, 2018 4:59:03 AM Victor Stinner wrote: > > > Hi, > > > > Since one or two weeks, I noticed that it's difficult to merge pull > > requests into the 2.7 branch. If a different commit is pushed in the > > meanwhile (if a different PR has been merged), the 2.7 branch diverges > > and the PR is immediately marked as "This branch is out-of-date with > > the base branch" and the "Squash and Merge" button is disabled (grey). > > > > For example my PR https://github.com/python/cpython/pull/7120 which > > changes Lib/test/regrtest.py cannot be merged because a commit > > touching the documentation (Doc/ directory) has been merged after I > > posted my PR and before the CI completed. > > > > I don't see the same behavior on the master branch. Is the 2.7 branch > > configured as more strict? > > > > Victor > > _______________________________________________ > > Python-Dev mailing list > > Python-Dev at python.org > > https://mail.python.org/mailman/listinfo/python-dev > > Unsubscribe: > > https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nas-python at arctrix.com Mon May 28 12:34:23 2018 From: nas-python at arctrix.com (Neil Schemenauer) Date: Mon, 28 May 2018 10:34:23 -0600 Subject: [Python-Dev] When tp_clear returns non-zero? In-Reply-To: References: Message-ID: <20180528163423.utglxqwtc4ocmhxe@python.ca> On 2018-05-28, Serhiy Storchaka wrote: > I'm interesting what the result of this function means. In what > cases it can return non-zero, and can it set an exception? My memory is fuzzy (nearly 20 years since I wrote that code). My best guess is that I thought a return value might be useful somehow. As you have noticed, the return type probably should have been void. If you want to see one of the first implementations of the Python GC, I still have a patch: http://python.ca/nas/python/gc/gc-cycle-152.diff Regards, Neil From njs at pobox.com Mon May 28 13:19:31 2018 From: njs at pobox.com (Nathaniel Smith) Date: Mon, 28 May 2018 10:19:31 -0700 Subject: [Python-Dev] Troubles to merge changes in the 2.7 branch: PR "out-of-date" branch In-Reply-To: References: <163a70fca60.27a3.db5b03704c129196a4e9415e55413ce6@gmail.com> Message-ID: Isn't that what happens if someone enables the check box at Repository Settings -> Branches -> Branch protection rules -> [pick a branch] -> Require branches to be up to date before merging ? On Mon, May 28, 2018, 09:11 Brett Cannon wrote: > Ryan is right that there's no special setting in GitHub at least which > would make merging more strict for certain branches as you're describing. > > On Mon, 28 May 2018 at 07:06 Ryan Gonzalez wrote: > >> AFAIK there's no setting like this available, and I've done this many >> times >> on other repos with no trouble. Maybe it could be a GitHub bug? >> >> >> On May 28, 2018 4:59:03 AM Victor Stinner wrote: >> >> > Hi, >> > >> > Since one or two weeks, I noticed that it's difficult to merge pull >> > requests into the 2.7 branch. If a different commit is pushed in the >> > meanwhile (if a different PR has been merged), the 2.7 branch diverges >> > and the PR is immediately marked as "This branch is out-of-date with >> > the base branch" and the "Squash and Merge" button is disabled (grey). >> > >> > For example my PR https://github.com/python/cpython/pull/7120 which >> > changes Lib/test/regrtest.py cannot be merged because a commit >> > touching the documentation (Doc/ directory) has been merged after I >> > posted my PR and before the CI completed. >> > >> > I don't see the same behavior on the master branch. Is the 2.7 branch >> > configured as more strict? >> > >> > Victor >> > _______________________________________________ >> > Python-Dev mailing list >> > Python-Dev at python.org >> > https://mail.python.org/mailman/listinfo/python-dev >> > Unsubscribe: >> > https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com >> >> >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/brett%40python.org >> > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/njs%40pobox.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nad at python.org Mon May 28 13:47:56 2018 From: nad at python.org (Ned Deily) Date: Mon, 28 May 2018 13:47:56 -0400 Subject: [Python-Dev] Troubles to merge changes in the 2.7 branch: PR "out-of-date" branch In-Reply-To: References: <163a70fca60.27a3.db5b03704c129196a4e9415e55413ce6@gmail.com> Message-ID: <821FDA20-E5F6-4650-8754-DD77A909DB5A@python.org> On May 28, 2018, at 13:19, Nathaniel Smith wrote: > > Isn't that what happens if someone enables the check box at Repository Settings -> Branches -> Branch protection rules -> [pick a branch] -> Require branches to be up to date before merging ? Hmm, for some some reason, it appears that, at the moment, the 2.7, 3.4, and 3.5 branches have that option set, while 3.6, 3.7, and master don't. I'm not sure how we got to that state. Any other reasons to prefer on versus off? On Mon, May 28, 2018, 09:11 Brett Cannon wrote: > Ryan is right that there's no special setting in GitHub at least which would make merging more strict for certain branches as you're describing. > > On Mon, 28 May 2018 at 07:06 Ryan Gonzalez wrote: > AFAIK there's no setting like this available, and I've done this many times > on other repos with no trouble. Maybe it could be a GitHub bug? > > On May 28, 2018 4:59:03 AM Victor Stinner wrote: > > > Hi, > > > > Since one or two weeks, I noticed that it's difficult to merge pull > > requests into the 2.7 branch. If a different commit is pushed in the > > meanwhile (if a different PR has been merged), the 2.7 branch diverges > > and the PR is immediately marked as "This branch is out-of-date with > > the base branch" and the "Squash and Merge" button is disabled (grey). > > > > For example my PR https://github.com/python/cpython/pull/7120 which > > changes Lib/test/regrtest.py cannot be merged because a commit > > touching the documentation (Doc/ directory) has been merged after I > > posted my PR and before the CI completed. > > > > I don't see the same behavior on the master branch. Is the 2.7 branch > > configured as more strict? -- Ned Deily nad at python.org -- [] From phd at phdru.name Mon May 28 14:51:59 2018 From: phd at phdru.name (Oleg Broytman) Date: Mon, 28 May 2018 20:51:59 +0200 Subject: [Python-Dev] The history of PyXML In-Reply-To: References: <3e83f0be4977de6006f4efdf9019075f7924b560.camel@janc.be> Message-ID: <20180528185159.7hpcv5yo5tnsxaez@phdru.name> On Mon, May 28, 2018 at 03:09:46PM +0300, Serhiy Storchaka wrote: > 25.05.18 05:09, Jan Claeys ????: > > On Thu, 2018-05-17 at 15:18 +0300, Serhiy Storchaka wrote: > > > Does anyone has the full copy of the PyXML repository, with the > > > complete history? > > > > > > This library was included in Python 2.1 as the xml package and is > > > not maintained as a separate project since 2004. It's home on > > > SourceForge was removed. I have found sources of the last PyXML > > > version (0.8.4), but without history. > > > > > > > Did you try asking SourceForge if they still have a backup copy? > > No, I didn't. I first tried to ask whether any of active Python core > developers were involved in developing of PyXML and kept some history. I > even don't know whether PyXML used any VCS served by SourceForge, or just > published tarballs. They had been using CVS: https://web.archive.org/web/20151113082010/http://sourceforge.net/p/pyxml/code/ CVS repo web viewer shows some subdirectories but it seems there is no sources. I also failed to rsync from their CVS pserver and I doubt they have a backup -- SF stopped supporting CVS long ago. Oleg. -- Oleg Broytman https://phdru.name/ phd at phdru.name Programmers don't die, they just GOSUB without RETURN. From jeremy.kloth at gmail.com Mon May 28 16:11:50 2018 From: jeremy.kloth at gmail.com (Jeremy Kloth) Date: Mon, 28 May 2018 14:11:50 -0600 Subject: [Python-Dev] The history of PyXML In-Reply-To: References: Message-ID: On Thu, May 17, 2018 at 6:18 AM, Serhiy Storchaka wrote: > Does anyone has the full copy of the PyXML repository, with the complete > history? > > This library was included in Python 2.1 as the xml package and is not > maintained as a separate project since 2004. It's home on SourceForge was > removed. I have found sources of the last PyXML version (0.8.4), but without > history. > > I'm trying to figure out some intentions and fix possible bugs in the xml > package. The history of all commits could help. Here you go! https://github.com/jkloth/pyxml -- Jeremy Kloth From jeremy.kloth at gmail.com Mon May 28 16:14:32 2018 From: jeremy.kloth at gmail.com (Jeremy Kloth) Date: Mon, 28 May 2018 14:14:32 -0600 Subject: [Python-Dev] The history of PyXML In-Reply-To: References: Message-ID: On Mon, May 28, 2018 at 2:11 PM, Jeremy Kloth wrote: > Here you go! > > https://github.com/jkloth/pyxml I did forget to mention that I was one of the prior maintainers on the PyXML project as well. -- Jeremy Kloth From vstinner at redhat.com Mon May 28 17:42:04 2018 From: vstinner at redhat.com (Victor Stinner) Date: Mon, 28 May 2018 23:42:04 +0200 Subject: [Python-Dev] Troubles to merge changes in the 2.7 branch: PR "out-of-date" branch In-Reply-To: <821FDA20-E5F6-4650-8754-DD77A909DB5A@python.org> References: <163a70fca60.27a3.db5b03704c129196a4e9415e55413ce6@gmail.com> <821FDA20-E5F6-4650-8754-DD77A909DB5A@python.org> Message-ID: 2018-05-28 19:47 GMT+02:00 Ned Deily : > Hmm, for some some reason, it appears that, at the moment, the 2.7, 3.4, and 3.5 branches have that option set, while 3.6, 3.7, and master don't. I'm not sure how we got to that state. Any other reasons to prefer on versus off? As I wrote, it became very difficult to merge any PR on 2.7 because of that. We all run a race to be the first one to merge a change into 2.7. The next one will get a "conflict" even if the merged commit is unrelated (as I described: two different unrelated directories). Please use the same configuration for 2.7, 3.6, 3.7 and master branches! Victor From nad at python.org Mon May 28 18:52:31 2018 From: nad at python.org (Ned Deily) Date: Mon, 28 May 2018 18:52:31 -0400 Subject: [Python-Dev] Troubles to merge changes in the 2.7 branch: PR "out-of-date" branch In-Reply-To: References: <163a70fca60.27a3.db5b03704c129196a4e9415e55413ce6@gmail.com> <821FDA20-E5F6-4650-8754-DD77A909DB5A@python.org> Message-ID: <343A5884-9396-45A7-9302-0271EF9A7C8C@python.org> On May 28, 2018, at 17:42, Victor Stinner wrote: > 2018-05-28 19:47 GMT+02:00 Ned Deily : >> Hmm, for some some reason, it appears that, at the moment, the 2.7, 3.4, and 3.5 branches have that option set, while 3.6, 3.7, and master don't. I'm not sure how we got to that state. Any other reasons to prefer on versus off? > > As I wrote, it became very difficult to merge any PR on 2.7 because of > that. We all run a race to be the first one to merge a change into > 2.7. The next one will get a "conflict" even if the merged commit is > unrelated (as I described: two different unrelated directories). > > Please use the same configuration for 2.7, 3.6, 3.7 and master branches! Sounds reasonable. I've updated the 2.7 configuration to match the others and not require the PR branch be up to date before merging, meaning the CI test might be against an older view of the branch. If it proves to be a problem, we can revisit it for all of the branches. I am not going to change the settings for 3.5 and 3.4 as they are in security-fix mode and only their release manager can merge changes for those. -- Ned Deily nad at python.org -- [] From guido at python.org Mon May 28 22:00:27 2018 From: guido at python.org (Guido van Rossum) Date: Mon, 28 May 2018 19:00:27 -0700 Subject: [Python-Dev] The history of PyXML In-Reply-To: References: Message-ID: Wow. Thanks! On Mon, May 28, 2018 at 1:14 PM, Jeremy Kloth wrote: > On Mon, May 28, 2018 at 2:11 PM, Jeremy Kloth > wrote: > > Here you go! > > > > https://github.com/jkloth/pyxml > > I did forget to mention that I was one of the prior maintainers on the > PyXML project as well. > > -- > Jeremy Kloth > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ > guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From wes.turner at gmail.com Mon May 28 23:08:24 2018 From: wes.turner at gmail.com (Wes Turner) Date: Mon, 28 May 2018 23:08:24 -0400 Subject: [Python-Dev] The history of PyXML In-Reply-To: References: Message-ID: On Thursday, May 17, 2018, Serhiy Storchaka wrote: > [...] > > I'm trying to figure out some intentions and fix possible bugs in the xml > package. defusedxml https://pypi.org/project/defusedxml/ > XML bomb protection for Python stdlib modules https://pypi.org/project/defusedxml/#how-to-avoid-xml-vulnerabilities """ Best practices - Don?t allow DTDs - Don?t expand entities - Don?t resolve externals - Limit parse depth - Limit total input size - Limit parse time - Favor a SAX or iterparse-like parser for potential large data - Validate and properly quote arguments to XSL transformations and XPath queries - Don?t use XPath expression from untrusted sources - Don?t apply XSL transformations that come untrusted sources """ https://github.com/tiran/defusedxml > The history of all commits could help. > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/wes. > turner%40gmail.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From storchaka at gmail.com Mon May 28 23:53:05 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Tue, 29 May 2018 06:53:05 +0300 Subject: [Python-Dev] The history of PyXML In-Reply-To: References: Message-ID: 28.05.18 23:11, Jeremy Kloth ????: > On Thu, May 17, 2018 at 6:18 AM, Serhiy Storchaka wrote: >> Does anyone has the full copy of the PyXML repository, with the complete >> history? >> >> This library was included in Python 2.1 as the xml package and is not >> maintained as a separate project since 2004. It's home on SourceForge was >> removed. I have found sources of the last PyXML version (0.8.4), but without >> history. >> >> I'm trying to figure out some intentions and fix possible bugs in the xml >> package. The history of all commits could help. > > Here you go! > > https://github.com/jkloth/pyxml Great! Thank you, this is what I needed! From ncoghlan at gmail.com Tue May 29 08:25:47 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Tue, 29 May 2018 22:25:47 +1000 Subject: [Python-Dev] Troubles to merge changes in the 2.7 branch: PR "out-of-date" branch In-Reply-To: <343A5884-9396-45A7-9302-0271EF9A7C8C@python.org> References: <163a70fca60.27a3.db5b03704c129196a4e9415e55413ce6@gmail.com> <821FDA20-E5F6-4650-8754-DD77A909DB5A@python.org> <343A5884-9396-45A7-9302-0271EF9A7C8C@python.org> Message-ID: On 29 May 2018 at 08:52, Ned Deily wrote: > On May 28, 2018, at 17:42, Victor Stinner wrote: > > Please use the same configuration for 2.7, 3.6, 3.7 and master branches! > > Sounds reasonable. I've updated the 2.7 configuration to match the others > and not require the PR branch be up to date before merging, meaning the CI > test might be against an older view of the branch. If it proves to be a > problem, we can revisit it for all of the branches. I am not going to > change the settings for 3.5 and 3.4 as they are in security-fix mode and > only their release manager can merge changes for those. > Turning the setting on for security-fix branches sounds like a good idea to me, since it's basically a "stability-of-target-branch vs ease-of-merging" trade-off: by setting it, you ensure that your pre-merge CI checks reflect the state of the post-merge branch, whereas the default setting means your post-merge branch state represents a never-before-tested combination of software. The default state works pretty well for us since we don't kick off test runs on the BuildBot fleet for every PR, so there's always some additional testing that only happens post-merge anyway. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From songofacandy at gmail.com Tue May 29 09:12:04 2018 From: songofacandy at gmail.com (INADA Naoki) Date: Tue, 29 May 2018 22:12:04 +0900 Subject: [Python-Dev] Compact GC Header Message-ID: Hi, all. I hacked GC module and managed to slim PyGC_Head down from 3 words to 2 words. It passes test suite, while some comments and code cleanup is needed before merge. * https://bugs.python.org/issue33597 * https://github.com/python/cpython/pull/7043 I want to merge it after 3.7.0rc1 and buildbots are stable, if Antoine or other GC expert accept it. I estimate it reduces 5% memory usage (RSS) and negligible performance difference. If someone interested in it, please test and benchmark it on GC heavy application. Regards, -- INADA Naoki From vstinner at redhat.com Tue May 29 09:41:25 2018 From: vstinner at redhat.com (Victor Stinner) Date: Tue, 29 May 2018 15:41:25 +0200 Subject: [Python-Dev] Compact GC Header In-Reply-To: References: Message-ID: > I hacked GC module and managed to slim PyGC_Head down from 3 words to 2 > words. > It passes test suite, while some comments and code cleanup is needed before > merge. Does this change break the stable ABI? Victor From steve.dower at python.org Tue May 29 10:15:35 2018 From: steve.dower at python.org (Steve Dower) Date: Tue, 29 May 2018 07:15:35 -0700 Subject: [Python-Dev] Compact GC Header In-Reply-To: References: Message-ID: Looks like it breaks the 3.7 ABI, which is certainly not allowed at this time. But it?s not a limited API structure, so no problem for 3.8. Top-posted from my Windows 10 phone From: Victor Stinner Sent: Tuesday, May 29, 2018 6:44 To: INADA Naoki Cc: Python-Dev Subject: Re: [Python-Dev] Compact GC Header > I hacked GC module and managed to slim PyGC_Head down from 3 words to 2 > words. > It passes test suite, while some comments and code cleanup is needed before > merge. Does this change break the stable ABI? Victor _______________________________________________ Python-Dev mailing list Python-Dev at python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/steve.dower%40python.org -------------- next part -------------- An HTML attachment was scrubbed... URL: From bussonniermatthias at gmail.com Tue May 29 11:25:02 2018 From: bussonniermatthias at gmail.com (Matthias Bussonnier) Date: Tue, 29 May 2018 08:25:02 -0700 Subject: [Python-Dev] Troubles to merge changes in the 2.7 branch: PR "out-of-date" branch In-Reply-To: References: <163a70fca60.27a3.db5b03704c129196a4e9415e55413ce6@gmail.com> <821FDA20-E5F6-4650-8754-DD77A909DB5A@python.org> <343A5884-9396-45A7-9302-0271EF9A7C8C@python.org> Message-ID: > As I wrote, it became very difficult to merge any PR on 2.7 because of > that. We all run a race to be the first one to merge a change into > 2.7. The next one will get a "conflict" even if the merged commit is > unrelated (as I described: two different unrelated directories). Couldn't miss-islington be configured to automatically rebase those (when there is no actual conflict). The "Allow edit from maintainer" would most likely permit to push on the PRs branch. -- M On Tue, 29 May 2018 at 05:26, Nick Coghlan wrote: > On 29 May 2018 at 08:52, Ned Deily wrote: > >> On May 28, 2018, at 17:42, Victor Stinner wrote: >> > Please use the same configuration for 2.7, 3.6, 3.7 and master branches! >> >> Sounds reasonable. I've updated the 2.7 configuration to match the others >> and not require the PR branch be up to date before merging, meaning the CI >> test might be against an older view of the branch. If it proves to be a >> problem, we can revisit it for all of the branches. I am not going to >> change the settings for 3.5 and 3.4 as they are in security-fix mode and >> only their release manager can merge changes for those. >> > > Turning the setting on for security-fix branches sounds like a good idea > to me, since it's basically a "stability-of-target-branch vs > ease-of-merging" trade-off: by setting it, you ensure that your pre-merge > CI checks reflect the state of the post-merge branch, whereas the default > setting means your post-merge branch state represents a never-before-tested > combination of software. > > The default state works pretty well for us since we don't kick off test > runs on the BuildBot fleet for every PR, so there's always some additional > testing that only happens post-merge anyway. > > Cheers, > Nick. > > -- > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/bussonniermatthias%40gmail.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From paul at ganssle.io Tue May 29 11:36:47 2018 From: paul at ganssle.io (Paul G) Date: Tue, 29 May 2018 11:36:47 -0400 Subject: [Python-Dev] Troubles to merge changes in the 2.7 branch: PR "out-of-date" branch In-Reply-To: References: <163a70fca60.27a3.db5b03704c129196a4e9415e55413ce6@gmail.com> <821FDA20-E5F6-4650-8754-DD77A909DB5A@python.org> <343A5884-9396-45A7-9302-0271EF9A7C8C@python.org> Message-ID: <18ed35ef-a58d-7a4d-f2ed-be1f8ccb2bd1@ganssle.io> This doesn't seem like the best idea, since it would kick off dozens or hundreds of builds for every commit to the master branch. On 05/29/2018 11:25 AM, Matthias Bussonnier wrote: >> As I wrote, it became very difficult to merge any PR on 2.7 because of >> that. We all run a race to be the first one to merge a change into >> 2.7. The next one will get a "conflict" even if the merged commit is >> unrelated (as I described: two different unrelated directories). > > Couldn't miss-islington be configured to automatically rebase those (when > there is no actual conflict). The "Allow edit from maintainer" would most > likely permit to push on the PRs branch. > > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/paul%40ganssle.io > -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 833 bytes Desc: OpenPGP digital signature URL: From nad at python.org Tue May 29 12:11:12 2018 From: nad at python.org (Ned Deily) Date: Tue, 29 May 2018 12:11:12 -0400 Subject: [Python-Dev] Python 3.7.0 updated schedule: beta 5 cutoff in 24 hours Message-ID: <333C2C43-459C-46C4-A3AE-637FA05D0057@python.org> Here's an update on the 3.7.0 endgame. As announced several days ago, we made the difficult decision to hold back on 3.7.0rc1 due primarily to some unexpected difficulties being seen downstream due to changes in how docstrings were handled in 3.7.0 (details below). After some discussions about various approaches, we agreed on a solution that should minimize downstream impact without losing all the benefits of the existing 3.7 changes. Thanks to a lot of work over the long weekend by a number of people that solution is now merged in the 3.7 branch. In parallel with that, a number of people spent a lot of time looking at CI and buildbot test failures, mostly intermittent ones. As a result, a number of actual bugs were fixed and also problems with a number of tests were fixed which should make the test suite more robust. All this is good news. Primarily because of the important user-facing changes made with the AST docstring API, I feel we need to do one more beta release before we are ready for the release candidate. About 24 hours from now, approximately 2018-05-30 18:00UTC, I plan to tag and start manufacturing 3.7.0b5. This will be a short beta cycle, aimed mainly at users of the AST API so they can recheck that their packages with 3.7.0. Assuming all goes well, we will then plan to tag 3.7.0rc1 on 2018-06-11 and 3.7.0 final on 2017-06-27. I am also rescheduling 3.6.6rc1 and 3.6.6 final to match the new 3.7.0 dates. All fixes that have been merged into the 3.7 branch as of cutoff tomorrow will be in 3.7.0b5 and fixes merged afterwards will be in 3.7.0rc1 up to its cutoff point. After 3.7.0rc1 cutoff, 3.7 merges will appear in 3.7.1. Please continue to exercise diligence when deciding whether a change is appropriate for 3.7; as a rule of thumb, treat the 3.7 branch as if it were already released and in maintenance mode. Please also pay attention to CI test failures and buildbot test failures and see if you can help resolve them. I want to thank everyone who has been involved so far in helping us through this endgame and who have given up their personal time to work on making Python better. I, for one, am deeply grateful. 2018-05-30 3.7.0b5 2018-06-11 3.7.0rc1 & 3.6.6rc1 2018-06-27 3.7.0final & 3.6.6final --Ned On May 25, 2018, at 01:33, Ned Deily wrote: > On May 24, 2018, at 03:23, Ned Deily wrote: >> On May 23, 2018, at 09:13, Ned Deily wrote: >>> On May 23, 2018, at 07:45, Serhiy Storchaka wrote: >>>> Is it possible to add yet one beta instead? >>>> CI was broken for few latest days, tests are not passed on my computer still (and fail on some buildbots), updating What's New exposed new features which need additional testing (and maybe fixing or reverting), and I'm not comfortable about some changes which would be harder to fix after the release. >>> it is possible but there's no point in doing either another beta or a release candidate until we understand and address the current blocking issues, like the major buildbot failures. We have another 24 hours until rc1 was planned to be tagged. Let's keep working on the known issues and we will make a decision then. >> An update: thanks to a lot of effort over the past day by a number of >> people (including Victor, Serhiy, Christian, Zach, and others I'm sure >> I'm forgetting - my apologies), we have addressed all of the "release >> blocker" issues and all but one of the persistent failures on the 3.7 >> stable buildbots. We should have the couple of remaining "deferred >> blockers" including the remaining stable buildbots in green status by >> later today. At that point, we will be ready to tag 3.7.0rc1 and begin >> producing the release candidate artifacts. > Further update: some good news and some changes. > > The good news is that we have resolutions for all of the previous release and deferred blockers. Thanks to a number of people for continuing to help get the remaining stable buildbot issues taken care of along with some lingering bugs. > > The not-quite-as-good news is that we have had more discussions about some unexpected incompatibilities that have shown up with downstream user testing with the AST docstrings changes in place (see bpo-32911). We have had some previous discussions about the expected user impact and, earlier in the beta phase, I encouraged us to stay the course with the feature as implemented. But I am now persuaded that we owe it to our users to take one more look at this to make sure we do not force them to make changes for 3.7 and then once again for 3.8. More details are in the bug tracker issue; I strongly encourage those of us who have been involved with this to "vote" there on the proposal to either (A) proceed with the release of the current implementation in 3.7.0 or (B) revert the feature in 3.7.0 and retarget for 3.8. Should the consensus be to revert (B), we will plan to have one more fast-track beta release (b5) prior to the release candidate, in order to allow downstream users to tes > t their projects with the removal. PLEASE, keep the discussion about this on the bug tracker (and not here!) and keep it brief so we can move forward quickly. Because of the upcoming 3-day holiday weekend in some countries, I have set Tue 2018-05-29 18:00 UTC as a cutoff for "voting" but, if a clear consensus emerges earlier, we will likely cut the discussion short. So chime in now on the bug tracker if you have a stake in this issue. > > https://bugs.python.org/issue32911 > > This does mean that yesterday's "last chance" has been extended a bit, at most a few days. I will let you know as soon as we have made a decision about the feature and will provide updated 3.7.0 schedule info at that time. -- Ned Deily nad at python.org -- [] From bussonniermatthias at gmail.com Tue May 29 12:24:45 2018 From: bussonniermatthias at gmail.com (Matthias Bussonnier) Date: Tue, 29 May 2018 09:24:45 -0700 Subject: [Python-Dev] Troubles to merge changes in the 2.7 branch: PR "out-of-date" branch In-Reply-To: <18ed35ef-a58d-7a4d-f2ed-be1f8ccb2bd1@ganssle.io> References: <163a70fca60.27a3.db5b03704c129196a4e9415e55413ce6@gmail.com> <821FDA20-E5F6-4650-8754-DD77A909DB5A@python.org> <343A5884-9396-45A7-9302-0271EF9A7C8C@python.org> <18ed35ef-a58d-7a4d-f2ed-be1f8ccb2bd1@ganssle.io> Message-ID: On Tue, 29 May 2018 at 08:43, Paul G wrote: > This doesn't seem like the best idea, since it would kick off dozens or > hundreds of builds for every commit to the master branch. > Sorry if I was unclear, I was not suggesting to do that for PRs against master, but do that only for 2.7, 3.4 and 3.5 only. (or Branch having "Require branches to be up to date before merging ") For these there is likely only a couple of open PRs, and it won't kick more jobs than if the author had to rebase and push by themselves. If you still think it's too much or triggered too often, you can also just have a tag "Miss-islington please rebase", which is often easier to apply manually, than having to do that by yourself on the Command line. -- M -------------- next part -------------- An HTML attachment was scrubbed... URL: From vstinner at redhat.com Tue May 29 16:01:16 2018 From: vstinner at redhat.com (Victor Stinner) Date: Tue, 29 May 2018 22:01:16 +0200 Subject: [Python-Dev] Troubles to merge changes in the 2.7 branch: PR "out-of-date" branch In-Reply-To: References: <163a70fca60.27a3.db5b03704c129196a4e9415e55413ce6@gmail.com> <821FDA20-E5F6-4650-8754-DD77A909DB5A@python.org> <343A5884-9396-45A7-9302-0271EF9A7C8C@python.org> <18ed35ef-a58d-7a4d-f2ed-be1f8ccb2bd1@ganssle.io> Message-ID: GitHub provides a [Update branch] button. It seems like the button does a rebase, no? Victor 2018-05-29 18:24 GMT+02:00 Matthias Bussonnier : > On Tue, 29 May 2018 at 08:43, Paul G wrote: >> >> This doesn't seem like the best idea, since it would kick off dozens or >> hundreds of builds for every commit to the master branch. > > > Sorry if I was unclear, I was not suggesting to do that for PRs against > master, but do that only for 2.7, 3.4 and 3.5 only. (or Branch having > "Require branches to be up to date before merging ") > For these there is likely only a couple of open PRs, and it won't kick more > jobs than if the author had to rebase and push by themselves. > > If you still think it's too much or triggered too often, you can also just > have a tag "Miss-islington please rebase", which is often easier to apply > manually, than having to do that by yourself on the Command line. > > -- > M > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/vstinner%40redhat.com > From mhroncok at redhat.com Tue May 29 16:12:29 2018 From: mhroncok at redhat.com (=?UTF-8?Q?Miro_Hron=c4=8dok?=) Date: Tue, 29 May 2018 22:12:29 +0200 Subject: [Python-Dev] Troubles to merge changes in the 2.7 branch: PR "out-of-date" branch In-Reply-To: References: <163a70fca60.27a3.db5b03704c129196a4e9415e55413ce6@gmail.com> <821FDA20-E5F6-4650-8754-DD77A909DB5A@python.org> <343A5884-9396-45A7-9302-0271EF9A7C8C@python.org> <18ed35ef-a58d-7a4d-f2ed-be1f8ccb2bd1@ganssle.io> Message-ID: On 29.5.2018 22:01, Victor Stinner wrote: > GitHub provides a [Update branch] button. It seems like the button > does a rebase, no? AFAIK it merges the traget branch to the PR branch. No rebase. -- Miro Hron?ok -- Phone: +420777974800 IRC: mhroncok From vstinner at redhat.com Tue May 29 16:35:31 2018 From: vstinner at redhat.com (Victor Stinner) Date: Tue, 29 May 2018 22:35:31 +0200 Subject: [Python-Dev] Troubles to merge changes in the 2.7 branch: PR "out-of-date" branch In-Reply-To: References: <163a70fca60.27a3.db5b03704c129196a4e9415e55413ce6@gmail.com> <821FDA20-E5F6-4650-8754-DD77A909DB5A@python.org> <343A5884-9396-45A7-9302-0271EF9A7C8C@python.org> <18ed35ef-a58d-7a4d-f2ed-be1f8ccb2bd1@ganssle.io> Message-ID: 2018-05-29 22:12 GMT+02:00 Miro Hron?ok : > On 29.5.2018 22:01, Victor Stinner wrote: >> >> GitHub provides a [Update branch] button. It seems like the button >> does a rebase, no? > > > AFAIK it merges the traget branch to the PR branch. No rebase. Oh right! And it's the good option :-) Rebase cause confusions in reviews. Merge are fine for reviews. Victor From vstinner at redhat.com Wed May 30 05:33:22 2018 From: vstinner at redhat.com (Victor Stinner) Date: Wed, 30 May 2018 11:33:22 +0200 Subject: [Python-Dev] Keeping an eye on Travis CI, AppVeyor and buildbots: revert on regression Message-ID: Hi, I fixed a few tests which failed randomly. There are still a few, but the most annoying have been fixed. I will continue to keep an eye on our CIs: Travis CI, AppVeyor and buildbots. Sometimes, even when I report a regression, the author doesn't fix the bug. But when a test fails on Travis CI or AppVeyor, we cannot merge a pull request, so it blocks our whole workflow. I will restart the policy that I proposed last year: if a change introduces a regression and I'm unable to fix it quickly (say in less than 2 hours and the author isn't available), I will simply revert the change. Please don't take it personally, the purpose is just to unblock our workflow and be able to detect other regressions. It's just an opportunity for the change author to fix the change without the pressure of the broken CI. Buildbots only send email notifications to buildbot-status at python.org when the state changes from success (green) to failure (red). It's much simpler for me to spot a regression when most buildbots are green. By the way, while miss-lington is really an amazing tool (thanks Mariatta!!!), most core developers (including me!) forgot that 2 years ago, the policy was to check if a change doesn't break buildbots before backporting a change (well, 2 years ago we forward-ported changes, but it's the same idea ;-)). Please remind that we only run tests on Linux and Windows as pre-commit checks on GitHub pull requests, whereas it's very common to spot bugs on buildbots thanks to the diversity of platforms and architectures (and different performance). All buildbot builders can be watched at: http://buildbot.python.org/all/#/builders But I prefer to watch notifications on IRC (#python-dev on Freenode) and the buildbot-status mailing list. https://mail.python.org/mm3/mailman3/lists/buildbot-status.python.org/ -- Night gathers, AND NOW MY WATCH BEGINS. IT SHALL NOT END UNTIL MY DEATH. I shall take no wife, hold no lands, father no children. I shall wear no crowns and win no glory. I shall live and die at my post. I am the sword in the darkness. I am the watcher on the walls. I am the fire that burns against cold, the light that brings the dawn, the horn that wakes the sleepers, the shield that guards the realms of men. I pledge my life and honor to the Night's Watch, for this night and all the nights to come. Victor From vstinner at redhat.com Wed May 30 06:01:32 2018 From: vstinner at redhat.com (Victor Stinner) Date: Wed, 30 May 2018 12:01:32 +0200 Subject: [Python-Dev] How to watch buildbots? Message-ID: Hi, I would like to delegate the maintenance task "watch buildbots", since I'm already very busy with many other maintenance tasks. I'm looking for volunteers to handle incoming emails on buildbot-status. I already started to explain to Pablo Galindo Salgado how to do that, but it would be great to have at least two people doing this task. Otherwise, Pablo wouldn't be able to take holiday or just make a break for any reason. Buildbots are evil beast which require care every day. Otherwise, they quickly turn red and become less useful :-( It seems like the first blocker issue is that we have no explicit documentation "how to deal with buildbots?" (the devguide documentation is incomplete, it doesn't explain what I'm explaining below). Let me start with a few notes of how I watch buildbots. I'm getting buildbot notifications on IRC (#python-dev on Freenode) and on the buildbot-status mailing list: https://mail.python.org/mm3/mailman3/lists/buildbot-status.python.org/ When a buildbot fails, I look at tests logs and I try to check if an issue has already been reported. For example, search for the test method in title (ex: "test_complex" for test_complex() method). If no result, search using the test filename (ex: "test_os" for Lib/test/test_os.py). If there is no result, repeat with full text searchs ("All Text"). If you cannot find any open bug, create a new one: * The title should contain the test name, test method and the buildbot name. Example: " test_posix: TestPosixSpawn fails on PPC64 Fedora 3.x". * The description should contain the link to the buildbot failure. Try to identify useful parts of tests log and copy them in the description. * Fill the Python version field (ex: "3.8" for 3.x buildbots) * Select at least the "Tests" Component. You may select additional Components depending on the bug. If a bug was already open, you may add a comment to mention that there is a new failure: add at least a link to buildbot name and a link to the failure. And that's all! Simple, isn't it? At this stage, there is no need to investigate the test failure. To finish, reply to the failure notification on the mailing list with a very short email: add a link to the existing or the freshly created issue, maybe copy one line of the failure and/or the issue title. Recent bug example: https://bugs.python.org/issue33630 -- Later, you may want to analyze these failures, but I consider that it's a different job (different "maintenance task"). If you don't feel able to analyze the bug, you may try to find someone who knows more than you about the failure. For better bug reports, you can look at the [Changes] tab of a build failure, and try to identify which recent change introduced the regression. This task requires to follow recent commits, since sometimes the failure is old, it's just that the test fails randomly depending on network issues, system load, or anything else. Sometimes, previous tests have side effects. Or the buildbot owner made a change on the system. There are many different explanation, it's hard to write a complete list. It's really on a case by case basis. Hopefully, it's now more common that a buildbot failure is obvious and caused by a very specific recent changes which can be found in the [Changes] tab. -- If you are interested to help me on watching our CIs: please come on the python-buildbot at python.org mailing list! Introduce yourself and explain how do you plan to help. I may propose to mentor you to assist you the first weeks. As I wrote, maybe a first step would be to write down a documentation how to deal with buildbots and/or update and complete existing documentations. https://devguide.python.org/buildbots/ Victor From storchaka at gmail.com Wed May 30 06:43:47 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Wed, 30 May 2018 13:43:47 +0300 Subject: [Python-Dev] Compact GC Header In-Reply-To: <40wG5m4Yl2zFr4F@mail.python.org> References: <40wG5m4Yl2zFr4F@mail.python.org> Message-ID: 29.05.18 17:15, Steve Dower ????: > Looks like it breaks the 3.7 ABI, which is certainly not allowed at this > time. But it?s not a limited API structure, so no problem for 3.8. Looks like it breaks only extensions that use private macros _PyObject_GC_TRACK, _PyObject_GC_UNTRACK and _PyObject_GC_IS_TRACKED. Those that use only public functions PyObject_GC_Track() and PyObject_GC_UnTrack() shouldn't be affected. From vano at mail.mipt.ru Wed May 30 08:30:39 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Wed, 30 May 2018 15:30:39 +0300 Subject: [Python-Dev] How to watch buildbots? In-Reply-To: References: Message-ID: On 30.05.2018 13:01, Victor Stinner wrote: > Hi, > > I would like to delegate the maintenance task "watch buildbots", since > I'm already very busy with many other maintenance tasks. I'm looking > for volunteers to handle incoming emails on buildbot-status. I already > started to explain to Pablo Galindo Salgado how to do that, but it > would be great to have at least two people doing this task. Otherwise, > Pablo wouldn't be able to take holiday or just make a break for any > reason. Buildbots are evil beast which require care every day. > Otherwise, they quickly turn red and become less useful :-( > > It seems like the first blocker issue is that we have no explicit > documentation "how to deal with buildbots?" (the devguide > documentation is incomplete, it doesn't explain what I'm explaining > below). Let me start with a few notes of how I watch buildbots. > > I'm getting buildbot notifications on IRC (#python-dev on Freenode) > and on the buildbot-status mailing list: > https://mail.python.org/mm3/mailman3/lists/buildbot-status.python.org/ > > When a buildbot fails, I look at tests logs and I try to check if an > issue has already been reported. For example, search for the test > method in title (ex: "test_complex" for test_complex() method). If no > result, search using the test filename (ex: "test_os" for > Lib/test/test_os.py). If there is no result, repeat with full text > searchs ("All Text"). If you cannot find any open bug, create a new > one: > > * The title should contain the test name, test method and the buildbot > name. Example: " test_posix: TestPosixSpawn fails on PPC64 Fedora > 3.x". > * The description should contain the link to the buildbot failure. Try > to identify useful parts of tests log and copy them in the > description. > * Fill the Python version field (ex: "3.8" for 3.x buildbots) > * Select at least the "Tests" Component. You may select additional > Components depending on the bug. > > If a bug was already open, you may add a comment to mention that there > is a new failure: add at least a link to buildbot name and a link to > the failure. > > And that's all! Simple, isn't it? At this stage, there is no need to > investigate the test failure. > > To finish, reply to the failure notification on the mailing list with > a very short email: add a link to the existing or the freshly created > issue, maybe copy one line of the failure and/or the issue title. > > Recent bug example: https://bugs.python.org/issue33630 > > -- > > Later, you may want to analyze these failures, but I consider that > it's a different job (different "maintenance task"). If you don't feel > able to analyze the bug, you may try to find someone who knows more > than you about the failure. > > For better bug reports, you can look at the [Changes] tab of a build > failure, and try to identify which recent change introduced the > regression. This task requires to follow recent commits, since > sometimes the failure is old, it's just that the test fails randomly > depending on network issues, system load, or anything else. Sometimes, > previous tests have side effects. Or the buildbot owner made a change > on the system. There are many different explanation, it's hard to > write a complete list. It's really on a case by case basis. > > Hopefully, it's now more common that a buildbot failure is obvious and > caused by a very specific recent changes which can be found in the > [Changes] tab. > > -- > > If you are interested to help me on watching our CIs: please come on > the python-buildbot at python.org mailing list! Introduce yourself and > explain how do you plan to help. I may propose to mentor you to assist > you the first weeks. > > As I wrote, maybe a first step would be to write down a documentation > how to deal with buildbots and/or update and complete existing > documentations. > > https://devguide.python.org/buildbots/ > > Victor > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/vano%40mail.mipt.ru What's the big idea of separate buildbots anyway? I thought the purpose of CI is to test everything _before_ it breaks the main codebase. Then it's the job of the contributor rather than maintainer to fix any breakages. So, maybe making them be driven by Github checks would be a better time investment. Especially since we've got VSTS checks just recently, so whoever was doing that still knows how to interface with this Github machinery. If the bots cancel a previous build if a new one for the same PR arrives, this will not lead to a significant load difference 'cuz the number of actively developed PRs is stable and roughly equal to the number of merges according to the open/closed tickets dynamics. -- Regards, Ivan From ncoghlan at gmail.com Wed May 30 09:16:52 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Wed, 30 May 2018 23:16:52 +1000 Subject: [Python-Dev] Compact GC Header In-Reply-To: References: <40wG5m4Yl2zFr4F@mail.python.org> Message-ID: On 30 May 2018 at 20:43, Serhiy Storchaka wrote: > 29.05.18 17:15, Steve Dower ????: > >> Looks like it breaks the 3.7 ABI, which is certainly not allowed at this >> time. But it?s not a limited API structure, so no problem for 3.8. >> > > Looks like it breaks only extensions that use private macros > _PyObject_GC_TRACK, _PyObject_GC_UNTRACK and _PyObject_GC_IS_TRACKED. Those > that use only public functions PyObject_GC_Track() and > PyObject_GC_UnTrack() shouldn't be affected. The ABI concern is with PyGC_Head changing size, as that's a public struct definition in a public header - even though the macros for working with it are marked as private, the struct itself isn't. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Wed May 30 09:36:44 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Wed, 30 May 2018 23:36:44 +1000 Subject: [Python-Dev] How to watch buildbots? In-Reply-To: References: Message-ID: On 30 May 2018 at 22:30, Ivan Pozdeev via Python-Dev wrote: > What's the big idea of separate buildbots anyway? I thought the purpose of > CI is to test everything _before_ > it breaks the main codebase. Then it's the job of the contributor rather > than maintainer to fix any breakages. > > So, maybe making them be driven by Github checks would be a better time > investment. > Especially since we've got VSTS checks just recently, so whoever was doing > that still knows how to interface with this Github machinery. > > If the bots cancel a previous build if a new one for the same PR arrives, > this will not lead to a significant load difference 'cuz the number of > actively developed PRs is stable and roughly equal to the number of merges > according to the open/closed tickets dynamics. > There are a few key details here: 1. We currently need to run post-merge CI anyway, as we're not doing linearised commits (where core devs just approve a change without merging it, and then a gating system like Zuul ensures that the tests are run against the latest combination of the target branch and the PR before merging the change) 2. Since the buildbots are running on donated dedicated machines (rather than throwaway instances from a dynamic CI provider), we need to review the code before we let it run on the contributed systems 3. The buildbot instances run *1* build at a time, which would lead to major PR merging bottlenecks during sprints if we made them a gating requirement 4. For the vast majority of PRs, the post-merge cross-platform testing is a formality, since the code being modified is using lower level cross-platform APIs elsewhere in the standard library, so if it works on Windows, Linux, and Mac OS X, it will work everywhere Python runs 5. We generally don't *want* to burden new contributors with the task of dealing with the less common (or harder to target) platforms outside the big 3 - when they do break, it often takes a non-trivial amount of platform knowledge to understand what's different about the platform in question Cheers, Nick. P.S. That said, if VSTS or Travis were to offer FreeBSD as an option for pre-merge CI, I'd suggest we enable it, at least in an advisory capacity - it's a better check against Linux-specific assumptions creeping into the code base than Mac OS X, since the latter is regularly different enough from other *nix systems that we need to give it dedicated code paths. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From vstinner at redhat.com Wed May 30 10:06:39 2018 From: vstinner at redhat.com (Victor Stinner) Date: Wed, 30 May 2018 16:06:39 +0200 Subject: [Python-Dev] How to watch buildbots? In-Reply-To: References: Message-ID: 2018-05-30 14:30 GMT+02:00 Ivan Pozdeev via Python-Dev : > What's the big idea of separate buildbots anyway? I thought the purpose of > CI is to test everything _before_ > it breaks the main codebase. Then it's the job of the contributor rather > than maintainer to fix any breakages. I will answer more generally. Technically, buildbots support to send emails to author of changes which introduced a regression. But a build may test a single change or dozens of new changes. Moreover, our test suite is not perfect: they are at least 5 known tests which fail randomly. Even if we fix these unstable tests, it's also "common" that buildbots fail for "external" reasons: * network failure: fail to clone the GitHub repository * functional test using an external service and the service is down. I started to list external services used by "unit" tests: http://vstinner.readthedocs.io/cpython.html#services-used-by-unit-tests * vacuum cleaner: https://mail.python.org/pipermail/python-buildbots/2017-June/000122.html * many other random reasons... Since two years, I'm trying to fix all tests failing randomly, but as I just explained, it's really hard to get a failure rate of 0%. I'm not sure that we can "require" authors of pull requests to understand buildbot failures... So I prefer to keep the status quo: filter buildbot failures manually. Victor From vstinner at redhat.com Wed May 30 10:28:22 2018 From: vstinner at redhat.com (Victor Stinner) Date: Wed, 30 May 2018 16:28:22 +0200 Subject: [Python-Dev] Withdraw PEP 546? Backport ssl.MemoryBIO and ssl.SSLObject to Python 2.7 Message-ID: Hi, tl; dr I will withdraw the PEP 546 in one week if noboy shows up to finish the implementation. Last year,I wrote the PEP 546 with Cory Benfield: "Backport ssl.MemoryBIO and ssl.SSLObject to Python 2.7" https://www.python.org/dev/peps/pep-0546/ The plan was to get a Python 2.7 implementation of Cory's PEP 543: "A Unified TLS API for Python" https://www.python.org/dev/peps/pep-0543/ Sadly, it seems like Cory is no longer available to work on the projec (PEP 543 is still a draft)t. The PEP 546 is implemented: https://github.com/python/cpython/pull/2133 Well, I closed it, but you can still get it as a patch with: https://patch-diff.githubusercontent.com/raw/python/cpython/pull/2133.patch But tests fail on Travis CI whereas I'm unable to reproduce the issue on my laptop (on Fedora). The failure seems to depend on the version of OpenSSL. Christian Heimes has a "multissl" tool which automates tests on multiple OpenSSL versions, but I failed to find time to try this tool. Time flies and one year later, the PR of the PEP 546 is still not merged, tests are still failing. One month ago, when 2.7.15 has been released, Benjamin Peterson, Python 2.7 release manager, simply proposed: "The lack of movement for a year makes me wonder if PEP 546 should be moved to Withdrawn status." Since again, I failed to find time to look at the test_ssl failure, I plan to withdraw the PEP next week if nobody shows up :-( Sorry Python 2.7! Does anyone would benefit of MemoryBIO in Python 2.7? Twisted, asyncio, trio, urllib3, anyone else? If yes, who is volunteer to finish the MemoryBIO backport (and maintain it)? Victor From njs at pobox.com Wed May 30 12:02:02 2018 From: njs at pobox.com (Nathaniel Smith) Date: Wed, 30 May 2018 09:02:02 -0700 Subject: [Python-Dev] Withdraw PEP 546? Backport ssl.MemoryBIO and ssl.SSLObject to Python 2.7 In-Reply-To: References: Message-ID: On Wed, May 30, 2018, 07:30 Victor Stinner wrote: > Does anyone would benefit of MemoryBIO in Python 2.7? Twisted, > asyncio, trio, urllib3, anyone else? Asyncio and trio are strongly py3-only. Twisted's TLS functionality is built around pyopenssl, so the stdlib ssl module doesn't affect them. Urllib3 uses the socket-wrapping APIs, not MemoryBIO. So fwiw I don't think any of those projects would benefit. -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From python-list-bounces at python.org Wed May 30 12:16:36 2018 From: python-list-bounces at python.org (python-list-bounces at python.org) Date: Wed, 30 May 2018 12:16:36 -0400 Subject: [Python-Dev] Forward of moderated message Message-ID: An embedded message was scrubbed... From: "Oscar Ortiz Garcia (Axelerate LLC)" Subject: Windows Application Issue | Python Software Foundation | REF # 12927041 Date: Wed, 30 May 2018 16:03:51 +0000 Size: 71473 URL: From brett at python.org Wed May 30 12:58:47 2018 From: brett at python.org (Brett Cannon) Date: Wed, 30 May 2018 09:58:47 -0700 Subject: [Python-Dev] Forward of moderated message In-Reply-To: References: Message-ID: If you look you will see this is being executed from within Kodi, so this is probably an embedding situation where Kodi has a bug and they are triggering a crash in the interpreter. On Wed, 30 May 2018 at 09:22 wrote: > > > > ---------- Forwarded message ---------- > From: "Oscar Ortiz Garcia (Axelerate LLC)" > To: "python-list at python.org" > Cc: Windows Developer Engagements - AppQuality > Bcc: > Date: Wed, 30 May 2018 16:03:51 +0000 > Subject: Windows Application Issue | Python Software Foundation | REF # > 12927041 > > Dear Windows developer, > > I?m a program manager in the Partner App Experience Team at Microsoft. > We?re reaching out to notify you of a potential issue in one of your > applications. This issue is automatically generated by Windows Error > Reporting, a system which generates an error report whenever a crash > happens on user?s machine. > > The issue details are below. Our goal is to work with you to address this > issue and to understand what your expected timeline to address this issue > might be. If you have any questions about the details below or have already > addressed this issue in a forthcoming update, please let me know. > > *Issue* > > *Appcrash in > FAIL_FAST_FATAL_APP_EXIT_DETOURS_c0000409_python27.dll!PyMapping_HasKey* > > *Product* > > Python > > *Module* > > python27.dll > > *Module Version* > > 2.7.12150.1013 > > *OS* > > Windows 10 Insider Preview, Windows 10 > > *OS Version* > > 17134, 16299, 15063, 14393 > > *OS Architecture* > > x64, x86 > > *Environment* > > Desktop > > *Occurrence* > > More than 0.1M hits in the last 14 days > > *Impacted Locations* > > US, Canada, Europe, Brazil > > *Triage Notes* > > PROCESS_NAME: kodi.exe > > ERROR_CODE: (NTSTATUS) 0xc0000409 - The system detected an overrun of a > stack-based buffer in this application. This overrun could potentially > allow a malicious user to gain control of this application. > > EXCEPTION_CODE: (NTSTATUS) 0xc0000409 - The system detected an overrun of > a stack-based buffer in this application. This overrun could potentially > allow a malicious user to gain control of this application. > > > > WATSON_BKT_PROCVER: 17.6.0.0 > > WATSON_BKT_MODULE: ucrtbase.dll > > WATSON_BKT_MODVER: 10.0.17134.1 > > > > PROBLEM_CLASSES: > > > > ID: [0n278] > > Type: [FAIL_FAST] > > Class: Primary > > Scope: DEFAULT_BUCKET_ID (Failure Bucket ID prefix) > > BUCKET_ID > > Name: Add > > Data: Omit > > PID: [Unspecified] > > TID: [Unspecified] > > Frame: [0] > > > > ID: [0n267] > > Type: [FATAL_APP_EXIT] > > Class: Addendum > > Scope: DEFAULT_BUCKET_ID (Failure Bucket ID prefix) > > BUCKET_ID > > Name: Add > > Data: Omit > > PID: [Unspecified] > > TID: [Unspecified] > > Frame: [0] > > > > ID: [0n138] > > Type: [DETOURS] > > Class: Addendum > > Scope: DEFAULT_BUCKET_ID (Failure Bucket ID prefix) > > BUCKET_ID > > Name: Add > > Data: Omit > > PID: [0x3308] > > TID: [0x2548] > > Frame: [0] : ucrtbase!abort > > > > BUGCHECK_STR: FAIL_FAST_FATAL_APP_EXIT_DETOURS > > > > STACK_TEXT: > > 0dd3db08 56e74b1b 00650063 00610076 003a006c ucrtbase!abort+0x4b > > WARNING: Stack unwind information not available. Following frames may be > wrong. > > 0dd3db54 56e3d1b0 56f6ae14 118a9190 00000000 python27+0x164b1b > > 0dd3dbd4 56e3cded 12db2468 00000000 118a9190 python27+0x12d1b0 > > 0dd3dc0c 56e3c70f 12db6338 12db25a0 12dbe810 python27+0x12cded > > 0dd3dc3c 56e5820c 12db6338 12dbe810 12dbe810 python27+0x12c70f > > 0dd3dc60 56e5be48 118a9190 12db6338 118a94c0 python27+0x14820c > > 0dd3dcc4 56e5b6ec 118a9190 118a94c0 0f288ce8 python27+0x14be48 > > 0dd3dce0 56e5b1b2 118a9190 0f288ce8 118a94c0 python27+0x14b6ec > > 0dd3dd18 56e5b972 56fbe178 118a9190 118a9190 python27+0x14b1b2 > > 0dd3dd48 56e5ae86 56fbe178 56fbe178 12da18d4 python27+0x14b972 > > 0dd3dd78 56e5886d 00000000 12da9270 12da9270 python27+0x14ae86 > > 0dd3dd98 56e3783a 12da18d4 12da9270 12da9270 python27+0x14886d > > 0dd3ddc8 56deccc7 00000000 12db8150 00000000 python27+0x12783a > > 0dd3dde8 56db1ac6 0d125f30 12db8150 00000000 python27+0xdccc7 > > 0dd3de04 56e3c6aa 0d125f30 12db8150 00000000 python27+0xa1ac6 > > 0dd3de20 56e3efcf 0d125f30 12db8150 00000000 python27+0x12c6aa > > 0dd3dea8 56e3cded 11b72800 00000000 118a8800 python27+0x12efcf > > 0dd3dee0 56e3c70f 12da8b18 11b72938 12da9270 python27+0x12cded > > 0dd3df10 56e5820c 12da8b18 12da9270 12da9270 python27+0x12c70f > > 0dd3df34 56e5be48 118a8800 12da8b18 118a86f0 python27+0x14820c > > 0dd3df98 56e5b6ec 118a8800 118a86f0 0f288ea8 python27+0x14be48 > > 0dd3dfb4 56e5b1b2 118a8800 0f288ea8 118a86f0 python27+0x14b6ec > > 0dd3dfec 56e5b972 12a6bd90 118a880d 118a8800 python27+0x14b1b2 > > 0dd3e01c 56e5ae86 12a6bd90 12a6bd90 12a6bdd4 python27+0x14b972 > > 0dd3e04c 56e5886d 00000000 12dbec00 12dbec00 python27+0x14ae86 > > 0dd3e06c 56e3783a 12a6bdd4 12dbec00 12dbec00 python27+0x14886d > > 0dd3e09c 56deccc7 00000000 12a44390 00000000 python27+0x12783a > > 0dd3e0bc 56db1ac6 0d125f30 12a44390 00000000 python27+0xdccc7 > > 0dd3e0d8 56e3c6aa 0d125f30 12a44390 00000000 python27+0xa1ac6 > > 0dd3e0f4 56e3efcf 0d125f30 12a44390 00000000 python27+0x12c6aa > > 0dd3e17c 56e3cded 12883370 00000000 118a8b30 python27+0x12efcf > > 0dd3e1b4 56e3c70f 12a58ec0 128834a8 12dbec00 python27+0x12cded > > 0dd3e1e4 56e5820c 12a58ec0 12dbec00 12dbec00 python27+0x12c70f > > 0dd3e208 56e5be48 118a8b30 12a58ec0 118a85e0 python27+0x14820c > > 0dd3e26c 56e5b6ec 118a8b30 118a85e0 0f289500 python27+0x14be48 > > 0dd3e288 56e5bc08 118a8b30 0f289500 118a85e0 python27+0x14b6ec > > 0dd3e2b8 56e5b724 118a8b30 118a8c40 12f14918 python27+0x14bc08 > > 0dd3e2d0 56e5b1b2 118a8b30 00000000 118a8c40 python27+0x14b724 > > 0dd3e308 56e5b972 128b1d90 118a8b38 118a8b30 python27+0x14b1b2 > > 0dd3e338 56e5ae86 128b1d90 128b1d90 12a6b9d4 python27+0x14b972 > > 0dd3e368 56e5886d 12a6b9d9 12d8e030 12d8e030 python27+0x14ae86 > > 0dd3e388 56e3783a 12a6b9d4 12d8e030 12d8e030 python27+0x14886d > > 0dd3e3b8 56deccc7 00000000 12a55f60 00000000 python27+0x12783a > > 0dd3e3d8 56db1ac6 0d125f30 12a55f60 00000000 python27+0xdccc7 > > 0dd3e3f4 56e3c6aa 0d125f30 12a55f60 00000000 python27+0xa1ac6 > > 0dd3e410 56e3efcf 0d125f30 12a55f60 00000000 python27+0x12c6aa > > 0dd3e498 56e3cded 12a70e40 00000000 118a6600 python27+0x12efcf > > 0dd3e4d0 56e3c70f 12a58e78 12a70f78 12d8e030 python27+0x12cded > > 0dd3e500 56e5820c 12a58e78 12d8e030 12d8e030 python27+0x12c70f > > 0dd3e524 56e5be48 118a6600 12a58e78 118a7f80 python27+0x14820c > > 0dd3e588 56e5b6ec 118a6600 118a7f80 0f2895e0 python27+0x14be48 > > 0dd3e5a4 56e5b1b2 118a6600 0f2895e0 118a7f80 python27+0x14b6ec > > 0dd3e5dc 56e5b972 128b1d90 118a6608 118a6600 python27+0x14b1b2 > > 0dd3e60c 56e5ae86 128b1d90 128b1d90 13148d74 python27+0x14b972 > > 0dd3e63c 56e5886d 00000000 1132c9c0 1132c9c0 python27+0x14ae86 > > 0dd3e65c 56e3783a 13148d74 1132c9c0 1132c9c0 python27+0x14886d > > 0dd3e68c 56deccc7 00000000 12a55ed0 00000000 python27+0x12783a > > 0dd3e6ac 56db1ac6 0d125f30 12a55ed0 00000000 python27+0xdccc7 > > 0dd3e6c8 56e3c6aa 0d125f30 12a55ed0 00000000 python27+0xa1ac6 > > 0dd3e6e4 56e3efcf 0d125f30 12a55ed0 00000000 python27+0x12c6aa > > 0dd3e76c 56e3cded 12b4b590 00000000 118a5500 python27+0x12efcf > > 0dd3e7a4 56e3c70f 12a583c8 12b4b6c8 1132c9c0 python27+0x12cded > > 0dd3e7d4 56e5820c 12a583c8 1132c9c0 1132c9c0 python27+0x12c70f > > 0dd3e7f8 56e5be48 118a5500 12a583c8 118a72c0 python27+0x14820c > > 0dd3e85c 56e5b6ec 118a5500 118a72c0 0f2891b8 python27+0x14be48 > > 0dd3e878 56e5b1b2 118a5500 0f2891b8 118a72c0 python27+0x14b6ec > > 0dd3e8b0 56e5b972 128b1d90 118a5508 118a5500 python27+0x14b1b2 > > 0dd3e8e0 56e5ae86 128b1d90 128b1d90 1122e304 python27+0x14b972 > > 0dd3e910 56e5886d 00000000 128abf60 128abf60 python27+0x14ae86 > > 0dd3e930 56e3783a 1122e304 128abf60 128abf60 python27+0x14886d > > 0dd3e960 56deccc7 00000000 12a55960 00000000 python27+0x12783a > > 0dd3e980 56db1ac6 0d125f30 12a55960 00000000 python27+0xdccc7 > > 0dd3e99c 56e3c6aa 0d125f30 12a55960 00000000 python27+0xa1ac6 > > 0dd3e9b8 56e3efcf 0d125f30 12a55960 00000000 python27+0x12c6aa > > 0dd3ea40 56e3cded 128b81b8 00000000 118a60b9 python27+0x12efcf > > 0dd3ea78 56e3c70f 12a7d890 128b82f0 128abf60 python27+0x12cded > > 0dd3eaa8 56e5820c 12a7d890 128abf60 128abf60 python27+0x12c70f > > 0dd3eacc 56e5be48 118a60b9 12a7d890 118a52e0 python27+0x14820c > > 0dd3eb30 56e5b6ec 118a60b9 118a52e0 0f289a40 python27+0x14be48 > > 0dd3eb4c 56e5bc08 118a60b9 0f289a40 118a52e0 python27+0x14b6ec > > 0dd3eb7c 56e5b724 118a60b9 118a62d0 00000000 python27+0x14bc08 > > 0dd3eb94 56e5b1b2 118a60b9 00000000 118a62d0 python27+0x14b724 > > 0dd3ebcc 56e5b9a3 56fbe178 118a60b9 118a60b9 python27+0x14b1b2 > > 0dd3ebf0 56e5ae86 128b1d10 56fbe178 13a5f7b4 python27+0x14b9a3 > > 0dd3ec20 56e5886d 00000000 1132cdb0 1132cdb0 python27+0x14ae86 > > 0dd3ec40 56e3783a 13a5f7b4 1132cdb0 1132cdb0 python27+0x14886d > > 0dd3ec70 56deccc7 00000000 12a55840 00000000 python27+0x12783a > > 0dd3ec90 56db1ac6 0d125f30 12a55840 00000000 python27+0xdccc7 > > 0dd3ecac 56e3c6aa 0d125f30 12a55840 00000000 python27+0xa1ac6 > > 0dd3ecc8 56e3efcf 0d125f30 12a55840 00000000 python27+0x12c6aa > > 0dd3ed50 56e3cded 12a70738 00000000 118a3c98 python27+0x12efcf > > 0dd3ed88 56e3c70f 12a7d728 12a70870 1132cdb0 python27+0x12cded > > 0dd3edb8 56e5820c 12a7d728 1132cdb0 1132cdb0 python27+0x12c70f > > 0dd3eddc 56e5be48 118a3c98 12a7d728 118a30e0 python27+0x14820c > > 0dd3ee40 56e5b6ec 118a3c98 118a30e0 0f2887e0 python27+0x14be48 > > 0dd3ee5c 56e5bc08 118a3c98 0f2887e0 118a30e0 python27+0x14b6ec > > 0dd3ee8c 56e5b724 118a3c98 118a3da0 00000000 python27+0x14bc08 > > 0dd3eea4 56e5b1b2 118a3c98 00000000 118a3da0 python27+0x14b724 > > 0dd3eedc 56e5b9a3 56fbe178 118a3c98 118a3c98 python27+0x14b1b2 > > 0dd3ef00 56e5ae86 13a58bb0 56fbe178 132b31b4 python27+0x14b9a3 > > 0dd3ef30 56e5886d 00000000 1289ba50 1289ba50 python27+0x14ae86 > > 0dd3ef50 56e3783a 132b31b4 1289ba50 1289ba50 python27+0x14886d > > 0dd3ef80 56deccc7 00000000 12a555a0 00000000 python27+0x12783a > > 0dd3efa0 56db1ac6 0d125f30 12a555a0 00000000 python27+0xdccc7 > > 0dd3efbc 56e3c6aa 0d125f30 12a555a0 00000000 python27+0xa1ac6 > > 0dd3efd8 56e3efcf 0d125f30 12a555a0 00000000 python27+0x12c6aa > > 0dd3f060 56e3cded 13301900 00000000 118a40d0 python27+0x12efcf > > 0dd3f098 56e3c70f 12a7d5c0 13301a38 1289ba50 python27+0x12cded > > 0dd3f0c8 56e5820c 12a7d5c0 1289ba50 1289ba50 python27+0x12c70f > > 0dd3f0ec 56e5be48 118a40d0 12a7d5c0 118a3850 python27+0x14820c > > 0dd3f150 56e5b6ec 118a40d0 118a3850 0f288738 python27+0x14be48 > > 0dd3f16c 56e5b1b2 118a40d0 0f288738 118a3850 python27+0x14b6ec > > 0dd3f1a4 56e5b972 13a58bb0 118a40d8 118a40d0 python27+0x14b1b2 > > 0dd3f1d4 56e5ae86 13a58bb0 56fbe178 12b57114 python27+0x14b972 > > 0dd3f204 56e5886d 00000000 1132c030 1132c030 python27+0x14ae86 > > 0dd3f224 56e3783a 12b57114 1132c030 1132c030 python27+0x14886d > > 0dd3f254 56deccc7 00000000 12a55270 00000000 python27+0x12783a > > 0dd3f274 56db1ac6 0d125f30 12a55270 00000000 python27+0xdccc7 > > 0dd3f290 56e3c6aa 0d125f30 12a55270 00000000 python27+0xa1ac6 > > 0dd3f2ac 56e3efcf 0d125f30 12a55270 00000000 python27+0x12c6aa > > 0dd3f334 56e3cded 12a71030 00000000 118a3300 python27+0x12efcf > > 0dd3f36c 56e3c70f 12a528d8 12a71168 1132c030 python27+0x12cded > > 0dd3f39c 56e5820c 12a528d8 1132c030 1132c030 python27+0x12c70f > > 0dd3f3c0 56e5be48 118a3300 12a528d8 118a3740 python27+0x14820c > > 0dd3f424 56e5b6ec 118a3300 118a3740 0f283c68 python27+0x14be48 > > 0dd3f440 56e5b1b2 118a3300 0f283c68 118a3740 python27+0x14b6ec > > 0dd3f478 56e5b972 13a58bb0 118a3308 118a3300 python27+0x14b1b2 > > 0dd3f4a8 56e5aec0 13a58bb0 13a58bb0 12a64cfc python27+0x14b972 > > 0dd3f4d8 56e5886d 00000000 12896f60 12896f60 python27+0x14aec0 > > 0dd3f4f8 56e3783a 12a64cf4 12896f60 12896f60 python27+0x14886d > > 0dd3f528 56deccc7 00000000 12a55180 00000000 python27+0x12783a > > 0dd3f548 56db1ac6 0d125f30 12a55180 00000000 python27+0xdccc7 > > 0dd3f564 56e3c6aa 0d125f30 12a55180 00000000 python27+0xa1ac6 > > 0dd3f580 56e3efcf 0d125f30 12a55180 00000000 python27+0x12c6aa > > 0dd3f608 56e3cded 11213188 00000000 118a363e python27+0x12efcf > > 0dd3f640 56e3c70f 12a52890 112132c0 12896f60 python27+0x12cded > > 0dd3f670 56e5820c 12a52890 12896f60 12896f60 python27+0x12c70f > > 0dd3f694 56e5be48 118a363e 12a52890 118a3520 python27+0x14820c > > 0dd3f6f8 56e5b6ec 118a363e 118a3520 0d0ba078 python27+0x14be48 > > 0dd3f714 56e5b1b2 118a363e 0d0ba078 118a3520 python27+0x14b6ec > > 0dd3f74c 56e5b9a3 56fbe178 118a363e 118a363e python27+0x14b1b2 > > 0dd3f770 56e5ae86 12b758b0 56fbe178 12b6869c python27+0x14b9a3 > > 0dd3f7a0 56e5886d 00000000 12c88780 12c88780 python27+0x14ae86 > > 0dd3f7c0 56e3783a 12b6869c 12c88780 12c88780 python27+0x14886d > > 0dd3f7f0 56deccc7 00000000 1107f300 00000000 python27+0x12783a > > 0dd3f810 56db1ac6 0d125f30 1107f300 00000000 python27+0xdccc7 > > 0dd3f82c 56e3c6aa 0d125f30 1107f300 00000000 python27+0xa1ac6 > > 0dd3f848 56e3efcf 0d125f30 1107f300 00000000 python27+0x12c6aa > > 0dd3f8d0 56e3cded 0f327da0 00000000 0d0783f0 python27+0x12efcf > > 0dd3f908 56e3c70f 12a83338 0f327ed8 12c88780 python27+0x12cded > > 0dd3f938 56e5820c 12a83338 12c88780 12c88780 python27+0x12c70f > > 0dd3f95c 56e5be48 0d0783f0 12a83338 0d078830 python27+0x14820c > > 0dd3f9c0 56e5b6ec 0d0783f0 0d078830 0d0b9f98 python27+0x14be48 > > 0dd3f9dc 56e5b1b2 0d0783f0 0d0b9f98 0d078830 python27+0x14b6ec > > 0dd3fa14 56e5b972 12b758b0 0d0783fe 0d0783f0 python27+0x14b1b2 > > 0dd3fa44 56e5aec0 12b758b0 12b758b0 12aa6f72 python27+0x14b972 > > 0dd3fa74 56e5886d 00000000 0f79b4b0 0f79b4b0 python27+0x14aec0 > > 0dd3fa94 56e3783a 12aa6f64 0f79b4b0 0f79b4b0 python27+0x14886d > > 0dd3fac4 56deccc7 00000000 128799f0 00000000 python27+0x12783a > > 0dd3fae4 56db1ac6 0d125f30 128799f0 00000000 python27+0xdccc7 > > 0dd3fb00 56e3c6aa 0d125f30 128799f0 00000000 python27+0xa1ac6 > > 0dd3fb1c 56e3efcf 0d125f30 128799f0 00000000 python27+0x12c6aa > > 0dd3fba4 56e3cded 11733188 00000000 129f88a8 python27+0x12efcf > > 0dd3fbdc 56e3c70f 12b43ba8 117332c0 0f79b4b0 python27+0x12cded > > 0dd3fc0c 56e762ce 12b43ba8 0f79b4b0 0f79b4b0 python27+0x12c70f > > 0dd3fc28 56e740f8 1379db30 1307ce10 0f79b4b0 python27+0x1662ce > > 0dd3fc54 019a961b 0f2887e0 1307ce10 00000101 python27+0x1640f8 > > 0dd3fc78 019a8ed7 0f2887e0 0dd3fd84 11052190 kodi+0x6a961b > > 0dd3fdb0 0196cfa2 0c8f382c 0c8f3844 0c8f382c kodi+0x6a8ed7 > > 0dd3fdc4 019a8118 0c8f382c 0c8f3844 72f486d7 kodi+0x66cfa2 > > 0dd3fe0c 0185ff10 0c8f382c 0c8f3844 0c8f36c8 kodi+0x6a8118 > > 0dd3fe54 014809b1 72f4860f 0d09c618 01480890 kodi+0x55ff10 > > 0dd3fed4 76cbe16f 0c8f36c8 70c961a6 76cbe130 kodi+0x1809b1 > > 0dd3ff10 76868484 0d09c618 76868460 05a1e0fa > ucrtbase!thread_start+0x3f > > 0dd3ff24 77912ec0 0d09c618 c2cf1a0e 00000000 > kernel32!BaseThreadInitThunk+0x24 > > 0dd3ff6c 77912e90 ffffffff 7792dede 00000000 > ntdll!__RtlUserThreadStart+0x2f > > 0dd3ff7c 00000000 76cbe130 0d09c618 00000000 ntdll!_RtlUserThreadStart+0x1b > > > > > > MODULE_NAME: python27 > > BUCKET_ID_IMAGE_STR: python27.dll > > FAILURE_MODULE_NAME: python27 > > *Attachments* > > N/A > > *Reference #* > > 12927041 > > *Windows Insider Preview* > > If you would like to test your app in windows insider build , sign up to > the Windows Insider Program to get Windows 10 Insider Preview build. > https://insider.windows.com > > *Resource* > > For any questions on app development (or) submission on windows, contact > Windows Dev Center. https://developer.microsoft.com/en-us/windows/support > > > > Thank you for your continued support of Windows! > > Regards, > > [image: cid:iconId] > > AppEngage Team > Outreach and Engagement | Partner App Experience > EMAIL appengage at microsoft.com > > > > > > ---------- Forwarded message ---------- > From: python-list-bounces at python.org > To: python-dev at python.org > Cc: > Bcc: > Date: Wed, 30 May 2018 12:16:36 -0400 > Subject: [Python-Dev] Forward of moderated message > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image002.jpg Type: image/jpeg Size: 1584 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image002.jpg Type: image/jpeg Size: 1584 bytes Desc: not available URL: From steve.dower at python.org Wed May 30 14:46:04 2018 From: steve.dower at python.org (Steve Dower) Date: Wed, 30 May 2018 11:46:04 -0700 Subject: [Python-Dev] Forward of moderated message In-Reply-To: References: Message-ID: <19f1a920-2b2a-ae5d-b327-56f814b1d5f2@python.org> I doubt responding to python-list-bounces made it back, so I've added the emails from the original message. As Brett says, this is clearly someone else's build of Python (for starters, Python 2.7 should not be using ucrtbase.dll), so you would be best to track them down. Also, the best address for these issues is certainly python-dev at python.org and not python-list. I'm also happy to take them directly to at if it's easier to pass them on internally. Cheers, Steve On 30May2018 0958, Brett Cannon wrote: > If you look you will see this is being executed from within Kodi, so > this is probably an embedding situation where Kodi has a bug and they > are triggering a crash in the interpreter. > > On Wed, 30 May 2018 at 09:22 > wrote: > > > > > ---------- Forwarded message ---------- > From:?"Oscar Ortiz Garcia (Axelerate LLC)" > > To:?"python-list at python.org " > > > Cc:?Windows Developer Engagements - AppQuality > > > Bcc:? > Date:?Wed, 30 May 2018 16:03:51 +0000 > Subject:?Windows Application Issue | Python Software Foundation | > REF # 12927041 > > Dear Windows developer,____ > > I?m a program manager in the Partner App Experience Team at > Microsoft. We?re reaching out to notify you of a potential issue in > one of your applications. This issue is automatically generated by > Windows Error Reporting, a system which generates an error report > whenever a crash happens on user?s machine.____ > > The issue details are below. Our goal is to work with you to address > this issue and to understand what your expected timeline to address > this issue might be. If you have any questions about the details > below or have already addressed this issue in a forthcoming update, > please let me know.____ > > *Issue____* > > > > *Appcrash in > FAIL_FAST_FATAL_APP_EXIT_DETOURS_c0000409_python27.dll!PyMapping_HasKey*____ > > *Product____* > > > > Python____ > > *Module____* > > > > python27.dll____ > > *Module Version____* > > > > 2.7.12150.1013____ > > *OS____* > > > > Windows 10 Insider Preview, Windows 10____ > > *OS Version____* > > > > 17134, 16299, 15063, 14393____ > > *OS Architecture____* > > > > x64, x86____ > > *Environment____* > > > > Desktop____ > > *Occurrence____* > > > > More than 0.1M hits in the last 14 days____ > > *Impacted Locations____* > > > > US, Canada, Europe, Brazil____ > > *Triage Notes____* > > > > PROCESS_NAME:? kodi.exe____ > > ERROR_CODE: (NTSTATUS) 0xc0000409 - The system detected an overrun > of a stack-based buffer in this application. This overrun could > potentially allow a malicious user to gain control of this > application.____ > > EXCEPTION_CODE: (NTSTATUS) 0xc0000409 - The system detected an > overrun of a stack-based buffer in this application. This overrun > could potentially allow a malicious user to gain control of this > application.____ > > __?__ > > WATSON_BKT_PROCVER:? 17.6.0.0____ > > WATSON_BKT_MODULE:? ucrtbase.dll____ > > WATSON_BKT_MODVER:? 10.0.17134.1____ > > __?__ > > PROBLEM_CLASSES: ____ > > __?__ > > ??? ID:???? [0n278]____ > > ??? Type:?? [FAIL_FAST]____ > > ??? Class:? Primary____ > > ??? Scope:? DEFAULT_BUCKET_ID (Failure Bucket ID prefix)____ > > ??????????? BUCKET_ID____ > > ??? Name:?? Add____ > > ??? Data:?? Omit____ > > ??? PID:??? [Unspecified]____ > > ??? TID:??? [Unspecified]____ > > ??? Frame:? [0]____ > > __?__ > > ??? ID:???? [0n267]____ > > ??? Type:?? [FATAL_APP_EXIT]____ > > ??? Class:? Addendum____ > > ??? Scope:? DEFAULT_BUCKET_ID (Failure Bucket ID prefix)____ > > ??????????? BUCKET_ID____ > > ??? Name:?? Add____ > > ??? Data:?? Omit____ > > ??? PID:??? [Unspecified]____ > > ??? TID:??? [Unspecified]____ > > ??? Frame:? [0]____ > > __?__ > > ??? ID:???? [0n138]____ > > ??? Type:?? [DETOURS]____ > > ??? Class:? Addendum____ > > ??? Scope:? DEFAULT_BUCKET_ID (Failure Bucket ID prefix)____ > > ??????????? BUCKET_ID____ > > ??? Name:?? Add____ > > ??? Data:?? Omit____ > > ??? PID:??? [0x3308]____ > > ??? TID:??? [0x2548]____ > > ??? Frame:? [0] : ucrtbase!abort____ > > __?__ > > BUGCHECK_STR:? FAIL_FAST_FATAL_APP_EXIT_DETOURS____ > > __?__ > > STACK_TEXT:? ____ > > 0dd3db08 56e74b1b 00650063 00610076 003a006c ucrtbase!abort+0x4b____ > > WARNING: Stack unwind information not available. Following frames > may be wrong.____ > > 0dd3db54 56e3d1b0 56f6ae14 118a9190 00000000 python27+0x164b1b____ > > 0dd3dbd4 56e3cded 12db2468 00000000 118a9190 python27+0x12d1b0____ > > 0dd3dc0c 56e3c70f 12db6338 12db25a0 12dbe810 python27+0x12cded____ > > 0dd3dc3c 56e5820c 12db6338 12dbe810 12dbe810 python27+0x12c70f____ > > 0dd3dc60 56e5be48 118a9190 12db6338 118a94c0 python27+0x14820c____ > > 0dd3dcc4 56e5b6ec 118a9190 118a94c0 0f288ce8 python27+0x14be48____ > > 0dd3dce0 56e5b1b2 118a9190 0f288ce8 118a94c0 python27+0x14b6ec____ > > 0dd3dd18 56e5b972 56fbe178 118a9190 118a9190 python27+0x14b1b2____ > > 0dd3dd48 56e5ae86 56fbe178 56fbe178 12da18d4 python27+0x14b972____ > > 0dd3dd78 56e5886d 00000000 12da9270 12da9270 python27+0x14ae86____ > > 0dd3dd98 56e3783a 12da18d4 12da9270 12da9270 python27+0x14886d____ > > 0dd3ddc8 56deccc7 00000000 12db8150 00000000 python27+0x12783a____ > > 0dd3dde8 56db1ac6 0d125f30 12db8150 00000000 python27+0xdccc7____ > > 0dd3de04 56e3c6aa 0d125f30 12db8150 00000000 python27+0xa1ac6____ > > 0dd3de20 56e3efcf 0d125f30 12db8150 00000000 python27+0x12c6aa____ > > 0dd3dea8 56e3cded 11b72800 00000000 118a8800 python27+0x12efcf____ > > 0dd3dee0 56e3c70f 12da8b18 11b72938 12da9270 python27+0x12cded____ > > 0dd3df10 56e5820c 12da8b18 12da9270 12da9270 python27+0x12c70f____ > > 0dd3df34 56e5be48 118a8800 12da8b18 118a86f0 python27+0x14820c____ > > 0dd3df98 56e5b6ec 118a8800 118a86f0 0f288ea8 python27+0x14be48____ > > 0dd3dfb4 56e5b1b2 118a8800 0f288ea8 118a86f0 python27+0x14b6ec____ > > 0dd3dfec 56e5b972 12a6bd90 118a880d 118a8800 python27+0x14b1b2____ > > 0dd3e01c 56e5ae86 12a6bd90 12a6bd90 12a6bdd4 python27+0x14b972____ > > 0dd3e04c 56e5886d 00000000 12dbec00 12dbec00 python27+0x14ae86____ > > 0dd3e06c 56e3783a 12a6bdd4 12dbec00 12dbec00 python27+0x14886d____ > > 0dd3e09c 56deccc7 00000000 12a44390 00000000 python27+0x12783a____ > > 0dd3e0bc 56db1ac6 0d125f30 12a44390 00000000 python27+0xdccc7____ > > 0dd3e0d8 56e3c6aa 0d125f30 12a44390 00000000 python27+0xa1ac6____ > > 0dd3e0f4 56e3efcf 0d125f30 12a44390 00000000 python27+0x12c6aa____ > > 0dd3e17c 56e3cded 12883370 00000000 118a8b30 python27+0x12efcf____ > > 0dd3e1b4 56e3c70f 12a58ec0 128834a8 12dbec00 python27+0x12cded____ > > 0dd3e1e4 56e5820c 12a58ec0 12dbec00 12dbec00 python27+0x12c70f____ > > 0dd3e208 56e5be48 118a8b30 12a58ec0 118a85e0 python27+0x14820c____ > > 0dd3e26c 56e5b6ec 118a8b30 118a85e0 0f289500 python27+0x14be48____ > > 0dd3e288 56e5bc08 118a8b30 0f289500 118a85e0 python27+0x14b6ec____ > > 0dd3e2b8 56e5b724 118a8b30 118a8c40 12f14918 python27+0x14bc08____ > > 0dd3e2d0 56e5b1b2 118a8b30 00000000 118a8c40 python27+0x14b724____ > > 0dd3e308 56e5b972 128b1d90 118a8b38 118a8b30 python27+0x14b1b2____ > > 0dd3e338 56e5ae86 128b1d90 128b1d90 12a6b9d4 python27+0x14b972____ > > 0dd3e368 56e5886d 12a6b9d9 12d8e030 12d8e030 python27+0x14ae86____ > > 0dd3e388 56e3783a 12a6b9d4 12d8e030 12d8e030 python27+0x14886d____ > > 0dd3e3b8 56deccc7 00000000 12a55f60 00000000 python27+0x12783a____ > > 0dd3e3d8 56db1ac6 0d125f30 12a55f60 00000000 python27+0xdccc7____ > > 0dd3e3f4 56e3c6aa 0d125f30 12a55f60 00000000 python27+0xa1ac6____ > > 0dd3e410 56e3efcf 0d125f30 12a55f60 00000000 python27+0x12c6aa____ > > 0dd3e498 56e3cded 12a70e40 00000000 118a6600 python27+0x12efcf____ > > 0dd3e4d0 56e3c70f 12a58e78 12a70f78 12d8e030 python27+0x12cded____ > > 0dd3e500 56e5820c 12a58e78 12d8e030 12d8e030 python27+0x12c70f____ > > 0dd3e524 56e5be48 118a6600 12a58e78 118a7f80 python27+0x14820c____ > > 0dd3e588 56e5b6ec 118a6600 118a7f80 0f2895e0 python27+0x14be48____ > > 0dd3e5a4 56e5b1b2 118a6600 0f2895e0 118a7f80 python27+0x14b6ec____ > > 0dd3e5dc 56e5b972 128b1d90 118a6608 118a6600 python27+0x14b1b2____ > > 0dd3e60c 56e5ae86 128b1d90 128b1d90 13148d74 python27+0x14b972____ > > 0dd3e63c 56e5886d 00000000 1132c9c0 1132c9c0 python27+0x14ae86____ > > 0dd3e65c 56e3783a 13148d74 1132c9c0 1132c9c0 python27+0x14886d____ > > 0dd3e68c 56deccc7 00000000 12a55ed0 00000000 python27+0x12783a____ > > 0dd3e6ac 56db1ac6 0d125f30 12a55ed0 00000000 python27+0xdccc7____ > > 0dd3e6c8 56e3c6aa 0d125f30 12a55ed0 00000000 python27+0xa1ac6____ > > 0dd3e6e4 56e3efcf 0d125f30 12a55ed0 00000000 python27+0x12c6aa____ > > 0dd3e76c 56e3cded 12b4b590 00000000 118a5500 python27+0x12efcf____ > > 0dd3e7a4 56e3c70f 12a583c8 12b4b6c8 1132c9c0 python27+0x12cded____ > > 0dd3e7d4 56e5820c 12a583c8 1132c9c0 1132c9c0 python27+0x12c70f____ > > 0dd3e7f8 56e5be48 118a5500 12a583c8 118a72c0 python27+0x14820c____ > > 0dd3e85c 56e5b6ec 118a5500 118a72c0 0f2891b8 python27+0x14be48____ > > 0dd3e878 56e5b1b2 118a5500 0f2891b8 118a72c0 python27+0x14b6ec____ > > 0dd3e8b0 56e5b972 128b1d90 118a5508 118a5500 python27+0x14b1b2____ > > 0dd3e8e0 56e5ae86 128b1d90 128b1d90 1122e304 python27+0x14b972____ > > 0dd3e910 56e5886d 00000000 128abf60 128abf60 python27+0x14ae86____ > > 0dd3e930 56e3783a 1122e304 128abf60 128abf60 python27+0x14886d____ > > 0dd3e960 56deccc7 00000000 12a55960 00000000 python27+0x12783a____ > > 0dd3e980 56db1ac6 0d125f30 12a55960 00000000 python27+0xdccc7____ > > 0dd3e99c 56e3c6aa 0d125f30 12a55960 00000000 python27+0xa1ac6____ > > 0dd3e9b8 56e3efcf 0d125f30 12a55960 00000000 python27+0x12c6aa____ > > 0dd3ea40 56e3cded 128b81b8 00000000 118a60b9 python27+0x12efcf____ > > 0dd3ea78 56e3c70f 12a7d890 128b82f0 128abf60 python27+0x12cded____ > > 0dd3eaa8 56e5820c 12a7d890 128abf60 128abf60 python27+0x12c70f____ > > 0dd3eacc 56e5be48 118a60b9 12a7d890 118a52e0 python27+0x14820c____ > > 0dd3eb30 56e5b6ec 118a60b9 118a52e0 0f289a40 python27+0x14be48____ > > 0dd3eb4c 56e5bc08 118a60b9 0f289a40 118a52e0 python27+0x14b6ec____ > > 0dd3eb7c 56e5b724 118a60b9 118a62d0 00000000 python27+0x14bc08____ > > 0dd3eb94 56e5b1b2 118a60b9 00000000 118a62d0 python27+0x14b724____ > > 0dd3ebcc 56e5b9a3 56fbe178 118a60b9 118a60b9 python27+0x14b1b2____ > > 0dd3ebf0 56e5ae86 128b1d10 56fbe178 13a5f7b4 python27+0x14b9a3____ > > 0dd3ec20 56e5886d 00000000 1132cdb0 1132cdb0 python27+0x14ae86____ > > 0dd3ec40 56e3783a 13a5f7b4 1132cdb0 1132cdb0 python27+0x14886d____ > > 0dd3ec70 56deccc7 00000000 12a55840 00000000 python27+0x12783a____ > > 0dd3ec90 56db1ac6 0d125f30 12a55840 00000000 python27+0xdccc7____ > > 0dd3ecac 56e3c6aa 0d125f30 12a55840 00000000 python27+0xa1ac6____ > > 0dd3ecc8 56e3efcf 0d125f30 12a55840 00000000 python27+0x12c6aa____ > > 0dd3ed50 56e3cded 12a70738 00000000 118a3c98 python27+0x12efcf____ > > 0dd3ed88 56e3c70f 12a7d728 12a70870 1132cdb0 python27+0x12cded____ > > 0dd3edb8 56e5820c 12a7d728 1132cdb0 1132cdb0 python27+0x12c70f____ > > 0dd3eddc 56e5be48 118a3c98 12a7d728 118a30e0 python27+0x14820c____ > > 0dd3ee40 56e5b6ec 118a3c98 118a30e0 0f2887e0 python27+0x14be48____ > > 0dd3ee5c 56e5bc08 118a3c98 0f2887e0 118a30e0 python27+0x14b6ec____ > > 0dd3ee8c 56e5b724 118a3c98 118a3da0 00000000 python27+0x14bc08____ > > 0dd3eea4 56e5b1b2 118a3c98 00000000 118a3da0 python27+0x14b724____ > > 0dd3eedc 56e5b9a3 56fbe178 118a3c98 118a3c98 python27+0x14b1b2____ > > 0dd3ef00 56e5ae86 13a58bb0 56fbe178 132b31b4 python27+0x14b9a3____ > > 0dd3ef30 56e5886d 00000000 1289ba50 1289ba50 python27+0x14ae86____ > > 0dd3ef50 56e3783a 132b31b4 1289ba50 1289ba50 python27+0x14886d____ > > 0dd3ef80 56deccc7 00000000 12a555a0 00000000 python27+0x12783a____ > > 0dd3efa0 56db1ac6 0d125f30 12a555a0 00000000 python27+0xdccc7____ > > 0dd3efbc 56e3c6aa 0d125f30 12a555a0 00000000 python27+0xa1ac6____ > > 0dd3efd8 56e3efcf 0d125f30 12a555a0 00000000 python27+0x12c6aa____ > > 0dd3f060 56e3cded 13301900 00000000 118a40d0 python27+0x12efcf____ > > 0dd3f098 56e3c70f 12a7d5c0 13301a38 1289ba50 python27+0x12cded____ > > 0dd3f0c8 56e5820c 12a7d5c0 1289ba50 1289ba50 python27+0x12c70f____ > > 0dd3f0ec 56e5be48 118a40d0 12a7d5c0 118a3850 python27+0x14820c____ > > 0dd3f150 56e5b6ec 118a40d0 118a3850 0f288738 python27+0x14be48____ > > 0dd3f16c 56e5b1b2 118a40d0 0f288738 118a3850 python27+0x14b6ec____ > > 0dd3f1a4 56e5b972 13a58bb0 118a40d8 118a40d0 python27+0x14b1b2____ > > 0dd3f1d4 56e5ae86 13a58bb0 56fbe178 12b57114 python27+0x14b972____ > > 0dd3f204 56e5886d 00000000 1132c030 1132c030 python27+0x14ae86____ > > 0dd3f224 56e3783a 12b57114 1132c030 1132c030 python27+0x14886d____ > > 0dd3f254 56deccc7 00000000 12a55270 00000000 python27+0x12783a____ > > 0dd3f274 56db1ac6 0d125f30 12a55270 00000000 python27+0xdccc7____ > > 0dd3f290 56e3c6aa 0d125f30 12a55270 00000000 python27+0xa1ac6____ > > 0dd3f2ac 56e3efcf 0d125f30 12a55270 00000000 python27+0x12c6aa____ > > 0dd3f334 56e3cded 12a71030 00000000 118a3300 python27+0x12efcf____ > > 0dd3f36c 56e3c70f 12a528d8 12a71168 1132c030 python27+0x12cded____ > > 0dd3f39c 56e5820c 12a528d8 1132c030 1132c030 python27+0x12c70f____ > > 0dd3f3c0 56e5be48 118a3300 12a528d8 118a3740 python27+0x14820c____ > > 0dd3f424 56e5b6ec 118a3300 118a3740 0f283c68 python27+0x14be48____ > > 0dd3f440 56e5b1b2 118a3300 0f283c68 118a3740 python27+0x14b6ec____ > > 0dd3f478 56e5b972 13a58bb0 118a3308 118a3300 python27+0x14b1b2____ > > 0dd3f4a8 56e5aec0 13a58bb0 13a58bb0 12a64cfc python27+0x14b972____ > > 0dd3f4d8 56e5886d 00000000 12896f60 12896f60 python27+0x14aec0____ > > 0dd3f4f8 56e3783a 12a64cf4 12896f60 12896f60 python27+0x14886d____ > > 0dd3f528 56deccc7 00000000 12a55180 00000000 python27+0x12783a____ > > 0dd3f548 56db1ac6 0d125f30 12a55180 00000000 python27+0xdccc7____ > > 0dd3f564 56e3c6aa 0d125f30 12a55180 00000000 python27+0xa1ac6____ > > 0dd3f580 56e3efcf 0d125f30 12a55180 00000000 python27+0x12c6aa____ > > 0dd3f608 56e3cded 11213188 00000000 118a363e python27+0x12efcf____ > > 0dd3f640 56e3c70f 12a52890 112132c0 12896f60 python27+0x12cded____ > > 0dd3f670 56e5820c 12a52890 12896f60 12896f60 python27+0x12c70f____ > > 0dd3f694 56e5be48 118a363e 12a52890 118a3520 python27+0x14820c____ > > 0dd3f6f8 56e5b6ec 118a363e 118a3520 0d0ba078 python27+0x14be48____ > > 0dd3f714 56e5b1b2 118a363e 0d0ba078 118a3520 python27+0x14b6ec____ > > 0dd3f74c 56e5b9a3 56fbe178 118a363e 118a363e python27+0x14b1b2____ > > 0dd3f770 56e5ae86 12b758b0 56fbe178 12b6869c python27+0x14b9a3____ > > 0dd3f7a0 56e5886d 00000000 12c88780 12c88780 python27+0x14ae86____ > > 0dd3f7c0 56e3783a 12b6869c 12c88780 12c88780 python27+0x14886d____ > > 0dd3f7f0 56deccc7 00000000 1107f300 00000000 python27+0x12783a____ > > 0dd3f810 56db1ac6 0d125f30 1107f300 00000000 python27+0xdccc7____ > > 0dd3f82c 56e3c6aa 0d125f30 1107f300 00000000 python27+0xa1ac6____ > > 0dd3f848 56e3efcf 0d125f30 1107f300 00000000 python27+0x12c6aa____ > > 0dd3f8d0 56e3cded 0f327da0 00000000 0d0783f0 python27+0x12efcf____ > > 0dd3f908 56e3c70f 12a83338 0f327ed8 12c88780 python27+0x12cded____ > > 0dd3f938 56e5820c 12a83338 12c88780 12c88780 python27+0x12c70f____ > > 0dd3f95c 56e5be48 0d0783f0 12a83338 0d078830 python27+0x14820c____ > > 0dd3f9c0 56e5b6ec 0d0783f0 0d078830 0d0b9f98 python27+0x14be48____ > > 0dd3f9dc 56e5b1b2 0d0783f0 0d0b9f98 0d078830 python27+0x14b6ec____ > > 0dd3fa14 56e5b972 12b758b0 0d0783fe 0d0783f0 python27+0x14b1b2____ > > 0dd3fa44 56e5aec0 12b758b0 12b758b0 12aa6f72 python27+0x14b972____ > > 0dd3fa74 56e5886d 00000000 0f79b4b0 0f79b4b0 python27+0x14aec0____ > > 0dd3fa94 56e3783a 12aa6f64 0f79b4b0 0f79b4b0 python27+0x14886d____ > > 0dd3fac4 56deccc7 00000000 128799f0 00000000 python27+0x12783a____ > > 0dd3fae4 56db1ac6 0d125f30 128799f0 00000000 python27+0xdccc7____ > > 0dd3fb00 56e3c6aa 0d125f30 128799f0 00000000 python27+0xa1ac6____ > > 0dd3fb1c 56e3efcf 0d125f30 128799f0 00000000 python27+0x12c6aa____ > > 0dd3fba4 56e3cded 11733188 00000000 129f88a8 python27+0x12efcf____ > > 0dd3fbdc 56e3c70f 12b43ba8 117332c0 0f79b4b0 python27+0x12cded____ > > 0dd3fc0c 56e762ce 12b43ba8 0f79b4b0 0f79b4b0 python27+0x12c70f____ > > 0dd3fc28 56e740f8 1379db30 1307ce10 0f79b4b0 python27+0x1662ce____ > > 0dd3fc54 019a961b 0f2887e0 1307ce10 00000101 python27+0x1640f8____ > > 0dd3fc78 019a8ed7 0f2887e0 0dd3fd84 11052190 kodi+0x6a961b____ > > 0dd3fdb0 0196cfa2 0c8f382c 0c8f3844 0c8f382c kodi+0x6a8ed7____ > > 0dd3fdc4 019a8118 0c8f382c 0c8f3844 72f486d7 kodi+0x66cfa2____ > > 0dd3fe0c 0185ff10 0c8f382c 0c8f3844 0c8f36c8 kodi+0x6a8118____ > > 0dd3fe54 014809b1 72f4860f 0d09c618 01480890 kodi+0x55ff10____ > > 0dd3fed4 76cbe16f 0c8f36c8 70c961a6 76cbe130 kodi+0x1809b1____ > > 0dd3ff10 76868484 0d09c618 76868460 05a1e0fa > ucrtbase!thread_start+0x3f____ > > 0dd3ff24 77912ec0 0d09c618 c2cf1a0e 00000000 > kernel32!BaseThreadInitThunk+0x24____ > > 0dd3ff6c 77912e90 ffffffff 7792dede 00000000 > ntdll!__RtlUserThreadStart+0x2f____ > > 0dd3ff7c 00000000 76cbe130 0d09c618 00000000 > ntdll!_RtlUserThreadStart+0x1b____ > > __?__ > > __?__ > > MODULE_NAME: python27____ > > BUCKET_ID_IMAGE_STR:? python27.dll____ > > FAILURE_MODULE_NAME:? python27____ > > *Attachments____* > > > > N/A____ > > *Reference #____* > > > > 12927041____ > > *Windows Insider Preview____* > > > > If you would like to test your app in windows insider build , sign > up to the Windows Insider Program to get Windows 10 Insider Preview > build. https://insider.windows.com____ > > *Resource____* > > > > For any questions on app development (or) submission on windows, > contact Windows Dev Center. > https://developer.microsoft.com/en-us/windows/support____ > > > > ____ > > Thank you for your continued support of Windows!____ > > Regards,____ > > cid:iconId____ > > > > AppEngage Team > Outreach and Engagement | Partner App Experience > EMAIL appengage at microsoft.com ____ From vstinner at redhat.com Wed May 30 17:21:23 2018 From: vstinner at redhat.com (Victor Stinner) Date: Wed, 30 May 2018 23:21:23 +0200 Subject: [Python-Dev] Withdraw PEP 546? Backport ssl.MemoryBIO and ssl.SSLObject to Python 2.7 In-Reply-To: References: Message-ID: 2018-05-30 18:02 GMT+02:00 Nathaniel Smith : > On Wed, May 30, 2018, 07:30 Victor Stinner wrote: >> >> Does anyone would benefit of MemoryBIO in Python 2.7? Twisted, >> asyncio, trio, urllib3, anyone else? > > Asyncio and trio are strongly py3-only. Twisted's TLS functionality is built > around pyopenssl, so the stdlib ssl module doesn't affect them. Urllib3 uses > the socket-wrapping APIs, not MemoryBIO. So fwiw I don't think any of those > projects would benefit. MemoryBIO was the key feature which allowed to implement TLS for the ProactorEventLoop (IOCP) of asyncio. I'm not sure that the Python 2.7 ssl module is a drop-in replacement for pyopenssl. Victor From nad at python.org Thu May 31 00:32:14 2018 From: nad at python.org (Ned Deily) Date: Thu, 31 May 2018 00:32:14 -0400 Subject: [Python-Dev] [RELEASE] Python 3.7.0b5 bonus beta! Message-ID: A 3.7 update: Python 3.7.0b5 is now the final beta preview of Python 3.7, the next feature release of Python. 3.7.0b4 was intended to be the final beta but, due to some unexpected compatibility issues discovered during beta testing of third-party packages, we decided to revert some changes in how Python's 3.7 Abstract Syntax Tree parser deals with docstrings; 3.7.0b5 now behaves like 3.6.x and previous releases (refer to the 3.7.0b5 changelog for more information). **If your code makes use of the ast module, you are strongly encouraged to test (or retest) that code with 3.7.0b5, especially if you previously made changes to work with earlier preview versons of 3.7.0.** As always, please report issues found to bugs.python.org as soon as possible. Please keep in mind that this is a preview release and its use is not recommended for production environments. Attention macOS users: there is now a new installer variant for macOS 10.9+ that includes a built-in version of Tcl/Tk 8.6. This variant is expected to become the default version when 3.7.0 releases. Check it out! The next (and final, we hope!) preview release will be the release candidate which is now planned for 2018-06-11, followed by the official release of 3.7.0, now planned for 2018-06-27. You can find Python 3.7.0b5 and more information here: https://www.python.org/downloads/release/python-370b5/ -- Ned Deily nad at python.org -- [] From njs at pobox.com Thu May 31 05:34:28 2018 From: njs at pobox.com (Nathaniel Smith) Date: Thu, 31 May 2018 02:34:28 -0700 Subject: [Python-Dev] Withdraw PEP 546? Backport ssl.MemoryBIO and ssl.SSLObject to Python 2.7 In-Reply-To: References: Message-ID: On Wed, May 30, 2018, 14:21 Victor Stinner wrote: > 2018-05-30 18:02 GMT+02:00 Nathaniel Smith : > > On Wed, May 30, 2018, 07:30 Victor Stinner wrote: > >> > >> Does anyone would benefit of MemoryBIO in Python 2.7? Twisted, > >> asyncio, trio, urllib3, anyone else? > > > > Asyncio and trio are strongly py3-only. Twisted's TLS functionality is > built > > around pyopenssl, so the stdlib ssl module doesn't affect them. Urllib3 > uses > > the socket-wrapping APIs, not MemoryBIO. So fwiw I don't think any of > those > > projects would benefit. > > MemoryBIO was the key feature which allowed to implement TLS for the > ProactorEventLoop (IOCP) of asyncio. > MemoryBIO is definitely super useful for async libraries ? trio uses it, asyncio uses it, twisted uses it (via pyopenssl). But I don't know of anyone who currently needs it but hasn't already found a way to get it. I'm not sure that the Python 2.7 ssl module is a drop-in replacement > for pyopenssl. > No, their APIs are totally different, for better or worse. -n > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Thu May 31 10:22:38 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 1 Jun 2018 00:22:38 +1000 Subject: [Python-Dev] Withdraw PEP 546? Backport ssl.MemoryBIO and ssl.SSLObject to Python 2.7 In-Reply-To: References: Message-ID: On 31 May 2018 at 19:34, Nathaniel Smith wrote: > On Wed, May 30, 2018, 14:21 Victor Stinner wrote: > >> MemoryBIO was the key feature which allowed to implement TLS for the >> ProactorEventLoop (IOCP) of asyncio. >> > > MemoryBIO is definitely super useful for async libraries ? trio uses it, > asyncio uses it, twisted uses it (via pyopenssl). But I don't know of > anyone who currently needs it but hasn't already found a way to get it. > I think one of the other key things that changed is pip gaining its own native support for using the SecureTransport API on Mac OS X. So yeah, unless someone from PyCA chimes in to say that the PEP still offers benefits that we can't get another way, withdrawing PEP 546 as "Overtaken by events" probably makes sense. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From jeremy.kloth at gmail.com Thu May 31 11:23:31 2018 From: jeremy.kloth at gmail.com (Jeremy Kloth) Date: Thu, 31 May 2018 09:23:31 -0600 Subject: [Python-Dev] The history of PyXML In-Reply-To: References: Message-ID: On Mon, May 28, 2018 at 9:53 PM, Serhiy Storchaka wrote: > 28.05.18 23:11, Jeremy Kloth ????: >> >> On Thu, May 17, 2018 at 6:18 AM, Serhiy Storchaka >> wrote: >>> >>> Does anyone has the full copy of the PyXML repository, with the complete >>> history? >>> >> >> Here you go! >> >> https://github.com/jkloth/pyxml > > > Great! Thank you, this is what I needed! I had also contacted SourceForge prior to me uploading a Github repo. They have just restored the project files for PyXML as well (CVS, downloads, ...). -- Jeremy Kloth From jonathan_tsang2003 at yahoo.ca Thu May 31 22:22:47 2018 From: jonathan_tsang2003 at yahoo.ca (Jonathan Tsang) Date: Fri, 1 Jun 2018 02:22:47 +0000 (UTC) Subject: [Python-Dev] What is the command to upgrade python 3.6.5 to 3.7.5? References: <1986554737.911105.1527819767709.ref@mail.yahoo.com> Message-ID: <1986554737.911105.1527819767709@mail.yahoo.com> Hi Dev. Support, ?Is there a command that can help me to upgrade python 3.6.5 to 3.7.5 without uninstall and reinstall please? Thanks,Jonathan -------------- next part -------------- An HTML attachment was scrubbed... URL: From vano at mail.mipt.ru Thu May 31 22:28:02 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Fri, 1 Jun 2018 05:28:02 +0300 Subject: [Python-Dev] What is the command to upgrade python 3.6.5 to 3.7.5? In-Reply-To: <1986554737.911105.1527819767709@mail.yahoo.com> References: <1986554737.911105.1527819767709.ref@mail.yahoo.com> <1986554737.911105.1527819767709@mail.yahoo.com> Message-ID: https://stackoverflow.com/questions/15102943/how-to-update-python/50616351#50616351 On 01.06.2018 5:22, Jonathan Tsang via Python-Dev wrote: > Hi Dev. Support, > > ?Is there a command that can help me to upgrade python 3.6.5 to 3.7.5 > without uninstall and reinstall please? > > Thanks, > Jonathan > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/vano%40mail.mipt.ru -- Regards, Ivan -------------- next part -------------- An HTML attachment was scrubbed... URL: From donald at stufft.io Thu May 31 22:49:43 2018 From: donald at stufft.io (Donald Stufft) Date: Thu, 31 May 2018 22:49:43 -0400 Subject: [Python-Dev] Withdraw PEP 546? Backport ssl.MemoryBIO and ssl.SSLObject to Python 2.7 In-Reply-To: References: Message-ID: <96831945-56DE-48F8-8B13-1EC92DC0A15C@stufft.io> > On May 31, 2018, at 10:22 AM, Nick Coghlan wrote: > > On 31 May 2018 at 19:34, Nathaniel Smith > wrote: > On Wed, May 30, 2018, 14:21 Victor Stinner > wrote: > MemoryBIO was the key feature which allowed to implement TLS for the > ProactorEventLoop (IOCP) of asyncio. > > MemoryBIO is definitely super useful for async libraries ? trio uses it, asyncio uses it, twisted uses it (via pyopenssl). But I don't know of anyone who currently needs it but hasn't already found a way to get it. > > I think one of the other key things that changed is pip gaining its own native support for using the SecureTransport API on Mac OS X. > > So yeah, unless someone from PyCA chimes in to say that the PEP still offers benefits that we can't get another way, withdrawing PEP 546 as "Overtaken by events" probably makes sense. > I think it still provides benefits FWIW, pip?s SecureTransport shim is a slow as hell ctypes hack that is slow enough we *only* use it when the ``ssl`` library wouldn?t be able to connect anyways and that doesn?t help Windows. OTOH I don?t think it?s super useful without PEP 543 also. -------------- next part -------------- An HTML attachment was scrubbed... URL: