From vano at mail.mipt.ru Fri Jun 1 02:13:56 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Fri, 1 Jun 2018 09:13:56 +0300 Subject: [Python-Dev] How to watch buildbots? In-Reply-To: References: Message-ID: <104c4a1a-c7e9-9668-adb7-3dd10d737718@mail.mipt.ru> On 30.05.2018 16:36, Nick Coghlan wrote: > On 30 May 2018 at 22:30, Ivan Pozdeev via Python-Dev > > wrote: > > What's the big idea of separate buildbots anyway? I thought the > purpose of CI is to test everything _before_ > it breaks the main codebase. Then it's the job of the contributor > rather than maintainer to fix any breakages. > > > So, maybe making them be driven by Github checks would be a better > time investment. > Especially since we've got VSTS checks just recently, so whoever > was doing that still knows how to interface with this Github > machinery. > > If the bots cancel a previous build if a new one for the same PR > arrives, this will not lead to a significant load difference 'cuz > the number of > actively developed PRs is stable and roughly equal to the number > of merges according to the open/closed tickets dynamics. > > > There are a few key details here: > > 1. We currently need to run post-merge CI anyway, as we're not doing > linearised commits (where core devs just approve a change without > merging it, and then a gating system like Zuul ensures that the tests > are run against the latest combination of the target branch and the PR > before merging the change) This is the only point here that looks valid (others can be refuted). This technique limits the achievable commit rate by 1/testing_time . Our average rate probably fits into this, though it still means delays. > 2. Since the buildbots are running on donated dedicated machines > (rather than throwaway instances from a dynamic CI provider), we need > to review the code before we let it run on the contributed systems > 3. The buildbot instances run *1* build at a time, which would lead to > major PR merging bottlenecks during sprints if we made them a gating > requirement > 4. For the vast majority of PRs, the post-merge cross-platform testing > is a formality, since the code being modified is using lower level > cross-platform APIs elsewhere in the standard library, so if it works > on Windows, Linux, and Mac OS X, it will work everywhere Python runs > 5. We generally don't *want* to burden new contributors with the task > of dealing with the less common (or harder to target) platforms > outside the big 3 - when they do break, it often takes a non-trivial > amount of platform knowledge to understand what's different about the > platform in question > > Cheers, > Nick. > > P.S. That said, if VSTS or Travis were to offer FreeBSD as an option > for pre-merge CI, I'd suggest we enable it, at least in an advisory > capacity - it's a better check against Linux-specific assumptions > creeping into the code base than Mac OS X, since the latter is > regularly different enough from other *nix systems that we need to > give it dedicated code paths. > > -- > Nick Coghlan?? | ncoghlan at gmail.com |?? > Brisbane, Australia -- Regards, Ivan -------------- next part -------------- An HTML attachment was scrubbed... URL: From solipsis at pitrou.net Fri Jun 1 04:47:21 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Fri, 1 Jun 2018 10:47:21 +0200 Subject: [Python-Dev] Withdraw PEP 546? Backport ssl.MemoryBIO and ssl.SSLObject to Python 2.7 References: Message-ID: <20180601104721.70796d8d@fsol> +1 for withdrawing it. It's much too late in the 2.7 maintenance schedule to start bothering with such a large and perilous backport. Regards Antoine. On Wed, 30 May 2018 16:28:22 +0200 Victor Stinner wrote: > Hi, > > tl; dr I will withdraw the PEP 546 in one week if noboy shows up to > finish the implementation. > > > Last year,I wrote the PEP 546 with Cory Benfield: > "Backport ssl.MemoryBIO and ssl.SSLObject to Python 2.7" > https://www.python.org/dev/peps/pep-0546/ > > The plan was to get a Python 2.7 implementation of Cory's PEP 543: > "A Unified TLS API for Python" > https://www.python.org/dev/peps/pep-0543/ > > Sadly, it seems like Cory is no longer available to work on the projec > (PEP 543 is still a draft)t. > > The PEP 546 is implemented: > https://github.com/python/cpython/pull/2133 > > Well, I closed it, but you can still get it as a patch with: > https://patch-diff.githubusercontent.com/raw/python/cpython/pull/2133.patch > > But tests fail on Travis CI whereas I'm unable to reproduce the issue > on my laptop (on Fedora). The failure seems to depend on the version > of OpenSSL. Christian Heimes has a "multissl" tool which automates > tests on multiple OpenSSL versions, but I failed to find time to try > this tool. > > Time flies and one year later, the PR of the PEP 546 is still not > merged, tests are still failing. > > One month ago, when 2.7.15 has been released, Benjamin Peterson, > Python 2.7 release manager, simply proposed: > "The lack of movement for a year makes me wonder if PEP 546 should be > moved to Withdrawn status." > > Since again, I failed to find time to look at the test_ssl failure, I > plan to withdraw the PEP next week if nobody shows up :-( Sorry Python > 2.7! > > Does anyone would benefit of MemoryBIO in Python 2.7? Twisted, > asyncio, trio, urllib3, anyone else? If yes, who is volunteer to > finish the MemoryBIO backport (and maintain it)? > > Victor From tismer at stackless.com Fri Jun 1 09:10:30 2018 From: tismer at stackless.com (Christian Tismer) Date: Fri, 1 Jun 2018 15:10:30 +0200 Subject: [Python-Dev] PyIndex_Check conflicts with PEP 384 Message-ID: Hi friends, when implementing the limited API for PySide2, I recognized a little bug where a function was replaced by a macro. The file number.rst on python 3.6 says """ .. c:function:: int PyIndex_Check(PyObject *o) Returns ``1`` if *o* is an index integer (has the nb_index slot of the tp_as_number structure filled in), and ``0`` otherwise. """ Without notice, this function was replaced by a macro a while ago, which reads #define PyIndex_Check(obj) \ ((obj)->ob_type->tp_as_number != NULL && \ (obj)->ob_type->tp_as_number->nb_index != NULL) This contradicts PEP 384, because there is no way for non-heaptype types to access the nb_index field. If nobody objects, I would like to submit a patch that adds the function back when the limited API is active. I think to fix that before Python 3.7 is out. Ciao -- Chris -- Christian Tismer-Sperling :^) tismer at stackless.com Software Consulting : http://www.stackless.com/ Karl-Liebknecht-Str. 121 : http://pyside.org 14482 Potsdam : GPG key -> 0xFB7BEE0E phone +49 173 24 18 776 fax +49 (30) 700143-0023 -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 522 bytes Desc: OpenPGP digital signature URL: From njs at pobox.com Fri Jun 1 11:18:15 2018 From: njs at pobox.com (Nathaniel Smith) Date: Fri, 1 Jun 2018 08:18:15 -0700 Subject: [Python-Dev] PyIndex_Check conflicts with PEP 384 In-Reply-To: References: Message-ID: Indeed, that sounds like a pretty straightforward bug in the stable ABI. You should file an issue on bugs.python.org so it doesn't get lost (and if it's the main new stable ABI break in 3.7 then you should probably mark that bug as a release blocker so that Ned notices it). Unfortunately, very few people use the stable ABI currently, so it's easy for things like this to get missed. Hopefully it Qt starts using it then that will help us shake these things out... But it also means we need your help to catch these kinds of issues :-). Thanks! On Fri, Jun 1, 2018, 06:51 Christian Tismer wrote: > Hi friends, > > when implementing the limited API for PySide2, I recognized > a little bug where a function was replaced by a macro. > > The file number.rst on python 3.6 says > > """ > .. c:function:: int PyIndex_Check(PyObject *o) > > Returns ``1`` if *o* is an index integer (has the nb_index slot of the > tp_as_number structure filled in), and ``0`` otherwise. > """ > > Without notice, this function was replaced by a macro a while > ago, which reads > > #define PyIndex_Check(obj) \ > ((obj)->ob_type->tp_as_number != NULL && \ > (obj)->ob_type->tp_as_number->nb_index != NULL) > > This contradicts PEP 384, because there is no way for non-heaptype > types to access the nb_index field. > > If nobody objects, I would like to submit a patch that adds the > function back when the limited API is active. > I think to fix that before Python 3.7 is out. > > Ciao -- Chris > > -- > Christian Tismer-Sperling :^) tismer at stackless.com > Software Consulting : http://www.stackless.com/ > Karl-Liebknecht-Str. 121 : http://pyside.org > 14482 Potsdam : GPG key -> 0xFB7BEE0E > phone +49 173 24 18 776 fax +49 (30) 700143-0023 > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/njs%40pobox.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From status at bugs.python.org Fri Jun 1 12:09:53 2018 From: status at bugs.python.org (Python tracker) Date: Fri, 1 Jun 2018 18:09:53 +0200 (CEST) Subject: [Python-Dev] Summary of Python tracker Issues Message-ID: <20180601160953.764E9560CF@psf.upfronthosting.co.za> ACTIVITY SUMMARY (2018-05-25 - 2018-06-01) Python tracker at https://bugs.python.org/ To view or respond to any of the issues listed below, click on the issue. Do NOT respond to this message. Issues counts and deltas: open 6684 (-15) closed 38802 (+102) total 45486 (+87) Open issues with patches: 2631 Issues opened (64) ================== #31731: [2.7] test_io hangs on x86 Gentoo Refleaks 2.7 https://bugs.python.org/issue31731 reopened by zach.ware #33532: test_multiprocessing_forkserver: TestIgnoreEINTR.test_ignore() https://bugs.python.org/issue33532 reopened by vstinner #33622: Fix and improve errors handling in the garbage collector https://bugs.python.org/issue33622 reopened by serhiy.storchaka #33649: asyncio docs overhaul https://bugs.python.org/issue33649 opened by yselivanov #33650: Prohibit adding a signal handler for SIGCHLD https://bugs.python.org/issue33650 opened by yselivanov #33656: IDLE: Turn on DPI awareness on Windows https://bugs.python.org/issue33656 opened by terry.reedy #33658: Introduce a method to concatenate regex patterns https://bugs.python.org/issue33658 opened by aleskva #33660: pathlib.Path.resolve() returns path with double slash when res https://bugs.python.org/issue33660 opened by QbLearningPython #33661: urllib may leak sensitive HTTP headers to a third-party web si https://bugs.python.org/issue33661 opened by artem.smotrakov #33663: Web.py wsgiserver3.py raises TypeError when CSS file is not fo https://bugs.python.org/issue33663 opened by jmlp #33664: IDLE: scroll text by lines, not pixels. https://bugs.python.org/issue33664 opened by terry.reedy #33665: tkinter.ttk.Scrollbar.fraction() is inaccurate, or at least in https://bugs.python.org/issue33665 opened by pez #33666: Document removal of os.errno https://bugs.python.org/issue33666 opened by hroncok #33668: Wrong behavior of help function on module https://bugs.python.org/issue33668 opened by Oleg.Oleinik #33669: str.format should raise exception when placeholder number does https://bugs.python.org/issue33669 opened by xiang.zhang #33671: Efficient zero-copy for shutil.copy* functions (Linux, OSX and https://bugs.python.org/issue33671 opened by giampaolo.rodola #33676: test_multiprocessing_fork: dangling threads warning https://bugs.python.org/issue33676 opened by vstinner #33678: selector_events.BaseSelectorEventLoop.sock_connect should pres https://bugs.python.org/issue33678 opened by sebastien.bourdeauducq #33679: IDLE: Configurable color on code context https://bugs.python.org/issue33679 opened by cheryl.sabella #33680: regrtest: re-run failed tests in a subprocess https://bugs.python.org/issue33680 opened by vstinner #33683: asyncio: sendfile tests ignore SO_SNDBUF on Windows https://bugs.python.org/issue33683 opened by vstinner #33684: parse failed for mutibytes characters, encode will show in \xx https://bugs.python.org/issue33684 opened by zhou.ronghua #33686: test_concurrent_futures: test_pending_calls_race() failed on x https://bugs.python.org/issue33686 opened by vstinner #33687: uu.py calls os.path.chmod which doesn't exist https://bugs.python.org/issue33687 opened by bsdphk #33688: asyncio child watchers aren't fork friendly https://bugs.python.org/issue33688 opened by yselivanov #33689: Blank lines in .pth file cause a duplicate sys.path entry https://bugs.python.org/issue33689 opened by Malcolm Smith #33692: Chinese characters issue with input() function https://bugs.python.org/issue33692 opened by Valentin Zhao #33694: test_asyncio: test_start_tls_server_1() fails on Python on x86 https://bugs.python.org/issue33694 opened by vstinner #33695: Have shutil.copytree(), copy() and copystat() use cached scand https://bugs.python.org/issue33695 opened by giampaolo.rodola #33696: Install python-docs-theme even if SPHINXBUILD is defined https://bugs.python.org/issue33696 opened by adelfino #33697: test_zipfile.test_write_filtered_python_package() failed on Ap https://bugs.python.org/issue33697 opened by vstinner #33698: `._pth` does not allow to populate `sys.path` with empty entry https://bugs.python.org/issue33698 opened by excitoon #33699: Don't describe try's else clause in a footnote https://bugs.python.org/issue33699 opened by adelfino #33700: [doc] Old version picker don't understand language tags in URL https://bugs.python.org/issue33700 opened by mdk #33701: test_datetime crashed (SIGSEGV) on Travis CI https://bugs.python.org/issue33701 opened by vstinner #33702: Add some missings links in production lists and a little polis https://bugs.python.org/issue33702 opened by adelfino #33705: Unicode is normalised after keywords are checked for https://bugs.python.org/issue33705 opened by steven.daprano #33708: Doc: Asyncio's Event documentation typo. https://bugs.python.org/issue33708 opened by socketpair #33709: test.support.FS_NONASCII returns incorrect result in Windows w https://bugs.python.org/issue33709 opened by Ivan.Pozdeev #33710: Deprecate gettext.lgettext() https://bugs.python.org/issue33710 opened by serhiy.storchaka #33711: Could not find externals/db-* in msi.py on license generation https://bugs.python.org/issue33711 opened by Ivan.Pozdeev #33712: OrderedDict can set an exception in tp_clear https://bugs.python.org/issue33712 opened by serhiy.storchaka #33713: memoryview can set an exception in tp_clear https://bugs.python.org/issue33713 opened by serhiy.storchaka #33714: module can set an exception in tp_clear https://bugs.python.org/issue33714 opened by serhiy.storchaka #33715: test_multiprocessing_spawn.test_wait_result() failed on x86 Wi https://bugs.python.org/issue33715 opened by vstinner #33716: test_concurrent_futures.test_crash() failed on x86 Windows7 3. https://bugs.python.org/issue33716 opened by vstinner #33717: Enhance test.pythoninfo: meta-ticket for multiple changes https://bugs.python.org/issue33717 opened by vstinner #33718: Enhance regrtest: meta-ticket for multiple changes https://bugs.python.org/issue33718 opened by vstinner #33719: Test failures on Python 3.7 beta 5 and Windows 10 https://bugs.python.org/issue33719 opened by vstinner #33720: test_marshal: crash in Python 3.7b5 on Windows 10 https://bugs.python.org/issue33720 opened by vstinner #33721: os.path.exists() ought to return False if pathname contains NU https://bugs.python.org/issue33721 opened by pacujo #33722: Document builtins in mock_open https://bugs.python.org/issue33722 opened by jcrotts #33723: test_time.test_thread_time() failed on AMD64 Debian root 3.x https://bugs.python.org/issue33723 opened by vstinner #33724: test__xxsubinterpreters failed on ARMv7 Ubuntu 3.x https://bugs.python.org/issue33724 opened by vstinner #33725: High Sierra hang when using multi-processing https://bugs.python.org/issue33725 opened by kapilt #33726: Add short descriptions to PEP references in seealso https://bugs.python.org/issue33726 opened by adelfino #33727: Server.wait_closed() doesn't always wait for its transports to https://bugs.python.org/issue33727 opened by yselivanov #33729: Hashlib/blake2* missing 'data' keyword argument https://bugs.python.org/issue33729 opened by Juuso Lehtivarjo #33731: string formatting that produces floats with preset precision w https://bugs.python.org/issue33731 opened by Jakub Szewczyk #33732: Python 2.7.15: xml.sax.parse() closes file objects passed to i https://bugs.python.org/issue33732 opened by gibfahn #33733: Add utilities to get/set pipe and socket buffer sizes? https://bugs.python.org/issue33733 opened by vstinner #33734: asyncio/ssl: Fix AttributeError, increase default handshake ti https://bugs.python.org/issue33734 opened by yselivanov #33735: test_multiprocessing_spawn leaked [1, 2, 1] memory blocks on A https://bugs.python.org/issue33735 opened by vstinner #1366311: SRE engine should release the GIL when/if possible https://bugs.python.org/issue1366311 reopened by barry Most recent 15 issues with no replies (15) ========================================== #33729: Hashlib/blake2* missing 'data' keyword argument https://bugs.python.org/issue33729 #33727: Server.wait_closed() doesn't always wait for its transports to https://bugs.python.org/issue33727 #33726: Add short descriptions to PEP references in seealso https://bugs.python.org/issue33726 #33723: test_time.test_thread_time() failed on AMD64 Debian root 3.x https://bugs.python.org/issue33723 #33722: Document builtins in mock_open https://bugs.python.org/issue33722 #33716: test_concurrent_futures.test_crash() failed on x86 Windows7 3. https://bugs.python.org/issue33716 #33715: test_multiprocessing_spawn.test_wait_result() failed on x86 Wi https://bugs.python.org/issue33715 #33714: module can set an exception in tp_clear https://bugs.python.org/issue33714 #33702: Add some missings links in production lists and a little polis https://bugs.python.org/issue33702 #33699: Don't describe try's else clause in a footnote https://bugs.python.org/issue33699 #33697: test_zipfile.test_write_filtered_python_package() failed on Ap https://bugs.python.org/issue33697 #33695: Have shutil.copytree(), copy() and copystat() use cached scand https://bugs.python.org/issue33695 #33688: asyncio child watchers aren't fork friendly https://bugs.python.org/issue33688 #33686: test_concurrent_futures: test_pending_calls_race() failed on x https://bugs.python.org/issue33686 #33684: parse failed for mutibytes characters, encode will show in \xx https://bugs.python.org/issue33684 Most recent 15 issues waiting for review (15) ============================================= #33734: asyncio/ssl: Fix AttributeError, increase default handshake ti https://bugs.python.org/issue33734 #33726: Add short descriptions to PEP references in seealso https://bugs.python.org/issue33726 #33718: Enhance regrtest: meta-ticket for multiple changes https://bugs.python.org/issue33718 #33717: Enhance test.pythoninfo: meta-ticket for multiple changes https://bugs.python.org/issue33717 #33709: test.support.FS_NONASCII returns incorrect result in Windows w https://bugs.python.org/issue33709 #33708: Doc: Asyncio's Event documentation typo. https://bugs.python.org/issue33708 #33702: Add some missings links in production lists and a little polis https://bugs.python.org/issue33702 #33699: Don't describe try's else clause in a footnote https://bugs.python.org/issue33699 #33696: Install python-docs-theme even if SPHINXBUILD is defined https://bugs.python.org/issue33696 #33695: Have shutil.copytree(), copy() and copystat() use cached scand https://bugs.python.org/issue33695 #33692: Chinese characters issue with input() function https://bugs.python.org/issue33692 #33687: uu.py calls os.path.chmod which doesn't exist https://bugs.python.org/issue33687 #33684: parse failed for mutibytes characters, encode will show in \xx https://bugs.python.org/issue33684 #33679: IDLE: Configurable color on code context https://bugs.python.org/issue33679 #33671: Efficient zero-copy for shutil.copy* functions (Linux, OSX and https://bugs.python.org/issue33671 Top 10 most discussed issues (10) ================================= #33532: test_multiprocessing_forkserver: TestIgnoreEINTR.test_ignore() https://bugs.python.org/issue33532 26 msgs #33597: Compact PyGC_Head https://bugs.python.org/issue33597 21 msgs #33666: Document removal of os.errno https://bugs.python.org/issue33666 15 msgs #33692: Chinese characters issue with input() function https://bugs.python.org/issue33692 15 msgs #33713: memoryview can set an exception in tp_clear https://bugs.python.org/issue33713 12 msgs #33627: test-complex of test_numeric_tower.test_complex() crashes inte https://bugs.python.org/issue33627 9 msgs #33012: Invalid function cast warnings with gcc 8 for METH_NOARGS https://bugs.python.org/issue33012 8 msgs #33701: test_datetime crashed (SIGSEGV) on Travis CI https://bugs.python.org/issue33701 8 msgs #32832: doctest should support custom ps1/ps2 prompts https://bugs.python.org/issue32832 7 msgs #33720: test_marshal: crash in Python 3.7b5 on Windows 10 https://bugs.python.org/issue33720 7 msgs Issues closed (94) ================== #18872: platform.linux_distribution() doesn't recognize Amazon Linux https://bugs.python.org/issue18872 closed by petr.viktorin #19213: platform.linux_distribution detects Oracle Linux as Red Hat En https://bugs.python.org/issue19213 closed by petr.viktorin #20454: platform.linux_distribution() returns empty value on Archlinux https://bugs.python.org/issue20454 closed by petr.viktorin #21327: socket.type value changes after using settimeout() https://bugs.python.org/issue21327 closed by yselivanov #25015: Idle: scroll Text faster with mouse wheel https://bugs.python.org/issue25015 closed by terry.reedy #26251: Use "Low-fragmentation Heap" memory allocator on Windows https://bugs.python.org/issue26251 closed by vstinner #29392: msvcrt.locking crashes python https://bugs.python.org/issue29392 closed by steve.dower #30145: Create a How to or Tutorial documentation for asyncio https://bugs.python.org/issue30145 closed by yselivanov #30391: test_threading_handled() and test_threading_not_handled() of t https://bugs.python.org/issue30391 closed by vstinner #30654: signal module always overwrites SIGINT on interpreter shutdown https://bugs.python.org/issue30654 closed by pitrou #30935: document the new behavior of get_event_loop() in Python 3.6 https://bugs.python.org/issue30935 closed by yselivanov #31151: socketserver.ForkingMixIn.server_close() leaks zombie processe https://bugs.python.org/issue31151 closed by vstinner #31180: test_multiprocessing_spawn hangs randomly https://bugs.python.org/issue31180 closed by vstinner #31233: socketserver.ThreadingMixIn leaks running threads after server https://bugs.python.org/issue31233 closed by vstinner #31246: [2.7] test_signal.test_setitimer_tiny() fails randomly on x86- https://bugs.python.org/issue31246 closed by vstinner #31368: Add os.preadv() and os.pwritev() https://bugs.python.org/issue31368 closed by vstinner #31424: test_socket hangs on x86 Gentoo Installed with X 3.x https://bugs.python.org/issue31424 closed by vstinner #31491: Add is_closing() to asyncio.StreamWriter. https://bugs.python.org/issue31491 closed by yselivanov #31611: Tests failures using -u largefile when the disk is full https://bugs.python.org/issue31611 closed by vstinner #31647: asyncio: StreamWriter write_eof() after close raises mysteriou https://bugs.python.org/issue31647 closed by yselivanov #31851: test_subprocess hangs randomly on Windows with Python 3.x https://bugs.python.org/issue31851 closed by vstinner #31931: test_concurrent_futures: ProcessPoolSpawnExecutorTest.test_shu https://bugs.python.org/issue31931 closed by ned.deily #31971: idle_test: failures on x86 Windows7 3.x https://bugs.python.org/issue31971 closed by vstinner #32038: Add API to intercept socket.close() https://bugs.python.org/issue32038 closed by yselivanov #32090: test_put() of test_multiprocessing queue tests has a race cond https://bugs.python.org/issue32090 closed by vstinner #32091: test_s_option() of test_site.HelperFunctionsTests failed on x8 https://bugs.python.org/issue32091 closed by vstinner #32104: add method throw() to asyncio.Task https://bugs.python.org/issue32104 closed by yselivanov #32131: Missing encoding parameter in urllib/parse.py https://bugs.python.org/issue32131 closed by cheryl.sabella #32156: Fix flake8 warning F401: ... imported but unused https://bugs.python.org/issue32156 closed by vstinner #32290: bolen-dmg-3.6: compilation failed with OSError: [Errno 23] Too https://bugs.python.org/issue32290 closed by ned.deily #32333: test_smtplib: dangling threads on x86 Gentoo Non-Debug with X https://bugs.python.org/issue32333 closed by vstinner #32334: test_configparser left @test_2876_tmp temporary file on x86 Wi https://bugs.python.org/issue32334 closed by vstinner #32380: functools.singledispatch interacts poorly with methods https://bugs.python.org/issue32380 closed by Ethan Smith #32457: Windows Python cannot handle an early PATH entry containing ". https://bugs.python.org/issue32457 closed by steve.dower #32458: test_asyncio: test_start_tls_server_1() fails randomly https://bugs.python.org/issue32458 closed by vstinner #32464: raise NotImplemented vs return NotImplemented https://bugs.python.org/issue32464 closed by yselivanov #32519: venv API docs - symlinks default incorrect https://bugs.python.org/issue32519 closed by vinay.sajip #32528: Change base class for futures.CancelledError https://bugs.python.org/issue32528 closed by yselivanov #32592: Drop support of Windows Vista in Python 3.7 https://bugs.python.org/issue32592 closed by vstinner #32610: asyncio.all_tasks() should return only non-finished tasks. https://bugs.python.org/issue32610 closed by yselivanov #32637: Android: set sys.platform to android https://bugs.python.org/issue32637 closed by vstinner #32646: test_asyncgen: race condition on test_async_gen_asyncio_gc_acl https://bugs.python.org/issue32646 closed by vstinner #32653: AttributeError: 'Task' object has no attribute '_callbacks' https://bugs.python.org/issue32653 closed by yselivanov #32654: Fixes Python for Android API 19 https://bugs.python.org/issue32654 closed by vstinner #32672: .then execution of actions following a future's completion https://bugs.python.org/issue32672 closed by yselivanov #32684: asyncio.gather(..., return_exceptions=True) swallows cancellat https://bugs.python.org/issue32684 closed by yselivanov #32751: wait_for(future, ...) should wait for the future (even if a ti https://bugs.python.org/issue32751 closed by yselivanov #32762: Choose protocol implementation on transport.set_protocol() https://bugs.python.org/issue32762 closed by yselivanov #32878: Document value of st_ino on Windows https://bugs.python.org/issue32878 closed by steve.dower #33001: Buffer overflow vulnerability in os.symlink on Windows (CVE-20 https://bugs.python.org/issue33001 closed by steve.dower #33238: AssertionError on await of Future returned by asyncio.wrap_fut https://bugs.python.org/issue33238 closed by asvetlov #33355: Windows 10 buildbot: 15 min timeout on test_mmap.test_large_fi https://bugs.python.org/issue33355 closed by vstinner #33356: Windows 10 buildbot: test__xxsubinterpreters.test_already_runn https://bugs.python.org/issue33356 closed by vstinner #33357: [EASY C] test_posix.test_posix_spawn_file_actions() leaks memo https://bugs.python.org/issue33357 closed by vstinner #33400: logging.Formatter does not default to ISO8601 date format https://bugs.python.org/issue33400 closed by vinay.sajip #33469: RuntimeError after closing loop that used run_in_executor https://bugs.python.org/issue33469 closed by yselivanov #33494: random.choices ought to check that cumulative weights are in a https://bugs.python.org/issue33494 closed by rhettinger #33505: Optimize asyncio.ensure_future by reordering if conditions https://bugs.python.org/issue33505 closed by yselivanov #33595: Fix references to lambda "arguments" https://bugs.python.org/issue33595 closed by terry.reedy #33598: ActiveState Recipes links in docs, and the apparent closure of https://bugs.python.org/issue33598 closed by rhettinger #33606: Improve logging performance when logger disabled https://bugs.python.org/issue33606 closed by vinay.sajip #33616: typing.NoReturn is undocumented https://bugs.python.org/issue33616 closed by levkivskyi #33623: Fix possible SIGSGV when asyncio.Future is created in __del__ https://bugs.python.org/issue33623 closed by yselivanov #33638: condition lock not re-acquired https://bugs.python.org/issue33638 closed by yselivanov #33639: Use high-performance os.sendfile() in shutil.copy* https://bugs.python.org/issue33639 closed by giampaolo.rodola #33641: Add links to RFCs https://bugs.python.org/issue33641 closed by serhiy.storchaka #33644: Fix signatures of tp_finalize handlers in testing code. https://bugs.python.org/issue33644 closed by serhiy.storchaka #33651: Add get() method to sqlite3.Row class https://bugs.python.org/issue33651 closed by rhettinger #33652: Improve pickling of typing types https://bugs.python.org/issue33652 closed by levkivskyi #33653: EnvironmentError does not set errno unless strerror is set https://bugs.python.org/issue33653 closed by giampaolo.rodola #33654: asyncio: transports don't support switching between Protocol a https://bugs.python.org/issue33654 closed by yselivanov #33655: test_posix_fallocate fails on FreeBSD buildbots with ZFS file https://bugs.python.org/issue33655 closed by ned.deily #33657: float addition rounding error https://bugs.python.org/issue33657 closed by rhettinger #33659: leak in pythonrun.c? https://bugs.python.org/issue33659 closed by lekma #33662: asyncio Stream Reader Blocks on read when data fetched is less https://bugs.python.org/issue33662 closed by yselivanov #33667: Mock calls on mutable objects https://bugs.python.org/issue33667 closed by r.david.murray #33670: Use errorlevel of Sphinx main() in Doc\make.bat https://bugs.python.org/issue33670 closed by steve.dower #33672: Fix Task.__repr__ crash when trying to format Cython's bogus c https://bugs.python.org/issue33672 closed by yselivanov #33673: Install python-docs-theme even if Sphinx is already installed https://bugs.python.org/issue33673 closed by steve.dower #33674: asyncio: race condition in SSLProtocol https://bugs.python.org/issue33674 closed by yselivanov #33675: Buildbot AMD64 Windows10 3.6 fails to compile: Cannot locate M https://bugs.python.org/issue33675 closed by steve.dower #33677: Fix signatures of tp_clear handlers https://bugs.python.org/issue33677 closed by serhiy.storchaka #33681: itertools.groupby() returned igroup is only callable once https://bugs.python.org/issue33681 closed by steven.daprano #33682: Optimize the bytecode for float(0) ? https://bugs.python.org/issue33682 closed by matrixise #33685: Instances bound methods with different memory addresses but sh https://bugs.python.org/issue33685 closed by steven.daprano #33690: urlib.parse.urlencode with empty list and doseq=True drops the https://bugs.python.org/issue33690 closed by maxking #33691: Refactor docstring handling code in the compiler https://bugs.python.org/issue33691 closed by serhiy.storchaka #33693: test test_webbrowser failed https://bugs.python.org/issue33693 closed by ncoghlan #33703: Object deletion and re-creation points to same attribute if at https://bugs.python.org/issue33703 closed by benjamin.peterson #33704: python 3.7 and python 3.8 stable release https://bugs.python.org/issue33704 closed by ned.deily #33706: Segfault in command line processing due to buffer over-read https://bugs.python.org/issue33706 closed by vstinner #33707: Doc: https://bugs.python.org/issue33707 closed by socketpair #33728: pandas.to_records can not be saved by numpy.savez https://bugs.python.org/issue33728 closed by steven.daprano #33730: string format 'n' produces numbers with incorrect precision https://bugs.python.org/issue33730 closed by eric.smith From ericsnowcurrently at gmail.com Fri Jun 1 13:03:48 2018 From: ericsnowcurrently at gmail.com (Eric Snow) Date: Fri, 1 Jun 2018 11:03:48 -0600 Subject: [Python-Dev] How to watch buildbots? In-Reply-To: References: Message-ID: On Wed, May 30, 2018 at 7:36 AM, Nick Coghlan wrote: > There are a few key details here: > > 1. We currently need to run post-merge CI anyway, as we're not doing > linearised commits (where core devs just approve a change without merging > it, and then a gating system like Zuul ensures that the tests are run > against the latest combination of the target branch and the PR before > merging the change) This is more of a concern when non-conflicting PRs against the same (or related) code are active at the same time. For the CPython code base this isn't as much of a problem, right? Under normal circumstances [fresh] active PRs typically do not run afoul of each other. Furthermore, during peak-activity events (like sprints) folks tend to keep a closer eye on the buildbots. I suppose old-but-still-active PRs that previously passed CI could cause a problem. However, it would be unlikely for such a PR to sit for a long time without needing changes before merging, whether to address reviewer concerns or to resolve merge conflicts. So post-merge CI (or merge gating) doesn't seem like much of a factor for us. In that regard I'd consider the buildbots more that sufficient. > 2. Since the buildbots are running on donated dedicated machines (rather > than throwaway instances from a dynamic CI provider), we need to review the > code before we let it run on the contributed systems > 3. The buildbot instances run *1* build at a time, ...where each build incorporates potentially several merged PRs... > which would lead to major > PR merging bottlenecks during sprints if we made them a gating requirement Agreed. There's enough of a delay already when watching the buildbots post-merge (especially some of them). :) > 4. For the vast majority of PRs, the post-merge cross-platform testing is a > formality, since the code being modified is using lower level cross-platform > APIs elsewhere in the standard library, so if it works on Windows, Linux, > and Mac OS X, it will work everywhere Python runs This is especially true of changes proposed by non-core contributors. It is also very true for buildbots with the OS/hardware combos that match CI. That said, when working with the C-API you can end up breaking things on the less common OSes and hardware platforms. So *those* buildbots are invaluable. I'm dealing with that right now. > 5. We generally don't *want* to burden new contributors with the task of > dealing with the less common (or harder to target) platforms outside the big > 3 - when they do break, it often takes a non-trivial amount of platform > knowledge to understand what's different about the platform in question As hinted above, I would not expect new contributors to provide patches very often (if ever) that would have the potential to cause buildbot failures but not fail under CI. So this point seems somewhat moot. :) > P.S. That said, if VSTS or Travis were to offer FreeBSD as an option for > pre-merge CI, I'd suggest we enable it, at least in an advisory capacity - > it's a better check against Linux-specific assumptions creeping into the > code base than Mac OS X, since the latter is regularly different enough from > other *nix systems that we need to give it dedicated code paths. +1 -eric From J.Demeyer at UGent.be Fri Jun 1 13:45:25 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Fri, 1 Jun 2018 19:45:25 +0200 Subject: [Python-Dev] Stable ABI In-Reply-To: <75ea61b157f74839b6c756a0d301ebb7@xmail101.UGent.be> References: <75ea61b157f74839b6c756a0d301ebb7@xmail101.UGent.be> Message-ID: <5B118635.1040006@UGent.be> On 2018-06-01 17:18, Nathaniel Smith wrote: > Unfortunately, very few people use the stable ABI currently, so it's > easy for things like this to get missed. So there are no tests for the stable ABI in Python? From tismer at stackless.com Fri Jun 1 17:37:22 2018 From: tismer at stackless.com (Christian Tismer) Date: Fri, 1 Jun 2018 23:37:22 +0200 Subject: [Python-Dev] PyIndex_Check conflicts with PEP 384 In-Reply-To: References: Message-ID: Hi Nate, I just did that, and I got some follow-up work, too, which is fine with me. Just as a note for you: Qt not itself, but PyQt5 did that already and submitted a stable ABI for Python 3.5 and up. I was challenged end of last December to try that, and I succeeded after a long struggle, because I also needed to convert all of PySide2 to using heaptypes, and that was really hard. Actually, I succeeded yesterday, after 5 months. And now I know all the subtle things that people need to know when converting existing types to heaptypes. Since QtC has adopted PySide2 in 2016, including myself as a consultant, now it is really a Qt product, and the limited API is due to my work. It comes naturally that I also should fix the problems which showed up during that process. I also think to submit a paper to python.org where I document all the subtle problems which occured during the conversion process. It looks simple, but it really is not. All the best -- Chris On 01.06.18 17:18, Nathaniel Smith wrote: > Indeed, that sounds like a pretty straightforward bug in the stable ABI. > You should file an issue on bugs.python.org so > it doesn't get lost (and if it's the main new stable ABI break in 3.7 > then you should probably mark that bug as a release blocker so that Ned > notices it). > > Unfortunately, very few people use the stable ABI currently, so it's > easy for things like this to get missed. Hopefully it Qt starts using it > then that will help us shake these things out... But it also means we > need your help to catch these kinds of issues :-). Thanks! > > On Fri, Jun 1, 2018, 06:51 Christian Tismer > wrote: > > Hi friends, > > when implementing the limited API for PySide2, I recognized > a little bug where a function was replaced by a macro. > > The file number.rst on python 3.6 says > > """ > .. c:function:: int PyIndex_Check(PyObject *o) > > ? ?Returns ``1`` if *o* is an index integer (has the nb_index slot > of the > ? ?tp_as_number structure filled in), and ``0`` otherwise. > """ > > Without notice, this function was replaced by a macro a while > ago, which reads > > #define PyIndex_Check(obj) \ > ? ?((obj)->ob_type->tp_as_number != NULL && \ > ? ? (obj)->ob_type->tp_as_number->nb_index != NULL) > > This contradicts PEP 384, because there is no way for non-heaptype > types to access the nb_index field. > > If nobody objects, I would like to submit a patch that adds the > function back when the limited API is active. > I think to fix that before Python 3.7 is out. > > Ciao -- Chris > > -- > Christian Tismer-Sperling? ? :^)? ?tismer at stackless.com > > Software Consulting? ? ? ? ? :? ? ?http://www.stackless.com/ > Karl-Liebknecht-Str. 121? ? ?:? ? ?http://pyside.org > 14482 Potsdam? ? ? ? ? ? ? ? :? ? ?GPG key -> 0xFB7BEE0E > phone +49 173 24 18 776? fax +49 (30) 700143-0023 > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/njs%40pobox.com > -- Christian Tismer-Sperling :^) tismer at stackless.com Software Consulting : http://www.stackless.com/ Karl-Liebknecht-Str. 121 : http://pyside.org 14482 Potsdam : GPG key -> 0xFB7BEE0E phone +49 173 24 18 776 fax +49 (30) 700143-0023 -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 496 bytes Desc: OpenPGP digital signature URL: From lists at janc.be Fri Jun 1 19:59:37 2018 From: lists at janc.be (Jan Claeys) Date: Sat, 02 Jun 2018 01:59:37 +0200 Subject: [Python-Dev] The history of PyXML In-Reply-To: References: Message-ID: On Thu, 2018-05-31 at 09:23 -0600, Jeremy Kloth wrote: > I had also contacted SourceForge prior to me uploading a Github repo. > They have just restored the project files for PyXML as well (CVS, > downloads, ...). So I was right thinking they might have a backup? :-) I have to say it seems like the new owners of SourceForge are doing their best to make it a good place for project hosting again. (Not always as fast as one would wish, but I understand it's big, and the previous owners did a lot of damage to the brand...) -- Jan Claeys From ncoghlan at gmail.com Fri Jun 1 23:47:23 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 2 Jun 2018 13:47:23 +1000 Subject: [Python-Dev] Stable ABI In-Reply-To: <5B118635.1040006@UGent.be> References: <75ea61b157f74839b6c756a0d301ebb7@xmail101.UGent.be> <5B118635.1040006@UGent.be> Message-ID: On 2 June 2018 at 03:45, Jeroen Demeyer wrote: > On 2018-06-01 17:18, Nathaniel Smith wrote: > >> Unfortunately, very few people use the stable ABI currently, so it's >> easy for things like this to get missed. >> > > So there are no tests for the stable ABI in Python? > Unfortunately not. https://bugs.python.org/issue21142 is an old issue suggesting automating those checks (so we don't inadvertently add or remove symbols for previously published stable ABI definitions), but it's not yet made it to the state of being sufficiently well automated that it can be a release checklist item in PEP 101. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From storchaka at gmail.com Sat Jun 2 00:33:03 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Sat, 2 Jun 2018 07:33:03 +0300 Subject: [Python-Dev] PyIndex_Check conflicts with PEP 384 In-Reply-To: References: Message-ID: 02.06.18 00:37, Christian Tismer ????: > I was challenged end of last December to try that, and I succeeded > after a long struggle, because I also needed to convert all of PySide2 > to using heaptypes, and that was really hard. Actually, I succeeded > yesterday, after 5 months. And now I know all the subtle things > that people need to know when converting existing types to heaptypes. Are you aware of the following pitfall Christian? https://bugs.python.org/issue26979 From armin.rigo at gmail.com Sat Jun 2 16:20:33 2018 From: armin.rigo at gmail.com (Armin Rigo) Date: Sat, 2 Jun 2018 22:20:33 +0200 Subject: [Python-Dev] Add __reversed__ methods for dict In-Reply-To: References: Message-ID: Hi Inada, On 27 May 2018 at 09:12, INADA Naoki wrote: > When focusing to CPython, PyPy and MicroPython, no problem for adding > __reverse__ in 3.8 seems OK. Fwiw, the functionality that is present in OrderedDict but still absent from 'dict' is: ``__reverse__``, discussed above, and ``move_to_end(last=False)``. In PyPy3, OrderedDict is implemented not like in the CPython stdlib but as just a thin dict subclass without any extra data, using two custom built-ins from the ``__pypy__`` module for the two methods above (plus some pure Python code for other methods like __eq__(), where it is possible to do so with the correct complexity). A bient?t, Armin. From tismer at stackless.com Sun Jun 3 06:03:25 2018 From: tismer at stackless.com (Christian Tismer) Date: Sun, 3 Jun 2018 12:03:25 +0200 Subject: [Python-Dev] Stable ABI In-Reply-To: References: <75ea61b157f74839b6c756a0d301ebb7@xmail101.UGent.be> <5B118635.1040006@UGent.be> Message-ID: On 02.06.18 05:47, Nick Coghlan wrote: > On 2 June 2018 at 03:45, Jeroen Demeyer > wrote: > > On 2018-06-01 17:18, Nathaniel Smith wrote: > > Unfortunately, very few people use the stable ABI currently, so it's > easy for things like this to get missed. > > > So there are no tests for the stable ABI in Python? > > > Unfortunately not. > > https://bugs.python.org/issue21142 is an old issue suggesting automating > those checks (so we don't inadvertently add or remove symbols for > previously published stable ABI definitions), but it's not yet made it > to the state of being sufficiently well automated that it can be a > release checklist item in PEP 101. > > Cheers, > Nick. Actually, I think we don't need such a test any more, or we could use this one as a heuristic test: I have written a script that scans all relevant header files and analyses all sections which are reachable in the limited API context. All macros that don't begin with an underscore which contain a "->tp_" string are the locations which will break. I found exactly 7 locations where this is the case. My PR will contain the 7 fixes plus the analysis script to go into tools. Preparind that in the evening. cheers -- Chris -- Christian Tismer-Sperling :^) tismer at stackless.com Software Consulting : http://www.stackless.com/ Karl-Liebknecht-Str. 121 : http://pyside.org 14482 Potsdam : GPG key -> 0xFB7BEE0E phone +49 173 24 18 776 fax +49 (30) 700143-0023 -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 496 bytes Desc: OpenPGP digital signature URL: From ronaldoussoren at mac.com Sun Jun 3 07:18:15 2018 From: ronaldoussoren at mac.com (Ronald Oussoren) Date: Sun, 03 Jun 2018 13:18:15 +0200 Subject: [Python-Dev] Stable ABI In-Reply-To: References: <75ea61b157f74839b6c756a0d301ebb7@xmail101.UGent.be> <5B118635.1040006@UGent.be> Message-ID: <8C620897-6A78-468C-AA0E-AF3209FB5B48@mac.com> > On 3 Jun 2018, at 12:03, Christian Tismer wrote: > > On 02.06.18 05:47, Nick Coghlan wrote: >> On 2 June 2018 at 03:45, Jeroen Demeyer > > wrote: >> >> On 2018-06-01 17:18, Nathaniel Smith wrote: >> >> Unfortunately, very few people use the stable ABI currently, so it's >> easy for things like this to get missed. >> >> >> So there are no tests for the stable ABI in Python? >> >> >> Unfortunately not. >> >> https://bugs.python.org/issue21142 is an old issue suggesting automating >> those checks (so we don't inadvertently add or remove symbols for >> previously published stable ABI definitions), but it's not yet made it >> to the state of being sufficiently well automated that it can be a >> release checklist item in PEP 101. >> >> Cheers, >> Nick. > > Actually, I think we don't need such a test any more, or we > could use this one as a heuristic test: > > I have written a script that scans all relevant header files > and analyses all sections which are reachable in the limited API > context. > All macros that don't begin with an underscore which contain > a "->tp_" string are the locations which will break. > > I found exactly 7 locations where this is the case. > > My PR will contain the 7 fixes plus the analysis script > to go into tools. Preparind that in the evening. Having tests would still be nice to detect changes to the stable ABI when they are made. Writing those tests is quite some work though, especially if those at least smoke test the limited ABI by compiling snippets the use all symbols that should be exposed by the limited ABI. Writing those tests should be fairly simple for someone that knows how to write C extensions, but is some work. Writing a tests that complain when the headers expose symbols that shouldn?t be exposed is harder, due to the need to parse headers (either by hacking something together using regular expressions, or by using tools like gccxml or clang?s C API). BTW. The problem with the tool in issue 21142 is that this looks at the symbols exposed by lib python on linux, and that exposed more symbols than just the limited ABI. Ronald From tismer at stackless.com Sun Jun 3 10:55:25 2018 From: tismer at stackless.com (Christian Tismer) Date: Sun, 3 Jun 2018 16:55:25 +0200 Subject: [Python-Dev] Stable ABI In-Reply-To: <8C620897-6A78-468C-AA0E-AF3209FB5B48@mac.com> References: <75ea61b157f74839b6c756a0d301ebb7@xmail101.UGent.be> <5B118635.1040006@UGent.be> <8C620897-6A78-468C-AA0E-AF3209FB5B48@mac.com> Message-ID: On 03.06.18 13:18, Ronald Oussoren wrote: > > >> On 3 Jun 2018, at 12:03, Christian Tismer wrote: ... >> >> I have written a script that scans all relevant header files >> and analyses all sections which are reachable in the limited API >> context. >> All macros that don't begin with an underscore which contain >> a "->tp_" string are the locations which will break. >> >> I found exactly 7 locations where this is the case. >> >> My PR will contain the 7 fixes plus the analysis script >> to go into tools. Preparind that in the evening. > > Having tests would still be nice to detect changes to the stable ABI when they are made. > > Writing those tests is quite some work though, especially if those at least smoke test the limited ABI by compiling snippets the use all symbols that should be exposed by the limited ABI. Writing those tests should be fairly simple for someone that knows how to write C extensions, but is some work. > > Writing a tests that complain when the headers expose symbols that shouldn?t be exposed is harder, due to the need to parse headers (either by hacking something together using regular expressions, or by using tools like gccxml or clang?s C API). What do you mean? My script does that with all "tp_*" type fields. What else would you want to check? -- Christian Tismer-Sperling :^) tismer at stackless.com Software Consulting : http://www.stackless.com/ Karl-Liebknecht-Str. 121 : http://pyside.org 14482 Potsdam : GPG key -> 0xFB7BEE0E phone +49 173 24 18 776 fax +49 (30) 700143-0023 -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 522 bytes Desc: OpenPGP digital signature URL: From eric at trueblade.com Sun Jun 3 11:04:44 2018 From: eric at trueblade.com (Eric V. Smith) Date: Sun, 3 Jun 2018 11:04:44 -0400 Subject: [Python-Dev] Stable ABI In-Reply-To: References: <75ea61b157f74839b6c756a0d301ebb7@xmail101.UGent.be> <5B118635.1040006@UGent.be> <8C620897-6A78-468C-AA0E-AF3209FB5B48@mac.com> Message-ID: <99bbe8b9-3fbf-3340-eff0-35abb541d6df@trueblade.com> On 6/3/2018 10:55 AM, Christian Tismer wrote: > On 03.06.18 13:18, Ronald Oussoren wrote: >> >> >>> On 3 Jun 2018, at 12:03, Christian Tismer wrote: > ... > >>> >>> I have written a script that scans all relevant header files >>> and analyses all sections which are reachable in the limited API >>> context. >>> All macros that don't begin with an underscore which contain >>> a "->tp_" string are the locations which will break. >>> >>> I found exactly 7 locations where this is the case. >>> >>> My PR will contain the 7 fixes plus the analysis script >>> to go into tools. Preparind that in the evening. >> >> Having tests would still be nice to detect changes to the stable ABI when they are made. >> >> Writing those tests is quite some work though, especially if those at least smoke test the limited ABI by compiling snippets the use all symbols that should be exposed by the limited ABI. Writing those tests should be fairly simple for someone that knows how to write C extensions, but is some work. >> >> Writing a tests that complain when the headers expose symbols that shouldn?t be exposed is harder, due to the need to parse headers (either by hacking something together using regular expressions, or by using tools like gccxml or clang?s C API). > > What do you mean? > My script does that with all "tp_*" type fields. > What else would you want to check? I think Ronald is saying we're trying to answer a few questions: 1. Did we accidentally drop anything from the stable ABI? 2. Did we add anything to the stable ABI that we didn't mean to? 3. (and one of mine): Does the stable ABI already contain things that we don't expect it to? Eric From ronaldoussoren at mac.com Mon Jun 4 03:17:24 2018 From: ronaldoussoren at mac.com (Ronald Oussoren) Date: Mon, 04 Jun 2018 09:17:24 +0200 Subject: [Python-Dev] Stable ABI In-Reply-To: References: <75ea61b157f74839b6c756a0d301ebb7@xmail101.UGent.be> <5B118635.1040006@UGent.be> <8C620897-6A78-468C-AA0E-AF3209FB5B48@mac.com> <99bbe8b9-3fbf-3340-eff0-35abb541d6df@trueblade.com> Message-ID: > On 4 Jun 2018, at 08:35, Ronald Oussoren wrote: > > > >> On 3 Jun 2018, at 17:04, Eric V. Smith > wrote: >> >> On 6/3/2018 10:55 AM, Christian Tismer wrote: >>> On 03.06.18 13:18, Ronald Oussoren wrote: >>>> >>>> >>>>> On 3 Jun 2018, at 12:03, Christian Tismer > wrote: >>> ... >>>>> >>>>> I have written a script that scans all relevant header files >>>>> and analyses all sections which are reachable in the limited API >>>>> context. >>>>> All macros that don't begin with an underscore which contain >>>>> a "->tp_" string are the locations which will break. >>>>> >>>>> I found exactly 7 locations where this is the case. >>>>> >>>>> My PR will contain the 7 fixes plus the analysis script >>>>> to go into tools. Preparind that in the evening. >>>> >>>> Having tests would still be nice to detect changes to the stable ABI when they are made. >>>> >>>> Writing those tests is quite some work though, especially if those at least smoke test the limited ABI by compiling snippets the use all symbols that should be exposed by the limited ABI. Writing those tests should be fairly simple for someone that knows how to write C extensions, but is some work. >>>> >>>> Writing a tests that complain when the headers expose symbols that shouldn?t be exposed is harder, due to the need to parse headers (either by hacking something together using regular expressions, or by using tools like gccxml or clang?s C API). >>> What do you mean? >>> My script does that with all "tp_*" type fields. >>> What else would you want to check? >> >> I think Ronald is saying we're trying to answer a few questions: >> >> 1. Did we accidentally drop anything from the stable ABI? >> >> 2. Did we add anything to the stable ABI that we didn't mean to? >> >> 3. (and one of mine): Does the stable ABI already contain things that we don't expect it to? > > That?s correct. There have been instances of the second item over the year, and not all of them have been caught before releases. What doesn?t help for all of these is that the stable ABI documentation says that every documented symbol is part of the stable ABI unless there?s explicit documentation to the contrary. This makes researching if functions are intended to be part of the stable ABI harder. > > And also: > > 4. Does the stable ABI actually work? > > Christian?s script finds cases where exposed names don?t actually work when you try to use them. To reply to myself, the gist below is a very crude version of what I was trying to suggest: https://gist.github.com/ronaldoussoren/fe4f80351a7ee72c245025df7b2ef1ed#file-gistfile1-txt The gist is far from usable, but shows some tests that check that symbols in the stable ABI can be used, and tests that everything exported in the stable ABI is actually tested. Again, the code in the gist is a crude hack and I have currently no plans to turn this into something that could be added to the testsuite. Ronald -------------- next part -------------- An HTML attachment was scrubbed... URL: From ronaldoussoren at mac.com Mon Jun 4 02:35:31 2018 From: ronaldoussoren at mac.com (Ronald Oussoren) Date: Mon, 04 Jun 2018 08:35:31 +0200 Subject: [Python-Dev] Stable ABI In-Reply-To: <99bbe8b9-3fbf-3340-eff0-35abb541d6df@trueblade.com> References: <75ea61b157f74839b6c756a0d301ebb7@xmail101.UGent.be> <5B118635.1040006@UGent.be> <8C620897-6A78-468C-AA0E-AF3209FB5B48@mac.com> <99bbe8b9-3fbf-3340-eff0-35abb541d6df@trueblade.com> Message-ID: > On 3 Jun 2018, at 17:04, Eric V. Smith wrote: > > On 6/3/2018 10:55 AM, Christian Tismer wrote: >> On 03.06.18 13:18, Ronald Oussoren wrote: >>> >>> >>>> On 3 Jun 2018, at 12:03, Christian Tismer wrote: >> ... >>>> >>>> I have written a script that scans all relevant header files >>>> and analyses all sections which are reachable in the limited API >>>> context. >>>> All macros that don't begin with an underscore which contain >>>> a "->tp_" string are the locations which will break. >>>> >>>> I found exactly 7 locations where this is the case. >>>> >>>> My PR will contain the 7 fixes plus the analysis script >>>> to go into tools. Preparind that in the evening. >>> >>> Having tests would still be nice to detect changes to the stable ABI when they are made. >>> >>> Writing those tests is quite some work though, especially if those at least smoke test the limited ABI by compiling snippets the use all symbols that should be exposed by the limited ABI. Writing those tests should be fairly simple for someone that knows how to write C extensions, but is some work. >>> >>> Writing a tests that complain when the headers expose symbols that shouldn?t be exposed is harder, due to the need to parse headers (either by hacking something together using regular expressions, or by using tools like gccxml or clang?s C API). >> What do you mean? >> My script does that with all "tp_*" type fields. >> What else would you want to check? > > I think Ronald is saying we're trying to answer a few questions: > > 1. Did we accidentally drop anything from the stable ABI? > > 2. Did we add anything to the stable ABI that we didn't mean to? > > 3. (and one of mine): Does the stable ABI already contain things that we don't expect it to? That?s correct. There have been instances of the second item over the year, and not all of them have been caught before releases. What doesn?t help for all of these is that the stable ABI documentation says that every documented symbol is part of the stable ABI unless there?s explicit documentation to the contrary. This makes researching if functions are intended to be part of the stable ABI harder. And also: 4. Does the stable ABI actually work? Christian?s script finds cases where exposed names don?t actually work when you try to use them. Ronald -------------- next part -------------- An HTML attachment was scrubbed... URL: From ericsnowcurrently at gmail.com Mon Jun 4 11:05:20 2018 From: ericsnowcurrently at gmail.com (Eric Snow) Date: Mon, 4 Jun 2018 09:05:20 -0600 Subject: [Python-Dev] Stable ABI In-Reply-To: References: <75ea61b157f74839b6c756a0d301ebb7@xmail101.UGent.be> <5B118635.1040006@UGent.be> <8C620897-6A78-468C-AA0E-AF3209FB5B48@mac.com> <99bbe8b9-3fbf-3340-eff0-35abb541d6df@trueblade.com> Message-ID: I've pointed out in bpo-21142 the similar script I added last year to track C globals: https://github.com/python/cpython/tree/master/Tools/c-globals -eric On Mon, Jun 4, 2018 at 1:17 AM, Ronald Oussoren wrote: > > > On 4 Jun 2018, at 08:35, Ronald Oussoren wrote: > > > > On 3 Jun 2018, at 17:04, Eric V. Smith wrote: > > On 6/3/2018 10:55 AM, Christian Tismer wrote: > > On 03.06.18 13:18, Ronald Oussoren wrote: > > > > On 3 Jun 2018, at 12:03, Christian Tismer wrote: > > ... > > > I have written a script that scans all relevant header files > and analyses all sections which are reachable in the limited API > context. > All macros that don't begin with an underscore which contain > a "->tp_" string are the locations which will break. > > I found exactly 7 locations where this is the case. > > My PR will contain the 7 fixes plus the analysis script > to go into tools. Preparind that in the evening. > > > Having tests would still be nice to detect changes to the stable ABI when > they are made. > > Writing those tests is quite some work though, especially if those at least > smoke test the limited ABI by compiling snippets the use all symbols that > should be exposed by the limited ABI. Writing those tests should be fairly > simple for someone that knows how to write C extensions, but is some work. > > Writing a tests that complain when the headers expose symbols that shouldn?t > be exposed is harder, due to the need to parse headers (either by hacking > something together using regular expressions, or by using tools like gccxml > or clang?s C API). > > What do you mean? > My script does that with all "tp_*" type fields. > What else would you want to check? > > > I think Ronald is saying we're trying to answer a few questions: > > 1. Did we accidentally drop anything from the stable ABI? > > 2. Did we add anything to the stable ABI that we didn't mean to? > > 3. (and one of mine): Does the stable ABI already contain things that we > don't expect it to? > > > That?s correct. There have been instances of the second item over the year, > and not all of them have been caught before releases. What doesn?t help for > all of these is that the stable ABI documentation says that every documented > symbol is part of the stable ABI unless there?s explicit documentation to > the contrary. This makes researching if functions are intended to be part of > the stable ABI harder. > > And also: > > 4. Does the stable ABI actually work? > > Christian?s script finds cases where exposed names don?t actually work when > you try to use them. > > > To reply to myself, the gist below is a very crude version of what I was > trying to suggest: > > https://gist.github.com/ronaldoussoren/fe4f80351a7ee72c245025df7b2ef1ed#file-gistfile1-txt > > The gist is far from usable, but shows some tests that check that symbols in > the stable ABI can be used, and tests that everything exported in the stable > ABI is actually tested. > > Again, the code in the gist is a crude hack and I have currently no plans to > turn this into something that could be added to the testsuite. > > Ronald > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/ericsnowcurrently%40gmail.com > From vstinner at redhat.com Mon Jun 4 11:03:27 2018 From: vstinner at redhat.com (Victor Stinner) Date: Mon, 4 Jun 2018 17:03:27 +0200 Subject: [Python-Dev] Microsoft to acquire GitHub for $7.5 billion Message-ID: Hi, It's now official: Microsoft is acquiring GitHub. https://blog.github.com/2018-06-04-github-microsoft/ https://news.microsoft.com/2018/06/04/microsoft-to-acquire-github-for-7-5-billion/ I copy the news here since Python rely more and more on GitHub. At this point, I have no opinion about the event :-) I just guess that it should make GitHub more sustainable since Microsoft is a big company with money and interest in GitHub. I'm also confident that nothing will change soon. IMHO there is no need to worry about anything. Victor From solipsis at pitrou.net Mon Jun 4 11:40:34 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Mon, 4 Jun 2018 17:40:34 +0200 Subject: [Python-Dev] Microsoft to acquire GitHub for $7.5 billion References: Message-ID: <20180604174034.58200616@fsol> On Mon, 4 Jun 2018 17:03:27 +0200 Victor Stinner wrote: > > At this point, I have no opinion about the event :-) I just guess that > it should make GitHub more sustainable since Microsoft is a big > company with money and interest in GitHub. I'm also confident that > nothing will change soon. IMHO there is no need to worry about > anything. It does spell uncertainty on the long term. While there is no need to worry for now, I think it gives a different colour to the debate about moving issues to Github. Regards Antoine. From guido at python.org Mon Jun 4 12:06:28 2018 From: guido at python.org (Guido van Rossum) Date: Mon, 4 Jun 2018 09:06:28 -0700 Subject: [Python-Dev] Microsoft to acquire GitHub for $7.5 billion In-Reply-To: <20180604174034.58200616@fsol> References: <20180604174034.58200616@fsol> Message-ID: On Mon, Jun 4, 2018 at 8:40 AM, Antoine Pitrou wrote: > > On Mon, 4 Jun 2018 17:03:27 +0200 > Victor Stinner wrote: > > > > At this point, I have no opinion about the event :-) I just guess that > > it should make GitHub more sustainable since Microsoft is a big > > company with money and interest in GitHub. I'm also confident that > > nothing will change soon. IMHO there is no need to worry about > > anything. > > It does spell uncertainty on the long term. While there is no need to > worry for now, I think it gives a different colour to the debate about > moving issues to Github. > I don't see how this *increases* the uncertainty. Surely if GitHub had remained independent there would have been be similar concerns about how it would make enough money to stay in business. -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From vstinner at redhat.com Mon Jun 4 12:11:55 2018 From: vstinner at redhat.com (Victor Stinner) Date: Mon, 4 Jun 2018 18:11:55 +0200 Subject: [Python-Dev] Why not using "except: (...) raise" to cleanup on error? Message-ID: Hi, I just read a recent bugfix in asyncio: https://github.com/python/cpython/commit/9602643120a509858d0bee4215d7f150e6125468 + try: + await waiter + except Exception: + transport.close() + raise Why only catching "except Exception:"? Why not also catching KeyboardInterrupt or MemoryError? Is it a special rule for asyncio, or a general policy in Python stdlib? For me, it's fine to catch any exception using "except:" if the block contains "raise", typical pattern to cleanup a resource in case of error. Otherwise, there is a risk of leaking open file or not flushing data on disk, for example. Victor From vstinner at redhat.com Mon Jun 4 12:31:12 2018 From: vstinner at redhat.com (Victor Stinner) Date: Mon, 4 Jun 2018 18:31:12 +0200 Subject: [Python-Dev] Keeping an eye on Travis CI, AppVeyor and buildbots: revert on regression In-Reply-To: References: Message-ID: 2018-05-30 11:33 GMT+02:00 Victor Stinner : > I fixed a few tests which failed randomly. There are still a few, but > the most annoying have been fixed. Quick update a few days later. For an unknown reason, test_multiprocessing_forkserver.TestIgnoreEINTR.test_ignore() started to fail very frequently but only on Travis CI. I have no explanation why only Travis CI. I failed to reproduce the issue on a Ubuntu Trusty container or in a Ubuntu Trusty VM. After hours of debug, I found the bug and wrote a fix. But the fix didn't work in all cases. A second fix and now it seems like the issue is gone! https://bugs.python.org/issue33532 if you are curious about the strange multiprocessing send() which must block but it doesn't :-) Except Windows 7 which has issues with test_asyncio and multiprocessing tests because this buildbot is slow, it seems like most CIs are now stable. Known issues: * PPC64 Fedora 3.x, PPC64LE Fedora 3.x, s390x RHEL 3.x: https://bugs.python.org/issue33630 * AIX: always red * USBan: experimental buildbot * Alpine: platform not supported yet (musl issues) Victor From rosuav at gmail.com Mon Jun 4 12:42:09 2018 From: rosuav at gmail.com (Chris Angelico) Date: Tue, 5 Jun 2018 02:42:09 +1000 Subject: [Python-Dev] Why not using "except: (...) raise" to cleanup on error? In-Reply-To: References: Message-ID: On Tue, Jun 5, 2018 at 2:11 AM, Victor Stinner wrote: > Hi, > > I just read a recent bugfix in asyncio: > > https://github.com/python/cpython/commit/9602643120a509858d0bee4215d7f150e6125468 > > + try: > + await waiter > + except Exception: > + transport.close() > + raise > > Why only catching "except Exception:"? Why not also catching > KeyboardInterrupt or MemoryError? Is it a special rule for asyncio, or > a general policy in Python stdlib? > > For me, it's fine to catch any exception using "except:" if the block > contains "raise", typical pattern to cleanup a resource in case of > error. Otherwise, there is a risk of leaking open file or not flushing > data on disk, for example. Pardon the dumb question, but why is try/finally unsuitable? ChrisA From guido at python.org Mon Jun 4 12:45:46 2018 From: guido at python.org (Guido van Rossum) Date: Mon, 4 Jun 2018 09:45:46 -0700 Subject: [Python-Dev] Why not using "except: (...) raise" to cleanup on error? In-Reply-To: References: Message-ID: It is currently a general convention in asyncio to only catch Exception, not BaseException. I consider this a flaw and we should fix it, but it's unfortunately not so easy -- the tests will fail if you replace all occurrences of Exception with BaseException, and it is not always clear what's the right thing to do. E.g. catching KeyboardInterrupt may actually make it harder to stop a runaway asyncio app. We should move this discussion to the issue tracker. On Mon, Jun 4, 2018 at 9:11 AM, Victor Stinner wrote: > Hi, > > I just read a recent bugfix in asyncio: > > https://github.com/python/cpython/commit/9602643120a509858d0bee4215d7f1 > 50e6125468 > > + try: > + await waiter > + except Exception: > + transport.close() > + raise > > Why only catching "except Exception:"? Why not also catching > KeyboardInterrupt or MemoryError? Is it a special rule for asyncio, or > a general policy in Python stdlib? > > For me, it's fine to catch any exception using "except:" if the block > contains "raise", typical pattern to cleanup a resource in case of > error. Otherwise, there is a risk of leaking open file or not flushing > data on disk, for example. > > Victor > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ > guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From yselivanov.ml at gmail.com Mon Jun 4 12:53:26 2018 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Mon, 4 Jun 2018 12:53:26 -0400 Subject: [Python-Dev] Why not using "except: (...) raise" to cleanup on error? In-Reply-To: References: Message-ID: > It is currently a general convention in asyncio to only catch Exception, not BaseException. I consider this a flaw and we should fix it, but it's unfortunately not so easy -- the tests will fail if you replace all occurrences of Exception with BaseException, and it is not always clear what's the right thing to do. E.g. catching KeyboardInterrupt may actually make it harder to stop a runaway asyncio app. Yes. Catching BaseExceptions or KeyboardInterrupts in start_tls() would be pointless. Currently asyncio's internal state isn't properly hardened to survive a BaseException in all other places it can occur. Fixing that is one of my goals for 3.8. Yury From yselivanov.ml at gmail.com Mon Jun 4 12:57:44 2018 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Mon, 4 Jun 2018 12:57:44 -0400 Subject: [Python-Dev] Why not using "except: (...) raise" to cleanup on error? In-Reply-To: References: Message-ID: On Mon, Jun 4, 2018 at 12:50 PM Chris Angelico wrote: > > On Tue, Jun 5, 2018 at 2:11 AM, Victor Stinner wrote: [..] > > For me, it's fine to catch any exception using "except:" if the block > > contains "raise", typical pattern to cleanup a resource in case of > > error. Otherwise, there is a risk of leaking open file or not flushing > > data on disk, for example. > > Pardon the dumb question, but why is try/finally unsuitable? Because try..finally isn't equivalent to try..except? Perhaps you should look at the actual code: https://github.com/python/cpython/blob/b609e687a076d77bdd687f5e4def85e29a044bfc/Lib/asyncio/base_events.py#L1117-L1123 Yury From rosuav at gmail.com Mon Jun 4 13:11:53 2018 From: rosuav at gmail.com (Chris Angelico) Date: Tue, 5 Jun 2018 03:11:53 +1000 Subject: [Python-Dev] Why not using "except: (...) raise" to cleanup on error? In-Reply-To: References: Message-ID: On Tue, Jun 5, 2018 at 2:57 AM, Yury Selivanov wrote: > On Mon, Jun 4, 2018 at 12:50 PM Chris Angelico wrote: >> >> On Tue, Jun 5, 2018 at 2:11 AM, Victor Stinner wrote: > [..] >> > For me, it's fine to catch any exception using "except:" if the block >> > contains "raise", typical pattern to cleanup a resource in case of >> > error. Otherwise, there is a risk of leaking open file or not flushing >> > data on disk, for example. >> >> Pardon the dumb question, but why is try/finally unsuitable? > > Because try..finally isn't equivalent to try..except? Perhaps you > should look at the actual code: > https://github.com/python/cpython/blob/b609e687a076d77bdd687f5e4def85e29a044bfc/Lib/asyncio/base_events.py#L1117-L1123 Oh. Duh. Yep, it was a dumb question. Sorry! The transport should ONLY be closed on error. ChrisA From solipsis at pitrou.net Mon Jun 4 13:02:02 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Mon, 4 Jun 2018 19:02:02 +0200 Subject: [Python-Dev] Microsoft to acquire GitHub for $7.5 billion In-Reply-To: References: <20180604174034.58200616@fsol> Message-ID: <20180604190202.16939ef6@fsol> That's true, but Microsoft has a lot of stakes in the ecosystem. For example, since it has its own CI service that it tries to promote (VSTS), is it in Microsoft's best interest to polish and improve integrations with other CI services? Regards Antoine. On Mon, 4 Jun 2018 09:06:28 -0700 Guido van Rossum wrote: > On Mon, Jun 4, 2018 at 8:40 AM, Antoine Pitrou wrote: > > > > > On Mon, 4 Jun 2018 17:03:27 +0200 > > Victor Stinner wrote: > > > > > > At this point, I have no opinion about the event :-) I just guess that > > > it should make GitHub more sustainable since Microsoft is a big > > > company with money and interest in GitHub. I'm also confident that > > > nothing will change soon. IMHO there is no need to worry about > > > anything. > > > > It does spell uncertainty on the long term. While there is no need to > > worry for now, I think it gives a different colour to the debate about > > moving issues to Github. > > > > I don't see how this *increases* the uncertainty. Surely if GitHub had > remained independent there would have been be similar concerns about how it > would make enough money to stay in business. > From mariatta.wijaya at gmail.com Mon Jun 4 13:49:55 2018 From: mariatta.wijaya at gmail.com (Mariatta Wijaya) Date: Mon, 4 Jun 2018 10:49:55 -0700 Subject: [Python-Dev] Microsoft to acquire GitHub for $7.5 billion In-Reply-To: <20180604190202.16939ef6@fsol> References: <20180604174034.58200616@fsol> <20180604190202.16939ef6@fsol> Message-ID: I think we shouldn't be speculating or making guesses. If people are concerned with how Microsoft will manage GitHub, please talk to Microsoft or GitHub representative, and not gossip in python-dev. If there is actual news or announcement of how GitHub will change, and how it will affect our workflow, we'll discuss in core-workflow. Mariatta On Mon, Jun 4, 2018 at 10:02 AM, Antoine Pitrou wrote: > > That's true, but Microsoft has a lot of stakes in the ecosystem. > For example, since it has its own CI service that it tries to promote > (VSTS), is it in Microsoft's best interest to polish and improve > integrations with other CI services? > > Regards > > Antoine > ? -------------- next part -------------- An HTML attachment was scrubbed... URL: From antoine at python.org Mon Jun 4 14:06:28 2018 From: antoine at python.org (Antoine Pitrou) Date: Mon, 4 Jun 2018 20:06:28 +0200 Subject: [Python-Dev] Microsoft to acquire GitHub for $7.5 billion In-Reply-To: References: <20180604174034.58200616@fsol> <20180604190202.16939ef6@fsol> Message-ID: <9e162f7c-d1fc-6837-4430-70cd5ceca242@python.org> Le 04/06/2018 ? 19:49, Mariatta Wijaya a ?crit?: > I think we shouldn't be speculating or making guesses. > If people are concerned with how Microsoft will manage GitHub, please > talk to Microsoft or GitHub representative, and not gossip in python-dev. What would talking to a MS or GH representative achieve? Of course they won't tell you if their company intend to disturb the current service in a few years. 1) being mere employees they are forbidden to make such statements that could hurt the company's bottom line; 2) they probably don't know themselves, since representatives aren't the ones making decisions, and there's no reason they would be notified months or years in advance. So, if you talk to a representative, you will obviously get some reassuring PR speak (which might be sincere btw). All we are left with is to speculate on where the company's interests lie. It's not unknown for products or services to endure stark changes after their provider is being bought by another entity. Microsoft themselves did it (see Skype), and of course others did as well. Regards Antoine. From ethan at stoneleaf.us Mon Jun 4 14:46:27 2018 From: ethan at stoneleaf.us (Ethan Furman) Date: Mon, 04 Jun 2018 11:46:27 -0700 Subject: [Python-Dev] Microsoft to acquire GitHub for $7.5 billion In-Reply-To: References: <20180604174034.58200616@fsol> <20180604190202.16939ef6@fsol> Message-ID: <5B158903.1000806@stoneleaf.us> On 06/04/2018 10:49 AM, Mariatta Wijaya wrote: > I think we shouldn't be speculating or making guesses. We should have contingency plans and be prepared. More than one bought-out competitor has simply disappeared, or been hamstrung in its effectiveness. -- ~Ethan~ From ben+python at benfinney.id.au Mon Jun 4 14:57:01 2018 From: ben+python at benfinney.id.au (Ben Finney) Date: Tue, 05 Jun 2018 04:57:01 +1000 Subject: [Python-Dev] Microsoft to acquire GitHub for $7.5 billion References: <20180604174034.58200616@fsol> <20180604190202.16939ef6@fsol> <5B158903.1000806@stoneleaf.us> Message-ID: <85po16idmq.fsf@benfinney.id.au> Ethan Furman writes: > On 06/04/2018 10:49 AM, Mariatta Wijaya wrote: > > > I think we shouldn't be speculating or making guesses. > > We should have contingency plans and be prepared. More than one > bought-out competitor has simply disappeared, or been hamstrung in its > effectiveness. Yes. So, because such contingency plans take quite some time to prepare, now *is* the time to be talking about it. -- \ ?There's a certain part of the contented majority who love | `\ anybody who is worth a billion dollars.? ?John Kenneth | _o__) Galbraith, 1992-05-23 | Ben Finney From vstinner at redhat.com Mon Jun 4 15:36:02 2018 From: vstinner at redhat.com (Victor Stinner) Date: Mon, 4 Jun 2018 21:36:02 +0200 Subject: [Python-Dev] Why not using "except: (...) raise" to cleanup on error? In-Reply-To: References: Message-ID: 2018-06-04 18:45 GMT+02:00 Guido van Rossum : > It is currently a general convention in asyncio to only catch Exception, not > BaseException. I consider this a flaw and we should fix it, but it's > unfortunately not so easy -- the tests will fail if you replace all > occurrences of Exception with BaseException, and it is not always clear > what's the right thing to do. E.g. catching KeyboardInterrupt may actually > make it harder to stop a runaway asyncio app. I recall vaguely something about loop.run_until_complete() which didn't behave "as expected" when interrupted by CTRL+c, like the following call to loop.run_until_complete() didn't work as expected. But this issue has been sorted out, no? I mean that maybe asyncio uses "except Exception:" for "historical reasons"? Victor From vano at mail.mipt.ru Mon Jun 4 15:37:24 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Mon, 4 Jun 2018 22:37:24 +0300 Subject: [Python-Dev] Keeping an eye on Travis CI, AppVeyor and buildbots: revert on regression In-Reply-To: References: <8a009070-96fe-058d-10fb-f913423c4627@mail.mipt.ru> Message-ID: <5431b12f-1c40-d9d9-ba78-cf14b4e7c186@mail.mipt.ru> No, replying only to you wasn't intended. https://docs.travis-ci.com/user/running-build-in-debug-mode/ is the official doc on how to debug a Travis CI build via ssh. On 04.06.2018 22:31, Victor Stinner wrote: > FYI you only replied to me in private. Is it on purpose? > > I'm interested if I can learn how to get a SSH access to Travis CI! > > 2018-06-04 21:05 GMT+02:00 Ivan Pozdeev : >> On 04.06.2018 19:31, Victor Stinner wrote: >>> 2018-05-30 11:33 GMT+02:00 Victor Stinner : >>>> I fixed a few tests which failed randomly. There are still a few, but >>>> the most annoying have been fixed. >>> Quick update a few days later. >>> >>> For an unknown reason, >>> test_multiprocessing_forkserver.TestIgnoreEINTR.test_ignore() started >>> to fail very frequently but only on Travis CI. I have no explanation >>> why only Travis CI. I failed to reproduce the issue on a Ubuntu Trusty >>> container or in a Ubuntu Trusty VM. After hours of debug, I found the >>> bug and wrote a fix. But the fix didn't work in all cases. A second >>> fix and now it seems like the issue is gone! >> >> FYI Travis claim they provide ssh access on request to debug particularly >> pesky issues. >> Last time I tried, got no response from them though. >> >>> https://bugs.python.org/issue33532 if you are curious about the >>> strange multiprocessing send() which must block but it doesn't :-) >>> >>> Except Windows 7 which has issues with test_asyncio and >>> multiprocessing tests because this buildbot is slow, it seems like >>> most CIs are now stable. >>> >>> Known issues: >>> >>> * PPC64 Fedora 3.x, PPC64LE Fedora 3.x, s390x RHEL 3.x: >>> https://bugs.python.org/issue33630 >>> * AIX: always red >>> * USBan: experimental buildbot >>> * Alpine: platform not supported yet (musl issues) >>> >>> Victor >>> _______________________________________________ >>> Python-Dev mailing list >>> Python-Dev at python.org >>> https://mail.python.org/mailman/listinfo/python-dev >>> Unsubscribe: >>> https://mail.python.org/mailman/options/python-dev/vano%40mail.mipt.ru >> >> -- >> Regards, >> Ivan >> -- Regards, Ivan From yselivanov.ml at gmail.com Mon Jun 4 15:40:31 2018 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Mon, 4 Jun 2018 15:40:31 -0400 Subject: [Python-Dev] Why not using "except: (...) raise" to cleanup on error? In-Reply-To: References: Message-ID: On Mon, Jun 4, 2018 at 3:38 PM Victor Stinner wrote: > > 2018-06-04 18:45 GMT+02:00 Guido van Rossum : > > It is currently a general convention in asyncio to only catch Exception, not > > BaseException. I consider this a flaw and we should fix it, but it's > > unfortunately not so easy -- the tests will fail if you replace all > > occurrences of Exception with BaseException, and it is not always clear > > what's the right thing to do. E.g. catching KeyboardInterrupt may actually > > make it harder to stop a runaway asyncio app. > > I recall vaguely something about loop.run_until_complete() which > didn't behave "as expected" when interrupted by CTRL+c, like the > following call to loop.run_until_complete() didn't work as expected. > But this issue has been sorted out, no? No, the issue is still there. And it's not an easy fix. Yury From mcepl at cepl.eu Mon Jun 4 13:25:49 2018 From: mcepl at cepl.eu (=?UTF-8?Q?Mat=C4=9Bj?= Cepl) Date: Mon, 04 Jun 2018 19:25:49 +0200 Subject: [Python-Dev] Microsoft to acquire GitHub for $7.5 billion References: <20180604174034.58200616@fsol> Message-ID: On 2018-06-04, 16:06 GMT, Guido van Rossum wrote: > I don't see how this *increases* the uncertainty. Surely if > GitHub had remained independent there would have been be > similar concerns about how it would make enough money to stay > in business. Beacuse Microsoft is known to keep a money loosing venture around forever? Best, Mat?j -- https://matej.ceplovi.cz/blog/, Jabber: mcepl at ceplovi.cz GPG Finger: 3C76 A027 CA45 AD70 98B5 BC1D 7920 5802 880B C9D8 To err is human, to purr feline. From vano at mail.mipt.ru Mon Jun 4 16:28:10 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Mon, 4 Jun 2018 23:28:10 +0300 Subject: [Python-Dev] Microsoft to acquire GitHub for $7.5 billion In-Reply-To: <5B158903.1000806@stoneleaf.us> References: <20180604174034.58200616@fsol> <20180604190202.16939ef6@fsol> <5B158903.1000806@stoneleaf.us> Message-ID: <3e4aac00-6213-85c3-64fd-52b39377c482@mail.mipt.ru> On 04.06.2018 21:46, Ethan Furman wrote: > On 06/04/2018 10:49 AM, Mariatta Wijaya wrote: > >> I think we shouldn't be speculating or making guesses. > > We should have contingency plans and be prepared.? More than one > bought-out competitor has simply disappeared, or been hamstrung in its > effectiveness. > Actually, since M$ has closely integrated Python into VS, I'm expecting Guido to receive an acquisition offer next! > -- > ~Ethan~ > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/vano%40mail.mipt.ru -- Regards, Ivan From vano at mail.mipt.ru Mon Jun 4 16:52:38 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Mon, 4 Jun 2018 23:52:38 +0300 Subject: [Python-Dev] Why not using "except: (...) raise" to cleanup on error? In-Reply-To: References: Message-ID: On 04.06.2018 20:11, Chris Angelico wrote: > On Tue, Jun 5, 2018 at 2:57 AM, Yury Selivanov wrote: >> On Mon, Jun 4, 2018 at 12:50 PM Chris Angelico wrote: >>> On Tue, Jun 5, 2018 at 2:11 AM, Victor Stinner wrote: >> [..] >>>> For me, it's fine to catch any exception using "except:" if the block >>>> contains "raise", typical pattern to cleanup a resource in case of >>>> error. Otherwise, there is a risk of leaking open file or not flushing >>>> data on disk, for example. >>> Pardon the dumb question, but why is try/finally unsuitable? >> Because try..finally isn't equivalent to try..except? Perhaps you >> should look at the actual code: >> https://github.com/python/cpython/blob/b609e687a076d77bdd687f5e4def85e29a044bfc/Lib/asyncio/base_events.py#L1117-L1123 > Oh. Duh. Yep, it was a dumb question. Sorry! The transport should ONLY > be closed on error. I smell a big, big design violation here. The whole point of Exception vs BaseException is that anything not Exception is "not an error", has a completely different effect on the program than an error, and thus is to be dealt with completely differently. For example, warnings do not disrupt the control flow, and GeneratorExit is normally handled by the `for` loop machinery. That's the whole point why except: is strongly discouraged. Be _very_ careful because when a system has matured, the risk of making bad to disastrous design decisions skyrockets (because "the big picture" grows ever larger, and it's ever more difficult to account for all of it). The best solution I know of is an independent sanity-check against the project's core design principles: focus solely on them and say if the suggestion is in harmony with the existing big picture. This prevents the project from falling victim to https://en.wikipedia.org/wiki/Design_by_committee in the long run. This is easier to do for someone not intimately involved with the change and the affected area 'cuz they are less biased in favor of the change and less distracted by minute details. Someone may take up this role to "provide a unified vision" (to reduce the load on a single http://meatballwiki.org/wiki/BenevolentDictator , different projects have tried delegates (this can run afoul of https://en.wikipedia.org/wiki/Conway%27s_law though) and a round-robin approach (Apache)). The best way, however, would probably be for anyone dealing with a design change to remember to make this check. This is even easier in Python, 'cuz the core values are officially formulated as Python Zen, and any module has one or two governing principles at its core, tops, that can be extracted by skimming through its docs. > ChrisA > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/vano%40mail.mipt.ru -- Regards, Ivan From chris.barker at noaa.gov Mon Jun 4 17:45:13 2018 From: chris.barker at noaa.gov (Chris Barker) Date: Mon, 4 Jun 2018 14:45:13 -0700 Subject: [Python-Dev] Docstrings on builtins Message-ID: Over on python-ideas, someone is/was proposing literals for timedeltas. I don't expect that will come to anything, but it did make me take a look at the docstring for datetime.timedelta. I use iPython's ? a lot for a quick overview of how to use a class/function. This is what I get: In [8]: timedelta? Init signature: timedelta(self, /, *args, **kwargs) Docstring: Difference between two datetime values. File: ~/miniconda2/envs/py3/lib/python3.6/datetime.py Type: type That is, well, not so useful. I'd like to see at least the signature: datetime.timedelta(days=0, seconds=0, microseconds=0, milliseconds=0, minutes=0, hours=0, weeks=0 And ideally much of the text in the docs. I've noticed similarly minimal docstrings on a number of builtin functions and methods. If I wanted to contribute a PR to enhance these docstrings, where would they go? I've seen mention of "argument clinic", but really don't know quite what that is, or how it works, but it appears to be related. Anyway -- more comprehensive docstrings on buildins could really help Python's usability for command line usage. Thanks, - Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From vano at mail.mipt.ru Mon Jun 4 17:54:34 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Tue, 5 Jun 2018 00:54:34 +0300 Subject: [Python-Dev] Why not using "except: (...) raise" to cleanup on error? In-Reply-To: References: Message-ID: On 04.06.2018 23:52, Ivan Pozdeev wrote: > On 04.06.2018 20:11, Chris Angelico wrote: >> On Tue, Jun 5, 2018 at 2:57 AM, Yury Selivanov >> wrote: >>> On Mon, Jun 4, 2018 at 12:50 PM Chris Angelico >>> wrote: >>>> On Tue, Jun 5, 2018 at 2:11 AM, Victor Stinner >>>> wrote: >>> [..] >>>>> For me, it's fine to catch any exception using "except:" if the block >>>>> contains "raise", typical pattern to cleanup a resource in case of >>>>> error. Otherwise, there is a risk of leaking open file or not >>>>> flushing >>>>> data on disk, for example. >>>> Pardon the dumb question, but why is try/finally unsuitable? >>> Because try..finally isn't equivalent to try..except?? Perhaps you >>> should look at the actual code: >>> https://github.com/python/cpython/blob/b609e687a076d77bdd687f5e4def85e29a044bfc/Lib/asyncio/base_events.py#L1117-L1123 >>> In this particular code, it looks like just KeyboardInterrupt needs to be handled in addition to Exception -- and even that's not certain 'cuz KeyboardInterrupt is an abnormal termination and specifically designed to not be messed with by the code ("The exception inherits from |BaseException| so as to not be accidentally caught by code that catches |Exception| and thus prevent the interpreter from exiting."). It only makes sense to catch it in REPL interfaces where the user clearly wants to terminale the current command rather than the entire program. If e.g. a warning is upgraded to exception, this means that some code is broken from user's POV, but not from Python team's POV, so we can't really be sure if we can handle this situation gracefully: our cleanup code can fail just as well! >> Oh. Duh. Yep, it was a dumb question. Sorry! The transport should ONLY >> be closed on error. > > I smell a big, big design violation here. > The whole point of Exception vs BaseException is that anything not > Exception is "not an error", has a completely different effect on the > program than an error, and thus is to be dealt with completely > differently. For example, warnings do not disrupt the control flow, > and GeneratorExit is normally handled by the `for` loop machinery. > That's the whole point why except: is strongly discouraged. > > Be _very_ careful because when a system has matured, the risk of > making bad to disastrous design decisions skyrockets (because "the big > picture" grows ever larger, and it's ever more difficult to account > for all of it). > > The best solution I know of is an independent sanity-check against the > project's core design principles: focus solely on them and say if the > suggestion is in harmony with the existing big picture. This prevents > the project from falling victim to > https://en.wikipedia.org/wiki/Design_by_committee in the long run. > This is easier to do for someone not intimately involved with the > change and the affected area 'cuz they are less biased in favor of the > change and less distracted by minute details. > > Someone may take up this role to "provide a unified vision" (to reduce > the load on a single http://meatballwiki.org/wiki/BenevolentDictator , > different projects have tried delegates (this can run afoul of > https://en.wikipedia.org/wiki/Conway%27s_law though) and a round-robin > approach (Apache)). > The best way, however, would probably be for anyone dealing with a > design change to remember to make this check. > > This is even easier in Python, 'cuz the core values are officially > formulated as Python Zen, and any module has one or two governing > principles at its core, tops, that can be extracted by skimming > through its docs. > >> ChrisA >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/vano%40mail.mipt.ru > -- Regards, Ivan -------------- next part -------------- An HTML attachment was scrubbed... URL: From vano at mail.mipt.ru Mon Jun 4 18:03:03 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Tue, 5 Jun 2018 01:03:03 +0300 Subject: [Python-Dev] Why not using "except: (...) raise" to cleanup on error? In-Reply-To: References: Message-ID: <58967325-a539-cdce-53c1-7ddf967a05a6@mail.mipt.ru> On 05.06.2018 0:54, Ivan Pozdeev wrote: > On 04.06.2018 23:52, Ivan Pozdeev wrote: >> On 04.06.2018 20:11, Chris Angelico wrote: >>> On Tue, Jun 5, 2018 at 2:57 AM, Yury Selivanov >>> wrote: >>>> On Mon, Jun 4, 2018 at 12:50 PM Chris Angelico >>>> wrote: >>>>> On Tue, Jun 5, 2018 at 2:11 AM, Victor Stinner >>>>> wrote: >>>> [..] >>>>>> For me, it's fine to catch any exception using "except:" if the >>>>>> block >>>>>> contains "raise", typical pattern to cleanup a resource in case of >>>>>> error. Otherwise, there is a risk of leaking open file or not >>>>>> flushing >>>>>> data on disk, for example. >>>>> Pardon the dumb question, but why is try/finally unsuitable? >>>> Because try..finally isn't equivalent to try..except? Perhaps you >>>> should look at the actual code: >>>> https://github.com/python/cpython/blob/b609e687a076d77bdd687f5e4def85e29a044bfc/Lib/asyncio/base_events.py#L1117-L1123 >>>> > > In this particular code, it looks like just KeyboardInterrupt needs to > be handled in addition to Exception -- and even that's not certain > 'cuz KeyboardInterrupt is an abnormal termination and specifically > designed to not be messed with by the code ("The exception inherits > from |BaseException| > > so as to not be accidentally caught by code that catches |Exception| > > and thus prevent the interpreter from exiting."). > > It only makes sense to catch it in REPL interfaces where the user > clearly wants to terminale the current command rather than the entire > program. > Remembered `pip`, too -- there, it's justified by it working in transactions. > > If e.g. a warning is upgraded to exception, this means that some code > is broken from user's POV, but not from Python team's POV, so we can't > really be sure if we can handle this situation gracefully: our cleanup > code can fail just as well! > >>> Oh. Duh. Yep, it was a dumb question. Sorry! The transport should ONLY >>> be closed on error. >> >> I smell a big, big design violation here. >> The whole point of Exception vs BaseException is that anything not >> Exception is "not an error", has a completely different effect on the >> program than an error, and thus is to be dealt with completely >> differently. For example, warnings do not disrupt the control flow, >> and GeneratorExit is normally handled by the `for` loop machinery. >> That's the whole point why except: is strongly discouraged. >> >> Be _very_ careful because when a system has matured, the risk of >> making bad to disastrous design decisions skyrockets (because "the >> big picture" grows ever larger, and it's ever more difficult to >> account for all of it). >> >> The best solution I know of is an independent sanity-check against >> the project's core design principles: focus solely on them and say if >> the suggestion is in harmony with the existing big picture. This >> prevents the project from falling victim to >> https://en.wikipedia.org/wiki/Design_by_committee in the long run. >> This is easier to do for someone not intimately involved with the >> change and the affected area 'cuz they are less biased in favor of >> the change and less distracted by minute details. >> >> Someone may take up this role to "provide a unified vision" (to >> reduce the load on a single >> http://meatballwiki.org/wiki/BenevolentDictator , different projects >> have tried delegates (this can run afoul of >> https://en.wikipedia.org/wiki/Conway%27s_law though) and a >> round-robin approach (Apache)). >> The best way, however, would probably be for anyone dealing with a >> design change to remember to make this check. >> >> This is even easier in Python, 'cuz the core values are officially >> formulated as Python Zen, and any module has one or two governing >> principles at its core, tops, that can be extracted by skimming >> through its docs. >> >>> ChrisA >>> _______________________________________________ >>> Python-Dev mailing list >>> Python-Dev at python.org >>> https://mail.python.org/mailman/listinfo/python-dev >>> Unsubscribe: >>> https://mail.python.org/mailman/options/python-dev/vano%40mail.mipt.ru >> > > -- > Regards, > Ivan -- Regards, Ivan -------------- next part -------------- An HTML attachment was scrubbed... URL: From vstinner at redhat.com Mon Jun 4 18:27:37 2018 From: vstinner at redhat.com (Victor Stinner) Date: Tue, 5 Jun 2018 00:27:37 +0200 Subject: [Python-Dev] Docstrings on builtins In-Reply-To: References: Message-ID: Hi, For Argument Clinic, have a look at https://docs.python.org/dev/howto/clinic.html You can also try to copy/paste code from other files using Argument Clinic and then run "make clinic" to regenerate the generated files. Victor 2018-06-04 23:45 GMT+02:00 Chris Barker via Python-Dev : > Over on python-ideas, someone is/was proposing literals for timedeltas. > > I don't expect that will come to anything, but it did make me take a look at > the docstring for datetime.timedelta. I use iPython's ? a lot for a quick > overview of how to use a class/function. > > This is what I get: > > In [8]: timedelta? > Init signature: timedelta(self, /, *args, **kwargs) > Docstring: Difference between two datetime values. > File: ~/miniconda2/envs/py3/lib/python3.6/datetime.py > Type: type > > > That is, well, not so useful. I'd like to see at least the signature: > > datetime.timedelta(days=0, seconds=0, microseconds=0, milliseconds=0, > minutes=0, hours=0, weeks=0 > > And ideally much of the text in the docs. > > I've noticed similarly minimal docstrings on a number of builtin functions > and methods. > > If I wanted to contribute a PR to enhance these docstrings, where would they > go? I've seen mention of "argument clinic", but really don't know quite > what that is, or how it works, but it appears to be related. > > Anyway -- more comprehensive docstrings on buildins could really help > Python's usability for command line usage. > > Thanks, > - Chris > > > > > -- > > Christopher Barker, Ph.D. > Oceanographer > > Emergency Response Division > NOAA/NOS/OR&R (206) 526-6959 voice > 7600 Sand Point Way NE (206) 526-6329 fax > Seattle, WA 98115 (206) 526-6317 main reception > > Chris.Barker at noaa.gov > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/vstinner%40redhat.com > From chris.barker at noaa.gov Mon Jun 4 19:19:17 2018 From: chris.barker at noaa.gov (Chris Barker) Date: Mon, 4 Jun 2018 16:19:17 -0700 Subject: [Python-Dev] Docstrings on builtins In-Reply-To: References: Message-ID: On Mon, Jun 4, 2018 at 3:27 PM, Victor Stinner wrote: > For Argument Clinic, have a look at > https://docs.python.org/dev/howto/clinic.html Thanks Victor -- scanning that page, it is indeed where I needed to look. You can also try to copy/paste code from other files using Argument > Clinic and then run "make clinic" to regenerate the generated files. > good idea. Now to find some time to actually work on this... -CHB > Victor > > 2018-06-04 23:45 GMT+02:00 Chris Barker via Python-Dev < > python-dev at python.org>: > > Over on python-ideas, someone is/was proposing literals for timedeltas. > > > > I don't expect that will come to anything, but it did make me take a > look at > > the docstring for datetime.timedelta. I use iPython's ? a lot for a quick > > overview of how to use a class/function. > > > > This is what I get: > > > > In [8]: timedelta? > > Init signature: timedelta(self, /, *args, **kwargs) > > Docstring: Difference between two datetime values. > > File: ~/miniconda2/envs/py3/lib/python3.6/datetime.py > > Type: type > > > > > > That is, well, not so useful. I'd like to see at least the signature: > > > > datetime.timedelta(days=0, seconds=0, microseconds=0, milliseconds=0, > > minutes=0, hours=0, weeks=0 > > > > And ideally much of the text in the docs. > > > > I've noticed similarly minimal docstrings on a number of builtin > functions > > and methods. > > > > If I wanted to contribute a PR to enhance these docstrings, where would > they > > go? I've seen mention of "argument clinic", but really don't know quite > > what that is, or how it works, but it appears to be related. > > > > Anyway -- more comprehensive docstrings on buildins could really help > > Python's usability for command line usage. > > > > Thanks, > > - Chris > > > > > > > > > > -- > > > > Christopher Barker, Ph.D. > > Oceanographer > > > > Emergency Response Division > > NOAA/NOS/OR&R (206) 526-6959 voice > > 7600 Sand Point Way NE (206) 526-6329 fax > > Seattle, WA 98115 (206) 526-6317 main reception > > > > Chris.Barker at noaa.gov > > > > _______________________________________________ > > Python-Dev mailing list > > Python-Dev at python.org > > https://mail.python.org/mailman/listinfo/python-dev > > Unsubscribe: > > https://mail.python.org/mailman/options/python-dev/vstinner%40redhat.com > > > -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at pearwood.info Mon Jun 4 19:38:31 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Tue, 5 Jun 2018 09:38:31 +1000 Subject: [Python-Dev] Microsoft to acquire GitHub for $7.5 billion In-Reply-To: References: <20180604174034.58200616@fsol> Message-ID: <20180604233831.GS12683@ando.pearwood.info> On Mon, Jun 04, 2018 at 07:25:49PM +0200, Mat?j Cepl wrote: > On 2018-06-04, 16:06 GMT, Guido van Rossum wrote: > > I don't see how this *increases* the uncertainty. Surely if > > GitHub had remained independent there would have been be > > similar concerns about how it would make enough money to stay > > in business. > > Beacuse Microsoft is known to keep a money loosing venture > around forever? No, but Guido is right: neither is anyone else. In that regard, Microsoft is probably *more* likely to keep pumping money into a failing business if it gives them a strategic advantage, compared to other investors with no long-term strategy other than "get aquired by Google/Microsoft/Oracle/Apple". But on the other hand, Microsoft (or at least the bad old Microsoft of Bill Gates' days) has a long history of "Embrace, Extend, Extinguish" as policy: https://en.wikipedia.org/wiki/Embrace,_extend,_and_extinguish (Not that Microsoft is the only big tech company that does/did this.) Anyway, this is just all speculation at this point. In the short term, nothing changes, and it is too early to tell how it changes the long term. -- Steve From bussonniermatthias at gmail.com Mon Jun 4 20:09:31 2018 From: bussonniermatthias at gmail.com (Matthias Bussonnier) Date: Mon, 4 Jun 2018 17:09:31 -0700 Subject: [Python-Dev] Docstrings on builtins In-Reply-To: References: Message-ID: This may even be a bug/feature of IPython, I see that inspect.signature(timedelta) fails, so if timedelta? says Init signature: timedelta(self, /, *args, **kwargs) Then this may be some IPython internal logic. The timedelta class seem to use __new__ instead of __init__ (not sure why) and __new__ have a meaningful signature, So maybe we should fallback on that during signature inspection. Feel free to open an issue on the IPython repo. Btw IPython is uppercase I, and we don't want any troupe with the fruit giant. -- M On Mon, 4 Jun 2018 at 16:30, Chris Barker via Python-Dev < python-dev at python.org> wrote: > On Mon, Jun 4, 2018 at 3:27 PM, Victor Stinner > wrote: > >> For Argument Clinic, have a look at >> https://docs.python.org/dev/howto/clinic.html > > > Thanks Victor -- scanning that page, it is indeed where I needed to look. > > You can also try to copy/paste code from other files using Argument >> Clinic and then run "make clinic" to regenerate the generated files. >> > > good idea. > > Now to find some time to actually work on this... > > -CHB > > > > >> Victor >> >> 2018-06-04 23:45 GMT+02:00 Chris Barker via Python-Dev < >> python-dev at python.org>: >> > Over on python-ideas, someone is/was proposing literals for timedeltas. >> > >> > I don't expect that will come to anything, but it did make me take a >> look at >> > the docstring for datetime.timedelta. I use iPython's ? a lot for a >> quick >> > overview of how to use a class/function. >> > >> > This is what I get: >> > >> > In [8]: timedelta? >> > Init signature: timedelta(self, /, *args, **kwargs) >> > Docstring: Difference between two datetime values. >> > File: ~/miniconda2/envs/py3/lib/python3.6/datetime.py >> > Type: type >> > >> > >> > That is, well, not so useful. I'd like to see at least the signature: >> > >> > datetime.timedelta(days=0, seconds=0, microseconds=0, milliseconds=0, >> > minutes=0, hours=0, weeks=0 >> > >> > And ideally much of the text in the docs. >> > >> > I've noticed similarly minimal docstrings on a number of builtin >> functions >> > and methods. >> > >> > If I wanted to contribute a PR to enhance these docstrings, where would >> they >> > go? I've seen mention of "argument clinic", but really don't know quite >> > what that is, or how it works, but it appears to be related. >> > >> > Anyway -- more comprehensive docstrings on buildins could really help >> > Python's usability for command line usage. >> > >> > Thanks, >> > - Chris >> > >> > >> > >> > >> > -- >> > >> > Christopher Barker, Ph.D. >> > Oceanographer >> > >> > Emergency Response Division >> > NOAA/NOS/OR&R (206) 526-6959 voice >> > 7600 Sand Point Way NE (206) 526-6329 fax >> > Seattle, WA 98115 (206) 526-6317 main reception >> > >> > Chris.Barker at noaa.gov >> > >> > _______________________________________________ >> > Python-Dev mailing list >> > Python-Dev at python.org >> > https://mail.python.org/mailman/listinfo/python-dev >> > Unsubscribe: >> > >> https://mail.python.org/mailman/options/python-dev/vstinner%40redhat.com >> > >> > > > > -- > > Christopher Barker, Ph.D. > Oceanographer > > Emergency Response Division > NOAA/NOS/OR&R (206) 526-6959 voice > 7600 Sand Point Way NE (206) 526-6329 fax > Seattle, WA 98115 (206) 526-6317 main reception > > Chris.Barker at noaa.gov > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/bussonniermatthias%40gmail.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vano at mail.mipt.ru Mon Jun 4 20:28:32 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Tue, 5 Jun 2018 03:28:32 +0300 Subject: [Python-Dev] Docstrings on builtins In-Reply-To: References: Message-ID: <6ea65b34-158c-587e-525c-35d4a2ed4777@mail.mipt.ru> On 05.06.2018 3:09, Matthias Bussonnier wrote: > This may even be a bug/feature of IPython, > > I see that inspect.signature(timedelta) fails, so if timedelta? says > Init signature: timedelta(self, /, *args, **kwargs) > Then this may be some IPython internal logic. The timedelta class seem > to use __new__ instead of __init__ (not sure why) Because it's an immutable type. > and __new__ have a meaningful signature, > So maybe we should fallback on that during signature inspection. > According to https://stackoverflow.com/questions/4374006/check-for-mutability-in-python , there are no reliable tests for mutability. > Feel free to open an issue on the IPython repo. > > Btw IPython is uppercase I, and we don't want any troupe with the > fruit giant. > -- > M > > On Mon, 4 Jun 2018 at 16:30, Chris Barker via Python-Dev > > wrote: > > On Mon, Jun 4, 2018 at 3:27 PM, Victor Stinner > > wrote: > > For Argument Clinic, have a look at > https://docs.python.org/dev/howto/clinic.html > > > Thanks Victor -- scanning that page, it is indeed where I needed > to look. > > You can also try to copy/paste code from other files using > Argument > Clinic and then run "make clinic" to regenerate the generated > files. > > > good idea. > > Now to find some time to actually work on this... > > -CHB > > > Victor > > 2018-06-04 23:45 GMT+02:00 Chris Barker via Python-Dev > >: > > Over on python-ideas, someone is/was proposing literals for > timedeltas. > > > > I don't expect that will come to anything, but it did make > me take a look at > > the docstring for datetime.timedelta. I use iPython's ? a > lot for a quick > > overview of how to use a class/function. > > > > This is what I get: > > > > In [8]: timedelta? > > Init signature: timedelta(self, /, *args, **kwargs) > > Docstring:? ? ? Difference between two datetime values. > > File: ?~/miniconda2/envs/py3/lib/python3.6/datetime.py > > Type:? ? ? ? ? ?type > > > > > > That is, well, not so useful. I'd like to see at least the > signature: > > > > datetime.timedelta(days=0, seconds=0, microseconds=0, > milliseconds=0, > > minutes=0, hours=0, weeks=0 > > > > And ideally much of the text in the docs. > > > > I've noticed similarly minimal docstrings on a number of > builtin functions > > and methods. > > > > If I wanted to contribute a PR to enhance these docstrings, > where would they > > go?? I've seen mention of "argument clinic", but really > don't know quite > > what that is, or how it works, but it appears to be related. > > > > Anyway -- more comprehensive docstrings on buildins could > really help > > Python's usability for command line usage. > > > > Thanks, > > -? Chris > > > > > > > > > > -- > > > > Christopher Barker, Ph.D. > > Oceanographer > > > > Emergency Response Division > > NOAA/NOS/OR&R? ? ? ? ? ? (206) 526-6959 ?voice > > 7600 Sand Point Way NE? ?(206) 526-6329? ?fax > > Seattle, WA? 98115? ? ? ?(206) 526-6317 ?main reception > > > > Chris.Barker at noaa.gov > > > > _______________________________________________ > > Python-Dev mailing list > > Python-Dev at python.org > > https://mail.python.org/mailman/listinfo/python-dev > > Unsubscribe: > > > https://mail.python.org/mailman/options/python-dev/vstinner%40redhat.com > > > > > > > -- > > Christopher Barker, Ph.D. > Oceanographer > > Emergency Response Division > NOAA/NOS/OR&R ? ? ? ? ? ?(206) 526-6959?? voice > 7600 Sand Point Way NE ??(206) 526-6329?? fax > Seattle, WA ?98115 ? ? ??(206) 526-6317?? main reception > > Chris.Barker at noaa.gov > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/bussonniermatthias%40gmail.com > > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/vano%40mail.mipt.ru -- Regards, Ivan -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Mon Jun 4 20:49:49 2018 From: chris.barker at noaa.gov (Chris Barker - NOAA Federal) Date: Mon, 4 Jun 2018 20:49:49 -0400 Subject: [Python-Dev] Docstrings on builtins In-Reply-To: References: Message-ID: > > This may even be a bug/feature of IPython, Ahh, thanks! I?ll look into that. -CHB From bussonniermatthias at gmail.com Mon Jun 4 21:21:02 2018 From: bussonniermatthias at gmail.com (Matthias Bussonnier) Date: Mon, 4 Jun 2018 18:21:02 -0700 Subject: [Python-Dev] Docstrings on builtins In-Reply-To: <6ea65b34-158c-587e-525c-35d4a2ed4777@mail.mipt.ru> References: <6ea65b34-158c-587e-525c-35d4a2ed4777@mail.mipt.ru> Message-ID: On Mon, 4 Jun 2018 at 17:29, Ivan Pozdeev via Python-Dev < python-dev at python.org> wrote: > On 05.06.2018 3:09, Matthias Bussonnier wrote: > > This may even be a bug/feature of IPython, > > I see that inspect.signature(timedelta) fails, so if timedelta? says > Init signature: timedelta(self, /, *args, **kwargs) > Then this may be some IPython internal logic. The timedelta class seem to > use __new__ instead of __init__ (not sure why) > > Because it's an immutable type. > Ah, yes, thanks. > and __new__ have a meaningful signature, > So maybe we should fallback on that during signature inspection. > > According to > https://stackoverflow.com/questions/4374006/check-for-mutability-in-python > , > there are no reliable tests for mutability. > Sure, but we can test if the signature of __init__ is (self,/, *args, **kwargs), and if it is, it is useless we can attempt to get the signature from __new__ and show that instead. We do similar things for docstrings, if __init__ have no docstring we look at the class level docstring. -- M -------------- next part -------------- An HTML attachment was scrubbed... URL: From vstinner at redhat.com Tue Jun 5 07:33:37 2018 From: vstinner at redhat.com (Victor Stinner) Date: Tue, 5 Jun 2018 13:33:37 +0200 Subject: [Python-Dev] Withdraw PEP 546? Backport ssl.MemoryBIO and ssl.SSLObject to Python 2.7 In-Reply-To: References: Message-ID: So sorry for Python 2.7, I just rejected my PEP 546, no ssl.MemoryBIO for you! https://www.python.org/dev/peps/pep-0546/#rejection-notice The workaround is to use PyOpenSSL on Python 2.7. Victor 2018-05-30 16:28 GMT+02:00 Victor Stinner : > Hi, > > tl; dr I will withdraw the PEP 546 in one week if noboy shows up to > finish the implementation. > > > Last year,I wrote the PEP 546 with Cory Benfield: > "Backport ssl.MemoryBIO and ssl.SSLObject to Python 2.7" > https://www.python.org/dev/peps/pep-0546/ > > The plan was to get a Python 2.7 implementation of Cory's PEP 543: > "A Unified TLS API for Python" > https://www.python.org/dev/peps/pep-0543/ > > Sadly, it seems like Cory is no longer available to work on the projec > (PEP 543 is still a draft)t. > > The PEP 546 is implemented: > https://github.com/python/cpython/pull/2133 > > Well, I closed it, but you can still get it as a patch with: > https://patch-diff.githubusercontent.com/raw/python/cpython/pull/2133.patch > > But tests fail on Travis CI whereas I'm unable to reproduce the issue > on my laptop (on Fedora). The failure seems to depend on the version > of OpenSSL. Christian Heimes has a "multissl" tool which automates > tests on multiple OpenSSL versions, but I failed to find time to try > this tool. > > Time flies and one year later, the PR of the PEP 546 is still not > merged, tests are still failing. > > One month ago, when 2.7.15 has been released, Benjamin Peterson, > Python 2.7 release manager, simply proposed: > "The lack of movement for a year makes me wonder if PEP 546 should be > moved to Withdrawn status." > > Since again, I failed to find time to look at the test_ssl failure, I > plan to withdraw the PEP next week if nobody shows up :-( Sorry Python > 2.7! > > Does anyone would benefit of MemoryBIO in Python 2.7? Twisted, > asyncio, trio, urllib3, anyone else? If yes, who is volunteer to > finish the MemoryBIO backport (and maintain it)? > > Victor From mal at egenix.com Tue Jun 5 07:54:01 2018 From: mal at egenix.com (M.-A. Lemburg) Date: Tue, 5 Jun 2018 13:54:01 +0200 Subject: [Python-Dev] Microsoft to acquire GitHub for $7.5 billion In-Reply-To: <20180604190202.16939ef6@fsol> References: <20180604174034.58200616@fsol> <20180604190202.16939ef6@fsol> Message-ID: Something that may change is the way they treat Github accounts, after all, MS is very much a sales driven company. But then there's always the possibility to move to Gitlab as alternative (hosted or run on PSF VMs), so I would worry too much. Do note, however, that the value in Github is not so much with the products they have, but with the data. Their databases know more about IT developer than anyone else and given that Github is put under the AI umbrella in MS should tell us something :-) On 04.06.2018 19:02, Antoine Pitrou wrote: > > That's true, but Microsoft has a lot of stakes in the ecosystem. > For example, since it has its own CI service that it tries to promote > (VSTS), is it in Microsoft's best interest to polish and improve > integrations with other CI services? > > Regards > > Antoine. > > > On Mon, 4 Jun 2018 09:06:28 -0700 > Guido van Rossum wrote: >> On Mon, Jun 4, 2018 at 8:40 AM, Antoine Pitrou wrote: >> >>> >>> On Mon, 4 Jun 2018 17:03:27 +0200 >>> Victor Stinner wrote: >>>> >>>> At this point, I have no opinion about the event :-) I just guess that >>>> it should make GitHub more sustainable since Microsoft is a big >>>> company with money and interest in GitHub. I'm also confident that >>>> nothing will change soon. IMHO there is no need to worry about >>>> anything. >>> >>> It does spell uncertainty on the long term. While there is no need to >>> worry for now, I think it gives a different colour to the debate about >>> moving issues to Github. >>> >> >> I don't see how this *increases* the uncertainty. Surely if GitHub had >> remained independent there would have been be similar concerns about how it >> would make enough money to stay in business. >> > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/mal%40egenix.com > -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Experts (#1, Jun 05 2018) >>> Python Projects, Coaching and Consulting ... http://www.egenix.com/ >>> Python Database Interfaces ... http://products.egenix.com/ >>> Plone/Zope Database Interfaces ... http://zope.egenix.com/ ________________________________________________________________________ ::: We implement business ideas - efficiently in both time and costs ::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/ http://www.malemburg.com/ From mcepl at cepl.eu Tue Jun 5 09:10:41 2018 From: mcepl at cepl.eu (=?UTF-8?Q?Mat=C4=9Bj?= Cepl) Date: Tue, 05 Jun 2018 15:10:41 +0200 Subject: [Python-Dev] Microsoft to acquire GitHub for $7.5 billion References: <20180604174034.58200616@fsol> <20180604233831.GS12683@ando.pearwood.info> Message-ID: On 2018-06-04, 23:38 GMT, Steven D'Aprano wrote: > No, but Guido is right: neither is anyone else. > > In that regard, Microsoft is probably *more* likely to keep pumping > money into a failing business if it gives them a strategic advantage, > compared to other investors with no long-term strategy other than "get > aquired by Google/Microsoft/Oracle/Apple". Let me just to say here, that gitlab.com has export of repository together with all metadata. Just saying. Mat?j -- https://matej.ceplovi.cz/blog/, Jabber: mcepl at ceplovi.cz GPG Finger: 3C76 A027 CA45 AD70 98B5 BC1D 7920 5802 880B C9D8 Oh, to be young, and to feel love?s keen sting. -- Albus Dumbledore From mhroncok at redhat.com Tue Jun 5 10:11:41 2018 From: mhroncok at redhat.com (=?UTF-8?Q?Miro_Hron=c4=8dok?=) Date: Tue, 5 Jun 2018 16:11:41 +0200 Subject: [Python-Dev] Microsoft to acquire GitHub for $7.5 billion In-Reply-To: References: <20180604174034.58200616@fsol> <20180604233831.GS12683@ando.pearwood.info> Message-ID: On 5.6.2018 15:10, Mat?j Cepl wrote: > On 2018-06-04, 23:38 GMT, Steven D'Aprano wrote: >> No, but Guido is right: neither is anyone else. >> >> In that regard, Microsoft is probably *more* likely to keep pumping >> money into a failing business if it gives them a strategic advantage, >> compared to other investors with no long-term strategy other than "get >> aquired by Google/Microsoft/Oracle/Apple". > > Let me just to say here, that gitlab.com has export of > repository together with all metadata. Just saying. GitHub has: https://developer.github.com/changes/2018-05-24-user-migration-api/ but I'm not sure what exactly is there. -- Miro Hron?ok -- Phone: +420777974800 IRC: mhroncok From chris.barker at noaa.gov Tue Jun 5 10:56:47 2018 From: chris.barker at noaa.gov (Chris Barker) Date: Tue, 5 Jun 2018 07:56:47 -0700 Subject: [Python-Dev] Docstrings on builtins In-Reply-To: References: <6ea65b34-158c-587e-525c-35d4a2ed4777@mail.mipt.ru> Message-ID: OK, looking a bit deeper: In [69]: timedelta.__new__.__doc__ Out[69]: 'Create and return a new object. See help(type) for accurate signature.' In [70]: timedelta.__init__.__doc__ Out[70]: 'Initialize self. See help(type(self)) for accurate signature.' In [71]: timedelta.__doc__ Out[71]: 'Difference between two datetime values.' So the none of the docstrings have the proper information. And: help(timedelta) returns: Help on class timedelta in module datetime: class timedelta(builtins.object) | Difference between two datetime values. | | Methods defined here: | | __abs__(self, /) | abs(self) | | __add__(self, value, /) | Return self+value. .... So no signature either. I'm guessing this is because argument clinic has not been properly applied -- so Ihave a PR to work on. but where does help() get its info anyway? I always thought docstrings were supposed to be used for the basic, well, docs. And between the class and __new__ and __init__, somewhere in there you should learn how to initialize an instance, yes? -CHB On Mon, Jun 4, 2018 at 6:21 PM, Matthias Bussonnier < bussonniermatthias at gmail.com> wrote: > > > On Mon, 4 Jun 2018 at 17:29, Ivan Pozdeev via Python-Dev < > python-dev at python.org> wrote: > >> On 05.06.2018 3:09, Matthias Bussonnier wrote: >> >> This may even be a bug/feature of IPython, >> >> I see that inspect.signature(timedelta) fails, so if timedelta? says >> Init signature: timedelta(self, /, *args, **kwargs) >> Then this may be some IPython internal logic. The timedelta class seem to >> use __new__ instead of __init__ (not sure why) >> >> Because it's an immutable type. >> > Ah, yes, thanks. > > >> and __new__ have a meaningful signature, >> So maybe we should fallback on that during signature inspection. >> >> According to https://stackoverflow.com/questions/4374006/check-for- >> mutability-in-python , >> there are no reliable tests for mutability. >> > Sure, but we can test if the signature of __init__ is (self,/, *args, > **kwargs), and if it is, it is useless we can attempt to get the signature > from __new__ and show that instead. We do similar things for docstrings, > if __init__ have no docstring we look at the class level docstring. > -- > M > > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ > chris.barker%40noaa.gov > > -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From vano at mail.mipt.ru Tue Jun 5 11:01:03 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Tue, 5 Jun 2018 18:01:03 +0300 Subject: [Python-Dev] Docstrings on builtins In-Reply-To: References: <6ea65b34-158c-587e-525c-35d4a2ed4777@mail.mipt.ru> Message-ID: On 05.06.2018 17:56, Chris Barker wrote: > OK, > > looking a bit deeper: > > In [69]: timedelta.__new__.__doc__ > Out[69]: 'Create and return a new object.? See help(type) for accurate > signature.' > > In [70]: timedelta.__init__.__doc__ > Out[70]: 'Initialize self.? See help(type(self)) for accurate signature.' > > In [71]: timedelta.__doc__ > Out[71]: 'Difference between two datetime values.' > > So the none of the docstrings have the proper information. And: > > help(timedelta) returns: > > Help on class timedelta in module datetime: > > class timedelta(builtins.object) > ?|? Difference between two datetime values. > ?| > ?|? Methods defined here: > ?| > ?|? __abs__(self, /) > ?|????? abs(self) > ?| > ?|? __add__(self, value, /) > ?|????? Return self+value. > .... > > So no signature either. > > I'm guessing this is because argument clinic has not been properly > applied -- so Ihave a PR to work on. > > but where does help() get its info anyway? > > I always thought docstrings were supposed to be used for the basic, > well, docs. And between the class and __new__ and __init__, somewhere > in there you should learn how to initialize an instance, yes? > In [5]: print(str.__doc__) str(object='') -> str str(bytes_or_buffer[, encoding[, errors]]) -> str Create a new string object from the given object. If encoding or errors is specified <...> As you can see, the start of the type's docstring contains constructor signature(s). Timedelta's one should probably do the same. > -CHB > > > > > > On Mon, Jun 4, 2018 at 6:21 PM, Matthias Bussonnier > > > wrote: > > > > On Mon, 4 Jun 2018 at 17:29, Ivan Pozdeev via Python-Dev > > wrote: > > On 05.06.2018 3:09, Matthias Bussonnier wrote: >> This may even be a bug/feature of IPython, >> >> I see that inspect.signature(timedelta) fails, so if >> timedelta? says >> Init signature: timedelta(self, /, *args, **kwargs) >> Then this may be some IPython internal logic. The timedelta >> class seem to use __new__ instead of __init__ (not sure why) > > Because it's an immutable type. > > Ah, yes, thanks. > >> and __new__ have a meaningful signature, >> So maybe we should fallback on that during signature inspection. >> > According to > https://stackoverflow.com/questions/4374006/check-for-mutability-in-python > > , > there are no reliable tests for mutability. > > Sure, but we can test if the signature of __init__ is (self,/, > *args, **kwargs), and if it is,? it is useless we can attempt to > get the signature from __new__ and show that instead.? We do > similar things for docstrings, if __init__ have no docstring we > look at the class level docstring. > -- > M > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/chris.barker%40noaa.gov > > > > > > -- > > Christopher Barker, Ph.D. > Oceanographer > > Emergency Response Division > NOAA/NOS/OR&R ? ? ? ? ? ?(206) 526-6959?? voice > 7600 Sand Point Way NE ??(206) 526-6329?? fax > Seattle, WA ?98115 ? ? ??(206) 526-6317?? main reception > > Chris.Barker at noaa.gov -- Regards, Ivan -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Tue Jun 5 11:03:14 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Wed, 6 Jun 2018 01:03:14 +1000 Subject: [Python-Dev] Microsoft to acquire GitHub for $7.5 billion In-Reply-To: References: <20180604174034.58200616@fsol> <20180604233831.GS12683@ando.pearwood.info> Message-ID: On 5 June 2018 at 23:10, Mat?j Cepl wrote: > On 2018-06-04, 23:38 GMT, Steven D'Aprano wrote: > > No, but Guido is right: neither is anyone else. > > > > In that regard, Microsoft is probably *more* likely to keep pumping > > money into a failing business if it gives them a strategic advantage, > > compared to other investors with no long-term strategy other than "get > > aquired by Google/Microsoft/Oracle/Apple". > > Let me just to say here, that gitlab.com has export of > repository together with all metadata. Just saying. > I was actually looking into this recently to see if the repository import feature could be used to run a regularly updated repository mirror that included all issues and PR comments in addition to the code, and noticed that one of the things that GitLab really requires to create a high quality export is for folks to have linked their GitLab instance accounts with their GitHub ones - if you don't already have that account mapping in place, then the import currently loses the ability to link users correctly with their comments and commits (We have a partial mapping due to folks adding their account names to bugs.python.org for CLA signing purposes, but not in a form that GitLab could actually use, since they need the API access authorisation). That said, I personally don't think this changes much about our relationship with GitHub, except for the fact that we're now dealing with a large publicly traded multinational with relatively transparent financial reports rather than a smallish privately held company that was only financially accountable to the venture capitalists backing it. I'm more confident in my ability to predict Microsoft's business incentives based on the prevailing technology landscape than I am in my ability to predict the actions of a VC firm like Andreesen Horowitz :) Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Tue Jun 5 14:27:45 2018 From: brett at python.org (Brett Cannon) Date: Tue, 5 Jun 2018 11:27:45 -0700 Subject: [Python-Dev] Stable ABI In-Reply-To: References: <75ea61b157f74839b6c756a0d301ebb7@xmail101.UGent.be> <5B118635.1040006@UGent.be> <8C620897-6A78-468C-AA0E-AF3209FB5B48@mac.com> <99bbe8b9-3fbf-3340-eff0-35abb541d6df@trueblade.com> Message-ID: I know Kushal set up ABI testing for Fedora and has brought up taking the work he did for that and bringing it over to CPython, but I also know he is offline for personal reasons ATM and won't be able to to reply for a little while. On Mon, 4 Jun 2018 at 08:06 Eric Snow wrote: > I've pointed out in bpo-21142 the similar script I added last year to > track C globals: > > https://github.com/python/cpython/tree/master/Tools/c-globals > > -eric > > On Mon, Jun 4, 2018 at 1:17 AM, Ronald Oussoren > wrote: > > > > > > On 4 Jun 2018, at 08:35, Ronald Oussoren wrote: > > > > > > > > On 3 Jun 2018, at 17:04, Eric V. Smith wrote: > > > > On 6/3/2018 10:55 AM, Christian Tismer wrote: > > > > On 03.06.18 13:18, Ronald Oussoren wrote: > > > > > > > > On 3 Jun 2018, at 12:03, Christian Tismer wrote: > > > > ... > > > > > > I have written a script that scans all relevant header files > > and analyses all sections which are reachable in the limited API > > context. > > All macros that don't begin with an underscore which contain > > a "->tp_" string are the locations which will break. > > > > I found exactly 7 locations where this is the case. > > > > My PR will contain the 7 fixes plus the analysis script > > to go into tools. Preparind that in the evening. > > > > > > Having tests would still be nice to detect changes to the stable ABI when > > they are made. > > > > Writing those tests is quite some work though, especially if those at > least > > smoke test the limited ABI by compiling snippets the use all symbols that > > should be exposed by the limited ABI. Writing those tests should be > fairly > > simple for someone that knows how to write C extensions, but is some > work. > > > > Writing a tests that complain when the headers expose symbols that > shouldn?t > > be exposed is harder, due to the need to parse headers (either by hacking > > something together using regular expressions, or by using tools like > gccxml > > or clang?s C API). > > > > What do you mean? > > My script does that with all "tp_*" type fields. > > What else would you want to check? > > > > > > I think Ronald is saying we're trying to answer a few questions: > > > > 1. Did we accidentally drop anything from the stable ABI? > > > > 2. Did we add anything to the stable ABI that we didn't mean to? > > > > 3. (and one of mine): Does the stable ABI already contain things that we > > don't expect it to? > > > > > > That?s correct. There have been instances of the second item over the > year, > > and not all of them have been caught before releases. What doesn?t help > for > > all of these is that the stable ABI documentation says that every > documented > > symbol is part of the stable ABI unless there?s explicit documentation to > > the contrary. This makes researching if functions are intended to be > part of > > the stable ABI harder. > > > > And also: > > > > 4. Does the stable ABI actually work? > > > > Christian?s script finds cases where exposed names don?t actually work > when > > you try to use them. > > > > > > To reply to myself, the gist below is a very crude version of what I was > > trying to suggest: > > > > > https://gist.github.com/ronaldoussoren/fe4f80351a7ee72c245025df7b2ef1ed#file-gistfile1-txt > > > > The gist is far from usable, but shows some tests that check that > symbols in > > the stable ABI can be used, and tests that everything exported in the > stable > > ABI is actually tested. > > > > Again, the code in the gist is a crude hack and I have currently no > plans to > > turn this into something that could be added to the testsuite. > > > > Ronald > > > > _______________________________________________ > > Python-Dev mailing list > > Python-Dev at python.org > > https://mail.python.org/mailman/listinfo/python-dev > > Unsubscribe: > > > https://mail.python.org/mailman/options/python-dev/ericsnowcurrently%40gmail.com > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From python at mrabarnett.plus.com Tue Jun 5 15:17:25 2018 From: python at mrabarnett.plus.com (MRAB) Date: Tue, 5 Jun 2018 20:17:25 +0100 Subject: [Python-Dev] Unicode 11.0.0 released Message-ID: <1d108d13-9c67-9446-48bc-935cdbefb879@mrabarnett.plus.com> Unicode 11.0.0 has been released. Will Python 3.7 be updated to it, or is it too late? From mgainty at hotmail.com Tue Jun 5 10:28:24 2018 From: mgainty at hotmail.com (Martin Gainty) Date: Tue, 5 Jun 2018 14:28:24 +0000 Subject: [Python-Dev] Microsoft to acquire GitHub for $7.5 b In-Reply-To: References: <20180604174034.58200616@fsol> <20180604190202.16939ef6@fsol>, Message-ID: who owns the Data hosted on Github? Github Author? Microsoft? Martin ______________________________________________ ________________________________ From: Python-Dev on behalf of M.-A. Lemburg Sent: Tuesday, June 5, 2018 7:54 AM To: Antoine Pitrou; python-dev at python.org Subject: Re: [Python-Dev] Microsoft to acquire GitHub for $7.5 billion Something that may change is the way they treat Github accounts, after all, MS is very much a sales driven company. But then there's always the possibility to move to Gitlab as alternative (hosted or run on PSF VMs), so I would worry too much. Do note, however, that the value in Github is not so much with the products they have, but with the data. Their databases know more about IT developer than anyone else and given that Github is put under the AI umbrella in MS should tell us something :-) On 04.06.2018 19:02, Antoine Pitrou wrote: > > That's true, but Microsoft has a lot of stakes in the ecosystem. > For example, since it has its own CI service that it tries to promote > (VSTS), is it in Microsoft's best interest to polish and improve > integrations with other CI services? > > Regards > > Antoine. > > > On Mon, 4 Jun 2018 09:06:28 -0700 > Guido van Rossum wrote: >> On Mon, Jun 4, 2018 at 8:40 AM, Antoine Pitrou wrote: >> >>> >>> On Mon, 4 Jun 2018 17:03:27 +0200 >>> Victor Stinner wrote: >>>> >>>> At this point, I have no opinion about the event :-) I just guess that >>>> it should make GitHub more sustainable since Microsoft is a big >>>> company with money and interest in GitHub. I'm also confident that >>>> nothing will change soon. IMHO there is no need to worry about >>>> anything. >>> >>> It does spell uncertainty on the long term. While there is no need to >>> worry for now, I think it gives a different colour to the debate about >>> moving issues to Github. >>> >> >> I don't see how this *increases* the uncertainty. Surely if GitHub had >> remained independent there would have been be similar concerns about how it >> would make enough money to stay in business. >> > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev Python-Dev Info Page mail.python.org Do not post general Python questions to this list. For help with Python please see the Python help page.. On this list the key Python developers discuss the future of the language and its implementation. > Unsubscribe: https://mail.python.org/mailman/options/python-dev/mal%40egenix.com > -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Experts (#1, Jun 05 2018) >>> Python Projects, Coaching and Consulting ... http://www.egenix.com/ >>> Python Database Interfaces ... http://products.egenix.com/ >>> Plone/Zope Database Interfaces ... http://zope.egenix.com/ ________________________________________________________________________ ::: We implement business ideas - efficiently in both time and costs ::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/ http://www.malemburg.com/ _______________________________________________ Python-Dev mailing list Python-Dev at python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/mgainty%40hotmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From vano at mail.mipt.ru Tue Jun 5 22:01:09 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Wed, 6 Jun 2018 05:01:09 +0300 Subject: [Python-Dev] Microsoft to acquire GitHub for $7.5 b In-Reply-To: References: <20180604174034.58200616@fsol> <20180604190202.16939ef6@fsol> Message-ID: <87cd38df-c2fe-ab0b-660f-7d1740db9a71@mail.mipt.ru> On 05.06.2018 17:28, Martin Gainty wrote: > who owns the Data hosted on Github? > > Github Author? > Microsoft? > > > Martin https://help.github.com/articles/github-terms-of-service/#d-user-generated-content : "/You own content you create, but you allow us certain rights to it, so that we can display and share the content you post. You still have control over your content, and responsibility for it, and the rights you grant us are limited to those we need to provide the service. We have the right to remove content or close Accounts if we need to."/ > > ------------------------------------------------------------------------ > *From:* Python-Dev > on behalf of M.-A. Lemburg > *Sent:* Tuesday, June 5, 2018 7:54 AM > *To:* Antoine Pitrou; python-dev at python.org > *Subject:* Re: [Python-Dev] Microsoft to acquire GitHub for $7.5 billion > Something that may change is the way they treat Github > accounts, after all, MS is very much a sales driven company. > > But then there's always the possibility to move to Gitlab > as alternative (hosted or run on PSF VMs), so I would > worry too much. > > Do note, however, that the value in Github is not so much with > the products they have, but with the data. Their databases > know more about IT developer than anyone else and given > that Github is put under the AI umbrella in MS should tell > us something :-) > > > On 04.06.2018 19:02, Antoine Pitrou wrote: > > > > That's true, but Microsoft has a lot of stakes in the ecosystem. > > For example, since it has its own CI service that it tries to promote > > (VSTS), is it in Microsoft's best interest to polish and improve > > integrations with other CI services? > > > > Regards > > > > Antoine. > > > > > > On Mon, 4 Jun 2018 09:06:28 -0700 > > Guido van Rossum wrote: > >> On Mon, Jun 4, 2018 at 8:40 AM, Antoine Pitrou > wrote: > >> > >>> > >>> On Mon, 4 Jun 2018 17:03:27 +0200 > >>> Victor Stinner wrote: > >>>> > >>>> At this point, I have no opinion about the event :-) I just guess > that > >>>> it should make GitHub more sustainable since Microsoft is a big > >>>> company with money and interest in GitHub. I'm also confident that > >>>> nothing will change soon. IMHO there is no need to worry about > >>>> anything. > >>> > >>> It does spell uncertainty on the long term.? While there is no need to > >>> worry for now, I think it gives a different colour to the debate about > >>> moving issues to Github. > >>> > >> > >> I don't see how this *increases* the uncertainty. Surely if GitHub had > >> remained independent there would have been be similar concerns > about how it > >> would make enough money to stay in business. > >> > > > > _______________________________________________ > > Python-Dev mailing list > > Python-Dev at python.org > > https://mail.python.org/mailman/listinfo/python-dev > > Python-Dev Info Page > mail.python.org > Do not post general Python questions to this list. For help with > Python please see the Python help page.. On this list the key Python > developers discuss the future of the language and its implementation. > > > > > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/mal%40egenix.com > > > > -- > Marc-Andre Lemburg > eGenix.com > > Professional Python Services directly from the Experts (#1, Jun 05 2018) > >>> Python Projects, Coaching and Consulting ... > http://www.egenix.com/ > >>> Python Database Interfaces ... http://products.egenix.com/ > > >>> Plone/Zope Database Interfaces ... http://zope.egenix.com/ > > ________________________________________________________________________ > > ::: We implement business ideas - efficiently in both time and costs ::: > > ?? eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 > ??? D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg > ?????????? Registered at Amtsgericht Duesseldorf: HRB 46611 > http://www.egenix.com/company/contact/ > > http://www.malemburg.com/ > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/mgainty%40hotmail.com > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/vano%40mail.mipt.ru -- Regards, Ivan -------------- next part -------------- An HTML attachment was scrubbed... URL: From benjamin at python.org Wed Jun 6 00:22:01 2018 From: benjamin at python.org (Benjamin Peterson) Date: Tue, 05 Jun 2018 21:22:01 -0700 Subject: [Python-Dev] Unicode 11.0.0 released In-Reply-To: <1d108d13-9c67-9446-48bc-935cdbefb879@mrabarnett.plus.com> References: <1d108d13-9c67-9446-48bc-935cdbefb879@mrabarnett.plus.com> Message-ID: <1528258921.2489494.1397963184.01FFBD07@webmail.messagingengine.com> On Tue, Jun 5, 2018, at 12:17, MRAB wrote: > Unicode 11.0.0 has been released. Will Python 3.7 be updated to it, or > is it too late? https://github.com/python/cpython/pull/7439 will update 3.8. Generally, we've considered updating the Unicode database to be a feature and not backported updates to bugfix branches. Thus, tradition would seem to exclude Unicode 11.0.0 from landing so late in 3.7's release cycle. That said, the update is highly unlikely to break anything. It's up to Ned. From tritium-list at sdamon.com Wed Jun 6 01:28:55 2018 From: tritium-list at sdamon.com (Alex Walters) Date: Wed, 6 Jun 2018 01:28:55 -0400 Subject: [Python-Dev] Microsoft to acquire GitHub for $7.5 b In-Reply-To: <87cd38df-c2fe-ab0b-660f-7d1740db9a71@mail.mipt.ru> References: <20180604174034.58200616@fsol> <20180604190202.16939ef6@fsol> <87cd38df-c2fe-ab0b-660f-7d1740db9a71@mail.mipt.ru> Message-ID: > -----Original Message----- > From: Python-Dev list=sdamon.com at python.org> On Behalf Of Ivan Pozdeev via Python-Dev > Sent: Tuesday, June 5, 2018 10:01 PM > To: python-dev at python.org > Subject: Re: [Python-Dev] Microsoft to acquire GitHub for $7.5 b > > On 05.06.2018 17:28, Martin Gainty wrote: > who owns the Data hosted on Github? > > Github Author? > Microsoft? > > Martin > https://help.github.com/articles/github-terms-of-service/#d-user- > generated-content : > > "You own content you create, but you allow us certain rights to it, so that we > can display and share the content you post. You still have control over your > content, and responsibility for it, and the rights you grant us are limited to > those we need to provide the service. We have the right to remove content > or close Accounts if we need to." > > And this is not likely to change under Microsoft. CodePlex had similar language, as does BitBucket, GitLab, and pretty much any service that hosts creative content that isn?t a social network (and even then, some of them do too.) > > ________________________________________ > From: Python-Dev bounces+mgainty=hotmail.com at python.org> on behalf of > M.-A. Lemburg > Sent: Tuesday, June 5, 2018 7:54 AM > To: Antoine Pitrou; python-dev at python.org > Subject: Re: [Python-Dev] Microsoft to acquire GitHub for > $7.5 billion > > Something that may change is the way they treat Github > accounts, after all, MS is very much a sales driven company. > > But then there's always the possibility to move to Gitlab > as alternative (hosted or run on PSF VMs), so I would > worry too much. > > Do note, however, that the value in Github is not so much > with > the products they have, but with the data. Their databases > know more about IT developer than anyone else and given > that Github is put under the AI umbrella in MS should tell > us something :-) > > > On 04.06.2018 19:02, Antoine Pitrou wrote: > > > > That's true, but Microsoft has a lot of stakes in the > ecosystem. > > For example, since it has its own CI service that it tries to > promote > > (VSTS), is it in Microsoft's best interest to polish and > improve > > integrations with other CI services? > > > > Regards > > > > Antoine. > > > > > > On Mon, 4 Jun 2018 09:06:28 -0700 > > Guido van Rossum wrote: > >> On Mon, Jun 4, 2018 at 8:40 AM, Antoine Pitrou > wrote: > >> > >>> > >>> On Mon, 4 Jun 2018 17:03:27 +0200 > >>> Victor Stinner wrote: > >>>> > >>>> At this point, I have no opinion about the event :-) I just > guess that > >>>> it should make GitHub more sustainable since Microsoft > is a big > >>>> company with money and interest in GitHub. I'm also > confident that > >>>> nothing will change soon. IMHO there is no need to > worry about > >>>> anything. > >>> > >>> It does spell uncertainty on the long term.? While there is > no need to > >>> worry for now, I think it gives a different colour to the > debate about > >>> moving issues to Github. > >>> > >> > >> I don't see how this *increases* the uncertainty. Surely if > GitHub had > >> remained independent there would have been be similar > concerns about how it > >> would make enough money to stay in business. > >> > > > > > ______________________________________________ > _ > > Python-Dev mailing list > > Python-Dev at python.org > > https://mail.python.org/mailman/listinfo/python-dev > Python- > Dev Info > Page > mail.pyth > on.org > Do not > post > general > Python > questions > to this list. > For help > with > Python > please > see the > Python > help > page.. On > this list > the key > Python > developer > s discuss > the future > of the > language > and its > implemen > tation. > > > > > Unsubscribe: > https://mail.python.org/mailman/options/python- > dev/mal%40egenix.com > > > > -- > Marc-Andre Lemburg > eGenix.com > > Professional Python Services directly from the Experts (#1, > Jun 05 2018) > >>> Python Projects, Coaching and Consulting > ...? http://www.egenix.com/ > >>> Python Database Interfaces > ...?????????? http://products.egenix.com/ > >>> Plone/Zope Database Interfaces > ...?????????? http://zope.egenix.com/ > ______________________________________________ > __________________________ > > ::: We implement business ideas - efficiently in both time and > costs ::: > > ?? eGenix.com Software, Skills and Services GmbH? Pastor- > Loeh-Str.48 > ??? D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc- > Andre Lemburg > ?????????? Registered at Amtsgericht Duesseldorf: HRB 46611 > ?????????????? http://www.egenix.com/company/contact/ > ????????????????????? http://www.malemburg.com/ > > ______________________________________________ > _ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python- > dev/mgainty%40hotmail.com > > > > ______________________________________________ > _ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python- > dev/vano%40mail.mipt.ru > > > -- > Regards, > Ivan From rob.cliffe at btinternet.com Wed Jun 6 01:31:35 2018 From: rob.cliffe at btinternet.com (Rob Cliffe) Date: Wed, 6 Jun 2018 06:31:35 +0100 Subject: [Python-Dev] Python3 compiled listcomp can't see local var - bug or feature? Message-ID: Is this a bug or a feature? Consider the following program: # TestProgram.py def Test(): ? # global x ??? x = 1 ??? exec(compile('print([x+1,x+2])', 'MyTest', 'exec')) ??? exec(compile('print([x+i for i in range(1,3)])', 'MyTest', 'exec')) Test() In Python 2.7.15 the output is [2, 3] [2, 3] In Python 3.6.5 the output is [2, 3] Traceback (most recent call last): ? File "TestProgram.py", line 7, in ??? Test() ? File "TestProgram.py", line 6, in Test ??? exec(compile('print([x+i for i in range(1,3)])', 'MyTest', 'exec')) ? File "MyTest", line 1, in ? File "MyTest", line 1, in NameError: name 'x' is not defined If the "global x" declaration is uncommented, this "fixes" the Python 3.6.5 behaviour, i.e. no error occurs and the output is the same as for Python 2.7.15. *In other words, it looks as if in Python 3.6.5, the compiled list comprehension** **can "see" a pre-existing global variable but not a local one.* I have used dis to examine the code objects returned by compile() (they are the same with or without the "global x"): Python 2.7.15 first code object from 'print([x+1,x+2])': ? 1?????????? 0 LOAD_NAME??????????????? 0 (x) ????????????? 3 LOAD_CONST?????????????? 0 (1) ????????????? 6 BINARY_ADD ????????????? 7 LOAD_NAME??????????????? 0 (x) ???????????? 10 LOAD_CONST?????????????? 1 (2) ???????????? 13 BINARY_ADD ???????????? 14 BUILD_LIST?????????????? 2 ???????????? 17 PRINT_ITEM ???????????? 18 PRINT_NEWLINE ???????????? 19 LOAD_CONST?????????????? 2 (None) ???????????? 22 RETURN_VALUE Python 2.7.15 second code object from 'print([x+i for i in range(1,3)])': ? 1?????????? 0 BUILD_LIST?????????????? 0 ????????????? 3 LOAD_NAME??????????????? 0 (range) ????????????? 6 LOAD_CONST?????????????? 0 (1) ????????????? 9 LOAD_CONST?????????????? 1 (3) ???????????? 12 CALL_FUNCTION??????????? 2 ???????????? 15 GET_ITER ??????? >>?? 16 FOR_ITER??????????????? 16 (to 35) ???????????? 19 STORE_NAME?????????????? 1 (i) ???????????? 22 LOAD_NAME??????????????? 2 (x) ???????????? 25 LOAD_NAME??????????????? 1 (i) ???????????? 28 BINARY_ADD ???????????? 29 LIST_APPEND????????????? 2 ???????????? 32 JUMP_ABSOLUTE?????????? 16 ??????? >>?? 35 PRINT_ITEM ???????????? 36 PRINT_NEWLINE ???????????? 37 LOAD_CONST?????????????? 2 (None) ???????????? 40 RETURN_VALUE Python 3.6.5 first code object from 'print([x+1,x+2])': ? 1?????????? 0 LOAD_NAME??????????????? 0 (print) ????????????? 2 LOAD_NAME??????????????? 1 (x) ????????????? 4 LOAD_CONST?????????????? 0 (1) ????????????? 6 BINARY_ADD ????????????? 8 LOAD_NAME??????????????? 1 (x) ???????????? 10 LOAD_CONST?????????????? 1 (2) ???????????? 12 BINARY_ADD ???????????? 14 BUILD_LIST?????????????? 2 ???????????? 16 CALL_FUNCTION??????????? 1 ???????????? 18 POP_TOP ???????????? 20 LOAD_CONST?????????????? 2 (None) ???????????? 22 RETURN_VALUE Python 3.6.5 second code object from 'print([x+i for i in range(1,3)])': ? 1?????????? 0 LOAD_NAME??????????????? 0 (print) ????????????? 2 LOAD_CONST?????????????? 0 ( at 0x00000000029F79C0, file "MyTest", line 1>) ????????????? 4 LOAD_CONST?????????????? 1 ('') ????????????? 6 MAKE_FUNCTION??????????? 0 ????????????? 8 LOAD_NAME??????????????? 1 (range) ???????????? 10 LOAD_CONST?????????????? 2 (1) ???????????? 12 LOAD_CONST?????????????? 3 (3) ???????????? 14 CALL_FUNCTION??????????? 2 ???????????? 16 GET_ITER ???????????? 18 CALL_FUNCTION??????????? 1 ???????????? 20 CALL_FUNCTION??????????? 1 ???????????? 22 POP_TOP ???????????? 24 LOAD_CONST?????????????? 4 (None) ???????????? 26 RETURN_VALUE You will see that in Python 3.6.5 the dis output for the second code object does not show the internals of the listcomp, and in particular whether, and how, it refers to the variable 'x'.? I don't know how to investigate further. Best wishes Rob Cliffe -------------- next part -------------- An HTML attachment was scrubbed... URL: From mcepl at cepl.eu Wed Jun 6 04:55:22 2018 From: mcepl at cepl.eu (=?UTF-8?Q?Mat=C4=9Bj?= Cepl) Date: Wed, 06 Jun 2018 10:55:22 +0200 Subject: [Python-Dev] Microsoft to acquire GitHub for $7.5 billion References: <20180604174034.58200616@fsol> <20180604233831.GS12683@ando.pearwood.info> Message-ID: On 2018-06-05, 15:03 GMT, Nick Coghlan wrote: > I was actually looking into this recently to see if the > repository import feature could be used to run a regularly > updated repository mirror that included all issues and PR > comments in addition to the code, Good, thank you very much. I didn?t, so I just worked out of the PR materials and documentation on their side. I am glad somebody did. > I'm more confident in my ability to predict Microsoft's > business incentives based on the prevailing technology > landscape than I am in my ability to predict the actions of > a VC firm like Andreesen Horowitz :) Perhaps. I still would be more comfortable, if we were thinking from time to time about alternatives in case Microsoft (or somebody else) returns to The Bad Old Ways. I hope it won't happen, but it might. Best, Mat?j -- https://matej.ceplovi.cz/blog/, Jabber: mcepl at ceplovi.cz GPG Finger: 3C76 A027 CA45 AD70 98B5 BC1D 7920 5802 880B C9D8 May integrity and uprightness preserve me, for I wait for you. -- Psalm 25:21 ESV From ncoghlan at gmail.com Wed Jun 6 09:51:55 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Wed, 6 Jun 2018 23:51:55 +1000 Subject: [Python-Dev] Python3 compiled listcomp can't see local var - bug or feature? In-Reply-To: References: Message-ID: On 6 June 2018 at 15:31, Rob Cliffe via Python-Dev wrote: > Is this a bug or a feature? > Consider the following program: > > # TestProgram.py > def Test(): > # global x > x = 1 > exec(compile('print([x+1,x+2])', 'MyTest', 'exec')) > exec(compile('print([x+i for i in range(1,3)])', 'MyTest', 'exec')) > Test() > > In Python 2.7.15 the output is > > [2, 3] > [2, 3] > > In Python 3.6.5 the output is > [2, 3] > Traceback (most recent call last): > File "TestProgram.py", line 7, in > Test() > File "TestProgram.py", line 6, in Test > exec(compile('print([x+i for i in range(1,3)])', 'MyTest', 'exec')) > File "MyTest", line 1, in > File "MyTest", line 1, in > NameError: name 'x' is not defined > > If the "global x" declaration is uncommented, this "fixes" the Python > 3.6.5 behaviour, > i.e. no error occurs and the output is the same as for Python 2.7.15. > > *In other words, it looks as if in Python 3.6.5, the compiled list > comprehension* > *can "see" a pre-existing global variable but not a local one.* > Yes, this is expected behaviour - the two-namespace form of exec (which is what you get implicitly when you use it inside a function body) is similar to a class body, and hence nested functions (including the implicit ones created for comprehensions) can't see the top level local variables. You can override that and force the use of the single-namespace form by passing the locals() namespace into exec() explicitly: def explicit_local_namespace(): x = 1 exec(compile('print([x+i for i in range(1,3)])', 'MyTest', 'exec'), locals()) explicit_local_namespace() (Note: you'll then need to use collections.ChainMap instead of separate locals and globals namespaces if you want the exec'ed code to still be able to see the module globals in addition to the function locals) Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From vstinner at redhat.com Wed Jun 6 11:10:40 2018 From: vstinner at redhat.com (Victor Stinner) Date: Wed, 6 Jun 2018 17:10:40 +0200 Subject: [Python-Dev] Keeping an eye on Travis CI, AppVeyor and buildbots: revert on regression In-Reply-To: <5431b12f-1c40-d9d9-ba78-cf14b4e7c186@mail.mipt.ru> References: <8a009070-96fe-058d-10fb-f913423c4627@mail.mipt.ru> <5431b12f-1c40-d9d9-ba78-cf14b4e7c186@mail.mipt.ru> Message-ID: 2018-06-04 21:37 GMT+02:00 Ivan Pozdeev : > https://docs.travis-ci.com/user/running-build-in-debug-mode/ is the official > doc on how to debug a Travis CI build via ssh. Did you already try it? The doc mentions a "[Debug]" button, but I cannot see it whereas I'm logged in in the Python organization. I also tried the curl API call but it fails with: { "@type": "error", "error_type": "wrong_credentials", "error_message": "access denied" } curl -s -X POST \ -H "Content-Type: application/json" \ -H "Accept: application/json" \ -H "Travis-API-Version: 3" \ -H "Authorization: token XXXXX" \ -d "{\"quiet\": true}" \ https://api.travis-ci.org/job/388706591/debug where XXXXX is my hidden token ;-) If I use an invalid token ID, I get a different error: just the string "access denied", instead of a JSON dictionary. First I was also confused between travis-ci.com and travis-ci.org ... The documentation shows an example with .com, but Python organization uses .org. Victor From vano at mail.mipt.ru Wed Jun 6 11:53:02 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Wed, 6 Jun 2018 18:53:02 +0300 Subject: [Python-Dev] Keeping an eye on Travis CI, AppVeyor and buildbots: revert on regression In-Reply-To: References: <8a009070-96fe-058d-10fb-f913423c4627@mail.mipt.ru> <5431b12f-1c40-d9d9-ba78-cf14b4e7c186@mail.mipt.ru> Message-ID: <8b5fa4d9-a184-5778-7f34-afb6d6f703af@mail.mipt.ru> On 06.06.2018 18:10, Victor Stinner wrote: > 2018-06-04 21:37 GMT+02:00 Ivan Pozdeev : >> https://docs.travis-ci.com/user/running-build-in-debug-mode/ is the official >> doc on how to debug a Travis CI build via ssh. > Did you already try it? The doc mentions a "[Debug]" button, but I > cannot see it whereas I'm logged in in the Python organization. Last I checked, they wrote it's only available for paid accounts (on travis-ci.com) by default and only enabled for others on a case-by-case basis, but I cannot find this info now. So suggest you make a support ticket at https://github.com/travis-ci/travis-ci . > I also tried the curl API call but it fails with: > > { > "@type": "error", > "error_type": "wrong_credentials", > "error_message": "access denied" > } > > curl -s -X POST \ > -H "Content-Type: application/json" \ > -H "Accept: application/json" \ > -H "Travis-API-Version: 3" \ > -H "Authorization: token XXXXX" \ > -d "{\"quiet\": true}" \ > https://api.travis-ci.org/job/388706591/debug > > where XXXXX is my hidden token ;-) > > If I use an invalid token ID, I get a different error: just the string > "access denied", instead of a JSON dictionary. First I was also > confused between travis-ci.com and travis-ci.org ... The documentation > shows an example with .com, but Python organization uses .org. > > Victor -- Regards, Ivan From songofacandy at gmail.com Wed Jun 6 12:20:27 2018 From: songofacandy at gmail.com (INADA Naoki) Date: Thu, 7 Jun 2018 01:20:27 +0900 Subject: [Python-Dev] Keeping an eye on Travis CI, AppVeyor and buildbots: revert on regression In-Reply-To: References: <8a009070-96fe-058d-10fb-f913423c4627@mail.mipt.ru> <5431b12f-1c40-d9d9-ba78-cf14b4e7c186@mail.mipt.ru> Message-ID: > > First I was also > confused between travis-ci.com and travis-ci.org ... The documentation > shows an example with .com, but Python organization uses .org. > > Victor > .org is legacy. Open source projects can migrate to new .com. Maybe, ssh is .com only feature. https://blog.travis-ci.com/2018-05-02-open-source-projects-on-travis-ci-com-with-github-apps https://docs.travis-ci.com/user/open-source-on-travis-ci-com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Wed Jun 6 13:44:12 2018 From: brett at python.org (Brett Cannon) Date: Wed, 6 Jun 2018 10:44:12 -0700 Subject: [Python-Dev] Keeping an eye on Travis CI, AppVeyor and buildbots: revert on regression In-Reply-To: References: <8a009070-96fe-058d-10fb-f913423c4627@mail.mipt.ru> <5431b12f-1c40-d9d9-ba78-cf14b4e7c186@mail.mipt.ru> Message-ID: On Wed, 6 Jun 2018 at 09:27 INADA Naoki wrote: > First I was also >> confused between travis-ci.com and travis-ci.org ... The documentation >> shows an example with .com, but Python organization uses .org. >> >> Victor >> > > .org is legacy. > > Open source projects can migrate to new .com. > ... eventually: "existing user accounts and repositories will be migrated over time." I have not seen any announcements or anything regarding how when or how to migrate ourselves. -Brett > > Maybe, ssh is .com only feature. > > > https://blog.travis-ci.com/2018-05-02-open-source-projects-on-travis-ci-com-with-github-apps > > https://docs.travis-ci.com/user/open-source-on-travis-ci-com/ > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Wed Jun 6 13:56:33 2018 From: chris.barker at noaa.gov (Chris Barker) Date: Wed, 6 Jun 2018 10:56:33 -0700 Subject: [Python-Dev] Docstrings on builtins In-Reply-To: References: <6ea65b34-158c-587e-525c-35d4a2ed4777@mail.mipt.ru> Message-ID: On Tue, Jun 5, 2018 at 8:01 AM, Ivan Pozdeev wrote: > In [5]: print(str.__doc__) > str(object='') -> str > str(bytes_or_buffer[, encoding[, errors]]) -> str > > Create a new string object from the given object. If encoding or > errors is specified <...> > As you can see, the start of the type's docstring contains constructor > signature(s). > > And iPython does the "right thing" here, too: In [7]: str? Init signature: str(self, /, *args, **kwargs) Docstring: str(object='') -> str str(bytes_or_buffer[, encoding[, errors]]) -> str Create a new string object from the given object. If encoding or errors is specified, then the object must expose a data buffer that will be decoded using the given encoding and error handler. Otherwise, returns the result of object.__str__() (if defined) or repr(object). encoding defaults to sys.getdefaultencoding(). errors defaults to 'strict'. Type: type > > Timedelta's one should probably do the same. > OK, I've found the docstring in the source and will submit a PR. -CHB > -CHB > > > > > > On Mon, Jun 4, 2018 at 6:21 PM, Matthias Bussonnier < > bussonniermatthias at gmail.com> wrote: > >> >> >> On Mon, 4 Jun 2018 at 17:29, Ivan Pozdeev via Python-Dev < >> python-dev at python.org> wrote: >> >>> On 05.06.2018 3:09, Matthias Bussonnier wrote: >>> >>> This may even be a bug/feature of IPython, >>> >>> I see that inspect.signature(timedelta) fails, so if timedelta? says >>> Init signature: timedelta(self, /, *args, **kwargs) >>> Then this may be some IPython internal logic. The timedelta class seem >>> to use __new__ instead of __init__ (not sure why) >>> >>> Because it's an immutable type. >>> >> Ah, yes, thanks. >> >> >>> and __new__ have a meaningful signature, >>> So maybe we should fallback on that during signature inspection. >>> >>> According to https://stackoverflow.com/ques >>> tions/4374006/check-for-mutability-in-python , >>> there are no reliable tests for mutability. >>> >> Sure, but we can test if the signature of __init__ is (self,/, *args, >> **kwargs), and if it is, it is useless we can attempt to get the signature >> from __new__ and show that instead. We do similar things for docstrings, >> if __init__ have no docstring we look at the class level docstring. >> -- >> M >> >> >> >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: https://mail.python.org/mailman/options/python-dev/chris. >> barker%40noaa.gov >> >> > > > -- > > Christopher Barker, Ph.D. > Oceanographer > > Emergency Response Division > NOAA/NOS/OR&R (206) 526-6959 voice > 7600 Sand Point Way NE > > (206) 526-6329 fax > Seattle, WA 98115 (206) 526-6317 main reception > > Chris.Barker at noaa.gov > > > -- > Regards, > Ivan > > -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Wed Jun 6 14:24:35 2018 From: brett at python.org (Brett Cannon) Date: Wed, 6 Jun 2018 11:24:35 -0700 Subject: [Python-Dev] Microsoft to acquire GitHub for $7.5 billion In-Reply-To: References: <20180604174034.58200616@fsol> <20180604233831.GS12683@ando.pearwood.info> Message-ID: On Wed, 6 Jun 2018 at 02:09 Mat?j Cepl wrote: > On 2018-06-05, 15:03 GMT, Nick Coghlan wrote: > > I was actually looking into this recently to see if the > > repository import feature could be used to run a regularly > > updated repository mirror that included all issues and PR > > comments in addition to the code, > > Good, thank you very much. I didn?t, so I just worked out of the > PR materials and documentation on their side. I am glad somebody > did. > > > I'm more confident in my ability to predict Microsoft's > > business incentives based on the prevailing technology > > landscape than I am in my ability to predict the actions of > > a VC firm like Andreesen Horowitz :) > > Perhaps. I still would be more comfortable, if we were thinking > from time to time about alternatives in case Microsoft (or > somebody else) returns to The Bad Old Ways. I hope it won't > happen, but it might. > Backing up the git repo is not terribly troublesome because you just need to to it regularly by cron job. Plus since git is distributed we have copies of that repo all over the place and you can verify integrity through the commit hashes, so even without an official backup we have a ton of unofficial backups. :) As for the PR data, that could be done by recording every webhook event from our repo if someone wanted to. Then you could reconstruct things by essentially replaying the log of events. There are also probably more structured ways to do it as well if people cared. But the key thing is someone has to take the time and effort to set something up. I'm personally not planning to put in the effort since I think there's a massively bigger chance we will switch hosts again rather than GitHub deciding to shut off data access with no data export feature or lead time such that we have to craft our own solution and we can't do it fast enough to prevent the loss of data (and I don't think doing this ahead of time for an eventual migration is worth it either as any platform we move to might have its own migration support, etc.). IOW I ain't worried, but I think everyone assumed that for me. :) -------------- next part -------------- An HTML attachment was scrubbed... URL: From jake at lwn.net Wed Jun 6 17:56:53 2018 From: jake at lwn.net (Jake Edge) Date: Wed, 6 Jun 2018 15:56:53 -0600 Subject: [Python-Dev] 2018 Python Language Summit coverage Message-ID: <20180606155653.264c9566@gallinule> Hola python-dev, I have been remiss in posting about my coverage from this year's Python Language Summit -- not to mention remiss in getting it all written up. But I am about half-way done with the sessions from this year. I am posting SubscriberLinks for articles that are still behind the paywall. LWN subscribers can always see our content right away; one week after they are published in a weekly edition, they become freely available for everyone. SubscriberLinks are a way around the paywall. Please feel free to share the SubscriberLinks I am posting here. The starting point is here: https://lwn.net/Articles/754152/ That is an overview article with links to the articles. It will be updated as I add more articles. Here is what we have so far: - Subinterpreter support for Python https://lwn.net/Articles/754162/ - Modifying the Python object model https://lwn.net/Articles/754163/ - A Gilectomy update https://lwn.net/Articles/754577/ - Using GitHub Issues for Python https://lwn.net/Articles/754779/ - Shortening the Python release schedule https://lwn.net/Articles/755224/ - Unplugging old batteries https://lwn.net/SubscriberLink/755229/df78cf17181dbdca/ Hopefully I captured things reasonably well -- if you have corrections or clarifications (or just comments :) , I would recommend posting them as comments on the article. I will post an update soon with the next round (with luck, all of the rest of them). enjoy! jake -- Jake Edge - LWN - jake at lwn.net - http://lwn.net From songofacandy at gmail.com Wed Jun 6 18:56:34 2018 From: songofacandy at gmail.com (INADA Naoki) Date: Thu, 7 Jun 2018 07:56:34 +0900 Subject: [Python-Dev] Keeping an eye on Travis CI, AppVeyor and buildbots: revert on regression In-Reply-To: References: <8a009070-96fe-058d-10fb-f913423c4627@mail.mipt.ru> <5431b12f-1c40-d9d9-ba78-cf14b4e7c186@mail.mipt.ru> Message-ID: ? ?2018?6?7?(?) 2:44 Brett Cannon : > > On Wed, 6 Jun 2018 at 09:27 INADA Naoki wrote: > >> First I was also >>> confused between travis-ci.com and travis-ci.org ... The documentation >>> shows an example with .com, but Python organization uses .org. >>> >>> Victor >>> >> >> .org is legacy. >> >> Open source projects can migrate to new .com. >> > > ... eventually: "existing user accounts and repositories will be migrated > over time." I have not seen any announcements or anything regarding how > when or how to migrate ourselves. > > -Brett > Before waiting notice from Travis-CI, we need to activate the repository on new site. https://docs.travis-ci.com/user/open-source-on-travis-ci-com/#Existing-Open-Source-Repositories-on-travis-ci.org > However, open source repositories will be migrated to travis-ci.com gradually, beginning at the end of Q2 2018. You will receive an email when the migration for a repository is complete. This is an opt-in process: to have a repository migrated over, it must first be activated on travis-ci.com. Could someone who is ?python org admin owner try activa ?? ting from here? https://travis-ci.com/profile/python -------------- next part -------------- An HTML attachment was scrubbed... URL: From ericsnowcurrently at gmail.com Wed Jun 6 20:47:12 2018 From: ericsnowcurrently at gmail.com (Eric Snow) Date: Wed, 6 Jun 2018 18:47:12 -0600 Subject: [Python-Dev] 2018 Python Language Summit coverage In-Reply-To: <20180606155653.264c9566@gallinule> References: <20180606155653.264c9566@gallinule> Message-ID: Thanks for doing this Jake. -eric On Wed, Jun 6, 2018 at 3:56 PM, Jake Edge wrote: > > Hola python-dev, > > I have been remiss in posting about my coverage from this year's Python > Language Summit -- not to mention remiss in getting it all written up. > But I am about half-way done with the sessions from this year. > > I am posting SubscriberLinks for articles that are still behind the > paywall. LWN subscribers can always see our content right away; one > week after they are published in a weekly edition, they become freely > available for everyone. SubscriberLinks are a way around the paywall. > Please feel free to share the SubscriberLinks I am posting here. > > The starting point is here: https://lwn.net/Articles/754152/ That is > an overview article with links to the articles. It will be updated as > I add more articles. Here is what we have so far: > > - Subinterpreter support for Python https://lwn.net/Articles/754162/ > > - Modifying the Python object model https://lwn.net/Articles/754163/ > > - A Gilectomy update https://lwn.net/Articles/754577/ > > - Using GitHub Issues for Python https://lwn.net/Articles/754779/ > > - Shortening the Python release schedule > https://lwn.net/Articles/755224/ > > - Unplugging old batteries > https://lwn.net/SubscriberLink/755229/df78cf17181dbdca/ > > Hopefully I captured things reasonably well -- if you have corrections > or clarifications (or just comments :) , I would recommend posting them > as comments on the article. > > I will post an update soon with the next round (with luck, all of the > rest of them). > > enjoy! > > jake > > -- > Jake Edge - LWN - jake at lwn.net - http://lwn.net > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ericsnowcurrently%40gmail.com From chris.jerdonek at gmail.com Wed Jun 6 20:59:17 2018 From: chris.jerdonek at gmail.com (Chris Jerdonek) Date: Wed, 6 Jun 2018 17:59:17 -0700 Subject: [Python-Dev] Microsoft to acquire GitHub for $7.5 b In-Reply-To: <87cd38df-c2fe-ab0b-660f-7d1740db9a71@mail.mipt.ru> References: <20180604174034.58200616@fsol> <20180604190202.16939ef6@fsol> <87cd38df-c2fe-ab0b-660f-7d1740db9a71@mail.mipt.ru> Message-ID: On Tue, Jun 5, 2018 at 7:03 PM Ivan Pozdeev via Python-Dev < python-dev at python.org> wrote: > On 05.06.2018 17:28, Martin Gainty wrote: > > who owns the Data hosted on Github? > > Github Author? > Microsoft? > > > Martin > > > https://help.github.com/articles/github-terms-of-service/#d-user-generated-content > : > > "*You own content you create, but you allow us certain rights to it, so > that we can display and share the content you post. You still have control > over your content, and responsibility for it, and the rights you grant us > are limited to those we need to provide the service.* > Is the ?service? they provide (and what it needs) allowed to change over time, so that the rights granted can expand? The definition of ?service? in their document is? 1. The ?Service? refers to the applications, software, products, and services provided by GitHub. ?Chris * We have the right to remove content or close Accounts if we need to."* > > > ------------------------------ > *From:* Python-Dev > on behalf of M.-A. > Lemburg > *Sent:* Tuesday, June 5, 2018 7:54 AM > *To:* Antoine Pitrou; python-dev at python.org > *Subject:* Re: [Python-Dev] Microsoft to acquire GitHub for $7.5 billion > > Something that may change is the way they treat Github > accounts, after all, MS is very much a sales driven company. > > But then there's always the possibility to move to Gitlab > as alternative (hosted or run on PSF VMs), so I would > worry too much. > > Do note, however, that the value in Github is not so much with > the products they have, but with the data. Their databases > know more about IT developer than anyone else and given > that Github is put under the AI umbrella in MS should tell > us something :-) > > > On 04.06.2018 19:02, Antoine Pitrou wrote: > > > > That's true, but Microsoft has a lot of stakes in the ecosystem. > > For example, since it has its own CI service that it tries to promote > > (VSTS), is it in Microsoft's best interest to polish and improve > > integrations with other CI services? > > > > Regards > > > > Antoine. > > > > > > On Mon, 4 Jun 2018 09:06:28 -0700 > > Guido van Rossum wrote: > >> On Mon, Jun 4, 2018 at 8:40 AM, Antoine Pitrou > wrote: > >> > >>> > >>> On Mon, 4 Jun 2018 17:03:27 +0200 > >>> Victor Stinner wrote: > >>>> > >>>> At this point, I have no opinion about the event :-) I just guess that > >>>> it should make GitHub more sustainable since Microsoft is a big > >>>> company with money and interest in GitHub. I'm also confident that > >>>> nothing will change soon. IMHO there is no need to worry about > >>>> anything. > >>> > >>> It does spell uncertainty on the long term. While there is no need to > >>> worry for now, I think it gives a different colour to the debate about > >>> moving issues to Github. > >>> > >> > >> I don't see how this *increases* the uncertainty. Surely if GitHub had > >> remained independent there would have been be similar concerns about > how it > >> would make enough money to stay in business. > >> > > > > _______________________________________________ > > Python-Dev mailing list > > Python-Dev at python.org > > https://mail.python.org/mailman/listinfo/python-dev > Python-Dev Info Page > mail.python.org > Do not post general Python questions to this list. For help with Python > please see the Python help page.. On this list the key Python developers > discuss the future of the language and its implementation. > > > > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/mal%40egenix.com > > > > -- > Marc-Andre Lemburg > eGenix.com > > Professional Python Services directly from the Experts (#1, Jun 05 2018) > >>> Python Projects, Coaching and Consulting ... http://www.egenix.com/ > >>> Python Database Interfaces ... http://products.egenix.com/ > >>> Plone/Zope Database Interfaces ... http://zope.egenix.com/ > ________________________________________________________________________ > > ::: We implement business ideas - efficiently in both time and costs ::: > > eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 > D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg > Registered at Amtsgericht Duesseldorf: HRB 46611 > http://www.egenix.com/company/contact/ > http://www.malemburg.com/ > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/mgainty%40hotmail.com > > > _______________________________________________ > Python-Dev mailing listPython-Dev at python.orghttps://mail.python.org/mailman/listinfo/python-dev > > Unsubscribe: https://mail.python.org/mailman/options/python-dev/vano%40mail.mipt.ru > > > -- > Regards, > Ivan > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/chris.jerdonek%40gmail.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mariatta.wijaya at gmail.com Wed Jun 6 22:25:52 2018 From: mariatta.wijaya at gmail.com (Mariatta Wijaya) Date: Wed, 6 Jun 2018 19:25:52 -0700 Subject: [Python-Dev] Keeping an eye on Travis CI, AppVeyor and buildbots: revert on regression In-Reply-To: References: <8a009070-96fe-058d-10fb-f913423c4627@mail.mipt.ru> <5431b12f-1c40-d9d9-ba78-cf14b4e7c186@mail.mipt.ru> Message-ID: Activating Travis CI GitHub app is being tracked in https://github.com/python/core-workflow/issues/255 Let's not press the button until after the 3.7 release. On Wed, Jun 6, 2018, 3:57 PM INADA Naoki wrote: > ? > ?2018?6?7?(?) 2:44 Brett Cannon : > >> >> On Wed, 6 Jun 2018 at 09:27 INADA Naoki wrote: >> >>> First I was also >>>> confused between travis-ci.com and travis-ci.org ... The documentation >>>> shows an example with .com, but Python organization uses .org. >>>> >>>> Victor >>>> >>> >>> .org is legacy. >>> >>> Open source projects can migrate to new .com. >>> >> >> ... eventually: "existing user accounts and repositories will be migrated >> over time." I have not seen any announcements or anything regarding how >> when or how to migrate ourselves. >> >> -Brett >> > > Before waiting notice from Travis-CI, we need to activate the repository > on new site. > > > https://docs.travis-ci.com/user/open-source-on-travis-ci-com/#Existing-Open-Source-Repositories-on-travis-ci.org > > However, open source repositories will be migrated to travis-ci.com gradually, > beginning at the end of Q2 2018. You will receive an email when the > migration for a repository is complete. This is an opt-in process: to have > a repository migrated over, it must first be activated on travis-ci.com. > > Could someone who is > ?python org admin > owner try activa > ?? > ting from here? > https://travis-ci.com/profile/python > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/mariatta.wijaya%40gmail.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mariatta.wijaya at gmail.com Wed Jun 6 22:45:35 2018 From: mariatta.wijaya at gmail.com (Mariatta Wijaya) Date: Wed, 6 Jun 2018 19:45:35 -0700 Subject: [Python-Dev] Keeping an eye on Travis CI, AppVeyor and buildbots: revert on regression In-Reply-To: References: Message-ID: Are there APIs we can use to check the status of builbots? Maybe we can have the our bots check for the buildbot status in backport PRs. On Wed, May 30, 2018, 2:33 AM Victor Stinner wrote: > > Buildbots only send email notifications to buildbot-status at python.org > when the state changes from success (green) to failure (red). It's > much simpler for me to spot a regression when most buildbots are > green. > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at pearwood.info Thu Jun 7 00:41:27 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Thu, 7 Jun 2018 14:41:27 +1000 Subject: [Python-Dev] Microsoft to acquire GitHub for $7.5 b In-Reply-To: References: <20180604174034.58200616@fsol> <20180604190202.16939ef6@fsol> <87cd38df-c2fe-ab0b-660f-7d1740db9a71@mail.mipt.ru> Message-ID: <20180607044127.GB12683@ando.pearwood.info> On Wed, Jun 06, 2018 at 05:59:17PM -0700, Chris Jerdonek wrote: > Is the ?service? they provide (and what it needs) allowed to change over > time, so that the rights granted can expand? Of course it can change. And they might not even need to give us notice. But that's no different from any other service provider, including your ISP, your phone provider, your electricity provider, etc. If we're to be concerned about changes to terms and conditions, we should be equally concerned about Google, Apple, Amazon, Red Hat, Oracle etc. We shouldn't be uniquely or especially concerned just because Microsoft has purchased Github. Nothing has changed. Github (the old Github, before being sold) were not "the Good Guys", and Microsoft is not "the Bad Guys". Github were a commercial entity, run by venture capitalists only in it for the money, with a brogrammer culture that was (allegedly) highly toxic to women. If Github didn't try to make a grab for their users' content, it was because they made a commercial decision that stealing the IP for a thousand versions of "leftpad" for Node.js was not worth the harm they would do to their business, not because they're nice guys who wouldn't do that. https://medium.com/@caspervonb/the-internet-is-at-the-mercy-of-a-handful-of-people-73fac4bc5068 I know that suspicion and fear of Microsoft's bona fides is a long running tradition in FOSS circles, but Microsoft is subject to the same sorts of commercial realities as any other corporation: there is a limit to how evil they can be for the LOLs and still stay in business. They are no more likely to grab users' content than Github were, and for the same reasons. Actually, probably LESS likely. The sort of companies who are Microsoft's important customers, the ones with deep pockets willing to pay for services like Github's, are if anything even more cognisant of the value and importance of so-called Intellectual Property than the average FOSS user, and far more likely to be defensive over some hosting company trying to claim rights to their IP. (Personally, I'm more concerned about MS trying to become another Google, profiling us -- all the better to sell our personal data -- by matching up our Github identies with everything we do on the internet. But again, that's not unique to Microsoft. Every second website, these days, wants to follow your every click and watch everything you do. But that's a rant for another day.) -- Steve From nad at python.org Thu Jun 7 03:40:02 2018 From: nad at python.org (Ned Deily) Date: Thu, 7 Jun 2018 03:40:02 -0400 Subject: [Python-Dev] Unicode 11.0.0 released In-Reply-To: <1528258921.2489494.1397963184.01FFBD07@webmail.messagingengine.com> References: <1d108d13-9c67-9446-48bc-935cdbefb879@mrabarnett.plus.com> <1528258921.2489494.1397963184.01FFBD07@webmail.messagingengine.com> Message-ID: <2B58236E-D849-4E81-9373-F25DE22984C6@python.org> On Jun 6, 2018, at 00:22, Benjamin Peterson wrote: > On Tue, Jun 5, 2018, at 12:17, MRAB wrote: >> Unicode 11.0.0 has been released. Will Python 3.7 be updated to it, or >> is it too late? > > https://github.com/python/cpython/pull/7439 will update 3.8. Generally, we've considered updating the Unicode database to be a feature and not backported updates to bugfix branches. Thus, tradition would seem to exclude Unicode 11.0.0 from landing so late in 3.7's release cycle. That said, the update is highly unlikely to break anything. It's up to Ned. I'd hate for 3.7 to fall behind in the emoji race :) Seriously, there are some important additions that will, no doubt, appear in platforms over the lifetime of 3.7 and the risk is very low. Thanks for pinging about it. https://github.com/python/cpython/pull/7470 -- Ned Deily nad at python.org -- [] From zachary.ware+pydev at gmail.com Thu Jun 7 10:15:51 2018 From: zachary.ware+pydev at gmail.com (Zachary Ware) Date: Thu, 7 Jun 2018 09:15:51 -0500 Subject: [Python-Dev] Keeping an eye on Travis CI, AppVeyor and buildbots: revert on regression In-Reply-To: References: Message-ID: On Wed, Jun 6, 2018 at 9:45 PM, Mariatta Wijaya wrote: > Are there APIs we can use to check the status of builbots? > Maybe we can have the our bots check for the buildbot status in backport > PRs. There is a REST API for buildbot; I have no idea how usable/useful it is though (but I think the new UI interacts with the master mostly via REST calls). I am planning to eventually get buildbot integration with GitHub set up, possibly in September. I think it should be possible to make only stable bots show up as status checks, or even just a select subset of the stable bots. -- Zach From python at mrabarnett.plus.com Thu Jun 7 13:02:40 2018 From: python at mrabarnett.plus.com (MRAB) Date: Thu, 7 Jun 2018 18:02:40 +0100 Subject: [Python-Dev] Unicode 11.0.0 released In-Reply-To: <2B58236E-D849-4E81-9373-F25DE22984C6@python.org> References: <1d108d13-9c67-9446-48bc-935cdbefb879@mrabarnett.plus.com> <1528258921.2489494.1397963184.01FFBD07@webmail.messagingengine.com> <2B58236E-D849-4E81-9373-F25DE22984C6@python.org> Message-ID: <9500c77f-6aa6-8b8f-c11e-01a599f3bf8d@mrabarnett.plus.com> On 2018-06-07 08:40, Ned Deily wrote: > On Jun 6, 2018, at 00:22, Benjamin Peterson wrote: > > On Tue, Jun 5, 2018, at 12:17, MRAB wrote: > >> Unicode 11.0.0 has been released. Will Python 3.7 be updated to it, or > >> is it too late? > > > > https://github.com/python/cpython/pull/7439 will update 3.8. Generally, we've considered updating the Unicode database to be a feature and not backported updates to bugfix branches. Thus, tradition would seem to exclude Unicode 11.0.0 from landing so late in 3.7's release cycle. That said, the update is highly unlikely to break anything. It's up to Ned. > > I'd hate for 3.7 to fall behind in the emoji race :) The Python community _is_ meant to be inclusive, and we should support the addition of ginger emoijs. :-) > Seriously, there are some important additions that will, no doubt, appear in platforms over the lifetime of 3.7 and the risk is very low. Thanks for pinging about it. > > https://github.com/python/cpython/pull/7470 > From chris.barker at noaa.gov Thu Jun 7 13:33:45 2018 From: chris.barker at noaa.gov (Chris Barker - NOAA Federal) Date: Thu, 7 Jun 2018 10:33:45 -0700 Subject: [Python-Dev] Microsoft to acquire GitHub for $7.5 b In-Reply-To: <20180607044127.GB12683@ando.pearwood.info> References: <20180604174034.58200616@fsol> <20180604190202.16939ef6@fsol> <87cd38df-c2fe-ab0b-660f-7d1740db9a71@mail.mipt.ru> <20180607044127.GB12683@ando.pearwood.info> Message-ID: > We shouldn't be uniquely or especially concerned just because > Microsoft has purchased Github. Nothing has changed. Exactly ? but this change HAS made people think about an issue that we should have already been thinking about. At the end of the day, anyone, or any project, would be well served by having a plan for potential loss of valuable data. And the no brainer on that is : don?t have only one copy in one place. That?s why my family photos are on my hard drive, and a backup drive, and an online backup service. My house could burn down, so I don?t want everything there. The backup service could change its conditions, go out of business, or simply have a technical failure ? so I don?t want everything there, either. Any service could change or fail. Period. So we shouldn?t want valuable information about Python development only in gitHub. I don?t know how hard it is to backup / mirror an entire repo ? but it sure seems like a good idea. -CHB From rosuav at gmail.com Thu Jun 7 14:24:26 2018 From: rosuav at gmail.com (Chris Angelico) Date: Fri, 8 Jun 2018 04:24:26 +1000 Subject: [Python-Dev] Microsoft to acquire GitHub for $7.5 b In-Reply-To: References: <20180604174034.58200616@fsol> <20180604190202.16939ef6@fsol> <87cd38df-c2fe-ab0b-660f-7d1740db9a71@mail.mipt.ru> <20180607044127.GB12683@ando.pearwood.info> Message-ID: On Fri, Jun 8, 2018 at 3:33 AM, Chris Barker - NOAA Federal via Python-Dev wrote: > Any service could change or fail. Period. > > So we shouldn?t want valuable information about Python development > only in gitHub. > > I don?t know how hard it is to backup / mirror an entire repo ? but it > sure seems like a good idea. There are two separate concerns here: 1) How do we get a full copy of all of CPython and its change history? 2) How do we get all the non-code content - issues, pull requests, comments? The first one is trivially easy. *Everyone* who has a clone of the repository [1] has a full copy of the code and all history, updated every time 'git pull' is run. The second one depends on GitHub's exporting facilities; but it also depends on a definition of what's important. Maybe the PSF doesn't care if people's comments at the bottoms of commits are lost (not to be confused with commit messages themselves, which are part of the repo proper), so it wouldn't matter if they're lost. Or maybe it's important to have the contents of such commits, but it's okay to credit them to an email address rather than linking to an actual username. Or whatever. Unlike with the code/history repo, an imperfect export is still of partial value. ChrisA [1] Barring shallow clones, but most people don't do those From vstinner at redhat.com Fri Jun 8 03:48:03 2018 From: vstinner at redhat.com (Victor Stinner) Date: Fri, 8 Jun 2018 09:48:03 +0200 Subject: [Python-Dev] Idea: reduce GC threshold in development mode (-X dev) Message-ID: Hi, Yury Selivanov pushed his implementation of the PEP 567 -- Context Variables at January 23, 2018. Yesterday, 4 months after the commit and only 3 weeks before 3.7.0 final release, a crash has been found in the implementation: https://bugs.python.org/issue33803 (it's now fixed, don't worry Ned!) The bug is a "common" mistake in an object constructor implemented in C: the object was tracked by the garbage collector before it was fully initialized, and a GC collection caused a crash somewhere in "object traversing". By "common", I mean that I saw this exact bug between 5 and 10 times over the last 5 years. In the bpo issue, I asked why we only spotted the bug yesterday? It seems like changing the threshold of the GC generation 0 from 700 to 5 triggers the bug immediately in test_context (tests of the PEP 567). I wrote a proof-of-concept patch to change the threshold when using -X dev. Question: Do you think that bugs spotted by a GC collection are common enough to change the GC thresholds in development mode (new -X dev flag of Python 3.7)? GC collections detect various kinds of bugs. Another "common" bug is when an object remains somehow alive in the GC whereas its memory has been freed: using PYTHONMALLOC=debug (debug feature already enabled by -X dev), a GC collection will always crash in such case. I'm not sure about the exact thresholds that would be used in development mode. The general question is more if it would be useful. Then the side question is if reducing the threshold would kill performances or not. About performances, -X dev allows to enable debug features which have an "acceptable" cost in term of performance and memory, but enabled features are chosen on a case by case basis. For example, I chose to *not* enable tracemalloc using -X dev because the cost in term of CPU *and* memory is too high (usually 2x slower and memory x2). Victor From vstinner at redhat.com Fri Jun 8 03:53:05 2018 From: vstinner at redhat.com (Victor Stinner) Date: Fri, 8 Jun 2018 09:53:05 +0200 Subject: [Python-Dev] Keeping an eye on Travis CI, AppVeyor and buildbots: revert on regression In-Reply-To: References: Message-ID: 2018-06-07 4:45 GMT+02:00 Mariatta Wijaya : > Are there APIs we can use to check the status of builbots? Buildbots offer different ways to send notifications: emails and IRC bot for example. If you want to *poll* for recent builds, I don't know. I would suggest to use notifications (push) rather than polling. > Maybe we can have the our bots check for the buildbot status in backport > PRs. Right now, I'm not confortable with this idea because there is a risk to scare newcomers with false alarms and real bugs which are not coming from their changes, but may be known or are new but still unrelated to their changes. Moreover, even when a buildbot fails because of a real regression, a build may include multiple changes (I saw builds with more than 25 changes on some buildbots). Buildbot is different than a CI: slow buildbots are able to pack test multiple new changes at once. So again, there is a high risk of false alarms. Maybe I'm too conservative and we can try something with good documentation and proper warnings to explain properly such hypotetical notifications on pull requests. See also my other email which explains this differently: https://mail.python.org/pipermail/python-dev/2018-May/153759.html Victor From vstinner at redhat.com Fri Jun 8 04:17:04 2018 From: vstinner at redhat.com (Victor Stinner) Date: Fri, 8 Jun 2018 10:17:04 +0200 Subject: [Python-Dev] Keeping an eye on Travis CI, AppVeyor and buildbots: revert on regression In-Reply-To: References: Message-ID: 2018-06-04 18:31 GMT+02:00 Victor Stinner : > Quick update a few days later. > (...) > Except Windows 7 which has issues with test_asyncio and > multiprocessing tests because this buildbot is slow, it seems like > most CIs are now stable. The bug wasn't specific to this buildbot, it was a very old race condition in the Windows ProactorEventLoop which only started recently to trigger test_asyncio failures. See https://bugs.python.org/issue33694 for the details. > Known issues: > > * PPC64 Fedora 3.x, PPC64LE Fedora 3.x, s390x RHEL 3.x: > https://bugs.python.org/issue33630 > * AIX: always red > * USBan: experimental buildbot > * Alpine: platform not supported yet (musl issues) Except of these few CIs, the main issue was test__xxsubinterpreters which is known to crash: https://bugs.python.org/issue33615 To fix CIs and give more time to Eric Snow to look at this crash, I decided to skip test__xxsubinterpreters. You may want to offer help to Eric to look into these tricky issues. So again, except of the few known issues listed above, all other CIs (Travis CI, AppVeyor, all other buildbots) should now pass. Victor From storchaka at gmail.com Fri Jun 8 04:17:20 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Fri, 8 Jun 2018 11:17:20 +0300 Subject: [Python-Dev] Idea: reduce GC threshold in development mode (-X dev) In-Reply-To: References: Message-ID: 08.06.18 10:48, Victor Stinner ????: > Yury Selivanov pushed his implementation of the PEP 567 -- Context > Variables at January 23, 2018. Yesterday, 4 months after the commit > and only 3 weeks before 3.7.0 final release, a crash has been found in > the implementation: > https://bugs.python.org/issue33803 > (it's now fixed, don't worry Ned!) > > The bug is a "common" mistake in an object constructor implemented in > C: the object was tracked by the garbage collector before it was fully > initialized, and a GC collection caused a crash somewhere in "object > traversing". By "common", I mean that I saw this exact bug between 5 > and 10 times over the last 5 years. > > In the bpo issue, I asked why we only spotted the bug yesterday? It > seems like changing the threshold of the GC generation 0 from 700 to 5 > triggers the bug immediately in test_context (tests of the PEP 567). I > wrote a proof-of-concept patch to change the threshold when using -X > dev. > > Question: Do you think that bugs spotted by a GC collection are common > enough to change the GC thresholds in development mode (new -X dev > flag of Python 3.7)? > > GC collections detect various kinds of bugs. Another "common" bug is > when an object remains somehow alive in the GC whereas its memory has > been freed: using PYTHONMALLOC=debug (debug feature already enabled by > -X dev), a GC collection will always crash in such case. > > I'm not sure about the exact thresholds that would be used in > development mode. The general question is more if it would be useful. > Then the side question is if reducing the threshold would kill > performances or not. > > About performances, -X dev allows to enable debug features which have > an "acceptable" cost in term of performance and memory, but enabled > features are chosen on a case by case basis. For example, I chose to > *not* enable tracemalloc using -X dev because the cost in term of CPU > *and* memory is too high (usually 2x slower and memory x2). Reducing GC threshold can hide other bugs that will be reproducible only in the release mode (because of earlier releasing of resources or changed order of destroying objects). What is the cost of traversing all objects? Would it be too high if just traverse all objects every time when the garbage collecting potentially can happen, but without modifying any data, just check for consistency of GC headers? It may be worth to write also suggestions for testing extensions (including setting low GC threshold) and include them in the Devguide (for core developers) and the "Extending and Embedding" section of the documentation (for authors of third-party extensions). From vstinner at redhat.com Fri Jun 8 04:31:51 2018 From: vstinner at redhat.com (Victor Stinner) Date: Fri, 8 Jun 2018 10:31:51 +0200 Subject: [Python-Dev] Idea: reduce GC threshold in development mode (-X dev) In-Reply-To: References: Message-ID: 2018-06-08 10:17 GMT+02:00 Serhiy Storchaka : > Reducing GC threshold can hide other bugs that will be reproducible only in > the release mode (because of earlier releasing of resources or changed order > of destroying objects). > > What is the cost of traversing all objects? Would it be too high if just > traverse all objects every time when the garbage collecting potentially can > happen, but without modifying any data, just check for consistency of GC > headers? Do you suggest to trigger a fake "GC collection" which would just visit all objects with a no-op visit callback? I like the idea! Yeah, that would help to detect objects in an inconsistent state and reuse the existing implemented visit methods of all types. Would you be interested to try to implement this new debug feature? Victor From storchaka at gmail.com Fri Jun 8 06:36:24 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Fri, 8 Jun 2018 13:36:24 +0300 Subject: [Python-Dev] Idea: reduce GC threshold in development mode (-X dev) In-Reply-To: References: Message-ID: 08.06.18 11:31, Victor Stinner ????: > Do you suggest to trigger a fake "GC collection" which would just > visit all objects with a no-op visit callback? I like the idea! > > Yeah, that would help to detect objects in an inconsistent state and > reuse the existing implemented visit methods of all types. > > Would you be interested to try to implement this new debug feature? It is simple: #ifdef Py_DEBUG void _PyGC_CheckConsistency(void) { ??? int i; ??? if (_PyRuntime.gc.collecting) { ??????? return; ??? } ??? _PyRuntime.gc.collecting = 1; ??? for (i = 0; i < NUM_GENERATIONS; ++i) { ??????? update_refs(GEN_HEAD(i)); ??? } ??? for (i = 0; i < NUM_GENERATIONS; ++i) { ??????? subtract_refs(GEN_HEAD(i)); ??? } ??? for (i = 0; i < NUM_GENERATIONS; ++i) { ??????? revive_garbage(GEN_HEAD(i)); ??? } ??? _PyRuntime.gc.collecting = 0; } #endif From steve at pearwood.info Fri Jun 8 08:42:45 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Fri, 8 Jun 2018 22:42:45 +1000 Subject: [Python-Dev] Unicode 11.0.0 released In-Reply-To: <9500c77f-6aa6-8b8f-c11e-01a599f3bf8d@mrabarnett.plus.com> References: <1d108d13-9c67-9446-48bc-935cdbefb879@mrabarnett.plus.com> <1528258921.2489494.1397963184.01FFBD07@webmail.messagingengine.com> <2B58236E-D849-4E81-9373-F25DE22984C6@python.org> <9500c77f-6aa6-8b8f-c11e-01a599f3bf8d@mrabarnett.plus.com> Message-ID: <20180608124244.GI12683@ando.pearwood.info> On Thu, Jun 07, 2018 at 06:02:40PM +0100, MRAB wrote: > The Python community _is_ meant to be inclusive, and we should support > the addition of ginger emoijs. :-) https://www.youtube.com/watch?v=KVN_0qvuhhw -- Steve From ronaldoussoren at mac.com Fri Jun 8 09:22:01 2018 From: ronaldoussoren at mac.com (Ronald Oussoren) Date: Fri, 08 Jun 2018 15:22:01 +0200 Subject: [Python-Dev] Idea: reduce GC threshold in development mode (-X dev) In-Reply-To: References: Message-ID: > On 8 Jun 2018, at 12:36, Serhiy Storchaka wrote: > > 08.06.18 11:31, Victor Stinner ????: >> Do you suggest to trigger a fake "GC collection" which would just >> visit all objects with a no-op visit callback? I like the idea! >> >> Yeah, that would help to detect objects in an inconsistent state and >> reuse the existing implemented visit methods of all types. >> >> Would you be interested to try to implement this new debug feature? > > It is simple: > > #ifdef Py_DEBUG > void > _PyGC_CheckConsistency(void) > { > int i; > if (_PyRuntime.gc.collecting) { > return; > } > _PyRuntime.gc.collecting = 1; > for (i = 0; i < NUM_GENERATIONS; ++i) { > update_refs(GEN_HEAD(i)); > } > for (i = 0; i < NUM_GENERATIONS; ++i) { > subtract_refs(GEN_HEAD(i)); > } > for (i = 0; i < NUM_GENERATIONS; ++i) { > revive_garbage(GEN_HEAD(i)); > } > _PyRuntime.gc.collecting = 0; > } > #endif Wouldn?t it be enough to visit just the the newly tracked object in PyObject_GC_Track with a visitor function that does something minimal to verify that the object value is sane, for example by checking PyType_Ready(Py_TYPE(op)). That would find issues where objects are tracked before they are initialised far enough to be save to visit, without changing GC behavior. I have no idea what the performance impact of this is though. Ronald -------------- next part -------------- An HTML attachment was scrubbed... URL: From status at bugs.python.org Fri Jun 8 12:09:46 2018 From: status at bugs.python.org (Python tracker) Date: Fri, 8 Jun 2018 18:09:46 +0200 (CEST) Subject: [Python-Dev] Summary of Python tracker Issues Message-ID: <20180608160946.49DC85620E@psf.upfronthosting.co.za> ACTIVITY SUMMARY (2018-06-01 - 2018-06-08) Python tracker at https://bugs.python.org/ To view or respond to any of the issues listed below, click on the issue. Do NOT respond to this message. Issues counts and deltas: open 6691 ( +8) closed 38869 (+66) total 45560 (+74) Open issues with patches: 2640 Issues opened (39) ================== #23835: configparser does not convert defaults to strings https://bugs.python.org/issue23835 reopened by barry #24622: tokenize.py: missing EXACT_TOKEN_TYPES https://bugs.python.org/issue24622 reopened by skrah #33736: Improve the documentation of asyncio stream API https://bugs.python.org/issue33736 opened by Elvis.Pranskevichus #33738: PyIndex_Check conflicts with PEP 384 https://bugs.python.org/issue33738 opened by Christian.Tismer #33740: PyByteArray_AsString C-API description lacks the assurance, th https://bugs.python.org/issue33740 opened by realead #33741: UnicodeEncodeError onsmtplib.login(MAIL_USER, MAIL_PASSWORD) https://bugs.python.org/issue33741 opened by JustAnother1 #33742: Unsafe memory access in PyStructSequence_InitType https://bugs.python.org/issue33742 opened by Pasha Stetsenko #33745: 3.7.0b5 changes the line number of empty functions with docstr https://bugs.python.org/issue33745 opened by nedbat #33746: testRegisterResult in test_unittest fails in verbose mode https://bugs.python.org/issue33746 opened by serhiy.storchaka #33747: Failed separate test_patch_propogrates_exc_on_exit in test_uni https://bugs.python.org/issue33747 opened by serhiy.storchaka #33748: test_discovery_failed_discovery in test_unittest modifies sys. https://bugs.python.org/issue33748 opened by serhiy.storchaka #33751: Failed separate testTruncateOnWindows in test_file https://bugs.python.org/issue33751 opened by serhiy.storchaka #33754: f-strings should be part of the Grammar https://bugs.python.org/issue33754 opened by davidhalter #33757: Failed separate test_pdb_next_command_in_generator_for_loop in https://bugs.python.org/issue33757 opened by serhiy.storchaka #33758: Unexpected success of test_get_type_hints_modules_forwardref https://bugs.python.org/issue33758 opened by serhiy.storchaka #33762: temp file isn't IOBase https://bugs.python.org/issue33762 opened by Dutcho #33766: Grammar Incongruence https://bugs.python.org/issue33766 opened by Isaac Elliott #33770: base64 throws 'incorrect padding' exception when the issue is https://bugs.python.org/issue33770 opened by dniq #33771: Module: timeit. According to documentation default_repeat shou https://bugs.python.org/issue33771 opened by svyatoslav #33772: Fix few dead code paths https://bugs.python.org/issue33772 opened by David Carlier #33774: Document that @lru_cache caches based on exactly how the funct https://bugs.python.org/issue33774 opened by solstag #33775: argparse: the word 'default' (in help) is not marked as transl https://bugs.python.org/issue33775 opened by woutgg #33777: dummy_threading: .is_alive method returns True after execution https://bugs.python.org/issue33777 opened by njatkinson #33779: Error while installing python 3.6.5 on windows 10 https://bugs.python.org/issue33779 opened by sid1987 #33780: [subprocess] Better Unicode support for shell=True on Windows https://bugs.python.org/issue33780 opened by Yoni Rozenshein #33782: VSTS Windows-PR: internal error https://bugs.python.org/issue33782 opened by vstinner #33783: Use proper class markup for random.Random docs https://bugs.python.org/issue33783 opened by ncoghlan #33787: Argument clinic and Windows line endings https://bugs.python.org/issue33787 opened by giampaolo.rodola #33788: Argument clinic: use path_t in _winapi.c https://bugs.python.org/issue33788 opened by giampaolo.rodola #33793: asyncio: _ProactorReadPipeTransport reads by chunk of 32 KiB: https://bugs.python.org/issue33793 opened by vstinner #33797: json int encoding incorrect for dbus.Byte https://bugs.python.org/issue33797 opened by radsquirrel #33799: Remove non-ordered dicts comments from FAQ https://bugs.python.org/issue33799 opened by adelfino #33800: Fix default argument for parameter dict_type of ConfigParser/R https://bugs.python.org/issue33800 opened by adelfino #33801: Remove non-ordered dict comment from plistlib https://bugs.python.org/issue33801 opened by adelfino #33802: Regression in logging configuration https://bugs.python.org/issue33802 opened by barry #33804: Document the default value of the size parameter of io.TextIOB https://bugs.python.org/issue33804 opened by adelfino #33805: dataclasses: replace() give poor error message if using InitVa https://bugs.python.org/issue33805 opened by eric.smith #33808: ssl.get_server_certificate fails with openssl 1.1.0 but works https://bugs.python.org/issue33808 opened by dsanghan #33809: Expose `capture_locals` parameter in `traceback` convenience f https://bugs.python.org/issue33809 opened by ulope Most recent 15 issues with no replies (15) ========================================== #33809: Expose `capture_locals` parameter in `traceback` convenience f https://bugs.python.org/issue33809 #33808: ssl.get_server_certificate fails with openssl 1.1.0 but works https://bugs.python.org/issue33808 #33805: dataclasses: replace() give poor error message if using InitVa https://bugs.python.org/issue33805 #33804: Document the default value of the size parameter of io.TextIOB https://bugs.python.org/issue33804 #33801: Remove non-ordered dict comment from plistlib https://bugs.python.org/issue33801 #33800: Fix default argument for parameter dict_type of ConfigParser/R https://bugs.python.org/issue33800 #33797: json int encoding incorrect for dbus.Byte https://bugs.python.org/issue33797 #33793: asyncio: _ProactorReadPipeTransport reads by chunk of 32 KiB: https://bugs.python.org/issue33793 #33788: Argument clinic: use path_t in _winapi.c https://bugs.python.org/issue33788 #33772: Fix few dead code paths https://bugs.python.org/issue33772 #33762: temp file isn't IOBase https://bugs.python.org/issue33762 #33757: Failed separate test_pdb_next_command_in_generator_for_loop in https://bugs.python.org/issue33757 #33747: Failed separate test_patch_propogrates_exc_on_exit in test_uni https://bugs.python.org/issue33747 #33746: testRegisterResult in test_unittest fails in verbose mode https://bugs.python.org/issue33746 #33741: UnicodeEncodeError onsmtplib.login(MAIL_USER, MAIL_PASSWORD) https://bugs.python.org/issue33741 Most recent 15 issues waiting for review (15) ============================================= #33804: Document the default value of the size parameter of io.TextIOB https://bugs.python.org/issue33804 #33802: Regression in logging configuration https://bugs.python.org/issue33802 #33801: Remove non-ordered dict comment from plistlib https://bugs.python.org/issue33801 #33800: Fix default argument for parameter dict_type of ConfigParser/R https://bugs.python.org/issue33800 #33799: Remove non-ordered dicts comments from FAQ https://bugs.python.org/issue33799 #33787: Argument clinic and Windows line endings https://bugs.python.org/issue33787 #33770: base64 throws 'incorrect padding' exception when the issue is https://bugs.python.org/issue33770 #33766: Grammar Incongruence https://bugs.python.org/issue33766 #33751: Failed separate testTruncateOnWindows in test_file https://bugs.python.org/issue33751 #33748: test_discovery_failed_discovery in test_unittest modifies sys. https://bugs.python.org/issue33748 #33746: testRegisterResult in test_unittest fails in verbose mode https://bugs.python.org/issue33746 #33738: PyIndex_Check conflicts with PEP 384 https://bugs.python.org/issue33738 #33736: Improve the documentation of asyncio stream API https://bugs.python.org/issue33736 #33726: Add short descriptions to PEP references in seealso https://bugs.python.org/issue33726 #33722: Document builtins in mock_open https://bugs.python.org/issue33722 Top 10 most discussed issues (10) ================================= #33694: test_asyncio: test_start_tls_server_1() fails on Python on x86 https://bugs.python.org/issue33694 17 msgs #33738: PyIndex_Check conflicts with PEP 384 https://bugs.python.org/issue33738 14 msgs #33770: base64 throws 'incorrect padding' exception when the issue is https://bugs.python.org/issue33770 13 msgs #33720: test_marshal: crash in Python 3.7b5 on Windows 10 https://bugs.python.org/issue33720 10 msgs #33766: Grammar Incongruence https://bugs.python.org/issue33766 10 msgs #33642: IDLE: Display up to maxlines non-blank lines for Code Context https://bugs.python.org/issue33642 9 msgs #33779: Error while installing python 3.6.5 on windows 10 https://bugs.python.org/issue33779 8 msgs #33615: test__xxsubinterpreters crashed on x86 Gentoo Refleaks 3.x https://bugs.python.org/issue33615 7 msgs #31731: [2.7] test_io hangs on x86 Gentoo Refleaks 2.7 https://bugs.python.org/issue31731 6 msgs #33609: Document that dicts preserve insertion order https://bugs.python.org/issue33609 6 msgs Issues closed (68) ================== #4896: Faster why variable manipulation in ceval.c https://bugs.python.org/issue4896 closed by serhiy.storchaka #5755: "-Wstrict-prototypes" is valid for Ada/C/ObjC but not for C++" https://bugs.python.org/issue5755 closed by inada.naoki #5945: PyMapping_Check returns 1 for lists https://bugs.python.org/issue5945 closed by levkivskyi #9141: Allow objects to decide if they can be collected by GC https://bugs.python.org/issue9141 closed by kristjan.jonsson #12486: tokenize module should have a unicode API https://bugs.python.org/issue12486 closed by willingc #16778: Logger.findCaller needs to be smarter https://bugs.python.org/issue16778 closed by vinay.sajip #18533: Avoid error from repr() of recursive dictview https://bugs.python.org/issue18533 closed by serhiy.storchaka #23495: The writer.writerows method should be documented as accepting https://bugs.python.org/issue23495 closed by Mariatta #27902: pstats.Stats: strip_dirs() method cannot handle file paths fro https://bugs.python.org/issue27902 closed by inada.naoki #28962: Crash when throwing an exception with a malicious __hash__ ove https://bugs.python.org/issue28962 closed by serhiy.storchaka #31849: Python/pyhash.c warning: comparison of integers of different s https://bugs.python.org/issue31849 closed by inada.naoki #32392: subprocess.run documentation does not have **kwargs https://bugs.python.org/issue32392 closed by berker.peksag #32479: inconsistent ImportError message executing same import stateme https://bugs.python.org/issue32479 closed by xiang.zhang #32676: test_asyncio emits many warnings when run in debug mode https://bugs.python.org/issue32676 closed by vstinner #33057: logging.Manager.logRecordFactory is never used https://bugs.python.org/issue33057 closed by vinay.sajip #33165: Add stacklevel parameter to logging APIs https://bugs.python.org/issue33165 closed by vinay.sajip #33197: Confusing error message when constructing invalid inspect.Para https://bugs.python.org/issue33197 closed by yselivanov #33274: minidom removeAttributeNode returns None https://bugs.python.org/issue33274 closed by fdrake #33423: [logging] Improve consistency of logger mechanism. https://bugs.python.org/issue33423 closed by vinay.sajip #33477: Document that compile(code, 'exec') has different behavior in https://bugs.python.org/issue33477 closed by mbussonn #33504: configparser should use dict instead of OrderedDict in 3.7+ https://bugs.python.org/issue33504 closed by lukasz.langa #33527: Invalid child function scope https://bugs.python.org/issue33527 closed by r.david.murray #33562: Check that the global settings for asyncio are not changed by https://bugs.python.org/issue33562 closed by brett.cannon #33600: [EASY DOC] Python 2: document that platform.linux_distribution https://bugs.python.org/issue33600 closed by vstinner #33640: [EASY DOC] uuid: endian of the bytes argument is not documente https://bugs.python.org/issue33640 closed by vstinner #33664: IDLE: scroll text by lines, not pixels. https://bugs.python.org/issue33664 closed by terry.reedy #33669: str.format should raise exception when placeholder number does https://bugs.python.org/issue33669 closed by xiang.zhang #33679: IDLE: Enable theme-specific color configuration for code conte https://bugs.python.org/issue33679 closed by terry.reedy #33696: Install python-docs-theme even if SPHINXBUILD is defined https://bugs.python.org/issue33696 closed by adelfino #33724: test__xxsubinterpreters failed on ARMv7 Ubuntu 3.x https://bugs.python.org/issue33724 closed by eric.snow #33734: asyncio/ssl: Fix AttributeError, increase default handshake ti https://bugs.python.org/issue33734 closed by ned.deily #33737: Multiprocessing not working https://bugs.python.org/issue33737 closed by ned.deily #33739: pathlib: Allow ellipsis to appear after "/" to navigate to par https://bugs.python.org/issue33739 closed by yselivanov #33743: test_asyncio raises a deprecation warning https://bugs.python.org/issue33743 closed by vstinner #33744: Fix and improve tests for the uu module https://bugs.python.org/issue33744 closed by serhiy.storchaka #33749: pdb.Pdb constructor stdout override required to disable use_ra https://bugs.python.org/issue33749 closed by r.david.murray #33750: Failed separate test_from_tuple in test_decimal https://bugs.python.org/issue33750 closed by skrah #33752: Leaked file in test_anydbm_creation_n_file_exists_with_invalid https://bugs.python.org/issue33752 closed by serhiy.storchaka #33753: Leaked file in test_nextfile_oserror_deleting_backup in test_f https://bugs.python.org/issue33753 closed by serhiy.storchaka #33755: Failed separate tests in test_importlib https://bugs.python.org/issue33755 closed by barry #33756: Python 3.7.0b5 build error https://bugs.python.org/issue33756 closed by ned.deily #33759: Failed separate ServerProxyTestCase tests in test_xmlrpc https://bugs.python.org/issue33759 closed by serhiy.storchaka #33760: Leaked files in test_io https://bugs.python.org/issue33760 closed by serhiy.storchaka #33761: Leaked file in test_iterparse in test_xml_etree https://bugs.python.org/issue33761 closed by serhiy.storchaka #33763: IDLE: Use text widget for code context instead of label widget https://bugs.python.org/issue33763 closed by terry.reedy #33764: AppVeyor builds interrupted before tests complete https://bugs.python.org/issue33764 closed by vstinner #33765: AppVeyor didn't start on my PR 7365 https://bugs.python.org/issue33765 closed by vstinner #33767: Improper use of SystemError in the mmap module https://bugs.python.org/issue33767 closed by serhiy.storchaka #33768: IDLE: click on context line should jump to line, at top of win https://bugs.python.org/issue33768 closed by terry.reedy #33769: Cleanup start_tls() implementation https://bugs.python.org/issue33769 closed by yselivanov #33773: test.support.fd_count(): off-by-one error when listing /proc/s https://bugs.python.org/issue33773 closed by vstinner #33776: Segfault when passing invalid argument to asyncio.ensure_futur https://bugs.python.org/issue33776 closed by yselivanov #33778: update Unicode database to 11.0 https://bugs.python.org/issue33778 closed by benjamin.peterson #33781: audioop.c: fbound() casts double to int for its return value https://bugs.python.org/issue33781 closed by vstinner #33784: hash collision in instances of ipaddress.ip_network https://bugs.python.org/issue33784 closed by mark.dickinson #33785: Crash caused by pasting ???????? into IDLE on Windows https://bugs.python.org/issue33785 closed by terry.reedy #33786: @asynccontextmanager doesn't work well with async generators https://bugs.python.org/issue33786 closed by yselivanov #33789: test_asyncio emits ResourceWarning warnings https://bugs.python.org/issue33789 closed by vstinner #33790: Decorated (inner/wrapped) function kwarg defaults dont pass th https://bugs.python.org/issue33790 closed by steven.daprano #33791: Update README.rst to mention third-party OpenSSL needed for ma https://bugs.python.org/issue33791 closed by ned.deily #33792: asyncio: how to set a "Proactor event loop" policy? Issue with https://bugs.python.org/issue33792 closed by yselivanov #33794: Python.framework build is missing 'Current' symlink https://bugs.python.org/issue33794 closed by ned.deily #33795: Memory leak in X509StoreContext class. https://bugs.python.org/issue33795 closed by berker.peksag #33796: dataclasses.replace broken if a class has any ClassVars https://bugs.python.org/issue33796 closed by eric.smith #33798: Fix csv module comment regarding dict insertion order https://bugs.python.org/issue33798 closed by inada.naoki #33803: contextvars: hamt_alloc() must initialize h_root and h_count f https://bugs.python.org/issue33803 closed by yselivanov #33806: Cannot re-open an existing telnet session https://bugs.python.org/issue33806 closed by r.david.murray #33807: CONTRIBUTING.rst: 'Stable buildbots' links with 404 errors https://bugs.python.org/issue33807 closed by zach.ware From yselivanov.ml at gmail.com Fri Jun 8 12:12:57 2018 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Fri, 8 Jun 2018 12:12:57 -0400 Subject: [Python-Dev] Idea: reduce GC threshold in development mode (-X dev) In-Reply-To: References: Message-ID: On Fri, Jun 8, 2018 at 9:24 AM Ronald Oussoren wrote: [..] > Wouldn?t it be enough to visit just the the newly tracked object in PyObject_GC_Track with a visitor function that does something minimal to verify that the object value is sane, for example by checking PyType_Ready(Py_TYPE(op)). +1. Yury From xdegaye at gmail.com Fri Jun 8 12:28:36 2018 From: xdegaye at gmail.com (Xavier de Gaye) Date: Fri, 8 Jun 2018 18:28:36 +0200 Subject: [Python-Dev] Python3 compiled listcomp can't see local var - bug or feature? In-Reply-To: References: Message-ID: On 06/06/2018 03:51 PM, Nick Coghlan wrote: > On 6 June 2018 at 15:31, Rob Cliffe via Python-Dev > wrote: > ... > *In other words, it looks as if in Python 3.6.5, the compiled list comprehension** > **can "see" a pre-existing global variable but not a local one.* > > Yes, this is expected behaviour - the two-namespace form of exec (which is what you get implicitly when you use it inside a function body) is similar to a class body, and hence nested functions > (including the implicit ones created for comprehensions) can't see the top level local variables. In issue 13557 [1] Amaury gives the following explanation by quoting the documentation [2] "Free variables are not resolved in the nearest enclosing namespace, but in the global namespace" and hints at the same solution that is proposed by Nick. FWIW in issue 21161 [3] folks have been bitten by this when trying to run a list comprehension in pdb. [1] https://bugs.python.org/issue13557 [2] http://docs.python.org/py3k/reference/executionmodel.html#interaction-with-dynamic-features [3] https://bugs.python.org/issue21161 Xavier From mike at selik.org Fri Jun 8 14:05:08 2018 From: mike at selik.org (Michael Selik) Date: Fri, 8 Jun 2018 11:05:08 -0700 Subject: [Python-Dev] Add __reversed__ methods for dict Message-ID: Am I correct in saying that the consensus is +1 for inclusion in v3.8? The last point in the thread was INADA Naoki researching various implementations and deciding that it's OK to include this feature in 3.8. As I understand it, Guido was in agreement with INADA's advice to wait for MicroPython's implementation of v3.7. Since INADA has changed minds, I'm guessing it's all in favor? -------------- next part -------------- An HTML attachment was scrubbed... URL: From skip.montanaro at gmail.com Fri Jun 8 14:27:19 2018 From: skip.montanaro at gmail.com (Skip Montanaro) Date: Fri, 8 Jun 2018 13:27:19 -0500 Subject: [Python-Dev] Python3 compiled listcomp can't see local var - bug or feature? In-Reply-To: References: Message-ID: > Is this a bug or a feature? The bug was me being so excited about the new construct (I pushed in someone else's work, can't recall who now, maybe Fredrik Lundh?) that I didn't consider that leaking the loop variable out of the list comprehension was a bad idea. Think of the Py3 behavior as one of those "corrections" to things which were "got wrong" in Python 1 or 2. :-) Skip From guido at python.org Fri Jun 8 14:38:16 2018 From: guido at python.org (Guido van Rossum) Date: Fri, 8 Jun 2018 11:38:16 -0700 Subject: [Python-Dev] Add __reversed__ methods for dict In-Reply-To: References: Message-ID: That sounds right to me. We will then have had two versions where this was the case: - 3.6 where order preserving was implemented in CPython but in the language spec - 3.7 where it was also added to the language spec On Fri, Jun 8, 2018 at 11:05 AM, Michael Selik wrote: > Am I correct in saying that the consensus is +1 for inclusion in v3.8? > > The last point in the thread was INADA Naoki researching various > implementations and deciding that it's OK to include this feature in 3.8. > As I understand it, Guido was in agreement with INADA's advice to wait for > MicroPython's implementation of v3.7. Since INADA has changed minds, I'm > guessing it's all in favor? > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ > guido%40python.org > > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Fri Jun 8 16:35:06 2018 From: guido at python.org (Guido van Rossum) Date: Fri, 8 Jun 2018 13:35:06 -0700 Subject: [Python-Dev] 2018 Python Language Summit coverage In-Reply-To: References: <20180606155653.264c9566@gallinule> Message-ID: +1 -- thanks so much for acting as our scribe again! On Wed, Jun 6, 2018 at 5:47 PM, Eric Snow wrote: > Thanks for doing this Jake. > > -eric > > On Wed, Jun 6, 2018 at 3:56 PM, Jake Edge wrote: > > > > Hola python-dev, > > > > I have been remiss in posting about my coverage from this year's Python > > Language Summit -- not to mention remiss in getting it all written up. > > But I am about half-way done with the sessions from this year. > > > > I am posting SubscriberLinks for articles that are still behind the > > paywall. LWN subscribers can always see our content right away; one > > week after they are published in a weekly edition, they become freely > > available for everyone. SubscriberLinks are a way around the paywall. > > Please feel free to share the SubscriberLinks I am posting here. > > > > The starting point is here: https://lwn.net/Articles/754152/ That is > > an overview article with links to the articles. It will be updated as > > I add more articles. Here is what we have so far: > > > > - Subinterpreter support for Python https://lwn.net/Articles/754162/ > > > > - Modifying the Python object model https://lwn.net/Articles/754163/ > > > > - A Gilectomy update https://lwn.net/Articles/754577/ > > > > - Using GitHub Issues for Python https://lwn.net/Articles/754779/ > > > > - Shortening the Python release schedule > > https://lwn.net/Articles/755224/ > > > > - Unplugging old batteries > > https://lwn.net/SubscriberLink/755229/df78cf17181dbdca/ > > > > Hopefully I captured things reasonably well -- if you have corrections > > or clarifications (or just comments :) , I would recommend posting them > > as comments on the article. > > > > I will post an update soon with the next round (with luck, all of the > > rest of them). > > > > enjoy! > > > > jake > > > > -- > > Jake Edge - LWN - jake at lwn.net - http://lwn.net > > _______________________________________________ > > Python-Dev mailing list > > Python-Dev at python.org > > https://mail.python.org/mailman/listinfo/python-dev > > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ > ericsnowcurrently%40gmail.com > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ > guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From nad at python.org Mon Jun 11 06:23:14 2018 From: nad at python.org (Ned Deily) Date: Mon, 11 Jun 2018 06:23:14 -0400 Subject: [Python-Dev] 3.7.0rc1 and 3.6.6rc happening later today! Message-ID: <6AFB2CAC-BFDA-45C5-BEF2-B2954E63DB36@python.org> Short and sweet: thanks to a *lot* of work by a lot of people, we appear to be about ready to finally tag and manufacture the 3.7.0 release candidate! At the moment, we have no "release blocker" or "deferred blocker" issues open for 3.7 - a first! We also now have 21 out of 22 3.7 "production" buildbots consistently green or occasionally pink (meaning successful test retry) - also quite an accomplishment. (Only the 3.7 AIX PPC64 buildbot remains red but, since we really only support AIX on a "best effort" basis, we are not going to further delay 3.7.0 for it.) We have also had to make some tough decisions and defer some features to 3.8 and a few more complex bug resolutions to 2.7.1 or later. And releasing the "bonus beta", 3.7.0b5, resulted in some good feedback and squashing a few more issues. As you may recall, the most recently updated schedule calls for both 3.7.0rc1 and 3.6.6rc1 to be produced today, 2018-06-11, with the finals coming about two weeks later on 2018-06-27. I plan to start on 3.6.6rc1 in about 12 hours (around 22:00 UTC) with 3.7.0rc1 to follow soon thereafter. Feel free to use the remaining time to merge any last-minute documentation updates or minor bug fixes - but please do not break anything! When in doubt, ask. (I will be off-line for the next 8 hours or so.) After 3.7.0rc1 cutoff, new 3.7 merges will appear in 3.7.1, which should appear sometime next month (by the end of 2018-07). Likewise, new 3.6 merges will next appear in 3.6.7rc1, by the end of 2018-09. Please continue to exercise diligence when deciding whether a change is appropriate for 3.7; as a rule of thumb, treat the 3.7 branch as if it were already released and in maintenance mode. Please also pay attention to CI test failures and buildbot test failures and see if you can help resolve them. As always, if you think you may have found a critical problem at any time in either release candidate, please open (or reuse) an issue on bugs.python.org and mark it as "release blocker" priority. 3.7.0: here we come, thanks to you! --Ned -- Ned Deily nad at python.org -- [] From rob.cliffe at btinternet.com Mon Jun 11 18:06:36 2018 From: rob.cliffe at btinternet.com (Rob Cliffe) Date: Mon, 11 Jun 2018 23:06:36 +0100 Subject: [Python-Dev] Python3 compiled listcomp can't see local var - bug or feature? In-Reply-To: References: Message-ID: <1921c20f-8177-4f03-ae41-b99d82470a0c@btinternet.com> Skip, I think you have misunderstood the? point I was making.? It was not whether the loop variable should leak out of a list comprehension.? Rather, it was whether a local variable should, so to speak, "leak into" a list comprehension.? And the answer is: it depends on whether the code is executed normally, or via exec/eval.? Example: def Test(): ?? ?? x = 1 ? ? ? print([x+i for i in range(1,3)])??? ??? ????? # Prints [2,3] ????? exec('print([x+i for i in range(1,3)])') # Raises NameError (x) Test() I (at least at first) found the difference in behaviour surprising. Regards Rob Cliffe On 08/06/2018 19:27, Skip Montanaro wrote: >> Is this a bug or a feature? > The bug was me being so excited about the new construct (I pushed in > someone else's work, can't recall who now, maybe Fredrik Lundh?) that > I didn't consider that leaking the loop variable out of the list > comprehension was a bad idea. Think of the Py3 behavior as one of > those "corrections" to things which were "got wrong" in Python 1 or 2. > :-) > > Skip > > --- > This email has been checked for viruses by AVG. > http://www.avg.com > > From skip.montanaro at gmail.com Mon Jun 11 18:29:37 2018 From: skip.montanaro at gmail.com (Skip Montanaro) Date: Mon, 11 Jun 2018 17:29:37 -0500 Subject: [Python-Dev] Python3 compiled listcomp can't see local var - bug or feature? In-Reply-To: <1921c20f-8177-4f03-ae41-b99d82470a0c@btinternet.com> References: <1921c20f-8177-4f03-ae41-b99d82470a0c@btinternet.com> Message-ID: > Skip, I think you have misunderstood the point I was making. It was > not whether the loop variable should leak out of a list comprehension. > Rather, it was whether a local variable should, so to speak, "leak into" > a list comprehension. And the answer is: it depends on whether the code > is executed normally, or via exec/eval. > Got it. Yes, you'll have to pass in locals to exec. (Can't verify, as I'm on the train, on my phone.) Builtins like range are global to everything, so no problem there. Your clarification also make it more of a Python programming question , I think. Skip > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ericfahlgren at gmail.com Mon Jun 11 18:31:03 2018 From: ericfahlgren at gmail.com (Eric Fahlgren) Date: Mon, 11 Jun 2018 15:31:03 -0700 Subject: [Python-Dev] Python3 compiled listcomp can't see local var - bug or feature? In-Reply-To: <1921c20f-8177-4f03-ae41-b99d82470a0c@btinternet.com> References: <1921c20f-8177-4f03-ae41-b99d82470a0c@btinternet.com> Message-ID: On Mon, Jun 11, 2018 at 3:10 PM Rob Cliffe via Python-Dev < python-dev at python.org> wrote: > Skip, I think you have misunderstood the point I was making. It was > not whether the loop variable should leak out of a list comprehension. > Rather, it was whether a local variable should, so to speak, "leak into" > a list comprehension. And the answer is: it depends on whether the code > is executed normally, or via exec/eval. Example: > > def Test(): > x = 1 > print([x+i for i in range(1,3)]) # Prints [2,3] > exec('print([x+i for i in range(1,3)])') # Raises NameError (x) > Test() > > I (at least at first) found the difference in behaviour surprising. > ?Change 'def' to 'class' and run it again. You'll be even more surprised.? -------------- next part -------------- An HTML attachment was scrubbed... URL: From greg.ewing at canterbury.ac.nz Mon Jun 11 19:08:50 2018 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Tue, 12 Jun 2018 11:08:50 +1200 Subject: [Python-Dev] Python3 compiled listcomp can't see local var - bug or feature? In-Reply-To: References: <1921c20f-8177-4f03-ae41-b99d82470a0c@btinternet.com> Message-ID: <5B1F0102.6070106@canterbury.ac.nz> Skip Montanaro wrote: > Yes, you'll have to pass in locals to exec. Exec changed between python 2 and 3. It used to be treated specially by the compiler so that it could see and modify the locals where it was used. But now it's just an ordinary function, so you can't expect it to magically know about anything that's not passed into it. -- Greg From nad at python.org Tue Jun 12 04:10:38 2018 From: nad at python.org (Ned Deily) Date: Tue, 12 Jun 2018 04:10:38 -0400 Subject: [Python-Dev] 3.7.0rc1 and 3.6.6rc1 now tagged - on to final! Message-ID: <92F63555-A1EC-4AD6-9AC5-4A1A0A3EBC7E@python.org> An update: 3.7.0rc1 and 3.6.6rc1 are now tagged and we now move on to the final stages for 3.7.0 and for 3.6.6. The source code has been shipped to our factory (in a tariff-free zone!) where the elves will produce the final bits for the release. They promise to be done soon so stay tuned for the release announcements later today. In the meantime, the 3.7 and 3.6 branches in the cpython repo are open for merges. As of the rc1 cutoffs (tags v3.7.0rc1 and v3.6.6rc1 in the cpython repo), expect merges to 3.7 to appear in 3.7.1 and merges to 3.6 to appear in 3.6.7. Please continue to treat the 3.7 branch as if it were already released and in maintenance mode. Please continue to pay attention to CI test failures and buildbot test failures and see if you can help resolve them. If you find something that may affect either final release, please make sure to open a new issue on bugs.python.org, or update an existing issue, and set the priority to "release blocker". As always, improving the documentation never ceases so keep those updates coming in. Prior to 3.7.0 final and 3.6.6 final, I will review doc changes that have been merged and consider cherry-picking them into the release materials. By the way, don't be fooled: if you build Python from the 3.7 branch at the moment, the version will be "3.7.0rc1+" but changes merged will be in 3.7.1; similarly for 3.6. The clock is now ticking: 15 days until the final releases. Please do what you can to encourage exposure and testing by ourselves and our downstream users. Once again, I want to thank everyone who has been involved so far in helping us through the 3.7 endgame and who have given up their personal time to work on making Python better. I remain deeply grateful. --Ned Upcoming dates: - 2018-06-27 3.7.0 final !!! and 3.6.6 final !! - 2018-07-xx 3.7.1 - 2018-09-xx 3.6.7 -- Ned Deily nad at python.org -- [] From rob.cliffe at btinternet.com Tue Jun 12 04:38:47 2018 From: rob.cliffe at btinternet.com (Rob Cliffe) Date: Tue, 12 Jun 2018 09:38:47 +0100 Subject: [Python-Dev] Python3 compiled listcomp can't see local var - bug or feature? In-Reply-To: References: <1921c20f-8177-4f03-ae41-b99d82470a0c@btinternet.com> Message-ID: <11b337f2-f9ad-3146-2089-7c5ccc0ca0aa@btinternet.com> Ah yes, I see what you mean: class Test(): ?????? x = 1 ?????? print (x)???????????????????????? # Prints 1 ?????? print([x+i for i in range(1,3)])? # NameError (x) Anyway, I? apologise for posting to Python-Dev on was a known issue, and turned out to be more me asking for help with development with Python, rather than development of Python.? (My original use case was a scripting language that could contain embedded Python code).? Thanks to Nick for his original answer. Rob Cliffe On 11/06/2018 23:31, Eric Fahlgren wrote: > On Mon, Jun 11, 2018 at 3:10 PM Rob Cliffe via Python-Dev > > wrote: > > Skip, I think you have misunderstood the? point I was making.? It was > not whether the loop variable should leak out of a list > comprehension. > Rather, it was whether a local variable should, so to speak, "leak > into" > a list comprehension.? And the answer is: it depends on whether > the code > is executed normally, or via exec/eval.? Example: > > def Test(): > ??? ?? x = 1 > ?? ? ? print([x+i for i in range(1,3)])??? ??? ????? # Prints [2,3] > ?????? exec('print([x+i for i in range(1,3)])') # Raises NameError (x) > Test() > > I (at least at first) found the difference in behaviour surprising. > > > ?Change 'def' to 'class' and run it again.? You'll be even more > surprised.? > > > Virus-free. www.avg.com > > > > <#DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2> -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Tue Jun 12 08:38:38 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Tue, 12 Jun 2018 22:38:38 +1000 Subject: [Python-Dev] Some data points for the "annual release cadence" concept Message-ID: Hi folks, Given the discussion of adopting an annual release cadence at the language summit this year [1], this recent Jakarta (nee Java) EE announcement caught my attention: https://www.eclipse.org/ee4j/news/?date=2018-06-08#release-cadence JEE are switching to an annual update cadence for the base platform, with quarterly updates for individual components. Since we last seriously discussed potential release cadence changes back in 2012 (with PEPs 407 and 413 [2,3]), that means JEE joins GCC switching to an annual release cadence from GCC 5 back in 2015 (see [4]), while clang/LLVM have been doing twice-annual feature releases for several years now [5]. Most directly relevant to Python would be the Node.js maintenance lifecycle initially developed in 2013, and evolved since then: https://github.com/nodejs/Release That's quite a fast lifecycle (even faster than we're considering - they do a release every 6 months, with every 2nd release being supported for 3 1/2 years), but one of the keys to making it work in practice is https://github.com/nodejs/nan The gist of the "Native Abstractions for Node.js" project is that it aims to provide a stable API & ABI for third party libraries to link against, but *without* having to keep those interfaces stable in V8/Node.js itself. In the Python world, the closest current equivalent would by SIP for PyQt projects [6], which provides a shim layer that allows version independent extension modules to target CPython's native stable ABI with the aid of a single version specific dependency (so only the "sip" wheel itself needs to be rebuilt for each new Python version, not every extension module that depends on it). So I expect a release cadence change would be a lot more viable now than it would have been 6 years ago, but I also suspect actually getting there will require a non-trivial amount of effort invested in generalising the SIP model such that the stable ABI gets a *lot* easier for projects to realistically target (including for cffi and Cython generated extension modules). Cheers, Nick. [1] https://lwn.net/Articles/755224/ [2] https://www.python.org/dev/peps/pep-0407/ [3] https://www.python.org/dev/peps/pep-0413/ [4] https://gcc.gnu.org/releases.html [5] https://releases.llvm.org/ (Note: LLVM switched to twice-yearly X.0.0 releases in 2017, but were doing twice yearly X.Y releases for some time before that) [6] https://pypi.org/project/SIP/ -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Tue Jun 12 12:23:23 2018 From: guido at python.org (Guido van Rossum) Date: Tue, 12 Jun 2018 09:23:23 -0700 Subject: [Python-Dev] Some data points for the "annual release cadence" concept In-Reply-To: References: Message-ID: So, to summarize, we need something like six for C? On Tue, Jun 12, 2018 at 5:38 AM, Nick Coghlan wrote: > Hi folks, > > Given the discussion of adopting an annual release cadence at the language > summit this year [1], this recent Jakarta (nee Java) EE announcement caught > my attention: https://www.eclipse.org/ee4j/news/?date=2018-06-08#release- > cadence > > JEE are switching to an annual update cadence for the base platform, with > quarterly updates for individual components. > > Since we last seriously discussed potential release cadence changes back > in 2012 (with PEPs 407 and 413 [2,3]), that means JEE joins GCC switching > to an annual release cadence from GCC 5 back in 2015 (see [4]), while > clang/LLVM have been doing twice-annual feature releases for several years > now [5]. > > Most directly relevant to Python would be the Node.js maintenance > lifecycle initially developed in 2013, and evolved since then: > https://github.com/nodejs/Release > > That's quite a fast lifecycle (even faster than we're considering - they > do a release every 6 months, with every 2nd release being supported for 3 > 1/2 years), but one of the keys to making it work in practice is > https://github.com/nodejs/nan > > The gist of the "Native Abstractions for Node.js" project is that it aims > to provide a stable API & ABI for third party libraries to link against, > but *without* having to keep those interfaces stable in V8/Node.js itself. > > In the Python world, the closest current equivalent would by SIP for PyQt > projects [6], which provides a shim layer that allows version independent > extension modules to target CPython's native stable ABI with the aid of a > single version specific dependency (so only the "sip" wheel itself needs to > be rebuilt for each new Python version, not every extension module that > depends on it). > > So I expect a release cadence change would be a lot more viable now than > it would have been 6 years ago, but I also suspect actually getting there > will require a non-trivial amount of effort invested in generalising the > SIP model such that the stable ABI gets a *lot* easier for projects to > realistically target (including for cffi and Cython generated extension > modules). > > Cheers, > Nick. > > [1] https://lwn.net/Articles/755224/ > [2] https://www.python.org/dev/peps/pep-0407/ > [3] https://www.python.org/dev/peps/pep-0413/ > [4] https://gcc.gnu.org/releases.html > [5] https://releases.llvm.org/ (Note: LLVM switched to twice-yearly X.0.0 > releases in 2017, but were doing twice yearly X.Y releases for some time > before that) > [6] https://pypi.org/project/SIP/ > > > -- > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ > guido%40python.org > > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From mariatta.wijaya at gmail.com Tue Jun 12 14:47:13 2018 From: mariatta.wijaya at gmail.com (Mariatta Wijaya) Date: Tue, 12 Jun 2018 11:47:13 -0700 Subject: [Python-Dev] Microsoft to acquire GitHub for $7.5 b In-Reply-To: References: <20180604174034.58200616@fsol> <20180604190202.16939ef6@fsol> <87cd38df-c2fe-ab0b-660f-7d1740db9a71@mail.mipt.ru> <20180607044127.GB12683@ando.pearwood.info> Message-ID: Backing up GitHub data has been brought up since the time we migrated to GitHub, and being tracked here: https://github.com/pytho n/core-workflow/issues/20 TL;DR We'll be using GitHub's new Migrations API to download archived GitHub data of CPython. Ernest is helping us get set up with daily backups of CPython repo to be stored within The PSF's infrastructure. Mariatta On Thu, Jun 7, 2018 at 11:24 AM, Chris Angelico wrote: > On Fri, Jun 8, 2018 at 3:33 AM, Chris Barker - NOAA Federal via > Python-Dev wrote: > > Any service could change or fail. Period. > > > > So we shouldn?t want valuable information about Python development > > only in gitHub. > > > > I don?t know how hard it is to backup / mirror an entire repo ? but it > > sure seems like a good idea. > > There are two separate concerns here: > > 1) How do we get a full copy of all of CPython and its change history? > > 2) How do we get all the non-code content - issues, pull requests, > comments? > > The first one is trivially easy. *Everyone* who has a clone of the > repository [1] has a full copy of the code and all history, updated > every time 'git pull' is run. > > The second one depends on GitHub's exporting facilities; but it also > depends on a definition of what's important. Maybe the PSF doesn't > care if people's comments at the bottoms of commits are lost (not to > be confused with commit messages themselves, which are part of the > repo proper), so it wouldn't matter if they're lost. Or maybe it's > important to have the contents of such commits, but it's okay to > credit them to an email address rather than linking to an actual > username. Or whatever. Unlike with the code/history repo, an imperfect > export is still of partial value. > > ChrisA > > [1] Barring shallow clones, but most people don't do those > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ > mariatta.wijaya%40gmail.com > ? -------------- next part -------------- An HTML attachment was scrubbed... URL: From nad at python.org Tue Jun 12 16:29:38 2018 From: nad at python.org (Ned Deily) Date: Tue, 12 Jun 2018 16:29:38 -0400 Subject: [Python-Dev] [RELEASE] Python 3.7.0rc1 and 3.6.6rc1 are now available Message-ID: <18BD5CA7-EB42-477F-9D14-7EE78CBEB5C2@python.org> Python 3.7.0rc1 and 3.6.6rc1 are now available. 3.7.0rc1 is the final planned release preview of Python 3.7, the next feature release of Python. 3.6.6rc1 is the the release preview of the next maintenance release of Python 3.6, the current release of Python. Assuming no critical problems are found prior to *2018-06-27*, the scheduled release dates for 3.7.0 and 3.6.6, no code changes are planned between these release candidates and the final releases. These release candidates are intended to give you the opportunity to test the new features and bug fixes in 3.7.0 and 3.6.6 and to prepare your projects to support them. We strongly encourage you to test your projects and report issues found to bugs.python.org as soon as possible. Please keep in mind that these are preview releases and, thus, their use is not recommended for production environments. Attention macOS users: there is now a new installer variant for macOS 10.9+ that includes a built-in version of Tcl/Tk 8.6. This variant will become the default version when 3.7.0 releases. Check it out! You can find these releases and more information here: https://www.python.org/downloads/release/python-370rc1/ https://www.python.org/downloads/release/python-366rc1/ -- Ned Deily nad at python.org -- [] From doko at ubuntu.com Wed Jun 13 04:25:56 2018 From: doko at ubuntu.com (Matthias Klose) Date: Wed, 13 Jun 2018 10:25:56 +0200 Subject: [Python-Dev] Some data points for the "annual release cadence" concept In-Reply-To: References: Message-ID: <36fc3445-4b61-7c81-3d0c-0d3667347f68@ubuntu.com> On 12.06.2018 18:23, Guido van Rossum wrote: > So, to summarize, we need something like six for C? there is https://github.com/encukou/py3c From doko at ubuntu.com Wed Jun 13 04:27:32 2018 From: doko at ubuntu.com (Matthias Klose) Date: Wed, 13 Jun 2018 10:27:32 +0200 Subject: [Python-Dev] Some data points for the "annual release cadence" concept In-Reply-To: References: Message-ID: On 12.06.2018 14:38, Nick Coghlan wrote: > Since we last seriously discussed potential release cadence changes back in > 2012 (with PEPs 407 and 413 [2,3]), that means JEE joins GCC switching to > an annual release cadence from GCC 5 back in 2015 (see [4]), no, GCC is doing yearly releases since 2001. In 2015 they changed the versioning schema, but it's still one major release per year. From ncoghlan at gmail.com Wed Jun 13 09:31:15 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Wed, 13 Jun 2018 23:31:15 +1000 Subject: [Python-Dev] Microsoft to acquire GitHub for $7.5 b In-Reply-To: References: <20180604174034.58200616@fsol> <20180604190202.16939ef6@fsol> <87cd38df-c2fe-ab0b-660f-7d1740db9a71@mail.mipt.ru> <20180607044127.GB12683@ando.pearwood.info> Message-ID: On 13 June 2018 at 04:47, Mariatta Wijaya wrote: > Backing up GitHub data has been brought up since the time we migrated to > GitHub, and being tracked here: https://github.com/pytho > n/core-workflow/issues/20 > > TL;DR We'll be using GitHub's new Migrations API > to download archived > GitHub data of CPython. Ernest is helping us get set up with daily backups > of CPython repo to be stored within The PSF's infrastructure. > Nice! Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Wed Jun 13 09:42:42 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Wed, 13 Jun 2018 23:42:42 +1000 Subject: [Python-Dev] Some data points for the "annual release cadence" concept In-Reply-To: References: Message-ID: On 13 June 2018 at 02:23, Guido van Rossum wrote: > So, to summarize, we need something like six for C? > Yeah, pretty much - once we can get to the point where it's routine for folks to be building "abiX" or "abiXY" wheels (with the latter not actually being a defined compatibility tag yet, but having the meaning of "targets the stable ABI as first defined in CPython X.Y"), rather than feature release specific "cpXYm" ones, then a *lot* of the extension module maintenance pain otherwise arising from more frequent CPython releases should be avoided. There'd still be a lot of other details to work out to turn the proposed release cadence change into a practical reality, but this is the key piece that I think is a primarily technical hurdle: simplifying the current "wheel-per-python-version-per-target-platform" community project build matrices to instead be "wheel-per-target-platform". Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From ronaldoussoren at mac.com Wed Jun 13 16:30:18 2018 From: ronaldoussoren at mac.com (Ronald Oussoren) Date: Wed, 13 Jun 2018 22:30:18 +0200 Subject: [Python-Dev] Some data points for the "annual release cadence" concept In-Reply-To: References: Message-ID: <0004060B-F404-48F7-BF54-0E9DA0EC3952@mac.com> > On 13 Jun 2018, at 15:42, Nick Coghlan wrote: > > On 13 June 2018 at 02:23, Guido van Rossum > wrote: > So, to summarize, we need something like six for C? > > Yeah, pretty much - once we can get to the point where it's routine for folks to be building "abiX" or "abiXY" wheels (with the latter not actually being a defined compatibility tag yet, but having the meaning of "targets the stable ABI as first defined in CPython X.Y"), rather than feature release specific "cpXYm" ones, then a *lot* of the extension module maintenance pain otherwise arising from more frequent CPython releases should be avoided. > > There'd still be a lot of other details to work out to turn the proposed release cadence change into a practical reality, but this is the key piece that I think is a primarily technical hurdle: simplifying the current "wheel-per-python-version-per-target-platform" community project build matrices to instead be "wheel-per-target-platform?. This requires getting people to mostly stop using the non-stable ABI, and that could be a lot of work for projects that have existing C extensions that don?t use the stable ABI or cython/cffi/? That said, the CPython API tends to be fairly stable over releases and even without using the stable ABI supporting faster CPython feature releases shouldn?t be too onerous, especially for projects with some kind of automation for creating release artefacts (such as a CI system). Ronald -------------- next part -------------- An HTML attachment was scrubbed... URL: From desmoulinmichel at gmail.com Wed Jun 13 16:45:22 2018 From: desmoulinmichel at gmail.com (Michel Desmoulin) Date: Wed, 13 Jun 2018 22:45:22 +0200 Subject: [Python-Dev] A more flexible task creation Message-ID: I was working on a concurrency limiting code for asyncio, so the user may submit as many tasks as one wants, but only a max number of tasks will be submitted to the event loop at the same time. However, I wanted that passing an awaitable would always return a task, no matter if the task was currently scheduled or not. The goal is that you could add done callbacks to it, decide to force schedule it, etc I dug in the asyncio.Task code, and encountered: def __init__(self, coro, *, loop=None): ... self._loop.call_soon(self._step) self.__class__._all_tasks.add(self) I was surprised to see that instantiating a Task class has any side effect at all, let alone 2, and one of them being to be immediately scheduled for execution. I couldn't find a clean way to do what I wanted: either you loop.create_task() and you get a task but it runs, or you don't run anything, but you don't get a nice task object to hold on to. I tried several alternatives, like returning a future, and binding the future awaiting to the submission of a task, but that was complicated code that duplicated a lot of things. I tried creating a custom task, but it was even harder, setting a custom event policy, to provide a custom event loop with my own create_task() accepting parameters. That's a lot to do just to provide a parameter to Task, especially if you already use a custom event loop (e.g: uvloop). I was expecting to have to create a task factory only, but task factories can't get any additional parameters from create_task()). Additionally I can't use ensure_future(), as it doesn't allow to pass any parameter to the underlying Task, so if I want to accept any awaitable in my signature, I need to provide my own custom ensure_future(). All those implementations access a lot of _private_api, and do other shady things that linters hate; plus they are fragile at best. What's more, Task being rewritten in C prevents things like setting self._coro, so we can only inherit from the pure Python slow version. In the end, I can't even await the lazy task, because it blocks the entire program. Hence I have 2 distinct, but independent albeit related, proposals: - Allow Task to be created but not scheduled for execution, and add a parameter to ensure_future() and create_task() to control this. Awaiting such a task would just do like asyncio.sleep(O) until it is scheduled for execution. - Add an parameter to ensure_future() and create_task() named "kwargs" that accept a mapping and will be passed as **kwargs to the underlying created Task. I insist on the fact that the 2 proposals are independent, so please don't reject both if you don't like one or the other. Passing a parameter to the underlying custom Task is still of value even without the unscheduled instantiation, and vice versa. Also, if somebody has any idea on how to make a LazyTask that we can await on without blocking everything, I'll take it. From yselivanov.ml at gmail.com Wed Jun 13 17:26:56 2018 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 13 Jun 2018 17:26:56 -0400 Subject: [Python-Dev] A more flexible task creation In-Reply-To: References: Message-ID: On Wed, Jun 13, 2018 at 4:47 PM Michel Desmoulin wrote: > > I was working on a concurrency limiting code for asyncio, so the user > may submit as many tasks as one wants, but only a max number of tasks > will be submitted to the event loop at the same time. What does that "concurrency limiting code" do? What problem does it solve? > > However, I wanted that passing an awaitable would always return a task, > no matter if the task was currently scheduled or not. The goal is that > you could add done callbacks to it, decide to force schedule it, etc The obvious advice is to create a new class "DelayedTask" with a Future-like API. You can then schedule the real awaitable that it wraps with `loop.create_task` at any point. Providing "add_done_callback"-like API is trivial. DelayedTask can itself be an awaitable, scheduling itself on a first __await__ call. As a benefit, your implementation will support any Task-like objects that alternative asyncio loops can implement. No need to mess with policies either. > > I dug in the asyncio.Task code, and encountered: > > def __init__(self, coro, *, loop=None): > ... > self._loop.call_soon(self._step) > self.__class__._all_tasks.add(self) > > I was surprised to see that instantiating a Task class has any side > effect at all, let alone 2, and one of them being to be immediately > scheduled for execution. To be fair, implicitly scheduling a task for execution is what all async frameworks (twisted, curio, trio) do when you wrap a coroutine into a task. I don't recall them having a keyword argument to control when the task is scheduled. > > I couldn't find a clean way to do what I wanted: either you > loop.create_task() and you get a task but it runs, or you don't run > anything, but you don't get a nice task object to hold on to. A clean way is to create a new layer of abstraction (e.g. DelayedTask I suggested above). [..] > I tried creating a custom task, but it was even harder, setting a custom > event policy, to provide a custom event loop with my own create_task() > accepting parameters. That's a lot to do just to provide a parameter to > Task, especially if you already use a custom event loop (e.g: uvloop). I > was expecting to have to create a task factory only, but task factories > can't get any additional parameters from create_task()). I don't think creating a new Task implementation is needed here, a simple wrapper should work just fine. [..] > Hence I have 2 distinct, but independent albeit related, proposals: > > - Allow Task to be created but not scheduled for execution, and add a > parameter to ensure_future() and create_task() to control this. Awaiting > such a task would just do like asyncio.sleep(O) until it is scheduled > for execution. > > - Add an parameter to ensure_future() and create_task() named "kwargs" > that accept a mapping and will be passed as **kwargs to the underlying > created Task. > > I insist on the fact that the 2 proposals are independent, so please > don't reject both if you don't like one or the other. Passing a > parameter to the underlying custom Task is still of value even without > the unscheduled instantiation, and vice versa. Well, to add a 'kwargs' parameter to ensure_future() we need kwargs in Task.__init__. So far we only have 'loop' and it's not something that ensure_future() should allow you to override. So unless we implement the first proposal, we don't need the second. Yury From njs at pobox.com Wed Jun 13 22:09:39 2018 From: njs at pobox.com (Nathaniel Smith) Date: Wed, 13 Jun 2018 19:09:39 -0700 Subject: [Python-Dev] A more flexible task creation In-Reply-To: References: Message-ID: How about: async def wait_to_run(async_fn, *args): await wait_for_something() return await async_fn(*args) task = loop.create_task(wait_to_run(myfunc, ...)) ----- Whatever strategy you use, you should also think about what semantics you want if one of these delayed tasks is cancelled before it starts. For regular, non-delayed tasks, Trio makes sure that even if it gets cancelled before it starts, then it still gets scheduled and runs until the first cancellation point. This is necessary for correct resource hand-off between tasks: async def some_task(handle): with handle: await ... If we skipped running this task entirely, then the handle wouldn't be closed properly; scheduling it once allows the with block to run, and then get cleaned up by the cancellation exception. I'm not sure but I think asyncio handles pre-cancellation in a similar way. (Yury, do you know?) Now, in delayed task case, there's a similar issue. If you want to keep the same solution, then you might want to instead write: # asyncio async def wait_to_run(async_fn, *args): try: await wait_for_something() except asyncio.CancelledError: # have to create a subtask to make it cancellable subtask = loop.create_task(async_fn(*args)) # then cancel it immediately subtask.cancel() # and wait for the cancellation to be processed return await subtask else: return await async_fn(*args) In trio, this could be simplified to # trio async def wait_to_run(async_fn, *args): try: await wait_for_something() except trio.Cancelled: pass return await async_fn(*args) (This works because of trio's "stateful cancellation" ? if the whole thing is cancelled, then as soon as async_fn hits a cancellation point the exception will be re-delivered.) -n On Wed, Jun 13, 2018, 13:47 Michel Desmoulin wrote: > I was working on a concurrency limiting code for asyncio, so the user > may submit as many tasks as one wants, but only a max number of tasks > will be submitted to the event loop at the same time. > > However, I wanted that passing an awaitable would always return a task, > no matter if the task was currently scheduled or not. The goal is that > you could add done callbacks to it, decide to force schedule it, etc > > I dug in the asyncio.Task code, and encountered: > > def __init__(self, coro, *, loop=None): > ... > self._loop.call_soon(self._step) > self.__class__._all_tasks.add(self) > > I was surprised to see that instantiating a Task class has any side > effect at all, let alone 2, and one of them being to be immediately > scheduled for execution. > > I couldn't find a clean way to do what I wanted: either you > loop.create_task() and you get a task but it runs, or you don't run > anything, but you don't get a nice task object to hold on to. > > I tried several alternatives, like returning a future, and binding the > future awaiting to the submission of a task, but that was complicated > code that duplicated a lot of things. > > I tried creating a custom task, but it was even harder, setting a custom > event policy, to provide a custom event loop with my own create_task() > accepting parameters. That's a lot to do just to provide a parameter to > Task, especially if you already use a custom event loop (e.g: uvloop). I > was expecting to have to create a task factory only, but task factories > can't get any additional parameters from create_task()). > > Additionally I can't use ensure_future(), as it doesn't allow to pass > any parameter to the underlying Task, so if I want to accept any > awaitable in my signature, I need to provide my own custom ensure_future(). > > All those implementations access a lot of _private_api, and do other > shady things that linters hate; plus they are fragile at best. What's > more, Task being rewritten in C prevents things like setting self._coro, > so we can only inherit from the pure Python slow version. > > In the end, I can't even await the lazy task, because it blocks the > entire program. > > Hence I have 2 distinct, but independent albeit related, proposals: > > - Allow Task to be created but not scheduled for execution, and add a > parameter to ensure_future() and create_task() to control this. Awaiting > such a task would just do like asyncio.sleep(O) until it is scheduled > for execution. > > - Add an parameter to ensure_future() and create_task() named "kwargs" > that accept a mapping and will be passed as **kwargs to the underlying > created Task. > > I insist on the fact that the 2 proposals are independent, so please > don't reject both if you don't like one or the other. Passing a > parameter to the underlying custom Task is still of value even without > the unscheduled instantiation, and vice versa. > > Also, if somebody has any idea on how to make a LazyTask that we can > await on without blocking everything, I'll take it. > > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/njs%40pobox.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From tinchester at gmail.com Thu Jun 14 12:38:49 2018 From: tinchester at gmail.com (=?UTF-8?Q?Tin_Tvrtkovi=C4=87?=) Date: Thu, 14 Jun 2018 18:38:49 +0200 Subject: [Python-Dev] A more flexible task creation In-Reply-To: References: Message-ID: Hi, I've been using asyncio a lot lately and have encountered this problem several times. Imagine you want to do a lot of queries against a database, spawning 10000 tasks in parallel will probably cause a lot of them to fail. What you need in a task pool of sorts, to limit concurrency and do only 20 requests in parallel. If we were doing this synchronously, we wouldn't spawn 10000 threads using 10000 connections, we would use a thread pool with a limited number of threads and submit the jobs into its queue. To me, tasks are (somewhat) logically analogous to threads. The solution that first comes to mind is to create an AsyncioTaskExecutor with a submit(coro, *args, **kwargs) method. Put a reference to the coroutine and its arguments into an asyncio queue. Spawn n tasks pulling from this queue and awaiting the coroutines. It'd probably be useful to have this in the stdlib at some point. Date: Wed, 13 Jun 2018 22:45:22 +0200 > From: Michel Desmoulin > To: python-dev at python.org > Subject: [Python-Dev] A more flexible task creation > Message-ID: > Content-Type: text/plain; charset=utf-8 > > I was working on a concurrency limiting code for asyncio, so the user > may submit as many tasks as one wants, but only a max number of tasks > will be submitted to the event loop at the same time. > > However, I wanted that passing an awaitable would always return a task, > no matter if the task was currently scheduled or not. The goal is that > you could add done callbacks to it, decide to force schedule it, etc > > I dug in the asyncio.Task code, and encountered: > > def __init__(self, coro, *, loop=None): > ... > self._loop.call_soon(self._step) > self.__class__._all_tasks.add(self) > > I was surprised to see that instantiating a Task class has any side > effect at all, let alone 2, and one of them being to be immediately > scheduled for execution. > > I couldn't find a clean way to do what I wanted: either you > loop.create_task() and you get a task but it runs, or you don't run > anything, but you don't get a nice task object to hold on to. > > I tried several alternatives, like returning a future, and binding the > future awaiting to the submission of a task, but that was complicated > code that duplicated a lot of things. > > I tried creating a custom task, but it was even harder, setting a custom > event policy, to provide a custom event loop with my own create_task() > accepting parameters. That's a lot to do just to provide a parameter to > Task, especially if you already use a custom event loop (e.g: uvloop). I > was expecting to have to create a task factory only, but task factories > can't get any additional parameters from create_task()). > > Additionally I can't use ensure_future(), as it doesn't allow to pass > any parameter to the underlying Task, so if I want to accept any > awaitable in my signature, I need to provide my own custom ensure_future(). > > All those implementations access a lot of _private_api, and do other > shady things that linters hate; plus they are fragile at best. What's > more, Task being rewritten in C prevents things like setting self._coro, > so we can only inherit from the pure Python slow version. > > In the end, I can't even await the lazy task, because it blocks the > entire program. > > Hence I have 2 distinct, but independent albeit related, proposals: > > - Allow Task to be created but not scheduled for execution, and add a > parameter to ensure_future() and create_task() to control this. Awaiting > such a task would just do like asyncio.sleep(O) until it is scheduled > for execution. > > - Add an parameter to ensure_future() and create_task() named "kwargs" > that accept a mapping and will be passed as **kwargs to the underlying > created Task. > > I insist on the fact that the 2 proposals are independent, so please > don't reject both if you don't like one or the other. Passing a > parameter to the underlying custom Task is still of value even without > the unscheduled instantiation, and vice versa. > > Also, if somebody has any idea on how to make a LazyTask that we can > await on without blocking everything, I'll take it. > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From yselivanov.ml at gmail.com Thu Jun 14 13:08:15 2018 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Thu, 14 Jun 2018 13:08:15 -0400 Subject: [Python-Dev] A more flexible task creation In-Reply-To: References: Message-ID: On Thu, Jun 14, 2018 at 12:40 PM Tin Tvrtkovi? wrote: > > Hi, > > I've been using asyncio a lot lately and have encountered this problem several times. Imagine you want to do a lot of queries against a database, spawning 10000 tasks in parallel will probably cause a lot of them to fail. What you need in a task pool of sorts, to limit concurrency and do only 20 requests in parallel. > > If we were doing this synchronously, we wouldn't spawn 10000 threads using 10000 connections, we would use a thread pool with a limited number of threads and submit the jobs into its queue. > > To me, tasks are (somewhat) logically analogous to threads. The solution that first comes to mind is to create an AsyncioTaskExecutor with a submit(coro, *args, **kwargs) method. Put a reference to the coroutine and its arguments into an asyncio queue. Spawn n tasks pulling from this queue and awaiting the coroutines. > > It'd probably be useful to have this in the stdlib at some point. Sounds like a good idea! Feel free to open an issue to prototype the API. Yury From gjcarneiro at gmail.com Thu Jun 14 13:33:44 2018 From: gjcarneiro at gmail.com (Gustavo Carneiro) Date: Thu, 14 Jun 2018 18:33:44 +0100 Subject: [Python-Dev] A more flexible task creation In-Reply-To: References: Message-ID: On Thu, 14 Jun 2018 at 17:40, Tin Tvrtkovi? wrote: > Hi, > > I've been using asyncio a lot lately and have encountered this problem > several times. Imagine you want to do a lot of queries against a database, > spawning 10000 tasks in parallel will probably cause a lot of them to fail. > What you need in a task pool of sorts, to limit concurrency and do only 20 > requests in parallel. > > If we were doing this synchronously, we wouldn't spawn 10000 threads using > 10000 connections, we would use a thread pool with a limited number of > threads and submit the jobs into its queue. > > To me, tasks are (somewhat) logically analogous to threads. The solution > that first comes to mind is to create an AsyncioTaskExecutor with a > submit(coro, *args, **kwargs) method. Put a reference to the coroutine and > its arguments into an asyncio queue. Spawn n tasks pulling from this queue > and awaiting the coroutines. > > It'd probably be useful to have this in the stdlib at some point. > Probably a good idea, yes, because it seems a rather common use case. OTOH, I did something similar but for a different use case. In my case, I have a Watchdog class, that takes a list of (coro, *args, **kwargs). What it does is ensure there is always a task for each of the co-routines running, and watches the tasks, if they crash they are automatically restarted (with logging). Then there is a stop() method to cancel the watchdog-managed tasks and await them. My use case is because I tend to write a lot of singleton-style objects, which need book keeping tasks, or redis pubsub listening tasks, and my primary concern is not starting lots of tasks, it is that the few tasks I have must be restarted if they crash, forever. This is why I think it's not that hard to write "sugar" APIs on top of asyncio, and everyone's needs will be different. The strict API compatibility requirements of core Python stdlib, coupled with the very long feature release life-cycles of Python, make me think this sort of thing perhaps is better built in an utility library on top of asyncio, rather than inside asyncio itself? 18 months is a long long time to iterate on these features. I can't wait for Python 3.8... > > Date: Wed, 13 Jun 2018 22:45:22 +0200 >> From: Michel Desmoulin >> To: python-dev at python.org >> Subject: [Python-Dev] A more flexible task creation >> Message-ID: >> Content-Type: text/plain; charset=utf-8 >> >> I was working on a concurrency limiting code for asyncio, so the user >> may submit as many tasks as one wants, but only a max number of tasks >> will be submitted to the event loop at the same time. >> >> However, I wanted that passing an awaitable would always return a task, >> no matter if the task was currently scheduled or not. The goal is that >> you could add done callbacks to it, decide to force schedule it, etc >> >> I dug in the asyncio.Task code, and encountered: >> >> def __init__(self, coro, *, loop=None): >> ... >> self._loop.call_soon(self._step) >> self.__class__._all_tasks.add(self) >> >> I was surprised to see that instantiating a Task class has any side >> effect at all, let alone 2, and one of them being to be immediately >> scheduled for execution. >> >> I couldn't find a clean way to do what I wanted: either you >> loop.create_task() and you get a task but it runs, or you don't run >> anything, but you don't get a nice task object to hold on to. >> >> I tried several alternatives, like returning a future, and binding the >> future awaiting to the submission of a task, but that was complicated >> code that duplicated a lot of things. >> >> I tried creating a custom task, but it was even harder, setting a custom >> event policy, to provide a custom event loop with my own create_task() >> accepting parameters. That's a lot to do just to provide a parameter to >> Task, especially if you already use a custom event loop (e.g: uvloop). I >> was expecting to have to create a task factory only, but task factories >> can't get any additional parameters from create_task()). >> >> Additionally I can't use ensure_future(), as it doesn't allow to pass >> any parameter to the underlying Task, so if I want to accept any >> awaitable in my signature, I need to provide my own custom >> ensure_future(). >> >> All those implementations access a lot of _private_api, and do other >> shady things that linters hate; plus they are fragile at best. What's >> more, Task being rewritten in C prevents things like setting self._coro, >> so we can only inherit from the pure Python slow version. >> >> In the end, I can't even await the lazy task, because it blocks the >> entire program. >> >> Hence I have 2 distinct, but independent albeit related, proposals: >> >> - Allow Task to be created but not scheduled for execution, and add a >> parameter to ensure_future() and create_task() to control this. Awaiting >> such a task would just do like asyncio.sleep(O) until it is scheduled >> for execution. >> >> - Add an parameter to ensure_future() and create_task() named "kwargs" >> that accept a mapping and will be passed as **kwargs to the underlying >> created Task. >> >> I insist on the fact that the 2 proposals are independent, so please >> don't reject both if you don't like one or the other. Passing a >> parameter to the underlying custom Task is still of value even without >> the unscheduled instantiation, and vice versa. >> >> Also, if somebody has any idea on how to make a LazyTask that we can >> await on without blocking everything, I'll take it. >> >> _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/gjcarneiro%40gmail.com > -- Gustavo J. A. M. Carneiro Gambit Research "The universe is always one step beyond logic." -- Frank Herbert -------------- next part -------------- An HTML attachment was scrubbed... URL: From solipsis at pitrou.net Thu Jun 14 15:04:13 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Thu, 14 Jun 2018 21:04:13 +0200 Subject: [Python-Dev] Idea: reduce GC threshold in development mode (-X dev) References: Message-ID: <20180614210413.68c84824@fsol> On Fri, 8 Jun 2018 09:48:03 +0200 Victor Stinner wrote: > > Question: Do you think that bugs spotted by a GC collection are common > enough to change the GC thresholds in development mode (new -X dev > flag of Python 3.7)? I don't think replacing a more-or-less arbitrary value with another more-or-less arbitrary value is a very useful change. Regards Antoine. From chris.barker at noaa.gov Thu Jun 14 15:14:18 2018 From: chris.barker at noaa.gov (Chris Barker) Date: Thu, 14 Jun 2018 12:14:18 -0700 Subject: [Python-Dev] A more flexible task creation In-Reply-To: References: Message-ID: Excuse my ignorance (or maybe it's a vocabulary thing), but I'm trying to understand the problem here. But if I have this right: I've been using asyncio a lot lately and have encountered this problem > several times. Imagine you want to do a lot of queries against a database, > spawning 10000 tasks in parallel will probably cause a lot of them to fail. > async is not parallel -- all the tasks will be run in the same thread (Unless you explicitly spawn another thread), and only one task is running at once, and the task switching happens when the task specifically releases itself. If it matters in what order the tasks are performed, then you should not be using async. So why do queries fail with 10000 tasks? or ANY number? If the async DB access code is written right, a given query should not "await" unless it is in a safe state to do so. So what am I missing here??? What you need in a task pool of sorts, to limit concurrency and do only 20 > requests in parallel. > still wrapping my head around the vocabulary, but async is not concurrent. If we were doing this synchronously, we wouldn't spawn 10000 threads using > 10000 connections, > and threads aren't synchronous -- but they are concurrent. > we would use a thread pool with a limited number of threads and submit the > jobs into its queue. > because threads ARE concurrent, and there is no advantage to having more threads than can actually run at once, and having many more does cause thread-switching performance issues. To me, tasks are (somewhat) logically analogous to threads. > kinda -- in the sense that they are run (and completed) in arbitrary order, But they are different, and that difference is key to this issue. As Yury expressed interest in this idea, there must be something I'm missing. What is it? -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From j.orponen at 4teamwork.ch Thu Jun 14 15:39:21 2018 From: j.orponen at 4teamwork.ch (Joni Orponen) Date: Thu, 14 Jun 2018 21:39:21 +0200 Subject: [Python-Dev] A more flexible task creation In-Reply-To: References: Message-ID: On Thu, Jun 14, 2018 at 9:17 PM Chris Barker via Python-Dev < python-dev at python.org> wrote: > Excuse my ignorance (or maybe it's a vocabulary thing), but I'm trying to > understand the problem here. > Vocabulary-wise 'queue depth' might be a suitable mental aid for what people actually want to limit. The practical issue is most likely something to do with hitting timeouts when trying to queue too much work onto a service. -- Joni Orponen -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve.dower at python.org Thu Jun 14 16:03:12 2018 From: steve.dower at python.org (Steve Dower) Date: Thu, 14 Jun 2018 13:03:12 -0700 Subject: [Python-Dev] A more flexible task creation In-Reply-To: References: Message-ID: <481c9ae0-9656-b839-c327-6a107e880f5c@python.org> On 14Jun2018 1214, Chris Barker via Python-Dev wrote: > Excuse my ignorance (or maybe it's a vocabulary thing), but I'm trying > to understand the problem here. > > But if I have this right: > > I've been using asyncio a lot lately and have encountered this > problem several times. Imagine you want to do a lot of queries > against a database, spawning 10000 tasks in parallel will probably > cause a lot of them to fail. > > > async is not parallel -- all the tasks will be run in the same thread > (Unless you explicitly spawn another thread), and only one task is > running at once, and the task switching happens when the task > specifically releases itself. If the task isn't actually doing the work, but merely waiting for it to finish, then you can end up overloading the thing that *is* doing the task (e.g. the network interface, database server, other thread/process, file system, etc.). Single-threaded async is actually all about *waiting* - it provides a convenient model to do other tasks while you are waiting for the first (as well as a convenient model to indicate what should be done after it completes - there are two conveniences here). If the underlying thing you're doing *can* run in parallel, but becomes less efficient the more times you do it (for example, most file system operations fall into this category), you will want to limit how many tasks you *start*, not just how many you are waiting for. I often use semaphores for this when I need it, and it looks like asyncio.Semaphore() is sufficient for this: import asyncio task_limiter = asyncio.Semaphore(4) async def my_task(): await task_limiter.acquire() try: await do_db_request() finally: task_limiter.release() Cheers, Steve From tinchester at gmail.com Thu Jun 14 18:21:41 2018 From: tinchester at gmail.com (=?UTF-8?Q?Tin_Tvrtkovi=C4=87?=) Date: Fri, 15 Jun 2018 00:21:41 +0200 Subject: [Python-Dev] A more flexible task creation In-Reply-To: References: Message-ID: Other folks have already chimed in, so I'll be to the point. Try writing a simple asyncio web scraper (using maybe the aiohttp library) and create 5000 tasks for scraping different sites. My prediction is a whole lot of them will time out due to various reasons. Other responses inline. On Thu, Jun 14, 2018 at 9:15 PM Chris Barker wrote: > async is not parallel -- all the tasks will be run in the same thread > (Unless you explicitly spawn another thread), and only one task is running > at once, and the task switching happens when the task specifically releases > itself. asyncio is mostly used for IO-heavy workloads (note the name). If you're doing IO in asyncio, it is most definitely parallel. The point of it is having a large number of open network connections at the same time. > So why do queries fail with 10000 tasks? or ANY number? If the async DB > access code is written right, a given query should not "await" unless it is > in a safe state to do so. > Imagine you have a batch job you need to do. You need to fetch a million records from your database, and you can't use a query to get them all - you need a million individual "get" requests. Even if Python was infinitely fast, and your bandwidth was infinite, can your database handle opening a million new connections in parallel, in a very short time? Mine sure can't, even a few hundred extra connections would be a potential problem. So you want to do the work in chunks, but still not one by one. > and threads aren't synchronous -- but they are concurrent. > Using threads implies coupling threads with IO. Doing requests one at a time in a given thread. Generally called 'synchronous IO', as opposed to asynchronous IO/asyncio. > because threads ARE concurrent, and there is no advantage to having more > threads than can actually run at once, and having many more does cause > thread-switching performance issues. > Weeell technically threads in CPython aren't really concurrent (when running Python bytecode), but for doing IO they are in practice. When doing IO, there absolutely is an advantage to using more threads than can run at once (in CPython only one thread running Python can run at once). You can test it out yourself by writing a synchronous web scraper (using maybe the requests library) and trying to scrape using a threadpool vs using a single thread. You'll find the threadpool version is much faster. -------------- next part -------------- An HTML attachment was scrubbed... URL: From tinchester at gmail.com Thu Jun 14 18:31:12 2018 From: tinchester at gmail.com (=?UTF-8?Q?Tin_Tvrtkovi=C4=87?=) Date: Fri, 15 Jun 2018 00:31:12 +0200 Subject: [Python-Dev] A more flexible task creation In-Reply-To: <481c9ae0-9656-b839-c327-6a107e880f5c@python.org> References: <481c9ae0-9656-b839-c327-6a107e880f5c@python.org> Message-ID: On Thu, Jun 14, 2018 at 10:03 PM Steve Dower wrote: > I often use > semaphores for this when I need it, and it looks like > asyncio.Semaphore() is sufficient for this: > > > import asyncio > task_limiter = asyncio.Semaphore(4) > > async def my_task(): > await task_limiter.acquire() > try: > await do_db_request() > finally: > task_limiter.release() Yeah, a semaphore logically fits exactly but * I feel this API is somewhat clunky, even if you use an 'async with'. * my gut feeling is spawning a thousand tasks and having them all fighting over the same semaphore and scheduling is going to be much less efficient than a small number of tasks draining a queue. -------------- next part -------------- An HTML attachment was scrubbed... URL: From casanova906 at hotmail.com Thu Jun 14 18:43:22 2018 From: casanova906 at hotmail.com (casanova yassine) Date: Thu, 14 Jun 2018 22:43:22 +0000 Subject: [Python-Dev] Python-Dev Digest, Vol 179, Issue 21 Message-ID: The Jseries acknowlegement by using Jetty containers can get you a best resolution To python wheel asynchronism bugs Envoy? ? partir d?un Smarpthone Android avec GMX Mail. Le 14/06/2018, 4:00 PM python-dev-request at python.org a ?crit: On 13 Jun 2018, at 15:42, Nick Coghlan > wrote: On 13 June 2018 at 02:23, Guido van Rossum > wrote: So, to summarize, we need something like six for C? Yeah, pretty much - once we can get to the point where it's routine for folks to be building "abiX" or "abiXY" wheels (with the latter not actually being a defined compatibility tag yet, but having the meaning of "targets the stable ABI as first defined in CPython X.Y"), rather than feature release specific "cpXYm" ones, then a *lot* of the extension module maintenance pain otherwise arising from more frequent CPython releases should be avoided. There'd still be a lot of other details to work out to turn the proposed release cadence change into a practical reality, but this is the key piece that I think is a primarily technical hurdle: simplifying the current "wheel-per-python-version-per-target-platform" community project build matrices to instead be "wheel-per-target-platform?. This requires getting people to mostly stop using the non-stable ABI, and that could be a lot of work for projects that have existing C extensions that don?t use the stable ABI or cython/cffi/? That said, the CPython API tends to be fairly stable over releases and even without using the stable ABI supporting faster CPython feature releases shouldn?t be too onerous, especially for projects with some kind of automation for creating release artefacts (such as a CI system). Ronald -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Thu Jun 14 21:46:18 2018 From: njs at pobox.com (Nathaniel Smith) Date: Thu, 14 Jun 2018 18:46:18 -0700 Subject: [Python-Dev] A more flexible task creation In-Reply-To: References: <481c9ae0-9656-b839-c327-6a107e880f5c@python.org> Message-ID: On Thu, Jun 14, 2018 at 3:31 PM, Tin Tvrtkovi? wrote: > * my gut feeling is spawning a thousand tasks and having them all fighting > over the same semaphore and scheduling is going to be much less efficient > than a small number of tasks draining a queue. Fundamentally, a Semaphore is a queue: https://github.com/python/cpython/blob/9e7c92193cc98fd3c2d4751c87851460a33b9118/Lib/asyncio/locks.py#L437 ...so the two approaches are more analogous than it might appear at first. The big difference is what objects are in the queue. For a web scraper, the options might be either a queue where each entry is a URL represented as a str, versus a queue where each entry is (effectively) a Task object with attached coroutine object. So I think the main differences you'll see in practice are: - a Task + coroutine aren't terribly big -- maybe a few kilobytes -- but definitely larger than a str; so the Semaphore approach will take more RAM. Modern machines have lots of RAM, so for many use cases this is still probably fine (50,000 tasks is really not that many). But there will certainly be some situations where the str queue fits in RAM but the Task queue doesn't. - If you create all those Task objects up front, then that front-loads a chunk of work (i.e., allocating all those objects!) that otherwise would be spread throughout the queue processing. So you'll see a noticeable pause up front before the code starts working. -n -- Nathaniel J. Smith -- https://vorpus.org From steve at holdenweb.com Fri Jun 15 03:56:49 2018 From: steve at holdenweb.com (Steve Holden) Date: Fri, 15 Jun 2018 08:56:49 +0100 Subject: [Python-Dev] A more flexible task creation In-Reply-To: References: Message-ID: On Thu, Jun 14, 2018 at 8:14 PM, Chris Barker via Python-Dev < python-dev at python.org> wrote: > Excuse my ignorance (or maybe it's a vocabulary thing), but I'm trying to > understand the problem here. > > So why do queries fail with 10000 tasks? or ANY number? If the async DB > access code is written right, a given query should not "await" unless it is > in a safe state to do so. > > So what am I missing here??? > > because threads ARE concurrent, and there is no advantage to having more >> threads than can actually run at once, and having many more does cause >> thread-switching performance issues. >> > > To me, tasks are (somewhat) logically analogous to threads. >> > > kinda -- in the sense that they are run (and completed) in arbitrary > order, But they are different, and that difference is key to this issue. > > As Yury expressed interest in this idea, there must be something I'm > missing. > > What is it? > All tasks need resources, and bookkeeping for such tasks is likely to slow things down. More importantly, with an uncontrolled number of tasks you can require an uncontrolled use of resources, decreasing efficiency to levels well below that attainable with sensible conservation of resources. Imagine, if you will, a task that starts by allocating 1GB of memory. Would you want 10,000 of those? -------------- next part -------------- An HTML attachment was scrubbed... URL: From desmoulinmichel at gmail.com Fri Jun 15 04:08:21 2018 From: desmoulinmichel at gmail.com (Michel Desmoulin) Date: Fri, 15 Jun 2018 10:08:21 +0200 Subject: [Python-Dev] A more flexible task creation In-Reply-To: References: Message-ID: > > The strict API compatibility requirements of core Python stdlib, coupled > with the very long feature release life-cycles of Python, make me think > this sort of thing perhaps is better built in an utility library on top > of asyncio, rather than inside asyncio itself?? 18 months is a long long > time to iterate on these features.? I can't wait for Python 3.8... > ? A lot of my late requests come from my attempt to group some of that in a lib: https://github.com/Tygs/ayo Most of it works, although I got read of context() recently, but the lazy task part really fails. Indeed, the API allows to do: async with ayo.scope() as run: task_list = run.all(foo(), foo(), foo()) run.asap(bar()) await task_list.gather() run.asap(baz()) scope() return a nursery like object, and this works perfectly, with the usual guaranty of Trio's nursery, but working in asyncio right now. However, I tried to add to the mix: async with ayo.scope(max_concurrency=2) as run: task_list = run.all(foo(), foo(), foo()) run.asap(bar()) await task_list.gather() run.asap(baz()) And I can get it to work. task_list will right now contains a list of tasks and None, because some tasks are not scheduled immediately. That's why I wanted lazytasks. I tried to create my own lazy tasks, but it never really worked. I'm going to try to go down the road of wrapping the unscheduled coro in a future-like object as suggested by Yuri. But having that built-in seems logical, elegant, and just good design in general: __init__ should not have side effects. From desmoulinmichel at gmail.com Fri Jun 15 04:21:07 2018 From: desmoulinmichel at gmail.com (Michel Desmoulin) Date: Fri, 15 Jun 2018 10:21:07 +0200 Subject: [Python-Dev] A more flexible task creation In-Reply-To: References: Message-ID: <2ba32fea-3291-ef76-28b5-f78ce222383b@gmail.com> Le 14/06/2018 ? 04:09, Nathaniel Smith a ?crit?: > How about: > > async def wait_to_run(async_fn, *args): > ? ? await wait_for_something() > ? ? return await async_fn(*args) > > task = loop.create_task(wait_to_run(myfunc, ...)) > It's quite elegant, although figuring out the wait_for_something() is going to be tricky. > ----- > > Whatever strategy you use, you should also think about what semantics > you want if one of these delayed tasks is cancelled before it starts. > > For regular, non-delayed tasks, Trio makes sure that even if it gets > cancelled before it starts, then it still gets scheduled and runs until > the first cancellation point. This is necessary for correct resource > hand-off between tasks: > > async def some_task(handle): > ? ? with handle: > ? ? ? ? await ... > > If we skipped running this task entirely, then the handle wouldn't be > closed properly; scheduling it once allows the with block to run, and > then get cleaned up by the cancellation exception. I'm not sure but I > think asyncio handles pre-cancellation in a similar way. (Yury, do you > know? > > Now, in delayed task case, there's a similar issue. If you want to keep > the same solution, then you might want to instead write: > > # asyncio > async def wait_to_run(async_fn, *args): > ? ? try: > ? ? ? ? await wait_for_something() > ? ? except asyncio.CancelledError: > ? ? ? ? # have to create a subtask to make it cancellable > ? ? ? ? subtask = loop.create_task(async_fn(*args)) > ? ? ? ? # then cancel it immediately > ? ? ? ? subtask.cancel() > ? ? ? ? # and wait for the cancellation to be processed > ? ? ? ? return await subtask > ? ? else: > ? ? ? ? return await async_fn(*args) > > In trio, this could be simplified to > > # trio > async def wait_to_run(async_fn, *args): > ? ? try: > ? ? ? ? await wait_for_something() > ? ? except trio.Cancelled: > ? ? ? ? pass > ? ? return await async_fn(*args) > > (This works because of trio's "stateful cancellation" ? if the whole > thing is cancelled, then as soon as async_fn hits a cancellation point > the exception will be re-delivered.) Thanks for the tip. It schedules it in all cases, but I don't know what asyncio does with it. I'll add a unit test for that. From ncoghlan at gmail.com Fri Jun 15 07:00:46 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 15 Jun 2018 21:00:46 +1000 Subject: [Python-Dev] Some data points for the "annual release cadence" concept In-Reply-To: <0004060B-F404-48F7-BF54-0E9DA0EC3952@mac.com> References: <0004060B-F404-48F7-BF54-0E9DA0EC3952@mac.com> Message-ID: On 14 June 2018 at 06:30, Ronald Oussoren wrote: > On 13 Jun 2018, at 15:42, Nick Coghlan wrote: > > Yeah, pretty much - once we can get to the point where it's routine for > folks to be building "abiX" or "abiXY" wheels (with the latter not actually > being a defined compatibility tag yet, but having the meaning of "targets > the stable ABI as first defined in CPython X.Y"), rather than feature > release specific "cpXYm" ones, then a *lot* of the extension module > maintenance pain otherwise arising from more frequent CPython releases > should be avoided. > > There'd still be a lot of other details to work out to turn the proposed > release cadence change into a practical reality, but this is the key piece > that I think is a primarily technical hurdle: simplifying the current > "wheel-per-python-version-per-target-platform" community project build > matrices to instead be "wheel-per-target-platform?. > > > This requires getting people to mostly stop using the non-stable ABI, and > that could be a lot of work for projects that have existing C extensions > that don?t use the stable ABI or cython/cffi/? > > That said, the CPython API tends to be fairly stable over releases and > even without using the stable ABI supporting faster CPython feature > releases shouldn?t be too onerous, especially for projects with some kind > of automation for creating release artefacts (such as a CI system). > Right, there would still be a non-zero impact on projects that ship binary artifacts. Having a viable stable ABI as a target just allows third party projects to make the trade-off between the upfront cost of migrating to the stable ABI (but then only needing to rebuild binaries when their own code changes), and the ongoing cost of maintaining an extra few sets of binary wheel archives. I think asking folks to make that trade-off on a case by case basis is reasonable, whereas back in the previous discussion I considered *only* offering the second option to be unreasonable. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From gjcarneiro at gmail.com Fri Jun 15 08:21:00 2018 From: gjcarneiro at gmail.com (Gustavo Carneiro) Date: Fri, 15 Jun 2018 13:21:00 +0100 Subject: [Python-Dev] A more flexible task creation In-Reply-To: References: Message-ID: On Fri, 15 Jun 2018 at 09:18, Michel Desmoulin wrote: > > > > > The strict API compatibility requirements of core Python stdlib, coupled > > with the very long feature release life-cycles of Python, make me think > > this sort of thing perhaps is better built in an utility library on top > > of asyncio, rather than inside asyncio itself? 18 months is a long long > > time to iterate on these features. I can't wait for Python 3.8... > > > > A lot of my late requests come from my attempt to group some of that in > a lib: https://github.com/Tygs/ayo > Ah, good idea. > Most of it works, although I got read of context() recently, but the > lazy task part really fails. > > > Indeed, the API allows to do: > > async with ayo.scope() as run: > task_list = run.all(foo(), foo(), foo()) > run.asap(bar()) > await task_list.gather() > run.asap(baz()) > > > > scope() return a nursery like object, and this works perfectly, with the > usual guaranty of Trio's nursery, but working in asyncio right now. > To be honest, I see "async with" being abused everywhere in asyncio, lately. I like to have objects with start() and stop() methods, but everywhere I see async context managers. Fine, add nursery or whatever, but please also have a simple start() / stop() public API. "async with" is only good for functional programming. If you want to go more of an object-oriented style, you tend to have start() and stop() methods in your classes, which will call start() & stop() (or close()) methods recursively on nested resources. So of the libraries (aiopg, I'm looking at you) don't support start/stop or open/close well. > However, I tried to add to the mix: > > async with ayo.scope(max_concurrency=2) as run: > task_list = run.all(foo(), foo(), foo()) > run.asap(bar()) > await task_list.gather() > run.asap(baz()) > > And I can get it to work. task_list will right now contains a list of > tasks and None, because some tasks are not scheduled immediately. That's > why I wanted lazytasks. I tried to create my own lazy tasks, but it > never really worked. I'm going to try to go down the road of wrapping > the unscheduled coro in a future-like object as suggested by Yuri. But > having that built-in seems logical, elegant, and just good design in > general: __init__ should not have side effects. > I tend to slightly agree, but OTOH if asyncio had been designed to not schedule tasks automatically on __init__ I bet there would have been other users complaining that "why didn't task XX run?", or "why do tasks need a start() method, that is clunky!". You can't please everyone... Also, in task_list = run.all(foo(), foo(), foo()) As soon as you call foo(), you are instantiating a coroutine, which consumes memory, while the task may not even be scheduled for a long time (if you have 5000 potential tasks but only execute 10 at a time, for example). But if you do as Yuri suggested, you'll instead accept a function reference, foo, which is a singleton, you can have many foo references to the function, but they will only create coroutine objects when the task is actually about to be scheduled, so it's more efficient in terms of memory. -- Gustavo J. A. M. Carneiro Gambit Research "The universe is always one step beyond logic." -- Frank Herbert -------------- next part -------------- An HTML attachment was scrubbed... URL: From status at bugs.python.org Fri Jun 15 12:10:01 2018 From: status at bugs.python.org (Python tracker) Date: Fri, 15 Jun 2018 18:10:01 +0200 (CEST) Subject: [Python-Dev] Summary of Python tracker Issues Message-ID: <20180615161001.7AFF4561C7@psf.upfronthosting.co.za> ACTIVITY SUMMARY (2018-06-08 - 2018-06-15) Python tracker at https://bugs.python.org/ To view or respond to any of the issues listed below, click on the issue. Do NOT respond to this message. Issues counts and deltas: open 6691 ( -1) closed 38930 (+62) total 45621 (+61) Open issues with patches: 2644 Issues opened (42) ================== #33656: IDLE: Turn on DPI awareness on Windows https://bugs.python.org/issue33656 reopened by terry.reedy #33811: asyncio accepting connection limit https://bugs.python.org/issue33811 opened by lguo #33813: Update overdue 'Deprecated ... removed in 3.x' messages https://bugs.python.org/issue33813 opened by terry.reedy #33816: New metaclass example for Data Model topic https://bugs.python.org/issue33816 opened by adelfino #33817: PyString_FromFormatV() fails to build empty strings https://bugs.python.org/issue33817 opened by Tey #33820: IDLE subsection of What's New 3.6 https://bugs.python.org/issue33820 opened by terry.reedy #33821: IDLE subsection of What's New 3.7 https://bugs.python.org/issue33821 opened by terry.reedy #33822: IDLE subsection of What's New 3.8 https://bugs.python.org/issue33822 opened by terry.reedy #33823: A BUG in concurrent/asyncio https://bugs.python.org/issue33823 opened by Python++ #33824: Settign LANG=C modifies the --version behavior https://bugs.python.org/issue33824 opened by hroncok #33826: enable discovery of class source code in IPython interactively https://bugs.python.org/issue33826 opened by t-vi #33829: C API: provide new object protocol helper https://bugs.python.org/issue33829 opened by Bartosz Go??aszewski #33830: Error in the output of one example in the httplib docs https://bugs.python.org/issue33830 opened by Aifu LIU #33832: Make "magic methods" a little more discoverable in the docs https://bugs.python.org/issue33832 opened by adelfino #33833: ProactorEventLoop raises AssertionError https://bugs.python.org/issue33833 opened by twisteroid ambassador #33834: Test for ProactorEventLoop logs InvalidStateError https://bugs.python.org/issue33834 opened by twisteroid ambassador #33836: [Good first-time issue] Recommend keyword-only param for memoi https://bugs.python.org/issue33836 opened by zach.ware #33837: Closing asyncio.Server on asyncio.ProactorEventLoop causes all https://bugs.python.org/issue33837 opened by mliska #33838: Very slow upload with http.client on Windows when setting time https://bugs.python.org/issue33838 opened by ivknv #33839: IDLE tooltips.py: refactor and add docstrings and tests https://bugs.python.org/issue33839 opened by terry.reedy #33840: connection limit on listening socket in asyncio https://bugs.python.org/issue33840 opened by lguo #33841: lock not released in threading.Condition https://bugs.python.org/issue33841 opened by lev.maximov #33842: Remove tarfile.filemode https://bugs.python.org/issue33842 opened by inada.naoki #33843: Remove deprecated stuff in cgi module https://bugs.python.org/issue33843 opened by inada.naoki #33846: Misleading error message in urllib.parse.unquote https://bugs.python.org/issue33846 opened by thet #33847: doc: Add '@' operator entry to index https://bugs.python.org/issue33847 opened by adelfino #33851: 3.7 regression: ast.get_docstring() for a node that lacks a do https://bugs.python.org/issue33851 opened by mgedmin #33852: doc Remove parentheses from sequence subscription description https://bugs.python.org/issue33852 opened by adelfino #33854: doc Add PEP title in seealso of Built-in Types https://bugs.python.org/issue33854 opened by adelfino #33855: IDLE: Minimally test every non-startup module. https://bugs.python.org/issue33855 opened by terry.reedy #33856: Type "help" is not present on win32 https://bugs.python.org/issue33856 opened by matrixise #33857: python exception on Solaris : code for hash blake2b was not fo https://bugs.python.org/issue33857 opened by goron #33858: A typo in multiprocessing documentation https://bugs.python.org/issue33858 opened by aruseni #33859: Spelling mistakes found using aspell https://bugs.python.org/issue33859 opened by xtreak #33861: Minor improvements of tests for os.path. https://bugs.python.org/issue33861 opened by serhiy.storchaka #33864: collections.abc.ByteString does not register memoryview https://bugs.python.org/issue33864 opened by fried #33865: [EASY] Missing code page aliases: "unknown encoding: 874" https://bugs.python.org/issue33865 opened by winvinc #33866: Stop using OrderedDict in enum https://bugs.python.org/issue33866 opened by inada.naoki #33867: Module dicts are wiped on module garbage collection https://bugs.python.org/issue33867 opened by natedogith1 #33868: test__xxsubinterpreters: test_subinterpreter() fails randomly https://bugs.python.org/issue33868 opened by vstinner #33869: doc Add set, frozen set, and tuple entries to Glossary https://bugs.python.org/issue33869 opened by adelfino #33870: pdb continue + breakpoint https://bugs.python.org/issue33870 opened by philiprowlands Most recent 15 issues with no replies (15) ========================================== #33870: pdb continue + breakpoint https://bugs.python.org/issue33870 #33869: doc Add set, frozen set, and tuple entries to Glossary https://bugs.python.org/issue33869 #33867: Module dicts are wiped on module garbage collection https://bugs.python.org/issue33867 #33866: Stop using OrderedDict in enum https://bugs.python.org/issue33866 #33861: Minor improvements of tests for os.path. https://bugs.python.org/issue33861 #33859: Spelling mistakes found using aspell https://bugs.python.org/issue33859 #33857: python exception on Solaris : code for hash blake2b was not fo https://bugs.python.org/issue33857 #33856: Type "help" is not present on win32 https://bugs.python.org/issue33856 #33854: doc Add PEP title in seealso of Built-in Types https://bugs.python.org/issue33854 #33847: doc: Add '@' operator entry to index https://bugs.python.org/issue33847 #33843: Remove deprecated stuff in cgi module https://bugs.python.org/issue33843 #33842: Remove tarfile.filemode https://bugs.python.org/issue33842 #33841: lock not released in threading.Condition https://bugs.python.org/issue33841 #33838: Very slow upload with http.client on Windows when setting time https://bugs.python.org/issue33838 #33834: Test for ProactorEventLoop logs InvalidStateError https://bugs.python.org/issue33834 Most recent 15 issues waiting for review (15) ============================================= #33869: doc Add set, frozen set, and tuple entries to Glossary https://bugs.python.org/issue33869 #33866: Stop using OrderedDict in enum https://bugs.python.org/issue33866 #33865: [EASY] Missing code page aliases: "unknown encoding: 874" https://bugs.python.org/issue33865 #33859: Spelling mistakes found using aspell https://bugs.python.org/issue33859 #33855: IDLE: Minimally test every non-startup module. https://bugs.python.org/issue33855 #33854: doc Add PEP title in seealso of Built-in Types https://bugs.python.org/issue33854 #33851: 3.7 regression: ast.get_docstring() for a node that lacks a do https://bugs.python.org/issue33851 #33847: doc: Add '@' operator entry to index https://bugs.python.org/issue33847 #33843: Remove deprecated stuff in cgi module https://bugs.python.org/issue33843 #33842: Remove tarfile.filemode https://bugs.python.org/issue33842 #33839: IDLE tooltips.py: refactor and add docstrings and tests https://bugs.python.org/issue33839 #33836: [Good first-time issue] Recommend keyword-only param for memoi https://bugs.python.org/issue33836 #33833: ProactorEventLoop raises AssertionError https://bugs.python.org/issue33833 #33832: Make "magic methods" a little more discoverable in the docs https://bugs.python.org/issue33832 #33829: C API: provide new object protocol helper https://bugs.python.org/issue33829 Top 10 most discussed issues (10) ================================= #33656: IDLE: Turn on DPI awareness on Windows https://bugs.python.org/issue33656 26 msgs #33630: test_posix: TestPosixSpawn fails on PPC64 Fedora 3.x https://bugs.python.org/issue33630 22 msgs #1529353: Squeezer - squeeze large output in the interpreter https://bugs.python.org/issue1529353 14 msgs #14102: argparse: add ability to create a man page https://bugs.python.org/issue14102 12 msgs #33462: reversible dict https://bugs.python.org/issue33462 10 msgs #33738: PyIndex_Check conflicts with PEP 384 https://bugs.python.org/issue33738 9 msgs #29750: smtplib doesn't handle unicode passwords https://bugs.python.org/issue29750 7 msgs #33865: [EASY] Missing code page aliases: "unknown encoding: 874" https://bugs.python.org/issue33865 7 msgs #27777: cgi.FieldStorage can't parse simple body with Content-Length a https://bugs.python.org/issue27777 6 msgs #32962: test_gdb fails in debug build with `-mcet -fcf-protection -O0` https://bugs.python.org/issue32962 6 msgs Issues closed (56) ================== #11874: argparse assertion failure with brackets in metavars https://bugs.python.org/issue11874 closed by ncoghlan #17045: Improve C-API doc for PyTypeObject https://bugs.python.org/issue17045 closed by eric.snow #17286: Make subprocess handling text output with universal_newlines m https://bugs.python.org/issue17286 closed by cheryl.sabella #19382: tabnanny unit tests https://bugs.python.org/issue19382 closed by vstinner #22264: Add wsgiref.util.dump_wsgistr & load_wsgistr https://bugs.python.org/issue22264 closed by ncoghlan #22555: Tracking issue for adjustments to binary/text boundary handlin https://bugs.python.org/issue22555 closed by ncoghlan #23869: Initialization is being done in PyType_GenericAlloc https://bugs.python.org/issue23869 closed by eric.snow #24384: difflib.SequenceMatcher faster quick_ratio with lower bound sp https://bugs.python.org/issue24384 closed by floyd #26698: Tk DPI awareness https://bugs.python.org/issue26698 closed by terry.reedy #27397: email.message.Message.get_payload(decode=True) raises Assertio https://bugs.python.org/issue27397 closed by taleinat #29456: bugs in unicodedata.normalize: u1176, u11a7 and u11c3 https://bugs.python.org/issue29456 closed by xiang.zhang #30805: asyncio: race condition with debug and subprocess https://bugs.python.org/issue30805 closed by yselivanov #30820: email.contentmanager.raw_data_manager fails to create multipar https://bugs.python.org/issue30820 closed by r.david.murray #31102: deheader: double #incude of the same file https://bugs.python.org/issue31102 closed by Mariatta #31120: [2.7] Python 64 bit _ssl compile fails due missing buildinf_am https://bugs.python.org/issue31120 closed by zach.ware #31181: Segfault in gcmodule.c:360 visit_decref (PyObject_IS_GC(op)) https://bugs.python.org/issue31181 closed by vstinner #31215: Add version changed notes for OpenSSL 1.1.0 compatibility https://bugs.python.org/issue31215 closed by ncoghlan #31378: Missing documentation for sqlite3.OperationalError https://bugs.python.org/issue31378 closed by berker.peksag #32400: inspect.isdatadescriptor false negative https://bugs.python.org/issue32400 closed by berker.peksag #33375: warnings: get filename from frame.f_code.co_filename https://bugs.python.org/issue33375 closed by brett.cannon #33409: Clarify the interaction between locale coercion & UTF-8 mode https://bugs.python.org/issue33409 closed by ncoghlan #33582: formatargspec deprecated but does not emit DeprecationWarning. https://bugs.python.org/issue33582 closed by ned.deily #33642: IDLE: Display up to maxlines non-blank lines for Code Context https://bugs.python.org/issue33642 closed by terry.reedy #33671: Efficient zero-copy for shutil.copy* functions (Linux, OSX and https://bugs.python.org/issue33671 closed by giampaolo.rodola #33694: test_asyncio: test_start_tls_server_1() fails on Python on x86 https://bugs.python.org/issue33694 closed by vstinner #33741: UnicodeEncodeError onsmtplib.login(MAIL_USER, MAIL_PASSWORD) https://bugs.python.org/issue33741 closed by r.david.murray #33745: 3.7.0b5 changes the line number of empty functions with docstr https://bugs.python.org/issue33745 closed by ned.deily #33748: test_discovery_failed_discovery in test_unittest modifies sys. https://bugs.python.org/issue33748 closed by ned.deily #33751: Failed separate testTruncateOnWindows in test_file https://bugs.python.org/issue33751 closed by serhiy.storchaka #33766: Grammar Incongruence https://bugs.python.org/issue33766 closed by terry.reedy #33770: base64 throws 'incorrect padding' exception when the issue is https://bugs.python.org/issue33770 closed by ned.deily #33771: Module: timeit. According to documentation default_repeat shou https://bugs.python.org/issue33771 closed by terry.reedy #33799: Remove non-ordered dicts comments from FAQ https://bugs.python.org/issue33799 closed by adelfino #33800: Fix default argument for parameter dict_type of ConfigParser/R https://bugs.python.org/issue33800 closed by ned.deily #33801: Remove non-ordered dict comment from plistlib https://bugs.python.org/issue33801 closed by ned.deily #33802: Regression in logging configuration https://bugs.python.org/issue33802 closed by barry #33810: Remove unused code in datetime module https://bugs.python.org/issue33810 closed by belopolsky #33812: Different behavior between datetime.py and its C accelerator https://bugs.python.org/issue33812 closed by belopolsky #33814: exec() maybe has a memory leak https://bugs.python.org/issue33814 closed by tim.peters #33815: List a = []*19 doesn't create a list with index length of 19 https://bugs.python.org/issue33815 closed by zach.ware #33818: Make PyExceptionClass_Name returning a const string https://bugs.python.org/issue33818 closed by serhiy.storchaka #33819: Mention "ordered mapping" instead of "ordered dictionary" in e https://bugs.python.org/issue33819 closed by r.david.murray #33825: Change mentions of "magic" attributes to "special" https://bugs.python.org/issue33825 closed by adelfino #33827: Generators with lru_cache can be non-intuituve https://bugs.python.org/issue33827 closed by rhettinger #33828: Add versionchanged notes for string.Formatter https://bugs.python.org/issue33828 closed by xiang.zhang #33831: Make htmlview work in make.bat https://bugs.python.org/issue33831 closed by adelfino #33835: Too strong side effect? https://bugs.python.org/issue33835 closed by serhiy.storchaka #33844: Writing capital letters with csvwriter.writerow changes the cs https://bugs.python.org/issue33844 closed by serhiy.storchaka #33845: Update Doc\make.bat on 2.7 to bring it on par to master versio https://bugs.python.org/issue33845 closed by steve.dower #33848: Incomplete format string syntax for Exceptions https://bugs.python.org/issue33848 closed by eric.smith #33849: Caught infinite recursion resets the trace function https://bugs.python.org/issue33849 closed by belopolsky #33850: Json.dump() bug when using generator https://bugs.python.org/issue33850 closed by eric.smith #33853: test_multiprocessing_spawn is leaking memory https://bugs.python.org/issue33853 closed by serhiy.storchaka #33860: doc Avoid "ordered dictionary" references now that dictionarie https://bugs.python.org/issue33860 closed by ethan.furman #33862: doc Fix Enum __members__ type https://bugs.python.org/issue33862 closed by ethan.furman #33863: Enum doc correction relating to __members__ https://bugs.python.org/issue33863 closed by ethan.furman From robertwb at gmail.com Sun Jun 17 02:02:05 2018 From: robertwb at gmail.com (Robert Bradshaw) Date: Sat, 16 Jun 2018 23:02:05 -0700 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: <5B01D74C.3000506@UGent.be> References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> <5AFC95D2.5070102@UGent.be> <2115689ffb51476caff516fc5af1410f@xmail101.UGent.be> <5B01D74C.3000506@UGent.be> Message-ID: Having had some time to let this settle for a bit, I hope it doesn't get abandoned just because it was to complicated to come to a conclusion. I'd like to attempt to summarize the main ideas as follows. 1) Currently the "fast call" optimization is implemented by by checking explicitly for a set of types (builtin functions, methods method descriptors, and functions). This is both ugly, as it requires listing several cases, and also locks out any other types from participating in this protocol. This PEP proposes elevating this to a contract that other types can participate in. 2) Inspect and friends are hard-coded checks on these non-extendable types, again making it difficult for other types to be truly first class citizens, and breaks attempts at duck typing. 3) The current hierarchy of builtin_function_or_method vs. function vs. instancemethod could use some cleanup for consistency and extensibility. PEP 575 solves all of these by introducing a common base class, but they are somewhat separable. As for complexity, there are two metrics, the complexity of the delta (e.g. more lines of code in trickier places = worse, paid once) and of the final result (less code, less special casing = better, paid as long as the code is in use). I tend to think it's a good tradeoff to pay former to improve the latter. Jeroen, is this a fair summary? Are they fully separable? Others, are these three valuable goals? At what cost (e.g. (3) may have backwards compatibility concerns if taken as far as possible.) - Robert On Sun, May 20, 2018 at 1:15 PM, Jeroen Demeyer wrote: > On 2018-05-19 15:29, Nick Coghlan wrote: >> >> That's not how code reviews work, as their complexity is governed by the >> number of lines changed (added/removed/modified), not just the number of >> lines that are left at the end. > > > Of course, you are right. I didn't mean literally that only the end result > matters. But it should certainly be considered. > > If you only do small incremental changes, complexity tends to build up > because choices which are locally optimal are not always globally optimal. > Sometimes you need to do some refactoring to revisit some of that > complexity. This is part of what PEP 575 does. > >> That said, "deletes more lines than it >> adds" is typically a point strongly in favour of a particular change. > > > This certainly won't be true for my patch, because there is a lot of code > that I need to support for backwards compatibility (all the old code for > method_descriptor in particular). > > > Going back to the review of PEP 575, I see the following possible outcomes: > > (A) Accept it as is (possibly with minor changes). > > (B) Accept the general idea but split the details up in several PEPs which > can still be discussed individually. > > (C) Accept a minimal variant of PEP 575, only changing existing classes but > not changing the class hierarchy. > > (D) Accept some yet-to-be-written variant of PEP 575. > > (E) Don't fix the use case that PEP 575 wants to address. > > > Petr Viktorin suggests (C). I am personally quite hesitant because that only > adds complexity and it wouldn't be the best choice for the future > maintainability of CPython. I also fear that this hypothetical PEP variant > would be rejected because of that reason. Of course, if there is some > general agreement that (C) is the way to go, then that is fine for me. > > If people feel that PEP 575 is currently too complex, I think that (B) is a > very good compromise. The end result would be the same as what PEP 575 > proposes. Instead of changing many things at once, we could handle each > class in a separate PEP. But the motivation of those mini-PEPs will still be > PEP 575. So, in order for this to make sense, the general idea of PEP 575 > needs to be accepted: adding a base_function base class and making various > existing classes subclasses of that. > > > > Jeroen. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/robertwb%40gmail.com From J.Demeyer at UGent.be Sun Jun 17 05:00:10 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Sun, 17 Jun 2018 11:00:10 +0200 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: <2abee391fee347a184f751766b9fcd84@xmail101.UGent.be> References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> <5AFC95D2.5070102@UGent.be> <2115689ffb51476caff516fc5af1410f@xmail101.UGent.be> <5B01D74C.3000506@UGent.be> <2abee391fee347a184f751766b9fcd84@xmail101.UGent.be> Message-ID: <5B26231A.70308@UGent.be> Hello, I have been working on a slightly different PEP to use a new type slot tp_ccalloffset instead the base_function base class. You can see the work in progress here: https://github.com/jdemeyer/PEP-ccall By creating a new protocol that each class can implement, there is a full decoupling between the features of a class and between the class hierarchy (such coupling was complained about during the PEP 575 discussion). So I got convinced that this is a better approach. It also has the advantage that changes can be made more gradually: this PEP changes nothing at all on the Python side, it only changes the CPython implementation. I still think that it would be a good idea to refactor the class hierarchy, but that's now an independent issue. Another advantage is that it's more general and easier for existing classes to use the protocol (PEP 575 on the other hand requires subclassing from base_function which may not be compatible with an existing class hierarchy). Jeroen. From ncoghlan at gmail.com Sun Jun 17 07:55:33 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun, 17 Jun 2018 21:55:33 +1000 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: <5B26231A.70308@UGent.be> References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> <5AFC95D2.5070102@UGent.be> <2115689ffb51476caff516fc5af1410f@xmail101.UGent.be> <5B01D74C.3000506@UGent.be> <2abee391fee347a184f751766b9fcd84@xmail101.UGent.be> <5B26231A.70308@UGent.be> Message-ID: On 17 June 2018 at 19:00, Jeroen Demeyer wrote: > Hello, > > I have been working on a slightly different PEP to use a new type slot > tp_ccalloffset instead the base_function base class. You can see the work in > progress here: > > https://github.com/jdemeyer/PEP-ccall > > By creating a new protocol that each class can implement, there is a full > decoupling between the features of a class and between the class hierarchy > (such coupling was complained about during the PEP 575 discussion). So I got > convinced that this is a better approach. > > It also has the advantage that changes can be made more gradually: this PEP > changes nothing at all on the Python side, it only changes the CPython > implementation. I still think that it would be a good idea to refactor the > class hierarchy, but that's now an independent issue. > > Another advantage is that it's more general and easier for existing classes > to use the protocol (PEP 575 on the other hand requires subclassing from > base_function which may not be compatible with an existing class hierarchy). Ah, this looks *very* nice :) Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ronaldoussoren at mac.com Sun Jun 17 08:50:04 2018 From: ronaldoussoren at mac.com (Ronald Oussoren) Date: Sun, 17 Jun 2018 14:50:04 +0200 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: <5B26231A.70308@UGent.be> References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> <5AFC95D2.5070102@UGent.be> <2115689ffb51476caff516fc5af1410f@xmail101.UGent.be> <5B01D74C.3000506@UGent.be> <2abee391fee347a184f751766b9fcd84@xmail101.UGent.be> <5B26231A.70308@UGent.be> Message-ID: <0D79E946-F973-45AB-8A62-51C1A3E329EB@mac.com> > On 17 Jun 2018, at 11:00, Jeroen Demeyer wrote: > > Hello, > > I have been working on a slightly different PEP to use a new type slot tp_ccalloffset instead the base_function base class. You can see the work in progress here: > > https://github.com/jdemeyer/PEP-ccall > > By creating a new protocol that each class can implement, there is a full decoupling between the features of a class and between the class hierarchy (such coupling was complained about during the PEP 575 discussion). So I got convinced that this is a better approach. > > It also has the advantage that changes can be made more gradually: this PEP changes nothing at all on the Python side, it only changes the CPython implementation. I still think that it would be a good idea to refactor the class hierarchy, but that's now an independent issue. > > Another advantage is that it's more general and easier for existing classes to use the protocol (PEP 575 on the other hand requires subclassing from base_function which may not be compatible with an existing class hierarchy). This looks interesting. Why did you add a tp_ccalloffset slot to the type with the actual information in instances instead of storing the information in a slot? Ronald From ronaldoussoren at mac.com Sun Jun 17 08:36:53 2018 From: ronaldoussoren at mac.com (Ronald Oussoren) Date: Sun, 17 Jun 2018 14:36:53 +0200 Subject: [Python-Dev] Some data points for the "annual release cadence" concept In-Reply-To: References: <0004060B-F404-48F7-BF54-0E9DA0EC3952@mac.com> Message-ID: > On 15 Jun 2018, at 13:00, Nick Coghlan wrote: > > On 14 June 2018 at 06:30, Ronald Oussoren > wrote: >> On 13 Jun 2018, at 15:42, Nick Coghlan > wrote: >> >> Yeah, pretty much - once we can get to the point where it's routine for folks to be building "abiX" or "abiXY" wheels (with the latter not actually being a defined compatibility tag yet, but having the meaning of "targets the stable ABI as first defined in CPython X.Y"), rather than feature release specific "cpXYm" ones, then a *lot* of the extension module maintenance pain otherwise arising from more frequent CPython releases should be avoided. >> >> There'd still be a lot of other details to work out to turn the proposed release cadence change into a practical reality, but this is the key piece that I think is a primarily technical hurdle: simplifying the current "wheel-per-python-version-per-target-platform" community project build matrices to instead be "wheel-per-target-platform?. > > This requires getting people to mostly stop using the non-stable ABI, and that could be a lot of work for projects that have existing C extensions that don?t use the stable ABI or cython/cffi/? > > That said, the CPython API tends to be fairly stable over releases and even without using the stable ABI supporting faster CPython feature releases shouldn?t be too onerous, especially for projects with some kind of automation for creating release artefacts (such as a CI system). > > Right, there would still be a non-zero impact on projects that ship binary artifacts. > > Having a viable stable ABI as a target just allows third party projects to make the trade-off between the upfront cost of migrating to the stable ABI (but then only needing to rebuild binaries when their own code changes), and the ongoing cost of maintaining an extra few sets of binary wheel archives. I think asking folks to make that trade-off on a case by case basis is reasonable, whereas back in the previous discussion I considered *only* offering the second option to be unreasonable. I agree. I haven?t seriously looked at the stable ABI yet, so I don?t know if there are reasons for now migrating to it beyond Py2 support and the effort required. For my own projects (both public and not) I have some that could possibly migratie to the stable ABI, and some that cannot because they access information that isn?t public in the stable ABI. I generally still use the non-stable C API when I write extensions, basically because I already know how to do so. Ronald -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefan_ml at behnel.de Sun Jun 17 10:31:36 2018 From: stefan_ml at behnel.de (Stefan Behnel) Date: Sun, 17 Jun 2018 16:31:36 +0200 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: <0D79E946-F973-45AB-8A62-51C1A3E329EB@mac.com> References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> <5AFC95D2.5070102@UGent.be> <2115689ffb51476caff516fc5af1410f@xmail101.UGent.be> <5B01D74C.3000506@UGent.be> <2abee391fee347a184f751766b9fcd84@xmail101.UGent.be> <5B26231A.70308@UGent.be> <0D79E946-F973-45AB-8A62-51C1A3E329EB@mac.com> Message-ID: Ronald Oussoren schrieb am 17.06.2018 um 14:50: > Why did you add a tp_ccalloffset slot to the type with the actual information in instances instead of storing the information in a slot? If the configuration of the callable was in the type, you would need a separate type for each kind of callable. That would quickly explode. Think of this as a generalised PyCFunction interface to arbitrary callables. There is a function pointer and some meta data, and both are specific to an instance. Also, there are usually only a limited number of callables around, so memory doesn't matter. (And memory usage would be a striking reason to have something in a type rather than an instance.) Stefan From J.Demeyer at UGent.be Sun Jun 17 13:07:34 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Sun, 17 Jun 2018 19:07:34 +0200 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: <3c4ea4cd2ed44e2abfdfc836d0f11347@xmail101.UGent.be> References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> <5AFC95D2.5070102@UGent.be> <2115689ffb51476caff516fc5af1410f@xmail101.UGent.be> <5B01D74C.3000506@UGent.be> <2abee391fee347a184f751766b9fcd84@xmail101.UGent.be> <5B26231A.70308@UGent.be> <3c4ea4cd2ed44e2abfdfc836d0f11347@xmail101.UGent.be> Message-ID: <5B269556.2000100@UGent.be> On 2018-06-17 14:50, Ronald Oussoren wrote: > This looks interesting. Why did you add a tp_ccalloffset slot to the type with the actual information in instances instead of storing the information in a slot? Think of built-in functions. Every built-in function is a different callable and calls a different C function. So it must be stored in the instances. However, the case where all instances share a PyCCallDef is also possible: all instances would then simply have the same PyCCallDef pointer. Jeroen. From ronaldoussoren at mac.com Sun Jun 17 13:52:06 2018 From: ronaldoussoren at mac.com (Ronald Oussoren) Date: Sun, 17 Jun 2018 19:52:06 +0200 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> <5AFC95D2.5070102@UGent.be> <2115689ffb51476caff516fc5af1410f@xmail101.UGent.be> <5B01D74C.3000506@UGent.be> <2abee391fee347a184f751766b9fcd84@xmail101.UGent.be> <5B26231A.70308@UGent.be> <0D79E946-F973-45AB-8A62-51C1A3E329EB@mac.com> Message-ID: <124A5106-D66F-497F-BCFB-9B8531CF171B@mac.com> > On 17 Jun 2018, at 16:31, Stefan Behnel wrote: > > Ronald Oussoren schrieb am 17.06.2018 um 14:50: >> Why did you add a tp_ccalloffset slot to the type with the actual information in instances instead of storing the information in a slot? > > If the configuration of the callable was in the type, you would need a > separate type for each kind of callable. That would quickly explode. Think > of this as a generalised PyCFunction interface to arbitrary callables. > There is a function pointer and some meta data, and both are specific to an > instance. That?s true for PyCFunction, but not necessarily as a general replacement for the tp_call slot. I my code I?d basically use the same function pointer and metadata for all instances (that is, more like PyFunction than PyCFunction). > > Also, there are usually only a limited number of callables around, so > memory doesn't matter. (And memory usage would be a striking reason to have > something in a type rather than an instance.) I was mostly surprised that something that seems to be a replacement for tp_call stores the interesting information in instances instead of the type itself. Ronald From songofacandy at gmail.com Sun Jun 17 21:34:16 2018 From: songofacandy at gmail.com (INADA Naoki) Date: Mon, 18 Jun 2018 10:34:16 +0900 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: <5B26231A.70308@UGent.be> References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> <5AFC95D2.5070102@UGent.be> <2115689ffb51476caff516fc5af1410f@xmail101.UGent.be> <5B01D74C.3000506@UGent.be> <2abee391fee347a184f751766b9fcd84@xmail101.UGent.be> <5B26231A.70308@UGent.be> Message-ID: Hi Jeroen. It's interesting, but I think we need reference implementation to compare it's benefit with it's complexity. Victor had tried to add `tp_fastcall` slot, but he suspended his effort because it's benefit is not enough for it's complexity. https://bugs.python.org/issue29259 I think if your idea can reduce complexity of current special cases without any performance loss, it's nice. On the other hand, if your idea increase complexity, I doubt it's benefit. Increasing performance of all Python defined methods + most of builtin methods affects total application performance because it covers most calls. But calling callable object other than them are relatively rare. It may not affect real world performance of most applications. So, until I can compare it's complexity and benefits, I can say only "it's interesting." Regards, -- INADA Naoki -------------- next part -------------- An HTML attachment was scrubbed... URL: From J.Demeyer at UGent.be Mon Jun 18 01:55:13 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Mon, 18 Jun 2018 07:55:13 +0200 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: <235ac1e011d54f2998747de53f1e1d4c@xmail101.UGent.be> References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> <5AFC95D2.5070102@UGent.be> <2115689ffb51476caff516fc5af1410f@xmail101.UGent.be> <5B01D74C.3000506@UGent.be> <2abee391fee347a184f751766b9fcd84@xmail101.UGent.be> <5B26231A.70308@UGent.be> <235ac1e011d54f2998747de53f1e1d4c@xmail101.UGent.be> Message-ID: <5B274941.4080903@UGent.be> On 2018-06-18 03:34, INADA Naoki wrote: > Victor had tried to add `tp_fastcall` slot, but he suspended his effort > because > it's benefit is not enough for it's complexity. > https://bugs.python.org/issue29259 I has a quick look at that patch and it's really orthogonal to what I'm proposing. I'm proposing to use the slot *instead* of existing fastcall optimizations. Victor's patch was about adding fastcall support to classes that didn't support it before. Jeroen. From songofacandy at gmail.com Mon Jun 18 02:25:36 2018 From: songofacandy at gmail.com (INADA Naoki) Date: Mon, 18 Jun 2018 15:25:36 +0900 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: <5B274941.4080903@UGent.be> References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> <5AFC95D2.5070102@UGent.be> <2115689ffb51476caff516fc5af1410f@xmail101.UGent.be> <5B01D74C.3000506@UGent.be> <2abee391fee347a184f751766b9fcd84@xmail101.UGent.be> <5B26231A.70308@UGent.be> <235ac1e011d54f2998747de53f1e1d4c@xmail101.UGent.be> <5B274941.4080903@UGent.be> Message-ID: I didn't meant comparing tp_fastcall and your PEP. I just meant we need to compare complexity and benefit (performance), and we need reference implementation for comparing. On Mon, Jun 18, 2018 at 3:03 PM Jeroen Demeyer wrote: > On 2018-06-18 03:34, INADA Naoki wrote: > > Victor had tried to add `tp_fastcall` slot, but he suspended his effort > > because > > it's benefit is not enough for it's complexity. > > https://bugs.python.org/issue29259 > > I has a quick look at that patch and it's really orthogonal to what I'm > proposing. I'm proposing to use the slot *instead* of existing fastcall > optimizations. Victor's patch was about adding fastcall support to > classes that didn't support it before. > > > Jeroen. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/songofacandy%40gmail.com > -- INADA Naoki -------------- next part -------------- An HTML attachment was scrubbed... URL: From ethan at stoneleaf.us Mon Jun 18 09:13:15 2018 From: ethan at stoneleaf.us (Ethan Furman) Date: Mon, 18 Jun 2018 06:13:15 -0700 Subject: [Python-Dev] the new(-ish) dict ordering vs hash randomization Message-ID: <5B27AFEB.8050000@stoneleaf.us> I'm sure we've already had this conversation, but my google-fu is failing me. Can someone provide a link to a discussion explaining why the new ordering of dictionaries does not defeat the hash-randomization non-ordering we added a few versions ago? -- ~Ethan~ From vstinner at redhat.com Mon Jun 18 09:09:52 2018 From: vstinner at redhat.com (Victor Stinner) Date: Mon, 18 Jun 2018 15:09:52 +0200 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: <5B274941.4080903@UGent.be> References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> <5AFC95D2.5070102@UGent.be> <2115689ffb51476caff516fc5af1410f@xmail101.UGent.be> <5B01D74C.3000506@UGent.be> <2abee391fee347a184f751766b9fcd84@xmail101.UGent.be> <5B26231A.70308@UGent.be> <235ac1e011d54f2998747de53f1e1d4c@xmail101.UGent.be> <5B274941.4080903@UGent.be> Message-ID: Hi, I tried two options to add support for FASTCALL on calling an object: add a flag in tp_flags and reuse tp_call, or add a new tp_fastcall slot. I failed to implement correctly any of these two options. There are multiple issues with tp_fastcall: * ABI issue: it's possible to load a C extension using the old ABI, without tp_fastcall: it's not possible to write type->tp_fastcall on such type. This limitation causes different issues. * If tp_call is modified, tp_fastcall may be outdated. Same if tp_fastcall is modified. What happens on "del obj.__call__" or "del type.__call__"? * Many public functions of the C API still requires the tuple and dict to pass positional and keyword arguments, so a compatibility layer is required to types who only want to implement FASTCALL. Related issue: what is something calls tp_call with (args: tuple, kwargs: dict)? Crash or call a compatibility layer converting arguments to FASTCALL calling convention? Reusing tp_call for FASTCALL cause similar or worse issues. I abandoned my idea for two reasons: 1) in the worst case, my changes caused a crash which is not accepted for an optimization. My first intent was to removed the property_descr_get() hack because its implementation is fragile and caused crashes. 2) we implemented a lot of other optimizations which made calls faster without having to touch tp_call nor tp_fastcall. The benefit of FASTCALL for tp_call/tp_fastcall was not really significant. Victor 2018-06-18 7:55 GMT+02:00 Jeroen Demeyer : > On 2018-06-18 03:34, INADA Naoki wrote: >> >> Victor had tried to add `tp_fastcall` slot, but he suspended his effort >> because >> it's benefit is not enough for it's complexity. >> https://bugs.python.org/issue29259 > > > I has a quick look at that patch and it's really orthogonal to what I'm > proposing. I'm proposing to use the slot *instead* of existing fastcall > optimizations. Victor's patch was about adding fastcall support to classes > that didn't support it before. > > > > Jeroen. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/vstinner%40redhat.com From solipsis at pitrou.net Mon Jun 18 09:21:40 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Mon, 18 Jun 2018 15:21:40 +0200 Subject: [Python-Dev] the new(-ish) dict ordering vs hash randomization References: <5B27AFEB.8050000@stoneleaf.us> Message-ID: <20180618152140.0af1ff63@fsol> On Mon, 18 Jun 2018 06:13:15 -0700 Ethan Furman wrote: > I'm sure we've already had this conversation, but my google-fu is failing me. > > Can someone provide a link to a discussion explaining why the new ordering of dictionaries does not defeat the > hash-randomization non-ordering we added a few versions ago? Because the aim of hash randomization was not to make iteration order unpredictable, it was to make hash collisions unpredictable. The solution used to make hash collisions unpredictable was to make hash values themselves unpredictable, and that had the side effect of also making iteration order unpredictable. But the new dict implementation is able to provide a deterministic iteration order even with non-deterministic hash values. Regards Antoine. From encukou at gmail.com Mon Jun 18 09:27:34 2018 From: encukou at gmail.com (Petr Viktorin) Date: Mon, 18 Jun 2018 15:27:34 +0200 Subject: [Python-Dev] the new(-ish) dict ordering vs hash randomization In-Reply-To: <5B27AFEB.8050000@stoneleaf.us> References: <5B27AFEB.8050000@stoneleaf.us> Message-ID: On 06/18/18 15:13, Ethan Furman wrote: > I'm sure we've already had this conversation, but my google-fu is > failing me. > > Can someone provide a link to a discussion explaining why the new > ordering of dictionaries does not defeat the hash-randomization > non-ordering we added a few versions ago? Hi, Modern dicts have an array of contents (which is used for iterating the dict, and thus iteration doesn't touch hashes at all), and a separate hash table of indexes (which still enjoys the benefits of hash randomization). See Raymond Hettinger's initial post from 2012: https://mail.python.org/pipermail/python-dev/2012-December/123028.html A technical overview of the idea is on the PyPy blog: https://morepypy.blogspot.com/2015/01/faster-more-memory-efficient-and-more.html From J.Demeyer at UGent.be Mon Jun 18 10:30:52 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Mon, 18 Jun 2018 16:30:52 +0200 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> <5AFC95D2.5070102@UGent.be> <2115689ffb51476caff516fc5af1410f@xmail101.UGent.be> <5B01D74C.3000506@UGent.be> <2abee391fee347a184f751766b9fcd84@xmail101.UGent.be> <5B26231A.70308@UGent.be> <235ac1e011d54f2998747de53f1e1d4c@xmail101.UGent.be> <5B274941.4080903@UGent.be> Message-ID: <5B27C21C.8000608@UGent.be> On 2018-06-18 15:09, Victor Stinner wrote: > 2) we implemented a lot of other optimizations which made calls faster > without having to touch tp_call nor tp_fastcall. And that's a problem because these optimizations typically only work for specific classes. My PEP wants to replace those by something more structural. From songofacandy at gmail.com Mon Jun 18 10:55:34 2018 From: songofacandy at gmail.com (INADA Naoki) Date: Mon, 18 Jun 2018 23:55:34 +0900 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: <5B27C21C.8000608@UGent.be> References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> <5AFC95D2.5070102@UGent.be> <2115689ffb51476caff516fc5af1410f@xmail101.UGent.be> <5B01D74C.3000506@UGent.be> <2abee391fee347a184f751766b9fcd84@xmail101.UGent.be> <5B26231A.70308@UGent.be> <235ac1e011d54f2998747de53f1e1d4c@xmail101.UGent.be> <5B274941.4080903@UGent.be> <5B27C21C.8000608@UGent.be> Message-ID: On Mon, Jun 18, 2018 at 11:33 PM Jeroen Demeyer wrote: > On 2018-06-18 15:09, Victor Stinner wrote: > > 2) we implemented a lot of other optimizations which made calls faster > > without having to touch tp_call nor tp_fastcall. > > And that's a problem because these optimizations typically only work for > specific classes. My PEP wants to replace those by something more > structural. > ?And we need data how much it speedup some applications, not only microbenchmarks. Speeding up most python function and some bultin functions was very significant. But I doubt making some 3rd party call 20% faster can make real applications significant faster. -- INADA Naoki -------------- next part -------------- An HTML attachment was scrubbed... URL: From gvanrossum at gmail.com Mon Jun 18 10:57:50 2018 From: gvanrossum at gmail.com (Guido van Rossum) Date: Mon, 18 Jun 2018 07:57:50 -0700 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: <5B27C21C.8000608@UGent.be> References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> <5AFC95D2.5070102@UGent.be> <2115689ffb51476caff516fc5af1410f@xmail101.UGent.be> <5B01D74C.3000506@UGent.be> <2abee391fee347a184f751766b9fcd84@xmail101.UGent.be> <5B26231A.70308@UGent.be> <235ac1e011d54f2998747de53f1e1d4c@xmail101.UGent.be> <5B274941.4080903@UGent.be> <5B27C21C.8000608@UGent.be> Message-ID: Like Inada-san, I would like to see the implementation first. On Mon, Jun 18, 2018, 07:33 Jeroen Demeyer wrote: > On 2018-06-18 15:09, Victor Stinner wrote: > > 2) we implemented a lot of other optimizations which made calls faster > > without having to touch tp_call nor tp_fastcall. > > And that's a problem because these optimizations typically only work for > specific classes. My PEP wants to replace those by something more > structural. > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jake at lwn.net Mon Jun 18 13:22:51 2018 From: jake at lwn.net (Jake Edge) Date: Mon, 18 Jun 2018 11:22:51 -0600 Subject: [Python-Dev] 2018 Python Language Summit coverage, part 2 Message-ID: <20180618112251.5f936617@gallinule> Hola python-dev, Here is some more coverage from the Python Language Summit. As usual, I am posting SubscriberLinks for articles that are still behind the paywall. LWN subscribers can always see our content right away; one week after they are published in a weekly edition, they become freely available for everyone. SubscriberLinks are a way around the paywall. Please feel free to share the SubscriberLinks I am posting here. The starting point is here: https://lwn.net/Articles/754152/ That is an overview article with links to the individual articles. It will be updated as I add more articles. Only 3 more to go after this. Here is what was added since my previous post (which is here: https://lwn.net/ml/python-dev/20180606155653.264c9566%40gallinule/ ) Linux distributions and Python 2 - https://lwn.net/SubscriberLink/756628/7a85f7b28ae3f690/ A Python static typing update - https://lwn.net/SubscriberLink/757218/6f47fe0675cbaf01/ Python virtual environments - https://lwn.net/SubscriberLink/757354/fd82c236dff2de13/ I will post again with the last three, which should be later this week ... enjoy! jake -- Jake Edge - LWN - jake at lwn.net - http://lwn.net From stefan_ml at behnel.de Mon Jun 18 13:49:28 2018 From: stefan_ml at behnel.de (Stefan Behnel) Date: Mon, 18 Jun 2018 19:49:28 +0200 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> <5AFC95D2.5070102@UGent.be> <2115689ffb51476caff516fc5af1410f@xmail101.UGent.be> <5B01D74C.3000506@UGent.be> <2abee391fee347a184f751766b9fcd84@xmail101.UGent.be> <5B26231A.70308@UGent.be> <235ac1e011d54f2998747de53f1e1d4c@xmail101.UGent.be> <5B274941.4080903@UGent.be> Message-ID: Victor Stinner schrieb am 18.06.2018 um 15:09: > I tried two options to add support for FASTCALL on calling an object: > add a flag in tp_flags and reuse tp_call, or add a new tp_fastcall > slot. I failed to implement correctly any of these two options. > > There are multiple issues with tp_fastcall: > > * ABI issue: it's possible to load a C extension using the old ABI, > without tp_fastcall: it's not possible to write type->tp_fastcall on > such type. This limitation causes different issues. Not a problem if we rededicate the unused (since Py3.0) "tp_print" slot for it. Even better, since the slot exists already in Py3.0+, tools like Cython, NumPy (with its ufuncs etc.) or generic function dispatchers, basically anything that benefits from fast calls, can enable support for it in all CPython 3.x versions and benefit from faster calls among each other, independent of the support in CPython. The explicit type flag opt-in that the PEP proposes makes this completely safe. > * If tp_call is modified, tp_fastcall may be outdated. Same if > tp_fastcall is modified. Slots are fixed at type creation and should never be modified afterwards. > What happens on "del obj.__call__" or "del type.__call__"? $ python3.7 -c 'del len.__call__' Traceback (most recent call last): File "", line 1, in AttributeError: 'builtin_function_or_method' object attribute '__call__' is read-only $ python3.7 -c 'del type.__call__' Traceback (most recent call last): File "", line 1, in TypeError: can't set attributes of built-in/extension type 'type' And a really lovely one: $ python3.7 -c 'del (lambda:0).__call__' Traceback (most recent call last): File "", line 1, in AttributeError: __call__ > * Many public functions of the C API still requires the tuple and dict > to pass positional and keyword arguments, so a compatibility layer is > required to types who only want to implement FASTCALL. Well, yes. It would require a trivial piece of code to map between the two. Fine with me. > Related issue: > what is something calls tp_call with (args: tuple, kwargs: dict)? > Crash or call a compatibility layer converting arguments to FASTCALL > calling convention? The latter, obviously. Also easy to implement, with the usual undefined dict order caveat (although that's probably solved when running in Py3.6+). > I abandoned my idea for two reasons: > > 1) in the worst case, my changes caused a crash which is not accepted > for an optimization. This isn't really an optimisation. It's a generalisation of the call protocol. > My first intent was to removed the > property_descr_get() hack because its implementation is fragile and > caused crashes. Not sure which hack you mean. > 2) we implemented a lot of other optimizations which made calls faster > without having to touch tp_call nor tp_fastcall. The benefit of > FASTCALL for tp_call/tp_fastcall was not really significant. What Jeroen said. Cleaning up the implementation and generalising the call protocol is going to open up a wonderfully bright future for CPython. :) Stefan From larry at hastings.org Mon Jun 18 16:35:19 2018 From: larry at hastings.org (Larry Hastings) Date: Mon, 18 Jun 2018 13:35:19 -0700 Subject: [Python-Dev] Idea: reduce GC threshold in development mode (-X dev) In-Reply-To: References: Message-ID: <17c40c74-a49f-84ae-6133-1feba4cfeb5f@hastings.org> On 06/08/2018 12:48 AM, Victor Stinner wrote: > Question: Do you think that bugs spotted by a GC collection are common > enough to change the GC thresholds in development mode (new -X dev > flag of Python 3.7)? I'd prefer that the development / debug environment be as much like production use as possible, so that surprises crop up during development rather than after deployment.? Additional monitoring is fine, but I think changing behavior is a no-no. Cheers, //arry/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From J.Demeyer at UGent.be Tue Jun 19 01:53:25 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Tue, 19 Jun 2018 07:53:25 +0200 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> <5AFC95D2.5070102@UGent.be> <2115689ffb51476caff516fc5af1410f@xmail101.UGent.be> <5B01D74C.3000506@UGent.be> <2abee391fee347a184f751766b9fcd84@xmail101.UGent.be> <5B26231A.70308@UGent.be> <235ac1e011d54f2998747de53f1e1d4c@xmail101.UGent.be> <5B274941.4080903@UGent.be> <5B27C21C.8000608@UGent.be> Message-ID: <5B289A55.4060004@UGent.be> On 2018-06-18 16:55, INADA Naoki wrote: > Speeding up most python function and some bultin functions was very > significant. > But I doubt making some 3rd party call 20% faster can make real > applications significant faster. These two sentences are almost contradictory. I find it strange to claim that a given optimization was "very significant" in specific cases while saying that the same optimization won't matter in other cases. People *have* done benchmarks for actual code and this is causing actual slow-downs of around 20% in actual applications. That is the main reason why I am trying to push this PEP (or PEP 575 which solves the same problem in a different way). Jeroen. From songofacandy at gmail.com Tue Jun 19 02:12:31 2018 From: songofacandy at gmail.com (INADA Naoki) Date: Tue, 19 Jun 2018 15:12:31 +0900 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: <5B289A55.4060004@UGent.be> References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> <5AFC95D2.5070102@UGent.be> <2115689ffb51476caff516fc5af1410f@xmail101.UGent.be> <5B01D74C.3000506@UGent.be> <2abee391fee347a184f751766b9fcd84@xmail101.UGent.be> <5B26231A.70308@UGent.be> <235ac1e011d54f2998747de53f1e1d4c@xmail101.UGent.be> <5B274941.4080903@UGent.be> <5B27C21C.8000608@UGent.be> <5B289A55.4060004@UGent.be> Message-ID: On Tue, Jun 19, 2018 at 2:56 PM Jeroen Demeyer wrote: > On 2018-06-18 16:55, INADA Naoki wrote: > > Speeding up most python function and some bultin functions was very > > significant. > > But I doubt making some 3rd party call 20% faster can make real > > applications significant faster. > > These two sentences are almost contradictory. I find it strange to claim > that a given optimization was "very significant" in specific cases while > saying that the same optimization won't matter in other cases. > It's not contradictory because there is basis: In most real world Python application, number of calling Python methods or bulitin functions are much more than other calls. For example, optimization for bulitin `tp_init` or `tp_new` by FASTCALL was rejected because it's implementation is complex and it's performance gain is not significant enough on macro benchmarks. And I doubt number of 3rd party calls are much more than calling builtin tp_init or tp_new. Of course, current benchmark suite [1] doesn't cover all types of real world Python application. You can create pull request which add benchmark for real world application which depends on massive 3rd party calls. [1] https://github.com/python/performance Regards, -- INADA Naoki -------------- next part -------------- An HTML attachment was scrubbed... URL: From solipsis at pitrou.net Tue Jun 19 04:47:00 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Tue, 19 Jun 2018 10:47:00 +0200 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> <5AFC95D2.5070102@UGent.be> <2115689ffb51476caff516fc5af1410f@xmail101.UGent.be> <5B01D74C.3000506@UGent.be> <2abee391fee347a184f751766b9fcd84@xmail101.UGent.be> <5B26231A.70308@UGent.be> <235ac1e011d54f2998747de53f1e1d4c@xmail101.UGent.be> <5B274941.4080903@UGent.be> Message-ID: <20180619104700.432a74b6@fsol> On Mon, 18 Jun 2018 19:49:28 +0200 Stefan Behnel wrote: > Victor Stinner schrieb am 18.06.2018 um 15:09: > > I tried two options to add support for FASTCALL on calling an object: > > add a flag in tp_flags and reuse tp_call, or add a new tp_fastcall > > slot. I failed to implement correctly any of these two options. > > > > There are multiple issues with tp_fastcall: > > > > * ABI issue: it's possible to load a C extension using the old ABI, > > without tp_fastcall: it's not possible to write type->tp_fastcall on > > such type. This limitation causes different issues. > > Not a problem if we rededicate the unused (since Py3.0) "tp_print" slot for it. On the topic of the so-called old ABI (which doesn't really exist), I would like to merge https://github.com/python/cpython/pull/4944 Regards Antoine. From J.Demeyer at UGent.be Tue Jun 19 07:58:51 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Tue, 19 Jun 2018 13:58:51 +0200 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> <5AFC95D2.5070102@UGent.be> <2115689ffb51476caff516fc5af1410f@xmail101.UGent.be> <5B01D74C.3000506@UGent.be> <2abee391fee347a184f751766b9fcd84@xmail101.UGent.be> <5B26231A.70308@UGent.be> <235ac1e011d54f2998747de53f1e1d4c@xmail101.UGent.be> <5B274941.4080903@UGent.be> Message-ID: <5B28EFFB.8000007@UGent.be> On 2018-06-18 15:09, Victor Stinner wrote: > There are multiple issues with tp_fastcall: Personally, I think that you are exaggerating these issues. Below, I'm writing the word FASTCALL to refer to tp_fastcall in your patch as well as my C call protocol in the PEP-in-progress. > * ABI issue: it's possible to load a C extension using the old ABI, > without tp_fastcall: it's not possible to write type->tp_fastcall on > such type. This limitation causes different issues. It's not hard to check for FASTCALL support and have a case distinction between using tp_call and FASTCALL. > * If tp_call is modified, tp_fastcall may be outdated. I plan to support FASTCALL only for extension types. Those cannot be changed from Python. If it turns out that FASTCALL might give significant benefits also for heap types, we can deal with those modifications: we already need to deal with such modifications anyway for existing slots like __call__. > * Many public functions of the C API still requires the tuple and dict > to pass positional and keyword arguments, so a compatibility layer is > required to types who only want to implement FASTCALL. Related issue: > what is something calls tp_call with (args: tuple, kwargs: dict)? > Crash or call a compatibility layer converting arguments to FASTCALL > calling convention? You make it sound as if such a "compatibility layer" is a big issue. You just need one C API function to put in the tp_call slot which calls the object instead using FASTCALL. From ncoghlan at gmail.com Tue Jun 19 08:02:35 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Tue, 19 Jun 2018 22:02:35 +1000 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> <5AFC95D2.5070102@UGent.be> <2115689ffb51476caff516fc5af1410f@xmail101.UGent.be> <5B01D74C.3000506@UGent.be> <2abee391fee347a184f751766b9fcd84@xmail101.UGent.be> <5B26231A.70308@UGent.be> <235ac1e011d54f2998747de53f1e1d4c@xmail101.UGent.be> <5B274941.4080903@UGent.be> <5B27C21C.8000608@UGent.be> <5B289A55.4060004@UGent.be> Message-ID: On 19 June 2018 at 16:12, INADA Naoki wrote: > > On Tue, Jun 19, 2018 at 2:56 PM Jeroen Demeyer wrote: >> >> On 2018-06-18 16:55, INADA Naoki wrote: >> > Speeding up most python function and some bultin functions was very >> > significant. >> > But I doubt making some 3rd party call 20% faster can make real >> > applications significant faster. >> >> These two sentences are almost contradictory. I find it strange to claim >> that a given optimization was "very significant" in specific cases while >> saying that the same optimization won't matter in other cases. > > > It's not contradictory because there is basis: > > In most real world Python application, number of calling Python methods or > bulitin functions are much more than other calls. > > For example, optimization for bulitin `tp_init` or `tp_new` by FASTCALL was > rejected because it's implementation is complex and it's performance gain is > not significant enough on macro benchmarks. > > And I doubt number of 3rd party calls are much more than calling builtin > tp_init or tp_new. I don't think this assumption is correct, as scientific Python software spends a lot of time calling other components in the scientific Python stack, and bypassing the core language runtime entirely. However, they're using the CPython C API's function calling abstractions to do it, and those are currently expensive (frustratingly so, when the caller, the callee, *and* the interpreter implementation defining the call abstraction layer are all implemented in C). Hence Jeroen's PEPs to make the FASTCALL API a generally available one. That's quite different from the situation with object constructors, where a whole lot of applications will get to the point of having a relatively stable working set of objects, and then see the rate of object creation slow down markedly. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From levkivskyi at gmail.com Tue Jun 19 09:22:50 2018 From: levkivskyi at gmail.com (Ivan Levkivskyi) Date: Tue, 19 Jun 2018 14:22:50 +0100 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> <5AFC95D2.5070102@UGent.be> <2115689ffb51476caff516fc5af1410f@xmail101.UGent.be> <5B01D74C.3000506@UGent.be> <2abee391fee347a184f751766b9fcd84@xmail101.UGent.be> <5B26231A.70308@UGent.be> <235ac1e011d54f2998747de53f1e1d4c@xmail101.UGent.be> <5B274941.4080903@UGent.be> <5B27C21C.8000608@UGent.be> <5B289A55.4060004@UGent.be> Message-ID: On 19 June 2018 at 13:02, Nick Coghlan wrote: > On 19 June 2018 at 16:12, INADA Naoki wrote: > > > > On Tue, Jun 19, 2018 at 2:56 PM Jeroen Demeyer > wrote: > >> > >> On 2018-06-18 16:55, INADA Naoki wrote: > >> > Speeding up most python function and some bultin functions was very > >> > significant. > >> > But I doubt making some 3rd party call 20% faster can make real > >> > applications significant faster. > >> > >> These two sentences are almost contradictory. I find it strange to claim > >> that a given optimization was "very significant" in specific cases while > >> saying that the same optimization won't matter in other cases. > > > > > > It's not contradictory because there is basis: > > > > In most real world Python application, number of calling Python > methods or > > bulitin functions are much more than other calls. > > > > For example, optimization for bulitin `tp_init` or `tp_new` by FASTCALL > was > > rejected because it's implementation is complex and it's performance > gain is > > not significant enough on macro benchmarks. > > > > And I doubt number of 3rd party calls are much more than calling builtin > > tp_init or tp_new. > > I don't think this assumption is correct, as scientific Python > software spends a lot of time calling other components in the > scientific Python stack, and bypassing the core language runtime > entirely. > > A recent Python survey by PSF/JetBrains shows that almost half of current Python users are using it for data science/ML/etc. For all these people most of the time is spent on calling C functions in extensions. -- Ivan -------------- next part -------------- An HTML attachment was scrubbed... URL: From songofacandy at gmail.com Tue Jun 19 09:36:34 2018 From: songofacandy at gmail.com (INADA Naoki) Date: Tue, 19 Jun 2018 22:36:34 +0900 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> <5AFC95D2.5070102@UGent.be> <2115689ffb51476caff516fc5af1410f@xmail101.UGent.be> <5B01D74C.3000506@UGent.be> <2abee391fee347a184f751766b9fcd84@xmail101.UGent.be> <5B26231A.70308@UGent.be> <235ac1e011d54f2998747de53f1e1d4c@xmail101.UGent.be> <5B274941.4080903@UGent.be> <5B27C21C.8000608@UGent.be> <5B289A55.4060004@UGent.be> Message-ID: That's why I suggested to add new benchmark. 2018?6?19?(?) 22:22 Ivan Levkivskyi : > On 19 June 2018 at 13:02, Nick Coghlan wrote: > >> On 19 June 2018 at 16:12, INADA Naoki wrote: >> > >> > On Tue, Jun 19, 2018 at 2:56 PM Jeroen Demeyer >> wrote: >> >> >> >> On 2018-06-18 16:55, INADA Naoki wrote: >> >> > Speeding up most python function and some bultin functions was very >> >> > significant. >> >> > But I doubt making some 3rd party call 20% faster can make real >> >> > applications significant faster. >> >> >> >> These two sentences are almost contradictory. I find it strange to >> claim >> >> that a given optimization was "very significant" in specific cases >> while >> >> saying that the same optimization won't matter in other cases. >> > >> > >> > It's not contradictory because there is basis: >> > >> > In most real world Python application, number of calling Python >> methods or >> > bulitin functions are much more than other calls. >> > >> > For example, optimization for bulitin `tp_init` or `tp_new` by FASTCALL >> was >> > rejected because it's implementation is complex and it's performance >> gain is >> > not significant enough on macro benchmarks. >> > >> > And I doubt number of 3rd party calls are much more than calling builtin >> > tp_init or tp_new. >> >> I don't think this assumption is correct, as scientific Python >> software spends a lot of time calling other components in the >> scientific Python stack, and bypassing the core language runtime >> entirely. >> >> > A recent Python survey by PSF/JetBrains shows that almost half of current > Python > users are using it for data science/ML/etc. For all these people most of > the time is spent > on calling C functions in extensions. > > -- > Ivan > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vstinner at redhat.com Tue Jun 19 10:59:27 2018 From: vstinner at redhat.com (Victor Stinner) Date: Tue, 19 Jun 2018 16:59:27 +0200 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: <5B28EFFB.8000007@UGent.be> References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> <5AFC95D2.5070102@UGent.be> <2115689ffb51476caff516fc5af1410f@xmail101.UGent.be> <5B01D74C.3000506@UGent.be> <2abee391fee347a184f751766b9fcd84@xmail101.UGent.be> <5B26231A.70308@UGent.be> <235ac1e011d54f2998747de53f1e1d4c@xmail101.UGent.be> <5B274941.4080903@UGent.be> <5B28EFFB.8000007@UGent.be> Message-ID: 2018-06-19 13:58 GMT+02:00 Jeroen Demeyer : > Personally, I think that you are exaggerating these issues. I'm not trying to convince you to abandon the idea. I would be happy to be able to use FASTCALL in more cases! I just tried to explain why I chose to abandon my idea. FASTCALL is cute on tiny microbenchmarks, but I'm not sure that having spent almost one year on it was worth it :-) Victor From stefan_ml at behnel.de Wed Jun 20 02:00:48 2018 From: stefan_ml at behnel.de (Stefan Behnel) Date: Wed, 20 Jun 2018 08:00:48 +0200 Subject: [Python-Dev] C-level calling (was: PEP 575 (Unifying function/method classes) update) In-Reply-To: References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> <5AFC95D2.5070102@UGent.be> <2115689ffb51476caff516fc5af1410f@xmail101.UGent.be> <5B01D74C.3000506@UGent.be> <2abee391fee347a184f751766b9fcd84@xmail101.UGent.be> <5B26231A.70308@UGent.be> <235ac1e011d54f2998747de53f1e1d4c@xmail101.UGent.be> <5B274941.4080903@UGent.be> <5B28EFFB.8000007@UGent.be> Message-ID: Victor Stinner schrieb am 19.06.2018 um 16:59: > 2018-06-19 13:58 GMT+02:00 Jeroen Demeyer : >> Personally, I think that you are exaggerating these issues. > > I'm not trying to convince you to abandon the idea. I would be happy > to be able to use FASTCALL in more cases! I just tried to explain why > I chose to abandon my idea. > > FASTCALL is cute on tiny microbenchmarks, but I'm not sure that having > spent almost one year on it was worth it :-) Fastcall is actually nice, also because it has a potential to *simplify* several things with regard to calling Python objects from C. Thanks for implementing it, Victor. Just to add another bit of background on top of the current discussion, there is an idea around, especially in the scipy/big-data community, (and I'm not giving any guarantees here that it will lead to a PEP + implementation, as it depends on people's workload) to design a dedicated C level calling interface for Python. Think of it as similar to the buffer interface, but for calling arbitrary C functions by bypassing the Python call interface entirely. Objects that wrap some kind of C function (and there are tons of them in the CPython world) would gain C signature meta data, maybe even for overloaded signatures, and C code that wants to call them could validate that meta data and call them as native C calls. But that is a rather big project to undertake, and I consider Jeroen's new PEP also a first step in that direction. Stefan From J.Demeyer at UGent.be Wed Jun 20 04:53:18 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Wed, 20 Jun 2018 10:53:18 +0200 Subject: [Python-Dev] PEP 579 and PEP 580: refactoring C functions and methods Message-ID: <5B2A15FE.4000608@UGent.be> Hello, Let me present PEP 579 and PEP 580. PEP 579 is an informational meta-PEP, listing some of the issues with functions/methods implemented in C. The idea is to create several PEPs each fix some part of the issues mentioned in PEP 579. PEP 580 is a standards track PEP to introduce a new "C call" protocol, which is an important part of PEP 579. In the reference implementation (which is work in progress), this protocol will be used by built-in functions and methods. However, it should be used by more classes in the future. You find the texts at https://www.python.org/dev/peps/pep-0579 https://www.python.org/dev/peps/pep-0580 Jeroen. From J.Demeyer at UGent.be Wed Jun 20 04:53:24 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Wed, 20 Jun 2018 10:53:24 +0200 Subject: [Python-Dev] C-level calling In-Reply-To: <163d4bb9ca4544a0889a36d068b4e16a@xmail101.UGent.be> References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> <5AFC95D2.5070102@UGent.be> <2115689ffb51476caff516fc5af1410f@xmail101.UGent.be> <5B01D74C.3000506@UGent.be> <2abee391fee347a184f751766b9fcd84@xmail101.UGent.be> <5B26231A.70308@UGent.be> <235ac1e011d54f2998747de53f1e1d4c@xmail101.UGent.be> <5B274941.4080903@UGent.be> <5B28EFFB.8000007@UGent.be> <163d4bb9ca4544a0889a36d068b4e16a@xmail101.UGent.be> Message-ID: <5B2A1604.3070800@UGent.be> On 2018-06-20 08:00, Stefan Behnel wrote: > Just to add another bit of background on top of the current discussion, > there is an idea around, especially in the scipy/big-data community, (and > I'm not giving any guarantees here that it will lead to a PEP + > implementation, as it depends on people's workload) to design a dedicated C > level calling interface for Python. Think of it as similar to the buffer > interface, but for calling arbitrary C functions by bypassing the Python > call interface entirely. Objects that wrap some kind of C function (and > there are tons of them in the CPython world) would gain C signature meta > data, maybe even for overloaded signatures, and C code that wants to call > them could validate that meta data and call them as native C calls. See also https://www.python.org/dev/peps/pep-0579/#allowing-native-c-arguments I specifically designed PEP 580 to be extendable such that it would be possible to add features later. From ncoghlan at gmail.com Wed Jun 20 07:43:20 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Wed, 20 Jun 2018 21:43:20 +1000 Subject: [Python-Dev] PEP 575 (Unifying function/method classes) update In-Reply-To: References: <5AED7166.1010008@UGent.be> <55bbe4828392480e87da5ca73741199e@xmail101.UGent.be> <5AFAA52F.6070908@UGent.be> <4974359864374ff8b2181b39fea97459@xmail101.UGent.be> <5AFB5734.7080908@UGent.be> <7856825233cc431c92040fcf48c8315e@xmail101.UGent.be> <5AFC95D2.5070102@UGent.be> <2115689ffb51476caff516fc5af1410f@xmail101.UGent.be> <5B01D74C.3000506@UGent.be> <2abee391fee347a184f751766b9fcd84@xmail101.UGent.be> <5B26231A.70308@UGent.be> <235ac1e011d54f2998747de53f1e1d4c@xmail101.UGent.be> <5B274941.4080903@UGent.be> <5B27C21C.8000608@UGent.be> <5B289A55.4060004@UGent.be> Message-ID: On 19 June 2018 at 16:12, INADA Naoki wrote: > > On Tue, Jun 19, 2018 at 2:56 PM Jeroen Demeyer wrote: >> >> On 2018-06-18 16:55, INADA Naoki wrote: >> > Speeding up most python function and some bultin functions was very >> > significant. >> > But I doubt making some 3rd party call 20% faster can make real >> > applications significant faster. >> >> These two sentences are almost contradictory. I find it strange to claim >> that a given optimization was "very significant" in specific cases while >> saying that the same optimization won't matter in other cases. > > It's not contradictory because there is basis: > > In most real world Python application, number of calling Python methods or > bulitin functions are much more than other calls. > > For example, optimization for bulitin `tp_init` or `tp_new` by FASTCALL was > rejected because it's implementation is complex and it's performance gain is > not significant enough on macro benchmarks. > > And I doubt number of 3rd party calls are much more than calling builtin > tp_init or tp_new. I was going to ask a question here about JSON parsing micro-benchmarks, but then I went back and re-read https://blog.sentry.io/2016/10/19/fixing-python-performance-with-rust.html and realised that the main problem discussed in that article was the *memory* overhead of creating full Python object instances, not the runtime cost of instantiating those objects. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From solipsis at pitrou.net Wed Jun 20 10:09:46 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Wed, 20 Jun 2018 16:09:46 +0200 Subject: [Python-Dev] PEP 579 and PEP 580: refactoring C functions and methods References: <5B2A15FE.4000608@UGent.be> Message-ID: <20180620160946.4fa05130@fsol> Hi Jeroen, On Wed, 20 Jun 2018 10:53:18 +0200 Jeroen Demeyer wrote: > > PEP 579 is an informational meta-PEP, listing some of the issues with > functions/methods implemented in C. The idea is to create several PEPs > each fix some part of the issues mentioned in PEP 579. > > PEP 580 is a standards track PEP to introduce a new "C call" protocol, > which is an important part of PEP 579. In the reference implementation > (which is work in progress), this protocol will be used by built-in > functions and methods. However, it should be used by more classes in the > future. This is very detailed and analytic. Thanks. I dislike that the protocol is complicated, with many special cases. Ideally there would be two axes for parametrization of C calls: - the signature of the C callee (either fast call or normal call) - whether the callable is called as a function ("foo(...)") or as a method ("some_obj.foo(...)"). But there seems to be some complication on top of that: - PyCCall_FastCall() accepts several types for the keywords, even a dict; does it get forwarded as-is to the `cc_func` or is it first transformed? - there's CCALL_OBJCLASS and CCALL_SLICE_SELF which have, well, non-obvious behaviour (especially the latter), especially as it is conditioned on the value of other fields or flags I wonder if there's a way to push some of the specificities out of the protocol and into the C API that mediates between the protocol and actual callers? Regards Antoine. From J.Demeyer at UGent.be Wed Jun 20 10:32:09 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Wed, 20 Jun 2018 16:32:09 +0200 Subject: [Python-Dev] PEP 579 and PEP 580: refactoring C functions and methods In-Reply-To: References: <5B2A15FE.4000608@UGent.be> Message-ID: <5B2A6569.3090006@UGent.be> On 2018-06-20 16:09, Antoine Pitrou wrote: > But there seems to be some complication on top of that: > > - PyCCall_FastCall() accepts several types for the keywords, even a > dict; That is actually a *simplification* instead of a *complication*. Currently, there is a huge amount of code duplication between _PyMethodDef_RawFastCallKeywords and _PyMethodDef_RawFastCallDict. Folding both of these in one function actually makes things simpler. > does it get forwarded as-is to the `cc_func` or is it first > transformed? Transformed (obviously, otherwise it would be a huge backwards incompatibility problem). > - there's CCALL_OBJCLASS and CCALL_SLICE_SELF which have, well, > non-obvious behaviour (especially the latter), especially as it is > conditioned on the value of other fields or flags It's actually quite obvious when you think of it: both are needed to support existing use cases. Perhaps it's just not explained well enough in the PEP. > I wonder if there's a way to push some of the specificities out of the > protocol and into the C API that mediates between the protocol and > actual callers? Sorry, I have no idea what you mean here. Actually, those flags are handled by the C API. The actual C functions don't need to care about those flags. From solipsis at pitrou.net Wed Jun 20 10:42:32 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Wed, 20 Jun 2018 16:42:32 +0200 Subject: [Python-Dev] PEP 579 and PEP 580: refactoring C functions and methods References: <5B2A15FE.4000608@UGent.be> <5B2A6569.3090006@UGent.be> Message-ID: <20180620164232.0058d4d4@fsol> On Wed, 20 Jun 2018 16:32:09 +0200 Jeroen Demeyer wrote: > > > - there's CCALL_OBJCLASS and CCALL_SLICE_SELF which have, well, > > non-obvious behaviour (especially the latter), especially as it is > > conditioned on the value of other fields or flags > > It's actually quite obvious when you think of it: both are needed to > support existing use cases. Perhaps it's just not explained well enough > in the PEP. Yes, it's explained in PEP 579. But just because the motivation is easy to understand doesn't mean the mechanism is easy to follow. I'm wondering what amount of code and debugging is needed for, say, Cython or Numba to implement that protocol as a caller, without going through the C API's indirections (for performance). Regards Antoine. From J.Demeyer at UGent.be Wed Jun 20 10:49:38 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Wed, 20 Jun 2018 16:49:38 +0200 Subject: [Python-Dev] PEP 579 and PEP 580: refactoring C functions and methods In-Reply-To: References: <5B2A15FE.4000608@UGent.be> <5B2A6569.3090006@UGent.be> Message-ID: <5B2A6982.80404@UGent.be> On 2018-06-20 16:42, Antoine Pitrou wrote: > I'm wondering what amount of code and debugging is needed for, say, > Cython or Numba to implement that protocol as a caller, without going > through the C API's indirections (for performance). The goal is to have a really fast C API without a lot of indirections. If Cython or Numba can implement the protocol faster than CPython, we should just change the CPython implementation to be equally fast. From songofacandy at gmail.com Wed Jun 20 11:42:07 2018 From: songofacandy at gmail.com (INADA Naoki) Date: Thu, 21 Jun 2018 00:42:07 +0900 Subject: [Python-Dev] Can we make METH_FASTCALL public, from Python 3.7? (ref: PEP 579 Message-ID: Hi, All. First of all, thank you Jeroen for writing nice PEPs. When I read PEP 579, I think "6. METH_FASTCALL is private and undocumented" should be solved first. I don't have any idea about changing METH_FASTCALL more. If Victor and Serhiy think so, and PyPy maintainers like it too, I want to make it public as soon as possible. _PyObject_FastCall* APIs are private in Python 3.7. But METH_FASTCALL is not completely private (start without underscore, but not documented) Can we call it as public, stable by adding document, if Ned allows? It's used widely in Python internals already. I suppose that making it public doesn't make Python 3.7 unstable much. If we can't at Python 3.7, I think we should do it at 3.8. Regards, -- INADA Naoki -------------- next part -------------- An HTML attachment was scrubbed... URL: From vstinner at redhat.com Wed Jun 20 12:09:00 2018 From: vstinner at redhat.com (Victor Stinner) Date: Wed, 20 Jun 2018 18:09:00 +0200 Subject: [Python-Dev] Can we make METH_FASTCALL public, from Python 3.7? (ref: PEP 579 In-Reply-To: References: Message-ID: Hi, I chose to make it private because I wasn't sure about the API. I was right: Serhiy removed keyword arguments from METH_FASTCALL, you now have to use METH_FASTCALL | METH_KEYWORDS to also pass keyword arguments ;-) I don't recall if this change was done in 3.7 or also in 3.6. FASTCALL has been introduced in 3.6 if I recall correctly. I didn't write much documentation about FASTCALL, only 2 articles. The most interesting one: https://vstinner.github.io/fastcall-microbenchmarks.html METH_FASTCALL is already used by Cython, so I'm not sure that it's fully private :-) > _PyObject_FastCall* APIs are private in Python 3.7. I'm not sure that it's worth it to make these functions public, they are already used internally, when using PyObject_CallFunction() for example. And it may be painful for PyPy (and others) to have to explain all these new functions. > If we can't at Python 3.7, I think we should do it at 3.8. What's the rationale to make it public in 3.7? Can't it wait for 3.8? The new PEPs target 3.8 anyway, no? IMHO it's too late for 3.7. Victor 2018-06-20 17:42 GMT+02:00 INADA Naoki : > Hi, All. > > First of all, thank you Jeroen for writing nice PEPs. > > When I read PEP 579, I think "6. METH_FASTCALL is private and undocumented" > should be solved first. > > I don't have any idea about changing METH_FASTCALL more. > If Victor and Serhiy think so, and PyPy maintainers like it too, I want to > make it public > as soon as possible. > > _PyObject_FastCall* APIs are private in Python 3.7. > But METH_FASTCALL is not completely private (start without underscore, > but not documented) > Can we call it as public, stable by adding document, if Ned allows? > > It's used widely in Python internals already. I suppose that making it > public > doesn't make Python 3.7 unstable much. > > If we can't at Python 3.7, I think we should do it at 3.8. > > Regards, > -- > INADA Naoki From solipsis at pitrou.net Wed Jun 20 12:16:42 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Wed, 20 Jun 2018 18:16:42 +0200 Subject: [Python-Dev] Can we make METH_FASTCALL public, from Python 3.7? (ref: PEP 579 References: Message-ID: <20180620181642.31775c88@fsol> On Wed, 20 Jun 2018 18:09:00 +0200 Victor Stinner wrote: > > > If we can't at Python 3.7, I think we should do it at 3.8. > > What's the rationale to make it public in 3.7? Can't it wait for 3.8? > The new PEPs target 3.8 anyway, no? > > IMHO it's too late for 3.7. Agreed with Victor. Also Jeroen's work might lead us to change the protocol for better flexibility or performance. Let's not make it a public API too early. Regards Antoine. From songofacandy at gmail.com Wed Jun 20 12:48:23 2018 From: songofacandy at gmail.com (INADA Naoki) Date: Thu, 21 Jun 2018 01:48:23 +0900 Subject: [Python-Dev] Can we make METH_FASTCALL public, from Python 3.7? (ref: PEP 579 In-Reply-To: <20180620181642.31775c88@fsol> References: <20180620181642.31775c88@fsol> Message-ID: 2018?6?21?(?) 1:17 Antoine Pitrou : > On Wed, 20 Jun 2018 18:09:00 +0200 > Victor Stinner wrote: > > > > > If we can't at Python 3.7, I think we should do it at 3.8. > > > > What's the rationale to make it public in 3.7? Can't it wait for 3.8? > > The new PEPs target 3.8 anyway, no? > > > > IMHO it's too late for 3.7. > > Agreed with Victor. Also Jeroen's work might lead us to change the > protocol for better flexibility or performance. Unless libraries are written with METH_FASTCALL (or using Cython), tp_ccall can't have any gain for 3rd party functions written in C. In other words, if many libraries start supporting FASTCALL, tp_ccall will have more gain at the time when Python 3.8 is released. Let's not make it a > public API too early. > Ok. Even though it's private at 3.7, extension authors can start using it at their risk if we decide METH_FASTCALL is public in 3.8 without any change from 3.7. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From storchaka at gmail.com Wed Jun 20 12:56:40 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Wed, 20 Jun 2018 19:56:40 +0300 Subject: [Python-Dev] Can we make METH_FASTCALL public, from Python 3.7? (ref: PEP 579 In-Reply-To: References: Message-ID: 20.06.18 18:42, INADA Naoki ????: > First of all, thank you Jeroen for writing nice PEPs. > > When I read PEP 579, I think "6. METH_FASTCALL is private and undocumented" > should be solved first. > > I don't have any idea about changing METH_FASTCALL more. > If Victor and?Serhiy think so, and PyPy maintainers like it too, I want > to make it public > as soon as possible. I don't have objections against making the METH_FASTCALL method calling convention public. But only for positional-only parameters, the protocol for keyword parameters is more complex and still can be changed. We should to provide also APIs for calling functions using this protocol (_PyObject_FastCall) and for parsing arguments (_PyArg_ParseStack). We may want to bikeshed names and the order of arguments for them. > It's used widely in Python internals already.? I suppose that making it > public > doesn't make Python 3.7 unstable much. > > If we can't at Python 3.7, I think we should do it at 3.8. It is too late for 3.7 in any case. From songofacandy at gmail.com Wed Jun 20 13:19:19 2018 From: songofacandy at gmail.com (INADA Naoki) Date: Thu, 21 Jun 2018 02:19:19 +0900 Subject: [Python-Dev] Can we make METH_FASTCALL public, from Python 3.7? (ref: PEP 579 In-Reply-To: References: Message-ID: 2018?6?21?(?) 1:59 Serhiy Storchaka : > 20.06.18 18:42, INADA Naoki ????: > > First of all, thank you Jeroen for writing nice PEPs. > > > > When I read PEP 579, I think "6. METH_FASTCALL is private and > undocumented" > > should be solved first. > > > > I don't have any idea about changing METH_FASTCALL more. > > If Victor and Serhiy think so, and PyPy maintainers like it too, I want > > to make it public > > as soon as possible. > > I don't have objections against making the METH_FASTCALL method calling > convention public. But only for positional-only parameters, the protocol > for keyword parameters is more complex and still can be changed. > > We should to provide also APIs for calling functions using this protocol > (_PyObject_FastCall) and for parsing arguments (_PyArg_ParseStack). We > may want to bikeshed names and the order of arguments for them. > Calling API can be determined later. Even without the API, methods can be called faster from Python core. But for parsing API, you're right. It should be public with METH_FASTCALL. Only positional arguments can be received without it. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefan_ml at behnel.de Wed Jun 20 13:45:16 2018 From: stefan_ml at behnel.de (Stefan Behnel) Date: Wed, 20 Jun 2018 19:45:16 +0200 Subject: [Python-Dev] Can we make METH_FASTCALL public, from Python 3.7? (ref: PEP 579 In-Reply-To: References: Message-ID: Serhiy Storchaka schrieb am 20.06.2018 um 18:56: > 20.06.18 18:42, INADA Naoki ????: >> I don't have any idea about changing METH_FASTCALL more. >> If Victor and?Serhiy think so, and PyPy maintainers like it too, I want >> to make it public as soon as possible. > > I don't have objections against making the METH_FASTCALL method calling > convention public. But only for positional-only parameters, the protocol > for keyword parameters is more complex and still can be changed. That's also the level that Cython currently uses/supports, exactly because keyword arguments are a) quite a bit more complex, b) a lot less often used and c) pretty much never used in performance critical code. Cython also currently limits the usage to Py3.6+, although I'm considering to generally enable it for everything since Py2.6 as soon as Cython starts using the calling convention for its own functions, just in case it ends up calling itself without prior notice. :) Stefan From brett at python.org Wed Jun 20 14:35:53 2018 From: brett at python.org (Brett Cannon) Date: Wed, 20 Jun 2018 11:35:53 -0700 Subject: [Python-Dev] Can we make METH_FASTCALL public, from Python 3.7? (ref: PEP 579 In-Reply-To: References: <20180620181642.31775c88@fsol> Message-ID: On Wed, 20 Jun 2018 at 09:49 INADA Naoki wrote: > > 2018?6?21?(?) 1:17 Antoine Pitrou : > >> On Wed, 20 Jun 2018 18:09:00 +0200 >> Victor Stinner wrote: >> > >> > > If we can't at Python 3.7, I think we should do it at 3.8. >> > >> > What's the rationale to make it public in 3.7? Can't it wait for 3.8? >> > The new PEPs target 3.8 anyway, no? >> > >> > IMHO it's too late for 3.7. >> >> Agreed with Victor. Also Jeroen's work might lead us to change the >> protocol for better flexibility or performance. > > > Unless libraries are written with METH_FASTCALL (or using Cython), > tp_ccall can't have any gain for 3rd party functions written in C. > > In other words, if many libraries start supporting FASTCALL, tp_ccall will > have more gain at the time when Python 3.8 is released. > > Let's not make it a >> public API too early. >> > > Ok. > > Even though it's private at 3.7, extension authors can start using it at > their risk if we decide METH_FASTCALL is public in 3.8 without any change > from 3.7. > People can still wait for 3.8. Waiting 1.5 years for a feature is nothing when the software you're talking about is already 28 years. :) It's simply not worth the risk. Or you can push for Lukasz's idea of doing annual releases and speed it up a little. ;) -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Wed Jun 20 17:06:16 2018 From: guido at python.org (Guido van Rossum) Date: Wed, 20 Jun 2018 14:06:16 -0700 Subject: [Python-Dev] Intent to accept PEP 561 -- Distributing and Packaging Type Information Message-ID: I have reviewed PEP 561 and I intend to accept it some time next week, unless significant discussion happens between now and then. The latest version of the PEP can be found at https://www.python.org/dev/peps/pep-0561/ PEP 561 solves a big problem for users of static type checkers like mypy and Pyre (as well as pytype and PyCharm): how to scale the creation of stubs (files with just type annotations, with a .pyi extension). IMO Ethan Smith has done a great job both coming up with and revising the design, and crafting an implementation -- most of PEP 561 is already supported by mypy. It's been a while since a copy of the PEP was posted to python-dev ( https://mail.python.org/pipermail/python-dev/2018-April/152694.html), but only a few things changed since then, so I'll just link to the change history: https://github.com/python/peps/commits/master/pep-0561.rst. Only the last two commits are new since the last posting: support for partial packages and a bunch of small textual tweaks I found today while reviewing. There wasn't a lot of feedback then so I don't expect a flamewar today, but better late than never. ;-) -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From songofacandy at gmail.com Wed Jun 20 21:01:19 2018 From: songofacandy at gmail.com (INADA Naoki) Date: Thu, 21 Jun 2018 10:01:19 +0900 Subject: [Python-Dev] Can we make METH_FASTCALL public, from Python 3.7? (ref: PEP 579 In-Reply-To: References: <20180620181642.31775c88@fsol> Message-ID: > > ?? >> Even though it's private at 3.7, extension authors can start using it at >> their risk if we decide METH_FASTCALL is public in 3.8 without any change >> from 3.7. >> > > People can still wait for 3.8. Waiting 1.5 years for a feature is nothing > when the software you're talking about is already 28 years. :) It's simply > not worth the risk. > > ?Of course. My idea is providing information to "early adaptors" who writes C extension manually. ?PEP ?580 is trying to expand METH_FASTCALL to custom function types in 3rd party library written in tools like Cython. But METH_FASTCALL cannot be used widely even for normal function types in 3rd party library yet. Without publicating METH_FASTCALL, PEP 580 is useful only for libraries using private APIs. That's unhealthy. ?So I think we should discuss about METH_FASTCALL? publication before evaluating PEP 580. That's my main point, and "from 3.7" part is just a bonus, sorry. -- INADA Naoki -------------- next part -------------- An HTML attachment was scrubbed... URL: From J.Demeyer at UGent.be Thu Jun 21 01:57:49 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Thu, 21 Jun 2018 07:57:49 +0200 Subject: [Python-Dev] Can we make METH_FASTCALL public, from Python 3.7? (ref: PEP 579 In-Reply-To: <2c917f5c615843c0bc7cce26e026fb2f@xmail101.UGent.be> References: <2c917f5c615843c0bc7cce26e026fb2f@xmail101.UGent.be> Message-ID: <5B2B3E5D.20009@UGent.be> On 2018-06-20 17:42, INADA Naoki wrote: > I don't have any idea about changing METH_FASTCALL more. > If Victor and Serhiy think so, and PyPy maintainers like it too, I want > to make it public > as soon as possible. There are two different things here: The first is documenting METH_FASTCALL such that everybody can create built-in functions using the METH_FASTCALL signature. I think that the API for METH_FASTCALL (without or with METH_KEYWORDS) is fine, so I support making it public. This is really just a documentation issue, so I see no reason why it couldn't be added to 3.7.0 if we're fast. The API for calling functions using the FASTCALL convention is more of a mess though. There are functions taking keyword arguments as dict and functions taking them as tuple. As I mentioned in PEP 580, I'd like to merge these and simply allow either a dict or a tuple. Since this would require an API change, this won't be for 3.7.0. Jeroen. From songofacandy at gmail.com Thu Jun 21 04:25:40 2018 From: songofacandy at gmail.com (INADA Naoki) Date: Thu, 21 Jun 2018 17:25:40 +0900 Subject: [Python-Dev] Can we make METH_FASTCALL public, from Python 3.7? (ref: PEP 579 In-Reply-To: <5B2B3E5D.20009@UGent.be> References: <2c917f5c615843c0bc7cce26e026fb2f@xmail101.UGent.be> <5B2B3E5D.20009@UGent.be> Message-ID: On Thu, Jun 21, 2018 at 2:57 PM Jeroen Demeyer wrote: > On 2018-06-20 17:42, INADA Naoki wrote: > > I don't have any idea about changing METH_FASTCALL more. > > If Victor and Serhiy think so, and PyPy maintainers like it too, I want > > to make it public > > as soon as possible. > > There are two different things here: > > The first is documenting METH_FASTCALL such that everybody can create > built-in functions using the METH_FASTCALL signature. I think that the > API for METH_FASTCALL (without or with METH_KEYWORDS) is fine, so I > support making it public. This is really just a documentation issue, so > I see no reason why it couldn't be added to 3.7.0 if we're fast. > > As ?Serhiy noted, argument parsing API (_PyArg_ParseStack) is not public too. So METH_FASTCALL is incomplete for pure C extension authors even if it's documented. So I don't have strong opinion for documenting it on 3.7. Consensus about not changing it (without METH_KEYWORDS) on 3.8 seems enough to me (and Cython). Then, _PyArg_ParseStack API should be considered first for make it public on Python 3.8. (bikeshedding: The name *Stack* feels not good. It implies Python VM stack. But this API can be used not only with VM stack.) > The API for calling functions using the FASTCALL convention is more of a > mess though. There are functions taking keyword arguments as dict and > functions taking them as tuple. As I mentioned in PEP 580, I'd like to > merge these and simply allow either a dict or a tuple. Since this would > require an API change, this won't be for 3.7.0. > > I like proposed API too. But I think we should focus on METH_FASTCALL without METH_KEYWORDS first. Making _PyObject_FastCall() public is significant step for 3.8. Regards, -- INADA Naoki -------------- next part -------------- An HTML attachment was scrubbed... URL: From vstinner at redhat.com Thu Jun 21 05:22:38 2018 From: vstinner at redhat.com (Victor Stinner) Date: Thu, 21 Jun 2018 11:22:38 +0200 Subject: [Python-Dev] PEP 579 and PEP 580: refactoring C functions and methods In-Reply-To: <5B2A15FE.4000608@UGent.be> References: <5B2A15FE.4000608@UGent.be> Message-ID: https://www.python.org/dev/peps/pep-0580/#the-c-call-protocol CCALL_VARARGS: cc_func(PyObject *self, PyObject *args) If we add a new calling convention, I would prefer to reuse the FASTCALL calling convention to pass arguments: pass a PyObject **args array with a size (Py_ssize_t nargs) rather than requiring to create a temporary tuple object to pass positional arguments. Victor 2018-06-20 10:53 GMT+02:00 Jeroen Demeyer : > Hello, > > Let me present PEP 579 and PEP 580. > > PEP 579 is an informational meta-PEP, listing some of the issues with > functions/methods implemented in C. The idea is to create several PEPs each > fix some part of the issues mentioned in PEP 579. > > PEP 580 is a standards track PEP to introduce a new "C call" protocol, which > is an important part of PEP 579. In the reference implementation (which is > work in progress), this protocol will be used by built-in functions and > methods. However, it should be used by more classes in the future. > > You find the texts at > https://www.python.org/dev/peps/pep-0579 > https://www.python.org/dev/peps/pep-0580 > > > Jeroen. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/vstinner%40redhat.com From J.Demeyer at UGent.be Thu Jun 21 05:32:38 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Thu, 21 Jun 2018 11:32:38 +0200 Subject: [Python-Dev] PEP 579 and PEP 580: refactoring C functions and methods In-Reply-To: <346097a6425d47cab19e47857d2f01a9@xmail101.UGent.be> References: <5B2A15FE.4000608@UGent.be> <346097a6425d47cab19e47857d2f01a9@xmail101.UGent.be> Message-ID: <5B2B70B6.5010902@UGent.be> On 2018-06-21 11:22, Victor Stinner wrote: > https://www.python.org/dev/peps/pep-0580/#the-c-call-protocol > > CCALL_VARARGS: cc_func(PyObject *self, PyObject *args) > > If we add a new calling convention This is not a *new* calling convention, it's the *existing* calling convention for METH_VARARGS. Obviously, we need to continue to support that. From J.Demeyer at UGent.be Thu Jun 21 07:25:19 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Thu, 21 Jun 2018 13:25:19 +0200 Subject: [Python-Dev] About [].append == [].append Message-ID: <5B2B8B1F.5070605@UGent.be> Currently, we have: >>> [].append == [].append False However, with a Python class: >>> class List(list): ... def append(self, x): super().append(x) >>> List().append == List().append True In the former case, __self__ is compared using "is" and in the latter case, it is compared using "==". I think that comparing using "==" is the right thing to do because "is" is really an implementation detail. Consider >>> (10000).bit_length == (10000).bit_length True >>> (10000).bit_length == (10000+0).bit_length False I guess that's also the reason why CPython internally rarely uses "is" for comparisons. See also: - https://bugs.python.org/issue1617161 - https://bugs.python.org/issue33925 Any opinions? Jeroen. From solipsis at pitrou.net Thu Jun 21 07:30:02 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Thu, 21 Jun 2018 13:30:02 +0200 Subject: [Python-Dev] About [].append == [].append References: <5B2B8B1F.5070605@UGent.be> Message-ID: <20180621133002.4ff1be31@fsol> On Thu, 21 Jun 2018 13:25:19 +0200 Jeroen Demeyer wrote: > Currently, we have: > > >>> [].append == [].append > False > > However, with a Python class: > > >>> class List(list): > ... def append(self, x): super().append(x) > >>> List().append == List().append > True > > In the former case, __self__ is compared using "is" and in the latter > case, it is compared using "==". > > I think that comparing using "==" is the right thing to do because "is" > is really an implementation detail. Probably... though comparing bound methods doesn't sound terribly useful, so I'm not sure how much of an issue this is in practice. Regards Antoine. From vano at mail.mipt.ru Thu Jun 21 07:33:27 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Thu, 21 Jun 2018 14:33:27 +0300 Subject: [Python-Dev] About [].append == [].append In-Reply-To: <5B2B8B1F.5070605@UGent.be> References: <5B2B8B1F.5070605@UGent.be> Message-ID: <8358f59d-85ac-f819-6915-66315068998a@mail.mipt.ru> First, tell us what problem you're solving. Strictly speaking, bound methods don't have an unambiguous notion of equality: are they equal if they do the same thing, or of they do they same thing _on the same object_? The result that you're seeing is a consequence of that same dichotomy in the minds of the .__eq__ designers, and Python Zen advises "In the face of ambiguity, refuse the temptation to guess." -- which is what you're suggesting. On 21.06.2018 14:25, Jeroen Demeyer wrote: > Currently, we have: > > >>> [].append == [].append > False > > However, with a Python class: > > >>> class List(list): > ...???? def append(self, x): super().append(x) > >>> List().append == List().append > True > > In the former case, __self__ is compared using "is" and in the latter > case, it is compared using "==". > > I think that comparing using "==" is the right thing to do because > "is" is really an implementation detail. Consider > > >>> (10000).bit_length == (10000).bit_length > True > >>> (10000).bit_length == (10000+0).bit_length > False > > I guess that's also the reason why CPython internally rarely uses "is" > for comparisons. > > See also: > - https://bugs.python.org/issue1617161 > - https://bugs.python.org/issue33925 > > Any opinions? > > > Jeroen. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/vano%40mail.mipt.ru -- Regards, Ivan From J.Demeyer at UGent.be Thu Jun 21 07:53:53 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Thu, 21 Jun 2018 13:53:53 +0200 Subject: [Python-Dev] About [].append == [].append In-Reply-To: References: <5B2B8B1F.5070605@UGent.be> Message-ID: <5B2B91D1.1040105@UGent.be> On 2018-06-21 13:33, Ivan Pozdeev via Python-Dev wrote: > First, tell us what problem you're solving. There is no specific problem I want to solve here. I just noticed an inconsistency and I wondered if it would be OK to change the implementation of comparisons of builtin_function_or_method instances. It's a valid question to ask even if it doesn't solve an actual problem. From solipsis at pitrou.net Thu Jun 21 07:58:50 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Thu, 21 Jun 2018 13:58:50 +0200 Subject: [Python-Dev] About [].append == [].append References: <5B2B8B1F.5070605@UGent.be> <5B2B91D1.1040105@UGent.be> Message-ID: <20180621135850.0192f809@fsol> On Thu, 21 Jun 2018 13:53:53 +0200 Jeroen Demeyer wrote: > On 2018-06-21 13:33, Ivan Pozdeev via Python-Dev wrote: > > First, tell us what problem you're solving. > > There is no specific problem I want to solve here. I just noticed an > inconsistency and I wondered if it would be OK to change the > implementation of comparisons of builtin_function_or_method instances. I think it's ok. Regards Antoine. From steve at pearwood.info Thu Jun 21 09:39:56 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Thu, 21 Jun 2018 23:39:56 +1000 Subject: [Python-Dev] About [].append == [].append In-Reply-To: <8358f59d-85ac-f819-6915-66315068998a@mail.mipt.ru> References: <5B2B8B1F.5070605@UGent.be> <8358f59d-85ac-f819-6915-66315068998a@mail.mipt.ru> Message-ID: <20180621133956.GC14437@ando.pearwood.info> On Thu, Jun 21, 2018 at 02:33:27PM +0300, Ivan Pozdeev via Python-Dev wrote: > First, tell us what problem you're solving. You might not be aware of the context of Jereon's question. He is the author of PEP 579 and 580, so I expect he's looking into implementation details of the CPython builtin functions and methods. https://www.python.org/dev/peps/pep-0579/ https://www.python.org/dev/peps/pep-0580/ > Strictly speaking, bound methods don't have an unambiguous notion of > equality: > > are they equal if they do the same thing, or of they do they same thing > _on the same object_? That's a red-herring, because CPython already defines an unambiguous notion of method equality. The problem is that the notion depends on whether the method is written in Python or not, and that seems like a needless difference. > The result that you're seeing is a consequence of that same dichotomy in > the minds of the .__eq__ designers, and Python Zen advises "In the face > of ambiguity, refuse the temptation to guess." -- which is what you're > suggesting. How do you come to that conclusion? If "refuse the temptation to guess" applied here, we couldn't do this: py> "".upper == "".upper True (by your reasoning, it should raise an exception). Note the contrast in treatment of strings with: py> [].append == [].append False (The reason is that "" is cached and reused, and the empty string is not.) > On 21.06.2018 14:25, Jeroen Demeyer wrote: [...] > >I think that comparing using "==" is the right thing to do because > >"is" is really an implementation detail. +1 -- Steve From songofacandy at gmail.com Thu Jun 21 10:16:26 2018 From: songofacandy at gmail.com (INADA Naoki) Date: Thu, 21 Jun 2018 23:16:26 +0900 Subject: [Python-Dev] About [].append == [].append In-Reply-To: <5B2B8B1F.5070605@UGent.be> References: <5B2B8B1F.5070605@UGent.be> Message-ID: 2018?6?21?(?) 20:27 Jeroen Demeyer : > Currently, we have: > > >>> [].append == [].append > False > > However, with a Python class: > > >>> class List(list): > ... def append(self, x): super().append(x) > >>> List().append == List().append > True > > In the former case, __self__ is compared using "is" and in the latter > case, it is compared using "==". > > I think that comparing using "==" is the right thing to do because "is" > is really an implementation detail. I think "is" is correct because "bound to which object" is essential for bound (instance) methods. Consider > > >>> (10000).bit_length == (10000).bit_length > True > >>> (10000).bit_length == (10000+0).bit_length > False > I'm OK for this difference. This comparison is what people shouldn't do, like 'id(10000) == id(10000+0)' > I guess that's also the reason why CPython internally rarely uses "is" > for comparisons. > > See also: > - https://bugs.python.org/issue1617161 > - https://bugs.python.org/issue33925 > > Any opinions? > I think changing this may break some tricky code. Is it really worth enough to change? > > > Jeroen. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/songofacandy%40gmail.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jake at lwn.net Thu Jun 21 11:18:30 2018 From: jake at lwn.net (Jake Edge) Date: Thu, 21 Jun 2018 09:18:30 -0600 Subject: [Python-Dev] 2018 Python Language Summit coverage, last part Message-ID: <20180621091830.7900f7fd@redtail> Hola python-dev, Here are the last three sessions from the Python Language Summit that I wrote up. Unfortunately, I did not write up the Lightning Talks, which were numerous and interesting, mostly because of time pressure. As usual, I am posting SubscriberLinks for articles that are still behind the paywall. LWN subscribers can always see our content right away; one week after they are published in a weekly edition, they become freely available for everyone. SubscriberLinks are a way around the paywall. Please feel free to share the SubscriberLinks I am posting here. The starting point is here: https://lwn.net/Articles/754152/ That is an overview article with links to the individual articles. Here are the articles since my post on Monday (which is here: https://lwn.net/ml/python-dev/20180618112251.5f936617%40gallinule/ ) PEP 572 and decision-making in Python https://lwn.net/SubscriberLink/757713/2118c7722d957926/ Getting along in the Python community https://lwn.net/SubscriberLink/757714/8dadb362ff7d5673/ Mentoring and diversity for Python https://lwn.net/SubscriberLink/757715/d6cf8c9f72e4bdd8/ enjoy! jake -- Jake Edge - LWN - jake at lwn.net - https://lwn.net From tismer at stackless.com Thu Jun 21 11:06:07 2018 From: tismer at stackless.com (Christian Tismer) Date: Thu, 21 Jun 2018 17:06:07 +0200 Subject: [Python-Dev] PySequence_Check but no __len__ Message-ID: <8f1c59b3-9206-7dfc-a538-545859391066@stackless.com> Hi friends, there is a case in the Python API where I am not sure what to do: If an object defines __getitem__() only but no __len__(), then PySequence_Check() already is true and does not care. So if I define no __len__, it simply fails. Is this intended? I was mislead and thought this was the unlimited case, but it seems still to be true that sequences are always finite. Can someone please enlighten me? -- Christian Tismer-Sperling :^) tismer at stackless.com Software Consulting : http://www.stackless.com/ Karl-Liebknecht-Str. 121 : http://pyside.org 14482 Potsdam : GPG key -> 0xE7301150FB7BEE0E phone +49 173 24 18 776 fax +49 (30) 700143-0023 -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 522 bytes Desc: OpenPGP digital signature URL: From vano at mail.mipt.ru Thu Jun 21 11:58:05 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Thu, 21 Jun 2018 18:58:05 +0300 Subject: [Python-Dev] About [].append == [].append In-Reply-To: <20180621133956.GC14437@ando.pearwood.info> References: <5B2B8B1F.5070605@UGent.be> <8358f59d-85ac-f819-6915-66315068998a@mail.mipt.ru> <20180621133956.GC14437@ando.pearwood.info> Message-ID: <152b175d-4abd-4597-7f05-fd39320880c0@mail.mipt.ru> On 21.06.2018 16:39, Steven D'Aprano wrote: > On Thu, Jun 21, 2018 at 02:33:27PM +0300, Ivan Pozdeev via Python-Dev wrote: > >> First, tell us what problem you're solving. > You might not be aware of the context of Jereon's question. He is the > author of PEP 579 and 580, so I expect he's looking into implementation > details of the CPython builtin functions and methods. I see. `pythonobject.c:method_richcompare' compares .im_func and .im_self . Bound builtin methods should do the same, obviously -- preferrably, even use the same code. > > https://www.python.org/dev/peps/pep-0579/ > > https://www.python.org/dev/peps/pep-0580/ > > >> Strictly speaking, bound methods don't have an unambiguous notion of >> equality: >> >> are they equal if they do the same thing, or of they do they same thing >> _on the same object_? > That's a red-herring, because CPython already defines an unambiguous > notion of method equality. The problem is that the notion depends on > whether the method is written in Python or not, and that seems like a > needless difference. > > >> The result that you're seeing is a consequence of that same dichotomy in >> the minds of the .__eq__ designers, and Python Zen advises "In the face >> of ambiguity, refuse the temptation to guess." -- which is what you're >> suggesting. > How do you come to that conclusion? If "refuse the temptation to guess" > applied here, we couldn't do this: > > py> "".upper == "".upper > True > > (by your reasoning, it should raise an exception). > > Note the contrast in treatment of strings with: > > py> [].append == [].append > False > > (The reason is that "" is cached and reused, and the empty string is > not.) > > >> On 21.06.2018 14:25, Jeroen Demeyer wrote: > [...] > >>> I think that comparing using "==" is the right thing to do because >>> "is" is really an implementation detail. > +1 > > -- Regards, Ivan From solipsis at pitrou.net Thu Jun 21 12:26:17 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Thu, 21 Jun 2018 18:26:17 +0200 Subject: [Python-Dev] 2018 Python Language Summit coverage, last part References: <20180621091830.7900f7fd@redtail> Message-ID: <20180621182617.45f987c4@fsol> Hi Jake, On Thu, 21 Jun 2018 09:18:30 -0600 Jake Edge wrote: > > The starting point is here: https://lwn.net/Articles/754152/ That is > an overview article with links to the individual articles. Here are > the articles since my post on Monday (which is here: > https://lwn.net/ml/python-dev/20180618112251.5f936617%40gallinule/ ) > > PEP 572 and decision-making in Python > https://lwn.net/SubscriberLink/757713/2118c7722d957926/ This is an excellent writeup, thank you. The discussion seems to have been constructive, too. I would just like to approve the following points: > Part of the problem is that there is no real way to measure the > effectiveness of new language features. Indeed. But, for a syntax addition such as PEP 572, I think it would be a good idea to ask their opinion to teaching/education specialists. As far as I'm concerned, if teachers and/or education specialists were to say PEP 572 is not a problem, my position would shift from negative towards neutral. > Van Rossum wondered if the opposition only got heated up after > he got involved; people may not have taken it seriously earlier because > they thought he would not go for it. That's very true. For me, at the start, the assignment operator proposal was one of those many python-ideas threads about new syntax that never get any wide approval, so I didn't bother about it. Regards Antoine. From brett at python.org Thu Jun 21 12:29:52 2018 From: brett at python.org (Brett Cannon) Date: Thu, 21 Jun 2018 09:29:52 -0700 Subject: [Python-Dev] PySequence_Check but no __len__ In-Reply-To: <8f1c59b3-9206-7dfc-a538-545859391066@stackless.com> References: <8f1c59b3-9206-7dfc-a538-545859391066@stackless.com> Message-ID: Sorry, I don't quite follow. On Thu, 21 Jun 2018 at 08:50 Christian Tismer wrote: > Hi friends, > > there is a case in the Python API where I am not sure what to do: > > If an object defines __getitem__() only but no __len__(), > then PySequence_Check() already is true and does not care. > Which matches https://docs.python.org/3/c-api/sequence.html#c.PySequence_Check . >From Objects/abstract.c: int PySequence_Check(PyObject *s) { if (PyDict_Check(s)) return 0; return s != NULL && s->ob_type->tp_as_sequence && s->ob_type->tp_as_sequence->sq_item != NULL; } > > So if I define no __len__, it simply fails. Is this intended? > What is "it" in this case that is failing? It isn't PySequence_Check() so I'm not sure what the issue is. -Brett > > I was mislead and thought this was the unlimited case, but > it seems still to be true that sequences are always finite. > > Can someone please enlighten me? > -- > Christian Tismer-Sperling :^) tismer at stackless.com > Software Consulting : http://www.stackless.com/ > Karl-Liebknecht-Str. 121 : http://pyside.org > 14482 Potsdam : GPG key -> 0xE7301150FB7BEE0E > phone +49 173 24 18 776 <+49%20173%202418776> fax +49 (30) 700143-0023 > <+49%2030%207001430023> > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From storchaka at gmail.com Thu Jun 21 15:53:02 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Thu, 21 Jun 2018 22:53:02 +0300 Subject: [Python-Dev] About [].append == [].append In-Reply-To: <5B2B8B1F.5070605@UGent.be> References: <5B2B8B1F.5070605@UGent.be> Message-ID: 21.06.18 14:25, Jeroen Demeyer ????: > Currently, we have: > > >>> [].append == [].append > False > > However, with a Python class: > > >>> class List(list): > ....???? def append(self, x): super().append(x) > >>> List().append == List().append > True I think this is a bug. These bound methods can't be equal because they have different side effect. The use case for using "is" for __self__ is described by the OP of issue1617161. I don't know use cases for using "==". There is a related problem of hashing. Currently bound methods are not hashable if __self__ is not hashable. This makes impossible using them as dict keys. From gvanrossum at gmail.com Thu Jun 21 16:40:27 2018 From: gvanrossum at gmail.com (Guido van Rossum) Date: Thu, 21 Jun 2018 13:40:27 -0700 Subject: [Python-Dev] About [].append == [].append In-Reply-To: References: <5B2B8B1F.5070605@UGent.be> Message-ID: I'm with Serhiy here, for mutable values I don't think the methods should compare equal, even when the values do. For immutables I don't care either way, it's an implementation detail. On Thu, Jun 21, 2018, 12:55 Serhiy Storchaka wrote: > 21.06.18 14:25, Jeroen Demeyer ????: > > Currently, we have: > > > > >>> [].append == [].append > > False > > > > However, with a Python class: > > > > >>> class List(list): > > .... def append(self, x): super().append(x) > > >>> List().append == List().append > > True > > I think this is a bug. These bound methods can't be equal because they > have different side effect. > > The use case for using "is" for __self__ is described by the OP of > issue1617161. I don't know use cases for using "==". > > There is a related problem of hashing. Currently > bound methods are not hashable if __self__ is not hashable. This makes > impossible using them as dict keys. > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vano at mail.mipt.ru Thu Jun 21 17:04:55 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Fri, 22 Jun 2018 00:04:55 +0300 Subject: [Python-Dev] About [].append == [].append In-Reply-To: References: <5B2B8B1F.5070605@UGent.be> Message-ID: <790dea68-5a36-c21f-63e1-2d9bdcf2f333@mail.mipt.ru> On 21.06.2018 23:40, Guido van Rossum wrote: > I'm with Serhiy here, for mutable values I don't think the methods > should compare equal, even when the values do. For immutables I don't > care either way, it's an implementation detail. > In this light, methods rather shouldn't have a rich comparison logic at all -- at the very least, until we have a realistic use case and can flesh out the requirements for it. In my previous message, I meant that if they do have that logic, the right way is what `method_richcompare' does. And that was apparently what the method's author (that you might be familiar with) was thinking in https://github.com/python/cpython/commit/47b9ff6ba11fab4c90556357c437cb4feec1e853 -- and even then and there, they were hesitant about the feature's usefulness. But Serhiy has just disproven that that is the right way which looks like the final nail into its coffin. > On Thu, Jun 21, 2018, 12:55 Serhiy Storchaka > wrote: > > 21.06.18 14:25, Jeroen Demeyer ????: > > Currently, we have: > > > >? >>> [].append == [].append > > False > > > > However, with a Python class: > > > >? >>> class List(list): > > ....???? def append(self, x): super().append(x) > >? >>> List().append == List().append > > True > > I think this is a bug. These bound methods can't be equal because > they > have different side effect. > > The use case for using "is" for __self__ is described by the OP of > issue1617161. I don't know use cases for using "==". > > There is a related problem of hashing. Currently > bound methods are not hashable if __self__ is not hashable. This > makes > impossible using them as dict keys. > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/vano%40mail.mipt.ru -- Regards, Ivan -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Thu Jun 21 17:59:19 2018 From: njs at pobox.com (Nathaniel Smith) Date: Thu, 21 Jun 2018 14:59:19 -0700 Subject: [Python-Dev] About [].append == [].append In-Reply-To: <790dea68-5a36-c21f-63e1-2d9bdcf2f333@mail.mipt.ru> References: <5B2B8B1F.5070605@UGent.be> <790dea68-5a36-c21f-63e1-2d9bdcf2f333@mail.mipt.ru> Message-ID: On Thu, Jun 21, 2018 at 2:04 PM, Ivan Pozdeev via Python-Dev wrote: > On 21.06.2018 23:40, Guido van Rossum wrote: > > > I'm with Serhiy here, for mutable values I don't think the methods should > > compare equal, even when the values do. For immutables I don't care either > > way, it's an implementation detail. > > In this light, methods rather shouldn't have a rich comparison logic at all > -- at the very least, until we have a realistic use case and can flesh out > the requirements for it. == and hashing for methods can be quite useful for things like cache keys. (E.g., a dict mapping callable+args -> result.) I'm sure people are using it. So I think simply removing it would be pretty disruptive. I can't think of any cases where == and hashing are useful for methods on *mutable objects*, which is that case where it matters whether the 'self' comparison uses == or 'is', but the method object doesn't know whether 'self' is mutable, so it has to either work in general or not work in general. -n -- Nathaniel J. Smith -- https://vorpus.org From storchaka at gmail.com Fri Jun 22 01:00:13 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Fri, 22 Jun 2018 08:00:13 +0300 Subject: [Python-Dev] About [].append == [].append In-Reply-To: <790dea68-5a36-c21f-63e1-2d9bdcf2f333@mail.mipt.ru> References: <5B2B8B1F.5070605@UGent.be> <790dea68-5a36-c21f-63e1-2d9bdcf2f333@mail.mipt.ru> Message-ID: 22.06.18 00:04, Ivan Pozdeev via Python-Dev ????: > On 21.06.2018 23:40, Guido van Rossum wrote: >> I'm with Serhiy here, for mutable values I don't think the methods >> should compare equal, even when the values do. For immutables I don't >> care either way, it's an implementation detail. >> > In this light, methods rather shouldn't have a rich comparison logic at > all This would be so if you get the same method object when resolve an attribute. But a.f is not a.f. Every time a new method object is created. If we want a.f == a.f, we need to implement a rich comparison logic. > -- at the very least, until we have a realistic use case and can > flesh out the requirements for it. There are realistic use cases. You will get a number of failures in the Python tests if make method objects incomparable. From tismer at stackless.com Fri Jun 22 07:17:11 2018 From: tismer at stackless.com (Christian Tismer) Date: Fri, 22 Jun 2018 13:17:11 +0200 Subject: [Python-Dev] PySequence_Check but no __len__ In-Reply-To: References: <8f1c59b3-9206-7dfc-a538-545859391066@stackless.com> Message-ID: <9ca8588c-3b5c-5b35-0722-91f653a8b483@stackless.com> Hi Brett, because you did not understand me, I must have had a fundamental misunderstanding. So I started a self-analysis and came to the conclusion that this was my error since maybe a decade: When iterators and generators came into existence, I somehow fell into the trap to think that there are now sequences with undetermined or infinite length. They would be exactly those sequences which have no __len__ attribute. I understand now that sequences are always of fixed length and adjusted myself. ----------------------------------------- My problem is to find out how to deal with a class which has __getitem__ but no __len__. The documentation suggests that the length of a sequence can always be obtained by len(). https://docs.python.org/3/reference/datamodel.html But the existence of __len__ is not guaranteed or enforced. And if you look at the definition of PySequence_Fast(), you find that a sequence can be turned into a list with iteration only and no __len__. So, is a sequence valid without __len__, if iteration is supported, instead? There is the whole chapter about sequence protocol https://docs.python.org/3/c-api/sequence.html?highlight=sequence but I cannot find out an exact definition what makes up a sequence? Sorry if I'm again the only one who misunderstands the obvious :) Best -- Chris On 21.06.18 18:29, Brett Cannon wrote: > Sorry, I don't quite follow. > > On Thu, 21 Jun 2018 at 08:50 Christian Tismer > wrote: > > Hi friends, > > there is a case in the Python API where I am not sure what to do: > > If an object defines __getitem__() only but no __len__(), > then PySequence_Check() already is true and does not care. > > > Which matches > https://docs.python.org/3/c-api/sequence.html#c.PySequence_Check . > > From Objects/abstract.c: > > int > PySequence_Check(PyObject *s) > { > ??? if (PyDict_Check(s)) > ??????? return 0; > ??? return s != NULL && s->ob_type->tp_as_sequence && > ??????? s->ob_type->tp_as_sequence->sq_item != NULL; > } > > ? > > > So if I define no __len__, it simply fails. Is this intended? > > > What is "it" in this case that is failing? It isn't PySequence_Check() > so I'm not sure what the issue is. > > -Brett > ? > > > I was mislead and thought this was the unlimited case, but > it seems still to be true that sequences are always finite. > > Can someone please enlighten me? > -- > Christian Tismer-Sperling? ? :^)? ?tismer at stackless.com > > Software Consulting? ? ? ? ? :? ? ?http://www.stackless.com/ > Karl-Liebknecht-Str. 121? ? ?:? ? ?http://pyside.org > 14482 Potsdam? ? ? ? ? ? ? ? :? ? ?GPG key -> 0xE7301150FB7BEE0E > phone +49 173 24 18 776 ? fax +49 (30) > 700143-0023 > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/tismer%40stackless.com > -- Christian Tismer-Sperling :^) tismer at stackless.com Software Consulting : http://www.stackless.com/ Karl-Liebknecht-Str. 121 : http://pyside.org 14482 Potsdam : GPG key -> 0xE7301150FB7BEE0E phone +49 173 24 18 776 fax +49 (30) 700143-0023 -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 522 bytes Desc: OpenPGP digital signature URL: From tismer at stackless.com Fri Jun 22 07:45:11 2018 From: tismer at stackless.com (Christian Tismer) Date: Fri, 22 Jun 2018 13:45:11 +0200 Subject: [Python-Dev] PySequence_Check but no __len__ In-Reply-To: <9ca8588c-3b5c-5b35-0722-91f653a8b483@stackless.com> References: <8f1c59b3-9206-7dfc-a538-545859391066@stackless.com> <9ca8588c-3b5c-5b35-0722-91f653a8b483@stackless.com> Message-ID: <9c5a2395-6405-a604-6e93-731e4432d5ce@stackless.com> Answering myself: PySequence_Check determines a sequence. See the docs. len() can but does not have to exist. The size is always limited. After evicting my initial fault, this is now obvious. Sorry about the noise. On 22.06.18 13:17, Christian Tismer wrote: > Hi Brett, > > because you did not understand me, I must have had a fundamental > misunderstanding. So I started a self-analysis and came to the > conclusion that this was my error since maybe a decade: > > When iterators and generators came into existence, I somehow > fell into the trap to think that there are now sequences with > undetermined or infinite length. They would be exactly those sequences > which have no __len__ attribute. > > I understand now that sequences are always of fixed length > and adjusted myself. > > ----------------------------------------- > > My problem is to find out how to deal with a class which has > __getitem__ but no __len__. > > The documentation suggests that the length of a sequence can always > be obtained by len(). > https://docs.python.org/3/reference/datamodel.html > > But the existence of __len__ is not guaranteed or enforced. > And if you look at the definition of PySequence_Fast(), you find that > a sequence can be turned into a list with iteration only and no __len__. > > So, is a sequence valid without __len__, if iteration is supported, > instead? > > There is the whole chapter about sequence protocol > https://docs.python.org/3/c-api/sequence.html?highlight=sequence > > but I cannot find out an exact definition what makes up a sequence? > > Sorry if I'm again the only one who misunderstands the obvious :) > > Best -- Chris > > > On 21.06.18 18:29, Brett Cannon wrote: >> Sorry, I don't quite follow. >> >> On Thu, 21 Jun 2018 at 08:50 Christian Tismer > > wrote: >> >> Hi friends, >> >> there is a case in the Python API where I am not sure what to do: >> >> If an object defines __getitem__() only but no __len__(), >> then PySequence_Check() already is true and does not care. >> >> >> Which matches >> https://docs.python.org/3/c-api/sequence.html#c.PySequence_Check . >> >> From Objects/abstract.c: >> >> int >> PySequence_Check(PyObject *s) >> { >> ??? if (PyDict_Check(s)) >> ??????? return 0; >> ??? return s != NULL && s->ob_type->tp_as_sequence && >> ??????? s->ob_type->tp_as_sequence->sq_item != NULL; >> } >> >> ? >> >> >> So if I define no __len__, it simply fails. Is this intended? >> >> >> What is "it" in this case that is failing? It isn't PySequence_Check() >> so I'm not sure what the issue is. >> >> -Brett >> ? >> >> >> I was mislead and thought this was the unlimited case, but >> it seems still to be true that sequences are always finite. >> >> Can someone please enlighten me? >> -- >> Christian Tismer-Sperling? ? :^)? ?tismer at stackless.com >> >> Software Consulting? ? ? ? ? :? ? ?http://www.stackless.com/ >> Karl-Liebknecht-Str. 121? ? ?:? ? ?http://pyside.org >> 14482 Potsdam? ? ? ? ? ? ? ? :? ? ?GPG key -> 0xE7301150FB7BEE0E >> phone +49 173 24 18 776 ? fax +49 (30) >> 700143-0023 >> >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/brett%40python.org >> >> >> >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: https://mail.python.org/mailman/options/python-dev/tismer%40stackless.com >> > > > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/tismer%40stackless.com > -- Christian Tismer-Sperling :^) tismer at stackless.com Software Consulting : http://www.stackless.com/ Karl-Liebknecht-Str. 121 : http://pyside.org 14482 Potsdam : GPG key -> 0xE7301150FB7BEE0E phone +49 173 24 18 776 fax +49 (30) 700143-0023 -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 522 bytes Desc: OpenPGP digital signature URL: From ncoghlan at gmail.com Fri Jun 22 08:53:47 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 22 Jun 2018 22:53:47 +1000 Subject: [Python-Dev] PySequence_Check but no __len__ In-Reply-To: <9c5a2395-6405-a604-6e93-731e4432d5ce@stackless.com> References: <8f1c59b3-9206-7dfc-a538-545859391066@stackless.com> <9ca8588c-3b5c-5b35-0722-91f653a8b483@stackless.com> <9c5a2395-6405-a604-6e93-731e4432d5ce@stackless.com> Message-ID: On 22 June 2018 at 21:45, Christian Tismer wrote: > Answering myself: > > PySequence_Check determines a sequence. See the docs. > > len() can but does not have to exist. > The size is always limited. Just to throw a couple of extra wrinkles on this: Due to a C API implementation detail in CPython, not only can len() throw TypeError for non-finite sequences (which implement other parts of the sequence API, but not that), but sufficiently large finite sequences may also throw OverflowError: >>> data = range(-2**64, 2**64) >>> format((data.stop - data.start) // data.step, "e") '3.689349e+19' >>> format(sys.maxsize, "e") '9.223372e+18' >>> len(data) Traceback (most recent call last): File "", line 1, in OverflowError: Python int too large to convert to C ssize_t >>> data.__len__() Traceback (most recent call last): File "", line 1, in OverflowError: Python int too large to convert to C ssize_t Infinite sequences that want to prevent infinite loops or unbounded memory consumption in consumers may also choose to implement a __length_hint__ that throws TypeError (see https://bugs.python.org/issue33939 for a proposal to do that in itertools). Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Fri Jun 22 09:10:23 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 22 Jun 2018 23:10:23 +1000 Subject: [Python-Dev] Intent to accept PEP 561 -- Distributing and Packaging Type Information In-Reply-To: References: Message-ID: On 21 June 2018 at 07:06, Guido van Rossum wrote: > Only the last two commits are new since the last posting: support for > partial packages and a bunch of small textual tweaks I found today while > reviewing. There wasn't a lot of feedback then so I don't expect a flamewar > today, but better late than never. ;-) Something that was raised indirectly in https://github.com/pypa/warehouse/issues/4164 was the terminology collision between type hinting stub files, and API testing stub interfaces. I don't think that's actually a problem, since type hinting stubs will only contain interface files, and not regular source files. This means that a type hinting stub could later be expanded in scope to also become an API emulating testing stub, and the two use cases wouldn't conflict (I'm not commenting on whether or not that would actually be a good idea - I'm just noting that PEP 561 claiming the "-stubs" naming convention on PyPI doesn't close out that option). Beyond that, I think the other points I raised in the Warehouse tracker issues can be considered derived requirements arising from the PEP acceptance - if anyone tries to use the window between PEP 561 being accepted, and the related permissions being enforced in PyPI to squat on stubs-related project names, then PEP 541 provides a mechanism for addressing that. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Fri Jun 22 10:22:33 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 23 Jun 2018 00:22:33 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) Message-ID: On 22 June 2018 at 02:26, Antoine Pitrou wrote: > Indeed. But, for a syntax addition such as PEP 572, I think it would be > a good idea to ask their opinion to teaching/education specialists. > > As far as I'm concerned, if teachers and/or education specialists were > to say PEP 572 is not a problem, my position would shift from negative > towards neutral. I asked a handful of folks at the Education Summit the next day about it: * for the basic notion of allowing expression level name binding using the "NAME := EXPR" notation, the reactions ranged from mildly negative (I read it as only a "-0" rather than a "-1") to outright positive. * for the reactions to my description of the currently proposed parent local scoping behaviour in comprehensions, I'd use the word "horrified", and feel I wasn't overstating the response :) While I try to account for the fact that I implemented the current comprehension semantics for the 3.x series, and am hence biased towards considering them the now obvious interpretation, it's also the case that generator expressions have worked like nested functions since they were introduced in Python 2.4 (more than 13 years ago now), and comprehensions have worked the same way as generator expressions since Python 3.0 (which has its 10th birthday coming up in December this year). This means that I take any claims that the legacy Python 2.x interpretation of comprehension behaviour is intuitively obvious with an enormous grain of salt - for the better part of a decade now, every tool at a Python 3 user's disposal (the fact that the iteration variable is hidden from the current scope, reading the language reference [1], printing out locals(), using the dis module, stepping through code in a debugger, writing their own tracing function, and even observing the quirky interaction with class scopes) will have nudged them towards the "it's a hidden nested function" interpretation of expected comprehension behaviour. Acquiring the old mental model for the way comprehensions work pretty much requires a developer to have started with Python 2.x themselves (perhaps even before comprehensions and lexical closures were part of the language), or else have been taught the Python 2 comprehension model by someone else - there's nothing in Python 3's behaviour to encourage that point of view, and plenty of functional-language-inspired documentation to instead encourage folks to view comprehensions as tightly encapsulated declarative container construction syntax. I'm currently working on a concept proposal at https://github.com/ncoghlan/peps/pull/2 that's much closer to PEP 572 than any of my previous `given` based suggestions: for already declared locals, it devolves to being the same as PEP 572 (except that expressions are allowed as top level statements), but for any names that haven't been previously introduced, it prohibits assigning to a name that doesn't already have a defined scope, and instead relies on a new `given` clause on various constructs that allows new target declarations to be introduced into the current scope (such that "if x:= f():" implies "x" is already defined as a target somewhere else in the current scope, while "if x := f() given x:" potentially introduces "x" as a new local target the same way a regular assignment statement does). One of the nicer features of the draft proposal is that if all you want to do is export the iteration variable from a comprehension, you don't need to use an assignment expression at all: you can just append "... given global x" or "... given nonlocal x" and export the iteration variable directly to the desired outer scope, the same way you can in the fully spelled out nested function equivalent. Cheers, Nick. [1] From https://docs.python.org/3.0/reference/expressions.html#displays-for-lists-sets-and-dictionaries: 'Note that the comprehension is executed in a separate scope, so names assigned to in the target list don?t ?leak? in the enclosing scope.' -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From solipsis at pitrou.net Fri Jun 22 10:47:10 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Fri, 22 Jun 2018 16:47:10 +0200 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: Message-ID: <20180622164710.4906f9b0@fsol> On Sat, 23 Jun 2018 00:22:33 +1000 Nick Coghlan wrote: > On 22 June 2018 at 02:26, Antoine Pitrou wrote: > > Indeed. But, for a syntax addition such as PEP 572, I think it would be > > a good idea to ask their opinion to teaching/education specialists. > > > > As far as I'm concerned, if teachers and/or education specialists were > > to say PEP 572 is not a problem, my position would shift from negative > > towards neutral. > > I asked a handful of folks at the Education Summit the next day about it: > > * for the basic notion of allowing expression level name binding using > the "NAME := EXPR" notation, the reactions ranged from mildly negative > (I read it as only a "-0" rather than a "-1") to outright positive. Thank you. Personally, I'd like to see feedback from educators/teachers after they take the time to read the PEP and take some time to think about its consequences. My main concern is we're introducing a second different way of doing something which is really fundamental. > * for the reactions to my description of the currently proposed parent > local scoping behaviour in comprehensions, I'd use the word > "horrified", and feel I wasn't overstating the response :) [...] Hmm... I don't think conflating the assignment expression proposal with comprehension semantics issues is helping the discussion. Regards Antoine. From guido at python.org Fri Jun 22 11:13:44 2018 From: guido at python.org (Guido van Rossum) Date: Fri, 22 Jun 2018 08:13:44 -0700 Subject: [Python-Dev] About [].append == [].append In-Reply-To: References: <5B2B8B1F.5070605@UGent.be> <790dea68-5a36-c21f-63e1-2d9bdcf2f333@mail.mipt.ru> Message-ID: Honestly it looks to me like the status quo is perfect. >>> a = [] >>> a.append is a.append False >>> a.append == a.append True >>> b = [] >>> a.append == b.append False >>> On Thu, Jun 21, 2018 at 10:02 PM Serhiy Storchaka wrote: > 22.06.18 00:04, Ivan Pozdeev via Python-Dev ????: > > On 21.06.2018 23:40, Guido van Rossum wrote: > >> I'm with Serhiy here, for mutable values I don't think the methods > >> should compare equal, even when the values do. For immutables I don't > >> care either way, it's an implementation detail. > >> > > In this light, methods rather shouldn't have a rich comparison logic at > > all > > This would be so if you get the same method object when resolve an > attribute. But a.f is not a.f. Every time a new method object is > created. If we want a.f == a.f, we need to implement a rich comparison > logic. > > > -- at the very least, until we have a realistic use case and can > > flesh out the requirements for it. > > There are realistic use cases. You will get a number of failures in the > Python tests if make method objects incomparable. > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Fri Jun 22 11:16:35 2018 From: guido at python.org (Guido van Rossum) Date: Fri, 22 Jun 2018 08:16:35 -0700 Subject: [Python-Dev] Intent to accept PEP 561 -- Distributing and Packaging Type Information In-Reply-To: References: Message-ID: That sounds like you're supporting PEP 561 as is, right? Excuse my ignorance, but where are API testing stub interfaces described or used? --Guido On Fri, Jun 22, 2018 at 6:10 AM Nick Coghlan wrote: > On 21 June 2018 at 07:06, Guido van Rossum wrote: > > Only the last two commits are new since the last posting: support for > > partial packages and a bunch of small textual tweaks I found today while > > reviewing. There wasn't a lot of feedback then so I don't expect a > flamewar > > today, but better late than never. ;-) > > Something that was raised indirectly in > https://github.com/pypa/warehouse/issues/4164 was the terminology > collision between type hinting stub files, and API testing stub > interfaces. > > I don't think that's actually a problem, since type hinting stubs will > only contain interface files, and not regular source files. This means > that a type hinting stub could later be expanded in scope to also > become an API emulating testing stub, and the two use cases wouldn't > conflict (I'm not commenting on whether or not that would actually be > a good idea - I'm just noting that PEP 561 claiming the > "-stubs" naming convention on PyPI doesn't close out that > option). > > Beyond that, I think the other points I raised in the Warehouse > tracker issues can be considered derived requirements arising from the > PEP acceptance - if anyone tries to use the window between PEP 561 > being accepted, and the related permissions being enforced in PyPI to > squat on stubs-related project names, then PEP 541 provides a > mechanism for addressing that. > > Cheers, > Nick. > > -- > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From J.Demeyer at UGent.be Fri Jun 22 11:16:38 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Fri, 22 Jun 2018 17:16:38 +0200 Subject: [Python-Dev] PEP 580 (C call protocol) draft implementation Message-ID: <5B2D12D6.8010305@UGent.be> Hello all, I have a first draft implementation of PEP 580 (introducing the C call protocol): https://github.com/jdemeyer/cpython/tree/pep580 Almost all tests pass, only test_gdb and test_pydoc fail for me. I still have to fix those. Jeroen. From ncoghlan at gmail.com Fri Jun 22 11:37:24 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 23 Jun 2018 01:37:24 +1000 Subject: [Python-Dev] Intent to accept PEP 561 -- Distributing and Packaging Type Information In-Reply-To: References: Message-ID: On 23 June 2018 at 01:16, Guido van Rossum wrote: > That sounds like you're supporting PEP 561 as is, right? Aye, I'm personally fine with it - we do need to do something about automatically reserving the derived names on PyPI, but I don't think that's a blocker for the initial PEP acceptance (instead, it will go the other way: PEP acceptance will drive Warehouse getting updated to handle the convention already being adopted by the client tools). > Excuse my > ignorance, but where are API testing stub interfaces described or used? They're not - it's just the context for Donald referring to "stubs" as being a general technical term with other meanings beyond the "type hinting stub file" one. As such, there's three parts to explaining why we're not worried about the terminology clash: - Ethan searched for projects called "*-stubs" or "*_stubs" and didn't find any, so the practical impact of any terminology clash will be low - there isn't an established need to automatically find testing stub libraries based on an existing project name the way there is for type hints - even if such a need did arise in the future, the "py.typed" marker file and the different file extension for stub files within a package still gives us an enormous amount of design flexibility Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From status at bugs.python.org Fri Jun 22 12:10:08 2018 From: status at bugs.python.org (Python tracker) Date: Fri, 22 Jun 2018 18:10:08 +0200 (CEST) Subject: [Python-Dev] Summary of Python tracker Issues Message-ID: <20180622161008.AF4C256A94@psf.upfronthosting.co.za> ACTIVITY SUMMARY (2018-06-15 - 2018-06-22) Python tracker at https://bugs.python.org/ To view or respond to any of the issues listed below, click on the issue. Do NOT respond to this message. Issues counts and deltas: open 6700 ( +9) closed 38994 (+64) total 45694 (+73) Open issues with patches: 2651 Issues opened (49) ================== #23660: Turtle left/right inverted when using different coordinates or https://bugs.python.org/issue23660 reopened by willingc #32500: PySequence_Length() raises TypeError on dict type https://bugs.python.org/issue32500 reopened by serhiy.storchaka #33871: Possible integer overflow in iov_setup() https://bugs.python.org/issue33871 opened by serhiy.storchaka #33873: False positives when running leak tests with -R 1:1 https://bugs.python.org/issue33873 opened by pablogsal #33875: Allow dynamic password evaluation in pypirc configuration file https://bugs.python.org/issue33875 opened by jperras #33877: doc Mention Windows along UNIX for script running instructions https://bugs.python.org/issue33877 opened by adelfino #33878: Doc: Assignment statement to tuple or list: case missing. https://bugs.python.org/issue33878 opened by mdk #33880: namedtuple should use NFKD to find duplicate members https://bugs.python.org/issue33880 opened by John Cooke #33881: dataclasses should use NFKC to find duplicate members https://bugs.python.org/issue33881 opened by eric.smith #33882: doc Mention breakpoint() in debugger-related FAQ https://bugs.python.org/issue33882 opened by adelfino #33883: doc Mention mypy, pytype and PyAnnotate in FAQ https://bugs.python.org/issue33883 opened by adelfino #33884: [multiprocessing] Multiprocessing in spawn mode doesn't work w https://bugs.python.org/issue33884 opened by Yoni Rozenshein #33885: doc Replace "hook function" with "callable" in urllib.request. https://bugs.python.org/issue33885 opened by adelfino #33886: SSL on aiomysql hangs on reconnection https://bugs.python.org/issue33886 opened by andr04 #33887: doc Add TOC in Design and History FAQ https://bugs.python.org/issue33887 opened by adelfino #33888: Use CPython instead of Python when talking about implementatio https://bugs.python.org/issue33888 opened by adelfino #33894: tempfile.tempdir cannot be unset https://bugs.python.org/issue33894 opened by philiprowlands #33895: LoadLibraryExW called with GIL held can cause deadlock https://bugs.python.org/issue33895 opened by Tony Roberts #33896: Document what components make up the filecmp.cmp os.stat signa https://bugs.python.org/issue33896 opened by Dean Morin #33897: Add a restart option to logging.basicConfig() https://bugs.python.org/issue33897 opened by rhettinger #33898: pathlib issues with Windows device paths https://bugs.python.org/issue33898 opened by eryksun #33899: Tokenize module does not mirror "end-of-input" is newline beha https://bugs.python.org/issue33899 opened by ammar2 #33909: PyObject_CallFinalizerFromDealloc is not referenced in any doc https://bugs.python.org/issue33909 opened by Eric.Wieser #33911: [EASY] test_docxmlrpc fails when run with -Werror https://bugs.python.org/issue33911 opened by vstinner #33913: test_multiprocessing_spawn random failures on x86 Windows7 3.6 https://bugs.python.org/issue33913 opened by vstinner #33914: test_gdb fails for Python 2.7.15 https://bugs.python.org/issue33914 opened by vibhutisawant #33916: test_lzma: test_refleaks_in_decompressor___init__() leaks 100 https://bugs.python.org/issue33916 opened by vstinner #33918: Hooking into pause/resume of iterators/coroutines https://bugs.python.org/issue33918 opened by Liran Nuna #33919: Expose _PyCoreConfig structure to Python https://bugs.python.org/issue33919 opened by barry #33920: test_asyncio: test_run_coroutine_threadsafe_with_timeout() fai https://bugs.python.org/issue33920 opened by vstinner #33921: Explain that '' can be used to bind to all interfaces for the https://bugs.python.org/issue33921 opened by John Hagen #33923: py.ini cannot set 32/64bits for specific version https://bugs.python.org/issue33923 opened by mrh1997 #33926: test_gdb is skipped in builds since gdb is not installed as pa https://bugs.python.org/issue33926 opened by xtreak #33927: Allow json.tool to have identical infile and outfile https://bugs.python.org/issue33927 opened by kuhlmann #33929: test_multiprocessing_spawn: WithProcessesTestProcess.test_many https://bugs.python.org/issue33929 opened by vstinner #33930: Segfault with deep recursion into object().__dir__ https://bugs.python.org/issue33930 opened by a-j-buxton #33931: Building 2.7 on Windows with PC\VS9.0 is broken https://bugs.python.org/issue33931 opened by anselm.kruis #33932: Calling Py_Initialize() twice now triggers a fatal error (Pyth https://bugs.python.org/issue33932 opened by vstinner #33933: Error message says dict has no len https://bugs.python.org/issue33933 opened by veky #33934: locale.getlocale() seems wrong when the locale is yet unset (p https://bugs.python.org/issue33934 opened by zezollo #33935: shutil.copyfile throws incorrect SameFileError on Google Drive https://bugs.python.org/issue33935 opened by Deniz Bozyigit #33936: OPENSSL_VERSION_1_1 never defined in _hashopenssl.c https://bugs.python.org/issue33936 opened by laomaiweng #33937: [3.6] test_socket: testSendmsgTimeout() failed on Travis CI https://bugs.python.org/issue33937 opened by vstinner #33938: Cross compilation fail for ARM https://bugs.python.org/issue33938 opened by n0s69z #33939: Raise OverflowError in __length_hint__ for consistently infini https://bugs.python.org/issue33939 opened by ncoghlan #33940: datetime.strptime have no directive to convert date values wit https://bugs.python.org/issue33940 opened by Raghunath Lingutla #33941: datetime.strptime not able to recognize invalid date formats https://bugs.python.org/issue33941 opened by Raghunath Lingutla #33942: IDLE crash when typing opening bracket https://bugs.python.org/issue33942 opened by Japsert #33943: doc Add references to logging.basicConfig https://bugs.python.org/issue33943 opened by adelfino Most recent 15 issues with no replies (15) ========================================== #33943: doc Add references to logging.basicConfig https://bugs.python.org/issue33943 #33942: IDLE crash when typing opening bracket https://bugs.python.org/issue33942 #33941: datetime.strptime not able to recognize invalid date formats https://bugs.python.org/issue33941 #33940: datetime.strptime have no directive to convert date values wit https://bugs.python.org/issue33940 #33937: [3.6] test_socket: testSendmsgTimeout() failed on Travis CI https://bugs.python.org/issue33937 #33936: OPENSSL_VERSION_1_1 never defined in _hashopenssl.c https://bugs.python.org/issue33936 #33934: locale.getlocale() seems wrong when the locale is yet unset (p https://bugs.python.org/issue33934 #33933: Error message says dict has no len https://bugs.python.org/issue33933 #33930: Segfault with deep recursion into object().__dir__ https://bugs.python.org/issue33930 #33929: test_multiprocessing_spawn: WithProcessesTestProcess.test_many https://bugs.python.org/issue33929 #33923: py.ini cannot set 32/64bits for specific version https://bugs.python.org/issue33923 #33920: test_asyncio: test_run_coroutine_threadsafe_with_timeout() fai https://bugs.python.org/issue33920 #33913: test_multiprocessing_spawn random failures on x86 Windows7 3.6 https://bugs.python.org/issue33913 #33899: Tokenize module does not mirror "end-of-input" is newline beha https://bugs.python.org/issue33899 #33898: pathlib issues with Windows device paths https://bugs.python.org/issue33898 Most recent 15 issues waiting for review (15) ============================================= #33943: doc Add references to logging.basicConfig https://bugs.python.org/issue33943 #33936: OPENSSL_VERSION_1_1 never defined in _hashopenssl.c https://bugs.python.org/issue33936 #33932: Calling Py_Initialize() twice now triggers a fatal error (Pyth https://bugs.python.org/issue33932 #33931: Building 2.7 on Windows with PC\VS9.0 is broken https://bugs.python.org/issue33931 #33916: test_lzma: test_refleaks_in_decompressor___init__() leaks 100 https://bugs.python.org/issue33916 #33911: [EASY] test_docxmlrpc fails when run with -Werror https://bugs.python.org/issue33911 #33895: LoadLibraryExW called with GIL held can cause deadlock https://bugs.python.org/issue33895 #33894: tempfile.tempdir cannot be unset https://bugs.python.org/issue33894 #33888: Use CPython instead of Python when talking about implementatio https://bugs.python.org/issue33888 #33887: doc Add TOC in Design and History FAQ https://bugs.python.org/issue33887 #33885: doc Replace "hook function" with "callable" in urllib.request. https://bugs.python.org/issue33885 #33883: doc Mention mypy, pytype and PyAnnotate in FAQ https://bugs.python.org/issue33883 #33882: doc Mention breakpoint() in debugger-related FAQ https://bugs.python.org/issue33882 #33878: Doc: Assignment statement to tuple or list: case missing. https://bugs.python.org/issue33878 #33877: doc Mention Windows along UNIX for script running instructions https://bugs.python.org/issue33877 Top 10 most discussed issues (10) ================================= #32962: test_gdb fails in debug build with `-mcet -fcf-protection -O0` https://bugs.python.org/issue32962 14 msgs #33865: [EASY] Missing code page aliases: "unknown encoding: 874" https://bugs.python.org/issue33865 14 msgs #33832: doc Add "magic method" entry to Glossary https://bugs.python.org/issue33832 10 msgs #33839: IDLE tooltips.py: refactor and add docstrings and tests https://bugs.python.org/issue33839 10 msgs #33932: Calling Py_Initialize() twice now triggers a fatal error (Pyth https://bugs.python.org/issue33932 9 msgs #29750: smtplib doesn't handle unicode passwords https://bugs.python.org/issue29750 8 msgs #33873: False positives when running leak tests with -R 1:1 https://bugs.python.org/issue33873 8 msgs #33895: LoadLibraryExW called with GIL held can cause deadlock https://bugs.python.org/issue33895 7 msgs #33826: enable discovery of class source code in IPython interactively https://bugs.python.org/issue33826 6 msgs #33939: Raise OverflowError in __length_hint__ for consistently infini https://bugs.python.org/issue33939 6 msgs Issues closed (61) ================== #10652: test___all_ + test_tcl fails (Windows installed binary) https://bugs.python.org/issue10652 closed by zach.ware #11697: Unsigned type in mmap_move_method https://bugs.python.org/issue11697 closed by berker.peksag #21941: Clean up turtle TPen class https://bugs.python.org/issue21941 closed by willingc #22571: Remove import * recommendations and examples in doc? https://bugs.python.org/issue22571 closed by terry.reedy #23922: Refactor icon setting to a separate function for Turtle https://bugs.python.org/issue23922 closed by terry.reedy #24978: Contributing to Documentation. Translation to Russian. https://bugs.python.org/issue24978 closed by terry.reedy #24990: Foreign language support in turtle module https://bugs.python.org/issue24990 closed by terry.reedy #26917: unicodedata.normalize(): bug in Hangul Composition https://bugs.python.org/issue26917 closed by benjamin.peterson #28710: Sphinx incompatible markup in the standard library docstrings https://bugs.python.org/issue28710 closed by rhettinger #30345: test_gdb fails on Python 3.6 when built with LTO+PGO https://bugs.python.org/issue30345 closed by vstinner #31725: Turtle/tkinter: NameError crashes ipython with "Tcl_AsyncDelet https://bugs.python.org/issue31725 closed by willingc #31966: [EASY C][Windows] print('hello\n', end='', flush=True) raises https://bugs.python.org/issue31966 closed by serhiy.storchaka #32555: Encoding issues with the locale encoding https://bugs.python.org/issue32555 closed by vstinner #33174: compiler error building the _sha3 module with Intel 2018 compi https://bugs.python.org/issue33174 closed by ned.deily #33365: http/client.py does not print correct headers in debug https://bugs.python.org/issue33365 closed by serhiy.storchaka #33499: Environment variable to set alternate location for pycache tre https://bugs.python.org/issue33499 closed by ncoghlan #33571: Add triple quotes to list of delimiters that trigger '...' pro https://bugs.python.org/issue33571 closed by Mariatta #33630: test_posix: TestPosixSpawn fails on PPC64 Fedora 3.x https://bugs.python.org/issue33630 closed by vstinner #33663: Web.py wsgiserver3.py raises TypeError when CSS file is not fo https://bugs.python.org/issue33663 closed by steve.dower #33733: Add utilities to get/set pipe and socket buffer sizes? https://bugs.python.org/issue33733 closed by vstinner #33742: Unsafe memory access in PyStructSequence_InitType https://bugs.python.org/issue33742 closed by xiang.zhang #33746: testRegisterResult in test_unittest fails in verbose mode https://bugs.python.org/issue33746 closed by vstinner #33824: Settign LANG=C modifies the --version behavior https://bugs.python.org/issue33824 closed by vstinner #33836: [Good first-time issue] Recommend keyword-only param for memoi https://bugs.python.org/issue33836 closed by Mariatta #33841: lock not released in threading.Condition https://bugs.python.org/issue33841 closed by pitrou #33843: Remove deprecated stuff in cgi module https://bugs.python.org/issue33843 closed by inada.naoki #33846: Misleading error message in urllib.parse.unquote https://bugs.python.org/issue33846 closed by r.david.murray #33847: doc: Add '@' operator entry to index https://bugs.python.org/issue33847 closed by terry.reedy #33852: doc Remove parentheses from sequence subscription description https://bugs.python.org/issue33852 closed by terry.reedy #33854: doc Add PEP title in seealso of Built-in Types https://bugs.python.org/issue33854 closed by terry.reedy #33855: IDLE: Minimally test every non-startup module. https://bugs.python.org/issue33855 closed by terry.reedy #33856: IDLE: "help" is missing from the sign-on message https://bugs.python.org/issue33856 closed by zach.ware #33858: A typo in multiprocessing documentation https://bugs.python.org/issue33858 closed by pitrou #33861: Minor improvements of tests for os.path. https://bugs.python.org/issue33861 closed by serhiy.storchaka #33866: Stop using OrderedDict in enum https://bugs.python.org/issue33866 closed by inada.naoki #33872: doc Add list access time to list definition https://bugs.python.org/issue33872 closed by rhettinger #33874: dictviews set operations do not follow pattern of set or froze https://bugs.python.org/issue33874 closed by rhettinger #33876: doc Mention relevant pythonesque implementations https://bugs.python.org/issue33876 closed by Mariatta #33879: Item assignment in tuple mutates list despite throwing error https://bugs.python.org/issue33879 closed by r.david.murray #33889: doc Mention Python on-line shell in Tutorial https://bugs.python.org/issue33889 closed by Mariatta #33890: Pathlib does not compare Windows and MinGW paths as equal https://bugs.python.org/issue33890 closed by r.david.murray #33891: ntpath join doesnt check whether path variable None or not https://bugs.python.org/issue33891 closed by xiang.zhang #33892: doc Use gender neutral words https://bugs.python.org/issue33892 closed by inada.naoki #33893: Linux terminal shortcuts support in python shell https://bugs.python.org/issue33893 closed by terry.reedy #33900: collections.counter update method without argument gives misle https://bugs.python.org/issue33900 closed by xtreak #33901: test_dbm_gnu.test_reorganize() failed on x86-64 High Sierra 3. https://bugs.python.org/issue33901 closed by vstinner #33902: entry_points/console_scripts is too slow https://bugs.python.org/issue33902 closed by barry #33903: Can't use lib2to3 with embeddable zip file https://bugs.python.org/issue33903 closed by steve.dower #33904: IDLE: In rstrip, rename class RstripExtension as Rstrip https://bugs.python.org/issue33904 closed by terry.reedy #33905: IDLE: Test stackbrowser.Stackbrowser https://bugs.python.org/issue33905 closed by terry.reedy #33906: IDLE: rename windows.py as window.py https://bugs.python.org/issue33906 closed by terry.reedy #33907: IDLE: Rename calltips and CallTips as calltip and Calltip. https://bugs.python.org/issue33907 closed by terry.reedy #33908: remove unecessary variable assignments https://bugs.python.org/issue33908 closed by xiang.zhang #33910: update random.Random 's parameter to have a proper name. https://bugs.python.org/issue33910 closed by rhettinger #33912: [EASY] test_warnings: test_exec_filename() fails when run with https://bugs.python.org/issue33912 closed by vstinner #33915: VSTS: Initialize phase: failed https://bugs.python.org/issue33915 closed by steve.dower #33917: Fix and document idlelib/idle_test/template.py https://bugs.python.org/issue33917 closed by terry.reedy #33922: Enforce 64bit Python by Launcher https://bugs.python.org/issue33922 closed by eryksun #33924: In IDLE menudefs, change 'windows' to 'window' https://bugs.python.org/issue33924 closed by terry.reedy #33925: builtin_function_or_method compares __self__ by identity inste https://bugs.python.org/issue33925 closed by serhiy.storchaka #33928: _Py_DecodeUTF8Ex() creates surrogate pairs on Windows https://bugs.python.org/issue33928 closed by vstinner From steve at pearwood.info Fri Jun 22 12:41:28 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Sat, 23 Jun 2018 02:41:28 +1000 Subject: [Python-Dev] About [].append == [].append In-Reply-To: References: <5B2B8B1F.5070605@UGent.be> <790dea68-5a36-c21f-63e1-2d9bdcf2f333@mail.mipt.ru> Message-ID: <20180622164128.GD14437@ando.pearwood.info> On Fri, Jun 22, 2018 at 08:13:44AM -0700, Guido van Rossum wrote: > Honestly it looks to me like the status quo is perfect. Does this example work for you? py> (17.1).hex == (17.1).hex True But: py> a = 17.1 py> b = 17.1 py> a.hex == b.hex False I know why it happens -- at the REPL, the interpreter uses the same object for both 17.1 instances when they're part of the same statement, but not when they're on separate lines. I just don't know whether this is desirable or not. -- Steve From vano at mail.mipt.ru Fri Jun 22 12:48:45 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Fri, 22 Jun 2018 19:48:45 +0300 Subject: [Python-Dev] About [].append == [].append In-Reply-To: <20180622164128.GD14437@ando.pearwood.info> References: <5B2B8B1F.5070605@UGent.be> <790dea68-5a36-c21f-63e1-2d9bdcf2f333@mail.mipt.ru> <20180622164128.GD14437@ando.pearwood.info> Message-ID: On 22.06.2018 19:41, Steven D'Aprano wrote: > On Fri, Jun 22, 2018 at 08:13:44AM -0700, Guido van Rossum wrote: > >> Honestly it looks to me like the status quo is perfect. > Does this example work for you? > > py> (17.1).hex == (17.1).hex > True > > But: > > py> a = 17.1 > py> b = 17.1 > py> a.hex == b.hex > False > > I know why it happens -- at the REPL, the interpreter uses the same > object for both 17.1 instances when they're part of the same statement, > but not when they're on separate lines. I just don't know whether this > is desirable or not. > > Strictly speaking, I can't see anything in the docs about method equality semantics. If that's true, it's an implementation detail, and users shouldn't rely on it. Consequently, anything is "desirable" that is sufficient for the Python codebase. -- Regards, Ivan From mike at selik.org Fri Jun 22 13:02:45 2018 From: mike at selik.org (Michael Selik) Date: Fri, 22 Jun 2018 10:02:45 -0700 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <20180622164710.4906f9b0@fsol> References: <20180622164710.4906f9b0@fsol> Message-ID: On Fri, Jun 22, 2018 at 8:09 AM Antoine Pitrou wrote: > Thank you. Personally, I'd like to see feedback from > educators/teachers after they take the time to read the PEP and take > some time to think about its consequences. > I've started testing the proposed syntax when I teach. I don't have a large sample yet, but most students either dislike it or don't appreciate the benefits. They state a clear preference for shorter, simpler lines at the consequence of more lines of code. This may partially be caused by the smaller screen real estate on a projector or large TV than a desktop monitor. My intuition is that one strength of Python for beginners is the relative lack of punctuation and operators compared with most other languages. This proposal encourages denser lines with more punctuation. Because of the order of operations, many uses of ``:=`` will also require parentheses. Even relatively simple uses, like ``if (match := pattern.search(data)) is not None:`` require doubled parentheses on one side or the other. Beginners are especially prone to typographical errors with mismatched parentheses and missing colons and get easily frustrated by the associated syntax errors. Given the following options: A. if (row := cursor.fetchone()) is None: raise NotFound return row B. row = cursor.fetchone() if row is None: raise NotFound return row C. if (row := cursor.fetchone()) is not None: return row raise NotFound D. row = cursor.fetchone() if row is not None: return row raise NotFound The majority of students preferred option B. I also tested some regex match examples. Results were similar. > My main concern is we're introducing a second different way of doing > something which is really fundamental. > The few students who like the proposal ask why it requires creating a new operator instead of repurposing the ``=`` operator. I'll reserve my personal opinions for a different thread. -------------- next part -------------- An HTML attachment was scrubbed... URL: From mike at selik.org Fri Jun 22 13:09:49 2018 From: mike at selik.org (Michael Selik) Date: Fri, 22 Jun 2018 10:09:49 -0700 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <20180622164710.4906f9b0@fsol> Message-ID: On Fri, Jun 22, 2018 at 10:02 AM Michael Selik wrote: > On Fri, Jun 22, 2018 at 8:09 AM Antoine Pitrou > wrote: > >> Thank you. Personally, I'd like to see feedback from >> educators/teachers after they take the time to read the PEP and take >> some time to think about its consequences. > > I forgot to add that I don't anticipate changing my lesson plans if this proposal is accepted. There's already not enough time to teach everything I'd like. Including a new assignment operator would distract from the learning objectives. -------------- next part -------------- An HTML attachment was scrubbed... URL: From rosuav at gmail.com Fri Jun 22 13:09:09 2018 From: rosuav at gmail.com (Chris Angelico) Date: Sat, 23 Jun 2018 03:09:09 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <20180622164710.4906f9b0@fsol> Message-ID: On Sat, Jun 23, 2018 at 3:02 AM, Michael Selik wrote: > On Fri, Jun 22, 2018 at 8:09 AM Antoine Pitrou wrote: >> >> Thank you. Personally, I'd like to see feedback from >> educators/teachers after they take the time to read the PEP and take >> some time to think about its consequences. > > > I've started testing the proposed syntax when I teach. I don't have a large > sample yet, but most students either dislike it or don't appreciate the > benefits. They state a clear preference for shorter, simpler lines at the > consequence of more lines of code. This is partly because students, lacking the experience to instantly recognize larger constructs, prefer a more concrete approach to coding. "Good code" is code where the concrete behaviour is more easily understood. As a programmer gains experience, s/he learns to grok more complex expressions, and is then better able to make use of the more expressive constructs such as list comprehensions. ChrisA From mike at selik.org Fri Jun 22 13:59:43 2018 From: mike at selik.org (Michael Selik) Date: Fri, 22 Jun 2018 10:59:43 -0700 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <20180622164710.4906f9b0@fsol> Message-ID: On Fri, Jun 22, 2018 at 10:19 AM Chris Angelico wrote: > On Sat, Jun 23, 2018 at 3:02 AM, Michael Selik wrote: > > On Fri, Jun 22, 2018 at 8:09 AM Antoine Pitrou > wrote: > >> > >> Thank you. Personally, I'd like to see feedback from > >> educators/teachers after they take the time to read the PEP and take > >> some time to think about its consequences. > > > > > > I've started testing the proposed syntax when I teach. I don't have a > large > > sample yet, but most students either dislike it or don't appreciate the > > benefits. They state a clear preference for shorter, simpler lines at the > > consequence of more lines of code. > > This is partly because students, lacking the experience to instantly > recognize larger constructs, prefer a more concrete approach to > coding. "Good code" is code where the concrete behaviour is more > easily understood. As a programmer gains experience, s/he learns to > grok more complex expressions, and is then better able to make use of > the more expressive constructs such as list comprehensions. > I don't think that's the only dynamic going on here. List comprehensions are more expressive, but also more declarative and in Python they have nice parallels with SQL and speech patterns in natural language. The concept of a comprehension is separate from its particular expression in Python. For example, Mozilla's array comprehensions in Javascript are/were ugly [0]. Students who are completely new to programming can see the similarity of list comprehensions to spoken language. They also appreciate the revision of certain 3-line and 4-line for-loops to comprehensions. I didn't get the same sense of "Oh! That looks better!" from my students when revising code with an assignment expression. Despite my best efforts to cheerlead, some students initially dislike list comprehensions. However, they come around to the idea that there's a tradeoff between line density and code block density. Comprehensions have a 3-to-1 or 4-to-1 ratio of code line shrinkage. They're also often used in sequence, like piping data through a series of transforms. Even if students dislike a single comprehension, they agree that turning 15 lines into 5 lines improves the readability. In contrast, an assignment expression only has a 2-to-1 code line compression ratio. It might save a level of indentation, but I think there are usually alternatives. Also, the assignment expression is less likely to be used several times in the same block. A good pitch for an assignment expression is refactoring a cascade of regular expressions: for line in f: mo = foo_re.search(line) if mo is not None: foo(mo.groups()) continue mo = bar_re.search(line) if mo is not None: bar(mo.groups()) continue mo = baz_re.search(line) if mo is not None: baz(mo.groups()) continue Here the assignment operator makes a clear improvement: for line in f: if (mo := foo_re.search(line)) is not None: foo(mo.groups()) elif (mo := bar_re.search(line)) is not None: bar(mo.groups()) elif (mo := baz_re.search(line)) is not None: baz(mo.groups()) However, I think this example is cheating a bit. While I've written similar code many times, it's almost never just a function call in each if-block. It's nearly always a handful of lines of logic which I wouldn't want to cut out into a separate function. The refactor is misleading, because I'd nearly always make a visual separation with a newline and the code would still look similar to the initial example. [0] https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Array_comprehensions -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Fri Jun 22 14:28:45 2018 From: chris.barker at noaa.gov (Chris Barker) Date: Fri, 22 Jun 2018 11:28:45 -0700 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <20180622164710.4906f9b0@fsol> Message-ID: On Fri, Jun 22, 2018 at 10:09 AM, Michael Selik wrote: > I forgot to add that I don't anticipate changing my lesson plans if this > proposal is accepted. There's already not enough time to teach everything > I'd like. Including a new assignment operator would distract from the > learning objectives. > nor would I. For a while, anyway.... But once it becomes a more common idiom, students will see it in the wild pretty early in their path to learning python. So we'll need to start introducing it earlier than later. I think this reflects that the "smaller" a language is, the easier it is to learn. Python has already grown a fair bit since 1.5 (when I started using it :-) ). Some things, like generators, are special purpose enough that I can wait pretty far into the program before teaching them. But others, like comprehensions (and lambda) are common enough that I have to introduce them pretty early on. Adding := is not a HUGE change, but it IS an expansion of the language, and one that we WILL have to introduce in an introductory course once it starts seeing common use. I really have no idea how much harder thats going to make the langauge to teach, but it will make it a bit harder -- I see enough confusion with "is" vs == already... -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Fri Jun 22 14:44:07 2018 From: guido at python.org (Guido van Rossum) Date: Fri, 22 Jun 2018 11:44:07 -0700 Subject: [Python-Dev] About [].append == [].append In-Reply-To: <20180622164128.GD14437@ando.pearwood.info> References: <5B2B8B1F.5070605@UGent.be> <790dea68-5a36-c21f-63e1-2d9bdcf2f333@mail.mipt.ru> <20180622164128.GD14437@ando.pearwood.info> Message-ID: On Fri, Jun 22, 2018 at 9:43 AM Steven D'Aprano wrote: > On Fri, Jun 22, 2018 at 08:13:44AM -0700, Guido van Rossum wrote: > > > Honestly it looks to me like the status quo is perfect. > > Does this example work for you? > > py> (17.1).hex == (17.1).hex > True > > But: > > py> a = 17.1 > py> b = 17.1 > py> a.hex == b.hex > False > > I know why it happens -- at the REPL, the interpreter uses the same > object for both 17.1 instances when they're part of the same statement, > but not when they're on separate lines. I just don't know whether this > is desirable or not. > But there's nothing new about that example. It's just the same as the issue that sometimes `1 is 1` and sometimes it isn't. -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From tjreedy at udel.edu Fri Jun 22 15:07:32 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Fri, 22 Jun 2018 15:07:32 -0400 Subject: [Python-Dev] PySequence_Check but no __len__ In-Reply-To: <9ca8588c-3b5c-5b35-0722-91f653a8b483@stackless.com> References: <8f1c59b3-9206-7dfc-a538-545859391066@stackless.com> <9ca8588c-3b5c-5b35-0722-91f653a8b483@stackless.com> Message-ID: On 6/22/2018 7:17 AM, Christian Tismer wrote: > > My problem is to find out how to deal with a class which has > __getitem__ but no __len__. > > The documentation suggests that the length of a sequence can always > be obtained by len(). > https://docs.python.org/3/reference/datamodel.html It says that plainly: "The built-in function len() returns the number of items of a sequence. " https://docs.python.org/3/library/collections.abc.html#collections-abstract-base-classes says that a Sequence has both __getitem__ and __len__. I am surprised that a C-API function calls something a 'sequence' without it having __len__. -- Terry Jan Reedy From vano at mail.mipt.ru Fri Jun 22 15:17:19 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Fri, 22 Jun 2018 22:17:19 +0300 Subject: [Python-Dev] PySequence_Check but no __len__ In-Reply-To: References: <8f1c59b3-9206-7dfc-a538-545859391066@stackless.com> <9ca8588c-3b5c-5b35-0722-91f653a8b483@stackless.com> Message-ID: <1b9ccd6b-a4e5-3761-6c4b-09151a967ede@mail.mipt.ru> On 22.06.2018 22:07, Terry Reedy wrote: > On 6/22/2018 7:17 AM, Christian Tismer wrote: > >> >> My problem is to find out how to deal with a class which has >> __getitem__ but no __len__. >> >> The documentation suggests that the length of a sequence can always >> be obtained by len(). >> https://docs.python.org/3/reference/datamodel.html > > It says that plainly: "The built-in function len() returns the number > of items of a sequence. " > > https://docs.python.org/3/library/collections.abc.html#collections-abstract-base-classes > > > says that a Sequence has both __getitem__ and __len__. > > I am surprised that a C-API function calls something a 'sequence' > without it having __len__. > A practical sequence check is checking for __iter__ . An iterator doesn't necessarily have a defined length -- e.g. a stream or a generator. -- Regards, Ivan From vano at mail.mipt.ru Fri Jun 22 15:30:20 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Fri, 22 Jun 2018 22:30:20 +0300 Subject: [Python-Dev] PySequence_Check but no __len__ In-Reply-To: <1b9ccd6b-a4e5-3761-6c4b-09151a967ede@mail.mipt.ru> References: <8f1c59b3-9206-7dfc-a538-545859391066@stackless.com> <9ca8588c-3b5c-5b35-0722-91f653a8b483@stackless.com> <1b9ccd6b-a4e5-3761-6c4b-09151a967ede@mail.mipt.ru> Message-ID: On 22.06.2018 22:17, Ivan Pozdeev wrote: > On 22.06.2018 22:07, Terry Reedy wrote: >> On 6/22/2018 7:17 AM, Christian Tismer wrote: >> >>> >>> My problem is to find out how to deal with a class which has >>> __getitem__ but no __len__. >>> >>> The documentation suggests that the length of a sequence can always >>> be obtained by len(). >>> https://docs.python.org/3/reference/datamodel.html >> >> It says that plainly: "The built-in function len() returns the number >> of items of a sequence. " >> >> https://docs.python.org/3/library/collections.abc.html#collections-abstract-base-classes >> >> >> says that a Sequence has both __getitem__ and __len__. >> >> I am surprised that a C-API function calls something a 'sequence' >> without it having __len__. >> > A practical sequence check is checking for __iter__ . An iterator > doesn't necessarily have a defined length -- e.g. a stream or a > generator. > Now, I know this isn't what https://docs.python.org/3/glossary.html#term-sequence says. But practically, the documentation seems to use "sequence" in the sense "finite iterable". Functions that need to know the length of input in advance seem to be the minority. -- Regards, Ivan From p.f.moore at gmail.com Fri Jun 22 15:35:37 2018 From: p.f.moore at gmail.com (Paul Moore) Date: Fri, 22 Jun 2018 20:35:37 +0100 Subject: [Python-Dev] PySequence_Check but no __len__ In-Reply-To: <1b9ccd6b-a4e5-3761-6c4b-09151a967ede@mail.mipt.ru> References: <8f1c59b3-9206-7dfc-a538-545859391066@stackless.com> <9ca8588c-3b5c-5b35-0722-91f653a8b483@stackless.com> <1b9ccd6b-a4e5-3761-6c4b-09151a967ede@mail.mipt.ru> Message-ID: On 22 June 2018 at 20:17, Ivan Pozdeev via Python-Dev wrote: > On 22.06.2018 22:07, Terry Reedy wrote: >> https://docs.python.org/3/library/collections.abc.html#collections-abstract-base-classes >> >> says that a Sequence has both __getitem__ and __len__. >> >> I am surprised that a C-API function calls something a 'sequence' without >> it having __len__. >> > A practical sequence check is checking for __iter__ . An iterator doesn't > necessarily have a defined length -- e.g. a stream or a generator. There's a difference between the ABC "Sequence" and the informally named sequence concept used in the C API. It's basically just that the C API term predates the ABC significantly, and there's no way that we'd change the C API naming because it would break too much code, but IMO it's just one of those "historical reasons" type of things that can't really be adequately explained, but just needs to be accepted... An ABC Sequence has __getitem__ and __len__. In terms of ABCs, something with __iter__ is an Iterable. Informal terminology is a different matter... Paul From greg.ewing at canterbury.ac.nz Fri Jun 22 19:06:15 2018 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Sat, 23 Jun 2018 11:06:15 +1200 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: Message-ID: <5B2D80E7.5010501@canterbury.ac.nz> Nick Coghlan wrote: > x:= f():" implies "x" is already defined as a target somewhere else in > the current scope, while "if x := f() given x:" potentially introduces > "x" as a new local target Noooo..... this is just taking a bad idea and making it worse, IMO. I'm -1 on any contortions designed to allow comprehensions to assign to things in outer scopes. All the proposed use cases I've seen for this have not improved readability over writing a function that does things the usual way. Can we please leave comprehensions as declarative constructs? The best tools do just one thing and do it well. These proposals seem to be trying to turn comprehensions into swiss army knives. -- Greg From brett at python.org Fri Jun 22 18:21:25 2018 From: brett at python.org (Brett Cannon) Date: Fri, 22 Jun 2018 15:21:25 -0700 Subject: [Python-Dev] We now have C code coverage! Message-ID: Thanks to a PR from Ammar Askar we now run Python under lcov as part of the code coverage build. And thanks to codecov.io automatically merging code coverage reports we get a complete report of our coverage (the first results of which can now be seen at https://codecov.io/gh/python/cpython). And funny enough the coverage average changed less than 1%. :) -------------- next part -------------- An HTML attachment was scrubbed... URL: From greg.ewing at canterbury.ac.nz Fri Jun 22 19:57:15 2018 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Sat, 23 Jun 2018 11:57:15 +1200 Subject: [Python-Dev] PySequence_Check but no __len__ In-Reply-To: References: <8f1c59b3-9206-7dfc-a538-545859391066@stackless.com> <9ca8588c-3b5c-5b35-0722-91f653a8b483@stackless.com> Message-ID: <5B2D8CDB.7090900@canterbury.ac.nz> Terry Reedy wrote: > I am surprised that a C-API function calls something a 'sequence' > without it having __len__. It's a bit strange that PySequence_Check exists at all. The principle of duck typing would suggest that one should be checking for the specific methods one needs. I suspect it's a holdover from very early Python, where the notion of a "sequence type" and a "mapping type" were more of a concrete thing. This is reflected in the existence of the tp_as_sequence and tp_as_mapping substructures. It was expected that a given type would either implement all the methods in one of those substructures or none of them, so shorcuts such as checking for just one method and assuming the others would exist made sense. But user-defined classes messed all that up, because it became possible to create a type that has __getitem__ but not __len__, etc. It also made it impossible to distinguish reliably between a sequence and a mapping. So it seems to me that PySequence_Check and related functions are not very useful any more, since it's not possible for them to really do what they claim to do. -- Greg From greg.ewing at canterbury.ac.nz Fri Jun 22 20:08:24 2018 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Sat, 23 Jun 2018 12:08:24 +1200 Subject: [Python-Dev] PySequence_Check but no __len__ In-Reply-To: References: <8f1c59b3-9206-7dfc-a538-545859391066@stackless.com> <9ca8588c-3b5c-5b35-0722-91f653a8b483@stackless.com> <1b9ccd6b-a4e5-3761-6c4b-09151a967ede@mail.mipt.ru> Message-ID: <5B2D8F78.4070108@canterbury.ac.nz> Ivan Pozdeev via Python-Dev wrote: > the documentation seems to use "sequence" in the sense > "finite iterable". Functions that need to know the length of input in > advance seem to be the minority. The official classifications we have are: Sequence: __iter__, __getitem__, __len__ Iterable: __iter__ There isn't any official term for a sequential thing that has __iter__ and __getitem__ but not __len__. That's probably because the need for such a thing doesn't seem to come up very much. One usually processes a potentially infinite sequence by iterating over it, not picking things out at arbitrary positions. And usually its items are generated by an algorithm that works sequentially, so random access would be difficult to implement. -- Greg From tjreedy at udel.edu Fri Jun 22 20:30:25 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Fri, 22 Jun 2018 20:30:25 -0400 Subject: [Python-Dev] PySequence_Check but no __len__ In-Reply-To: <5B2D8CDB.7090900@canterbury.ac.nz> References: <8f1c59b3-9206-7dfc-a538-545859391066@stackless.com> <9ca8588c-3b5c-5b35-0722-91f653a8b483@stackless.com> <5B2D8CDB.7090900@canterbury.ac.nz> Message-ID: On 6/22/2018 7:57 PM, Greg Ewing wrote: > Terry Reedy wrote: >> I am surprised that a C-API function calls something a 'sequence' >> without it having __len__. > > It's a bit strange that PySequence_Check exists at all. > The principle of duck typing would suggest that one > should be checking for the specific methods one needs. > > I suspect it's a holdover from very early Python, where > the notion of a "sequence type" and a "mapping type" > were more of a concrete thing. This is reflected in > the existence of the tp_as_sequence and tp_as_mapping > substructures. It was expected that a given type would > either implement all the methods in one of those > substructures or none of them, so shorcuts such as > checking for just one method and assuming the others > would exist made sense. > > But user-defined classes messed all that up, because > it became possible to create a type that has __getitem__ > but not __len__, etc. It also made it impossible to > distinguish reliably between a sequence and a mapping. > > So it seems to me that PySequence_Check and related > functions are not very useful any more, since it's not > possible for them to really do what they claim to do. So one should not take them as defining what they appear to define. In a sense, 'PySequence_Check' should be 'PySubscriptable_Check'. -- Terry Jan Reedy From tjreedy at udel.edu Fri Jun 22 20:43:25 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Fri, 22 Jun 2018 20:43:25 -0400 Subject: [Python-Dev] We now have C code coverage! In-Reply-To: References: Message-ID: On 6/22/2018 6:21 PM, Brett Cannon wrote: > Thanks to a PR from Ammar Askar we now run Python under lcov as part of > the code coverage build. And thanks to codecov.io > automatically merging code coverage reports we get a complete report of > our coverage (the first results of which can now be seen at > https://codecov.io/gh/python/cpython). > > And funny enough the coverage average changed less than 1%. :) Questions: 1. Is it possible, given that we are not paying for those reports, to customize the 'exclude_lines' definitions? Without such, the idlelib measures are biased downward. 2. What do the colors of test files mean? Every line of nearly all the idlelib test files are executed, but over half are red. The Learn More page does not say anything about either. -- Terry Jan Reedy From tjreedy at udel.edu Fri Jun 22 21:16:00 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Fri, 22 Jun 2018 21:16:00 -0400 Subject: [Python-Dev] We now have C code coverage! In-Reply-To: References: Message-ID: On 6/22/2018 8:43 PM, Terry Reedy wrote: > On 6/22/2018 6:21 PM, Brett Cannon wrote: >> Thanks to a PR from Ammar Askar we now run Python under lcov as part >> of the code coverage build. And thanks to codecov.io >> automatically merging code coverage reports we get >> a complete report of our coverage (the first results of which can now >> be seen at https://codecov.io/gh/python/cpython). >> >> And funny enough the coverage average changed less than 1%. :) > > Questions: > > 1. Is it possible, given that we are not paying for those reports, to > customize the .coveragerc exclude_lines definitions?? Without such, the idlelib > measures are biased downward. > > 2. What do the colors of test files mean?? Every line of nearly all the > idlelib test files are executed, but over half are red. > > The Learn More page does not say anything about either. I discovered the answer to 2. by shift-clicking on a text_x file to see their coverage report for the file. The colors actually do reflect the test lines executed. codecov.io excludes gui tests*, so the reported coverage for tkinter, idlelib, and turtle is deceptive and bogus, and under-reports the total cpython coverage by a percent or two. It would be better to exclude these modules. * I assume that codecov.io uses linux servers. I have read that there are programs that simulate X-Windows so that gui code will execute without actual terminals. -- Terry Jan Reedy From njs at pobox.com Fri Jun 22 21:40:23 2018 From: njs at pobox.com (Nathaniel Smith) Date: Fri, 22 Jun 2018 18:40:23 -0700 Subject: [Python-Dev] We now have C code coverage! In-Reply-To: References: Message-ID: On Fri, Jun 22, 2018 at 6:16 PM, Terry Reedy wrote: > On 6/22/2018 8:43 PM, Terry Reedy wrote: >> >> On 6/22/2018 6:21 PM, Brett Cannon wrote: >>> >>> Thanks to a PR from Ammar Askar we now run Python under lcov as part of >>> the code coverage build. And thanks to codecov.io >>> automatically merging code coverage reports we get a complete report of our >>> coverage (the first results of which can now be seen at >>> https://codecov.io/gh/python/cpython). >>> >>> And funny enough the coverage average changed less than 1%. :) >> >> >> Questions: >> >> 1. Is it possible, given that we are not paying for those reports, to >> customize the .coveragerc exclude_lines definitions? Without such, the >> idlelib measures are biased downward. >> >> 2. What do the colors of test files mean? Every line of nearly all the >> idlelib test files are executed, but over half are red. >> >> The Learn More page does not say anything about either. > > > I discovered the answer to 2. by shift-clicking on a text_x file to see > their coverage report for the file. The colors actually do reflect the test > lines executed. codecov.io excludes gui tests*, so the reported coverage > for tkinter, idlelib, and turtle is deceptive and bogus, and under-reports > the total cpython coverage by a percent or two. It would be better to > exclude these modules. > > * I assume that codecov.io uses linux servers. I have read that there are > programs that simulate X-Windows so that gui code will execute without > actual terminals. Codecov.io doesn't run any tests itself; it's just a service for aggregation and reporting. The coverage information is being gathered while running CPython's regular CI tests, and then uploaded to codecov.io to view. So if you want to run the gui tests -- which seems like a good idea if possible! -- then the way to do that would be to make them run as part of the regular Travis/Appveyor/VSTS checks. -n -- Nathaniel J. Smith -- https://vorpus.org From steve at pearwood.info Fri Jun 22 21:50:39 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Sat, 23 Jun 2018 11:50:39 +1000 Subject: [Python-Dev] About [].append == [].append In-Reply-To: References: <5B2B8B1F.5070605@UGent.be> <790dea68-5a36-c21f-63e1-2d9bdcf2f333@mail.mipt.ru> <20180622164128.GD14437@ando.pearwood.info> Message-ID: <20180623015038.GF14437@ando.pearwood.info> On Fri, Jun 22, 2018 at 11:44:07AM -0700, Guido van Rossum wrote: [...] > > I know why it happens -- at the REPL, the interpreter uses the same > > object for both 17.1 instances when they're part of the same statement, > > but not when they're on separate lines. I just don't know whether this > > is desirable or not. > > > > But there's nothing new about that example. It's just the same as the issue > that sometimes `1 is 1` and sometimes it isn't. Sure, but this is closer to "sometimes 1 == 1 and sometimes it isn't". But if you're okay with it, I don't have a counter-argument. I think it is more important that builtin methods and Python methods behave the same. Should Python methods be changed to compare self with "is" or are we too late to make that change? -- Steve From guido at python.org Fri Jun 22 22:21:27 2018 From: guido at python.org (Guido van Rossum) Date: Fri, 22 Jun 2018 19:21:27 -0700 Subject: [Python-Dev] About [].append == [].append In-Reply-To: <20180623015038.GF14437@ando.pearwood.info> References: <5B2B8B1F.5070605@UGent.be> <790dea68-5a36-c21f-63e1-2d9bdcf2f333@mail.mipt.ru> <20180622164128.GD14437@ando.pearwood.info> <20180623015038.GF14437@ando.pearwood.info> Message-ID: On Fri, Jun 22, 2018 at 6:52 PM Steven D'Aprano wrote: > On Fri, Jun 22, 2018 at 11:44:07AM -0700, Guido van Rossum wrote: > [...] > > > I know why it happens -- at the REPL, the interpreter uses the same > > > object for both 17.1 instances when they're part of the same statement, > > > but not when they're on separate lines. I just don't know whether this > > > is desirable or not. > > > > > > > But there's nothing new about that example. It's just the same as the > issue > > that sometimes `1 is 1` and sometimes it isn't. > > Sure, but this is closer to "sometimes 1 == 1 and sometimes it isn't". > But if you're okay with it, I don't have a counter-argument. > A bound method is a fairly complicated object, and for builtin bound methods, the == comparison has the following definition: - if the `__self__` objects are not the same object, return False - otherwise, return True iff it's the same method (i.e. the same name / the same underlying C function) > I think it is more important that builtin methods and Python methods > behave the same. Should Python methods be changed to compare self with > "is" or are we too late to make that change? > I am not sure. It's surprising, but I fear it may be too late to change. Are there tests in the stdlib for this behavior? -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at pearwood.info Fri Jun 22 22:23:18 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Sat, 23 Jun 2018 12:23:18 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <20180622164710.4906f9b0@fsol> Message-ID: <20180623022317.GG14437@ando.pearwood.info> On Fri, Jun 22, 2018 at 11:28:45AM -0700, Chris Barker via Python-Dev wrote: > On Fri, Jun 22, 2018 at 10:09 AM, Michael Selik wrote: > > > I forgot to add that I don't anticipate changing my lesson plans if this > > proposal is accepted. There's already not enough time to teach everything > > I'd like. Including a new assignment operator would distract from the > > learning objectives. > > > > nor would I. For a while, anyway.... > > But once it becomes a more common idiom, students will see it in the wild > pretty early in their path to learning python. So we'll need to start > introducing it earlier than later. Students see many features early in their path. I've had people still struggling with writing functions ask about metaclasses. People will see async code everywhere. We don't have to teach *everything* at once. The *subtleties* of assignment expressions might have some funny corner cases, but the high-level overview is simple. It is like ordinary assignment, but it is an expression that returns the value being assigned. So if you absolutely need to teach it to a beginner, it shouldn't be difficult once they understand the difference between an expression and a statement. [...] > I really have no idea how much harder thats going to make the langauge to > teach, but it will make it a bit harder -- I see enough confusion with "is" > vs == already... I think that the biggest source of confusion with "is" is that it *sometimes* seems to do what is wanted, i.e. test equality, but other times doesn't. It is that inconsistency that bites. Whereas with assignment expressions, there's no such inconsistency: - regular assignment using = only works as a statement, always; - assignment expression can go anywhere an expression can go, always; - regular assignment never returns a value; - assignment expression always returns a value; - regular assignments have lots of complex forms, such as sequence unpacking, and complex targets like spam[eggs](arg).attr; - assignment expressions only takes a plain name, always. Although there is some overlap in behaviour between the two, unlike "is", there's no inconsist behaviour to lead people astray. A better syntax error for things like this: py> if mo = regex.match(string): File "", line 1 if mo = regex.match(string): ^ SyntaxError: invalid syntax will also help, although of course some users won't read error messages for love or money. -- Steve From steve at pearwood.info Fri Jun 22 22:46:17 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Sat, 23 Jun 2018 12:46:17 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <20180622164710.4906f9b0@fsol> Message-ID: <20180623024616.GH14437@ando.pearwood.info> On Fri, Jun 22, 2018 at 10:59:43AM -0700, Michael Selik wrote: > > > I've started testing the proposed syntax when I teach. I don't have a > > > large > > > sample yet, but most students either dislike it or don't appreciate the > > > benefits. They state a clear preference for shorter, simpler lines at the > > > consequence of more lines of code. Of course they do -- they're less fluent at reading code. They don't have the experience to judge good code from bad. The question we should be asking is, do we only add features to Python if they are easy for beginners? It's not that I especially want to add features which *aren't* easy for beginners, but Python isn't Scratch and "easy for beginners" should only be a peripheral concern. > > This is partly because students, lacking the experience to instantly > > recognize larger constructs, prefer a more concrete approach to > > coding. "Good code" is code where the concrete behaviour is more > > easily understood. As a programmer gains experience, s/he learns to > > grok more complex expressions, and is then better able to make use of > > the more expressive constructs such as list comprehensions. > > > > I don't think that's the only dynamic going on here. List comprehensions > are more expressive, but also more declarative and in Python they have nice > parallels with SQL and speech patterns in natural language. The concept of > a comprehension is separate from its particular expression in Python. For > example, Mozilla's array comprehensions in Javascript are/were ugly [0]. Mozilla's array comprehensions are almost identical to Python's, aside from a couple of trivial differences: evens = [for (i of numbers) if (i % 2 === 0) i]; compared to: evens = [i for i in numbers if (i % 2 == 0)] - the inexplicable (to me) decision to say "for x of array" instead of "for x in array"; - moving the expression to the end, instead of the beginning. The second one is (arguably, though not by me) an improvement, since it preserves a perfect left-to-right execution order within the comprehension. > Students who are completely new to programming can see the similarity of > list comprehensions to spoken language. o_O I've been using comprehensions for something like a decade, and I can't :-) The closest analogy to comprehensions I know of is set builder notation in mathematics, which is hardly a surprise. That's where Haskell got the inspiration from, and their syntax is essentially an ASCIIfied version of set builder notation: Haskell: [(i,j) | i <- [1,2], j <- [1..4]] Maths: {(i,j) : i ? {1, 2}, j ? {1...4}} I teach secondary school children maths, and if there's a plain English natural language equivalent to list builder notation, neither I nor any of my students, nor any of the text books I've read, have noticed it. -- Steve From steve at pearwood.info Fri Jun 22 23:48:47 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Sat, 23 Jun 2018 13:48:47 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: Message-ID: <20180623034847.GI14437@ando.pearwood.info> On Sat, Jun 23, 2018 at 12:22:33AM +1000, Nick Coghlan wrote: [...] > * for the reactions to my description of the currently proposed parent > local scoping behaviour in comprehensions, I'd use the word > "horrified", and feel I wasn't overstating the response :) Without knowing how you worded the question, and the reasons for this horrified reaction, I'm afraid that isn't really helpful. It is nothing more than an appeal to emotion: https://en.wikipedia.org/wiki/Wisdom_of_repugnance Such strong emotions as "horrified" are typically a sign of an immediate, emotional gut reaction, not careful thought. We often see those sorts of reactions attached to the most objectively trivial matters. Immediate gut reactions are rarely a good guide because they tend to over-value the status quo, exaggerate the difficulty and costs of change, and under-estimate the benefits. Speaking personally, I've learned to question my immediately gut reaction. (And I remember to do so at least half the time.) PEP 572 is an example: when the issue was first raised back in February, my gut reaction was "Not in MY Python!!!" but by taking it seriously and running through some examples over the course of the discussion, I realised that, actually, I cautiously favour the idea. Of course, matters of *personal taste* cannot be anything but gut reaction, but in those matters, what one person holds strongly another can legitimately reject strongly. We ought to try to look beyond personal taste, and try (even if only imperfectly) to consider rational reasons for and against a proposal. If we do, reactions like "horrified" are rarely justified. It's just a minor feature in a programming language, the world will go on one way or the other, and Python already has trickier gotchas. > While I try to account for the fact that I implemented the current > comprehension semantics for the 3.x series, and am hence biased > towards considering them the now obvious interpretation, While we certainly don't want to make "non-obvious" a virtue for its own sake, obviousness (obvious to who?) ought to take a distant second place to *useful*. Otherwise we'd have to give up an awful lot of existing Python, starting with the fundamental execution model. (Oh, the number and length of arguments about whether Python uses call by value or call by reference, why mutable defaults and [[]]*3 are "broken"... if you think Python's execution model is "obvious" you've been using Python too long ;-) But as Tim Peters has said on a number of occasions, nobody is suggesting changing the interpretation of current comprehension semantics. Comprehension loop variables will continue to remain isolated to the comprehension. (And for the record, that makes *comprehensions* a weird special case, not assignment expressions. All other expressions run in the current lexical scope. Comprehensions introduce an implicit, invisible, sub-local scope that doesn't match up with a change in indentation as class and def statements do.) The behaviour in question is a matter of *assignment expression* semantics, not comprehensions. And honestly, I don't see why the proposed behaviour is "horrifying". Here's the high-level overview: - at the top level of a module, assignment expressions assign in the global scope; - inside a class, assignment expressions assign in the class scope; - inside a function, assignment expressions assign in the function local scope (unless declared global or nonlocal); - inside a comprehension, assignment expressions assign in the surrounding lexical scope (the surrounding function, class or module). The first three are the same as ordinary statement assignment. The last one is what you would expect if you treat comprehensions as any other expression which run in the current lexical scope. (The current function or class or module.) Even if we treat it as a "weird special case" (I don't think it is, but for the sake of the argument let's say it is) its not hard to explain. As I discuss below, you can get a very long way indeed working with comprehensions without once thinking about the scope they run in. By the time you need to think about comprehension scope, it shouldn't be hard to deal with the rule: - loop variables are hidden in a comprehension private scope; - explicit assignment expression variables are not. This is not async, or metaclasses, or even Unicode. [...] > plenty of > functional-language-inspired documentation to instead encourage folks > to view comprehensions as tightly encapsulated declarative container > construction syntax. I can't say I've done a broad survey, but the third-party documentation I've read on comprehensions typically glosses over the scoping issues without mentioning them. To the extent that scoping is even hinted at, comprehensions are treated as expressions which are exactly equivalent to re-writing them as a for-loop in the current scope. This is a typical example, found as the top result on googling for "python comprehensions": https://www.google.com/search?q=python+comprehensions http://www.pythonforbeginners.com/basics/list-comprehensions-in-python Nothing is mentioned about scope, and it repeats the inaccurate but simple equivalency: for item in list: if conditional: expression But perhaps that tutorial is too old. Okay this recent one is only a little more than a year old: https://hackernoon.com/list-comprehension-in-python-8895a785550b Again, no mention of scoping issues, comprehensions are simply expressions which presumably run in the same scope as any other expression. I think you over-estimate how many newcomers to Python are even aware that the scope of comprehensions is something to consider. > I'm currently working on a concept proposal at > https://github.com/ncoghlan/peps/pull/2 that's much closer to PEP 572 > than any of my previous `given` based suggestions: [...] I look forward to reading it, and I promise I won't go by my gut reaction :-) -- Steve From rosuav at gmail.com Sat Jun 23 00:06:21 2018 From: rosuav at gmail.com (Chris Angelico) Date: Sat, 23 Jun 2018 14:06:21 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <20180623034847.GI14437@ando.pearwood.info> References: <20180623034847.GI14437@ando.pearwood.info> Message-ID: On Sat, Jun 23, 2018 at 1:48 PM, Steven D'Aprano wrote: > I can't say I've done a broad survey, but the third-party documentation > I've read on comprehensions typically glosses over the scoping issues > without mentioning them. To the extent that scoping is even hinted at, > comprehensions are treated as expressions which are exactly equivalent > to re-writing them as a for-loop in the current scope. Even first-party documentation elides that distinction. The same inaccurate-but-simple equivalency - even using the word "equivalent" - comes up here: https://docs.python.org/3/howto/functional.html?highlight=equivalent#generator-expressions-and-list-comprehensions So I'm very sympathetic to the desire to have assignment expressions inside comprehensions behave like assignment expressions outside comprehensions. The trouble is that they are then _not_ the same as other names inside comprehensions. One way or another, there's a confusing distinction, especially at class scope. Unless this comes with an actual semantic change that affects existing code, there is going to be a bizarre disconnect *somewhere*, and it's just a matter of where. ChrisA From chris.barker at noaa.gov Sat Jun 23 00:08:37 2018 From: chris.barker at noaa.gov (Chris Barker) Date: Fri, 22 Jun 2018 21:08:37 -0700 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <20180623022317.GG14437@ando.pearwood.info> References: <20180622164710.4906f9b0@fsol> <20180623022317.GG14437@ando.pearwood.info> Message-ID: On Fri, Jun 22, 2018 at 7:23 PM, Steven D'Aprano wrote: > > But once it becomes a more common idiom, students will see it in the wild > > pretty early in their path to learning python. So we'll need to start > > introducing it earlier than later. > > Students see many features early in their path. I've had people still > struggling with writing functions ask about metaclasses. People > will see async code everywhere. We don't have to teach *everything* at > once. > These are not similar at all -- if you want similar examples, I"d say comprehensions, and lambda, both of which I DO introduce fairly early While newbies will *ask* about metaclasses, it's probably because they read about them somewhere, not because someone actually used a metaclass in a simple script or answer to a common question on SO. As for async, you are either doing async or not -- you can't even run an async def function without an event loop -- so again, it won't show up in real code newbies need to understand (at least until async becomes common practice with python...) -CHB So if you absolutely need to teach it to a beginner, it > shouldn't be difficult once they understand the difference between an > expression and a statement. > probably not, though that's a distinction that's mostly academic in the early stages of learning, it may become more critical now... again, not a huge deal, just a little bit more complexity -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at pearwood.info Sat Jun 23 01:52:07 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Sat, 23 Jun 2018 15:52:07 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <20180622164710.4906f9b0@fsol> <20180623022317.GG14437@ando.pearwood.info> Message-ID: <20180623055205.GJ14437@ando.pearwood.info> On Fri, Jun 22, 2018 at 09:08:37PM -0700, Chris Barker wrote: > > So if you absolutely need to teach it to a beginner, it > > shouldn't be difficult once they understand the difference between an > > expression and a statement. > > > > probably not, though that's a distinction that's mostly academic in the > early stages of learning, I don't think so. People do try to use assignment in expressions, even if only by mistake writing = when they meant == and need to distinguish between them. In Python 2, the most common clash between statements and expressions was print, but at least that's gone. https://www.quora.com/Whats-the-difference-between-a-statement-and-an-expression-in-Python-Why-is-print-%E2%80%98hi%E2%80%99-a-statement-while-other-functions-are-expressions https://stackoverflow.com/questions/4728073/what-is-the-difference-between-an-expression-and-a-statement-in-python https://stackoverflow.com/questions/43435850/what-is-the-difference-between-a-statement-and-a-function-in-python Even without assignment expressions, people still need to know why they can't write "if mo = re.match(pattern, text)". > again, not a huge deal, just a little bit more complexity Every new feature is added complexity. -- Steve From storchaka at gmail.com Sat Jun 23 02:11:06 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Sat, 23 Jun 2018 09:11:06 +0300 Subject: [Python-Dev] About [].append == [].append In-Reply-To: References: <5B2B8B1F.5070605@UGent.be> <790dea68-5a36-c21f-63e1-2d9bdcf2f333@mail.mipt.ru> <20180622164128.GD14437@ando.pearwood.info> <20180623015038.GF14437@ando.pearwood.info> Message-ID: 23.06.18 05:21, Guido van Rossum ????: > A bound method is a fairly complicated object, and for builtin bound > methods, the == comparison has the following definition: > - if the `__self__` objects are not the same object, return False > - otherwise, return True iff it's the same method (i.e. the same name / > the same underlying C function) > > I think it is more important that builtin methods and Python methods > behave the same. Should Python methods be changed to compare self with > "is" or are we too late to make that change? > > > I am not sure. It's surprising, but I fear it may be too late to change. > Are there tests in the stdlib for this behavior? Two tests are failed if change the current behavior. Both were added by fd01d7933bc3e9fd64d81961fbb7eabddcc82bc3 in issue1350060 together with changing comparisons for methods. https://github.com/python/cpython/commit/fd01d7933bc3e9fd64d81961fbb7eabddcc82bc3 https://bugs.python.org/issue1350060 It changed the behavior "for consistency". But unfortunately it made it less consistent (and broke your definition). There are different kind of methods, and currently they have different behavior. >>> [].append == [].append False >>> [].__iadd__ == [].__iadd__ True >>> UserList().append == UserList().append True Seems the last two examples returned False before issue1350060. I think changes made in issue1350060 should be reverted. See https://github.com/python/cpython/pull/7848 . But perhaps only in the master branch for minimizing possible breakage. From J.Demeyer at UGent.be Sat Jun 23 03:27:24 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Sat, 23 Jun 2018 09:27:24 +0200 Subject: [Python-Dev] About [].append == [].append In-Reply-To: <4346b866f6d84a538336412f6018f86a@xmail101.UGent.be> References: <5B2B8B1F.5070605@UGent.be> <790dea68-5a36-c21f-63e1-2d9bdcf2f333@mail.mipt.ru> <20180622164128.GD14437@ando.pearwood.info> <4346b866f6d84a538336412f6018f86a@xmail101.UGent.be> Message-ID: <5B2DF65C.8090908@UGent.be> On 2018-06-23 03:50, Steven D'Aprano wrote: > I think it is more important that builtin methods and Python methods > behave the same. +1 This inconsistency is the *real* problem here. It's one little extra complication to merging those classes (which was proposed in PEP 575, 576 and 579). From storchaka at gmail.com Sat Jun 23 04:54:47 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Sat, 23 Jun 2018 11:54:47 +0300 Subject: [Python-Dev] About [].append == [].append In-Reply-To: <5B2DF65C.8090908@UGent.be> References: <5B2B8B1F.5070605@UGent.be> <790dea68-5a36-c21f-63e1-2d9bdcf2f333@mail.mipt.ru> <20180622164128.GD14437@ando.pearwood.info> <4346b866f6d84a538336412f6018f86a@xmail101.UGent.be> <5B2DF65C.8090908@UGent.be> Message-ID: 23.06.18 10:27, Jeroen Demeyer ????: > On 2018-06-23 03:50, Steven D'Aprano wrote: >> I think it is more important that builtin methods and Python methods >> behave the same. > > +1 > > This inconsistency is the *real* problem here. It's one little extra > complication to merging those classes (which was proposed in PEP 575, > 576 and 579). +1 too. But I think the right solution should be opposite: reverting issue1350060 changes and making all methods equality be based on the identity of __self__. From armin.rigo at gmail.com Sat Jun 23 06:05:25 2018 From: armin.rigo at gmail.com (Armin Rigo) Date: Sat, 23 Jun 2018 12:05:25 +0200 Subject: [Python-Dev] About [].append == [].append In-Reply-To: References: <5B2B8B1F.5070605@UGent.be> <790dea68-5a36-c21f-63e1-2d9bdcf2f333@mail.mipt.ru> <20180622164128.GD14437@ando.pearwood.info> <4346b866f6d84a538336412f6018f86a@xmail101.UGent.be> <5B2DF65C.8090908@UGent.be> Message-ID: Hi, On 23 June 2018 at 10:54, Serhiy Storchaka wrote: > +1 too. But I think the right solution should be opposite: reverting > issue1350060 changes and making all methods equality be based on the > identity of __self__. The arguments in this thread are the kind of discussion I was looking for when I asked for one in https://bugs.python.org/issue1350060 :-) Better 12 years late than never. Fwiw I also tend to think nowadays that the right solution should be the opposite, like you say. A bient?t, Armin. From vano at mail.mipt.ru Sat Jun 23 06:23:35 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Sat, 23 Jun 2018 13:23:35 +0300 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <20180623024616.GH14437@ando.pearwood.info> References: <20180622164710.4906f9b0@fsol> <20180623024616.GH14437@ando.pearwood.info> Message-ID: <613a5a92-aa0b-38e5-3359-a0bee23aca46@mail.mipt.ru> On 23.06.2018 5:46, Steven D'Aprano wrote: > On Fri, Jun 22, 2018 at 10:59:43AM -0700, Michael Selik wrote: > >>>> I've started testing the proposed syntax when I teach. I don't have a >>>> large >>>> sample yet, but most students either dislike it or don't appreciate the >>>> benefits. They state a clear preference for shorter, simpler lines at the >>>> consequence of more lines of code. > Of course they do -- they're less fluent at reading code. They don't > have the experience to judge good code from bad. > > The question we should be asking is, do we only add features to Python > if they are easy for beginners? It's not that I especially want to add > features which *aren't* easy for beginners, but Python isn't Scratch and > "easy for beginners" should only be a peripheral concern. Python's design principles are expressed in the Zen. They rather focus on being no more complex than absolutely necessary, without prioritizing either beginners or old-timers ("simple is better than complex", "complex is better than complicated"). >>> This is partly because students, lacking the experience to instantly >>> recognize larger constructs, prefer a more concrete approach to >>> coding. "Good code" is code where the concrete behaviour is more >>> easily understood. As a programmer gains experience, s/he learns to >>> grok more complex expressions, and is then better able to make use of >>> the more expressive constructs such as list comprehensions. >>> >> I don't think that's the only dynamic going on here. List comprehensions >> are more expressive, but also more declarative and in Python they have nice >> parallels with SQL and speech patterns in natural language. The concept of >> a comprehension is separate from its particular expression in Python. For >> example, Mozilla's array comprehensions in Javascript are/were ugly [0]. > Mozilla's array comprehensions are almost identical to Python's, aside > from a couple of trivial differences: > > evens = [for (i of numbers) if (i % 2 === 0) i]; > > compared to: > > evens = [i for i in numbers if (i % 2 == 0)] > > - the inexplicable (to me) decision to say "for x of array" instead of > "for x in array"; > > - moving the expression to the end, instead of the beginning. > > The second one is (arguably, though not by me) an improvement, since it > preserves a perfect left-to-right execution order within the > comprehension. > > >> Students who are completely new to programming can see the similarity of >> list comprehensions to spoken language. > o_O > > I've been using comprehensions for something like a decade, and I can't > :-) > > The closest analogy to comprehensions I know of is set builder notation > in mathematics, which is hardly a surprise. That's where Haskell got the > inspiration from, and their syntax is essentially an ASCIIfied version > of set builder notation: > > Haskell: [(i,j) | i <- [1,2], j <- [1..4]] > > Maths: {(i,j) : i ? {1, 2}, j ? {1...4}} > > I teach secondary school children maths, and if there's a plain English > natural language equivalent to list builder notation, neither I nor any > of my students, nor any of the text books I've read, have noticed it. > > > > -- Regards, Ivan From p.f.moore at gmail.com Sat Jun 23 06:52:17 2018 From: p.f.moore at gmail.com (Paul Moore) Date: Sat, 23 Jun 2018 11:52:17 +0100 Subject: [Python-Dev] We now have C code coverage! In-Reply-To: References: Message-ID: On 22 June 2018 at 23:21, Brett Cannon wrote: > Thanks to a PR from Ammar Askar we now run Python under lcov as part of the > code coverage build. And thanks to codecov.io automatically merging code > coverage reports we get a complete report of our coverage (the first results > of which can now be seen at https://codecov.io/gh/python/cpython). > > And funny enough the coverage average changed less than 1%. :) Nice! One thing I noticed, code that's Windows-specific isn't covered. I assume that's because the coverage reports are based on runs of the test suite on Linux. Is it possible to merge in data from the Windows test runs? If not, what's the best way to address this? Should we be mocking things to attempt to test Windows-specific code even on Linux, or should we simply accept that we're not going to achieve 100% coverage and not worry about it? Paul From tjreedy at udel.edu Sat Jun 23 11:51:18 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Sat, 23 Jun 2018 11:51:18 -0400 Subject: [Python-Dev] About [].append == [].append In-Reply-To: References: <5B2B8B1F.5070605@UGent.be> <790dea68-5a36-c21f-63e1-2d9bdcf2f333@mail.mipt.ru> <20180622164128.GD14437@ando.pearwood.info> <4346b866f6d84a538336412f6018f86a@xmail101.UGent.be> <5B2DF65C.8090908@UGent.be> Message-ID: On 6/23/2018 4:54 AM, Serhiy Storchaka wrote: > 23.06.18 10:27, Jeroen Demeyer ????: >> On 2018-06-23 03:50, Steven D'Aprano wrote: >>> I think it is more important that builtin methods and Python methods >>> behave the same. >> >> +1 >> >> This inconsistency is the *real* problem here. It's one little extra >> complication to merging those classes (which was proposed in PEP 575, >> 576 and 579). > > +1 too. But I think the right solution should be opposite: reverting > issue1350060 changes and making all methods equality be based on the > identity of __self__. We 3, and Armin Rigo, it appears, agree that equivalent C and Python coded functions should act the same in comparisons. Abstractly, in English, one might say that a.m equals b.m is true if they perform the same action and have the same effect. The problem, even after adding a stipulation of same type, is that the nature of 'action and effect' are different for pure function methods versus mutation methods. Since pure functions ignore the identify of a and b, so should a.pure_func == b.pure_func. a == b is the right test. For example, a.bit_length == b.bit_length should be true, not false, if a == b. On the other hand, identify is essential for mutation methods, so 'a is b' is the right test, as for .append. But without functions having an 'I mutate' flag, '==' cannot know which comparison of a and b to use, so we have to choose one, and the choice should not depend on the coding language. 'a == b' leads to false positives; 'a is b' leads to false negatives. I don't know how method comparison is actually used, if ever, but false negatives seem safer to me. So I currently agree with Serhiy's choice. -- Terry Jan Reedy From tjreedy at udel.edu Sat Jun 23 12:31:16 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Sat, 23 Jun 2018 12:31:16 -0400 Subject: [Python-Dev] We now have C code coverage! In-Reply-To: References: Message-ID: On 6/22/2018 9:40 PM, Nathaniel Smith wrote: > On Fri, Jun 22, 2018 at 6:16 PM, Terry Reedy wrote: >> I discovered the answer to 2. by shift-clicking on a text_x file to see >> their coverage report for the file. The colors actually do reflect the test >> lines executed. codecov.io excludes gui tests*, so the reported coverage >> for tkinter, idlelib, and turtle is deceptive and bogus, and under-reports >> the total cpython coverage by a percent or two. It would be better to >> exclude these modules. >> * I assume that codecov.io uses linux servers. I have read that there are >> programs that simulate X-Windows so that gui code will execute without >> actual terminals. > > Codecov.io doesn't run any tests itself; it's just a service for > aggregation and reporting. The coverage information is being gathered > while running CPython's regular CI tests, and then uploaded to > codecov.io to view. Thank you for the information. > So if you want to run the gui tests -- which seems like a good idea if > possible! -- then the way to do that would be to make them run as part > of the regular Travis/Appveyor/VSTS checks. I have suggested that, and before that, the same for buildbots. The reality is that tkinter, IDLE, or turtle could be disabled on *nix by regressions and the official testing would not notice. -- Terry Jan Reedy From zachary.ware+pydev at gmail.com Sat Jun 23 13:09:01 2018 From: zachary.ware+pydev at gmail.com (Zachary Ware) Date: Sat, 23 Jun 2018 12:09:01 -0500 Subject: [Python-Dev] We now have C code coverage! In-Reply-To: References: Message-ID: On Sat, Jun 23, 2018 at 11:31 AM, Terry Reedy wrote: > I have suggested that, and before that, the same for buildbots. The reality > is that tkinter, IDLE, or turtle could be disabled on *nix by regressions > and the official testing would not notice. I'm looking into enabling the GUI tests on some of the CI hosts today, not sure how far I'll make it with that. GUI tests are currently enabled on my Gentoo [1] and Windows [2] builders, and have been for a couple of years at this point; I'm not sure if any other builders are running GUI tests. [1] http://buildbot.python.org/all/#/workers/6 [2] http://buildbot.python.org/all/#/workers/11 -- Zach From vano at mail.mipt.ru Sat Jun 23 14:18:05 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Sat, 23 Jun 2018 21:18:05 +0300 Subject: [Python-Dev] We now have C code coverage! In-Reply-To: References: Message-ID: <847118eb-7dd6-958b-a360-32a0615171eb@mail.mipt.ru> On 23.06.2018 13:52, Paul Moore wrote: > On 22 June 2018 at 23:21, Brett Cannon wrote: >> Thanks to a PR from Ammar Askar we now run Python under lcov as part of the >> code coverage build. And thanks to codecov.io automatically merging code >> coverage reports we get a complete report of our coverage (the first results >> of which can now be seen at https://codecov.io/gh/python/cpython). >> >> And funny enough the coverage average changed less than 1%. :) > Nice! > > One thing I noticed, code that's Windows-specific isn't covered. I > assume that's because the coverage reports are based on runs of the > test suite on Linux. Is it possible to merge in data from the Windows > test runs? If not, what's the best way to address this? Should we be > mocking things to attempt to test Windows-specific code even on Linux, > or should we simply accept that we're not going to achieve 100% > coverage and not worry about it? AFAICS lcov is based on gcov which is GCC-specific. > Paul > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/vano%40mail.mipt.ru -- Regards, Ivan From tjreedy at udel.edu Sat Jun 23 15:20:33 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Sat, 23 Jun 2018 15:20:33 -0400 Subject: [Python-Dev] We now have C code coverage! In-Reply-To: References: Message-ID: On 6/23/2018 1:09 PM, Zachary Ware wrote: > On Sat, Jun 23, 2018 at 11:31 AM, Terry Reedy wrote: >> I have suggested that, and before that, the same for buildbots. The reality >> is that tkinter, IDLE, or turtle could be disabled on *nix by regressions >> and the official testing would not notice. > > I'm looking into enabling the GUI tests on some of the CI hosts today, > not sure how far I'll make it with that. GUI tests are currently > enabled on my Gentoo [1] and Windows [2] builders, and have been for a > couple of years at this point; I'm not sure if any other builders are > running GUI tests. I noticed your Gentoo builders running gui tests some years ago. When I re-checked perhaps a year ago, they either were not running or seem to have skipped test_idle, maybe the former. > [1] http://buildbot.python.org/all/#/workers/6 > [2] http://buildbot.python.org/all/#/workers/11 Rechecking now, on Gentoo test_idle appears and passed on these 3.6 and 3.7 pages http://buildbot.python.org/all/#/builders/82/builds/414/steps/5/logs/stdio Neither Firefox nor Edge can find 'test_idle' on these 3.x pages http://buildbot.python.org/all/#/builders/103/builds/1149/steps/5/logs/stdio http://buildbot.python.org/all/#/builders/99/builds/1130/steps/4/logs/stdio test_tk appears on 1130 but not 1149 For your 8.1 bot: test_idle passed for 3.7 http://buildbot.python.org/all/#/builders/133/builds/339/steps/3/logs/stdio but does is not found on this 3.x page (neither is 'test_tk') http://buildbot.python.org/all/#/builders/12/builds/991/steps/3/logs/stdio Both Appveyor and my machine run test_idle when running the full 3.x test suite. -- Terry Jan Reedy From gvanrossum at gmail.com Sat Jun 23 17:08:16 2018 From: gvanrossum at gmail.com (Guido van Rossum) Date: Sat, 23 Jun 2018 14:08:16 -0700 Subject: [Python-Dev] About [].append == [].append In-Reply-To: References: <5B2B8B1F.5070605@UGent.be> <790dea68-5a36-c21f-63e1-2d9bdcf2f333@mail.mipt.ru> <20180622164128.GD14437@ando.pearwood.info> <4346b866f6d84a538336412f6018f86a@xmail101.UGent.be> <5B2DF65C.8090908@UGent.be> Message-ID: Let's make it so. On Sat, Jun 23, 2018, 08:53 Terry Reedy wrote: > On 6/23/2018 4:54 AM, Serhiy Storchaka wrote: > > 23.06.18 10:27, Jeroen Demeyer ????: > >> On 2018-06-23 03:50, Steven D'Aprano wrote: > >>> I think it is more important that builtin methods and Python methods > >>> behave the same. > >> > >> +1 > >> > >> This inconsistency is the *real* problem here. It's one little extra > >> complication to merging those classes (which was proposed in PEP 575, > >> 576 and 579). > > > > +1 too. But I think the right solution should be opposite: reverting > > issue1350060 changes and making all methods equality be based on the > > identity of __self__. > > We 3, and Armin Rigo, it appears, agree that equivalent C and Python > coded functions should act the same in comparisons. Abstractly, in > English, one might say that a.m equals b.m is true if they perform the > same action and have the same effect. > > The problem, even after adding a stipulation of same type, is that the > nature of 'action and effect' are different for pure function methods > versus mutation methods. Since pure functions ignore the identify of a > and b, so should a.pure_func == b.pure_func. a == b is the right test. > For example, a.bit_length == b.bit_length should be true, not false, if > a == b. On the other hand, identify is essential for mutation methods, > so 'a is b' is the right test, as for .append. > > But without functions having an 'I mutate' flag, '==' cannot know which > comparison of a and b to use, so we have to choose one, and the choice > should not depend on the coding language. 'a == b' leads to false > positives; 'a is b' leads to false negatives. I don't know how method > comparison is actually used, if ever, but false negatives seem safer to > me. So I currently agree with Serhiy's choice. > > -- > Terry Jan Reedy > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From zachary.ware+pydev at gmail.com Sat Jun 23 17:48:04 2018 From: zachary.ware+pydev at gmail.com (Zachary Ware) Date: Sat, 23 Jun 2018 16:48:04 -0500 Subject: [Python-Dev] We now have C code coverage! In-Reply-To: References: Message-ID: On Sat, Jun 23, 2018 at 2:20 PM, Terry Reedy wrote: > Rechecking now, on Gentoo > > test_idle appears and passed on these 3.6 and 3.7 pages > http://buildbot.python.org/all/#/builders/82/builds/414/steps/5/logs/stdio > > Neither Firefox nor Edge can find 'test_idle' on these 3.x pages > http://buildbot.python.org/all/#/builders/103/builds/1149/steps/5/logs/stdio > http://buildbot.python.org/all/#/builders/99/builds/1130/steps/4/logs/stdio > > test_tk appears on 1130 but not 1149 > > For your 8.1 bot: test_idle passed for 3.7 > http://buildbot.python.org/all/#/builders/133/builds/339/steps/3/logs/stdio > > but does is not found on this 3.x page (neither is 'test_tk') > http://buildbot.python.org/all/#/builders/12/builds/991/steps/3/logs/stdio Click the magnifying glass icon ("load all data for use with the browser search tool") at the upper right of the console pane and try again; each of the above is present and passed. This does unfortunately seem to be another case of non-intuitive UI from buildbot. > Both Appveyor and my machine run test_idle when running the full 3.x test > suite. I am pleasantly surprised that AppVeyor does appear to be running the GUI tests :) -- Zach From rokkerruslan at yandex.com Sat Jun 23 17:55:30 2018 From: rokkerruslan at yandex.com (Rokker Ruslan) Date: Sun, 24 Jun 2018 00:55:30 +0300 Subject: [Python-Dev] Python3.7 backwards incompatible Message-ID: <2207231529790930@web53o.yandex.ru> An HTML attachment was scrubbed... URL: From guido at python.org Sat Jun 23 18:10:44 2018 From: guido at python.org (Guido van Rossum) Date: Sat, 23 Jun 2018 15:10:44 -0700 Subject: [Python-Dev] Python3.7 backwards incompatible In-Reply-To: <2207231529790930@web53o.yandex.ru> References: <2207231529790930@web53o.yandex.ru> Message-ID: First, the typing module is still provisional, so there is no strict backwards compatibility guarantee. With that out of the way, I can reproduce your problem, and I assume it's caused by the implementation of PEP 560, which is meant to speed up the typing module (among other goals). I'm wondering about your claim that this breaks many "libraries that use annotations". Most code using annotations would not need to use issubclass() in this way. Where exactly did you encounter this? I'm CC'ing Ivan Levkivskyi, who knows the relevant code better and will be able to explain whether this is an oversight or an intentional regression. On Sat, Jun 23, 2018 at 2:59 PM Rokker Ruslan wrote: > Python 3.7 in the status of RC, but I do not see information about the > fact that python3.7 is backwards incompatible with python3.5. > > $ ~ python3.5 > Python 3.5.2 (default, Nov 23 2017, 16:37:01) > [GCC 5.4.0 20160609] on linux > Type "help", "copyright", "credits" or "license" for more information. > >>> from typing import List > >>> class MyList(list): > ... pass > ... > >>> issubclass(MyList, List) > True > >>> issubclass(List, MyList) > False > >>> > > $ ~ python3.7 > Python 3.7.0rc1+ (heads/3.7:3747dd16d5, Jun 22 2018, 22:53:42) > [GCC 5.4.0 20160609] on linux > Type "help", "copyright", "credits" or "license" for more information. > >>> from typing import List > >>> class MyList(list): > ... pass > ... > >>> issubclass(MyList, List) > True > >>> issubclass(List, MyList) > Traceback (most recent call last): > File "", line 1, in > TypeError: issubclass() arg 1 must be a class > >>> > > And so with all types of "typing" module. > This breaks down many libraries that use annotations as part of the > functionality if we now can't use typing.* into issubclass function. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From tjreedy at udel.edu Sat Jun 23 20:25:37 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Sat, 23 Jun 2018 20:25:37 -0400 Subject: [Python-Dev] We now have C code coverage! In-Reply-To: References: Message-ID: On 6/23/2018 5:48 PM, Zachary Ware wrote: > On Sat, Jun 23, 2018 at 2:20 PM, Terry Reedy wrote: >> Rechecking now, on Gentoo >> >> test_idle appears and passed on these 3.6 and 3.7 pages >> http://buildbot.python.org/all/#/builders/82/builds/414/steps/5/logs/stdio >> >> Neither Firefox nor Edge can find 'test_idle' on these 3.x pages >> http://buildbot.python.org/all/#/builders/103/builds/1149/steps/5/logs/stdio >> http://buildbot.python.org/all/#/builders/99/builds/1130/steps/4/logs/stdio >> >> test_tk appears on 1130 but not 1149 >> >> For your 8.1 bot: test_idle passed for 3.7 >> http://buildbot.python.org/all/#/builders/133/builds/339/steps/3/logs/stdio >> >> but does is not found on this 3.x page (neither is 'test_tk') >> http://buildbot.python.org/all/#/builders/12/builds/991/steps/3/logs/stdio > > Click the magnifying glass icon ("load all data for use with the > browser search tool") at the upper right of the console pane and try > again; each of the above is present and passed. This does > unfortunately seem to be another case of non-intuitive UI from > buildbot. Presenting data on the screen that cannot be found is terrible. With Edge, I had to erase and retype the search word also. With Firefox, that sometimes worked without pressing the magnifier. I thought my copy of Firefox was broken until I tried Edge also. >> Both Appveyor and my machine run test_idle when running the full 3.x test >> suite. > > I am pleasantly surprised that AppVeyor does appear to be running the > GUI tests :) > -- Terry Jan Reedy From levkivskyi at gmail.com Sat Jun 23 21:14:55 2018 From: levkivskyi at gmail.com (Ivan Levkivskyi) Date: Sun, 24 Jun 2018 02:14:55 +0100 Subject: [Python-Dev] Python3.7 backwards incompatible In-Reply-To: References: <2207231529790930@web53o.yandex.ru> Message-ID: This particular breakage is explicitly listed in PEP 560, see an example with List and List[int] in https://www.python.org/dev/peps/pep-0560/#backwards-compatibility-and-impact-on-users-who-don-t-use-typing In general, isinstance() with typing types should be avoided when possible (Mark Shannon who is the BDFL delegate for PEP 484 wanted to prohibit it completely, but in the end we kept only the bare minimum, like your first example). When designing/implementing PEP 560 I realised it will be impossible to keep 100% backwards compatibility. I tried to preserve 99% of public APIs, but since isinstance() is already discouraged, it fell into remaining 1%. A possible workaround is to use `typing_inspect` library on PyPI (disclaimer: I originally wrote it). You can use `get_origin()` function to extract the runtime class that corresponds to a given type. It works with both 3.6 and 3.7 and tries its best to return the relevant runtime class (with few exceptions, see docstring), for every version, for example on Python 3.7 `get_origin(List[int])` return `list`. The return of get_origin() should be usable in all runtime context including `isinstance()`. Btw, I actually like the new behaviour. After PEP 560 types are no more classes, which emphasises that they should be used in static context, if one wants to do something in runtime, one needs to use an explicit conversion to a runtime class. -- Ivan On 23 June 2018 at 23:10, Guido van Rossum wrote: > First, the typing module is still provisional, so there is no strict > backwards compatibility guarantee. > > With that out of the way, I can reproduce your problem, and I assume it's > caused by the implementation of PEP 560, which is meant to speed up the > typing module (among other goals). > > I'm wondering about your claim that this breaks many "libraries that use > annotations". Most code using annotations would not need to use > issubclass() in this way. Where exactly did you encounter this? > > I'm CC'ing Ivan Levkivskyi, who knows the relevant code better and will be > able to explain whether this is an oversight or an intentional regression. > > On Sat, Jun 23, 2018 at 2:59 PM Rokker Ruslan > wrote: > >> Python 3.7 in the status of RC, but I do not see information about the >> fact that python3.7 is backwards incompatible with python3.5. >> >> $ ~ python3.5 >> Python 3.5.2 (default, Nov 23 2017, 16:37:01) >> [GCC 5.4.0 20160609] on linux >> Type "help", "copyright", "credits" or "license" for more information. >> >>> from typing import List >> >>> class MyList(list): >> ... pass >> ... >> >>> issubclass(MyList, List) >> True >> >>> issubclass(List, MyList) >> False >> >>> >> >> $ ~ python3.7 >> Python 3.7.0rc1+ (heads/3.7:3747dd16d5, Jun 22 2018, 22:53:42) >> [GCC 5.4.0 20160609] on linux >> Type "help", "copyright", "credits" or "license" for more information. >> >>> from typing import List >> >>> class MyList(list): >> ... pass >> ... >> >>> issubclass(MyList, List) >> True >> >>> issubclass(List, MyList) >> Traceback (most recent call last): >> File "", line 1, in >> TypeError: issubclass() arg 1 must be a class >> >>> >> >> And so with all types of "typing" module. >> This breaks down many libraries that use annotations as part of the >> functionality if we now can't use typing.* into issubclass function. >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: https://mail.python.org/mailman/options/python-dev/ >> guido%40python.org >> > > > -- > --Guido van Rossum (python.org/~guido) > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Sun Jun 24 00:33:59 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun, 24 Jun 2018 14:33:59 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <5B2D80E7.5010501@canterbury.ac.nz> References: <5B2D80E7.5010501@canterbury.ac.nz> Message-ID: On 23 June 2018 at 09:06, Greg Ewing wrote: > Nick Coghlan wrote: >> >> x:= f():" implies "x" is already defined as a target somewhere else in >> the current scope, while "if x := f() given x:" potentially introduces >> "x" as a new local target > > > Noooo..... this is just taking a bad idea and making it > worse, IMO. > > I'm -1 on any contortions designed to allow comprehensions > to assign to things in outer scopes. All the proposed use > cases I've seen for this have not improved readability > over writing a function that does things the usual way. > > Can we please leave comprehensions as declarative > constructs? The best tools do just one thing and do > it well. These proposals seem to be trying to turn > comprehensions into swiss army knives. If PEP 572 was proposing the use of regular local scoping for assignment expressions in comprehensions, such that they could still be used to avoiding repeating subexpressions within an iteration, but couldn't be used to export progress data, or to maintain a partial sum without having to rely on `locals().get("total", 0)` to provide an initial value, then I wouldn't be feeling obliged to present an alternative that offers the same state export capabilities in a more explicit form. Given that PEP 572 *is* proposing implicit comprehension state export, though, then I think it's important to make the case that seeing the proposed semantics as intuitive is only going to be the case for folks that have used Python 2 style comprehensions extensively - anyone that's never encountered the old state-leaking behaviour for iteration variables is going to be surprised when assignment expressions ignore the existence of the comprehension scope (even though the iteration variable pays attention to it). Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From python-dev at mgmiller.net Sun Jun 24 00:57:06 2018 From: python-dev at mgmiller.net (Mike Miller) Date: Sat, 23 Jun 2018 21:57:06 -0700 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <20180623024616.GH14437@ando.pearwood.info> References: <20180622164710.4906f9b0@fsol> <20180623024616.GH14437@ando.pearwood.info> Message-ID: On 2018-06-22 19:46, Steven D'Aprano wrote: > - the inexplicable (to me) decision to say "for x of array" instead of > "for x in array"; Believe JavaScript has for?in, but as usual in the language it is broken and they needed a few more tries to get it right. for?of is the latest version and works as expected. -Mike From ncoghlan at gmail.com Sun Jun 24 01:27:07 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun, 24 Jun 2018 15:27:07 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <20180623034847.GI14437@ando.pearwood.info> References: <20180623034847.GI14437@ando.pearwood.info> Message-ID: On 23 June 2018 at 13:48, Steven D'Aprano wrote: > On Sat, Jun 23, 2018 at 12:22:33AM +1000, Nick Coghlan wrote: > [...] >> plenty of >> functional-language-inspired documentation to instead encourage folks >> to view comprehensions as tightly encapsulated declarative container >> construction syntax. > > I can't say I've done a broad survey, but the third-party documentation > I've read on comprehensions typically glosses over the scoping issues > without mentioning them. To the extent that scoping is even hinted at, > comprehensions are treated as expressions which are exactly equivalent > to re-writing them as a for-loop in the current scope. > > This is a typical example, found as the top result on googling for > "python comprehensions": > > https://www.google.com/search?q=python+comprehensions > > http://www.pythonforbeginners.com/basics/list-comprehensions-in-python > > Nothing is mentioned about scope, and it repeats the inaccurate but > simple equivalency: > > for item in list: > if conditional: > expression > > But perhaps that tutorial is too old. Okay this recent one is only a > little more than a year old: > > https://hackernoon.com/list-comprehension-in-python-8895a785550b > > Again, no mention of scoping issues, comprehensions are simply > expressions which presumably run in the same scope as any other > expression. > > I think you over-estimate how many newcomers to Python are even aware > that the scope of comprehensions is something to consider. I put quite a bit of work into making it possible for folks to gloss over the distinction and still come to mostly-correct conclusions about how particular code snippets would behave. I was only able to achieve it because the folks that designed lexical scoping before me had made *read* access to lexical scopes almost entirely transparent, and because generator expressions were designed to fail fast if there was a bug in the expression defining the outermost iterable (which meant that even at class scope, the outermost iterable expression still had access to class level variables, because it was evaluated *outside* the nested scope). *Write* access to lexically nested scopes, by contrast, was omitted entirely from the original lexical scoping design, and when it was later added by https://www.python.org/dev/peps/pep-3104/, it was done using an explicit "nonlocal" declaration statement (expressly akin to "global"), and PEP 3099 explicitly ruled out the use of ":=" to implicitly declare the target name as being non-local. PEP 572 is thus taking the position that: - we now want to make write access to outer scopes implicit (despite PEP 3099 explicitly ruling that out as desired design feature) - but only in comprehensions and generator expressions (not lambda expressions, and not full nested functions) - and only for assignment expressions, not for loop iteration variables - and we want it to implicitly choose between a "global NAME" declaration and a "nonlocal NAME" declaration based on where the comprehension is defined - and this is OK because "nobody" actually understands how comprehensions hide the iteration variable in practice, and "everybody" thinks they're still a simple for loop like they were in Python 2 - the fact that the language reference, the behaviour at class scopes, the disassembly output, and the behaviour in a debugger all indicate that comprehensions are full nested scopes isn't important This level of additional complication and complexity in the scoping semantics simply isn't warranted for such a minor readability enhancement as assignment expressions. Cheers, Nick. P.S. "You did such a good job of minimising the backwards compatibility breakage when we changed the semantics of scoping in comprehensions that we now consider your opinion on reasonable scoping semantics for comprehensions to be irrelevant, because everybody else still thinks they work the same way as they did in Python 2" is such a surreal position for folks to be taking that I'm having trouble working out how to effectively respond to it. Guido has complained that "I keep changing my mind about what I want", but that's not what's actually going on: what I want is to keep folks from taking our already complicated scoping semantics and making it close to impossible for anyone to ever infer how they work from experimentation at the interactive prompt. That goal has pretty much stayed consistent since the parent local scoping proposal was first put forward. What keeps changing is my tactics in pursuing that goal, based on my current perception of what the folks pushing that proposal are actually trying to achieve (which seems to be some combination of "We want to pretend that the Python 3 scoping changes in comprehensions never happened, but we still want to avoid leaking the iteration variables somehow" and "We want to enable new clever tricks with state export from comprehensions and generator expressions"), as well as repeatedly asking myself what *actually* bothers me about the proposal (which I've now distilled down to just the comprehension scoping issue, and the reliance on an arbitrary syntactic restriction against top level usage to avoid competing with traditional assignment statements). -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From steve at pearwood.info Sun Jun 24 01:56:47 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Sun, 24 Jun 2018 15:56:47 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> Message-ID: <20180624055646.GL14437@ando.pearwood.info> On Sun, Jun 24, 2018 at 02:33:59PM +1000, Nick Coghlan wrote: > Given that PEP 572 *is* proposing implicit comprehension state export, "Implicit" and "explicit" are two terms which often get misused to mean "I don't like it" and "I do like it". Making the intentional choice to use an assignment expression is not really "implicit" in any meaningful sense. One might as well complain that "import this" implicitly creates a local variable "this". Well, yes, it does, in a very loose sense, but that's what imports are defined as do and it is the whole purpose for making them. If PEP 572's proposal goes ahead, the behaviour of assignment expressions will be *defined* as creating assignments in the local scope rather than the sublocal comprehension scope. To call that "implicit" is rather like saying that regular assignment is implicit. > though, then I think it's important to make the case that seeing the > proposed semantics as intuitive is only going to be the case for folks > that have used Python 2 style comprehensions extensively - anyone > that's never encountered the old state-leaking behaviour for iteration > variables is going to be surprised when assignment expressions ignore > the existence of the comprehension scope (even though the iteration > variable pays attention to it). You are making the assumption that most people are even aware of "comprehension scope". I don't think that is the case. In my experience, scoping in Python is still typically seen as the LGB rule (locals/globals/builtins). See for example this StackOverflow post from 2016: https://stackoverflow.com/questions/37211910/override-lgb-scope-rule Sometimes people remember the E/N (enclosing function/nonlocal) part. Hardly anyone remembers the C (class) part unless they are actively thinking in terms of code running inside a class definition, and even if they do, they typically aren't sure of exactly how it interacts with the rest. And I predict that even fewer think of comprehensions as a separate scope, except by ommission: they *don't think about* the scope of the loop variable until it bites them. But as Tim Peters has previously discussed, the loop variable is special, and is especially prone to accidental shadowing. That won't be the case for assignment expressions. If there's shadowing going on, it will be deliberate. Aside: I've said before that I'm not a fan of sublocal comprehension scoping, since I personally found it helpful on occasion for the loop variable to be visible outside of the comprehension. But given that the only experience most people apparently had with comprehension scoping was to be bitten by it, I grudgingly accept that encapsulating the loop variable was the right decision to make, even if it personally inconvenienced me more than it saved me. Nor was I the only one: others have been bitten by the change to comprehension scope, see for example: https://www.reddit.com/r/Python/comments/425qmb/strange_python_27_34_difference/ There is no consensus that the change to comprehensions was a good thing or justified. The bottom line is that I don't think people will be surprised by assignment expression scope being local instead of sublocal. Rather I expect that they won't even think about it, until they do, and then *whatever* behaviour we pick, we'll annoy somebody. -- Steve From ncoghlan at gmail.com Sun Jun 24 02:33:38 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun, 24 Jun 2018 16:33:38 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <20180624055646.GL14437@ando.pearwood.info> References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> Message-ID: On 24 June 2018 at 15:56, Steven D'Aprano wrote: > On Sun, Jun 24, 2018 at 02:33:59PM +1000, Nick Coghlan wrote: > >> Given that PEP 572 *is* proposing implicit comprehension state export, > > "Implicit" and "explicit" are two terms which often get misused to mean > "I don't like it" and "I do like it". > > Making the intentional choice to use an assignment expression is not > really "implicit" in any meaningful sense. No, it's actually implicit: there's an extra "global NAME" or "nonlocal NAME" in the equivalent code for a comprehension that isn't there in the as-written source code, and doesn't get emitted for a regular assignment expression or for the iteration variable in a comprehension - it only shows up due to the defined interaction between comprehensions and assignment expressions. > One might as well complain > that "import this" implicitly creates a local variable "this". Well, > yes, it does, in a very loose sense, but that's what imports are > defined as do and it is the whole purpose for making them. And they behave the same way in every context where they're permitted to appear. > If PEP 572's proposal goes ahead, the behaviour of assignment > expressions will be *defined* as creating assignments in the local scope > rather than the sublocal comprehension scope. To call that "implicit" > is rather like saying that regular assignment is implicit. I do say that regular assignments implicitly declare a name as local. "Python has implicit local variable declarations" is also regularly cited as one of the differences between it and languages that require explicit declarations, like C. Even augmented assignments implicitly declare a name as being a local (hence the infamous UnboundLocalError that arises when you attempt to use an augmented assignment to rebind a name from an outer scope). The problem I have with PEP 572 is that it proposes *breaking that otherwise universal pattern* - instead of having assignment expressions in comprehensions implicitly declare the name as local in the nested comprehension scope, it instead has them: 1. implicitly declare the name as global or as nonlocal in the comprehension (or else raise an exception), depending on the nature of the parent scope where the comprehension is used 2. in the nonlocal reference case, amend the symbol table analysis to act like there was a preceding "if 0:\n for NAME in ():\n pass" in the parent scope (so the compiler knows which outer function scope to target) The rationale being given for why that is OK is: 1. "Everyone" thinks comprehensions are just a for loop (even though that hasn't been true for the better part of a decade, and was never true for generator expressions) 2. If comprehensions are just a for loop, then assignment expressions inside them should be local to the containing scope 3. Therefore the implicit declarations required to tie everything together and allow folks to continue with an incorrect understanding of how comprehensions work aren't really implicit - they're explicit in the inaccurate expansion of the construct! Can you imagine the reaction if anyone other than Guido or Tim was attempting to argue for a change to the language that only makes sense if we grant a completely inaccurate understanding of how a particular language construct works as being a credible starting point? Because that's how this comprehension scoping argument feels to me: Proposal author: "If the language worked in a way other than it does, then this proposal would make perfect sense." Proposal reviewer: "Yes, but it doesn't work that way, it works this way. We deliberately changed it because the old way caused problems." Proposal author: "Ah, but it *used* to work that way, and a lot of people still think it works that way, and we can get the compiler to jump through hoops to pretend it still works that way, except for the parts of the new way that we want to keep." Proposal reviewer: "..." Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From rosuav at gmail.com Sun Jun 24 02:53:50 2018 From: rosuav at gmail.com (Chris Angelico) Date: Sun, 24 Jun 2018 16:53:50 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> Message-ID: On Sun, Jun 24, 2018 at 4:33 PM, Nick Coghlan wrote: > On 24 June 2018 at 15:56, Steven D'Aprano wrote: >> On Sun, Jun 24, 2018 at 02:33:59PM +1000, Nick Coghlan wrote: >> >>> Given that PEP 572 *is* proposing implicit comprehension state export, >> >> "Implicit" and "explicit" are two terms which often get misused to mean >> "I don't like it" and "I do like it". >> >> Making the intentional choice to use an assignment expression is not >> really "implicit" in any meaningful sense. > > No, it's actually implicit: there's an extra "global NAME" or > "nonlocal NAME" in the equivalent code for a comprehension that isn't > there in the as-written source code, and doesn't get emitted for a > regular assignment expression or for the iteration variable in a > comprehension - it only shows up due to the defined interaction > between comprehensions and assignment expressions. The implicit "nonlocal NAME" is only because there is an equally implicit function boundary. Why is there a function boundary marked by square brackets? It's not saying "def" or "lambda", which obviously create functions. It's a 'for' loop wrapped inside a list display. What part of that says "hey, I'm a nested function"? So if there's an implicit function, with implicit declaration of a magical parameter called ".0", why can't it have an equally implicit declaration that "spam" is a nonlocal name? ChrisA From ncoghlan at gmail.com Sun Jun 24 03:10:28 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun, 24 Jun 2018 17:10:28 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> Message-ID: On 24 June 2018 at 16:53, Chris Angelico wrote: > On Sun, Jun 24, 2018 at 4:33 PM, Nick Coghlan wrote: >> On 24 June 2018 at 15:56, Steven D'Aprano wrote: >>> On Sun, Jun 24, 2018 at 02:33:59PM +1000, Nick Coghlan wrote: >>> >>>> Given that PEP 572 *is* proposing implicit comprehension state export, >>> >>> "Implicit" and "explicit" are two terms which often get misused to mean >>> "I don't like it" and "I do like it". >>> >>> Making the intentional choice to use an assignment expression is not >>> really "implicit" in any meaningful sense. >> >> No, it's actually implicit: there's an extra "global NAME" or >> "nonlocal NAME" in the equivalent code for a comprehension that isn't >> there in the as-written source code, and doesn't get emitted for a >> regular assignment expression or for the iteration variable in a >> comprehension - it only shows up due to the defined interaction >> between comprehensions and assignment expressions. > > The implicit "nonlocal NAME" is only because there is an equally > implicit function boundary. Why is there a function boundary marked by > square brackets? It's not saying "def" or "lambda", which obviously > create functions. It's a 'for' loop wrapped inside a list display. > What part of that says "hey, I'm a nested function"? Nothing - that's why I refer to them as implicitly nested scopes (vs the explicitly nested scopes in functions and lambda expressions, where the scope is introduced via keyword). However, there's still a major behavioural tell at runtime that they're running in a nested scope: the iteration variables don't leak. (There are other tells as well, but not ones that most folks are likely to encounter) > So if there's an implicit function, with implicit declaration of a > magical parameter called ".0", why can't it have an equally implicit > declaration that "spam" is a nonlocal name? Because comprehensions don't do that for their iteration variables, because assignment expressions don't do that when used in explicitly nested scopes, because the required implicit scope declarations are context dependent, and because even such gyrations still can't hide the existence of the comprehension's implicitly nested scope when dealing with classes and the two-argument form of exec(). Since the implicitly nested scopes can't be hidden, it makes far more sense to me to just admit that they're there, and provide explicit syntax for cases where folks decide they really do want name bindings to leak out of that scope (whether those name bindings are assignment expression targets or the iteration variables themselves). Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From steve at pearwood.info Sun Jun 24 07:47:40 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Sun, 24 Jun 2018 21:47:40 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <20180624055646.GL14437@ando.pearwood.info> References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> Message-ID: <20180624114740.GM14437@ando.pearwood.info> On Sun, Jun 24, 2018 at 03:56:47PM +1000, Steven D'Aprano wrote: > There is no consensus that the change to comprehensions was a good thing > or justified. On re-reading that, I think its wrong -- it wasn't really what I intended to say. What I intended to say was, in hindsight, more like: *Despite the consensus to change comprehension scope*, there's a contingent of people who are not convinced that the change was a good thing or justified. Sorry for the inaccurate comment. Mea culpa. -- Steve From vano at mail.mipt.ru Sun Jun 24 10:24:12 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Sun, 24 Jun 2018 17:24:12 +0300 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> Message-ID: On 24.06.2018 9:53, Chris Angelico wrote: > On Sun, Jun 24, 2018 at 4:33 PM, Nick Coghlan wrote: >> On 24 June 2018 at 15:56, Steven D'Aprano wrote: >>> On Sun, Jun 24, 2018 at 02:33:59PM +1000, Nick Coghlan wrote: >>> >>>> Given that PEP 572 *is* proposing implicit comprehension state export, >>> "Implicit" and "explicit" are two terms which often get misused to mean >>> "I don't like it" and "I do like it". >>> >>> Making the intentional choice to use an assignment expression is not >>> really "implicit" in any meaningful sense. My 2c. An expression is intuitively thought to be self-contained i.e. without side effects. if I write `a=b+1`, I'm not expecting it to do anything except assigning `a'. Expressions with side effects has long since proven to be problematic because of the implicit (thus hard to see and track) links they create (and because the result depends on the order of evaluation). Moreover, Python's other design elements have been consistently discouraging expressions with side effects, too (e.g. mutator methods intentionally return None instead of the new value, making them useless in expressions), so the proposition is in direct conflict with the language's design. Assignment expressions are a grey area: they carry the full implications of expressions with side effects described above, but their side effect is their only effect, i.e. they are explicit and prominent about the "evil" they do. >> No, it's actually implicit: there's an extra "global NAME" or >> "nonlocal NAME" in the equivalent code for a comprehension that isn't >> there in the as-written source code, and doesn't get emitted for a >> regular assignment expression or for the iteration variable in a >> comprehension - it only shows up due to the defined interaction >> between comprehensions and assignment expressions. > The implicit "nonlocal NAME" is only because there is an equally > implicit function boundary. Why is there a function boundary marked by > square brackets? It's not saying "def" or "lambda", which obviously > create functions. It's a 'for' loop wrapped inside a list display. > What part of that says "hey, I'm a nested function"? > > So if there's an implicit function, with implicit declaration of a > magical parameter called ".0", why can't it have an equally implicit > declaration that "spam" is a nonlocal name? > > ChrisA > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/vano%40mail.mipt.ru -- Regards, Ivan From steve at pearwood.info Sun Jun 24 10:52:04 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Mon, 25 Jun 2018 00:52:04 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> Message-ID: <20180624145204.GN14437@ando.pearwood.info> On Sun, Jun 24, 2018 at 05:24:12PM +0300, Ivan Pozdeev via Python-Dev wrote: > An expression is intuitively thought to be self-contained i.e. without > side effects. > if I write `a=b+1`, I'm not expecting it to do anything except assigning > `a'. a = d.pop(1) a = d.setdefault(key, 0) chars_written = file.write(text) > Expressions with side effects has long since proven to be problematic > because of the implicit (thus hard to see and track) links they create > (and because the result depends on the order of evaluation). If you're going to take a hard-core functional approach to side-effects, I think you are using the wrong language. Nearly everything in Python *could* have side-effects (even if usually it won't). Even your own example of "b+1" (depending on what b.__add__ does). > Moreover, Python's other design elements have been consistently > discouraging expressions with side effects, too (e.g. mutator methods > intentionally return None instead of the new value, making them useless > in expressions), I don't think that's the reason why mutator methods return None. They return None rather than self to avoid confusion over whether they return a copy or not. https://docs.python.org/3/faq/design.html#why-doesn-t-list-sort-return-the-sorted-list > so the proposition is in direct conflict with the > language's design. Python is full of operations with side-effects. Besides, they're not quite useless: (alist.append() or alist) is functionally equivalent to alist.append returning self. Just a bit more verbose. Methods (and functions) all return a value, even if that value is None, so they can be used in expressions. If Guido wanted Pascal style procedures, which cannot be used in expressions, we would have them by now :-) -- Steve From ammar at ammaraskar.com Sun Jun 24 05:03:49 2018 From: ammar at ammaraskar.com (Ammar Askar) Date: Sun, 24 Jun 2018 02:03:49 -0700 Subject: [Python-Dev] We now have C code coverage! Message-ID: > Is it possible, given that we are not paying for those reports, to > customize the 'exclude_lines' definitions? Do you want to exclude python code or C code? For C code you can mark sections that exclude coverage in lcov with comments like "LCOV_EXCL_START" http://ltp.sourceforge.net/coverage/lcov/geninfo.1.php For Python code, coverage.py also has some comments you can put down to exclude lines: http://coverage.readthedocs.io/en/coverage-4.2/excluding.html If you don't need line level granularity, you can always add files to ignore in our codecov.yml file: https://docs.codecov.io/docs/ignoring-paths From guido at python.org Sun Jun 24 12:24:39 2018 From: guido at python.org (Guido van Rossum) Date: Sun, 24 Jun 2018 09:24:39 -0700 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <20180624145204.GN14437@ando.pearwood.info> References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: A quick follow-up: PEP 572 currently has two ideas: (a) introduce := for inline assignment, (b) when := is used in a comprehension, set the scope for the target as if the assignment occurred outside any comprehensions. It seems we have more support for (a) than for (b) -- at least Nick and Greg seem to be +0 or better for (a) but -1 for (b). IIRC (b) originated with Tim. But his essay on the topic, included as Appendix A ( https://www.python.org/dev/peps/pep-0572/#appendix-a-tim-peters-s-findings) does not even mention comprehensions. However, he did post his motivation for (b) on python-ideas, IIRC a bit before PyCon; and the main text of the PEP gives a strong motivation ( https://www.python.org/dev/peps/pep-0572/#scope-of-the-target). Nevertheless, maybe we should compromise and drop (b)? -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at pearwood.info Sun Jun 24 14:06:38 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Mon, 25 Jun 2018 04:06:38 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> Message-ID: <20180624180636.GO14437@ando.pearwood.info> On Sun, Jun 24, 2018 at 04:33:38PM +1000, Nick Coghlan wrote: [...] > > Making the intentional choice to use an assignment expression is not > > really "implicit" in any meaningful sense. > > No, it's actually implicit: there's an extra "global NAME" or > "nonlocal NAME" in the equivalent code for a comprehension that isn't > there in the as-written source code, and doesn't get emitted for a > regular assignment expression or for the iteration variable in a > comprehension - it only shows up due to the defined interaction > between comprehensions and assignment expressions. You seem to be talking about an implementation which could change in the future. I'm talking semantics of the proposed language feature. As a programmer writing Python code, I have no visibility into the implementation. The implementation could change ten times a day for all I care, so long as the semantics remain the same. I want the desired semantics to drive the implementation, not the other way around. You seem to want the implementation to drive the semantics, by eliminating the proposed feature because it doesn't match your deep understanding of the implementation as a nested function. I want this feature because its useful, and without it the use-cases for assignment expressions are significantly reduced. As far as "implicit", for the sake of the discussion, I'll grant you that one. Okay, the proposed behaviour will implicitly enable comprehensions to export their state. Now what? Is that a good thing or a bad thing? If "implicit" (with or without the scare quotes) is such a bad thing to be avoided, why are comprehensions implemented using an implicit function? > The problem I have with PEP 572 is that it proposes *breaking that > otherwise universal pattern* - instead of having assignment > expressions in comprehensions implicitly declare the name as local in > the nested comprehension scope, it instead has them: You talk about "nested comprehension scope", and that's a critical point, but I'm going to skip answering that for now. I have a draft email responding to another of your posts on that topic, which I hope to polish in the next day. > 1. implicitly declare the name as global or as nonlocal in the > comprehension (or else raise an exception), depending on the nature of > the parent scope where the comprehension is used > 2. in the nonlocal reference case, amend the symbol table analysis to > act like there was a preceding "if 0:\n for NAME in ():\n pass" in the > parent scope (so the compiler knows which outer function scope to > target) If it is okay for you to amend the list comprehension to behave as if it were wrapped in an implicit nested function, why shouldn't it be okay to behave as if assignments inside the comprehension included an implicit nonlocal declaration? > The rationale being given for why that is OK is: > > 1. "Everyone" thinks comprehensions are just a for loop (even though > that hasn't been true for the better part of a decade, and was never > true for generator expressions) Obviously "everyone" is an exaggeration, but, yes, I stand by that -- most people don't even give comprehension scope a thought until they get bitten by it. Either because (Python 2) they don't realise the loop variable is local to their current scope: http://www.librador.com/2014/07/10/Variable-scope-in-list-comprehension-vs-generator-expression/ or (Python 3) they get bitten by the change: https://old.reddit.com/r/Python/comments/425qmb/strange_python_27_34_difference/ (As is so often the case, whatever behaviour we choose, we're going to surprise somebody.) It is hardly surprising that people don't think too hard about scoping of comprehensions. Without a way to perform assignments inside comprehensions, aside from the loop variables themselves, there's nothing going on inside a comprehension where it makes a visible difference whether it is a local scope or a sublocal scope. *IF* assignment expressions are introduced, that is going to change. We have some choices: 1. Keep assignment expressions encapsulated in their implicit function, and be prepared for people to be annoyed because (with no way to declare them global or non-local inside an expression), they can't use them to get data in and out of the comprehension. 2. Allow assignment expressions to be exported out of the comprehension, and be prepared for people to be annoyed because they clobbered a local. (But for the reasons Tim Peters has already set out, I doubt this will happen often.) 3. Allow some sort of extra comprehension syntax to allow global/nonlocal declarations inside comprehensions. x = 1 [nonlocal x := x+i for i in sequence] (Hmmm... I thought I would hate that more than I actually do.) 4. Have some sort of cunning plan whereby if the variable in question exists in the local scope, it is implicitly local inside the comprehension: x = 1 [x := i+1 for i in (1, 2)] assert x == 3 but if it doesn't, then the variable is implicitly sublocal inside the comprehension: del x [x := i+1 for i in (1, 2)] x # raises NameError Remember, the driving use-case which started this (ever-so-long) discussion was the ability to push data into a comprehension and then update it on each iteration, something like this: x = initial_value() results = [x := transform(x, i) for i in sequence] Please, Nick, take your implementor's hat off, forget everything you know about the implementation of comprehensions and their implicit nested function, and tell me that doesn't look like it should work. -- Steve From steve at pearwood.info Sun Jun 24 14:48:20 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Mon, 25 Jun 2018 04:48:20 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: <20180624184820.GP14437@ando.pearwood.info> On Sun, Jun 24, 2018 at 09:24:39AM -0700, Guido van Rossum wrote: > A quick follow-up: PEP 572 currently has two ideas: (a) introduce := for > inline assignment, (b) when := is used in a comprehension, set the scope > for the target as if the assignment occurred outside any comprehensions. It > seems we have more support for (a) than for (b) -- at least Nick and Greg > seem to be +0 or better for (a) but -1 for (b). IIRC (b) originated with > Tim. I'm not sure who came up with the idea first, but as I remember it, the first mention of this came in a separate thread on Python-Ideas: https://mail.python.org/pipermail/python-ideas/2018-April/049631.html so possibly I'm to blame :-) That thread starts here: https://mail.python.org/pipermail/python-ideas/2018-April/049622.html If I did get the idea from Tim, I don't remember doing so. > But his essay on the topic, included as Appendix A ( > https://www.python.org/dev/peps/pep-0572/#appendix-a-tim-peters-s-findings) > does not even mention comprehensions. However, he did post his motivation > for (b) on python-ideas, IIRC a bit before PyCon; and the main text of the > PEP gives a strong motivation ( > https://www.python.org/dev/peps/pep-0572/#scope-of-the-target). > Nevertheless, maybe we should compromise and drop (b)? I will have more to say about the whole "comprehensions are their own scope" issue later. But I'd like to see Nick's proposed PEP, or at least a draft of it, before making any final decisions. If it came down to it, I'd be happy with the ability to declare an assignment target nonlocal in the comprehension if that's what it takes. What do you think of this syntax? [global|nonlocal] simple_target := expression Inside a comprehension, without a declaration, the target would be sublocal (comprehension scope); that should make Nick happier :-) -- Steve From tim.peters at gmail.com Sun Jun 24 15:03:07 2018 From: tim.peters at gmail.com (Tim Peters) Date: Sun, 24 Jun 2018 14:03:07 -0500 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: [Guido] > A quick follow-up: PEP 572 currently has two ideas: (a) introduce := for inline > assignment, (b) when := is used in a comprehension, set the scope for the > target as if the assignment occurred outside any comprehensions. It seems > we have more support for (a) than for (b) -- at least Nick and Greg seem to > be +0 or better for (a) but -1 for (b). IIRC (b) originated with Tim. But his > essay on the topic, included as Appendix A > ( https://www.python.org/dev/peps/pep-0572/#appendix-a-tim-peters-s-findings) > does not even mention comprehensions. I was writing up my observations about simple changes to existing code. Since there's nothing sanely akin to binding non-for-targets possible in comprehensions now, comprehensions were out of scope for that effort (which was limited to staring at existing code already doing bindings). :> However, he did post his motivation for (b) on python-ideas, IIRC a bit > before PyCon; and the main text of the PEP gives a strong motivation > (https://www.python.org/dev/peps/pep-0572/#scope-of-the-target). Nevertheless, > maybe we should compromise and drop (b)? Two things to say about that. First, the original example I gave would be approximately as well addressed by allowing to declare intended scopes in magically synthesized functions; like (say) p = None # to establish the intended scope of `p` while any( # split across lines just for readability n % p == 0 for p in small_primes): n //= p It didn't really require an inline assignment, just a way to override the unwanted (in this case) "all `for` targets are local to the invisible function" rigid consequence of the implementation du jour. Second, if it's dropped, then the PEP needs more words to define what happens in cases like the following, because different textual parts of a comprehension execute in different scopes, and that can become visible when bindings can be embedded: def f(): y = -1 ys = [y for _ in range(y := 5)] print(y, ys) Here `range(y := 5)` is executed in f's scope. Presumably the `y` in `y for` also refers to f's scope, despite that `y` textually _appears_ to be assigned to in the body of the listcomp, and so would - for that reason - expected to be local to the synthesized function, and so raise `UnboundLocalError` when referenced. It's incoherent without detailed knowledge of the implementation. def g(): y = -1 ys = [y for y in range(y := 5)] print(y, ys) And here the `y` in `y for y` is local to the synthesized function, and presumably has nothing to do with the `y` in the `range()` call. That's incoherent in its own way. Under the current PEP, all instances of `y` in `f` refer to the f-local `y`, and the listcomp in `g` is a compile-time error. -------------- next part -------------- An HTML attachment was scrubbed... URL: From rosuav at gmail.com Sun Jun 24 16:16:37 2018 From: rosuav at gmail.com (Chris Angelico) Date: Mon, 25 Jun 2018 06:16:37 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <20180624180636.GO14437@ando.pearwood.info> References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624180636.GO14437@ando.pearwood.info> Message-ID: On Mon, Jun 25, 2018 at 4:06 AM, Steven D'Aprano wrote: > > Remember, the driving use-case which started this (ever-so-long) > discussion was the ability to push data into a comprehension and then > update it on each iteration, something like this: > > x = initial_value() > results = [x := transform(x, i) for i in sequence] Which means there is another option. 5. Have the assignment be local to the comprehension, but the initial value of ANY variable is looked up from the surrounding scopes. That is: you will NEVER get UnboundLocalError from a comprehension/genexp; instead, you will look up the name as if it were in the surrounding scope, either getting a value or bombing with regular old NameError. Or possibly variations on this such as "the immediately surrounding scope only", rather than full name lookups. It'd have an awkward boundary somewhere, whichever way you do it. This isn't able to send information *out* of a comprehension, but it is able to send information *in*. ChrisA From mike at selik.org Sun Jun 24 17:35:46 2018 From: mike at selik.org (Michael Selik) Date: Sun, 24 Jun 2018 14:35:46 -0700 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <20180624180636.GO14437@ando.pearwood.info> References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624180636.GO14437@ando.pearwood.info> Message-ID: This thread started with a request for educator feedback, which I took to mean observations of student reactions. I've only had the chance to test the proposal on ~20 students so far, but I'd like the chance to gather more data for your consideration before the PEP is accepted or rejected. On Sun, Jun 24, 2018 at 11:09 AM Steven D'Aprano wrote: > Remember, the driving use-case which started this (ever-so-long) > discussion was the ability to push data into a comprehension and then > update it on each iteration, something like this: > > x = initial_value() > results = [x := transform(x, i) for i in sequence] > If that is the driving use-case, then the proposal should be rejected. The ``itertools.accumulate`` function has been available for a little while now and it handles this exact case. The accumulate function may even be more readable, as it explains the purpose explicitly, not merely the algorithm. And heck, it's a one-liner. results = accumulate(sequence, transform) The benefits for ``any`` and ``all`` seem useful. Itertools has "first_seen" in the recipes section. While it feels intuitively useful, I can't recall ever writing something similar myself. For some reason, I (almost?) always want to find all (counter-)examples and aggregate them in some way -- min or max, perhaps -- rather than just get the first. Even so, if it turns out those uses are quite prevalent, wouldn't a new itertool be better than a new operator? It's good to solve the general problem, but so far the in-comprehension usage seems to have only a handful of cases. On Fri, Jun 22, 2018 at 9:14 PM Chris Barker via Python-Dev < python-dev at python.org> wrote: > again, not a huge deal, just a little bit more complexity > I worry that Python may experience something of a "death by a thousand cuts" along the lines of the "Remember the Vasa" warning. Python's greatest strength is its appeal to beginners. Little bits of added complexity have a non-linear effect. One day, we may wake up and Python won't be recommended as a beginner's language. On Fri, Jun 22, 2018 at 7:48 PM Steven D'Aprano wrote: > On Fri, Jun 22, 2018 at 10:59:43AM -0700, Michael Selik wrote: > > Of course they do -- they're less fluent at reading code. They don't > have the experience to judge good code from bad. > On the other hand, an "expert" may be so steeped in a particular subculture that he no longer can distinguish esoteric from intuitive. Don't be so fast to reject the wisdom of the inexperienced. > The question we should be asking is, do we only add features to Python > if they are easy for beginners? It's not that I especially want to add > features which *aren't* easy for beginners, but Python isn't Scratch and > "easy for beginners" should only be a peripheral concern. > On the contrary, I believe that "easy for beginners" should be a major concern. Ease of use has been and is a, or even the main reason for Python's success. When some other language becomes a better teaching language, it will eventually take over in business and science as well. Right now, Python is Scratch for adults. That's a great thing. Given the growth of the field, there are far more beginner programmers working today than there ever have been experts. Mozilla's array comprehensions are almost identical to Python's, aside > from a couple of trivial differences: > I can't prove it, but I think the phrase ordering difference is not trivial. > Students who are completely new to programming can see the similarity of > > [Python] list comprehensions to spoken language. > > I've been using comprehensions for something like a decade, and I can't > Python: any(line.startswith('#') for line in file) English: Any line starts with "#" in the file? -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Sun Jun 24 19:02:17 2018 From: guido at python.org (Guido van Rossum) Date: Sun, 24 Jun 2018 16:02:17 -0700 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <20180624184820.GP14437@ando.pearwood.info> References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> <20180624184820.GP14437@ando.pearwood.info> Message-ID: On Sun, Jun 24, 2018 at 11:50 AM Steven D'Aprano wrote: > [Guido] > > [...] IIRC (b) originated with Tim. > > I'm not sure who came up with the idea first, but as I remember it, the > first mention of this came in a separate thread on Python-Ideas: > > https://mail.python.org/pipermail/python-ideas/2018-April/049631.html > > so possibly I'm to blame :-) > Actually that post sounds like the OP of that thread (Peter O'Connor) is to blame -- he proposed a similar thing using `=` for the assignment and custom syntax (`from `) to specify the initial value, and it looks like that inspired you. > > But his essay on the topic, included as Appendix A ( > > > https://www.python.org/dev/peps/pep-0572/#appendix-a-tim-peters-s-findings > ) > > does not even mention comprehensions. However, he did post his motivation > > for (b) on python-ideas, IIRC a bit before PyCon; and the main text of > the > > PEP gives a strong motivation ( > > https://www.python.org/dev/peps/pep-0572/#scope-of-the-target). > > Nevertheless, maybe we should compromise and drop (b)? > > I will have more to say about the whole "comprehensions are their own > scope" issue later. But I'd like to see Nick's proposed PEP, or at least > a draft of it, before making any final decisions. > Agreed, though I assume it's just `given` again. > If it came down to it, I'd be happy with the ability to declare an > assignment target nonlocal in the comprehension if that's what it takes. > What do you think of this syntax? > > [global|nonlocal] simple_target := expression > > Inside a comprehension, without a declaration, the target would be > sublocal (comprehension scope); that should make Nick happier :-) > It's more special syntax. Just taking part (a) of PEP 572 would make most people happy enough. -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Sun Jun 24 19:25:03 2018 From: guido at python.org (Guido van Rossum) Date: Sun, 24 Jun 2018 16:25:03 -0700 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: On Sun, Jun 24, 2018 at 12:03 PM Tim Peters wrote: > [Guido] > :> However, [Tim] did post his motivation for (b) on python-ideas, IIRC a > bit > > before PyCon; and the main text of the PEP gives a strong motivation > > (https://www.python.org/dev/peps/pep-0572/#scope-of-the-target). > Nevertheless, > > maybe we should compromise and drop (b)? > > Two things to say about that. First, the original example I gave would be > approximately as well addressed by allowing to declare intended scopes in > magically synthesized functions; like (say) > > p = None # to establish the intended scope of `p` > while any( # split across lines just for readability > n % p == 0 for p in small_primes): > n //= p > > It didn't really require an inline assignment, just a way to override the > unwanted (in this case) "all `for` targets are local to the invisible > function" rigid consequence of the implementation du jour. > Hm, that's more special syntax. The nice bit about (b) as currently specified is that it adds no syntax -- it adds a scope rule, but (as IIRC Steven has convincingly argued) few people care about those. Python's scope rules, when fully specified, are intricate to the point of being arcane (e.g. for class scopes) but all that has a purpose -- to make them so DWIM ("Do what I Mean") that in practice you almost never have to worry about them, *especially* when reading non-obfuscated code (and also when writing, except for a few well-known patterns). > Second, if it's dropped, then the PEP needs more words to define what > happens in cases like the following, because different textual parts of a > comprehension execute in different scopes, and that can become visible > when bindings can be embedded: > > def f(): > y = -1 > ys = [y for _ in range(y := 5)] > print(y, ys) > > Here `range(y := 5)` is executed in f's scope. Presumably the `y` in `y > for` also refers to f's scope, despite that `y` textually _appears_ to be > assigned to in the body of the listcomp, and so would - for that reason - > expected to be local to the synthesized function, and so raise > `UnboundLocalError` when referenced. It's incoherent without detailed > knowledge of the implementation. > That code should have the same meaning regardless of whether we accept (b) or not -- there is only one `y`, in f's scope. I don't mind if we have to add more words to the PEP's scope rules to make this explicit, though I doubt it -- the existing weirdness (in the comprehension spec) about the "outermost iterable" being evaluated in the surrounding scope specifies this. I wouldn't call it incoherent -- I think what I said about scope rules above applies here, it just does what you expect. > def g(): > y = -1 > ys = [y for y in range(y := 5)] > print(y, ys) > > And here the `y` in `y for y` is local to the synthesized function, and > presumably has nothing to do with the `y` in the `range()` call. That's > incoherent in its own way. > > Under the current PEP, all instances of `y` in `f` refer to the f-local > `y`, and the listcomp in `g` is a compile-time error. > And under the (b)-less proposal, `g` would interpret `y for y` as both referring to a new variable created just for the comprehension, and `y := 5` as referring to g's scope. Again I don't think it needs extra words in the spec. And the end user docs might just say "don't do that" (with a link to the reference manual's rule about the "outermost iterable"). Even if in the end we did find a case where we'd have to write an explicit rule to make what happens here a consequence of the spec rather than the implementation, that doesn't count as an argument for keeping (b) to me. In favor of (b) we have a few examples (see https://www.python.org/dev/peps/pep-0572/#scope-of-the-target) that require it, and more that you described on python-ideas (and also the motivating use case from the thread that Steven dug up, starting here: https://mail.python.org/pipermail/python-ideas/2018-April/049622.html). A "neutral" argument about (b) is that despite the "horrified" reactions that Nick saw, in practice it's going to confuse very few people (again, due to my point about Python's scope rules). I'd wager that the people who might be most horrified about it would be people who feel strongly that the change to the comprehension scope rules in Python 3 is a big improvement, and who are familiar with the difference in implementation of comprehensions (though not generator expressions) in Python 2 vs. 3. -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From greg.ewing at canterbury.ac.nz Sun Jun 24 19:30:24 2018 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Mon, 25 Jun 2018 11:30:24 +1200 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: <5B302990.8060807@canterbury.ac.nz> Guido van Rossum wrote: > Greg seem to be +0 or better for (a) Actually, I'm closer to -1 on (a) as well. I don't like := as a way of getting assignment in an expression. The only thing I would give a non-negative rating is some form of "where" or "given". Brief summary of reasons for disliking ":=": * Cryptic use of punctuation * Too much overlap in functionality with "=" * Asymmetry between first and subsequent uses of the bound value * Makes expressions cluttered and hard to read to my eyes -- Greg From guido at python.org Sun Jun 24 19:28:16 2018 From: guido at python.org (Guido van Rossum) Date: Sun, 24 Jun 2018 16:28:16 -0700 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624180636.GO14437@ando.pearwood.info> Message-ID: On Sun, Jun 24, 2018 at 2:10 PM Chris Angelico wrote: > On Mon, Jun 25, 2018 at 4:06 AM, Steven D'Aprano > wrote: > > > > Remember, the driving use-case which started this (ever-so-long) > > discussion was the ability to push data into a comprehension and then > > update it on each iteration, something like this: > > > > x = initial_value() > > results = [x := transform(x, i) for i in sequence] > > Which means there is another option. > > 5. Have the assignment be local to the comprehension, but the initial > value of ANY variable is looked up from the surrounding scopes. > > That is: you will NEVER get UnboundLocalError from a > comprehension/genexp; instead, you will look up the name as if it were > in the surrounding scope, either getting a value or bombing with > regular old NameError. > > Or possibly variations on this such as "the immediately surrounding > scope only", rather than full name lookups. It'd have an awkward > boundary somewhere, whichever way you do it. > > This isn't able to send information *out* of a comprehension, but it > is able to send information *in*. > But this "horrifies" me for a slightly different reason: it effectively introduces a new case of dynamic scoping, which Python used to do everywhere but has long switched away from, with the exception of class scopes (whose difference with function scopes sometimes confuses people -- usually people who put too much code in their class scope). -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From greg.ewing at canterbury.ac.nz Sun Jun 24 19:40:59 2018 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Mon, 25 Jun 2018 11:40:59 +1200 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <20180624180636.GO14437@ando.pearwood.info> References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624180636.GO14437@ando.pearwood.info> Message-ID: <5B302C0B.80705@canterbury.ac.nz> Steven D'Aprano wrote: > You seem to be talking about an implementation which could change in the > future. I'm talking semantics of the proposed language feature. The way I see it, it's not about implementation details, it's about having a mental model that's easy to reason about. "Comprehensions run in their own scope, like a def or lambda" is a clear and simple mental model. It's easy to explain and keep in your head. The proposed semantics are much more complicated, and as far as I can see, are only motivated by use cases that you shouldn't really be doing in the first place. -- Greg From ben+python at benfinney.id.au Sun Jun 24 19:38:34 2018 From: ben+python at benfinney.id.au (Ben Finney) Date: Mon, 25 Jun 2018 09:38:34 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> <5B302990.8060807@canterbury.ac.nz> Message-ID: <85lgb3dabp.fsf@benfinney.id.au> Greg Ewing writes: > Actually, I'm closer to -1 on (a) as well. I don't like := as a > way of getting assignment in an expression. The only thing I would > give a non-negative rating is some form of "where" or "given". +1 to this; ?:=? doesn't convey the meaning well. Python's syntax typically eschews cryptic punctuation in faviour of existing words that convey an appropriate meaning, and I agree with Greg that would be a better way to get this effect. -- \ ?Self-respect: The secure feeling that no one, as yet, is | `\ suspicious.? ?Henry L. Mencken | _o__) | Ben Finney From guido at python.org Sun Jun 24 19:57:55 2018 From: guido at python.org (Guido van Rossum) Date: Sun, 24 Jun 2018 16:57:55 -0700 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624180636.GO14437@ando.pearwood.info> Message-ID: On Sun, Jun 24, 2018 at 2:41 PM Michael Selik wrote: > This thread started with a request for educator feedback, which I took to > mean observations of student reactions. I've only had the chance to test > the proposal on ~20 students so far, but I'd like the chance to gather more > data for your consideration before the PEP is accepted or rejected. > Sure. Since the target for the PEP is Python 3.8 I am in no particular hurry. It would be important to know how you present it to your students. > On Sun, Jun 24, 2018 at 11:09 AM Steven D'Aprano > wrote: > >> Remember, the driving use-case which started this (ever-so-long) >> discussion was the ability to push data into a comprehension and then >> update it on each iteration, something like this: >> >> x = initial_value() >> results = [x := transform(x, i) for i in sequence] >> > > If that is the driving use-case, then the proposal should be rejected. The > ``itertools.accumulate`` function has been available for a little while now > and it handles this exact case. The accumulate function may even be more > readable, as it explains the purpose explicitly, not merely the algorithm. > And heck, it's a one-liner. > > results = accumulate(sequence, transform) > I think that's a misunderstanding. At the very least the typical use case is *not* using an existing transform function which is readily passed to accumulate -- instead, it's typically written as a simple expression (e.g. `total := total + v` in the PEP) which would require a lambda. Plus, I don't know what kind of students you are teaching, but for me, whenever the solution requires a higher-order function (like accumulate), this implies a significant speed bump -- both when writing and when reading code. (Honestly, whenever I read code that uses itertools, I end up making a trip to StackOverflow :-). > The benefits for ``any`` and ``all`` seem useful. Itertools has > "first_seen" in the recipes section. > (I think you mean first_true().) > While it feels intuitively useful, I can't recall ever writing something > similar myself. For some reason, I (almost?) always want to find all > (counter-)examples and aggregate them in some way -- min or max, perhaps -- > rather than just get the first. > I trust Tim's intuition here, he's written about this. Also, Python's predecessor, ABC, had quantifiers (essentially any() and all()) built into the language, and the semantics included making the first (counter-)example available ( https://homepages.cwi.nl/~steven/abc/qr.html#TESTS). Essentially IF SOME x IN values HAS x < 0: WRITE "Found a negative x:", x equivalently IF EACH x IN values HAS x >= 0: # ... ELSE: WRITE "Found a negative x:", x and even IF NO x IN values HAS x < 0: # ... ELSE: WRITE "Found a negative x:", x > Even so, if it turns out those uses are quite prevalent, wouldn't a new > itertool be better than a new operator? It's good to solve the general > problem, but so far the in-comprehension usage seems to have only a handful > of cases. > Perhaps, but IMO the new itertool would be much less useful than the new syntax. > On Fri, Jun 22, 2018 at 9:14 PM Chris Barker via Python-Dev < > python-dev at python.org> wrote: > >> again, not a huge deal, just a little bit more complexity >> > > I worry that Python may experience something of a "death by a thousand > cuts" along the lines of the "Remember the Vasa" warning. Python's greatest > strength is its appeal to beginners. Little bits of added complexity have a > non-linear effect. One day, we may wake up and Python won't be recommended > as a beginner's language. > I don't think that appeal to beginners is Python's greatest strength. I'd rather say that it is its appeal to both beginners and experts (and everyone in between). If true appeal to beginners is needed, Scratch or Processing would probably be better. > On Fri, Jun 22, 2018 at 7:48 PM Steven D'Aprano > wrote: > >> On Fri, Jun 22, 2018 at 10:59:43AM -0700, Michael Selik wrote: >> >> Of course they do -- they're less fluent at reading code. They don't >> have the experience to judge good code from bad. >> > > On the other hand, an "expert" may be so steeped in a particular > subculture that [they] no longer can distinguish esoteric from intuitive. > Don't be so fast to reject the wisdom of the inexperienced. > Nor should we cater to them excessively though. While the user is indeed king, it's also well known that most users when they are asking for a feature don't know what they want (same for kings, actually, that's why they have advisors :-). > The question we should be asking is, do we only add features to Python >> if they are easy for beginners? It's not that I especially want to add >> features which *aren't* easy for beginners, but Python isn't Scratch and >> "easy for beginners" should only be a peripheral concern. >> > > On the contrary, I believe that "easy for beginners" should be a major > concern. Ease of use has been and is a, or even the main reason for > Python's success. When some other language becomes a better teaching > language, it will eventually take over in business and science as well. > Right now, Python is Scratch for adults. That's a great thing. Given the > growth of the field, there are far more beginner programmers working today > than there ever have been experts. > I'm sorry, but this offends me, and I don't believe it's true at all. Python is *not* a beginners language, and you are mixing ease of use and ease of learning. Python turns beginners into experts at an unprecedented rate, and that's the big difference with Scratch. -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From mike at selik.org Sun Jun 24 20:33:55 2018 From: mike at selik.org (Michael Selik) Date: Sun, 24 Jun 2018 17:33:55 -0700 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624180636.GO14437@ando.pearwood.info> Message-ID: On Sun, Jun 24, 2018 at 4:57 PM Guido van Rossum wrote: > On Sun, Jun 24, 2018 at 2:41 PM Michael Selik wrote: > >> This thread started with a request for educator feedback, which I took to >> mean observations of student reactions. I've only had the chance to test >> the proposal on ~20 students so far, but I'd like the chance to gather more >> data for your consideration before the PEP is accepted or rejected. >> > > Sure. Since the target for the PEP is Python 3.8 I am in no particular > hurry. It would be important to know how you present it to your students. > Absolutely. Since this has come up, I'll make an effort to be more systematic in data collection. > On Sun, Jun 24, 2018 at 11:09 AM Steven D'Aprano >> wrote: >> >>> Remember, the driving use-case which started this (ever-so-long) >>> discussion was the ability to push data into a comprehension and then >>> update it on each iteration, something like this: >>> >>> x = initial_value() >>> results = [x := transform(x, i) for i in sequence] >>> >> >> If that is the driving use-case, then the proposal should be rejected. >> The ``itertools.accumulate`` function has been available for a little while >> now and it handles this exact case. The accumulate function may even be >> more readable, as it explains the purpose explicitly, not merely the >> algorithm. And heck, it's a one-liner. >> >> results = accumulate(sequence, transform) >> > > I think that's a misunderstanding. At the very least the typical use case > is *not* using an existing transform function which is readily passed to > accumulate -- instead, it's typically written as a simple expression (e.g. > `total := total + v` in the PEP) which would require a lambda. > Plus, I don't know what kind of students you are teaching, but for me, > whenever the solution requires a higher-order function (like accumulate), > this implies a significant speed bump -- both when writing and when reading > code. (Honestly, whenever I read code that uses itertools, I end up making > a trip to StackOverflow :-). > Mostly mid-career professionals, of highly varying backgrounds. The higher-order functions do require some cushioning getting into, but I have some tricks I've learned over the years to make it go over pretty well. On Fri, Jun 22, 2018 at 7:48 PM Steven D'Aprano wrote: >> > On Fri, Jun 22, 2018 at 10:59:43AM -0700, Michael Selik wrote: >>> >>> Of course they do -- they're less fluent at reading code. They don't >>> have the experience to judge good code from bad. >>> >> >> On the other hand, an "expert" may be so steeped in a particular >> subculture that [they] no longer can distinguish esoteric from intuitive. >> Don't be so fast to reject the wisdom of the inexperienced. >> > > Nor should we cater to them excessively though. While the user is indeed > king, it's also well known that most users when they are asking for a > feature don't know what they want (same for kings, actually, that's why > they have advisors :-). > > >> The question we should be asking is, do we only add features to Python >>> if they are easy for beginners? It's not that I especially want to add >>> features which *aren't* easy for beginners, but Python isn't Scratch and >>> "easy for beginners" should only be a peripheral concern. >>> >> >> On the contrary, I believe that "easy for beginners" should be a major >> concern. Ease of use has been and is a, or even the main reason for >> Python's success. When some other language becomes a better teaching >> language, it will eventually take over in business and science as well. >> Right now, Python is Scratch for adults. That's a great thing. Given the >> growth of the field, there are far more beginner programmers working today >> than there ever have been experts. >> > > I'm sorry, but this offends me, and I don't believe it's true at all. > Python is *not* a beginners language, and you are mixing ease of use and > ease of learning. Python turns beginners into experts at an unprecedented > rate, and that's the big difference with Scratch. > By saying "Scratch for adults" I meant that Python is a language that can be adopted by beginners and rapidly make them professionals, not that it's exclusively a beginner's language. Also, Scratch and similar languages, like NetLogo, have some interesting features that allow beginners to write some sophisticated parallelism. I don't mean "beginner's language" in that it's overly simplistic, but that it enables what would be complex in other languages. I realize that my phrasing was likely to be misunderstood without knowing the context that I teach working professionals who are asked to be immediately productive at high-value tasks. -------------- next part -------------- An HTML attachment was scrubbed... URL: From tim.peters at gmail.com Sun Jun 24 22:46:27 2018 From: tim.peters at gmail.com (Tim Peters) Date: Sun, 24 Jun 2018 21:46:27 -0500 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: > > [Tim] >> > . First, the original example I gave would be approximately as well >> addressed by allowing to declare intended scopes in magically synthesized >> functions; like (say) >> >> p = None # to establish the intended scope of `p` >> while any( # split across lines just for readability >> n % p == 0 for p in small_primes): >> n //= p >> >> It didn't really require an inline assignment, just a way to override the >> unwanted (in this case) "all `for` targets are local to the invisible >> function" rigid consequence of the implementation du jour. >> > [Guido] > Hm, that's more special syntax. > Of course - I'm anticipating that the PEP will be changed to throw out useful assignment expressions in comprehensions, but I still want a way to "export" comprehension for-targets at times ;-) > The nice bit about (b) as currently specified is that it adds no syntax -- > it adds a scope rule, but (as IIRC Steven has convincingly argued) few > people care about those. Python's scope rules, when fully specified, are > intricate to the point of being arcane (e.g. for class scopes) but all that > has a purpose -- to make them so DWIM ("Do what I Mean") that in practice > you almost never have to worry about them, *especially* when reading > non-obfuscated code (and also when writing, except for a few well-known > patterns). > You and Steven and i appear to be on the same page here - but it's in a book nobody else seems to own :-( To me it's just screamingly obvious that total = 0 cumsums = [total := total + value for value in data] "should do" what it obviously intends to do - and that the only thing stopping that is a bass-ackwards focus on what most trivially falls out of the current implementation. ... def f(): >> y = -1 >> ys = [y for _ in range(y := 5)] >> print(y, ys) >> >> Here `range(y := 5)` is executed in f's scope. Presumably the `y` in `y >> for` also refers to f's scope, despite that `y` textually _appears_ to be >> assigned to in the body of the listcomp, and so would - for that reason - >> expected to be local to the synthesized function, and so raise >> `UnboundLocalError` when referenced. It's incoherent without detailed >> knowledge of the implementation. >> > > That code should have the same meaning regardless of whether we accept (b) > or not -- there is only one `y`, in f's scope. I don't mind if we have to > add more words to the PEP's scope rules to make this explicit, though I > doubt it -- the existing weirdness (in the comprehension spec) about the > "outermost iterable" being evaluated in the surrounding scope specifies > this. I wouldn't call it incoherent -- I think what I said about scope > rules above applies here, it just does what you expect. > Remove "y = -1" and - voila! - we have the dreaded "parent local scoping" Nick finds so baffling to explain (or so he claims). That is, "has exactly the same scope in the comprehension as in the parent block, and will create a local in the latter if the name is otherwise unknown in the parent" comes with assignment expressions, regardless of whether _all_ such targets "leak" (the current PEP) or only targets in the expression defining the iterable of the outermost `for` (the PEP without leaking assignment expressions in comprehensions). As to whether it "does what you expect", no, not really! In a world where _all_ binding targets in a comprehension are claimed to be local to the comprehension, I _expect_ that `y := 5` appearing inside the listcomp means `y` is local to the listcomp. "Oh - unless the binding appears in the expression defining the iterable of the outermost `for`" comes from Mars. Not that it really matters much, but (b) provides consistent semantics in these cases. No need to search Mars for weird exceptions ;-) ... > A "neutral" argument about (b) is that despite the "horrified" reactions > that Nick saw, in practice it's going to confuse very few people (again, > due to my point about Python's scope rules). I'd wager that the people who > might be most horrified about it would be people who feel strongly that the > change to the comprehension scope rules in Python 3 is a big improvement, > and who are familiar with the difference in implementation of > comprehensions (though not generator expressions) in Python 2 vs. 3. > I also doubt it will generally confuse people in practice (to the contrary, I expect they _will_ be confused if things like the cumulative sums example blow up with UnboundLocalError). But I still don't get the source of the "horror". Assignment expression semantics are wholly consistent with ordinary nested lexical scoping, with or without (b). The only difference is in the scopes picked for assignment expression target names (except for those appearing in the expression defining the iterable yadda yadda yadda). -------------- next part -------------- An HTML attachment was scrubbed... URL: From songofacandy at gmail.com Mon Jun 25 05:01:51 2018 From: songofacandy at gmail.com (INADA Naoki) Date: Mon, 25 Jun 2018 18:01:51 +0900 Subject: [Python-Dev] PEP 580 (C call protocol) draft implementation In-Reply-To: <5B2D12D6.8010305@UGent.be> References: <5B2D12D6.8010305@UGent.be> Message-ID: Thanks, Jeroen. I haven't review your code yet, but benchmark shows no significant slowdown. It's good start! $ ./python -m perf compare_to master.json pep580.json -G --min-speed=5 Slower (6): - scimark_fft: 398 ms +- 20 ms -> 442 ms +- 42 ms: 1.11x slower (+11%) - xml_etree_process: 99.6 ms +- 5.2 ms -> 109 ms +- 16 ms: 1.10x slower (+10%) - crypto_pyaes: 138 ms +- 1 ms -> 149 ms +- 13 ms: 1.09x slower (+9%) - pathlib: 24.8 ms +- 1.8 ms -> 27.0 ms +- 3.8 ms: 1.09x slower (+9%) - spectral_norm: 155 ms +- 8 ms -> 165 ms +- 17 ms: 1.06x slower (+6%) - django_template: 151 ms +- 5 ms -> 160 ms +- 8 ms: 1.06x slower (+6%) Faster (6): - pickle_list: 5.37 us +- 0.74 us -> 4.80 us +- 0.34 us: 1.12x faster (-11%) - regex_v8: 29.5 ms +- 3.3 ms -> 27.1 ms +- 0.1 ms: 1.09x faster (-8%) - telco: 8.08 ms +- 1.19 ms -> 7.45 ms +- 0.16 ms: 1.09x faster (-8%) - regex_effbot: 3.84 ms +- 0.36 ms -> 3.56 ms +- 0.05 ms: 1.08x faster (-7%) - sqlite_synth: 3.98 us +- 0.53 us -> 3.72 us +- 0.07 us: 1.07x faster (-6%) - richards: 89.3 ms +- 9.9 ms -> 84.6 ms +- 5.7 ms: 1.06x faster (-5%) Benchmark hidden because not significant (48) Regards, On Sat, Jun 23, 2018 at 12:32 AM Jeroen Demeyer wrote: > Hello all, > > I have a first draft implementation of PEP 580 (introducing the C call > protocol): > > https://github.com/jdemeyer/cpython/tree/pep580 > > Almost all tests pass, only test_gdb and test_pydoc fail for me. I still > have to fix those. > > > Jeroen. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/songofacandy%40gmail.com > -- INADA Naoki -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Mon Jun 25 07:44:37 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Mon, 25 Jun 2018 21:44:37 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: On 25 June 2018 at 02:24, Guido van Rossum wrote: > A quick follow-up: PEP 572 currently has two ideas: (a) introduce := for > inline assignment, (b) when := is used in a comprehension, set the scope for > the target as if the assignment occurred outside any comprehensions. It > seems we have more support for (a) than for (b) -- at least Nick and Greg > seem to be +0 or better for (a) Right, the proposed blunt solution to "Should I use 'NAME = EXPR' or 'NAME := EXPR'?" bothers me a bit, but it's the implementation implications of parent local scoping that I fear will create a semantic tar pit we can't get out of later. > but -1 for (b). IIRC (b) originated with > Tim. But his essay on the topic, included as Appendix A > (https://www.python.org/dev/peps/pep-0572/#appendix-a-tim-peters-s-findings) > does not even mention comprehensions. However, he did post his motivation > for (b) on python-ideas, IIRC a bit before PyCon; and the main text of the > PEP gives a strong motivation > (https://www.python.org/dev/peps/pep-0572/#scope-of-the-target). > Nevertheless, maybe we should compromise and drop (b)? Unfortunately, I think the key rationale for (b) is that if you *don't* do something along those lines, then there's a different strange scoping discrepancy that arises between the non-comprehension forms of container displays and the comprehension forms: (NAME := EXPR,) # Binds a local tuple(NAME := EXPR for __ in range(1)) # Doesn't bind a local [NAME := EXPR] # Binds a local [NAME := EXPR for __ in range(1)] # Doesn't bind a local list(NAME := EXPR for __ in range(1)) # Doesn't bind a local {NAME := EXPR} # Binds a local {NAME := EXPR for __ in range(1)} # Doesn't bind a local set(NAME := EXPR for __ in range(1)) # Doesn't bind a local {NAME := EXPR : EXPR2} # Binds a local {NAME := EXPR : EXPR2 for __ in range(1)} # Doesn't bind a local set((NAME := EXPR, EXPR2) for __ in range(1)) # Doesn't bind a local Those scoping inconsistencies aren't *new*, but provoking them currently involves either class scopes, or messing about with locals(). The one virtue that choosing this particular set of discrepancies has is that the explanation for why they happen is the same as the explanation for how the iteration variable gets hidden from the containing scope: because "(EXPR for ....)" et al create an implicitly nested scope, and that nested scope behaves the same way as an explicitly nested scope as far as name binding and name resolution is concerned. Parent local scoping tries to mitigate the surface inconsistency by changing how write semantics are defined for implicitly nested scopes, but that comes at the cost of making those semantics inconsistent with explicitly nested scopes and with the read semantics of implicitly nested scopes. The early iterations of PEP 572 tried to duck this whole realm of potential semantic inconsistencies by introducing sublocal scoping instead, such that the scoping for assignment expression targets would be unusual, but they'd be consistently unusual regardless of where they appeared, and their quirks would clearly be the result of how assignment expressions were defined, rather than only showing up in how they interacted with other scoping design decisions made years ago. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Mon Jun 25 08:17:09 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Mon, 25 Jun 2018 22:17:09 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: On 25 June 2018 at 09:25, Guido van Rossum wrote: > A "neutral" argument about (b) is that despite the "horrified" reactions > that Nick saw, in practice it's going to confuse very few people (again, due > to my point about Python's scope rules). I'd wager that the people who might > be most horrified about it would be people who feel strongly that the change > to the comprehension scope rules in Python 3 is a big improvement, and who > are familiar with the difference in implementation of comprehensions (though > not generator expressions) in Python 2 vs. 3. FWIW, the most cryptic parent local scoping related exception I've been able to devise so far still exhibits PEP 572's desired "Omitting the comprehension scope entirely would give you the same name lookup behaviour" semantics: >>> def outer(x=1): ... def middle(): ... return [x := x + i for i in range(10)] ... return middle() ... >>> outer() Traceback (most recent call last): ... NameError: free variable 'x' referenced before assignment in enclosing scope It isn't the parent local scoping, or even the assignment expression, that's at fault there, since you'd get exactly the same exception for: def outer(x=1): def middle(): x = x +1 return x return middle() Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From p.f.moore at gmail.com Mon Jun 25 08:25:09 2018 From: p.f.moore at gmail.com (Paul Moore) Date: Mon, 25 Jun 2018 13:25:09 +0100 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: On 25 June 2018 at 12:44, Nick Coghlan wrote: > Unfortunately, I think the key rationale for (b) is that if you > *don't* do something along those lines, then there's a different > strange scoping discrepancy that arises between the non-comprehension > forms of container displays and the comprehension forms: I've been mostly ignoring this proposal for a while now, so I'm going to respond here in the context of someone with a bit of an idea of the underlying complexities, but otherwise coming at it as a new proposal. > > (NAME := EXPR,) # Binds a local > tuple(NAME := EXPR for __ in range(1)) # Doesn't bind a local > > [NAME := EXPR] # Binds a local > [NAME := EXPR for __ in range(1)] # Doesn't bind a local > list(NAME := EXPR for __ in range(1)) # Doesn't bind a local > > {NAME := EXPR} # Binds a local > {NAME := EXPR for __ in range(1)} # Doesn't bind a local > set(NAME := EXPR for __ in range(1)) # Doesn't bind a local > > {NAME := EXPR : EXPR2} # Binds a local > {NAME := EXPR : EXPR2 for __ in range(1)} # Doesn't bind a local > set((NAME := EXPR, EXPR2) for __ in range(1)) # Doesn't bind a local None of those "discrepancies" bother me in the slightest, when taken in isolation as you present them here. I suspect you could lead me through a chain of logic that left me understanding why you describe them as discrepancies, but without that explanation, I'm fine with all of them. I'd also say that they seem contrived (not just in the use of artificial names, but also in the sense that I'm not sure why I'd want to use this *pattern*) so I'd happily say "well, don't do that then" if things started behaving non-intuitively. > Those scoping inconsistencies aren't *new*, but provoking them > currently involves either class scopes, or messing about with > locals(). And to reinforce my point above, I already consider putting significant code in class scopes, or using locals() to be techniques that should only be used sparingly and with a clear understanding of the subtleties. I'm sure you could say "but the examples above would be much more common" in response to which I'd like to see real use cases that behave non-intuitively in the way you're concerned about. > The one virtue that choosing this particular set of discrepancies has > is that the explanation for why they happen is the same as the > explanation for how the iteration variable gets hidden from the > containing scope: because "(EXPR for ....)" et al create an implicitly > nested scope, and that nested scope behaves the same way as an > explicitly nested scope as far as name binding and name resolution is > concerned. But that's precisely why I find the behaviour intuitive - the nested scope is the *reason* things behave this way, not some sort of easily-overlooked way the "problem" can be explained away. > Parent local scoping tries to mitigate the surface inconsistency by > changing how write semantics are defined for implicitly nested scopes, > but that comes at the cost of making those semantics inconsistent with > explicitly nested scopes and with the read semantics of implicitly > nested scopes. > > The early iterations of PEP 572 tried to duck this whole realm of > potential semantic inconsistencies by introducing sublocal scoping > instead, such that the scoping for assignment expression targets would > be unusual, but they'd be consistently unusual regardless of where > they appeared, and their quirks would clearly be the result of how > assignment expressions were defined, rather than only showing up in > how they interacted with other scoping design decisions made years > ago. Those last two paragraphs made my head explode, as far as I can see by virtue of the fact that they try to over-analyze the fairly simple intuition I have that "there's a nested scope involved". Disclaimer: I may well have got a *lot* of subtleties wrong here, and it's quite likely that my impressions don't stand up to the harsh reality of how the implementation works. But my comments are on the basis of my *intuition*, whether that's right or wrong. And if the reality violates my intuition, it's *other* constructs that I find non-intuitive, not this one. (I'm perfectly happy to concede that it's not possible to avoid *any* non-intuitive behaviour - all I'm trying to say is that my intuition doesn't balk at this one, unlike yours). Paul From p.f.moore at gmail.com Mon Jun 25 08:28:16 2018 From: p.f.moore at gmail.com (Paul Moore) Date: Mon, 25 Jun 2018 13:28:16 +0100 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: On 25 June 2018 at 13:17, Nick Coghlan wrote: > On 25 June 2018 at 09:25, Guido van Rossum wrote: >> A "neutral" argument about (b) is that despite the "horrified" reactions >> that Nick saw, in practice it's going to confuse very few people (again, due >> to my point about Python's scope rules). I'd wager that the people who might >> be most horrified about it would be people who feel strongly that the change >> to the comprehension scope rules in Python 3 is a big improvement, and who >> are familiar with the difference in implementation of comprehensions (though >> not generator expressions) in Python 2 vs. 3. > > FWIW, the most cryptic parent local scoping related exception I've > been able to devise so far still exhibits PEP 572's desired "Omitting > the comprehension scope entirely would give you the same name lookup > behaviour" semantics: > > >>> def outer(x=1): > ... def middle(): > ... return [x := x + i for i in range(10)] > ... return middle() > ... > >>> outer() > Traceback (most recent call last): > ... > NameError: free variable 'x' referenced before assignment in enclosing scope > > It isn't the parent local scoping, or even the assignment expression, > that's at fault there, since you'd get exactly the same exception for: > > def outer(x=1): > def middle(): > x = x +1 > return x > return middle() > Once again offering an "intuition" based response: 1. That definition of outer() is very complicated, I don't *expect* to understand it without checking the details. So the NameError is simply "hmm, wonder what triggered that?" not "OMG that's not what I'd expect!" :-) 2. Given that your version with no assignment expression or comprehension exhibits the same behaviour, I'm not sure what your argument is here anyway... Paul From ncoghlan at gmail.com Mon Jun 25 08:24:20 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Mon, 25 Jun 2018 22:24:20 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: On 25 June 2018 at 22:17, Nick Coghlan wrote: > FWIW, the most cryptic parent local scoping related exception I've > been able to devise so far still exhibits PEP 572's desired "Omitting > the comprehension scope entirely would give you the same name lookup > behaviour" semantics: > > >>> def outer(x=1): > ... def middle(): > ... return [x := x + i for i in range(10)] > ... return middle() > ... > >>> outer() > Traceback (most recent call last): > ... > NameError: free variable 'x' referenced before assignment in enclosing scope > > It isn't the parent local scoping, or even the assignment expression, > that's at fault there, since you'd get exactly the same exception for: > > def outer(x=1): > def middle(): > x = x +1 > return x > return middle() Oops, I didn't mean to say "exactly the same exception" here, as the whole reason I'd settled on this example as the most cryptic one I'd found so far was the fact that the doubly nested version *doesn't* give you the same exception as the singly nested version: the version without the comprehension throws UnboundLocalError instead. However, the resolution is the same either way: either 'x' has to be declared as 'nonlocal x' in 'middle', or else it has to be passed in to 'middle' as a parameter. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Mon Jun 25 08:35:00 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Mon, 25 Jun 2018 22:35:00 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> <20180624184820.GP14437@ando.pearwood.info> Message-ID: On 25 June 2018 at 09:02, Guido van Rossum wrote: > On Sun, Jun 24, 2018 at 11:50 AM Steven D'Aprano > wrote: >> I will have more to say about the whole "comprehensions are their own >> scope" issue later. But I'd like to see Nick's proposed PEP, or at least >> a draft of it, before making any final decisions. > > > Agreed, though I assume it's just `given` again. While I still have some TODO notes of my own to resolve before posting it to python-ideas, the examples section at https://github.com/ncoghlan/peps/pull/2/files#diff-7a25ca1769914c1141cb5c63dc781f32R202 already gives a pretty good idea of the differences relative to PEP 572: rebinding existing names is unchanged from PEP 572, but introducing new names requires a bit of "Yes, I really do want to introduce this new name here" repetition. The big difference from previous iterations of the "given" idea is that it doesn't try to completely replace the proposed inline assignments, it just supplements them by providing a way to do inline name *declarations* (which may include declaring targets as global or nonlocal, just as regular function level declarations can). Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From p.f.moore at gmail.com Mon Jun 25 09:46:31 2018 From: p.f.moore at gmail.com (Paul Moore) Date: Mon, 25 Jun 2018 14:46:31 +0100 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: On 25 June 2018 at 13:24, Nick Coghlan wrote: > On 25 June 2018 at 22:17, Nick Coghlan wrote: >> FWIW, the most cryptic parent local scoping related exception I've >> been able to devise so far still exhibits PEP 572's desired "Omitting >> the comprehension scope entirely would give you the same name lookup >> behaviour" semantics: >> >> >>> def outer(x=1): >> ... def middle(): >> ... return [x := x + i for i in range(10)] >> ... return middle() >> ... >> >>> outer() >> Traceback (most recent call last): >> ... >> NameError: free variable 'x' referenced before assignment in enclosing scope >> >> It isn't the parent local scoping, or even the assignment expression, >> that's at fault there, since you'd get exactly the same exception for: >> >> def outer(x=1): >> def middle(): >> x = x +1 >> return x >> return middle() > > Oops, I didn't mean to say "exactly the same exception" here, as the > whole reason I'd settled on this example as the most cryptic one I'd > found so far was the fact that the doubly nested version *doesn't* > give you the same exception as the singly nested version: the version > without the comprehension throws UnboundLocalError instead. At the level of "what my intuition says" the result is the same in both cases - "it throws an exception". I have no intuition on *which* exception would be raised and would experiment (or look up the details) if I cared. > However, the resolution is the same either way: either 'x' has to be > declared as 'nonlocal x' in 'middle', or else it has to be passed in > to 'middle' as a parameter. Once someone told me that's what I needed, it's sufficiently obvious that I'm fine with that. If no-one was able to tell me what to do, I'd simply rewrite the code to be less obfuscated :-) I've probably explained my intuition enough here. If we debate any further I'll just end up knowing what's going on, destroying my value as an "uninformed user" :-) Paul From vano at mail.mipt.ru Mon Jun 25 10:19:19 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Mon, 25 Jun 2018 17:19:19 +0300 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <5B302990.8060807@canterbury.ac.nz> References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> <5B302990.8060807@canterbury.ac.nz> Message-ID: <222b030b-6f6b-1644-6f74-8e7b6d2b9857@mail.mipt.ru> On 25.06.2018 2:30, Greg Ewing wrote: > Guido van Rossum wrote: >> Greg seem to be +0 or better for (a) > > Actually, I'm closer to -1 on (a) as well. I don't like := as a > way of getting assignment in an expression. The only thing I would > give a non-negative rating is some form of "where" or "given". > "as" was suggested even before is became a keyword in `with'. ( if (re.match(regex,line) as m) is not None: ) The only objective objection I've heard is it's already used in `import' and `with' -- but that's perfectly refutable. > Brief summary of reasons for disliking ":=": > > * Cryptic use of punctuation > > * Too much overlap in functionality with "=" > > * Asymmetry between first and subsequent uses of the bound value > > * Makes expressions cluttered and hard to read to my eyes > -- Regards, Ivan From vano at mail.mipt.ru Mon Jun 25 11:01:54 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Mon, 25 Jun 2018 18:01:54 +0300 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: <2f1d0937-7e27-6a68-e862-8cff0d2c9687@mail.mipt.ru> On 25.06.2018 14:44, Nick Coghlan wrote: > On 25 June 2018 at 02:24, Guido van Rossum wrote: >> A quick follow-up: PEP 572 currently has two ideas: (a) introduce := for >> inline assignment, (b) when := is used in a comprehension, set the scope for >> the target as if the assignment occurred outside any comprehensions. It >> seems we have more support for (a) than for (b) -- at least Nick and Greg >> seem to be +0 or better for (a) > Right, the proposed blunt solution to "Should I use 'NAME = EXPR' or > 'NAME := EXPR'?" bothers me a bit, but it's the implementation > implications of parent local scoping that I fear will create a > semantic tar pit we can't get out of later. > >> but -1 for (b). IIRC (b) originated with >> Tim. But his essay on the topic, included as Appendix A >> (https://www.python.org/dev/peps/pep-0572/#appendix-a-tim-peters-s-findings) >> does not even mention comprehensions. However, he did post his motivation >> for (b) on python-ideas, IIRC a bit before PyCon; and the main text of the >> PEP gives a strong motivation >> (https://www.python.org/dev/peps/pep-0572/#scope-of-the-target). >> Nevertheless, maybe we should compromise and drop (b)? > Unfortunately, I think the key rationale for (b) is that if you > *don't* do something along those lines, then there's a different > strange scoping discrepancy that arises between the non-comprehension > forms of container displays and the comprehension forms: > > (NAME := EXPR,) # Binds a local > tuple(NAME := EXPR for __ in range(1)) # Doesn't bind a local > > [NAME := EXPR] # Binds a local > [NAME := EXPR for __ in range(1)] # Doesn't bind a local > list(NAME := EXPR for __ in range(1)) # Doesn't bind a local > > {NAME := EXPR} # Binds a local > {NAME := EXPR for __ in range(1)} # Doesn't bind a local > set(NAME := EXPR for __ in range(1)) # Doesn't bind a local > > {NAME := EXPR : EXPR2} # Binds a local > {NAME := EXPR : EXPR2 for __ in range(1)} # Doesn't bind a local > set((NAME := EXPR, EXPR2) for __ in range(1)) # Doesn't bind a local > > Those scoping inconsistencies aren't *new*, but provoking them > currently involves either class scopes, or messing about with > locals(). I've got an idea about this. The fact is, assignments don't make much sense in an arbitrary part of a comprehension: `for' variables are assigned every iteration, so when the result is returned, only the final value will be seen. (And if you need a value every iteration, just go the explicit way and add it to the returned tuple.) Contrary to that, the "feeder" expression is only evaluated once at the start -- there, assignments do make sense. Effectively, it's equivalent to an additional line: seq = range(calculate_b() as bottom, calculate_t() as top) results = [calculate_r(bottom,r,top) for r in seq] So, I suggest to evaluate the "feeder" expression in a local scope but expressions that are evaluated every iteration in a private scope. -- Regards, Ivan From devel at baptiste-carvello.net Mon Jun 25 09:55:25 2018 From: devel at baptiste-carvello.net (Baptiste Carvello) Date: Mon, 25 Jun 2018 15:55:25 +0200 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <5B302990.8060807@canterbury.ac.nz> References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> <5B302990.8060807@canterbury.ac.nz> Message-ID: <959ca6f2-18cc-db31-8783-f28b68089140@baptiste-carvello.net> Not giving a vote, as I'm just a lurker, but: Le 25/06/2018 ? 01:30, Greg Ewing a ?crit?: > > Actually, I'm closer to -1 on (a) as well. I don't like := as a > way of getting assignment in an expression. The only thing I would > give a non-negative rating is some form of "where" or "given". This resonates with me for a yet different reason: expressing the feature with a new operator makes it feel very important and fundamental, so that beginners would feel compelled to learn it early, and old-timers tend to have a strong gut reaction to it. Using merely a keyword makes it less prominent. Baptiste From guido at python.org Mon Jun 25 12:27:27 2018 From: guido at python.org (Guido van Rossum) Date: Mon, 25 Jun 2018 09:27:27 -0700 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: [This is my one reply in this thread today. I am trying to limit the amount of time I spend to avoid another overheated escalation.] On Mon, Jun 25, 2018 at 4:44 AM Nick Coghlan wrote: > Right, the proposed blunt solution to "Should I use 'NAME = EXPR' or > 'NAME := EXPR'?" bothers me a bit, but it's the implementation > implications of parent local scoping that I fear will create a > semantic tar pit we can't get out of later. > Others have remarked this too, but it really bother me that you are focusing so much on the implementation of parent local scoping rather than on the "intuitive" behavior which is super easy to explain -- especially to someone who isn't all that familiar (or interested) with the implicit scope created for the loop control variable(s). According to Steven (who noticed that this is barely mentioned in most tutorials about comprehensions) that is most people, however very few of them read python-dev. It's not that much work for the compiler, since it just needs to do a little bit of (new) static analysis and then it can generate the bytecode to manipulate closure(s). The runtime proper doesn't need any new implementation effort. The fact that sometimes a closure must be introduced where no explicit initialization exists is irrelevant to the runtime -- this only affects the static analysis, at runtime it's no different than if the explicit initialization was inside `if 0`. Unfortunately, I think the key rationale for (b) is that if you > *don't* do something along those lines, then there's a different > strange scoping discrepancy that arises between the non-comprehension > forms of container displays and the comprehension forms: > > (NAME := EXPR,) # Binds a local > tuple(NAME := EXPR for __ in range(1)) # Doesn't bind a local > [...] > Those scoping inconsistencies aren't *new*, but provoking them > currently involves either class scopes, or messing about with > locals(). > In what sense are they not new? This syntax doesn't exist yet. > The one virtue that choosing this particular set of discrepancies has > is that the explanation for why they happen is the same as the > explanation for how the iteration variable gets hidden from the > containing scope: because "(EXPR for ....)" et al create an implicitly > nested scope, and that nested scope behaves the same way as an > explicitly nested scope as far as name binding and name resolution is > concerned. > Yeah, but most people don't think much about that explanation. You left out another discrepancy, which is more likely to hit people in the face: according to your doctrine, := used in the "outermost iterable" would create a local in the containing scope, since that's where the outermost iterable is evaluated. So in this example a = [x := i+1 for i in range(y := 2)] the scope of x would be the implicit function (i.e. it wouldn't leak) while the scope of y would be the same as that of a. (And there's an even more cryptic example, where the same name is assigned in both places.) This is another detail of comprehensions that I assume tutorials (rightly, IMO) gloss over because it's so rarely relevant. But it would make the explanation of how := works in comprehensions more cumbersome: you'd have to draw attention to the outermost iterable, otherwise "inline assignment in comprehensions has the same scope as the comprehension's loop control variable(s)" would lead one to believe that y's scope above would also be that of the implicit function. > Parent local scoping tries to mitigate the surface inconsistency by > changing how write semantics are defined for implicitly nested scopes, > but that comes at the cost of making those semantics inconsistent with > explicitly nested scopes and with the read semantics of implicitly > nested scopes. > Nobody thinks about write semantics though -- it's simply not the right abstraction to use here, you've introduced it because that's how *you* think about this. > The early iterations of PEP 572 tried to duck this whole realm of > potential semantic inconsistencies by introducing sublocal scoping > instead, such that the scoping for assignment expression targets would > be unusual, but they'd be consistently unusual regardless of where > they appeared, and their quirks would clearly be the result of how > assignment expressions were defined, rather than only showing up in > how they interacted with other scoping design decisions made years > ago. > There was also another variant in some iteration or PEP 572, after sublocal scopes were already eliminated -- a change to comprehensions that would evaluate the innermost iterable in the implicit function. This would make the explanation of inline assignment in comprehensions consistent again (they were always local to the comprehension in that iteration of the PEP), at the cost of a backward incompatibility that was ultimately withdrawn. -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Mon Jun 25 15:15:17 2018 From: guido at python.org (Guido van Rossum) Date: Mon, 25 Jun 2018 12:15:17 -0700 Subject: [Python-Dev] Intent to accept PEP 561 -- Distributing and Packaging Type Information In-Reply-To: References: Message-ID: OK, last call! I'll accept the current draft tomorrow unless someone pushes back. On Fri, Jun 22, 2018 at 8:37 AM Nick Coghlan wrote: > On 23 June 2018 at 01:16, Guido van Rossum wrote: > > That sounds like you're supporting PEP 561 as is, right? > > Aye, I'm personally fine with it - we do need to do something about > automatically reserving the derived names on PyPI, but I don't think > that's a blocker for the initial PEP acceptance (instead, it will go > the other way: PEP acceptance will drive Warehouse getting updated to > handle the convention already being adopted by the client tools). > > > Excuse my > > ignorance, but where are API testing stub interfaces described or used? > > They're not - it's just the context for Donald referring to "stubs" as > being a general technical term with other meanings beyond the "type > hinting stub file" one. > > As such, there's three parts to explaining why we're not worried about > the terminology clash: > > - Ethan searched for projects called "*-stubs" or "*_stubs" and didn't > find any, so the practical impact of any terminology clash will be low > - there isn't an established need to automatically find testing stub > libraries based on an existing project name the way there is for type > hints > - even if such a need did arise in the future, the "py.typed" marker > file and the different file extension for stub files within a package > still gives us an enormous amount of design flexibility > > Cheers, > Nick. > > -- > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From tjreedy at udel.edu Mon Jun 25 15:37:44 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Mon, 25 Jun 2018 15:37:44 -0400 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: On 6/24/2018 7:25 PM, Guido van Rossum wrote: > I'd wager that the people who might be most horrified about it the (b) scoping rule change > would be people who feel strongly that the change to the > comprehension scope rules in Python 3 is a big improvement, I might not be one of those 'most horrified' by (b), but I increasingly don't like it, and I was at best -0 on the comprehension scope change. To me, iteration variable assignment in the current scope is a non-problem. So to me the change was mostly useless churn. Little benefit, little harm. And not worth fighting when others saw a benefit. However, having made the change to nested scopes, I think we should stick with them. Or repeal them. (I believe there is another way to isolate iteration names -- see below). To me, (b) amounts to half repealing the nested scope change, making comprehensions half-fowl, half-fish chimeras. > and who are familiar with the difference in implementation > of comprehensions (though not generator expressions) in Python 2 vs. 3. That I pretty much am, I think. In Python 2, comprehensions (the fish) were, at least in effect, expanded in-line to a normal for loop. Generator expressions (the fowls) were different. They were, and still are, expanded into a temporary generator function whose return value is dropped back into the original namespace. Python 3 turned comprehensions (with 2 news varieties thereof) into fowls also, temporary functions whose return value is dropped back in the original namespace. The result is that a list comprehension is equivalent to list(generator_ expression), even though, for efficiency, it is not implemented that way. (To me, this unification is more a benefit than name hiding.) (b) proposes to add extra hidden code in and around the temporary function to partly undo the isolation. list comprehensions would no longer be equivalent to list(generator_expression), unless generator_expressions got the same treatment, in which case they would no longer be equivalent to calling the obvious generator function. Breaking either equivalence might break someone's code. --- How loop variables might be isolated without a nested scope: After a comprehension is parsed, so that names become strings, rename the loop variables to something otherwise illegal. For instance, i could become '', just as lambda becomes '' as the name of the resulting function. Expand the comprehension as in Python 2, except for deleting the loop names along with the temporary result name. Assignment expressions within a comprehension would become assignment expressions within the for loop expansion and would automatically add or replace values in the namespace containing the comprehension. In other words, I am suggesting that if we want name expressions in comprehensions to act as they would in Python 2, then we should consider reverting to an altered version of the Python 2 expansion. --- In any case, I think (b) should be a separate PEP linked to a PEP for (a). The decision for (a) could be reject (making (b) moot), accept with (b), or accept unconditionally (but still consider (b)). -- Terry Jan Reedy From rosuav at gmail.com Mon Jun 25 15:42:43 2018 From: rosuav at gmail.com (Chris Angelico) Date: Tue, 26 Jun 2018 05:42:43 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: On Tue, Jun 26, 2018 at 5:37 AM, Terry Reedy wrote: > How loop variables might be isolated without a nested scope: After a > comprehension is parsed, so that names become strings, rename the loop > variables to something otherwise illegal. For instance, i could become > '', just as lambda becomes '' as the name of the resulting > function. Expand the comprehension as in Python 2, except for deleting the > loop names along with the temporary result name. > > Assignment expressions within a comprehension would become assignment > expressions within the for loop expansion and would automatically add or > replace values in the namespace containing the comprehension. In other > words, I am suggesting that if we want name expressions in comprehensions to > act as they would in Python 2, then we should consider reverting to an > altered version of the Python 2 expansion. So..... sublocal scopes, like in the earliest versions of PEP 572? The wheel turns round and round, and the same spokes come up. ChrisA From tjreedy at udel.edu Mon Jun 25 15:50:34 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Mon, 25 Jun 2018 15:50:34 -0400 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: On 6/25/2018 8:25 AM, Paul Moore wrote: > On 25 June 2018 at 12:44, Nick Coghlan wrote: >> Unfortunately, I think the key rationale for (b) is that if you >> *don't* do something along those lines, then there's a different >> strange scoping discrepancy that arises between the non-comprehension >> forms of container displays and the comprehension forms: > > I've been mostly ignoring this proposal for a while now, so I'm going > to respond here in the context of someone with a bit of an idea of the > underlying complexities, but otherwise coming at it as a new proposal. > >> >> (NAME := EXPR,) # Binds a local >> tuple(NAME := EXPR for __ in range(1)) # Doesn't bind a local Of course not, in local scopes where is it not executed. But it would, in the nested function where the assignment *is* executed. Ditto for all of the following. >> [NAME := EXPR] # Binds a local >> [NAME := EXPR for __ in range(1)] # Doesn't bind a local >> list(NAME := EXPR for __ in range(1)) # Doesn't bind a local >> >> {NAME := EXPR} # Binds a local >> {NAME := EXPR for __ in range(1)} # Doesn't bind a local >> set(NAME := EXPR for __ in range(1)) # Doesn't bind a local >> >> {NAME := EXPR : EXPR2} # Binds a local >> {NAME := EXPR : EXPR2 for __ in range(1)} # Doesn't bind a local >> set((NAME := EXPR, EXPR2) for __ in range(1)) # Doesn't bind a local > > None of those "discrepancies" bother me in the slightest, Me neither. I pretty much agree with the rest of what Paul said. If we don't want comprehensions to execute in a nested scope, then we should not create one. See my response to Guido for a possible alternative. -- Terry Jan Reedy From steve at holdenweb.com Mon Jun 25 17:13:52 2018 From: steve at holdenweb.com (Steve Holden) Date: Mon, 25 Jun 2018 22:13:52 +0100 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: On Mon, Jun 25, 2018 at 8:37 PM, Terry Reedy wrote: > On 6/24/2018 7:25 PM, Guido van Rossum wrote: > >> I'd wager that the people who might be most horrified about it >> > > the (b) scoping rule change > > would be people who feel strongly that the change to the >> comprehension scope rules in Python 3 is a big improvement, >> > > I might not be one of those 'most horrified' by (b), but I increasingly > don't like it, and I was at best -0 on the comprehension scope change. To > me, iteration variable assignment in the current scope is a non-problem. > So to me the change was mostly useless churn. Little benefit, little > harm. And not worth fighting when others saw a benefit. > > However, having made the change to nested scopes, I think we should stick > with them. Or repeal them. (I believe there is another way to isolate > iteration names -- see below). To me, (b) amounts to half repealing the > nested scope change, making comprehensions half-fowl, half-fish chimeras. > ?[...]? > -- > Terry Jan Reedy > > ?I'd like to ask: how many readers of ? ?this email have ever deliberately taken advantage of the limited Python 3 scope in comprehensions and generator expressions to use what would otherwise be a conflicting local variable name?? I appreciate that the scope limitation can sidestep accidental naming errors, which is a good thing. Unfortunately, unless we anticipate Python 4 (or whatever) also making for loops have an implicit scope, I am left wondering whether it's not too large a price to pay. After all, special cases aren't special enough to break the rules, and unless the language is headed towards implicit scope for all uses of "for" one could argue that the scope limitation is a special case too far. It certainly threatens to be yet another confusion for learners, and while that isn't the only consideration, it should be given due weight. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ericfahlgren at gmail.com Mon Jun 25 18:09:00 2018 From: ericfahlgren at gmail.com (Eric Fahlgren) Date: Mon, 25 Jun 2018 15:09:00 -0700 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: On Mon, Jun 25, 2018 at 2:16 PM Steve Holden wrote: > ?I'd like to ask: how many readers of ? > > ?this email have ever deliberately taken advantage of the limited Python 3 > scope in comprehensions and generator expressions to use what would > otherwise be a conflicting local variable name?? > ?No, never, but the opposite has bitten me in production code (as I related several months back, a class-level variable was being used on the lhs of a comprehension and that failed when it was run in Py3). The caveat is that our code base is Py2+Py3, so we have the mindset that comprehension variables always leak.? -------------- next part -------------- An HTML attachment was scrubbed... URL: From mertz at gnosis.cx Mon Jun 25 18:16:59 2018 From: mertz at gnosis.cx (David Mertz) Date: Mon, 25 Jun 2018 18:16:59 -0400 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: On Mon, Jun 25, 2018 at 5:14 PM Steve Holden wrote: > I'd like to ask: how many readers of ? > > ?this email have ever deliberately taken advantage of the limited Python 3 > scope in comprehensions and generator expressions to use what would > otherwise be a conflicting local variable name?? > I have never once *deliberately* utilized the Python 3 local scoping in comprehensions. There were a few times in Python 2 where I made an error of overwriting a surrounding name by using it in a comprehension, and probably Python 3 has saved me from that a handful of times. Where I ever made such an error, it was with names like 'x' and 'i' and 'n'. They are useful for quick use, but "more important" variables always get more distinctive names anyway. Had the Python 2 behavior remained, I would have been very little inconvenienced; and I suppose comprehensions would have been slightly less "magic" (but less functional-programming). > > > I appreciate that the scope limitation can sidestep accidental naming > errors, which is a good thing. > > Unfortunately, unless we anticipate Python 4 (or whatever) also making for > loops have an implicit scope, I am left wondering whether it's not too > large a price to pay. After all, special cases aren't special enough to > break the rules, and unless the language is headed towards implicit scope > for all uses of "for" one could argue that the scope limitation is a > special case too far. It certainly threatens to be yet another confusion > for learners, and while that isn't the only consideration, it should be > given due weight. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/mertz%40gnosis.cx > -- Keeping medicines from the bloodstreams of the sick; food from the bellies of the hungry; books from the hands of the uneducated; technology from the underdeveloped; and putting advocates of freedom in prisons. Intellectual property is to the 21st century what the slave trade was to the 16th. -------------- next part -------------- An HTML attachment was scrubbed... URL: From greg.ewing at canterbury.ac.nz Mon Jun 25 18:34:47 2018 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Tue, 26 Jun 2018 10:34:47 +1200 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <222b030b-6f6b-1644-6f74-8e7b6d2b9857@mail.mipt.ru> References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> <5B302990.8060807@canterbury.ac.nz> <222b030b-6f6b-1644-6f74-8e7b6d2b9857@mail.mipt.ru> Message-ID: <5B316E07.1060305@canterbury.ac.nz> Ivan Pozdeev via Python-Dev wrote: > "as" was suggested even before is became a keyword in `with'. ( if > (re.match(regex,line) as m) is not None: ) That's not equivalent where/given, though, since it still has the asymmetry problem. -- Greg From greg.ewing at canterbury.ac.nz Mon Jun 25 18:54:12 2018 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Tue, 26 Jun 2018 10:54:12 +1200 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: <5B317294.7070609@canterbury.ac.nz> Terry Reedy wrote: > How loop variables might be isolated without a nested scope: After a > comprehension is parsed, so that names become strings, rename the loop > variables to something otherwise illegal. This doesn't change the situation conceptually, though, since the question arises of why not do the same mangling for names assigned within the comprehension. A decision still needs to be made about whether we *want* semantics that leak some things but not others. -- Greg From greg.ewing at canterbury.ac.nz Mon Jun 25 18:58:03 2018 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Tue, 26 Jun 2018 10:58:03 +1200 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: <5B31737B.5060708@canterbury.ac.nz> Chris Angelico wrote: > The wheel turns round and round, and the same spokes come up. A discussion long past, and a discussion yet to come. There are no beginnings or endings in the Wheel of Python... -- Greg From nad at python.org Tue Jun 26 02:39:35 2018 From: nad at python.org (Ned Deily) Date: Tue, 26 Jun 2018 02:39:35 -0400 Subject: [Python-Dev] 3.7.0 / 3.6.6 Update: all systems go for final releases! Message-ID: <2EFA992F-1AF8-43D7-A350-8E9FEB6A3002@python.org> A quick update: after many months we are at the finish line. We are on track (mixing metaphors) to release 3.7.0 (and 3.6.6) this week on 2018-06-27. Since 3.7.0rc1 shipped 2 weeks ago, I am aware of only two noteworthy regressions that have been identified and now fixed. Since the issues for both have the potential to impact some (but small) subsets of 3.7.0 users and the fixes for both are straightforward and appear to be low-risk, I am planning to cherry-pick the fixes for them into 3.7.0 final without either another release candidate cycle or waiting for 3.7.1. There may be some doc fixes that get cherry-picked as well. At the moment, there are no plans for any bug cherry-picks for 3.6.6 final. As you know, a new feature release is a big deal and something for all of us to be proud of. A new feature release also has various, mostly minor, impacts to lots of different parts of our development infrastructure: to multiple branches of the cpython repo, to documentation builds, to different parts of the python.org web site, etc. You will start to see some of the changes roll out over the next 24 to 36 hours and it may take some time until everything is in place. So please be patient until the official release announcement goes out before reporting release-related issues. Also be advised that over the same period, there may be a few brief periods where commit access to various cpython branches is blocked in order to do the necessary release engineering. If you run into this, for example when trying to merge a PR, please try again in a few hours. Thanks and more later! https://bugs.python.org/issue33851 https://bugs.python.org/issue33932 -- Ned Deily nad at python.org -- [] From eric at trueblade.com Tue Jun 26 03:31:52 2018 From: eric at trueblade.com (Eric V. Smith) Date: Tue, 26 Jun 2018 03:31:52 -0400 Subject: [Python-Dev] [python-committers] 3.7.0 / 3.6.6 Update: all systems go for final releases! In-Reply-To: <2EFA992F-1AF8-43D7-A350-8E9FEB6A3002@python.org> References: <2EFA992F-1AF8-43D7-A350-8E9FEB6A3002@python.org> Message-ID: <5BA88066-C9F6-4ECF-86CC-4E6F1A4FD483@trueblade.com> Congrats, Ned. Thank you for all of your hard work! -- Eric > On Jun 26, 2018, at 2:39 AM, Ned Deily wrote: > > A quick update: after many months we are at the finish line. We are on > track (mixing metaphors) to release 3.7.0 (and 3.6.6) this week on > 2018-06-27. Since 3.7.0rc1 shipped 2 weeks ago, I am aware of only two > noteworthy regressions that have been identified and now fixed. Since > the issues for both have the potential to impact some (but small) > subsets of 3.7.0 users and the fixes for both are straightforward and > appear to be low-risk, I am planning to cherry-pick the fixes for them > into 3.7.0 final without either another release candidate cycle or > waiting for 3.7.1. There may be some doc fixes that get cherry-picked > as well. At the moment, there are no plans for any bug cherry-picks for > 3.6.6 final. > > As you know, a new feature release is a big deal and something for all > of us to be proud of. A new feature release also has various, mostly > minor, impacts to lots of different parts of our development > infrastructure: to multiple branches of the cpython repo, to > documentation builds, to different parts of the python.org web site, > etc. You will start to see some of the changes roll out over the next 24 > to 36 hours and it may take some time until everything is in place. > So please be patient until the official release announcement goes out > before reporting release-related issues. Also be advised that over the > same period, there may be a few brief periods where commit access to > various cpython branches is blocked in order to do the necessary > release engineering. If you run into this, for example when trying to > merge a PR, please try again in a few hours. > > Thanks and more later! > > https://bugs.python.org/issue33851 > https://bugs.python.org/issue33932 > > -- > Ned Deily > nad at python.org -- [] > > _______________________________________________ > python-committers mailing list > python-committers at python.org > https://mail.python.org/mailman/listinfo/python-committers > Code of Conduct: https://www.python.org/psf/codeofconduct/ From vano at mail.mipt.ru Tue Jun 26 03:41:04 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Tue, 26 Jun 2018 10:41:04 +0300 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: On 26.06.2018 0:13, Steve Holden wrote: > On Mon, Jun 25, 2018 at 8:37 PM, Terry Reedy > wrote: > > On 6/24/2018 7:25 PM, Guido van Rossum wrote: > > I'd wager that the people who might be most horrified about it > > > the (b) scoping rule change > > would be people who feel strongly that the change to the > comprehension scope rules in Python 3 is a big improvement, > > > I might not be one of those 'most horrified' by (b), but I > increasingly don't like it, and I was at best -0 on the > comprehension scope change. To me, iteration variable assignment > in the current scope is a non-problem.? So to me the change was > mostly useless churn.? Little benefit, little harm.? And not worth > fighting when others saw a benefit. > > However, having made the change to nested scopes, I think we > should stick with them.? Or repeal them.? (I believe there is > another way to isolate iteration names -- see below).? To me, (b) > amounts to half repealing the nested scope change, making > comprehensions half-fowl, half-fish chimeras. > > ?[...]? > > -- > Terry Jan Reedy > > ?I'd like to ask: how many readers of ? > ?this email have ever deliberately taken advantage of the limited > Python 3 scope in comprehensions and generator expressions to use what > would otherwise be a conflicting local variable name?? I did: for l in (l.rstrip() for l in f): The provisional unstripped line variable is totally unneeded in the following code. > > I appreciate that the scope limitation can sidestep accidental naming > errors, which is a good thing. > > Unfortunately, unless we anticipate Python 4 (or whatever) also making > for loops have an implicit scope, I am left wondering whether it's not > too large a price to pay. After all, special cases aren't special > enough to break the rules, and unless the language is headed towards > implicit scope for all uses of "for" one could argue that the scope > limitation is a special case too far. It certainly threatens to be yet > another confusion for learners, and while that isn't the only > consideration, it should be given due weight. > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/vano%40mail.mipt.ru -- Regards, Ivan -------------- next part -------------- An HTML attachment was scrubbed... URL: From vano at mail.mipt.ru Tue Jun 26 04:08:43 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Tue, 26 Jun 2018 11:08:43 +0300 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <5B31737B.5060708@canterbury.ac.nz> References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> <5B31737B.5060708@canterbury.ac.nz> Message-ID: <386ee318-72de-f084-74b0-6e445f7b4b60@mail.mipt.ru> On 26.06.2018 1:58, Greg Ewing wrote: > Chris Angelico wrote: > >> The wheel turns round and round, and the same spokes come up. > Unless there's a repository of prior discussion no-one can be bothered to gather scraps from around the Net. Wikis solve this by all the discussion being in one place, and even they struggle is there were multiple. > A discussion long past, and a discussion yet to come. > > There are no beginnings or endings in the Wheel of Python... > -- Regards, Ivan From J.Demeyer at UGent.be Tue Jun 26 05:00:48 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Tue, 26 Jun 2018 11:00:48 +0200 Subject: [Python-Dev] Policy on refactoring/clean up Message-ID: <5B3200C0.7090307@UGent.be> Hello, On https://github.com/python/cpython/pull/7909 I encountered friction for a PR which I expected to be uncontroversial: it just moves some code without changing any functionality. So basically my question is: is there some CPython policy *against* refactoring code to make it easier to read and write? (Note that I'm not talking about pure style issues here) Background: cpython has a source file "call.c" (introduced in https://github.com/python/cpython/pull/12) but the corresponding declarations are split over several .h files. While working on PEP 580, I found this slightly confusing. I decided that it would make more sense to group all these declarations in a new file "call.h". That's what PR 7909 does. In my opinion, the resulting code is easier to read. It also defines a clear place for declarations of future functionality added to "call.c" (for example, if we add a public API for FASTCALL). Finally, I added/clarified a few comments. I expected the PR to be either ignored or accepted. However, I received a negative reaction from Inada Naoki on it. I don't mind closing the PR and keeping the status quo if there is a general agreement. However, I'm afraid that a future reviewer of PEP 580 might say "your includes are a mess" and he will be right. Jeroen. From songofacandy at gmail.com Tue Jun 26 05:11:16 2018 From: songofacandy at gmail.com (INADA Naoki) Date: Tue, 26 Jun 2018 18:11:16 +0900 Subject: [Python-Dev] Policy on refactoring/clean up In-Reply-To: <5B3200C0.7090307@UGent.be> References: <5B3200C0.7090307@UGent.be> Message-ID: FYI, I don't against general refactoring, when I agree it's really make code cleaner, readable. I against your PR because I didn't feel it really make code cleaner, readable. I already commented about it on the PR. https://github.com/python/cpython/pull/7909#issuecomment-400219905 So it's not problem about general policy about refactoring / clean up. It's just my preference. If Victor and Serhiy prefer the PR, I'm OK to merge it. Regards, -- INADA Naoki -------------- next part -------------- An HTML attachment was scrubbed... URL: From storchaka at gmail.com Tue Jun 26 07:07:44 2018 From: storchaka at gmail.com (Serhiy Storchaka) Date: Tue, 26 Jun 2018 14:07:44 +0300 Subject: [Python-Dev] Policy on refactoring/clean up In-Reply-To: References: <5B3200C0.7090307@UGent.be> Message-ID: 26.06.18 12:11, INADA Naoki ????: > FYI, I don't against general refactoring, when I agree it's really make > code cleaner, readable. > > I against your PR because I didn't feel it really make code cleaner, > readable. > I already commented about it on the PR. > https://github.com/python/cpython/pull/7909#issuecomment-400219905 > > So it's not problem about general policy about refactoring / clean up. > It's just my preference.? If Victor and Serhiy prefer the PR, I'm OK to > merge it. I'm with Inada. From vano at mail.mipt.ru Tue Jun 26 07:11:14 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Tue, 26 Jun 2018 14:11:14 +0300 Subject: [Python-Dev] Policy on refactoring/clean up In-Reply-To: <5B3200C0.7090307@UGent.be> References: <5B3200C0.7090307@UGent.be> Message-ID: <92a4ddf3-15ab-b30f-df1f-f0f1d45cad98@mail.mipt.ru> On 26.06.2018 12:00, Jeroen Demeyer wrote: > Hello, > > On https://github.com/python/cpython/pull/7909 I encountered friction > for a PR which I expected to be uncontroversial: it just moves some > code without changing any functionality. > > So basically my question is: is there some CPython policy *against* > refactoring code to make it easier to read and write? (Note that I'm > not talking about pure style issues here) > > Background: cpython has a source file "call.c" (introduced in > https://github.com/python/cpython/pull/12) but the corresponding > declarations are split over several .h files. While working on PEP > 580, I found this slightly confusing. I decided that it would make > more sense to group all these declarations in a new file "call.h". > That's what PR 7909 does. In my opinion, the resulting code is easier > to read. It also defines a clear place for declarations of future > functionality added to "call.c" (for example, if we add a public API > for FASTCALL). Finally, I added/clarified a few comments. > > I expected the PR to be either ignored or accepted. However, I > received a negative reaction from Inada Naoki on it. > > I don't mind closing the PR and keeping the status quo if there is a > general agreement. However, I'm afraid that a future reviewer of PEP > 580 might say "your includes are a mess" and he will be right. > AFAICS, your PR is not a strict improvement, that's the reason for the "friction". You may suggest it as a supplemental PR to PEP 580. Or even a part of it, but since the changes are controversial, better make the refactorings into separate commits so they can be rolled back separately if needed. > > Jeroen. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/vano%40mail.mipt.ru -- Regards, Ivan From vstinner at redhat.com Tue Jun 26 07:31:32 2018 From: vstinner at redhat.com (Victor Stinner) Date: Tue, 26 Jun 2018 13:31:32 +0200 Subject: [Python-Dev] Policy on refactoring/clean up In-Reply-To: References: <5B3200C0.7090307@UGent.be> Message-ID: Hi, I created call.c when I worked on optimizations. I saw that performances depend on the code locality and that compilers are more eager to inline code when it's in the same file. Moreover, call.c now contains some private functions like function_code_fastcall() or _PyObject_CallFunctionVa() which are called from other public functions. Without call.c, some of these functions should be make "public" (exposed) which has also an impact on performance. Putting all these functions in the same file helps the compiler to produce more efficient code, but should also prevent inconsistencies when we modify a calling convention. I'm not sure of the rationale behind the proposed call.h header. Is it difficult to maintain actual definitions in multiple header files? I'm used to them, it's easy to discover them, so *I* am not really convinced that call.h adds any value. I also expect fewer changes in header changes than in the implementation (call.c). Victor 2018-06-26 11:11 GMT+02:00 INADA Naoki : > FYI, I don't against general refactoring, when I agree it's really make code > cleaner, readable. > > I against your PR because I didn't feel it really make code cleaner, > readable. > I already commented about it on the PR. > https://github.com/python/cpython/pull/7909#issuecomment-400219905 > > So it's not problem about general policy about refactoring / clean up. > It's just my preference. If Victor and Serhiy prefer the PR, I'm OK to > merge it. > > Regards, > > -- > INADA Naoki > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/vstinner%40redhat.com > From J.Demeyer at UGent.be Tue Jun 26 07:43:32 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Tue, 26 Jun 2018 13:43:32 +0200 Subject: [Python-Dev] Policy on refactoring/clean up In-Reply-To: References: <5B3200C0.7090307@UGent.be> Message-ID: <5B3226E4.8090107@UGent.be> On 2018-06-26 13:11, Ivan Pozdeev via Python-Dev wrote: > AFAICS, your PR is not a strict improvement What does "strict improvement" even mean? Many changes are not strict improvements, but still useful to have. Inada pointed me to YAGNI (https://en.wikipedia.org/wiki/You_aren%27t_gonna_need_it) but I disagree with that premise: there is a large gray zone between "completely useless" and "really needed". My PR falls in that gap of "nice to have but we can do without it". > You may suggest it as a supplemental PR to PEP 580. Or even a part of > it, but since the changes are controversial, better make the > refactorings into separate commits so they can be rolled back separately > if needed. If those refactorings are rejected now, won't they be rejected as part of PEP 580 also? From vano at mail.mipt.ru Tue Jun 26 07:54:33 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Tue, 26 Jun 2018 14:54:33 +0300 Subject: [Python-Dev] Policy on refactoring/clean up In-Reply-To: <5B3226E4.8090107@UGent.be> References: <5B3200C0.7090307@UGent.be> <5B3226E4.8090107@UGent.be> Message-ID: <055c6d1a-d8dd-74a5-3eaf-d28251cab448@mail.mipt.ru> On 26.06.2018 14:43, Jeroen Demeyer wrote: > On 2018-06-26 13:11, Ivan Pozdeev via Python-Dev wrote: >> AFAICS, your PR is not a strict improvement > > What does "strict improvement" even mean? Many changes are not strict > improvements, but still useful to have. > > Inada pointed me to YAGNI > (https://en.wikipedia.org/wiki/You_aren%27t_gonna_need_it) but I > disagree with that premise: there is a large gray zone between > "completely useless" and "really needed". My PR falls in that gap of > "nice to have but we can do without it". > >> You may suggest it as a supplemental PR to PEP 580. Or even a part of >> it, but since the changes are controversial, better make the >> refactorings into separate commits so they can be rolled back separately >> if needed. > > If those refactorings are rejected now, won't they be rejected as part > of PEP 580 also? This is exactly what that the YAGNI principle is about, and Inada was right to point to it. Until you have an immediate practical need for something, you don't really know the shape and form for it that you will be the most comfortable with. Thus any "would be nice to have" tinkerings are essentially a waste of time and possibly a degradation, too: you'll very likely have to change them again when the real need arises -- while having to live with any drawbacks in the meantime. So, if you suggest those changes together with the PEP 580 PR, they will be reviewed through the prism of the new codebase and its needs, which are different from the current codebase and its needs. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/vano%40mail.mipt.ru -- Regards, Ivan From songofacandy at gmail.com Tue Jun 26 07:54:49 2018 From: songofacandy at gmail.com (INADA Naoki) Date: Tue, 26 Jun 2018 20:54:49 +0900 Subject: [Python-Dev] Policy on refactoring/clean up In-Reply-To: <5B3226E4.8090107@UGent.be> References: <5B3200C0.7090307@UGent.be> <5B3226E4.8090107@UGent.be> Message-ID: On Tue, Jun 26, 2018 at 8:46 PM Jeroen Demeyer wrote: > On 2018-06-26 13:11, Ivan Pozdeev via Python-Dev wrote: > > AFAICS, your PR is not a strict improvement > > What does "strict improvement" even mean? Many changes are not strict > improvements, but still useful to have. > > Inada pointed me to YAGNI > ?No, YAGNI is posted by someone and they removed their comment. My point was: Moving code around makes: > > - hard to track history. > > > - hard to backport patches to old branches. > > https://github.com/python/cpython/pull/7909#issuecomment-400219905 And I prefer keeping definitions relating to? methods in methodobject.h to move them to call.h only because they're used/implemented in call.c > (https://en.wikipedia.org/wiki/You_aren%27t_gonna_need_it) but I > disagree with that premise: there is a large gray zone between > "completely useless" and "really needed". My PR falls in that gap of > "nice to have but we can do without it". > > ?So I didn't think even it is "nice to have".? > > You may suggest it as a supplemental PR to PEP 580. Or even a part of > > it, but since the changes are controversial, better make the > > refactorings into separate commits so they can be rolled back separately > > if needed. > > If those refactorings are rejected now, won't they be rejected as part > of PEP 580 also? > Real need is important than my preference. If it is needed PEP 580, I'm OK. But I didn't know which part of the PR is required by PEP 580. Regards, -- INADA Naoki -------------- next part -------------- An HTML attachment was scrubbed... URL: From J.Demeyer at UGent.be Tue Jun 26 07:56:54 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Tue, 26 Jun 2018 13:56:54 +0200 Subject: [Python-Dev] Policy on refactoring/clean up In-Reply-To: <005390857ee44516b04d972ecd9f203d@xmail101.UGent.be> References: <5B3200C0.7090307@UGent.be> <5B3226E4.8090107@UGent.be> <005390857ee44516b04d972ecd9f203d@xmail101.UGent.be> Message-ID: <5B322A06.1000607@UGent.be> On 2018-06-26 13:54, INADA Naoki wrote: > ?No, YAGNI is posted by someone and they removed their comment. Sorry for that, I misunderstood the email that GitHub sent me. From vano at mail.mipt.ru Tue Jun 26 08:02:17 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Tue, 26 Jun 2018 15:02:17 +0300 Subject: [Python-Dev] Policy on refactoring/clean up In-Reply-To: References: <5B3200C0.7090307@UGent.be> <5B3226E4.8090107@UGent.be> Message-ID: <612f43ef-20c0-122b-d728-4a8c0e6f191e@mail.mipt.ru> On 26.06.2018 14:54, INADA Naoki wrote: > > On Tue, Jun 26, 2018 at 8:46 PM Jeroen Demeyer > wrote: > > On 2018-06-26 13:11, Ivan Pozdeev via Python-Dev wrote: > > AFAICS, your PR is not a strict improvement > > What does "strict improvement" even mean? Many changes are not strict > improvements, but still useful to have. > > Inada pointed me to YAGNI > > > ?No, YAGNI is posted by someone and they removed their comment. Yes, that was me instead. I posted it and then changed my mind. Apparently, notifications were sent nonetheless. I didn't watch the thread and kinda assumed that you pointed that out, too. (Just to put everything straight and not make anyone suspect I'm trying to pull the wool over anyone's eyes here.) > > My point was: > > Moving code around makes: > > * hard to track history. > > * hard to backport patches to old branches. > > https://github.com/python/cpython/pull/7909#issuecomment-400219905 > > And I prefer keeping definitions relating to? methods in methodobject.h to > move them to call.h only because they're used/implemented in call.c > > (https://en.wikipedia.org/wiki/You_aren%27t_gonna_need_it) but I > disagree with that premise: there is a large gray zone between > "completely useless" and "really needed". My PR falls in that gap of > "nice to have but we can do without it". > > > ?So I didn't think even it is "nice to have".? > > > You may suggest it as a supplemental PR to PEP 580. Or even a > part of > > it, but since the changes are controversial, better make the > > refactorings into separate commits so they can be rolled back > separately > > if needed. > > If those refactorings are rejected now, won't they be rejected as > part > of PEP 580 also? > > > Real need is important than my preference.? If it is needed PEP 580, > I'm OK. > But I didn't know which part of the PR is required by PEP 580. > > Regards, > > -- > INADA Naoki ?> > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/vano%40mail.mipt.ru -- Regards, Ivan -------------- next part -------------- An HTML attachment was scrubbed... URL: From solipsis at pitrou.net Tue Jun 26 08:12:13 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Tue, 26 Jun 2018 14:12:13 +0200 Subject: [Python-Dev] Policy on refactoring/clean up References: <5B3200C0.7090307@UGent.be> Message-ID: <20180626141213.7b4e2f20@fsol> On Tue, 26 Jun 2018 11:00:48 +0200 Jeroen Demeyer wrote: > Hello, > > On https://github.com/python/cpython/pull/7909 I encountered friction > for a PR which I expected to be uncontroversial: it just moves some code > without changing any functionality. > > So basically my question is: is there some CPython policy *against* > refactoring code to make it easier to read and write? (Note that I'm not > talking about pure style issues here) I think refactorings are generally ok, assuming they bring a substantial maintainability or readability benefit. I haven't studied your PR enough to decide whether that's the case for the changes you are proposing, though. Regards Antoine. From J.Demeyer at UGent.be Tue Jun 26 08:13:49 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Tue, 26 Jun 2018 14:13:49 +0200 Subject: [Python-Dev] Policy on refactoring/clean up In-Reply-To: <2d24772c149b450a95852b158575a57a@xmail101.UGent.be> References: <5B3200C0.7090307@UGent.be> <5B3226E4.8090107@UGent.be> <2d24772c149b450a95852b158575a57a@xmail101.UGent.be> Message-ID: <5B322DFD.2040000@UGent.be> On 2018-06-26 13:54, Ivan Pozdeev via Python-Dev wrote: > This is exactly what that the YAGNI principle is about, and Inada was > right to point to it. Until you have an immediate practical need for > something, you don't really know the shape and form for it that you will > be the most comfortable with. Thus any "would be nice to have" > tinkerings are essentially a waste of time and possibly a degradation, > too: you'll very likely have to change them again when the real need > arises -- while having to live with any drawbacks in the meantime. It is important to clarify that this is exactly what I did. I *have* an implementation of PEP 580 and it's based on that PR 7909. I just think that this PR makes sense independently of whether PEP 580 will be accepted. > So, if you suggest those changes together with the PEP 580 PR That sounds like a bad idea because that would be mixing two issues in one PR. If I want to increase my chances of getting PEP 580 and its implementation accepted, I shouldn't bring in unrelated changes. To put it in a different perspective: if somebody else would make a PR to one of my projects doing a refactoring and adding new features, I would ask them to split it up. From J.Demeyer at UGent.be Tue Jun 26 08:14:09 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Tue, 26 Jun 2018 14:14:09 +0200 Subject: [Python-Dev] Policy on refactoring/clean up In-Reply-To: <005390857ee44516b04d972ecd9f203d@xmail101.UGent.be> References: <5B3200C0.7090307@UGent.be> <5B3226E4.8090107@UGent.be> <005390857ee44516b04d972ecd9f203d@xmail101.UGent.be> Message-ID: <5B322E11.3050903@UGent.be> On 2018-06-26 13:54, INADA Naoki wrote: > Real need is important than my preference. If it is needed PEP 580, I'm OK. Of course it's not needed. I never claimed that it was. I think it's *nice to have* right now and slightly more *nice to have* when changes/additions are made to call.c in the future. From encukou at gmail.com Tue Jun 26 08:20:49 2018 From: encukou at gmail.com (Petr Viktorin) Date: Tue, 26 Jun 2018 14:20:49 +0200 Subject: [Python-Dev] Policy on refactoring/clean up In-Reply-To: <5B322DFD.2040000@UGent.be> References: <5B3200C0.7090307@UGent.be> <5B3226E4.8090107@UGent.be> <2d24772c149b450a95852b158575a57a@xmail101.UGent.be> <5B322DFD.2040000@UGent.be> Message-ID: On 06/26/18 14:13, Jeroen Demeyer wrote: > On 2018-06-26 13:54, Ivan Pozdeev via Python-Dev wrote: >> This is exactly what that the YAGNI principle is about, and Inada was >> right to point to it. Until you have an immediate practical need for >> something, you don't really know the shape and form for it that you will >> be the most comfortable with. Thus any "would be nice to have" >> tinkerings are essentially a waste of time and possibly a degradation, >> too: you'll very likely have to change them again when the real need >> arises -- while having to live with any drawbacks in the meantime. > > It is important to clarify that this is exactly what I did. I *have* an > implementation of PEP 580 and it's based on that PR 7909. > > I just think that this PR makes sense independently of whether PEP 580 > will be accepted. > >> So, if you suggest those changes together with the PEP 580 PR > > That sounds like a bad idea because that would be mixing two issues in > one PR. If I want to increase my chances of getting PEP 580 and its > implementation accepted, I shouldn't bring in unrelated changes. > > To put it in a different perspective: if somebody else would make a PR > to one of my projects doing a refactoring and adding new features, I > would ask them to split it up. Actually, that's exactly what we *did* ask Jeroen with his earlier proposal for PEP 575, where the implementation ended up being quite big. Split the changes to make it more manageable. Unfortunately I haven't had time to study this PR yet (work is taking all my time lately), but I trust that Jeroen will propose actual improvements on top of the clean-up. From vano at mail.mipt.ru Tue Jun 26 08:25:29 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Tue, 26 Jun 2018 15:25:29 +0300 Subject: [Python-Dev] Policy on refactoring/clean up In-Reply-To: <055c6d1a-d8dd-74a5-3eaf-d28251cab448@mail.mipt.ru> References: <5B3200C0.7090307@UGent.be> <5B3226E4.8090107@UGent.be> <055c6d1a-d8dd-74a5-3eaf-d28251cab448@mail.mipt.ru> Message-ID: On 26.06.2018 14:54, Ivan Pozdeev via Python-Dev wrote: > On 26.06.2018 14:43, Jeroen Demeyer wrote: >> On 2018-06-26 13:11, Ivan Pozdeev via Python-Dev wrote: >>> AFAICS, your PR is not a strict improvement >> >> What does "strict improvement" even mean? Many changes are not strict >> improvements, but still useful to have. >> >> Inada pointed me to YAGNI >> (https://en.wikipedia.org/wiki/You_aren%27t_gonna_need_it) but I >> disagree with that premise: there is a large gray zone between >> "completely useless" and "really needed". My PR falls in that gap of >> "nice to have but we can do without it". >> >>> You may suggest it as a supplemental PR to PEP 580. Or even a part of >>> it, but since the changes are controversial, better make the >>> refactorings into separate commits so they can be rolled back >>> separately >>> if needed. >> >> If those refactorings are rejected now, won't they be rejected as >> part of PEP 580 also? > > This is exactly what that the YAGNI principle is about, and Inada was > right to point to it. Strike this part out since he didn't actually say that as it turned out. > Until you have an immediate practical need for something, you don't > really know the shape and form for it that you will be the most > comfortable with. Thus any "would be nice to have" tinkerings are > essentially a waste of time and possibly a degradation, too: you'll > very likely have to change them again when the real need arises -- > while having to live with any drawbacks in the meantime. > > So, if you suggest those changes together with the PEP 580 PR, they > will be reviewed through the prism of the new codebase and its needs, > which are different from the current codebase and its needs. > >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/vano%40mail.mipt.ru > -- Regards, Ivan -------------- next part -------------- An HTML attachment was scrubbed... URL: From dimaqq at gmail.com Tue Jun 26 08:30:40 2018 From: dimaqq at gmail.com (Dima Tisnek) Date: Tue, 26 Jun 2018 20:30:40 +0800 Subject: [Python-Dev] We now have C code coverage! In-Reply-To: References: Message-ID: This is awesome, thank you Ammar! On 23 June 2018 at 06:21, Brett Cannon wrote: > Thanks to a PR from Ammar Askar we now run Python under lcov as part of > the code coverage build. And thanks to codecov.io automatically merging > code coverage reports we get a complete report of our coverage (the first > results of which can now be seen at https://codecov.io/gh/python/cpython). > > And funny enough the coverage average changed less than 1%. :) > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ > dimaqq%40gmail.com > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mark at hotpy.org Tue Jun 26 15:43:59 2018 From: mark at hotpy.org (Mark Shannon) Date: Tue, 26 Jun 2018 20:43:59 +0100 Subject: [Python-Dev] PEP 576 Message-ID: Hi all, Just a reminder that PEP 576 still exists as a lightweight alternative to PEP 575/580. It achieves the same goals as PEP 580 but is much smaller. https://github.com/markshannon/pep-576 Unless there is a big rush, I would like to do some experiments as to whether the new calling convention should be typedef (*callptr)(PyObject *func, PyObject *const *stack, Py_ssize_t nargs, PyObject *kwnames); or whether the increased generality of: typedef (*callptr)(PyObject *func, PyObject *const *stack, Py_ssize_t nargs, PyObject *kwnames, PyTupleObject *starargs, PyObject *kwdict); is a worthwhile enhancement. An implementation can be found here: https://github.com/python/cpython/compare/master...markshannon:pep-576-minimal Cheers, Mark. From J.Demeyer at UGent.be Tue Jun 26 16:57:01 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Tue, 26 Jun 2018 22:57:01 +0200 Subject: [Python-Dev] PEP 576 In-Reply-To: <63a22544a9fb4285b361ea1b042c40b8@xmail101.UGent.be> References: <63a22544a9fb4285b361ea1b042c40b8@xmail101.UGent.be> Message-ID: <5B32A89D.6000103@UGent.be> On 2018-06-26 21:43, Mark Shannon wrote: > https://github.com/markshannon/pep-576 Your version of PEP 576 looks very different from the "official" PEP 576 at https://www.python.org/dev/peps/pep-0576/ So can you please make a pull request to https://github.com/python/peps/pulls Also feel free to add pointers to your PEP on PEP 579. Jeroen. From guido at python.org Tue Jun 26 18:02:21 2018 From: guido at python.org (Guido van Rossum) Date: Tue, 26 Jun 2018 15:02:21 -0700 Subject: [Python-Dev] Policy on refactoring/clean up In-Reply-To: <5B3200C0.7090307@UGent.be> References: <5B3200C0.7090307@UGent.be> Message-ID: I know there was a big follow-up already, but I'd like to point out that (while clearly not everyone feels the same) I am personally inclined to set the bar pretty high for refactoring that don't add functionality. It makes crawling through history using e.g. git blame harder, since the person who last refactored the code ends up owning it even though they weren't responsible for all its intricacies (which might separately be blame-able on many different commits). And TBH a desire to refactor a lot of code is often a sign of a relatively new contributor who hasn't learned their way around the code yet, so they tend to want to make the code follow their understanding rather than letting their understanding follow the code. Also see https://en.wikipedia.org/wiki/Wikipedia:Chesterton%27s_fence On Tue, Jun 26, 2018 at 2:03 AM Jeroen Demeyer wrote: > Hello, > > On https://github.com/python/cpython/pull/7909 I encountered friction > for a PR which I expected to be uncontroversial: it just moves some code > without changing any functionality. > > So basically my question is: is there some CPython policy *against* > refactoring code to make it easier to read and write? (Note that I'm not > talking about pure style issues here) > > Background: cpython has a source file "call.c" (introduced in > https://github.com/python/cpython/pull/12) but the corresponding > declarations are split over several .h files. While working on PEP 580, > I found this slightly confusing. I decided that it would make more sense > to group all these declarations in a new file "call.h". That's what PR > 7909 does. In my opinion, the resulting code is easier to read. It also > defines a clear place for declarations of future functionality added to > "call.c" (for example, if we add a public API for FASTCALL). Finally, I > added/clarified a few comments. > > I expected the PR to be either ignored or accepted. However, I received > a negative reaction from Inada Naoki on it. > > I don't mind closing the PR and keeping the status quo if there is a > general agreement. However, I'm afraid that a future reviewer of PEP 580 > might say "your includes are a mess" and he will be right. > > > Jeroen. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at pearwood.info Tue Jun 26 21:42:19 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Wed, 27 Jun 2018 11:42:19 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <5B302990.8060807@canterbury.ac.nz> References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> <5B302990.8060807@canterbury.ac.nz> Message-ID: <20180627014217.GX14437@ando.pearwood.info> On Mon, Jun 25, 2018 at 11:30:24AM +1200, Greg Ewing wrote: > Guido van Rossum wrote: > >Greg seem to be +0 or better for (a) > > Actually, I'm closer to -1 on (a) as well. I don't like := as a > way of getting assignment in an expression. The only thing I would > give a non-negative rating is some form of "where" or "given". > > Brief summary of reasons for disliking ":=": > > * Cryptic use of punctuation ":=" is the second most common syntax used for assignment in common programming languages, not just Pascal. Even modern languages like Go use it. If that's "cryptic", what word would you use to describe @decorator syntax? *wink* Honestly Greg, can you put your hand on your heart and swear that if you came across "name := expression" in source code you wouldn't be able to hazard a guess as the meaning of the := operator? > * Too much overlap in functionality with "=" If you are willing to consider a non-negative rating under the "given/willing"spelling, presumably the "overlap in functionality" isn't that important. (Otherwise it would be an argument against the feature *regardless of spelling*.) So why should it be an argument against the := spelling? > * Asymmetry between first and subsequent uses of the bound value I don't know what this means. > * Makes expressions cluttered and hard to read to my eyes And Nick's more verbose "given" proposal makes expressions less cluttered? result = process(first=(spam := ham or eggs), second=spam*5) result = process(first=(spam given spam = ham or eggs), second=spam*5) The := spelling has three syntactic elements: the target name, the := operator itself, and the expression being assigned. The syntax you are willing to consider has five elements: an arbitrarily complex return expression, the keyword "given", the target name, the = operator, and the expression being assigned. It isn't rational to say that adding extra complexity and more syntactic elements *reduces* clutter. At the minimum, Nick's syntax requires: - an extra keyword ("given" or "where") - a repetitive, redundant, repeated use of the target name just to save one : character. That adds, not subtracts, clutter. Aside from the asymmetry issue (which I don't understand) it seems that most of your arguments against := apply equally, or even more strongly, to the "expr given name = expr" version. I know matters of taste are deeply subjective, but we ought to distinguish between *subjective* and *irrational* reasons for disliking proposed features, and try to resist the irrational ones: "We should change the spelling of set.add to set.append, as that will remove the troublesome double-letter, and reduce typing." *wink* -- Steve From guido at python.org Tue Jun 26 22:36:14 2018 From: guido at python.org (Guido van Rossum) Date: Tue, 26 Jun 2018 19:36:14 -0700 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: [This is my one response today] On Mon, Jun 25, 2018 at 12:40 PM Terry Reedy wrote: > On 6/24/2018 7:25 PM, Guido van Rossum wrote: > > I'd wager that the people who might be most horrified about it > > the (b) scoping rule change > > > would be people who feel strongly that the change to the > > comprehension scope rules in Python 3 is a big improvement, > > I might not be one of those 'most horrified' by (b), but I increasingly > don't like it, and I was at best -0 on the comprehension scope change. > To me, iteration variable assignment in the current scope is a > non-problem. So to me the change was mostly useless churn. Little > benefit, little harm. And not worth fighting when others saw a benefit. > Fair enough, and by itself this might not have been enough reason to make the change. But see below. > However, having made the change to nested scopes, I think we should > stick with them. Or repeal them. (I believe there is another way to > isolate iteration names -- see below). To me, (b) amounts to half > repealing the nested scope change, making comprehensions half-fowl, > half-fish chimeras. > That depends on how you see it -- to me (b) just means that there's an implicit nonlocal[1] to make the assignment have the (desirable) side-effect. The key thing to consider here is whether that side-effect is in fact desirable. For me, the side-effect of the comprehension's loop control variable was never desirable -- it was just an implementation detail leaking out. (And that's different from leaking a regular for-loop's control variable -- since we have 'break' (and 'else') there are some legitimate use cases. But comprehensions try to be expressions, and here the side effect is at best useless and at worst a nasty surprise.) > > and who are familiar with the difference in implementation > > of comprehensions (though not generator expressions) in Python 2 vs. 3. > > That I pretty much am, I think. In Python 2, comprehensions (the fish) > were, at least in effect, expanded in-line to a normal for loop. > Generator expressions (the fowls) were different. They were, and still > are, expanded into a temporary generator function whose return value is > dropped back into the original namespace. Python 3 turned > comprehensions (with 2 news varieties thereof) into fowls also, > temporary functions whose return value is dropped back in the original > namespace. The result is that a list comprehension is equivalent to > list(generator_ expression), even though, for efficiency, it is not > implemented that way. (To me, this unification is more a benefit than > name hiding.) > Right, and this consistency convinced me that the change was worth it. I just really like to be able to say "[... for ...]" is equivalent to "list(... for ...)", and similar for set and dict. > (b) proposes to add extra hidden code in and around the temporary > function to partly undo the isolation. But it just adds a nonlocal declaration. There's always some hidden code ('def' and 'return' at the very least). > list comprehensions would no > longer be equivalent to list(generator_expression), unless > generator_expressions got the same treatment, in which case they would > no longer be equivalent to calling the obvious generator function. > Breaking either equivalence might break someone's code. > Ah, there's the rub! I should probably apologize for not clarifying my terminology more. In the context of PEP 572, when I say "comprehensions" I include generators! PEP 572 states this explicitly ( https://github.com/python/peps/blame/master/pep-0572.rst#L201-L202). Certainly PEP 572 intends to add that implicit nonlocal to both comprehensions and generator expressions. (I just got really tired of writing that phrase over and over, and at some point I forgot that this is only a parenthetical remark added in the PEP's latest revision, and not conventional terminology -- alas. :-) Part (b) of PEP 572 does several things of things to *retain* consistency: - The target of := lives in the same scope regardless of whether it occurs in a comprehension, a generator expression, or just in some other expression. - When it occurs in a comprehension or generator expression, the scope is the same regardless of whether it occurs in the "outermost iterable" or not. If we didn't have (b) the target would live in the comprehension/genexpr scope if it occurred in a comprehension/genexp but outside its "outermost iterable", and in the surrounding scope otherwise. > --- > > How loop variables might be isolated without a nested scope: After a > comprehension is parsed, so that names become strings, rename the loop > variables to something otherwise illegal. For instance, i could become > '', just as lambda becomes '' as the name of the resulting > function. Expand the comprehension as in Python 2, except for deleting > the loop names along with the temporary result name. > > Assignment expressions within a comprehension would become assignment > expressions within the for loop expansion and would automatically add or > replace values in the namespace containing the comprehension. In other > words, I am suggesting that if we want name expressions in > comprehensions to act as they would in Python 2, then we should consider > reverting to an altered version of the Python 2 expansion. > Possibly this is based on a misunderstanding of my use of "comprehensions". Also, since your trick can only be used for list/set/dict comprehensions, but not for generator expressions (at least I assume you don't want it there) it would actually *reduce* consistency between list/set/dict comprehensions and generator expressions. > --- > > In any case, I think (b) should be a separate PEP linked to a PEP for > (a). The decision for (a) could be reject (making (b) moot), accept > with (b), or accept unconditionally (but still consider (b)). > For me personally, (b) makes the PEP more consistent, so I'm not in favor of breaking up the PEP. But we can certainly break up the discussion -- that's why I started using the labels (a) and (b). ---------- [1] Sometimes it's an implicit global instead of an implicit nonlocal -- when there's already a global for the same variable in the target scope. -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From J.Demeyer at UGent.be Wed Jun 27 01:27:24 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Wed, 27 Jun 2018 07:27:24 +0200 Subject: [Python-Dev] Policy on refactoring/clean up In-Reply-To: <6da2aefa032842c1a70c91c2e86077b5@xmail101.UGent.be> References: <5B3200C0.7090307@UGent.be> <6da2aefa032842c1a70c91c2e86077b5@xmail101.UGent.be> Message-ID: <5B33203C.70601@UGent.be> On 2018-06-27 00:02, Guido van Rossum wrote: > And TBH a desire to refactor a lot of code is often a sign of a > relatively new contributor who hasn't learned their way around the code > yet, so they tend to want to make the code follow their understanding > rather than letting their understanding follow the code. ...or it could be that the code is written the way it is only for historical reasons, instead of being purposely written that way. From greg.ewing at canterbury.ac.nz Wed Jun 27 01:38:38 2018 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Wed, 27 Jun 2018 17:38:38 +1200 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <20180627014217.GX14437@ando.pearwood.info> References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> <5B302990.8060807@canterbury.ac.nz> <20180627014217.GX14437@ando.pearwood.info> Message-ID: <5B3322DE.5050005@canterbury.ac.nz> Steven D'Aprano wrote: > ":=" is the second most common syntax used for assignment in common > programming languages, Yes, but it represents an *ordinary* assigment in those languages. The special way that's being proposed to use it in Python is not obvious. -- Greg From songofacandy at gmail.com Wed Jun 27 01:40:31 2018 From: songofacandy at gmail.com (INADA Naoki) Date: Wed, 27 Jun 2018 14:40:31 +0900 Subject: [Python-Dev] Policy on refactoring/clean up In-Reply-To: <5B33203C.70601@UGent.be> References: <5B3200C0.7090307@UGent.be> <6da2aefa032842c1a70c91c2e86077b5@xmail101.UGent.be> <5B33203C.70601@UGent.be> Message-ID: On Wed, Jun 27, 2018 at 2:27 PM Jeroen Demeyer wrote: > On 2018-06-27 00:02, Guido van Rossum wrote: > > And TBH a desire to refactor a lot of code is often a sign of a > > relatively new contributor who hasn't learned their way around the code > > yet, so they tend to want to make the code follow their understanding > > rather than letting their understanding follow the code. > > ...or it could be that the code is written the way it is only for > historical reasons, instead of bei > ?? > ng purposely written that way. > In ?? this ?time, I suppose you thought .c <=> .h filename should be matched. And we don't think so. Header files are organized for exposing APIs, and source files are organized for implementing ?the ? APIs. ?Since goal is different, they ?aren't? match ?ed? always.? ?? ?Regards,? -------------- next part -------------- An HTML attachment was scrubbed... URL: From tjreedy at udel.edu Wed Jun 27 01:30:10 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Wed, 27 Jun 2018 01:30:10 -0400 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: On 6/26/2018 10:36 PM, Guido van Rossum wrote: > [This is my one response today] Thank you for clearly presenting how you see 'comprehension', 'generator expression' and by implication 'equivalent code'. The latter can either be a definition or an explanation. The difference is subtle but real, and, I believe, the essence of the disagreement over iteration variables. If the code equivalent to a comprehension is its definition, like a macro expansion, then survival of the iteration variable is to be expected. If the equivalent code is an explanation of the *result* of evaluating a *self-contained expression*, then leakage is easily seen a wart, just as leakage of temporaries from any other expression would be. My interpretation of what you say below is that you always wanted, for instance, [i*i for i in iterable] == [j*j for j in iterable] to be true, and saw the leakage making this not quite true as a wart. In other words, within comprehensions (including generator expressions) iterations names should be regarded as dummy placeholders and not part of the value. If this is correct, the list comprehension syntax could have been [\0 * \0 for \0 in iterable] with \1, \2, ... used as needed. (I am using the regex back-reference notation in a way similar to the use of str.format forward reference notation.) I will stop here for now, as it is 1:30 am for me. Terry > On Mon, Jun 25, 2018 at 12:40 PM Terry Reedy > wrote: > > On 6/24/2018 7:25 PM, Guido van Rossum wrote: > > I'd wager that the people who might be most horrified about it > > the (b) scoping rule change > > > would be people who feel strongly that the change to the > > comprehension scope rules in Python 3 is a big improvement, > > I might not be one of those 'most horrified' by (b), but I increasingly > don't like it, and I was at best -0 on the comprehension scope change. > To me, iteration variable assignment in the current scope is a > non-problem.? So to me the change was mostly useless churn.? Little > benefit, little harm.? And not worth fighting when others saw a benefit. > > > Fair enough, and by itself this might not have been enough reason to > make the change. But see below. > > However, having made the change to nested scopes, I think we should > stick with them.? Or repeal them.? (I believe there is another way to > isolate iteration names -- see? below).? To me, (b) amounts to half > repealing the nested scope change, making comprehensions half-fowl, > half-fish chimeras. > > > That depends on how you see it -- to me (b) just means that there's an > implicit nonlocal[1] to make the assignment have the (desirable) > side-effect. > > The key thing to consider here is whether that side-effect is in fact > desirable. For me, the side-effect of the comprehension's loop control > variable was never desirable -- it was just an implementation detail > leaking out. (And that's different from leaking a regular for-loop's > control variable -- since we have 'break' (and 'else') there are some > legitimate use cases. But comprehensions try to be expressions, and here > the side effect is at best useless and at worst a nasty surprise.) > > > and who are familiar with the difference in implementation > > of comprehensions (though not generator expressions) in Python 2 > vs. 3. > > That I pretty much am, I think.? In Python 2, comprehensions (the fish) > were, at least in effect, expanded in-line to a normal for loop. > Generator expressions (the fowls) were different.? They were, and still > are, expanded into a temporary generator function whose return value is > dropped back into the original namespace.? Python 3 turned > comprehensions (with 2 news varieties thereof) into fowls also, > temporary functions whose return value is dropped back in the original > namespace.? The result is that a list comprehension is equivalent to > list(generator_ expression), even though, for efficiency, it is not > implemented that way.? (To me, this unification is more a benefit than > name hiding.) > > > Right, and this consistency convinced me that the change was worth it. I > just really like to be able to say "[... for ...]" is equivalent to > "list(... for ...)", and similar for set and dict. > > (b) proposes to add extra hidden code in and around the temporary > function to partly undo the isolation. > > > But it just adds a nonlocal declaration. There's always some hidden code > ('def' and 'return' at the very least). > > list comprehensions would no > longer be equivalent to list(generator_expression), unless > generator_expressions got the same treatment, in which case they would > no longer be equivalent to calling the obvious generator function. > Breaking either equivalence might break someone's code. > > > Ah, there's the rub! I should probably apologize for not clarifying my > terminology more. In the context of PEP 572, when I say "comprehensions" > I include generators! PEP 572 states this explicitly > (https://github.com/python/peps/blame/master/pep-0572.rst#L201-L202). > > Certainly PEP 572 intends to add that implicit nonlocal to both > comprehensions and generator expressions. (I just got really tired of > writing that phrase over and over, and at some point I forgot that this > is only a parenthetical remark added in the PEP's latest revision, and > not conventional terminology -- alas. :-) > > Part (b) of PEP 572 does several things of things to *retain* consistency: > > - The target of := lives in the same scope regardless of whether it > occurs in a comprehension, a generator expression, or just in some other > expression. > > - When it occurs in a comprehension or generator expression, the scope > is the same regardless of whether it occurs in the "outermost iterable" > or not. > > If we didn't have (b) the target would live in the comprehension/genexpr > scope if it occurred in a comprehension/genexp but outside its > "outermost iterable", and in the surrounding scope otherwise. > > --- > > How loop variables might be isolated without a nested scope: After a > comprehension is parsed, so that names become strings, rename the loop > variables to something otherwise illegal.? For instance, i could become > '', just as lambda becomes '' as the name of the resulting > function.? Expand the comprehension as in Python 2, except for deleting > the loop names along with the temporary result name. > > Assignment expressions within a comprehension would become assignment > expressions within the for loop expansion and would automatically > add or > replace values in the namespace containing the comprehension.? In other > words, I am suggesting that if we want name expressions in > comprehensions to act as they would in Python 2, then we should > consider > reverting to an altered version of the Python 2 expansion. > > > Possibly this is based on a misunderstanding of my use of > "comprehensions". Also, since your trick can only be used for > list/set/dict comprehensions, but not for generator expressions (at > least I assume you don't want it there) it would actually *reduce* > consistency between list/set/dict comprehensions and generator expressions. > > --- > > In any case, I think (b) should be a separate PEP linked to a PEP for > (a).? The decision for (a) could be reject (making (b) moot), accept > with (b), or accept unconditionally (but still consider (b)). > > > For me personally, (b) makes the PEP more consistent, so I'm not in > favor of breaking up the PEP. But we can certainly break up the > discussion -- that's why I started using the labels (a) and (b). > ---------- > [1] Sometimes it's an implicit global instead of an implicit nonlocal -- > when there's already a global for the same variable in the target scope. > > -- > --Guido van Rossum (python.org/~guido ) > > -- Terry Jan Reedy From steve at pearwood.info Wed Jun 27 02:54:44 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Wed, 27 Jun 2018 16:54:44 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: <20180627065444.GA14437@ando.pearwood.info> On Tue, Jun 26, 2018 at 05:42:43AM +1000, Chris Angelico wrote: > So..... sublocal scopes, like in the earliest versions of PEP 572? > > The wheel turns round and round, and the same spokes come up. It isn't as if comprehensions (and generator expressions) run in a proper separate scope. It is more half-and-half, sometimes it is seperate, sometimes it isn't: py> def show_subscope(): ... a, b = 1, 2 ... print("Comprehension scope, Part A") ... print(next(locals() for x in (1,))) ... print("Comprehension scope, Part B") ... print(next(obj for obj in (locals(),))) ... py> show_subscope() Comprehension scope, Part A {'x': 1, '.0': } Comprehension scope, Part B {'b': 2, 'a': 1} Comprehensions already run partly in the surrounding scope. I tried to take a survey of people on the Python-List mailing list, so see what their expectations of comprehension scope was. Disappointingly, not many people responded, but those who did, invariably think in terms of comprehensions running inside their enclosing scope, like any other expression: https://mail.python.org/pipermail/python-list/2018-June/734838.html (Please excuse the doubled-up posts, some misconfigured news server is periodically sending duplicate posts.) (Oh and ignore my comment about Python 2 -- I was thinking of something else.) Given the code shown: def test(): a = 1 b = 2 result = [value for key, value in locals().items()] return result nobody suggested that the result ought to be the empty list, which is what you should get if the comprehension ran in its own scope. Instead, they all expected some variation of [1, 2], which is what you would get if the comprehension ran in the enclosing scope. A decade or more since generator expressions started running in their own half-local-half-sublocal scope, people still think of scoping in terms of LEGB and don't think of comprehensions as running in their own scope *except* to the very limited degree that sometimes they are either surprised or pleased that "the loop variable doesn't leak". For example: http://nbviewer.jupyter.org/github/rasbt/python_reference/blob/master/tutorials/scope_resolution_legb_rule.ipynb doesn't mention comprehensions until the very end, almost in passing, and doesn't describe them as a separate scope at all. Rather, they are described as using closures "to prevent the for-loop variable to cut [sic] into the global namespace." This doesn't mention comprehension subscope at all: https://www.python-course.eu/python3_global_vs_local_variables.php Even the official documentation doesn't explicitly state that comprehensions are a separate scope: https://docs.python.org/3/reference/executionmodel.html#resolution-of-names rather leaving it to an after thought, to mention in passing almost as if it were an implementation-dependent accident, that comprehensions cannot see variables defined in any surrounding class scope. Aside from the loop variable (which PEP 572 will not change!) I see no evidence that the average non-core developer Python programmer considers comprehensions as a separate scope, or wants them to be a separate scope. Regardless of comprehensions being implicitly wrapped in a function or not, the average developer doesn't want the loop variable to "leak", and that's as far as their consideration has needed to go until now. But when pressed to explicitly consider the scope inside a comprehension, the evidence I have seen is that they consider it the same as the local scope surrounding it. Which is not wrong, as can be seen from the example above. Unlike the loop variable, I don't believe that assignment-expression bindings quote-unquote "leaking" from comprehensions will come as a surprise. On the contrary -- given that Nick et al have gone to great lengths to ensure that as a first approximation, comprehensions are equivalent to a simple for-loop running in the current scope: result = [expr for a in seq] # is almost the same as result = [] for a in seq: result.append(expr) I expect that people will be surprised if explicit, non-loop variable assignments *don't* occur in the current scope. If all that takes to implement is something like an implicit "nonlocal", that's hardly worse than the implicit functions already used. -- Steve From p.f.moore at gmail.com Wed Jun 27 03:30:00 2018 From: p.f.moore at gmail.com (Paul Moore) Date: Wed, 27 Jun 2018 08:30:00 +0100 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <20180627065444.GA14437@ando.pearwood.info> References: <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> <20180627065444.GA14437@ando.pearwood.info> Message-ID: On 27 June 2018 at 07:54, Steven D'Aprano wrote: > Comprehensions already run partly in the surrounding scope. > > I tried to take a survey of people on the Python-List mailing list, so > see what their expectations of comprehension scope was. Disappointingly, > not many people responded, but those who did, invariably think in terms > of comprehensions running inside their enclosing scope, like any other > expression: > > https://mail.python.org/pipermail/python-list/2018-June/734838.html > > (Please excuse the doubled-up posts, some misconfigured news server is > periodically sending duplicate posts.) > > (Oh and ignore my comment about Python 2 -- I was thinking of > something else.) > > Given the code shown: > > def test(): > a = 1 > b = 2 > result = [value for key, value in locals().items()] > return result > > > nobody suggested that the result ought to be the empty list, which is > what you should get if the comprehension ran in its own scope. Instead, > they all expected some variation of [1, 2], which is what you would get > if the comprehension ran in the enclosing scope. > > A decade or more since generator expressions started running in their > own half-local-half-sublocal scope, people still think of scoping in > terms of LEGB and don't think of comprehensions as running in their own > scope *except* to the very limited degree that sometimes they are either > surprised or pleased that "the loop variable doesn't leak". But test() returns [1, 2]. So does that say (as you claim above) that "the comprehension ran in the enclosing scope"? Doesn't it just say that the outermost iterable runs in the enclosing scope? So everybody expected the actual behaviour? (Disclaimer: in my response, I said that I had no clear expectation, which I stand by - locals() exposes implementation details that I don't normally feel that I need to know - but certainly the majority of respondents expected 1 and 2 to appear). On the other hand, >>> def test2(): ... a = 1 ... b = 2 ... result = [locals().items() for v in 'a'] ... return result ... >>> test2() [dict_items([('v', 'a'), ('.0', )])] and I bet no-one would have expected that if you'd posed that question (I certainly wouldn't). Although some might have said [('v', 'a')]. I suspect some would have expected a and b to appear there too, but that's just a guess... So yes, it's likely that people would have found the current behaviour unexpected in respect of locals(). But I imagine most people only care about the effective results when referencing variables, and >>> def test3(): ... a = 1 ... b = 2 ... result = [a for v in (1,)] ... return result ... >>> test3() [1] i.e., thanks to scope nesting, you can still reference locals from the enclosing scope. The problem is that := allows you to *change* values in a scope, and at that point you need to know *which* scope. So to that extent, the locals() question is important. However, I still suspect that most people would answer that they would like := to assign values *as if* they were in the enclosing scope, which is not really something that I think people would express in answer to a question about locals(). This can be achieved with an implicit "nonlocal" (and some extra shenanigans if the enclosing scope has a nonlocal or global declaration itself). Which, AIUI, is what the current proposal tries to do. IMO, the big question over the current PEP 572 proposal is whether it goes too far in the direction of "do what I mean". Superficially, the semantics are pretty clearly "what people would expect", and indeed that's been the whole focus recently to capture and satisfy *expected* behaviour. But there are edge cases (there always are when you work from messy real-world requirements rather than nice clean mathematical definitions ;-)) and the question is essentially whether any of those are bad enough to be an issue. I'm starting to feel that they aren't, and I'm moving towards a cautious +0 (or even +1) on the proposal. Paul From steve at pearwood.info Wed Jun 27 03:42:25 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Wed, 27 Jun 2018 17:42:25 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <5B317294.7070609@canterbury.ac.nz> References: <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> <5B317294.7070609@canterbury.ac.nz> Message-ID: <20180627074225.GB14437@ando.pearwood.info> On Tue, Jun 26, 2018 at 10:54:12AM +1200, Greg Ewing wrote: > A decision still needs to be made about whether we *want* > semantics that leak some things but not others. My sense (or bias, if you prefer) is that the answer to that depends on how you word the question. If you talk about "leaking", or give examples with trivial 1-character names that look all too easy to accidentally clobber, people will say "No": # Given this: x = 999 [(x := i)*x for i in (1, 2)] # should print(x) afterwards result in 4? but if you show a useful example that doesn't look like an accident waiting to happen, but a deliberate feature: # Given this: previous = 0 [previous + (previous := i) for i in (1, 2, 3)] # what value would you expect previous to have # at the completion of the loop? they'll be more receptive to the idea. (If they're not opposed to assignment expressions at all.) Avoiding leading questions is *hard*, and I believe that in general people don't know what they want until they've got it. I say that from considering all the times I've made a radical about face, features which I was *sure* would be awful actually turned out to be not awful at all -- augmented assignment, for instance. -- Steve From rosuav at gmail.com Wed Jun 27 03:52:16 2018 From: rosuav at gmail.com (Chris Angelico) Date: Wed, 27 Jun 2018 17:52:16 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> <20180627065444.GA14437@ando.pearwood.info> Message-ID: On Wed, Jun 27, 2018 at 5:30 PM, Paul Moore wrote: > But test() returns [1, 2]. So does that say (as you claim above) that > "the comprehension ran in the enclosing scope"? Doesn't it just say > that the outermost iterable runs in the enclosing scope? Yes - because the *outermost iterable* runs in the enclosing scope. But suppose you worded it like this: def test(): a = 1 b = 2 vars = {key: locals()[key] for key in locals()} return vars What would your intuition say? Should this be equivalent to dict(locals()) ? ChrisA From p.f.moore at gmail.com Wed Jun 27 04:03:58 2018 From: p.f.moore at gmail.com (Paul Moore) Date: Wed, 27 Jun 2018 09:03:58 +0100 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> <20180627065444.GA14437@ando.pearwood.info> Message-ID: On 27 June 2018 at 08:52, Chris Angelico wrote: > On Wed, Jun 27, 2018 at 5:30 PM, Paul Moore wrote: >> But test() returns [1, 2]. So does that say (as you claim above) that >> "the comprehension ran in the enclosing scope"? Doesn't it just say >> that the outermost iterable runs in the enclosing scope? > > Yes - because the *outermost iterable* runs in the enclosing scope. > But suppose you worded it like this: > > def test(): > a = 1 > b = 2 > vars = {key: locals()[key] for key in locals()} > return vars > > What would your intuition say? Should this be equivalent to dict(locals()) ? As I said on python-list, my intuition doesn't apply to locals() - I simply have no idea what I'd "expect" from that code, other than a request to go back and write it more clearly :-) *After* staring at it for a while and trying to interpret it base on the detailed knowledge I've gained from this thread, I'd say it does nothing remotely useful, and if you want dict(locals()) you should write it. (No, test() is not equivalent, because the two instances of locals() refer to different scopes, but I can't imagine why I'd ever need to know that outside of solving artificial puzzles like this). Paul From steve at pearwood.info Wed Jun 27 05:14:28 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Wed, 27 Jun 2018 19:14:28 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <20180624145204.GN14437@ando.pearwood.info> <20180627065444.GA14437@ando.pearwood.info> Message-ID: <20180627091427.GC14437@ando.pearwood.info> On Wed, Jun 27, 2018 at 08:30:00AM +0100, Paul Moore wrote: > On 27 June 2018 at 07:54, Steven D'Aprano wrote: > > Comprehensions already run partly in the surrounding scope. [...] > > Given the code shown: > > > > def test(): > > a = 1 > > b = 2 > > result = [value for key, value in locals().items()] > > return result [...] > But test() returns [1, 2]. So does that say (as you claim above) that > "the comprehension ran in the enclosing scope"? Doesn't it just say > that the outermost iterable runs in the enclosing scope? I think I was careful enough to only say that this was the same result you would get *if* the comprehension ran in the outer scope. Not to specifically say it *did* run in the outer scope. (If I slipped up anywhere, sorry.) I did say that the comprehension runs *partly* in the surrounding scope, and the example shows that the local namespace in the "... in iterable" part is not the same as the (sub)local namespace in the "expr for x in ..." part. *Parts* of the comprehension run in the surrounding scope, and parts of it run in an implicit sublocal scope inside a hidden function, giving us a quite complicated semantics for "comprehension scope": [expression for a in first_sequence for b in second ... ] |------sublocal-----|----local-----|------sublocal------| Try fitting *that* in the LEGB (+class) acronym :-) This becomes quite relevant once we include assignment expressions. To make the point that this is not specific to := but applies equally to Nick's "given" syntax as well, I'm going to use his syntax: result = [a for a in (x given x = expensive_function(), x+1, 2*x, x**3)] Here, the assignment to x runs in the local part. I can simulate that right now, using locals, but only outside of a function due to CPython's namespace optimization inside functions. (For simplicity, I'm just going to replace the call to "expensive_function" with just a constant.) py> del x py> [a for a in (locals().__setitem__('x', 2) or x, x+1, 2*x, x**3)] [2, 3, 4, 8] py> x 2 This confirms that the first sequence part of the comprehension runs in the surrounding local scope. So far so good. What if we move that assignment one level deep? Unfortunately, I can no longer use locals for this simulation, due to a peculiarity of the CPython function implementation. But replacing the call to locals() with globals() does the trick: del x # simulate [b*a for b in (1,) for a in (x given x = 2, x+1, 2*x, x**3)] [b*a for b in (1,) for a in (globals().__setitem__('x', 2) or x, x+1, 2*x, x**3)] That also works. But the problem comes if the user tries to assign to x in both the local and a sublocal section: # no simulation here, sorry [b*a for b in (x given x = 2, x**2) for a in (x given x = x + 1, x**3)] That looks like it should work. You're assigning to the same x in two parts of the same expression. Where's the problem? But given the "implicit function" implementation of comprehensions, I expect that this ought to raise an UnboundLocalError. The local scope part is okay: # needs a fixed-width font for best results [b*a for b in (x given x = 2, x**2) for a in (x given x = x + 1, x**3)] ..............|-----local part----|.....|--------sublocal part--------| but the sublocal part defines x as a sublocal variable, shadowing the surrounding local x, then tries to get a value for that sublocal x before it is defined. If we had assignment expressions before generator expressions and comprehensions, I don't think this would have been the behaviour we desired. (We might, I guess, accept it as an acceptable cost of the implicit function implementation. But we surely wouldn't argue for this complicated scoping behaviour as a good thing in and of itself.) In any case, we can work around this (at some cost of clarity and unobviousness) by changing the name of the variable. Not a big burden when the variable is a single character x: [b*a for b in (x given x = 2, x**2) for a in (y given y = x + 1, y**3)] but if x is a more descriptive name, that becomes more annoying. Nevermind, it is a way around this. Or we could Just Make It Work by treating the entire comprehension as the same scope for assignment expressions. (I stress, not for the loop variable.) Instead of having to remember which bits of the comprehension run in which scope, we have a conceptually much simpler rule: - comprehensions are expressions, and assignments inside them bind to the enclosing local scope, just like other expressions: - except for the loop variables, which are intentionally encapsulated inside the comprehension and don't "leak". The *implementation details* of how that works are not conceptually relevant. We may or may not want to advertise the fact that comprehensions use an implicit hidden function to do the encapsulation, and implicit hidden nonlocal to undo the effects of that hidden function. Or whatever implementation we happen to use. > So everybody expected the actual behaviour? More or less, if we ignore a few misapprehensions about how locals works. > On the other hand, > > >>> def test2(): > ... a = 1 > ... b = 2 > ... result = [locals().items() for v in 'a'] > ... return result > ... > >>> test2() > [dict_items([('v', 'a'), ('.0', )])] > > and I bet no-one would have expected that if you'd posed that question I suspect not. To be honest, I didn't even think of asking that question until after I had asked the first. > The problem is that := allows you to *change* values in a scope, and > at that point you need to know *which* scope. So to that extent, the > locals() question is important. However, I still suspect that most > people would answer that they would like := to assign values *as if* > they were in the enclosing scope, That is my belief as well. But that was intentionally not the question I was asking. I was interested in seeing whether people thought of comprehensions as a separate scope, or part of the enclosing scope. -- Steve From steve at pearwood.info Wed Jun 27 05:19:29 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Wed, 27 Jun 2018 19:19:29 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <20180624145204.GN14437@ando.pearwood.info> <20180627065444.GA14437@ando.pearwood.info> Message-ID: <20180627091928.GD14437@ando.pearwood.info> On Wed, Jun 27, 2018 at 05:52:16PM +1000, Chris Angelico wrote: > def test(): > a = 1 > b = 2 > vars = {key: locals()[key] for key in locals()} > return vars > > What would your intuition say? Should this be equivalent to dict(locals()) ? That example is so elegant it makes me want to cry. And not just because you shadowed the vars() builtin *wink* -- Steve From vano at mail.mipt.ru Wed Jun 27 06:00:31 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Wed, 27 Jun 2018 13:00:31 +0300 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <5B316E07.1060305@canterbury.ac.nz> References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> <5B302990.8060807@canterbury.ac.nz> <222b030b-6f6b-1644-6f74-8e7b6d2b9857@mail.mipt.ru> <5B316E07.1060305@canterbury.ac.nz> Message-ID: <74a4002f-7496-b2e0-4ecb-e4ceaf7f1e51@mail.mipt.ru> On 26.06.2018 1:34, Greg Ewing wrote: > Ivan Pozdeev via Python-Dev wrote: >> "as" was suggested even before is became a keyword in `with'. ( if >> (re.match(regex,line) as m) is not None: ) > > That's not equivalent where/given, though, since it still > has the asymmetry problem. > What do you mean by "asymmetry"? The fact that the first time around, it's the expression and after that, the variable? If that, it's not a "problem". The whole idea is to assign the result of a subexpression to something. If you force any assignments to be outside, it won't be a subexpression anymore, but effectively a separate statement -- if not syntactically, then visually at least -- both of which are the things the feature's purpose is to avoid. If you seek to force assignments outside, you should've rather suggested inline code blocks e.g. like anonymous methods in C# ( { a=foo(); b=bar(); return a+b;} ). Using this assigned result elsewhere in the same expression (akin to regex backreferences) is not a part of the basic idea actually. It depends on the evaluation order (and whether something is evaluated at all), so I doubt it should even be allowed -- but even if it is, it's a side benefit at best. -- Regards, Ivan From J.Demeyer at UGent.be Wed Jun 27 06:00:54 2018 From: J.Demeyer at UGent.be (Jeroen Demeyer) Date: Wed, 27 Jun 2018 12:00:54 +0200 Subject: [Python-Dev] PEP 576 In-Reply-To: <63a22544a9fb4285b361ea1b042c40b8@xmail101.UGent.be> References: <63a22544a9fb4285b361ea1b042c40b8@xmail101.UGent.be> Message-ID: <5B336056.8040606@UGent.be> On 2018-06-26 21:43, Mark Shannon wrote: > https://github.com/markshannon/pep-576 This actually looks close to Victor Stinner's bpo-29259. But instead of storing the function pointer in the class, you're storing it in the instance. One concern that I have is that this might lead to code duplication. You require that every class implements its own specialized _FOO_FastcallKeywords() function. So you end up with _PyCFunction_FastCallKeywords(), _PyMethodDescr_FastCallKeywords(), _PyFunction_FastCallKeywords(). If I want to implement a similar class myself, I have to reinvent that same wheel again. With PEP 580, I replace all those _FOO_FastCallKeywords() functions by one PyCCall_FASTCALL() function. Admittedly, my PyCCall_FASTCALL() is more complex than each of those _FOO_FastcallKeywords() individually. But overall, I think that PEP 580 leads to simpler code. Second, you still have a performance problem for methods. You made sure that the method optimizations in the Python bytecode interpreter continue to work, but method calls from C will be slowed down. I don't know to what extent and whether it really matters, but it's something to note. Jeroen. From ncoghlan at gmail.com Wed Jun 27 07:01:07 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Wed, 27 Jun 2018 21:01:07 +1000 Subject: [Python-Dev] Policy on refactoring/clean up In-Reply-To: <5B33203C.70601@UGent.be> References: <5B3200C0.7090307@UGent.be> <6da2aefa032842c1a70c91c2e86077b5@xmail101.UGent.be> <5B33203C.70601@UGent.be> Message-ID: On 27 June 2018 at 15:27, Jeroen Demeyer wrote: > On 2018-06-27 00:02, Guido van Rossum wrote: >> >> And TBH a desire to refactor a lot of code is often a sign of a >> relatively new contributor who hasn't learned their way around the code >> yet, so they tend to want to make the code follow their understanding >> rather than letting their understanding follow the code. > > > ...or it could be that the code is written the way it is only for historical > reasons, instead of being purposely written that way. Even so, we're still wary of change-for-change's sake since it tends to complicate maintenance of multiple concurrent release streams. In this particular case though, since it's header file related, those typically can't be backported for reasons of policy, so the risk of interfering with an otherwise acceptable backport is likely to be low. That leaves the "is there a clear motivation for API header consolidation?" question, and I don't think the proposed consolidation meets that bar, since it takes APIs from the abstract object API and multiple concrete object APIs, as well as the legacy (no longer documented) PyEval_* alternatives to the abstract object calling API, and combines them into a single header based on a shared categorisation of "calling things". CPython's public API isn't structured that way, as we keep the abstract object API and the concrete object API clearly distinct (compare https://docs.python.org/3/c-api/abstract.html and https://docs.python.org/3/c-api/concrete.html). That said, the one part of the proposed change that I think could definitely be reasonable is pulling the abstract call API (and supporting definitions) out of "abstract.h" and into their own "call.h" file (while leaving the concrete object API headers alone). That gives a solid correlation between Include/call.h and Objects/call.c, and I don't believe it's ever been the case that abstract.h declared all the API functions that are part of the PyObject_* namespace. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From rosuav at gmail.com Wed Jun 27 07:08:28 2018 From: rosuav at gmail.com (Chris Angelico) Date: Wed, 27 Jun 2018 21:08:28 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <20180627091928.GD14437@ando.pearwood.info> References: <20180624145204.GN14437@ando.pearwood.info> <20180627065444.GA14437@ando.pearwood.info> <20180627091928.GD14437@ando.pearwood.info> Message-ID: On Wed, Jun 27, 2018 at 7:19 PM, Steven D'Aprano wrote: > On Wed, Jun 27, 2018 at 05:52:16PM +1000, Chris Angelico wrote: > >> def test(): >> a = 1 >> b = 2 >> vars = {key: locals()[key] for key in locals()} >> return vars >> >> What would your intuition say? Should this be equivalent to dict(locals()) ? > > That example is so elegant it makes me want to cry. > > And not just because you shadowed the vars() builtin *wink* It gets funnier with nested loops. Or scarier. I've lost the ability to distinguish those two. def test(): spam = 1 ham = 2 vars = [key1+key2 for key1 in locals() for key2 in locals()] return vars Wanna guess what that's gonna return? ChrisA From eric at trueblade.com Wed Jun 27 08:00:20 2018 From: eric at trueblade.com (Eric V. Smith) Date: Wed, 27 Jun 2018 08:00:20 -0400 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <20180624145204.GN14437@ando.pearwood.info> <20180627065444.GA14437@ando.pearwood.info> <20180627091928.GD14437@ando.pearwood.info> Message-ID: On 6/27/2018 7:08 AM, Chris Angelico wrote: > It gets funnier with nested loops. Or scarier. I've lost the ability > to distinguish those two. > > def test(): > spam = 1 > ham = 2 > vars = [key1+key2 for key1 in locals() for key2 in locals()] > return vars > > Wanna guess what that's gonna return? I'm not singling out Chris here, but these discussions would be easier to follow and more illuminating if the answers to such puzzles were presented when they're posed. Eric From ncoghlan at gmail.com Wed Jun 27 08:27:30 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Wed, 27 Jun 2018 22:27:30 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: On 26 June 2018 at 02:27, Guido van Rossum wrote: > [This is my one reply in this thread today. I am trying to limit the amount > of time I spend to avoid another overheated escalation.] Aye, I'm trying to do the same, and deliberately spending some evenings entirely offline is helping with that :) > On Mon, Jun 25, 2018 at 4:44 AM Nick Coghlan wrote: >> >> Right, the proposed blunt solution to "Should I use 'NAME = EXPR' or >> 'NAME := EXPR'?" bothers me a bit, but it's the implementation >> implications of parent local scoping that I fear will create a >> semantic tar pit we can't get out of later. > > Others have remarked this too, but it really bother me that you are focusing > so much on the implementation of parent local scoping rather than on the > "intuitive" behavior which is super easy to explain -- especially to someone > who isn't all that familiar (or interested) with the implicit scope created > for the loop control variable(s). According to Steven (who noticed that this > is barely mentioned in most tutorials about comprehensions) that is most > people, however very few of them read python-dev. > > It's not that much work for the compiler, since it just needs to do a little > bit of (new) static analysis and then it can generate the bytecode to > manipulate closure(s). The runtime proper doesn't need any new > implementation effort. The fact that sometimes a closure must be introduced > where no explicit initialization exists is irrelevant to the runtime -- this > only affects the static analysis, at runtime it's no different than if the > explicit initialization was inside `if 0`. One of the things I prize about Python's current code generator is how many of the constructs can be formulated as simple content-and-context independent boilerplate removal, which is why parent local scoping (as currently defined in PEP 572) bothers me: rather than being a new primitive in its own right, the PEP instead makes the notion of "an assignment expression in a comprehension or generator expression" a construct that can't readily decomposed into lower level building blocks the way that both assignment expressions on their own and comprehensions and generator expressions on their own can be. Instead, completely new language semantics arise from the interaction between two otherwise independent features. Even changes as complicated as PEP 343's with statement, PEP 380's yield from, and PEP 492's native coroutines all include examples of how they could be written *without* the benefit of the new syntax. By contrast, PEP 572's parent local scoping can't currently be defined that way. Instead, to explain how the code generator is going to be expected to handle comprehensions, you have to take the current comprehension semantics and add two new loops to link up the bound names correctly:: [item := x for x in items] becomes: # Each bound name gets declared as local in the parent scope if 0: for item in (): pass def _list_comp(_outermost_iter): # Each bound name gets declared as: # - nonlocal if outer scope is a function scope # - global item if outer scope is a module scope # - an error, otherwise _result = [] for x in _outermost_iter: _result.append(x) return _result _expr_result = _list_comp(items) This is why my objections would be reduced significantly if the PEP explicitly admitted that it was defining a new kind of scoping semantics, and actually made those semantics available as an explicit "parentlocal NAME" declaration (behind a "from __future__ import parent_locals" guard), such that the translation of the above example to an explicitly nested scope could just be the visually straightforward:: def _list_comp(_outermost_iter): parentlocal item _result = [] for x in _outermost_iter: item = x _result.append(x) return _result _expr_result = _list_comp(items) That splits up the learning process for anyone trying to really understand how this particular aspect of Python's code generation works into two distinct pieces: - "assignment expressions inside comprehensions and generator expressions use parent local scoping" - "parent local scoping works " If the PEP did that, we could likely even make parent locals work sensibly for classes by saying that "parent local" for a method definition in a class body refers to the closure namespace where we already stash __class__ references for the benefit of zero-arg super (this would also be a far more robust way of defining private class variables than name mangling is able to offer). Having parent locals available as a language level concept (rather than solely as an interaction between assignment expressions and implicitly nested scopes) also gets us to a point where context-independent code thunks that work both at module level and inside another function can be built as nested functions which declare all their working variables as parentlocal (you still need to define the thunks inline in the scope you want them to affect, since this isn't dynamic scoping, but when describing the code, you don't need to say "as a module level function define it this way, as a nested function define it that way"). An explicit "parentlocal NAME" concept at the PEP 572 layer would also change the nature of the draft "given" proposal from competing with PEP 572, to instead being a follow-up proposal that focused on providing control of target name declarations in lambda expressions, comprehensions, and generator expressions such that: - (lambda arg: value := arg given parentlocal value) # Exports "value" to parent scope - any(x for x in items given parentlocal x) # Exports "x" to parent scope - [y for x in data if (y := f(x)) given y] # *Avoids* exporting "y" to parent scope With parent local scoping in the mix the proposed "given" syntax could also dispense with initialiser and type hinting support entirely and instead only allow: - "... given NAME" (always local, no matter the default scoping) - "... given parentlocal NAME" (always parent local, declaring if necessary) - "... given nonlocal NAME" (always nonlocal, error if not declared in outer scope) - "... given global NAME" (always global, no matter how nested the current scope is) - "... given (TARGET1, TARGET2, ...)" (declaring multiple assignment targets) If you want an initialiser or a type hint, then you'd use parentlocal semantics. If you want to keep names local (e.g. to avoid exporting them as part of a module's public API) then you can do that, too. >> Unfortunately, I think the key rationale for (b) is that if you >> *don't* do something along those lines, then there's a different >> strange scoping discrepancy that arises between the non-comprehension >> forms of container displays and the comprehension forms: >> >> (NAME := EXPR,) # Binds a local >> tuple(NAME := EXPR for __ in range(1)) # Doesn't bind a local >> [...] >> Those scoping inconsistencies aren't *new*, but provoking them >> currently involves either class scopes, or messing about with >> locals(). > > In what sense are they not new? This syntax doesn't exist yet. The simplest way to illustrate the scope distinction today is with "len(locals())": >>> [len(locals()) for i in range(1)] [2] >>> [len(locals())] [7] But essentially nobody ever does that, so the distinction doesn't currently matter. By contrast, where assignment expressions bind their targets matters a *lot*, so PEP 572 makes the existing scoping oddities a lot more significant. > You left out another discrepancy, which is more likely to hit people in the > face: according to your doctrine, := used in the "outermost iterable" would > create a local in the containing scope, since that's where the outermost > iterable is evaluated. So in this example > > a = [x := i+1 for i in range(y := 2)] > > the scope of x would be the implicit function (i.e. it wouldn't leak) while > the scope of y would be the same as that of a. (And there's an even more > cryptic example, where the same name is assigned in both places.) Yeah, the fact it deals with this problem nicely is one aspect of the parent local scoping that I find genuinely attractive. >> Parent local scoping tries to mitigate the surface inconsistency by >> changing how write semantics are defined for implicitly nested scopes, >> but that comes at the cost of making those semantics inconsistent with >> explicitly nested scopes and with the read semantics of implicitly >> nested scopes. > > > Nobody thinks about write semantics though -- it's simply not the right > abstraction to use here, you've introduced it because that's how *you* think > about this. The truth of the last part of that paragraph means that the only way for the first part of it to be true is to decide that my way of thinking is *so* unusual that nobody else in the 10 years that Python 3 has worked the way it does now has used the language reference, the source code, the disassembler, or the debugger to formulate a similar mental model of how they expect comprehensions and generator expressions to behave. I'll grant that I may be unusual in thinking about comprehensions and generator expressions the way I do, and I definitely accept that most folks simply don't think about the subtleties of how they handle scopes in the first place, but I *don't* accept the assertion that I'm unique in thinking about them that way. There are simply too many edge cases in their current runtime behaviour where the "Aha!" moment at the end of a debugging effort is going to be the realisation that they're implemented as an implicitly nested scope, and we've had a decade of Python 3 use where folks prone towards writing overly clever comprehensions have been in a position to independently make that discovery. >> The early iterations of PEP 572 tried to duck this whole realm of >> potential semantic inconsistencies by introducing sublocal scoping > There was also another variant in some iteration or PEP 572, after sublocal > scopes were already eliminated -- a change to comprehensions that would > evaluate the innermost iterable in the implicit function. This would make > the explanation of inline assignment in comprehensions consistent again > (they were always local to the comprehension in that iteration of the PEP), > at the cost of a backward incompatibility that was ultimately withdrawn. Yeah, the current "given" draft has an open question around the idea of having the presence of a "given" clause pull the outermost iterable evaluation inside the nested scope. It still doesn't really solve the problem, though, so I think I'd actually consider PEP-572-with-explicit-parent-local-scoping-support the version of assignment expressions that most cleanly handles the interaction with comprehension scopes without making that interaction rely on opaque magic (instead, it would be relying on an implicit target scope declaration, the same as any other name binding - the only unusual aspect is that the implicit declaration would be "parentlocal NAME" rather than the more typical local variable declaration). Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From vano at mail.mipt.ru Wed Jun 27 08:45:44 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Wed, 27 Jun 2018 15:45:44 +0300 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: <62ebcf13-075e-cf1e-c98a-0ab9a4416a57@mail.mipt.ru> On 27.06.2018 5:36, Guido van Rossum wrote: > [This is my one response today] > > On Mon, Jun 25, 2018 at 12:40 PM Terry Reedy > wrote: > > On 6/24/2018 7:25 PM, Guido van Rossum wrote: > > I'd wager that the people who might be most horrified about it > > the (b) scoping rule change > > > would be people who feel strongly that the change to the > > comprehension scope rules in Python 3 is a big improvement, > > I might not be one of those 'most horrified' by (b), but I > increasingly > don't like it, and I was at best -0 on the comprehension scope > change. > To me, iteration variable assignment in the current scope is a > non-problem.? So to me the change was mostly useless churn. Little > benefit, little harm.? And not worth fighting when others saw a > benefit. > > > Fair enough, and by itself this might not have been enough reason to > make the change. But see below. > > However, having made the change to nested scopes, I think we should > stick with them.? Or repeal them.? (I believe there is another way to > isolate iteration names -- see? below).? To me, (b) amounts to half > repealing the nested scope change, making comprehensions half-fowl, > half-fish chimeras. > > > That depends on how you see it -- to me (b) just means that there's an > implicit nonlocal[1] to make the assignment have the (desirable) > side-effect. > > The key thing to consider here is whether that side-effect is in fact > desirable. For me, the side-effect of the comprehension's loop control > variable was never desirable -- it was just an implementation detail > leaking out. (And that's different from leaking a regular for-loop's > control variable -- since we have 'break' (and 'else') there are some > legitimate use cases. But comprehensions try to be expressions, and > here the side effect is at best useless and at worst a nasty surprise.) > > > and who are familiar with the difference in implementation > > of comprehensions (though not generator expressions) in Python 2 > vs. 3. > > That I pretty much am, I think.? In Python 2, comprehensions (the > fish) > were, at least in effect, expanded in-line to a normal for loop. > Generator expressions (the fowls) were different.? They were, and > still > are, expanded into a temporary generator function whose return > value is > dropped back into the original namespace.? Python 3 turned > comprehensions (with 2 news varieties thereof) into fowls also, > temporary functions whose return value is dropped back in the > original > namespace.? The result is that a list comprehension is equivalent to > list(generator_ expression), even though, for efficiency, it is not > implemented that way.? (To me, this unification is more a benefit > than > name hiding.) > > > Right, and this consistency convinced me that the change was worth it. > I just really like to be able to say "[... for ...]" is equivalent to > "list(... for ...)", and similar for set and dict. "A shorthand to list()/dict()/set()" is actually how I thought of comprehensions when I studied them. And I was actually using list() in my code for some time before I learned of their existence. > (b) proposes to add extra hidden code in and around the temporary > function to partly undo the isolation. > > > But it just adds a nonlocal declaration. There's always some hidden > code ('def' and 'return' at the very least). > > list comprehensions would no > longer be equivalent to list(generator_expression), unless > generator_expressions got the same treatment, in which case they > would > no longer be equivalent to calling the obvious generator function. > Breaking either equivalence might break someone's code. > > > Ah, there's the rub! I should probably apologize for not clarifying my > terminology more. In the context of PEP 572, when I say > "comprehensions" I include generators! PEP 572 states this explicitly > (https://github.com/python/peps/blame/master/pep-0572.rst#L201-L202). > > Certainly PEP 572 intends to add that implicit nonlocal to both > comprehensions and generator expressions. (I just got really tired of > writing that phrase over and over, and at some point I forgot that > this is only a parenthetical remark added in the PEP's latest > revision, and not conventional terminology -- alas. :-) > > Part (b) of PEP 572 does several things of things to *retain* consistency: > > - The target of := lives in the same scope regardless of whether it > occurs in a comprehension, a generator expression, or just in some > other expression. > > - When it occurs in a comprehension or generator expression, the scope > is the same regardless of whether it occurs in the "outermost > iterable" or not. > > If we didn't have (b) the target would live in the > comprehension/genexpr scope if it occurred in a comprehension/genexp > but outside its "outermost iterable", and in the surrounding scope > otherwise. > > --- > > How loop variables might be isolated without a nested scope: After a > comprehension is parsed, so that names become strings, rename the > loop > variables to something otherwise illegal.? For instance, i could > become > '', just as lambda becomes '' as the name of the resulting > function.? Expand the comprehension as in Python 2, except for > deleting > the loop names along with the temporary result name. > > Assignment expressions within a comprehension would become assignment > expressions within the for loop expansion and would automatically > add or > replace values in the namespace containing the comprehension.? In > other > words, I am suggesting that if we want name expressions in > comprehensions to act as they would in Python 2, then we should > consider > reverting to an altered version of the Python 2 expansion. > > > Possibly this is based on a misunderstanding of my use of > "comprehensions". Also, since your trick can only be used for > list/set/dict comprehensions, but not for generator expressions (at > least I assume you don't want it there) it would actually *reduce* > consistency between list/set/dict comprehensions and generator > expressions. > > --- > > In any case, I think (b) should be a separate PEP linked to a PEP for > (a).? The decision for (a) could be reject (making (b) moot), accept > with (b), or accept unconditionally (but still consider (b)). > > > For me personally, (b) makes the PEP more consistent, so I'm not in > favor of breaking up the PEP. But we can certainly break up the > discussion -- that's why I started using the labels (a) and (b). > ---------- > [1] Sometimes it's an implicit global instead of an implicit nonlocal > -- when there's already a global for the same variable in the target > scope. > > -- > --Guido van Rossum (python.org/~guido ) > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/vano%40mail.mipt.ru -- Regards, Ivan -------------- next part -------------- An HTML attachment was scrubbed... URL: From greg.ewing at canterbury.ac.nz Wed Jun 27 09:25:14 2018 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 28 Jun 2018 01:25:14 +1200 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <74a4002f-7496-b2e0-4ecb-e4ceaf7f1e51@mail.mipt.ru> References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> <5B302990.8060807@canterbury.ac.nz> <222b030b-6f6b-1644-6f74-8e7b6d2b9857@mail.mipt.ru> <5B316E07.1060305@canterbury.ac.nz> <74a4002f-7496-b2e0-4ecb-e4ceaf7f1e51@mail.mipt.ru> Message-ID: <5B33903A.3060104@canterbury.ac.nz> Ivan Pozdeev via Python-Dev wrote: > Using this assigned result elsewhere in the same expression (akin to > regex backreferences) is not a part of the basic idea actually. If that's true, then the proposal has mutated into something that has *no* overlap whatsoever with the use case that started this whole discussion, which was about binding a temporary variable in a comprehension, for use *within* the comprehension. > It depends on the evaluation order (and whether something is evaluated > at all), Which to my mind is yet another reason not to like ":=". -- Greg From cstratak at redhat.com Wed Jun 27 09:18:24 2018 From: cstratak at redhat.com (Charalampos Stratakis) Date: Wed, 27 Jun 2018 09:18:24 -0400 (EDT) Subject: [Python-Dev] Python and Linux Standard Base In-Reply-To: <1622396855.42006758.1530103502934.JavaMail.zimbra@redhat.com> Message-ID: <835981563.42014101.1530105504575.JavaMail.zimbra@redhat.com> LSB (Linux Standard Base) is a set of standards defined from the Linux Foundation for linux distributions [0][1] with the latest version (LSB 5.0) released on 3rd of June, 2015. Python is also mentioned there but the information is horribly outdated [2]. For example here are the necessary modules that a python interpreter should include in an lsb compliant system [3] and the minimum python version should be 2.4.2. Also the python3 interpreter is never mentioned [4]. My question is, if there is any incentive to try and ask for modernization/amendment of the standards? I really doubt that any linux distro at that point can be considered lsb compliant at least from the python side of things. [0] https://en.wikipedia.org/wiki/Linux_Standard_Base [1] https://wiki.linuxfoundation.org/lsb/lsb-50 [2] https://refspecs.linuxfoundation.org/LSB_5.0.0/LSB-Languages/LSB-Languages/python.html [3] https://refspecs.linuxfoundation.org/LSB_5.0.0/LSB-Languages/LSB-Languages/pymodules.html [4] https://lsbbugs.linuxfoundation.org/show_bug.cgi?id=3677 -- Regards, Charalampos Stratakis Software Engineer Python Maintenance Team, Red Hat From greg.ewing at canterbury.ac.nz Wed Jun 27 09:38:54 2018 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 28 Jun 2018 01:38:54 +1200 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: <5B33936E.8080309@canterbury.ac.nz> Nick Coghlan wrote: > actually made those semantics available as an explicit > "parentlocal NAME" declaration ...: > > def _list_comp(_outermost_iter): > parentlocal item > _result = [] > for x in _outermost_iter: > item = x > _result.append(x) > return _result > > _expr_result = _list_comp(items) I'm not sure that's possible. If I understand correctly, part of the definition of "parent local" is that "parent" refers to the nearest enclosing *non-comprehension* scope, to give the expected result for nested comprehensions. If that's so, then it's impossible to fully decouple its definition from comprehensions. -- Greg From solipsis at pitrou.net Wed Jun 27 09:41:23 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Wed, 27 Jun 2018 15:41:23 +0200 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> <5B302990.8060807@canterbury.ac.nz> <222b030b-6f6b-1644-6f74-8e7b6d2b9857@mail.mipt.ru> <5B316E07.1060305@canterbury.ac.nz> <74a4002f-7496-b2e0-4ecb-e4ceaf7f1e51@mail.mipt.ru> <5B33903A.3060104@canterbury.ac.nz> Message-ID: <20180627154123.63446423@fsol> Why is this discussion talking about comprehensions at all? Is there a decent use case for using assignments in comprehensions (as opposed to language lawyering or deliberate obfuscation)? Regards Antoine. On Thu, 28 Jun 2018 01:25:14 +1200 Greg Ewing wrote: > Ivan Pozdeev via Python-Dev wrote: > > Using this assigned result elsewhere in the same expression (akin to > > regex backreferences) is not a part of the basic idea actually. > > If that's true, then the proposal has mutated into something > that has *no* overlap whatsoever with the use case that started > this whole discussion, which was about binding a temporary > variable in a comprehension, for use *within* the comprehension. > > > It depends on the evaluation order (and whether something is evaluated > > at all), > > Which to my mind is yet another reason not to like ":=". > From solipsis at pitrou.net Wed Jun 27 09:42:53 2018 From: solipsis at pitrou.net (Antoine Pitrou) Date: Wed, 27 Jun 2018 15:42:53 +0200 Subject: [Python-Dev] Python and Linux Standard Base References: <1622396855.42006758.1530103502934.JavaMail.zimbra@redhat.com> <835981563.42014101.1530105504575.JavaMail.zimbra@redhat.com> Message-ID: <20180627154253.15e36b61@fsol> On Wed, 27 Jun 2018 09:18:24 -0400 (EDT) Charalampos Stratakis wrote: > > My question is, if there is any incentive to try and ask for modernization/amendment of the standards? > I really doubt that any linux distro at that point can be considered lsb compliant at least from the > python side of things. One question: who maintains the LSB? The fact that the Python portion was never updated may hint that nobody uses it... Regards Antoine. From steve at pearwood.info Wed Jun 27 09:49:49 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Wed, 27 Jun 2018 23:49:49 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <20180627065444.GA14437@ando.pearwood.info> <20180627091928.GD14437@ando.pearwood.info> Message-ID: <20180627134948.GF14437@ando.pearwood.info> On Wed, Jun 27, 2018 at 08:00:20AM -0400, Eric V. Smith wrote: > On 6/27/2018 7:08 AM, Chris Angelico wrote: > >It gets funnier with nested loops. Or scarier. I've lost the ability > >to distinguish those two. > > > >def test(): > > spam = 1 > > ham = 2 > > vars = [key1+key2 for key1 in locals() for key2 in locals()] > > return vars > > > >Wanna guess what that's gonna return? > > I'm not singling out Chris here, but these discussions would be easier > to follow and more illuminating if the answers to such puzzles were > presented when they're posed. You can just copy and paste the function into the interactive interpreter and run it :-) But where's the fun in that? The point of the exercise is to learn first hand just how complicated it is to try to predict the *current* scope behaviour of comprehensions. Without the ability to perform assignment inside them, aside from the loop variable, we've managed to avoid thinking too much about this until now. It also demonstrates the unrealisticness of treating comprehensions as a separate scope -- they're hybrid scope, with parts of the comprehension running in the surrounding local scope, and parts running in an sublocal scope. Earlier in this thread, Nick tried to justify the idea that comprehensions run in their own scope, no matter how people think of them -- but that's an over-simplification, as Chris' example above shows. Parts of the comprehension do in fact behave exactly as the naive model would suggest (even if Nick is right that other parts don't). As complicated and hairy as the above example is, (1) it is a pretty weird thing to do, so most of us will almost never need to consider it; and (2) backwards compatibility requires that we live with it now (at least unless we introduce a __future__ import). If we can't simplify the scope of comprehensions, we can at least simplify the parts that actually matters. What matters are the loop variables (already guaranteed to be sublocal and not "leak" out of the comprehension) and the behaviour of assignment expressions (open to discussion). Broadly speaking, there are two positions we can take: 1. Let the current implementation of comprehensions as an implicit hidden function drive the functionality; that means we duplicate the hairiness of the locals() behaviour seen above, although it won't be obvious at first glance. What this means in practice is that assignments will go to different scopes depending on *where* they are in the comprehension: [ expr for x in iter1 for y in iter2 if cond ...] [ BBBBBB for x in AAAAAA for y in BBBBBB if BBBBBB ...] Assignments in the section marked "AAAAAA" will be in the local scope; assignments in the BBBBBB sections will be in the sublocal scope. That's not too bad, up to the point you try to assign to the same name in AAAAAA and BBBBBB. And then you are likely to get confusing hard to debug UnboundLocalErrors. 2. Or we can keep the current behaviour for locals and the loop variables, but we can keep assignment expressions simple by ensuring they always bind to the enclosing scope. Compared to the complexity of the above, we have the relatively straight forward: [ AAAAAA for x in AAAAAA for y in AAAAAA if AAAAAA ...] The loop variables continue to be hidden away in the invisible, implicit comprehension function, where they can't leak out, while explicit assignments to variables (using := or given or however it is spelled) will always go into the surrounding local scope, like they do in every other expression. Does it matter that the implementation of this requires an implicit nonlocal declaration for each assignment? No more than it matters that comprehensions themselves require an implicit function. And what we get out of this is simpler semantics at the Python level: - Unless previous declared global, assignment expressions always bind to the current scope, even if they're inside a comprehension; - and we don't have to deal with the oddity that different bits of a comprehension run in different scopes (unless we go out of our way to use locals()); merely using assignment expressions will just work consistently and simply, and loop variables will still be confined to the comprehension as they are now. -- Steve From mertz at gnosis.cx Wed Jun 27 09:56:45 2018 From: mertz at gnosis.cx (David Mertz) Date: Wed, 27 Jun 2018 09:56:45 -0400 Subject: [Python-Dev] Python and Linux Standard Base In-Reply-To: <20180627154253.15e36b61@fsol> References: <1622396855.42006758.1530103502934.JavaMail.zimbra@redhat.com> <835981563.42014101.1530105504575.JavaMail.zimbra@redhat.com> <20180627154253.15e36b61@fsol> Message-ID: The main wiki page was last touched at all in 2016. The mailing list in Jan 2018 had about 8 comments, none of them actually related to LSB. They stopped archiving the ML altogether in Feb 2018. I think it's safe to say the parrot is dead. On Wed, Jun 27, 2018, 9:50 AM Antoine Pitrou wrote: > On Wed, 27 Jun 2018 09:18:24 -0400 (EDT) > Charalampos Stratakis wrote: > > > > My question is, if there is any incentive to try and ask for > modernization/amendment of the standards? > > I really doubt that any linux distro at that point can be considered lsb > compliant at least from the > > python side of things. > > One question: who maintains the LSB? > > The fact that the Python portion was never updated may hint that nobody > uses it... > > Regards > > Antoine. > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/mertz%40gnosis.cx > -------------- next part -------------- An HTML attachment was scrubbed... URL: From cstratak at redhat.com Wed Jun 27 09:57:03 2018 From: cstratak at redhat.com (Charalampos Stratakis) Date: Wed, 27 Jun 2018 09:57:03 -0400 (EDT) Subject: [Python-Dev] Python and Linux Standard Base In-Reply-To: <20180627154253.15e36b61@fsol> References: <1622396855.42006758.1530103502934.JavaMail.zimbra@redhat.com> <835981563.42014101.1530105504575.JavaMail.zimbra@redhat.com> <20180627154253.15e36b61@fsol> Message-ID: <640562110.42019603.1530107823789.JavaMail.zimbra@redhat.com> ----- Original Message ----- > From: "Antoine Pitrou" > To: python-dev at python.org > Sent: Wednesday, June 27, 2018 3:42:53 PM > Subject: Re: [Python-Dev] Python and Linux Standard Base > > On Wed, 27 Jun 2018 09:18:24 -0400 (EDT) > Charalampos Stratakis wrote: > > > > My question is, if there is any incentive to try and ask for > > modernization/amendment of the standards? > > I really doubt that any linux distro at that point can be considered lsb > > compliant at least from the > > python side of things. > > One question: who maintains the LSB? > > The fact that the Python portion was never updated may hint that nobody > uses it... > > Regards > > Antoine. > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/cstratak%40redhat.com > That could definitely be the case here. I stumbled upon that when checking shebang requirements on Fedora and apparently every distro has a sort of meta package that adheres to those standards. In Fedora's case [0]. I don't have a good answer on who maintains it or even how compliant some distros are, but I was wondering if that topic came up beforehand and if any requirements were placed from either side. [0] https://src.fedoraproject.org/rpms/redhat-lsb/blob/master/f/redhat-lsb.spec#_419 -- Regards, Charalampos Stratakis Software Engineer Python Maintenance Team, Red Hat From steve at pearwood.info Wed Jun 27 10:04:29 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Thu, 28 Jun 2018 00:04:29 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <20180627154123.63446423@fsol> References: <20180624145204.GN14437@ando.pearwood.info> <5B302990.8060807@canterbury.ac.nz> <222b030b-6f6b-1644-6f74-8e7b6d2b9857@mail.mipt.ru> <5B316E07.1060305@canterbury.ac.nz> <74a4002f-7496-b2e0-4ecb-e4ceaf7f1e51@mail.mipt.ru> <5B33903A.3060104@canterbury.ac.nz> <20180627154123.63446423@fsol> Message-ID: <20180627140428.GG14437@ando.pearwood.info> On Wed, Jun 27, 2018 at 03:41:23PM +0200, Antoine Pitrou wrote: > > Why is this discussion talking about comprehensions at all? > Is there a decent use case for using assignments in comprehensions (as > opposed to language lawyering or deliberate obfuscation)? Yes. The *very first* motivating example for this proposal came from a comprehension. I think it is both unfortunate and inevitable that the discussion bogged down in comprehension-hell. Unfortunate because I don't think that the most compelling use-cases involve comprehensions at all. But inevitable because *comprehensions are the hard case*, thanks to the (justifiable!) decision to implement them as implicit hidden functions. In my opinion, the really two BIG wins for assignment expressions are while loops and cascades of if... blocks. Tim Peters has also given a couple of good examples of mathematical code that would benefit strongly from this feature. Going back a few months now, they were the examples that tipped me over from the opinion "Oh, just re-write the comprehension as a loop" to the opinion "You know, I think this feature actually is useful... and as a bonus, you can keep using the comprehension" But that requires that we get the comprehension scoping right. Not just leave it as an unspecified implementation detail. -- Steve From vano at mail.mipt.ru Wed Jun 27 10:07:24 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Wed, 27 Jun 2018 17:07:24 +0300 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <5B33903A.3060104@canterbury.ac.nz> References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> <5B302990.8060807@canterbury.ac.nz> <222b030b-6f6b-1644-6f74-8e7b6d2b9857@mail.mipt.ru> <5B316E07.1060305@canterbury.ac.nz> <74a4002f-7496-b2e0-4ecb-e4ceaf7f1e51@mail.mipt.ru> <5B33903A.3060104@canterbury.ac.nz> Message-ID: <05f368c2-3cd2-d7e0-9f91-27afb40d5b35@mail.mipt.ru> On 27.06.2018 16:25, Greg Ewing wrote: > Ivan Pozdeev via Python-Dev wrote: >> Using this assigned result elsewhere in the same expression (akin to >> regex backreferences) is not a part of the basic idea actually. > > If that's true, then the proposal has mutated into something > that has *no* overlap whatsoever with the use case that started > this whole discussion, I don't know what and where "started" it (AFAIK the idea has been around for years) but for me, the primary use case for an assignment expression is to be able to "catch" a value into a variable in places where I can't put an assignment statement in, like the infamous `if re.match() is not None'. > which was about binding a temporary > variable in a comprehension, for use *within* the comprehension. Then I can't understand all the current fuss about scoping. AFAICS, it's already like I described in https://mail.python.org/pipermail/python-dev/2018-June/154067.html : the outermost iterable is evaluated in the local scope while others in the internal one: In [13]: [(l,i) for l in list(locals())[:5] for i in locals()] Out[13]: [('__name__', 'l'), ?('__name__', '.0'), ?('__builtin__', 'l'), ?('__builtin__', '.0'), ?('__builtin__', 'i'), ?('__builtins__', 'l'), ?('__builtins__', '.0'), ?('__builtins__', 'i'), ?('_ih', 'l'), ?('_ih', '.0'), ?('_ih', 'i'), ?('_oh', 'l'), ?('_oh', '.0'), ?('_oh', 'i')] (note that `i' is bound after the first evaluation of internal `locals()' btw, as to be expected) If the "temporary variables" are for use inside the comprehension only, the assignment expression needs to bind in the current scope like the regular assignment statement, no changes are needed! >> It depends on the evaluation order (and whether something is >> evaluated at all), > > Which to my mind is yet another reason not to like ":=". > -- Regards, Ivan From ncoghlan at gmail.com Wed Jun 27 10:26:05 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 28 Jun 2018 00:26:05 +1000 Subject: [Python-Dev] Python and Linux Standard Base In-Reply-To: <640562110.42019603.1530107823789.JavaMail.zimbra@redhat.com> References: <1622396855.42006758.1530103502934.JavaMail.zimbra@redhat.com> <835981563.42014101.1530105504575.JavaMail.zimbra@redhat.com> <20180627154253.15e36b61@fsol> <640562110.42019603.1530107823789.JavaMail.zimbra@redhat.com> Message-ID: On 27 June 2018 at 23:57, Charalampos Stratakis wrote: > From: "Antoine Pitrou" >> One question: who maintains the LSB? >> >> The fact that the Python portion was never updated may hint that nobody >> uses it... > > That could definitely be the case here. I stumbled upon that when checking shebang requirements on Fedora > and apparently every distro has a sort of meta package that adheres to those standards. In Fedora's case [0]. > > I don't have a good answer on who maintains it or even how compliant some distros are, but I was wondering > if that topic came up beforehand and if any requirements were placed from either side. My impression while working for Red Hat was that LSB ended up being one of those bureaucratic standards that ended up sprawling so far beyond being a minimal system, while still leaving core capabilities that real world apps rely on underspecified, that compatibility and compliance testing became sufficiently painful that folks that cared about certifications started certifying a handful of major stable distros instead (with a common modern selection being Ubuntu LTS, Debian Stable, RHEL/CentOS, and SLES). https://en.wikipedia.org/wiki/Linux_Standard_Base seems to back up that impression, with neither Debian nor Ubuntu claiming LSB support at all these days. Given the rise of Flatpak, Snappy, and Linux containers in general, it may make sense to suggest that LSB drop Python entirely (similar to what they did for Java, albeit for different reasons), and instead recommend that portable applications requiring Python bundle their own interpreter. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Wed Jun 27 10:39:38 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 28 Jun 2018 00:39:38 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <5B33936E.8080309@canterbury.ac.nz> References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> <5B33936E.8080309@canterbury.ac.nz> Message-ID: On 27 June 2018 at 23:38, Greg Ewing wrote: > Nick Coghlan wrote: >> >> actually made those semantics available as an explicit >> "parentlocal NAME" declaration ...: >> >> def _list_comp(_outermost_iter): >> parentlocal item >> _result = [] >> for x in _outermost_iter: >> item = x >> _result.append(x) >> return _result >> >> _expr_result = _list_comp(items) > > > I'm not sure that's possible. If I understand correctly, > part of the definition of "parent local" is that "parent" > refers to the nearest enclosing *non-comprehension* scope, > to give the expected result for nested comprehensions. > If that's so, then it's impossible to fully decouple its > definition from comprehensions. I'm OK with a target scope declaration construct having lexical-scope-dependent behaviour - exactly what "nonlocal NAME" will do depends on both the nature of the current scope, and on which names are declared as local in which outer scopes, and that's also implicitly the case for all name lookups. However, PEP 572 in its current form takes the position "parent local scoping is sufficiently useful to make it a required pre-requisite for adding assignment expressions, but not useful enough to expose as a new scope declaration primitive", and I've come to the view that it really is the "A+B=MAGIC!" aspect of the current proposal that bothers me, whereas "A+B implies C for " doesn't bother me any more than the implicit non-local references introduced as part of the original lexical scoping changes bother me. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From vano at mail.mipt.ru Wed Jun 27 10:52:16 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Wed, 27 Jun 2018 17:52:16 +0300 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <20180627134948.GF14437@ando.pearwood.info> References: <20180627065444.GA14437@ando.pearwood.info> <20180627091928.GD14437@ando.pearwood.info> <20180627134948.GF14437@ando.pearwood.info> Message-ID: <5723993a-0bd2-4be6-e3ef-19ea6d0e1092@mail.mipt.ru> On 27.06.2018 16:49, Steven D'Aprano wrote: > On Wed, Jun 27, 2018 at 08:00:20AM -0400, Eric V. Smith wrote: >> On 6/27/2018 7:08 AM, Chris Angelico wrote: >>> It gets funnier with nested loops. Or scarier. I've lost the ability >>> to distinguish those two. >>> >>> def test(): >>> spam = 1 >>> ham = 2 >>> vars = [key1+key2 for key1 in locals() for key2 in locals()] >>> return vars >>> >>> Wanna guess what that's gonna return? >> I'm not singling out Chris here, but these discussions would be easier >> to follow and more illuminating if the answers to such puzzles were >> presented when they're posed. > You can just copy and paste the function into the interactive > interpreter and run it :-) > > But where's the fun in that? The point of the exercise is to learn first > hand just how complicated it is to try to predict the *current* scope > behaviour of comprehensions. Without the ability to perform assignment > inside them, aside from the loop variable, we've managed to avoid > thinking too much about this until now. > > It also demonstrates the unrealisticness of treating comprehensions as a > separate scope -- they're hybrid scope, with parts of the comprehension > running in the surrounding local scope, and parts running in an sublocal > scope. > > Earlier in this thread, Nick tried to justify the idea that > comprehensions run in their own scope, no matter how people think of > them -- but that's an over-simplification, as Chris' example above > shows. Parts of the comprehension do in fact behave exactly as the naive > model would suggest (even if Nick is right that other parts don't). > > As complicated and hairy as the above example is, (1) it is a pretty > weird thing to do, so most of us will almost never need to consider it; > and (2) backwards compatibility requires that we live with it now (at > least unless we introduce a __future__ import). > > If we can't simplify the scope of comprehensions, we can at least > simplify the parts that actually matters. What matters are the loop > variables (already guaranteed to be sublocal and not "leak" out of the > comprehension) and the behaviour of assignment expressions (open to > discussion). > > Broadly speaking, there are two positions we can take: > > 1. Let the current implementation of comprehensions as an implicit > hidden function drive the functionality; that means we duplicate the > hairiness of the locals() behaviour seen above, although it won't be > obvious at first glance. > > What this means in practice is that assignments will go to different > scopes depending on *where* they are in the comprehension: > > [ expr for x in iter1 for y in iter2 if cond ...] > [ BBBBBB for x in AAAAAA for y in BBBBBB if BBBBBB ...] > > Assignments in the section marked "AAAAAA" will be in the local scope; > assignments in the BBBBBB sections will be in the sublocal scope. That's > not too bad, up to the point you try to assign to the same name in > AAAAAA and BBBBBB. And then you are likely to get confusing hard to > debug UnboundLocalErrors. This isn't as messy as you make it sound if you remember that the outermost iterable is evaluated only once at the start and all the others -- each iteration. Anyone using comprehensions has to know this fact. The very readable syntax also makes it rather straightforward (though admittedly requiring some hand-tracing) to figure out what is evaluated after what. > > 2. Or we can keep the current behaviour for locals and the loop > variables, but we can keep assignment expressions simple by ensuring > they always bind to the enclosing scope. Compared to the complexity of > the above, we have the relatively straight forward: > > [ AAAAAA for x in AAAAAA for y in AAAAAA if AAAAAA ...] > > The loop variables continue to be hidden away in the invisible, implicit > comprehension function, where they can't leak out, while explicit > assignments to variables (using := or given or however it is spelled) > will always go into the surrounding local scope, like they do in every > other expression. > > Does it matter that the implementation of this requires an implicit > nonlocal declaration for each assignment? No more than it matters that > comprehensions themselves require an implicit function. > > And what we get out of this is simpler semantics at the Python level: > > - Unless previous declared global, assignment expressions always bind to > the current scope, even if they're inside a comprehension; > > - and we don't have to deal with the oddity that different bits of a > comprehension run in different scopes (unless we go out of our way to > use locals()); merely using assignment expressions will just work > consistently and simply, and loop variables will still be confined to > the comprehension as they are now. > > -- Regards, Ivan From p.f.moore at gmail.com Wed Jun 27 11:23:49 2018 From: p.f.moore at gmail.com (Paul Moore) Date: Wed, 27 Jun 2018 16:23:49 +0100 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> <5B33936E.8080309@canterbury.ac.nz> Message-ID: On 27 June 2018 at 15:39, Nick Coghlan wrote: > However, PEP 572 in its current form takes the position "parent local > scoping is sufficiently useful to make it a required pre-requisite for > adding assignment expressions, but not useful enough to expose as a > new scope declaration primitive", and I've come to the view that it > really is the "A+B=MAGIC!" aspect of the current proposal that bothers > me, whereas "A+B implies C for " doesn't bother me > any more than the implicit non-local references introduced as part of > the original lexical scoping changes bother me. >From my reading, PEP 572 takes the position that "parent local scoping" is what people expect from assignment expressions *in comprehensions* and it's useful enough that there is no reason not to make that the behaviour. The behaviour isn't generally useful enough to be worth exposing as a primitive (it's not even useful enough for the PEP to give it an explicit name!) so it's just a special case for assignment expressions in comprehensions/generators. That seems to me like a classic example of practicality beating purity. Paul From tim.peters at gmail.com Wed Jun 27 12:20:53 2018 From: tim.peters at gmail.com (Tim Peters) Date: Wed, 27 Jun 2018 11:20:53 -0500 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <5B33936E.8080309@canterbury.ac.nz> References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> <5B33936E.8080309@canterbury.ac.nz> Message-ID: [Nick Coghlan]> > actually made those semantics available as an explicit > "parentlocal NAME" declaration ...: > > > > def _list_comp(_outermost_iter): > > parentlocal item > > _result = [] > > for x in _outermost_iter: > > item = x > > _result.append(x) > > return _result > > > > _expr_result = _list_comp(items) > [Greg Ewing] I'm not sure that's possible. If I understand correctly, > part of the definition of "parent local" is that "parent" > refers to the nearest enclosing *non-comprehension* scope, > to give the expected result for nested comprehensions. > If that's so, then it's impossible to fully decouple its > definition from comprehensions. > > Nick's "parentlocal" does refer to the parent, but makes no distinction between synthesized and user-written functions. If the parent has a matching parentlocal declaration for the same name then the original really refers to the grandparent - and so on. Ultimately, it resolves to the closest enclosing scope in which the name is _not_ declared parentlocal. In that scope, a "nonlocal" or "global" declaration settles it if one appears, else the name is local to that scope. So a nested comprehension would declare its assignment expression targets as parentlocal in its synthesized function, and in all the containing synthesized functions generated for containing comprehensions. This appears in some strained ;-) way "natural" only because there is no explicit way to declare something "local" in Python. In just about any other language with closures and nested lexical scopes, comprehensions and generator expressions would have been implemented via nested functions that explicitly declared their "for" target names "local". and nothing else. The only change needed then for PEP 572 (b) semantics would be to declare assignment expression target names local (if their scope wasn't already known) in the closest containing non-synthesized block. None of which really matters. The real question is which semantics are desired. -------------- next part -------------- An HTML attachment was scrubbed... URL: From tim.peters at gmail.com Wed Jun 27 12:51:35 2018 From: tim.peters at gmail.com (Tim Peters) Date: Wed, 27 Jun 2018 11:51:35 -0500 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> <5B33936E.8080309@canterbury.ac.nz> Message-ID: [Nick Coghlan] > However, PEP 572 in its current form takes the position "parent local > scoping is sufficiently useful to make it a required pre-requisite for > adding assignment expressions, but not useful enough to expose as a > new scope declaration primitive", > Of course the PEP doesn't take that position at all: it doesn't even contain the term "parent local scoping". That's your term, which nobody else uses unless they're replying to you ;-) What the PEP does say: """ an assignment expression occurring in a list, set or dict comprehension or in a generator expression (below collectively referred to as "comprehensions") binds the target in the containing scope, honoring a nonlocal or global declaration for the target in that scope, if one exists. For the purpose of this rule the containing scope of a nested comprehension is the scope that contains the outermost comprehension. A lambda counts as a containing scope. """ It's a small collection of plainly stated rules for specifying the intended semantics. If you want to claim that this _is_ "useful enough to expose as a new scope declaration primitive", it's really on you to present use cases to justify that claim. I'd present some for you, but I don't have any (I don't care that "by hand" conversion of nested comprehensions to workalike Python nested functions may require a bit of thought to establish the intended scope of assignment expression target names - all of which is easily doable without adding any new statements). I don't _expect_ that other good use cases exist. The gimmick's purpose is to make code that visually _appears_ to belong to a block act as if embedded assignments do occur in that block. If there's an explicitly nested function, that fundamental motivation no longer applies. -------------- next part -------------- An HTML attachment was scrubbed... URL: From hervinhioslash at gmail.com Wed Jun 27 07:06:22 2018 From: hervinhioslash at gmail.com (=?UTF-8?B?SGVydsOpICJLeWxlIiBNVVRPTUJP?=) Date: Wed, 27 Jun 2018 12:06:22 +0100 Subject: [Python-Dev] Policy on refactoring/clean up Message-ID: In itself, the code clean up you have done is a good thing in the sense that you re-organized things and in my understanding, they look good now. In some of the teams I've been granted to work, there was a rule stating that whenever a dev would work on an enhancement/bugfix, he would create a separate ticket that would track all code refactoring related to his work. The strategy is quite different here and the suggestion was to have your refactoring be part of an enhancement/bugfix. The most valuable argument in favor of all the current reviewers is Guido van Rossum's comment on the effect of having git blame made harder. For some of the projects I'm currently working on, we have code cleanup teams where members are affected expressly to refactoring code. We might end up needing such things in the future but for now, it is wiser to leave the code as is and focus on fixing actual problems. -- *Q::Drone's Co-Founder and Co-CEO* *Herv? "Kyle" MUTOMBO* *Envoy? depuis le web.* -------------- next part -------------- An HTML attachment was scrubbed... URL: From eric at trueblade.com Wed Jun 27 15:01:03 2018 From: eric at trueblade.com (Eric V. Smith) Date: Wed, 27 Jun 2018 15:01:03 -0400 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <20180627134948.GF14437@ando.pearwood.info> References: <20180627065444.GA14437@ando.pearwood.info> <20180627091928.GD14437@ando.pearwood.info> <20180627134948.GF14437@ando.pearwood.info> Message-ID: <54588082-5401-40D9-94D6-676DAAE5172E@trueblade.com> > On Jun 27, 2018, at 9:49 AM, Steven D'Aprano wrote: > >> On Wed, Jun 27, 2018 at 08:00:20AM -0400, Eric V. Smith wrote: >>> On 6/27/2018 7:08 AM, Chris Angelico wrote: >>> It gets funnier with nested loops. Or scarier. I've lost the ability >>> to distinguish those two. >>> >>> def test(): >>> spam = 1 >>> ham = 2 >>> vars = [key1+key2 for key1 in locals() for key2 in locals()] >>> return vars >>> >>> Wanna guess what that's gonna return? >> >> I'm not singling out Chris here, but these discussions would be easier >> to follow and more illuminating if the answers to such puzzles were >> presented when they're posed. > > You can just copy and paste the function into the interactive > interpreter and run it :-) Not on my phone when I?m riding a bus, I can?t. I?m trying to more or less follow the discussion, but the ?guess what this will do? aspect of the discussion makes it hard. Eric > > But where's the fun in that? The point of the exercise is to learn first > hand just how complicated it is to try to predict the *current* scope > behaviour of comprehensions. Without the ability to perform assignment > inside them, aside from the loop variable, we've managed to avoid > thinking too much about this until now. > > It also demonstrates the unrealisticness of treating comprehensions as a > separate scope -- they're hybrid scope, with parts of the comprehension > running in the surrounding local scope, and parts running in an sublocal > scope. > > Earlier in this thread, Nick tried to justify the idea that > comprehensions run in their own scope, no matter how people think of > them -- but that's an over-simplification, as Chris' example above > shows. Parts of the comprehension do in fact behave exactly as the naive > model would suggest (even if Nick is right that other parts don't). > > As complicated and hairy as the above example is, (1) it is a pretty > weird thing to do, so most of us will almost never need to consider it; > and (2) backwards compatibility requires that we live with it now (at > least unless we introduce a __future__ import). > > If we can't simplify the scope of comprehensions, we can at least > simplify the parts that actually matters. What matters are the loop > variables (already guaranteed to be sublocal and not "leak" out of the > comprehension) and the behaviour of assignment expressions (open to > discussion). > > Broadly speaking, there are two positions we can take: > > 1. Let the current implementation of comprehensions as an implicit > hidden function drive the functionality; that means we duplicate the > hairiness of the locals() behaviour seen above, although it won't be > obvious at first glance. > > What this means in practice is that assignments will go to different > scopes depending on *where* they are in the comprehension: > > [ expr for x in iter1 for y in iter2 if cond ...] > [ BBBBBB for x in AAAAAA for y in BBBBBB if BBBBBB ...] > > Assignments in the section marked "AAAAAA" will be in the local scope; > assignments in the BBBBBB sections will be in the sublocal scope. That's > not too bad, up to the point you try to assign to the same name in > AAAAAA and BBBBBB. And then you are likely to get confusing hard to > debug UnboundLocalErrors. > > > 2. Or we can keep the current behaviour for locals and the loop > variables, but we can keep assignment expressions simple by ensuring > they always bind to the enclosing scope. Compared to the complexity of > the above, we have the relatively straight forward: > > [ AAAAAA for x in AAAAAA for y in AAAAAA if AAAAAA ...] > > The loop variables continue to be hidden away in the invisible, implicit > comprehension function, where they can't leak out, while explicit > assignments to variables (using := or given or however it is spelled) > will always go into the surrounding local scope, like they do in every > other expression. > > Does it matter that the implementation of this requires an implicit > nonlocal declaration for each assignment? No more than it matters that > comprehensions themselves require an implicit function. > > And what we get out of this is simpler semantics at the Python level: > > - Unless previous declared global, assignment expressions always bind to > the current scope, even if they're inside a comprehension; > > - and we don't have to deal with the oddity that different bits of a > comprehension run in different scopes (unless we go out of our way to > use locals()); merely using assignment expressions will just work > consistently and simply, and loop variables will still be confined to > the comprehension as they are now. > > > -- > Steve > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/eric%2Ba-python-dev%40trueblade.com From guido at python.org Wed Jun 27 18:31:42 2018 From: guido at python.org (Guido van Rossum) Date: Wed, 27 Jun 2018 15:31:42 -0700 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: So IIUC you are okay with the behavior described by the PEP but you want an explicit language feature to specify it? I don't particularly like adding a `parentlocal` statement to the language, because I don't think it'll be generally useful. (We don't have `goto` in the language even though it could be used in the formal specification of `if`, for example. :-) But as a descriptive mechanism to make the PEP's spec clearer I'm fine with it. Let's call it `__parentlocal` for now. It would work a bit like `nonlocal` but also different, since in the normal case (when there's no matching `nonlocal` in the parent scope) it would make the target a local in that scope rather than trying to look for a definition of the target name in surrounding (non-class, non-global) scopes. Also if there's a matching `global` in the parent scope, `__parentlocal` itself changes its meaning to `global`. If you want to push a target through several level of target scopes you can do that by having a `__parentlocal` in each scope that it should push through (this is needed for nested comprehensions, see below). Given that definition of `__parentlocal`, in first approximation the scoping rule proposed by PEP 572 would then be: In comprehensions (which in my use in the PEP 572 discussion includes generator expressions) the targets of inline assignments are automatically endowed with a `__parentlocal` declaration, except inside the "outermost iterable" (since that already runs in the parent scope). There would have to be additional words when comprehensions themselves are nested (e.g. `[[a for a in range(i)] for i in range(10)]`) since the PEP's intention is that inline assignments anywhere there end up targeting the scope containing the outermost comprehension. But this can all be expressed by adding `__parentlocal` for various variables in various places (including in the "outermost iterable" of inner comprehensions). I'd also like to keep the rule prohibiting use of the same name as a comprehension loop control variable and as an inline assignment target; this rule would also prohibit shenanigans with nested comprehensions (for any set of nested comprehensions, any name that's a loop control variable in any of them cannot be an inline assignment target in any of them). This would also apply to the "outermost iterable". Does this help at all, or did I miss something? --Guido On Wed, Jun 27, 2018 at 5:27 AM Nick Coghlan wrote: > On 26 June 2018 at 02:27, Guido van Rossum wrote: > > [This is my one reply in this thread today. I am trying to limit the > amount > > of time I spend to avoid another overheated escalation.] > > Aye, I'm trying to do the same, and deliberately spending some > evenings entirely offline is helping with that :) > > > On Mon, Jun 25, 2018 at 4:44 AM Nick Coghlan wrote: > >> > >> Right, the proposed blunt solution to "Should I use 'NAME = EXPR' or > >> 'NAME := EXPR'?" bothers me a bit, but it's the implementation > >> implications of parent local scoping that I fear will create a > >> semantic tar pit we can't get out of later. > > > > Others have remarked this too, but it really bother me that you are > focusing > > so much on the implementation of parent local scoping rather than on the > > "intuitive" behavior which is super easy to explain -- especially to > someone > > who isn't all that familiar (or interested) with the implicit scope > created > > for the loop control variable(s). According to Steven (who noticed that > this > > is barely mentioned in most tutorials about comprehensions) that is most > > people, however very few of them read python-dev. > > > > It's not that much work for the compiler, since it just needs to do a > little > > bit of (new) static analysis and then it can generate the bytecode to > > manipulate closure(s). The runtime proper doesn't need any new > > implementation effort. The fact that sometimes a closure must be > introduced > > where no explicit initialization exists is irrelevant to the runtime -- > this > > only affects the static analysis, at runtime it's no different than if > the > > explicit initialization was inside `if 0`. > > One of the things I prize about Python's current code generator is how > many of the constructs can be formulated as simple content-and-context > independent boilerplate removal, which is why parent local scoping (as > currently defined in PEP 572) bothers me: rather than being a new > primitive in its own right, the PEP instead makes the notion of "an > assignment expression in a comprehension or generator expression" a > construct that can't readily decomposed into lower level building > blocks the way that both assignment expressions on their own and > comprehensions and generator expressions on their own can be. Instead, > completely new language semantics arise from the interaction between > two otherwise independent features. > > Even changes as complicated as PEP 343's with statement, PEP 380's > yield from, and PEP 492's native coroutines all include examples of > how they could be written *without* the benefit of the new syntax. > > By contrast, PEP 572's parent local scoping can't currently be defined > that way. Instead, to explain how the code generator is going to be > expected to handle comprehensions, you have to take the current > comprehension semantics and add two new loops to link up the bound > names correctly:: > > [item := x for x in items] > > becomes: > > # Each bound name gets declared as local in the parent scope > if 0: > for item in (): pass > def _list_comp(_outermost_iter): > # Each bound name gets declared as: > # - nonlocal if outer scope is a function scope > # - global item if outer scope is a module scope > # - an error, otherwise > _result = [] > for x in _outermost_iter: > _result.append(x) > return _result > > _expr_result = _list_comp(items) > > This is why my objections would be reduced significantly if the PEP > explicitly admitted that it was defining a new kind of scoping > semantics, and actually made those semantics available as an explicit > "parentlocal NAME" declaration (behind a "from __future__ import > parent_locals" guard), such that the translation of the above example > to an explicitly nested scope could just be the visually > straightforward:: > > def _list_comp(_outermost_iter): > parentlocal item > _result = [] > for x in _outermost_iter: > item = x > _result.append(x) > return _result > > _expr_result = _list_comp(items) > > That splits up the learning process for anyone trying to really > understand how this particular aspect of Python's code generation > works into two distinct pieces: > > - "assignment expressions inside comprehensions and generator > expressions use parent local scoping" > - "parent local scoping works " > > If the PEP did that, we could likely even make parent locals work > sensibly for classes by saying that "parent local" for a method > definition in a class body refers to the closure namespace where we > already stash __class__ references for the benefit of zero-arg super > (this would also be a far more robust way of defining private class > variables than name mangling is able to offer). > > Having parent locals available as a language level concept (rather > than solely as an interaction between assignment expressions and > implicitly nested scopes) also gets us to a point where > context-independent code thunks that work both at module level and > inside another function can be built as nested functions which declare > all their working variables as parentlocal (you still need to define > the thunks inline in the scope you want them to affect, since this > isn't dynamic scoping, but when describing the code, you don't need to > say "as a module level function define it this way, as a nested > function define it that way"). > > An explicit "parentlocal NAME" concept at the PEP 572 layer would also > change the nature of the draft "given" proposal from competing with > PEP 572, to instead being a follow-up proposal that focused on > providing control of target name declarations in lambda expressions, > comprehensions, and generator expressions such that: > > - (lambda arg: value := arg given parentlocal value) # Exports "value" > to parent scope > - any(x for x in items given parentlocal x) # Exports "x" to parent scope > - [y for x in data if (y := f(x)) given y] # *Avoids* exporting "y" to > parent scope > > With parent local scoping in the mix the proposed "given" syntax could > also dispense with initialiser and type hinting support entirely and > instead only allow: > > - "... given NAME" (always local, no matter the default scoping) > - "... given parentlocal NAME" (always parent local, declaring if > necessary) > - "... given nonlocal NAME" (always nonlocal, error if not declared in > outer scope) > - "... given global NAME" (always global, no matter how nested the > current scope is) > - "... given (TARGET1, TARGET2, ...)" (declaring multiple assignment > targets) > > If you want an initialiser or a type hint, then you'd use parentlocal > semantics. If you want to keep names local (e.g. to avoid exporting > them as part of a module's public API) then you can do that, too. > > >> Unfortunately, I think the key rationale for (b) is that if you > >> *don't* do something along those lines, then there's a different > >> strange scoping discrepancy that arises between the non-comprehension > >> forms of container displays and the comprehension forms: > >> > >> (NAME := EXPR,) # Binds a local > >> tuple(NAME := EXPR for __ in range(1)) # Doesn't bind a local > >> [...] > >> Those scoping inconsistencies aren't *new*, but provoking them > >> currently involves either class scopes, or messing about with > >> locals(). > > > > In what sense are they not new? This syntax doesn't exist yet. > > The simplest way to illustrate the scope distinction today is with > "len(locals())": > > >>> [len(locals()) for i in range(1)] > [2] > >>> [len(locals())] > [7] > > But essentially nobody ever does that, so the distinction doesn't > currently matter. > > By contrast, where assignment expressions bind their targets matters a > *lot*, so PEP 572 makes the existing scoping oddities a lot more > significant. > > > You left out another discrepancy, which is more likely to hit people in > the > > face: according to your doctrine, := used in the "outermost iterable" > would > > create a local in the containing scope, since that's where the outermost > > iterable is evaluated. So in this example > > > > a = [x := i+1 for i in range(y := 2)] > > > > the scope of x would be the implicit function (i.e. it wouldn't leak) > while > > the scope of y would be the same as that of a. (And there's an even more > > cryptic example, where the same name is assigned in both places.) > > Yeah, the fact it deals with this problem nicely is one aspect of the > parent local scoping that I find genuinely attractive. > > >> Parent local scoping tries to mitigate the surface inconsistency by > >> changing how write semantics are defined for implicitly nested scopes, > >> but that comes at the cost of making those semantics inconsistent with > >> explicitly nested scopes and with the read semantics of implicitly > >> nested scopes. > > > > > > Nobody thinks about write semantics though -- it's simply not the right > > abstraction to use here, you've introduced it because that's how *you* > think > > about this. > > The truth of the last part of that paragraph means that the only way > for the first part of it to be true is to decide that my way of > thinking is *so* unusual that nobody else in the 10 years that Python > 3 has worked the way it does now has used the language reference, the > source code, the disassembler, or the debugger to formulate a similar > mental model of how they expect comprehensions and generator > expressions to behave. > > I'll grant that I may be unusual in thinking about comprehensions and > generator expressions the way I do, and I definitely accept that most > folks simply don't think about the subtleties of how they handle > scopes in the first place, but I *don't* accept the assertion that I'm > unique in thinking about them that way. There are simply too many edge > cases in their current runtime behaviour where the "Aha!" moment at > the end of a debugging effort is going to be the realisation that > they're implemented as an implicitly nested scope, and we've had a > decade of Python 3 use where folks prone towards writing overly clever > comprehensions have been in a position to independently make that > discovery. > > >> The early iterations of PEP 572 tried to duck this whole realm of > >> potential semantic inconsistencies by introducing sublocal scoping > > > There was also another variant in some iteration or PEP 572, after > sublocal > > scopes were already eliminated -- a change to comprehensions that would > > evaluate the innermost iterable in the implicit function. This would make > > the explanation of inline assignment in comprehensions consistent again > > (they were always local to the comprehension in that iteration of the > PEP), > > at the cost of a backward incompatibility that was ultimately withdrawn. > > Yeah, the current "given" draft has an open question around the idea > of having the presence of a "given" clause pull the outermost iterable > evaluation inside the nested scope. It still doesn't really solve the > problem, though, so I think I'd actually consider > PEP-572-with-explicit-parent-local-scoping-support the version of > assignment expressions that most cleanly handles the interaction with > comprehension scopes without making that interaction rely on opaque > magic (instead, it would be relying on an implicit target scope > declaration, the same as any other name binding - the only unusual > aspect is that the implicit declaration would be "parentlocal NAME" > rather than the more typical local variable declaration). > > Cheers, > Nick. > > -- > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at pearwood.info Wed Jun 27 18:42:50 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Thu, 28 Jun 2018 08:42:50 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <5723993a-0bd2-4be6-e3ef-19ea6d0e1092@mail.mipt.ru> References: <20180627065444.GA14437@ando.pearwood.info> <20180627091928.GD14437@ando.pearwood.info> <20180627134948.GF14437@ando.pearwood.info> <5723993a-0bd2-4be6-e3ef-19ea6d0e1092@mail.mipt.ru> Message-ID: <20180627224250.GJ14437@ando.pearwood.info> On Wed, Jun 27, 2018 at 05:52:16PM +0300, Ivan Pozdeev via Python-Dev wrote: > >What this means in practice is that assignments will go to different > >scopes depending on *where* they are in the comprehension: > > > > [ expr for x in iter1 for y in iter2 if cond ...] > > [ BBBBBB for x in AAAAAA for y in BBBBBB if BBBBBB ...] > > > >Assignments in the section marked "AAAAAA" will be in the local scope; > >assignments in the BBBBBB sections will be in the sublocal scope. That's > >not too bad, up to the point you try to assign to the same name in > >AAAAAA and BBBBBB. And then you are likely to get confusing hard to > >debug UnboundLocalErrors. > > This isn't as messy as you make it sound if you remember that the > outermost iterable is evaluated only once at the start and all the > others -- each iteration. The question isn't *how often* they are evaluated, or how many loops you have, but *what scope* they are evaluated in. Even in a single loop comprehension, parts of it are evaluated in the local scope and parts are evaluated in an implicit sublocal scope. The overlap between the two is the trap, if you try to assign to the same variable in the loop header and then update it in the loop body. Not to mention the inconsistency that some assignments are accessible from the surrounding code: [expr for a in (x := func(), ...) ] print(x) # works while the most useful ones, those in the body, will be locked up in an implicit sublocal scope where they are unreachable from outside of the comprehension: [x := something ... for a in sequence ] print(x) # fails -- Steve From guido at python.org Wed Jun 27 19:11:06 2018 From: guido at python.org (Guido van Rossum) Date: Wed, 27 Jun 2018 16:11:06 -0700 Subject: [Python-Dev] Intent to accept PEP 561 -- Distributing and Packaging Type Information In-Reply-To: References: Message-ID: Well, with that, I am hereby accepting PEP 561. Ethan has done a tremendous job writing this PEP and implementing it, and I am sure that package and stub authors will be very glad to hear that there are now officially supported ways other than typeshed to distribute type annotations. Congrats Ethan! --Guido On Mon, Jun 25, 2018 at 12:15 PM Guido van Rossum wrote: > OK, last call! I'll accept the current draft tomorrow unless someone > pushes back. > > On Fri, Jun 22, 2018 at 8:37 AM Nick Coghlan wrote: > >> On 23 June 2018 at 01:16, Guido van Rossum wrote: >> > That sounds like you're supporting PEP 561 as is, right? >> >> Aye, I'm personally fine with it - we do need to do something about >> automatically reserving the derived names on PyPI, but I don't think >> that's a blocker for the initial PEP acceptance (instead, it will go >> the other way: PEP acceptance will drive Warehouse getting updated to >> handle the convention already being adopted by the client tools). >> >> > Excuse my >> > ignorance, but where are API testing stub interfaces described or used? >> >> They're not - it's just the context for Donald referring to "stubs" as >> being a general technical term with other meanings beyond the "type >> hinting stub file" one. >> >> As such, there's three parts to explaining why we're not worried about >> the terminology clash: >> >> - Ethan searched for projects called "*-stubs" or "*_stubs" and didn't >> find any, so the practical impact of any terminology clash will be low >> - there isn't an established need to automatically find testing stub >> libraries based on an existing project name the way there is for type >> hints >> - even if such a need did arise in the future, the "py.typed" marker >> file and the different file extension for stub files within a package >> still gives us an enormous amount of design flexibility >> >> Cheers, >> Nick. >> >> -- >> Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia >> > > > -- > --Guido van Rossum (python.org/~guido) > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From levkivskyi at gmail.com Wed Jun 27 19:18:30 2018 From: levkivskyi at gmail.com (Ivan Levkivskyi) Date: Thu, 28 Jun 2018 00:18:30 +0100 Subject: [Python-Dev] Intent to accept PEP 561 -- Distributing and Packaging Type Information In-Reply-To: References: Message-ID: Congrats Ethan, Well done! I think PEP 561 will significantly simplify typing third party modules. -- Ivan On 28 June 2018 at 00:11, Guido van Rossum wrote: > Well, with that, I am hereby accepting PEP 561. > > Ethan has done a tremendous job writing this PEP and implementing it, and > I am sure that package and stub authors will be very glad to hear that > there are now officially supported ways other than typeshed to distribute > type annotations. > > Congrats Ethan! > > --Guido > > On Mon, Jun 25, 2018 at 12:15 PM Guido van Rossum > wrote: > >> OK, last call! I'll accept the current draft tomorrow unless someone >> pushes back. >> >> On Fri, Jun 22, 2018 at 8:37 AM Nick Coghlan wrote: >> >>> On 23 June 2018 at 01:16, Guido van Rossum wrote: >>> > That sounds like you're supporting PEP 561 as is, right? >>> >>> Aye, I'm personally fine with it - we do need to do something about >>> automatically reserving the derived names on PyPI, but I don't think >>> that's a blocker for the initial PEP acceptance (instead, it will go >>> the other way: PEP acceptance will drive Warehouse getting updated to >>> handle the convention already being adopted by the client tools). >>> >>> > Excuse my >>> > ignorance, but where are API testing stub interfaces described or used? >>> >>> They're not - it's just the context for Donald referring to "stubs" as >>> being a general technical term with other meanings beyond the "type >>> hinting stub file" one. >>> >>> As such, there's three parts to explaining why we're not worried about >>> the terminology clash: >>> >>> - Ethan searched for projects called "*-stubs" or "*_stubs" and didn't >>> find any, so the practical impact of any terminology clash will be low >>> - there isn't an established need to automatically find testing stub >>> libraries based on an existing project name the way there is for type >>> hints >>> - even if such a need did arise in the future, the "py.typed" marker >>> file and the different file extension for stub files within a package >>> still gives us an enormous amount of design flexibility >>> >>> Cheers, >>> Nick. >>> >>> -- >>> Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia >>> >> >> >> -- >> --Guido van Rossum (python.org/~guido) >> > > > -- > --Guido van Rossum (python.org/~guido) > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/ > levkivskyi%40gmail.com > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ericfahlgren at gmail.com Wed Jun 27 16:46:10 2018 From: ericfahlgren at gmail.com (Eric Fahlgren) Date: Wed, 27 Jun 2018 13:46:10 -0700 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> <5B33936E.8080309@canterbury.ac.nz> Message-ID: On Wed, Jun 27, 2018 at 9:27 AM Paul Moore wrote: > From my reading, PEP 572 takes the position that "parent local > scoping" is what people expect from assignment expressions *in > comprehensions* and it's useful enough that there is no reason not to > make that the behaviour. The behaviour isn't generally useful enough > to be worth exposing as a primitive (it's not even useful enough for > the PEP to give it an explicit name!) so it's just a special case for > assignment expressions in comprehensions/generators. > ?So, my interpretation is that it will behave like this? x = 2 y = [x := 3 for i in range(1)] print(x) 3 def f(): x = 4 y = [x := 5 for i in range(1)] print(x) f() 5 class C: x = 6 y = [x := 7 for i in range(1)] print(x) C() 6 print(x) 7? -------------- next part -------------- An HTML attachment was scrubbed... URL: From greg.ewing at canterbury.ac.nz Wed Jun 27 19:31:43 2018 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 28 Jun 2018 11:31:43 +1200 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <20180627140428.GG14437@ando.pearwood.info> References: <20180624145204.GN14437@ando.pearwood.info> <5B302990.8060807@canterbury.ac.nz> <222b030b-6f6b-1644-6f74-8e7b6d2b9857@mail.mipt.ru> <5B316E07.1060305@canterbury.ac.nz> <74a4002f-7496-b2e0-4ecb-e4ceaf7f1e51@mail.mipt.ru> <5B33903A.3060104@canterbury.ac.nz> <20180627154123.63446423@fsol> <20180627140428.GG14437@ando.pearwood.info> Message-ID: <5B341E5F.8050708@canterbury.ac.nz> Steven D'Aprano wrote: > The *very first* motivating example for this proposal came from a > comprehension. > > I think it is both unfortunate and inevitable that the discussion bogged > down in comprehension-hell. I think the unfortunateness started when we crossed over from talking about binding a temporary name for use *within* a comprehension or expression, to binding a name for use *outside* the comprehension or expression where it's bound. As long as it's for internal use, whether it's in a comprehension or not isn't an issue. > Tim Peters has also given a > couple of good examples of mathematical code that would benefit strongly > from this feature. > > Going back a few months now, they were the examples that tipped me over Well, I remain profoundly unconvinced that writing comprehensions with side effects is ever a good idea, and Tim's examples did nothing to change that. -- Greg From vano at mail.mipt.ru Wed Jun 27 19:31:55 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Thu, 28 Jun 2018 02:31:55 +0300 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <20180627224250.GJ14437@ando.pearwood.info> References: <20180627065444.GA14437@ando.pearwood.info> <20180627091928.GD14437@ando.pearwood.info> <20180627134948.GF14437@ando.pearwood.info> <5723993a-0bd2-4be6-e3ef-19ea6d0e1092@mail.mipt.ru> <20180627224250.GJ14437@ando.pearwood.info> Message-ID: On 28.06.2018 1:42, Steven D'Aprano wrote: > On Wed, Jun 27, 2018 at 05:52:16PM +0300, Ivan Pozdeev via Python-Dev wrote: > >>> What this means in practice is that assignments will go to different >>> scopes depending on *where* they are in the comprehension: >>> >>> [ expr for x in iter1 for y in iter2 if cond ...] >>> [ BBBBBB for x in AAAAAA for y in BBBBBB if BBBBBB ...] >>> >>> Assignments in the section marked "AAAAAA" will be in the local scope; >>> assignments in the BBBBBB sections will be in the sublocal scope. That's >>> not too bad, up to the point you try to assign to the same name in >>> AAAAAA and BBBBBB. And then you are likely to get confusing hard to >>> debug UnboundLocalErrors. >> This isn't as messy as you make it sound if you remember that the >> outermost iterable is evaluated only once at the start and all the >> others -- each iteration. > The question isn't *how often* they are evaluated, or how many loops you > have, but *what scope* they are evaluated in. Even in a single loop > comprehension, parts of it are evaluated in the local scope and parts > are evaluated in an implicit sublocal scope. All expressions inside the comprehension other than the initial iterable have access to the loop variables generated by the previous parts. So they are necessarily evaluated in the internal scope for that to be possible. Since this is too an essential semantics that one has to know to use the construct sensibly, I kinda assumed you could make that connection... E.g.: [(x*y) for x in range(5) if x%2 for y in range(x,5) if not (x+y)%2] ?? A????????????? B????????? C????????????? D?????????????? E C and D have access to the current x; E and A to both x and y. > > The overlap between the two is the trap, if you try to assign to the > same variable in the loop header and then update it in the loop body. > > Not to mention the inconsistency that some assignments are accessible > from the surrounding code: > > [expr for a in (x := func(), ...) ] > print(x) # works > > while the most useful ones, those in the body, will be locked up in an > implicit sublocal scope where they are unreachable from outside of the > comprehension: > > [x := something ... for a in sequence ] > print(x) # fails > > -- Regards, Ivan From ethan at stoneleaf.us Wed Jun 27 19:38:59 2018 From: ethan at stoneleaf.us (Ethan Furman) Date: Wed, 27 Jun 2018 16:38:59 -0700 Subject: [Python-Dev] Intent to accept PEP 561 -- Distributing and Packaging Type Information In-Reply-To: References: Message-ID: <5B342013.4030207@stoneleaf.us> On 06/27/2018 04:11 PM, Guido van Rossum wrote: > Well, with that, I am hereby accepting PEP 561. Congratulations, Ethan! :) -- ~Ethan~ From vano at mail.mipt.ru Wed Jun 27 19:45:05 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Thu, 28 Jun 2018 02:45:05 +0300 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <20180627065444.GA14437@ando.pearwood.info> <20180627091928.GD14437@ando.pearwood.info> <20180627134948.GF14437@ando.pearwood.info> <5723993a-0bd2-4be6-e3ef-19ea6d0e1092@mail.mipt.ru> <20180627224250.GJ14437@ando.pearwood.info> Message-ID: <5de53c24-c604-efd2-7072-5c8dd17dd636@mail.mipt.ru> On 28.06.2018 2:31, Ivan Pozdeev via Python-Dev wrote: > On 28.06.2018 1:42, Steven D'Aprano wrote: >> On Wed, Jun 27, 2018 at 05:52:16PM +0300, Ivan Pozdeev via Python-Dev >> wrote: >> >>>> What this means in practice is that assignments will go to different >>>> scopes depending on *where* they are in the comprehension: >>>> >>>> ???? [ expr?? for x in iter1? for y in iter2? if cond?? ...] >>>> ???? [ BBBBBB for x in AAAAAA for y in BBBBBB if BBBBBB ...] >>>> >>>> Assignments in the section marked "AAAAAA" will be in the local scope; >>>> assignments in the BBBBBB sections will be in the sublocal scope. >>>> That's >>>> not too bad, up to the point you try to assign to the same name in >>>> AAAAAA and BBBBBB. And then you are likely to get confusing hard to >>>> debug UnboundLocalErrors. >>> This isn't as messy as you make it sound if you remember that the >>> outermost iterable is evaluated only once at the start and all the >>> others -- each iteration. >> The question isn't *how often* they are evaluated, or how many loops you >> have, but *what scope* they are evaluated in. Even in a single loop >> comprehension, parts of it are evaluated in the local scope and parts >> are evaluated in an implicit sublocal scope. > > All expressions inside the comprehension other than the initial > iterable have access to the loop variables generated by the previous > parts. So they are necessarily evaluated in the internal scope for > that to be possible. > > Since this is too an essential semantics that one has to know to use > the construct sensibly, I kinda assumed you could make that connection... > E.g.: > > [(x*y) for x in range(5) if x%2 for y in range(x,5) if not (x+y)%2] > ?? A????????????? B????????? C????????????? D?????????????? E > > C and D have access to the current x; E and A to both x and y. > This means btw that users cannot rely on there being a single internal scope, or a scope at all. The public guarantee is only the access to the loop variables (and, with the PEP, additional variables from assignments), of the current iteration, generated by the previous parts. >> >> The overlap between the two is the trap, if you try to assign to the >> same variable in the loop header and then update it in the loop body. >> >> Not to mention the inconsistency that some assignments are accessible >> from the surrounding code: >> >> ???? [expr for a in (x := func(), ...) ] >> ???? print(x)? # works >> >> while the most useful ones, those in the body, will be locked up in an >> implicit sublocal scope where they are unreachable from outside of the >> comprehension: >> >> ???? [x := something ...? for a in sequence ] >> ???? print(x)? # fails >> >> > -- Regards, Ivan From greg.ewing at canterbury.ac.nz Wed Jun 27 19:55:20 2018 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 28 Jun 2018 11:55:20 +1200 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> <5B33936E.8080309@canterbury.ac.nz> Message-ID: <5B3423E8.3080208@canterbury.ac.nz> Nick Coghlan wrote: > I'm OK with a target scope declaration construct having > lexical-scope-dependent behaviour - exactly what "nonlocal NAME" will > do depends on both the nature of the current scope, Yes, but my point is that having an explicit "parentlocal" scope declaration doesn't help to make anything more orthogonal, because there's no way it can have *exactly* the same effect as a comprehension's implicit parent-local scoping. In other words, taking a comprehension and manually expanding it into a function with parentlocal declarations wouldn't give you something exactly equivalent to the original. If that's the purpose of having an explicit parentlocal, then it fails at that purpose. If that's *not* the purpose, then I'm not really sure what the purpose is, because I can't think of a situation where I'd choose to use parentlocal instead of nonlocal with an explicit assignment in the outer scope. Except maybe for the class-scope situation, which seems like an extremely obscure reason to introduce a whole new scoping concept with its own keyword. -- Greg From vano at mail.mipt.ru Wed Jun 27 19:57:26 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Thu, 28 Jun 2018 02:57:26 +0300 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <5de53c24-c604-efd2-7072-5c8dd17dd636@mail.mipt.ru> References: <20180627065444.GA14437@ando.pearwood.info> <20180627091928.GD14437@ando.pearwood.info> <20180627134948.GF14437@ando.pearwood.info> <5723993a-0bd2-4be6-e3ef-19ea6d0e1092@mail.mipt.ru> <20180627224250.GJ14437@ando.pearwood.info> <5de53c24-c604-efd2-7072-5c8dd17dd636@mail.mipt.ru> Message-ID: On 28.06.2018 2:45, Ivan Pozdeev via Python-Dev wrote: > On 28.06.2018 2:31, Ivan Pozdeev via Python-Dev wrote: >> On 28.06.2018 1:42, Steven D'Aprano wrote: >>> On Wed, Jun 27, 2018 at 05:52:16PM +0300, Ivan Pozdeev via >>> Python-Dev wrote: >>> >>>>> What this means in practice is that assignments will go to different >>>>> scopes depending on *where* they are in the comprehension: >>>>> >>>>> ???? [ expr?? for x in iter1? for y in iter2? if cond ...] >>>>> ???? [ BBBBBB for x in AAAAAA for y in BBBBBB if BBBBBB ...] >>>>> >>>>> Assignments in the section marked "AAAAAA" will be in the local >>>>> scope; >>>>> assignments in the BBBBBB sections will be in the sublocal scope. >>>>> That's >>>>> not too bad, up to the point you try to assign to the same name in >>>>> AAAAAA and BBBBBB. And then you are likely to get confusing hard to >>>>> debug UnboundLocalErrors. >>>> This isn't as messy as you make it sound if you remember that the >>>> outermost iterable is evaluated only once at the start and all the >>>> others -- each iteration. >>> The question isn't *how often* they are evaluated, or how many loops >>> you >>> have, but *what scope* they are evaluated in. Even in a single loop >>> comprehension, parts of it are evaluated in the local scope and parts >>> are evaluated in an implicit sublocal scope. >> >> All expressions inside the comprehension other than the initial >> iterable have access to the loop variables generated by the previous >> parts. So they are necessarily evaluated in the internal scope for >> that to be possible. >> >> Since this is too an essential semantics that one has to know to use >> the construct sensibly, I kinda assumed you could make that >> connection... >> E.g.: >> >> [(x*y) for x in range(5) if x%2 for y in range(x,5) if not (x+y)%2] >> ?? A????????????? B????????? C????????????? D?????????????? E >> >> C and D have access to the current x; E and A to both x and y. >> > This means btw that users cannot rely on there being a single internal > scope, or a scope at all. > The public guarantee is only the access to the loop variables (and, > with the PEP, additional variables from assignments), of the current > iteration, generated by the previous parts. > The expressions in the comprehension just somehow automagically determine which of the variables are internal and which are local. How they do that is an implementation detail. And the PEP doesn't need to (and probably shouldn't) make guarantees here other than where the variables from expressions are promised to be accessible. >>> >>> The overlap between the two is the trap, if you try to assign to the >>> same variable in the loop header and then update it in the loop body. >>> >>> Not to mention the inconsistency that some assignments are accessible >>> from the surrounding code: >>> >>> ???? [expr for a in (x := func(), ...) ] >>> ???? print(x)? # works >>> >>> while the most useful ones, those in the body, will be locked up in an >>> implicit sublocal scope where they are unreachable from outside of the >>> comprehension: >>> >>> ???? [x := something ...? for a in sequence ] >>> ???? print(x)? # fails >>> >>> >> > -- Regards, Ivan From greg.ewing at canterbury.ac.nz Wed Jun 27 20:03:32 2018 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 28 Jun 2018 12:03:32 +1200 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <5723993a-0bd2-4be6-e3ef-19ea6d0e1092@mail.mipt.ru> References: <20180627065444.GA14437@ando.pearwood.info> <20180627091928.GD14437@ando.pearwood.info> <20180627134948.GF14437@ando.pearwood.info> <5723993a-0bd2-4be6-e3ef-19ea6d0e1092@mail.mipt.ru> Message-ID: <5B3425D4.8000403@canterbury.ac.nz> Ivan Pozdeev via Python-Dev wrote: > This isn't as messy as you make it sound if you remember that the > outermost iterable is evaluated only once at the start and all the > others -- each iteration. > Anyone using comprehensions has to know this fact. That fact alone doesn't imply anthing about the *scopes* in which those iterators are evaluated, however. Currently the only situation where the scoping makes a difference is a generator expression that isn't immediately used, and you can get a long way into your Python career without ever encountering that case. -- Greg From vano at mail.mipt.ru Wed Jun 27 20:10:40 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Thu, 28 Jun 2018 03:10:40 +0300 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <5B341E5F.8050708@canterbury.ac.nz> References: <20180624145204.GN14437@ando.pearwood.info> <5B302990.8060807@canterbury.ac.nz> <222b030b-6f6b-1644-6f74-8e7b6d2b9857@mail.mipt.ru> <5B316E07.1060305@canterbury.ac.nz> <74a4002f-7496-b2e0-4ecb-e4ceaf7f1e51@mail.mipt.ru> <5B33903A.3060104@canterbury.ac.nz> <20180627154123.63446423@fsol> <20180627140428.GG14437@ando.pearwood.info> <5B341E5F.8050708@canterbury.ac.nz> Message-ID: On 28.06.2018 2:31, Greg Ewing wrote: > Steven D'Aprano wrote: >> The *very first* motivating example for this proposal came from a >> comprehension. >> >> I think it is both unfortunate and inevitable that the discussion bogged >> down in comprehension-hell. > > I think the unfortunateness started when we crossed over from > talking about binding a temporary name for use *within* a > comprehension or expression, to binding a name for use *outside* > the comprehension or expression where it's bound. > I've shown in <05f368c2-3cd2-d7e0-9f91-27afb40d5b35 at mail.mipt.ru> (27 Jun 2018 17:07:24 +0300) that assignment expressions are fine in most use cases without any changes to scoping whatsoever. So, as Guido suggested in (26 Jun 2018 19:36:14 -0700), the scoping matter can be split into a separate PEP and discussion. > As long as it's for internal use, whether it's in a comprehension > or not isn't an issue. > >> Tim Peters has also given a couple of good examples of mathematical >> code that would benefit strongly from this feature. >> >> Going back a few months now, they were the examples that tipped me over > > Well, I remain profoundly unconvinced that writing comprehensions > with side effects is ever a good idea, and Tim's examples did > nothing to change that. > -- Regards, Ivan From greg.ewing at canterbury.ac.nz Wed Jun 27 20:16:41 2018 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 28 Jun 2018 12:16:41 +1200 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> <5B33936E.8080309@canterbury.ac.nz> Message-ID: <5B3428E9.8000601@canterbury.ac.nz> Tim Peters wrote: > If the parent has a > matching parentlocal declaration for the same name then the original > really refers to the grandparent - and so on. Ah, I missed that part, sorry -- I withdraw that particular objecttion. Still, this seems like a major addition (seeing as it comes with a new keyword) whose justification is very little more than "it makes explaining comprehension scopes easier". -- Greg From tim.peters at gmail.com Wed Jun 27 20:29:52 2018 From: tim.peters at gmail.com (Tim Peters) Date: Wed, 27 Jun 2018 19:29:52 -0500 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <5B3423E8.3080208@canterbury.ac.nz> References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> <5B33936E.8080309@canterbury.ac.nz> <5B3423E8.3080208@canterbury.ac.nz> Message-ID: > [Nick Coghlan] > I'm OK with a target scope declaration construct having > > lexical-scope-dependent behaviour - exactly what "nonlocal NAME" will > > do depends on both the nature of the current scope, > [Greg Ewing] > Yes, but my point is that having an explicit "parentlocal" scope > declaration doesn't help to make anything more orthogonal, > because there's no way it can have *exactly* the same effect > as a comprehension's implicit parent-local scoping. > Sure it can - but I already explained that. This is the analogy to "nonlocal" Nick is making: neither "nonlocal" nor "parentlocal" tell you which scope a declared name _does_ belong to. Instead they both say "it's not this scope" and specify algorithms you can follow to determine the scope to which the name does belong. "parentlocal" isn't an accurate name because the owning scope may not be the parent block at all, and it may even be a synonym for "global". I think "by hand" translations of nested comprehensions into nested functions are clearer _without_ the "parentlocal" invention.- then you have to be explicit about what the context requires. Nick hates that because it isn't uniform. I like that because I don't want to pretend a non-uniform thing is uniform ;-) The only real use case here is synthesizing nested functions to implement comprehensions/genexps. In other words, taking a comprehension and manually expanding > it into a function with parentlocal declarations wouldn't > give you something exactly equivalent to the original. > If that's the purpose of having an explicit parentlocal, > then it fails at that purpose. > You can add (a sufficient number of) parentlocal declarations to get the precise intended semantics. Then again, that can also be done today (without the "parentlocal" invention). > > If that's *not* the purpose, then I'm not really sure what > the purpose is, because I can't think of a situation where > I'd choose to use parentlocal instead of nonlocal with an > explicit assignment in the outer scope. > For example, if the name is declared "global" in the outer scope, you'll get a compile-time error if you try to declare it "nonlocal" in the contained scope. "parentlocal" adjusts its meaning accordingly, becoming a synonym for "global" in that specific case. -------------- next part -------------- An HTML attachment was scrubbed... URL: From greg.ewing at canterbury.ac.nz Wed Jun 27 19:44:19 2018 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 28 Jun 2018 11:44:19 +1200 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <05f368c2-3cd2-d7e0-9f91-27afb40d5b35@mail.mipt.ru> References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> <5B302990.8060807@canterbury.ac.nz> <222b030b-6f6b-1644-6f74-8e7b6d2b9857@mail.mipt.ru> <5B316E07.1060305@canterbury.ac.nz> <74a4002f-7496-b2e0-4ecb-e4ceaf7f1e51@mail.mipt.ru> <5B33903A.3060104@canterbury.ac.nz> <05f368c2-3cd2-d7e0-9f91-27afb40d5b35@mail.mipt.ru> Message-ID: <5B342153.8050303@canterbury.ac.nz> Ivan Pozdeev via Python-Dev wrote: > for me, the primary use case for an assignment expression > is to be able to "catch" a value into a variable in places where I can't > put an assignment statement in, like the infamous `if re.match() is not > None'. This seems to be one of only about two uses for assignment expressions that gets regularly brought up. The other is the loop-and-a-half, which is already adequately addressed by iterators. So maybe instead of introducing an out-of-control sledgehammer in the form of ":=", we could think about addressing this particular case. Like maybe adding an "as" clause to if-statements: if pattern.match(s) as m: do_something_with(m) -- Greg From tim.peters at gmail.com Wed Jun 27 20:44:42 2018 From: tim.peters at gmail.com (Tim Peters) Date: Wed, 27 Jun 2018 19:44:42 -0500 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <5B3428E9.8000601@canterbury.ac.nz> References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> <5B33936E.8080309@canterbury.ac.nz> <5B3428E9.8000601@canterbury.ac.nz> Message-ID: [Tim] > If the parent has a matching parentlocal declaration for the same > name then the original > > really refers to the grandparent - and so on. > [Greg] > Ah, I missed that part, sorry -- I withdraw that particular > objecttion. > Good! I have another reply that crossed in the mail. > Still, this seems like a major addition (seeing as it comes > with a new keyword) whose justification is very little more > than "it makes explaining comprehension scopes easier". > > I agree - it has no other sane use case I can see, and "parentlocal" isn't _needed_ to capture the intended semantics in by-hand translations of comprehensions. I don't even think it makes "explaining" easier. It doesn't eliminate any corner cases, it just pushes them into the definition of what "parentllocal" means. What it would do is make writing synthetic functions "by hand" to implement comprehensions more uniform, because "parentlocal" would handle the corner cases by itself instead of making the programmer figure out when and where they need to type "nonlocal", "global", and/or cruft to establish a name as local to a block in which the name otherwise does't appear as a binding target. But to the extent that doing such translations by hand is meant to be "educational", it's more educational to learn how to do that stuff yourself. -------------- next part -------------- An HTML attachment was scrubbed... URL: From nad at python.org Wed Jun 27 20:58:03 2018 From: nad at python.org (Ned Deily) Date: Wed, 27 Jun 2018 20:58:03 -0400 Subject: [Python-Dev] Python 3.7.0 is now available! (and so is 3.6.6) Message-ID: <6FF553CD-6580-4939-A5E4-78143633BF1F@python.org> On behalf of the Python development community and the Python 3.7 release team, we are pleased to announce the availability of Python 3.7.0. Python 3.7.0 is the newest feature release of the Python language, and it contains many new features and optimizations. You can find Python 3.7.0 here: https://www.python.org/downloads/release/python-370/ Most third-party distributors of Python should be making 3.7.0 packages available soon. See the "What?s New In Python 3.7" document (https://docs.python.org/3.7/whatsnew/3.7.html) for more information about features included in the 3.7 series. Detailed information about the changes made in 3.7.0 can be found in its change log. Maintenance releases for the 3.7 series will follow at regular intervals starting in July of 2018. We hope you enjoy Python 3.7! P.S. We are also happy to announce the availability of Python 3.6.6, the next maintenance release of Python 3.6: https://www.python.org/downloads/release/python-366/ Thanks to all of the many volunteers who help make Python Development and these releases possible! Please consider supporting our efforts by volunteering yourself or through organization contributions to the Python Software Foundation. https://www.python.org/psf/ -- Ned Deily nad at python.org -- [] From vano at mail.mipt.ru Wed Jun 27 21:29:46 2018 From: vano at mail.mipt.ru (Ivan Pozdeev) Date: Thu, 28 Jun 2018 04:29:46 +0300 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <5B342153.8050303@canterbury.ac.nz> References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> <5B302990.8060807@canterbury.ac.nz> <222b030b-6f6b-1644-6f74-8e7b6d2b9857@mail.mipt.ru> <5B316E07.1060305@canterbury.ac.nz> <74a4002f-7496-b2e0-4ecb-e4ceaf7f1e51@mail.mipt.ru> <5B33903A.3060104@canterbury.ac.nz> <05f368c2-3cd2-d7e0-9f91-27afb40d5b35@mail.mipt.ru> <5B342153.8050303@canterbury.ac.nz> Message-ID: <85903285-8821-8483-8303-587be09f2844@mail.mipt.ru> On 28.06.2018 2:44, Greg Ewing wrote: > Ivan Pozdeev via Python-Dev wrote: >> for me, the primary use case for an assignment expression is to be >> able to "catch" a value into a variable in places where I can't put >> an assignment statement in, like the infamous `if re.match() is not >> None'. > > This seems to be one of only about two uses for assignment > expressions that gets regularly brought up. The other is > the loop-and-a-half, which is already adequately addressed > by iterators. > > So maybe instead of introducing an out-of-control sledgehammer > in the form of ":=", we could think about addressing this > particular case. > > Like maybe adding an "as" clause to if-statements: > > ?? if pattern.match(s) as m: > ????? do_something_with(m) > I've skimmed for the origins of "as" (which I remember seeing maybe even before Py3 was a thing) and found this excellent analysis of modern languages which is too a part of the PEP 572 discussion: https://mail.python.org/pipermail/python-ideas/2018-May/050920.html It basically concludes that most recently-created languages do not have assignment expressions; they rather allow assignment statement(s?) before the tested expression in block statements (only if/while is mentioned. `for' is not applicable because its exit condition in Python is always the iterable's exhaustion, there's nothing in it that could be used as a variable). It, however, doesn't say anything about constructs that are not block statements but are equivalent to them, like the ternary operator. (In comprehensions, filter conditions are the bits equivalent to if/while statements.) -- Regards, Ivan From rosuav at gmail.com Wed Jun 27 21:57:45 2018 From: rosuav at gmail.com (Chris Angelico) Date: Thu, 28 Jun 2018 11:57:45 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <85903285-8821-8483-8303-587be09f2844@mail.mipt.ru> References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> <5B302990.8060807@canterbury.ac.nz> <222b030b-6f6b-1644-6f74-8e7b6d2b9857@mail.mipt.ru> <5B316E07.1060305@canterbury.ac.nz> <74a4002f-7496-b2e0-4ecb-e4ceaf7f1e51@mail.mipt.ru> <5B33903A.3060104@canterbury.ac.nz> <05f368c2-3cd2-d7e0-9f91-27afb40d5b35@mail.mipt.ru> <5B342153.8050303@canterbury.ac.nz> <85903285-8821-8483-8303-587be09f2844@mail.mipt.ru> Message-ID: On Thu, Jun 28, 2018 at 11:29 AM, Ivan Pozdeev via Python-Dev wrote: > On 28.06.2018 2:44, Greg Ewing wrote: >> >> Ivan Pozdeev via Python-Dev wrote: >>> >>> for me, the primary use case for an assignment expression is to be able >>> to "catch" a value into a variable in places where I can't put an assignment >>> statement in, like the infamous `if re.match() is not None'. >> >> >> This seems to be one of only about two uses for assignment >> expressions that gets regularly brought up. The other is >> the loop-and-a-half, which is already adequately addressed >> by iterators. >> >> So maybe instead of introducing an out-of-control sledgehammer >> in the form of ":=", we could think about addressing this >> particular case. >> >> Like maybe adding an "as" clause to if-statements: >> >> if pattern.match(s) as m: >> do_something_with(m) >> > > I've skimmed for the origins of "as" (which I remember seeing maybe even > before Py3 was a thing) and found this excellent analysis of modern > languages which is too a part of the PEP 572 discussion: > https://mail.python.org/pipermail/python-ideas/2018-May/050920.html > > It basically concludes that most recently-created languages do not have > assignment expressions; they rather allow assignment statement(s?) before > the tested expression in block statements (only if/while is mentioned. `for' > is not applicable because its exit condition in Python is always the > iterable's exhaustion, there's nothing in it that could be used as a > variable). > Now read this response. https://mail.python.org/pipermail/python-ideas/2018-May/050938.html ChrisA From tim.peters at gmail.com Thu Jun 28 01:38:08 2018 From: tim.peters at gmail.com (Tim Peters) Date: Thu, 28 Jun 2018 00:38:08 -0500 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: [Guido] > .. > Given that definition of `__parentlocal`, in first approximation the > scoping rule proposed by PEP 572 would then be: In comprehensions > (which in my use in the PEP 572 discussion includes generator > expressions) the targets of inline assignments are automatically > endowed with a `__parentlocal` declaration, except inside the > "outermost iterable" (since that already runs in the parent scope). If this has to be done ;-) , I suggest removing that last exception. That is, "[all] targets of inline assignments in comprehensions are declared __parentlocal", period, should work fine for (b). In case one appears in the outermost iterable of the outermost comprehension, I believe such declaration is merely semantically redundant, not harmful. Where "redundant" means someone is so familiar with the implementation that the scope implications of "already runs in the parent scope" are immediately clear. For someone muddy about that, it would be a positive help to have the intent clarified by removing the exception. Plus 99% of the point of "parentlocal" seemed to be to allow mindless ("uniform") by-hand translation of nested comprehensions to nested Python functions, and an exception for the outermost iterable would work against that intent. -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.barker at noaa.gov Thu Jun 28 00:52:43 2018 From: chris.barker at noaa.gov (Chris Barker) Date: Wed, 27 Jun 2018 21:52:43 -0700 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) Message-ID: On Wed, Jun 27, 2018 at 12:01 PM, Eric V. Smith wrote: > >>> def test(): > >>> spam = 1 > >>> ham = 2 > >>> vars = [key1+key2 for key1 in locals() for key2 in locals()] > >>> return vars > >>> > >>> Wanna guess what that's gonna return? > Guessing aside, messing around with locals() isn't really helpful for the usual case of common code. But the real problem I see with all of this the distinction between generator expressions and comprehensions: """ an assignment expression occurring in a list, set or dict comprehension or in a generator expression (below collectively referred to as "comprehensions") binds the target in the containing scope, honoring a nonlocal or global declaration for the target in that scope, if one exists. For the purpose of this rule the containing scope of a nested comprehension is the scope that contains the outermost comprehension. A lambda counts as a containing scope. """ It seems everyone agrees that scoping rules should be the same for generator expressions and comprehensions, which is a good reason for python3's non-leaking comprehensions: (py2) In [*5*]: i = 0 In [*6*]: l = [i *for* i *in* range(3)] In [*7*]: i Out[*7*]: 2 In [*8*]: i = 0 In [*9*]: g = (i *for* i *in* range(3)) In [*10*]: i Out[*10*]: 0 In [*11*]: *for* j *in* g: ...: *pass* ...: In [*12*]: i Out[*12*]: 0 so comprehensions and generator expressions behave differently -- not great. (py3) In [*4*]: i = 0 In [*5*]: l = [i *for* i *in* range(3)] In [*6*]: i Out[*6*]: 0 In [*7*]: g = (i *for* i *in* range(3)) In [*8*]: i Out[*8*]: 0 In [*9*]: list(g) Out[*9*]: [0, 1, 2] In [*10*]: i Out[*10*]: 0 The loop name doesn't "leak" and comprehensions and generator expressions are the same this regard -- nice. So what about: l = [x:=i for i in range(3)] vs g = (x:=i for i in range(3)) Is there any way to keep these consistent if the "x" is in the regular local scope? Note that this thread is titled "Informal educator feedback on PEP 572". As an educator -- this is looking harder an harder to explain to newbies... Though easier if any assignments made in a "comprehension" don't "leak out". Which does not mean that we'd need a "proper" new local scope (i.e. locals() returning something new) -- as long as the common usage was "intuitive". >> I'm not singling out Chris here, but these discussions would be easier > >> to follow and more illuminating if the answers to such puzzles were > >> presented when they're posed. > well, I think the point there was that it wasn't obvious without running the code -- and that point is made regardless of the answer. -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From tim.peters at gmail.com Thu Jun 28 02:31:04 2018 From: tim.peters at gmail.com (Tim Peters) Date: Thu, 28 Jun 2018 01:31:04 -0500 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: Message-ID: [Chris Barker] > ... > So what about: > > l = [x:=i for i in range(3)] > > vs > > g = (x:=i for i in range(3)) > > Is there any way to keep these consistent if the "x" is in the regular local scope? I'm not clear on what the question is. The list comprehension would bind ` l ` to [0, 1, 2] and leave the local `x` bound to 2. The second example binds `g` to a generator object, which just sits there unexecuted. That has nothing to do with the PEP, though. If you go on to do, e.g., l = list(g) then, same as the listcomp, `l` will be bound to [0, 1, 2] and the local `x` will be left bound to 2. The only real difference is in _when_ the `x:=i for i in range(3)` part gets executed. There's no new twist here due to the PEP. Put a body B in a listcomp and any side effects due to executing B happen right away, but put B in a genexp and they don't happen until you force the genexp to yield results. For example, do you think these two are "consistent" today? l = [print(i) for i in range(3)] g = (print(i) for i in range(3)) ? If so, nothing essential changes by replacing "print(i)" with "x := i" - in either case the side effects happen when the body is executed. But if you don't think they're already consistent, then nothing gets less consistent either ;-) -------------- next part -------------- An HTML attachment was scrubbed... URL: From vstinner at redhat.com Thu Jun 28 06:00:06 2018 From: vstinner at redhat.com (Victor Stinner) Date: Thu, 28 Jun 2018 12:00:06 +0200 Subject: [Python-Dev] Python 3.7.0 is now available! (and so is 3.6.6) In-Reply-To: <6FF553CD-6580-4939-A5E4-78143633BF1F@python.org> References: <6FF553CD-6580-4939-A5E4-78143633BF1F@python.org> Message-ID: Hi, I updated my list of Python known vulnerabilities and the good news is that Python 3.6.6 and 3.7.0 have no known vulnerability :-) Python 3.7.0 comes with fixes for: * CVE-2018-1000117: Buffer overflow vulnerability in os.symlink on Windows * CVE-2018-1060: difflib and poplib catastrophic backtracking * Expat 2.2.3 (ex: CVE-2017-11742) * urllib FTP protocol stream injection * update zlib to 1.2.11 (CVE-2016-9840, CVE-2016-9841, CVE-2016-9842, CVE-2016-9843; only Windows and macOS are impacted) More information at: http://python-security.readthedocs.io/vulnerabilities.html Victor 2018-06-28 2:58 GMT+02:00 Ned Deily : > On behalf of the Python development community and the Python 3.7 release > team, we are pleased to announce the availability of Python 3.7.0. > Python 3.7.0 is the newest feature release of the Python language, and > it contains many new features and optimizations. You can find Python > 3.7.0 here: > > https://www.python.org/downloads/release/python-370/ > > Most third-party distributors of Python should be making 3.7.0 packages > available soon. > > See the "What?s New In Python 3.7" document > (https://docs.python.org/3.7/whatsnew/3.7.html) for more information > about features included in the 3.7 series. Detailed information about > the changes made in 3.7.0 can be found in its change log. Maintenance > releases for the 3.7 series will follow at regular intervals starting in > July of 2018. > > We hope you enjoy Python 3.7! > > P.S. We are also happy to announce the availability of Python 3.6.6, the > next maintenance release of Python 3.6: > > https://www.python.org/downloads/release/python-366/ > > Thanks to all of the many volunteers who help make Python Development > and these releases possible! Please consider supporting our efforts by > volunteering yourself or through organization contributions to the > Python Software Foundation. > > https://www.python.org/psf/ > > -- > Ned Deily > nad at python.org -- [] > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/vstinner%40redhat.com From vstinner at redhat.com Thu Jun 28 06:56:08 2018 From: vstinner at redhat.com (Victor Stinner) Date: Thu, 28 Jun 2018 12:56:08 +0200 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: Message-ID: ? Congrats Nick on your 100 emails thread ?! ? You won a virtual piece of cake: ? 2018-06-22 16:22 GMT+02:00 Nick Coghlan : > On 22 June 2018 at 02:26, Antoine Pitrou wrote: >> Indeed. But, for a syntax addition such as PEP 572, I think it would be >> a good idea to ask their opinion to teaching/education specialists. >> >> As far as I'm concerned, if teachers and/or education specialists were >> to say PEP 572 is not a problem, my position would shift from negative >> towards neutral. > > I asked a handful of folks at the Education Summit the next day about it: > > * for the basic notion of allowing expression level name binding using > the "NAME := EXPR" notation, the reactions ranged from mildly negative > (I read it as only a "-0" rather than a "-1") to outright positive. > * for the reactions to my description of the currently proposed parent > local scoping behaviour in comprehensions, I'd use the word > "horrified", and feel I wasn't overstating the response :) > > While I try to account for the fact that I implemented the current > comprehension semantics for the 3.x series, and am hence biased > towards considering them the now obvious interpretation, it's also the > case that generator expressions have worked like nested functions > since they were introduced in Python 2.4 (more than 13 years ago now), > and comprehensions have worked the same way as generator expressions > since Python 3.0 (which has its 10th birthday coming up in December > this year). > > This means that I take any claims that the legacy Python 2.x > interpretation of comprehension behaviour is intuitively obvious with > an enormous grain of salt - for the better part of a decade now, every > tool at a Python 3 user's disposal (the fact that the iteration > variable is hidden from the current scope, reading the language > reference [1], printing out locals(), using the dis module, stepping > through code in a debugger, writing their own tracing function, and > even observing the quirky interaction with class scopes) will have > nudged them towards the "it's a hidden nested function" interpretation > of expected comprehension behaviour. > > Acquiring the old mental model for the way comprehensions work pretty > much requires a developer to have started with Python 2.x themselves > (perhaps even before comprehensions and lexical closures were part of > the language), or else have been taught the Python 2 comprehension > model by someone else - there's nothing in Python 3's behaviour to > encourage that point of view, and plenty of > functional-language-inspired documentation to instead encourage folks > to view comprehensions as tightly encapsulated declarative container > construction syntax. > > I'm currently working on a concept proposal at > https://github.com/ncoghlan/peps/pull/2 that's much closer to PEP 572 > than any of my previous `given` based suggestions: for already > declared locals, it devolves to being the same as PEP 572 (except that > expressions are allowed as top level statements), but for any names > that haven't been previously introduced, it prohibits assigning to a > name that doesn't already have a defined scope, and instead relies on > a new `given` clause on various constructs that allows new target > declarations to be introduced into the current scope (such that "if > x:= f():" implies "x" is already defined as a target somewhere else in > the current scope, while "if x := f() given x:" potentially introduces > "x" as a new local target the same way a regular assignment statement > does). > > One of the nicer features of the draft proposal is that if all you > want to do is export the iteration variable from a comprehension, you > don't need to use an assignment expression at all: you can just append > "... given global x" or "... given nonlocal x" and export the > iteration variable directly to the desired outer scope, the same way > you can in the fully spelled out nested function equivalent. > > Cheers, > Nick. > > [1] From https://docs.python.org/3.0/reference/expressions.html#displays-for-lists-sets-and-dictionaries: > 'Note that the comprehension is executed in a separate scope, so names > assigned to in the target list don?t ?leak? in the enclosing scope.' > -- > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/vstinner%40redhat.com From devel at baptiste-carvello.net Thu Jun 28 08:05:26 2018 From: devel at baptiste-carvello.net (Baptiste Carvello) Date: Thu, 28 Jun 2018 14:05:26 +0200 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <5B341E5F.8050708@canterbury.ac.nz> References: <20180624145204.GN14437@ando.pearwood.info> <5B302990.8060807@canterbury.ac.nz> <222b030b-6f6b-1644-6f74-8e7b6d2b9857@mail.mipt.ru> <5B316E07.1060305@canterbury.ac.nz> <74a4002f-7496-b2e0-4ecb-e4ceaf7f1e51@mail.mipt.ru> <5B33903A.3060104@canterbury.ac.nz> <20180627154123.63446423@fsol> <20180627140428.GG14437@ando.pearwood.info> <5B341E5F.8050708@canterbury.ac.nz> Message-ID: <4e75c6bf-4eb1-cdab-ebc1-7de087543ea2@baptiste-carvello.net> Le 28/06/2018 ? 01:31, Greg Ewing a ?crit?: > Well, I remain profoundly unconvinced that writing comprehensions > with side effects is ever a good idea, and Tim's examples did > nothing to change that. Comprehensions with side effects feel scary indeed. But I could see myself using some variant of the "cumsum" example (for scientific work at the command prompt): >>> x=0; [x:=x+i for i in range(5)] Here the side effects are irrelevant, the "x" variable won't be reused. But it needs to be initialized at the start of the comprehension. I would happily get rid of the side-effects, but then what would be a non-cryptic alternative to the above example? Baptiste From chris.barker at noaa.gov Thu Jun 28 11:35:13 2018 From: chris.barker at noaa.gov (Chris Barker - NOAA Federal) Date: Thu, 28 Jun 2018 11:35:13 -0400 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: Message-ID: Sent from my iPhone > > So what about: > > > > l = [x:=i for i in range(3)] > > > > vs > > > > g = (x:=i for i in range(3)) > > > > Is there any way to keep these consistent if the "x" is in the regular local scope? > > I'm not clear on what the question is. The list comprehension would bind ` l ` to [0, 1, 2] and leave the local `x` bound to 2. The second example binds `g` to a generator object, which just sits there unexecuted. That has nothing to do with the PEP, though. > > If you go on to do, e.g., > > l = list(g) > > then, same as the listcomp, `l` will be bound to [0, 1, 2] and the local `x` will be left bound to 2. OK, it has been said that the priority is that list(a_gen_expression) Behave the same as [the_same_expression] So we?re good there. And maybe it?s correct that leaving the running of the gen_exp ?till later is pretty uncommon, particularly for newbies, but: If the execution of the gen_exp is put off, it really confuses things ? that name being changed would happen at some arbitrary tone, and at least in theory, the gen_exp could be passed off to somewhere else in the code, and be run or not run completely remotely from where the name is used. So while this is technically the same as the comprehension, it is not the same as a generator function which does get its own scope. And we should be clear how it will work ? after all, in py2, the handling of the looping name was handled differently in gen_exp vs comprehensions. So I think a local scope for all comprehension-like things would be the way to go. But getting back to the original thread topic ? python has a number of places that you can only use expressions ? adding the ability to bind a name in all these places complicates the language significantly. > Put a body B in a listcomp and any side effects due to executing B Maybe it?s just me, but re-binding a name seems like a whole new category of side effect. -CHB From tim.peters at gmail.com Thu Jun 28 12:28:10 2018 From: tim.peters at gmail.com (Tim Peters) Date: Thu, 28 Jun 2018 11:28:10 -0500 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: Message-ID: [Chris Barker] >>> So what about: >>> >>> l = [x:=i for i in range(3)] >>> >>> vs >>> >>> g = (x:=i for i in range(3)) >>> >>> Is there any way to keep these consistent if the "x" is in the regular local scope? [Tim] >> I'm not clear on what the question is. The list comprehension would >> bind ` l ` to [0, 1, 2] and leave the local `x` bound to 2. The second >> example binds `g` to a generator object, which just sits there >> unexecuted. That has nothing to do with the PEP, though. >> >> If you go on to do, e.g., >> >> l = list(g) >> >> then, same as the listcomp, `l` will be bound to [0, 1, 2] and the local `x` will >> be left bound to 2. [Chris] > OK, it has been said that the priority is that > > list(a_gen_expression) > > Behave the same as > > [the_same_expression] That's certainly desirable. > So we?re good there. And maybe it?s correct that leaving the running > of the gen_exp ?till later is pretty uncommon, particularly for newbies, Common or not, I have no idea why anyone would write a genexp like the one you gave, except to contrive an example of silly behavior exhibited by silly code ;-) It's really not interesting to me to make up code as goofy as you can conceive of - the interesting questions are about plausible code (including plausible coding errors). > but: > > If the execution of the gen_exp is put off, it really confuses things > ? that name being changed would happen at some arbitrary tone, and at > least in theory, the gen_exp could be passed off to somewhere else in > the code, and be run or not run completely remotely from where the > name is used. Sure. > So while this is technically the same as the comprehension, it is not > the same as a generator function which does get its own scope. It is the same as a generator function with appropriate scope declarations - a generator expression is, after all, implemented _by_ a nested generator function. You can write a workalike to your code above today, but nobody worries about that because nobody does that ;-) def f(): def bashx(outermost): nonlocal x for i in outermost: x = i yield i x = 12 g = bashx(range(3)) print("x before", x) L = list(g) print("L", L) print("x after", x) Then calling `f()` prints: x before 12 L [0, 1, 2] x after 2 > And we should be clear how it will work ? after all, in py2, the > handling of the looping name was handled differently in gen_exp vs > comprehensions. The PEP specifies the semantics. If it's accepted, that will be folded into the docs. > So I think a local scope for all comprehension-like things would be > the way to go. > > But getting back to the original thread topic ? python has a number of > places that you can only use expressions ? adding the ability to bind > a name in all these places complicates the language significantly. Did adding ternary `if` (truepart if expression else falsepart) complicate the language significantly? Python has rarely expanded the number of expression forms, but whenever it has the sky didn't actually fall despite earnest warnings that disaster was inevitable ;-) >> Put a body B in a listcomp and any side effects due to executing B > Maybe it?s just me, but re-binding a name seems like a whole new > category of side effect. With no trickery at all, you've always been able to rebind attributes, and mutate containers, in comprehensions and genexps. Because `for` targets aren't limited to plain names; e.g., g = (x+y for object.attribute, a[i][j] in zip(range(3), range(3))) is already "legal", and will stomp all over the complex `for` targets when executed - there's nothing "local" about them. But nobody worries about that because nobody does stuff like that. And as in my goofy code above, mucking with binding of plain names is also possible today. Indeed, straightforward if that's what you _want_ to do. But nobody does. It's just not one of Python's goals to make it impossible to write useless code ;-) -------------- next part -------------- An HTML attachment was scrubbed... URL: From tjreedy at udel.edu Thu Jun 28 13:18:36 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Thu, 28 Jun 2018 13:18:36 -0400 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <4e75c6bf-4eb1-cdab-ebc1-7de087543ea2@baptiste-carvello.net> References: <20180624145204.GN14437@ando.pearwood.info> <5B302990.8060807@canterbury.ac.nz> <222b030b-6f6b-1644-6f74-8e7b6d2b9857@mail.mipt.ru> <5B316E07.1060305@canterbury.ac.nz> <74a4002f-7496-b2e0-4ecb-e4ceaf7f1e51@mail.mipt.ru> <5B33903A.3060104@canterbury.ac.nz> <20180627154123.63446423@fsol> <20180627140428.GG14437@ando.pearwood.info> <5B341E5F.8050708@canterbury.ac.nz> <4e75c6bf-4eb1-cdab-ebc1-7de087543ea2@baptiste-carvello.net> Message-ID: On 6/28/2018 8:05 AM, Baptiste Carvello wrote: > Le 28/06/2018 ? 01:31, Greg Ewing a ?crit?: >> Well, I remain profoundly unconvinced that writing comprehensions >> with side effects is ever a good idea, and Tim's examples did >> nothing to change that. > > Comprehensions with side effects feel scary indeed. But I could see > myself using some variant of the "cumsum" example (for scientific work > at the command prompt): > >>>> x=0; [x:=x+i for i in range(5)] Creating an unneeded list with a comprehension purely for side effects is considered a bad idea by many. x = 0 for i in range(5): x += i > Here the side effects are irrelevant, the "x" variable won't be reused. If we ignore the side effect on x, the above is equivalent to 'pass' ;-) Perhaps you meant x = 0 cum = [x:=x+i for i in range(5)] which is equivalent to x, cum = 0, [] for i in range(5): x += i; cum.append(x) > But it needs to be initialized at the start of the comprehension. > > I would happily get rid of the side-effects, but then what would be a > non-cryptic alternative to the above example? The above as likely intended can also be written import itertools as it cum = list(it.accumulate(range(5))) We have two good existing alternatives to the proposed innovation. -- Terry Jan Reedy From chris.barker at noaa.gov Thu Jun 28 18:42:49 2018 From: chris.barker at noaa.gov (Chris Barker) Date: Thu, 28 Jun 2018 15:42:49 -0700 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: Message-ID: On Thu, Jun 28, 2018 at 9:28 AM, Tim Peters wrote: > >>> g = (x:=i for i in range(3)) > Common or not, I have no idea why anyone would write a genexp like the one > you gave, except to contrive an example of silly behavior exhibited by > silly code ;-) > yes, it was a contrived example, but the simplest one I could think of off the top of my head that re-bound a name in the loop -- which was what I thought was the entire point of this discussion? If we think hardly anyone is ever going to do that -- then I guess it doesn't matter how it's handled. > So while this is technically the same as the comprehension, it is not > > the same as a generator function which does get its own scope. > > It is the same as a generator function with appropriate scope declarations > - a generator expression is, after all, implemented _by_ a nested generator > function. You can write a workalike to your code above today, but nobody > worries about that because nobody does that ;-) > > def f(): > def bashx(outermost): > nonlocal x > for i in outermost: > x = i > yield i > but here the keyword "nonlocal" is used -- you are clearly declaring that you are messing with a nonlocal name here -- that is a lot more obvious than simply using a := And "nonlocal" is not used that often, and when it is it's for careful closure trickery -- I'm guessing := will be far more common. And, of course, when a newbie encounters it, they can google it and see what it means -- far different that seeing a := in a comprehension and understanding (by osmosis??) that it might make changes in the local scope. And I don't think you can even do that with generator expressions now -- as they can only contain expressions. Which is my point -- this would allow the local namespace to be manipulated in places it never could before. Maybe it's only comprehensions, and maybe it'll be rare to have a confusing version of those, so it'll be no big deal, but this thread started talking about educators' take on this -- and as an educator, I think this really does complicate the language. Python got much of it's "fame" by being "executable pseudo code" -- its been moving farther and farther away from those roots. That's generally a good thing, as we've gain expressiveness in exchangel, but we shouldn't pretend it isn't happening, or that this proposal doesn't contribute to that trend. Did adding ternary `if` (truepart if expression else falsepart) complicate > the language significantly? > I don't think so -- no. For two reasons: 1) the final chosen form is kind of verbose, but therefor more like "executable pseudo code" :-) As apposed to the C version, for instance. 2) it added one new construct, that if, when someone sees it for the first (or twenty fifth) time and doesn't understand it, they can look it up, and find out. and it only effects that line of code. So adding ANYTHING does complicate the language, by simply making it a bit larger, but some things are far more complicating than others. Python has rarely expanded the number of expression forms, but whenever it > has the sky didn't actually fall despite earnest warnings that disaster was > inevitable ;-) > Well, I've been surprised by what confused students before, and I will again. But I dont hink there is any doubt that Python 3.7 is a notably harder to learn that Python 1.5 was... > > Maybe it?s just me, but re-binding a name seems like a whole new > > category of side effect. > > With no trickery at all, you've always been able to rebind attributes, and > mutate containers, in comprehensions and genexps. Because `for` targets > aren't limited to plain names; e.g., > > g = (x+y for object.attribute, a[i][j] in zip(range(3), range(3))) > sure, but you are explicitly using the names "object" and "a" here -- so while side effects in comprehension are discouraged, it's not really a surprised that namespaces specifically named are changed. and this: In [*55*]: x = 0 In [*56*]: [x *for* x *in* range(3)] Out[*56*]: [0, 1, 2] In [*57*]: x Out[*57*]: 0 doesn't change x in the local scope -- if that was a good idea, why is a good idea to have := in a comprehension effect the local scope?? But maybe it is just me. > And as in my goofy code above, mucking with binding of plain names is also > possible today. Indeed, straightforward if that's what you _want_ to do. > But nobody does. > > It's just not one of Python's goals to make it impossible to write useless > code ;-) > I suppose we need to go back and look at the "real" examples of where/how folks think they'll use := in comprehensions, and see how confusing it may be. One of these conversations was started with an example something like this: [(f(x), g(f(x))) for x in an_iterable] The OP didn't like having to call f() twice. So that would become: [ (temp:=f(x), g(temp)) for x in an_iterable] so now the question is: should "temp" be created / changed in the enclosing local scope? This sure looks a lot like letting the iteration name (x in this example) leak out -- so I'd say no. And I don't think this kind of thing would be rare. Someone mentions that one problem with letting the iteration name leak out is that people tend to use short, common names, like "i" -- Im thinking that would also be the case for this kind of temp variable. -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker at noaa.gov -------------- next part -------------- An HTML attachment was scrubbed... URL: From greg.ewing at canterbury.ac.nz Thu Jun 28 19:11:25 2018 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Fri, 29 Jun 2018 11:11:25 +1200 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <4e75c6bf-4eb1-cdab-ebc1-7de087543ea2@baptiste-carvello.net> References: <20180624145204.GN14437@ando.pearwood.info> <5B302990.8060807@canterbury.ac.nz> <222b030b-6f6b-1644-6f74-8e7b6d2b9857@mail.mipt.ru> <5B316E07.1060305@canterbury.ac.nz> <74a4002f-7496-b2e0-4ecb-e4ceaf7f1e51@mail.mipt.ru> <5B33903A.3060104@canterbury.ac.nz> <20180627154123.63446423@fsol> <20180627140428.GG14437@ando.pearwood.info> <5B341E5F.8050708@canterbury.ac.nz> <4e75c6bf-4eb1-cdab-ebc1-7de087543ea2@baptiste-carvello.net> Message-ID: <5B356B1D.1080204@canterbury.ac.nz> Baptiste Carvello wrote: >>>>x=0; [x:=x+i for i in range(5)] > > what would be a > non-cryptic alternative to the above example? Personally I wouldn't insist on trying to do it with a comprehension at all, but if forced to come up with a readable syntax for that, it would probably be something like [x for i in range(5) letting x = x + 1 given x = 0] -- Greg From tjreedy at udel.edu Thu Jun 28 20:28:13 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Thu, 28 Jun 2018 20:28:13 -0400 Subject: [Python-Dev] We now have C code coverage! In-Reply-To: References: Message-ID: On 6/24/2018 5:03 AM, Ammar Askar wrote: >> Is it possible, given that we are not paying for those reports, to >> customize the 'exclude_lines' definitions? > > Do you want to exclude python code or C code? Python code. > For Python code, coverage.py also has some comments you can > put down to exclude lines: > http://coverage.readthedocs.io/en/coverage-4.2/excluding.html Yes, by default, one can use '# pragma: no cover' and if one uses the --branch flag, '# pragma: no branch'. For more 'advanced exclusion', one can use the following, normally in .coveragerc. [report] exclude_lines = ... "This is useful if you have often-used constructs to exclude that can be matched with a regex. You can exclude them all at once without littering your code with exclusion pragmas." For IDLE's test suite, I use a customized .coveragerc. I strongly prefer to not abandon that and litter the code with # pragmas. In order to make sense of the coverage report and have it be truthful, one needs to know what options are being used. Is the --branch flag set? Is .coveragerc or some other configuration file in use? If so, what is the content? Do we have any control over the use and content of exclusion settings? -- Terry Jan Reedy From tim.peters at gmail.com Thu Jun 28 23:21:01 2018 From: tim.peters at gmail.com (Tim Peters) Date: Thu, 28 Jun 2018 22:21:01 -0500 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: Message-ID: [Chris] > yes, it was a contrived example, but the simplest one I could think of off > the top of my head that re-bound a name in the loop -- which was what I > thought was the entire point of this discussion? But why off the top of your head? There are literally hundreds & hundreds of prior messages about this PEP, not to mention that you could also find examples in the PEP. Why make up a senseless example? > If we think hardly anyone is ever going to do that -- then I guess it doesn't matter > how it's handled. So look at real examples. One that's been repeated at least a hundred times wants a local to "leak into" a listcomp: total = 0 cumsums = [total ::= total + value for value in data] As an educator, how are you going to explain that blowing up with UnboundLocalError instead? Do you currently teach that comprehensions and genexps are implemented via invisible magically generated lexically nested functions? If not, you're going to have to start for people to even begin to make sense of UnboundLocalError if `total` _doesn't_ "leak into" that example. My belief is that just about everyone who doesn't know "too much" about the current implementation will be astonished & baffled if that example doesn't "just work". In other cases it's desired that targets "leak out": while any(n % (divisor := p) == 0 for p in small_primes): n //= divisor And in still other cases no leaking (neither in nor out) is desired. Same as `for` targets in that way,. but in the opposite direction: they don't leak and there's no way to make them leak, not even when that's wanted. Which _is_ wanted in the last example above, which would be clearer still written as: while any(n % p == 0 for p in small_primes): n //= p But that ship has sailed. > ... > And "nonlocal" is not used that often, and when it is it's for careful closure > trickery -- I'm guessing := will be far more common. My guess (recorded in the PEP's Appendix A) is that assignment expressions _overall_ will be used more often than ternary `if` but significantly less often than augmented assignment. I expect their use in genexps and comprehensions will be minimal. There are real use cases for them, but the vast majority of genexps and comprehensions apparently have no use for them at all. > And, of course, when a newbie encounters it, they can google it and see what > it means -- far different that seeing a := in a comprehension and understanding > (by osmosis??) that it might make changes in the local scope. Which relates to the above: how do you teach these things? The idea that "a newbie" even _suspects_ that genexps and listcomps have something to do with lexically nested scopes and invisible nested functions strikes me as hilarious ;-) Regardless of how assignment expressions work in listcomps and genexps, this example (which uses neither) _will_ rebind the containing block's `x`: [x := 1] How then are you going to explain that this seemingly trivial variation _doesn't_? [x := 1 for ignore in "a"] For all the world they both appear to be binding `x` in the code block containing the brackets. So let them. Even worse, [x for ignore in range(x := 1)] will rebind `x` in the containing block _regardless_ of how assignment expression targets are treated in "most of" a comprehension, because the expression defining the iterable of the outermost "for" _is_ evaluated in the containing block (it is _not_ evaluated in the scope of the synthetic function). That's not a special case for targets if they all "leak", but is if they don't. > And I don't think you can even do that with generator expressions now -- as > they can only contain expressions. Expressions can invoke arbitrary functions, which in turn can do anything whatsoever. > Which is my point -- this would allow the local namespace to be manipulated > in places it never could before. As above, not true. However, it would make it _easier_ to write senseless code mucking with the local namespace - if that's what you want to do. > Maybe it's only comprehensions, and maybe it'll be rare to have a confusing > version of those, so it'll be no big deal, but this thread started talking about > educators' take on this -- and as an educator, I think this really does > complicate the language. I'll grant that it certainly doesn't simplify the language ;-) > Python got much of it's "fame" by being "executable pseudo code" -- its been > moving farther and farther away from those roots. That's generally a good thing, > as we've gain expressiveness in exchangel, but we shouldn't pretend it isn't > happening, or that this proposal doesn't contribute to that trend. I didn't say a word about that one way or the other. I mostly agree, but at the start Guido was aiming to fill a niche between shell scripting languages and C. It was a very "clean" language from the start, but not aimed at beginners. Thanks to his experience working on ABC, it carried over some key ideas that were beginner-friendly, though. I view assignment expressions as being aimed at much the same audience as augmented assignments: experienced programmers who already know the pros and cons from vast experience with them in a large number of other widely used languages. That's also a key Python audience. > ... > Well, I've been surprised by what confused students before, and I will again. But I > dont hink there is any doubt that Python 3.7 is a notably harder to learn that > Python 1.5 was... Absolutely. It doesn't much bother me, though - at this point the language and its widely used libraries are so sprawling that I doubt anyone is fluent in all of it. That's a sign of worldly success. > ... > and this: > > In [55]: x = 0 > In [56]: [x for x in range(3)] > Out[56]: [0, 1, 2] >In [57]: x > Out[57]: 0 > > doesn't change x in the local scope -- In Python 3, yes; in Python 2 it rebinds `x` to 2. > if that was a good idea, why is a good idea > to have := in a comprehension effect the local scope?? Because you can't write a genexp or comprehension AT ALL without specifying `for` targets, and in the overwhelming majority of genexps and comprehensions anyone ever looked at, "leaking" of for-targets was not wanted. "So don't let them leak" was pretty much a no-brainer for Python 3. But assignment expressions are NEVER required to write a genexp or comprehension, and there are only a handful of patterns known so far in which assignment expressions appear to be of real value in those contexts. In at least half those patterns, leaking _is_ wanted - indeed, essential. In the rest, leaking isn't. So it goes. Also don't ignore other examples given before, showing how having assignment expressions _at all_ argues for "leaking" in order to be consistent with what assignment expressions do outside of comprehensions and genexps. > But maybe it is just me. Nope. But it has been discussed so often before this is the last time I'm going to repeat it all again ;-) > ... > One of these conversations was started with an example something like this: > > [(f(x), g(f(x))) for x in an_iterable] > > The OP didn't like having to call f() twice. So that would become: > > [ (temp:=f(x), g(temp)) for x in an_iterable] > > so now the question is: should "temp" be created / changed in the enclosing local scope? > > This sure looks a lot like letting the iteration name (x in this example) leak out - > so I'd say no. In that example, right, leaking `temp` almost certainly isn't wanted. So it goes. > And I don't think this kind of thing would be rare. I do. It's dead easy to make up examples to "prove" anything people like, but I'm unswayed unless examples come from real code, or are obviously compelling. Since we're not going to get a way to explicitly say which targets (neither `for` nor assignment expression) do and don't leak, it's a reasonably satisfying compromise to say that one kind never leaks and the other kind always leaks. The pick your poison accordingly. In the example above, note that they _could_ already do, e.g., [(fx, g(fx)) for x in an_iterable for fx in [f(x)]] Then nothing leaks (well, unless f() or g() do tricky things). I personally wouldn't care that `temp` leaks - but then I probably would have written that example as the shorter (& clearer to my eyes): [(v. g(v)) for v in map(f, an_iterable)] to begin with. -------------- next part -------------- An HTML attachment was scrubbed... URL: From tjreedy at udel.edu Fri Jun 29 00:57:51 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Fri, 29 Jun 2018 00:57:51 -0400 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: Message-ID: On 6/28/2018 11:21 PM, Tim Peters wrote: [somewhere below] this is the last time I'm going to repeat it all again ;-) For me, this is your most convincing exposition and summary of why the proposal is at least ok. Thank you. > [Chris] > >?yes, it was a contrived example, but the simplest one I could think > of off > > the top of my head that re-bound a name in the loop -- which was what I > > thought was the entire point of this discussion? > > But why off the top of your head?? There are literally hundreds & > hundreds of prior messages about this PEP, not to mention that you could > also find examples in the PEP.? Why make up a senseless example? > > > If we think hardly anyone is ever going to do that -- then I guess it > doesn't matter > > how it's handled. > > So look at real examples.? One that's been repeated at least a hundred > times wants a local to "leak into" a listcomp: > > total = 0 > cumsums = [total ::= total?+ value for value in data] > > As an educator, how are you going to explain that blowing up with > UnboundLocalError instead?? Do you currently teach that comprehensions > and genexps are implemented via invisible magically generated lexically > nested functions?? If not, you're going to have to start for people to > even begin to make sense of UnboundLocalError if `total` _doesn't_ "leak > into" that example.? My belief is that just about everyone who doesn't > know "too much" about the current implementation will be astonished & > baffled if that example doesn't "just work". > > In other cases it's desired that targets "leak out": > > while any(n % (divisor := p) == 0 for p in small_primes): > ? ? n //= divisor > > And in still other cases no leaking (neither in nor out) is desired. > > Same as `for` targets in that way,. but in the opposite direction:? they > don't leak and there's no way to make them leak, not even when that's > wanted.? Which _is_ wanted in the last example above, which would be > clearer still written as: > > while any(n % p == 0 for p in small_primes): > ? ? n //= p > > But that ship has sailed. > > > > ... > > And "nonlocal" is not used that often, and when it is it's for > careful closure > > trickery -- I'm guessing := will be far more common. > > My guess (recorded in the PEP's Appendix A) is that assignment > expressions _overall_ will be used more often than ternary `if` but > significantly less often than augmented assignment.? I expect their use > in genexps and comprehensions will be minimal.? There are real use cases > for them, but the vast majority of genexps and comprehensions apparently > have no use for them at all. > > > > And, of course, when a newbie encounters it, they can google it and > see what > > it means -- far different that seeing a := in a comprehension and > understanding > > (by osmosis??) that it might make changes in the local scope. > > Which relates to the above:? how do you teach these things?? The idea > that "a newbie" even _suspects_ that genexps and listcomps have > something to do with lexically nested scopes and invisible nested > functions strikes me as hilarious ;-) > > Regardless of how assignment expressions work in listcomps and genexps, > this example (which uses neither) _will_ rebind the containing block's `x`: > > [x := 1] > > How then are you going to explain that this seemingly trivial variation > _doesn't_? > > [x := 1 for ignore in "a"] > > For all the world they both appear to be binding `x` in the code block > containing the brackets.? So let them. > > Even worse, > > [x for ignore in range(x := 1)] > > will rebind `x` in the containing block _regardless_ of how assignment > expression targets are treated in "most of" a comprehension, because the > expression defining the iterable of the outermost "for" _is_ evaluated > in the containing block (it is _not_ evaluated in the scope of the > synthetic function). > > That's not a special case for targets if they all "leak", but is if they > don't. > > > > And? I don't think you can even do that with generator expressions > now -- as > > they can only contain expressions. > > Expressions can invoke arbitrary functions, which in turn can do > anything whatsoever. > > > Which is my point -- this would allow the local namespace to be > manipulated > > in places it never could before. > > As above, not true.? However, it would make it _easier_ to write > senseless code mucking with the local namespace - if that's what you > want to do. > > > > Maybe it's only comprehensions, and maybe it'll be rare to have a > confusing > > version of those, so it'll be no big deal, but this thread started > talking about > > educators' take on this -- and as an educator, I think this really does > > complicate the language. > > I'll grant that it certainly doesn't simplify the language ;-) > > > > Python got much of it's "fame" by being "executable pseudo code" -- > its been > > moving farther and farther away from those roots. That's generally a > good thing, > > as we've gain expressiveness in exchangel, but we shouldn't pretend > it isn't > > happening, or that this proposal doesn't contribute to that trend. > > I didn't say a word about that one way or the other.? I mostly agree, > but at the start Guido was aiming to fill a niche between shell > scripting languages and C.? It was a very "clean" language from the > start, but not aimed at beginners.? Thanks to his experience working on > ABC, it carried over some key ideas that were beginner-friendly, though. > > I view assignment expressions as being aimed at much the same audience > as augmented assignments:? experienced programmers who already know the > pros and cons from vast experience with them in a large number of other > widely used languages.? That's also a key Python audience. > > > ... > > Well, I've been surprised by what confused students before, and I > will again. But I > > dont hink there is any doubt that Python 3.7 is a notably harder to > learn that > > Python 1.5 was... > > Absolutely.? It doesn't much bother me, though - at this point the > language and its widely used libraries are so sprawling that I doubt > anyone is fluent in all of it.? That's a sign of worldly success. > > > ... > > and this: > > > > In [55]: x = 0 > > In [56]: [x for x in range(3)] > > Out[56]: [0, 1, 2] > >In [57]: x > > Out[57]: 0 > > > > doesn't change x in the local scope -- > > In Python 3, yes; in Python 2 it rebinds `x` to 2. > > > if that was a good idea, why is a good idea > >? to have := in a comprehension effect the local scope?? > > Because you can't write a genexp or comprehension AT ALL without > specifying `for` targets, and in the overwhelming majority of genexps > and comprehensions anyone ever looked at, "leaking" of for-targets was > not wanted.? "So don't let them leak" was pretty much a no-brainer for > Python 3. > > But assignment expressions are NEVER required to write a genexp or > comprehension, and there are only a handful of patterns known so far in > which assignment expressions appear to be of real value in those > contexts.? In at least half those patterns, leaking _is_ wanted - > indeed, essential.? In the rest, leaking isn't. > > So it goes.? Also don't ignore other examples given before, showing how > having assignment expressions _at all_ argues for "leaking" in order to > be consistent with what assignment expressions do outside of > comprehensions and genexps. > > > > But maybe it is just me. > > Nope.? But it has been discussed so often before this is the last time > I'm going to repeat it all again ;-) > > ... > > One of these conversations was started with an example something like > this: > > >> [(f(x), g(f(x))) for x in an_iterable] > > > > The OP didn't like having to call f() twice. So that would become: > > >> [ (temp:=f(x), g(temp)) for x in an_iterable] > > > > so now the question is: should "temp" be created / changed in the > enclosing local scope? > > > > This sure looks a lot like letting the iteration name (x in this > example) leak out - > > so I'd say no. > > In that example, right, leaking `temp` almost certainly isn't wanted. > So it goes. > > > > And I don't think this kind of thing would be rare. > > I do.? It's dead easy to make up examples to "prove" anything people > like, but I'm unswayed unless examples come from real code, or are > obviously compelling. > > Since we're not going to get a way to explicitly say which targets > (neither `for` nor assignment expression) do and don't leak, it's a > reasonably satisfying compromise to say that one kind never leaks and > the other kind always leaks.? The pick your poison accordingly. > > In the example above, note that they _could_ already do, e.g., > > ? ?[(fx, g(fx)) for x in an_iterable for fx in [f(x)]] > > Then nothing leaks (well, unless f() or g() do tricky things).? I > personally wouldn't care that `temp` leaks - but then I probably would > have written that example as the shorter? (& clearer to my eyes): > > ? ? [(v. g(v)) for v in map(f, an_iterable)] > > to begin with. -- Terry Jan Reedy From brett at python.org Fri Jun 29 09:25:07 2018 From: brett at python.org (Brett Cannon) Date: Fri, 29 Jun 2018 10:25:07 -0300 Subject: [Python-Dev] We now have C code coverage! In-Reply-To: References: Message-ID: On Thu, Jun 28, 2018, 21:28 Terry Reedy, wrote: > On 6/24/2018 5:03 AM, Ammar Askar wrote: > >> Is it possible, given that we are not paying for those reports, to > >> customize the 'exclude_lines' definitions? > > > > Do you want to exclude python code or C code? > > Python code. > > > For Python code, coverage.py also has some comments you can > > put down to exclude lines: > > http://coverage.readthedocs.io/en/coverage-4.2/excluding.html > > Yes, by default, one can use '# pragma: no cover' and if one uses the > --branch flag, '# pragma: no branch'. For more 'advanced exclusion', > one can use the following, normally in .coveragerc. > [report] > exclude_lines = ... > "This is useful if you have often-used constructs to exclude that can be > matched with a regex. You can exclude them all at once without littering > your code with exclusion pragmas." > > For IDLE's test suite, I use a customized .coveragerc. I strongly > prefer to not abandon that and litter the code with # pragmas. > > In order to make sense of the coverage report and have it be truthful, > one needs to know what options are being used. > Is the --branch flag set? > Is .coveragerc or some other configuration file in use? > If so, what is the content? > Do we have any control over the use and content of exclusion settings? > Everything is either covered by the Travis or codecov configuration files which are both checked into the cpython repo. (I'm on vacation or else I would provide links to the files themselves.) > -- > Terry Jan Reedy > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ericsnowcurrently at gmail.com Fri Jun 29 12:05:16 2018 From: ericsnowcurrently at gmail.com (Eric Snow) Date: Fri, 29 Jun 2018 10:05:16 -0600 Subject: [Python-Dev] Python 3.7.0 is now available! (and so is 3.6.6) In-Reply-To: <6FF553CD-6580-4939-A5E4-78143633BF1F@python.org> References: <6FF553CD-6580-4939-A5E4-78143633BF1F@python.org> Message-ID: On Wed, Jun 27, 2018 at 7:05 PM Ned Deily wrote: > On behalf of the Python development community and the Python 3.7 release > team, we are pleased to announce the availability of Python 3.7.0. Thanks, Ned (and everyone), for a great job on this release! And thanks to all for yet another great Python version! -eric From status at bugs.python.org Fri Jun 29 12:09:57 2018 From: status at bugs.python.org (Python tracker) Date: Fri, 29 Jun 2018 18:09:57 +0200 (CEST) Subject: [Python-Dev] Summary of Python tracker Issues Message-ID: <20180629160957.6C00556868@psf.upfronthosting.co.za> ACTIVITY SUMMARY (2018-06-22 - 2018-06-29) Python tracker at https://bugs.python.org/ To view or respond to any of the issues listed below, click on the issue. Do NOT respond to this message. Issues counts and deltas: open 6705 ( +5) closed 39046 (+52) total 45751 (+57) Open issues with patches: 2664 Issues opened (42) ================== #33932: Calling Py_Initialize() twice now triggers a fatal error (Pyth https://bugs.python.org/issue33932 reopened by vstinner #33944: Deprecate and remove pth files https://bugs.python.org/issue33944 opened by barry #33947: Dataclasses can raise RecursionError in __repr__ https://bugs.python.org/issue33947 opened by eric.smith #33948: doc truncated lines in PDF https://bugs.python.org/issue33948 opened by Mikhail_D #33949: tests: allow to select tests using loadTestsFromName https://bugs.python.org/issue33949 opened by blueyed #33953: The DEFAULT_ENTROPY variable used to store the current default https://bugs.python.org/issue33953 opened by lig #33954: float.__format__('n') fails with _PyUnicode_CheckConsistency a https://bugs.python.org/issue33954 opened by vstinner #33955: Implement PyOS_CheckStack on macOS using pthread_get_stack*_np https://bugs.python.org/issue33955 opened by ronaldoussoren #33959: doc Remove time complexity mention from list Glossary entry https://bugs.python.org/issue33959 opened by adelfino #33960: IDLE REPL: Strange indentation https://bugs.python.org/issue33960 opened by mdk #33961: Inconsistency in exceptions for dataclasses.dataclass document https://bugs.python.org/issue33961 opened by chriscog #33962: IDLE: use ttk.spinbox https://bugs.python.org/issue33962 opened by terry.reedy #33963: IDLE macosx: add tests. https://bugs.python.org/issue33963 opened by terry.reedy #33964: IDLE maxosc.overrideRootMenu: remove unused menudict https://bugs.python.org/issue33964 opened by terry.reedy #33965: [Windows WSL] Fatal Python error: _Py_InitializeMainInterprete https://bugs.python.org/issue33965 opened by vstinner #33966: test_multiprocessing_spawn.WithProcessesTestPool.test_tracebac https://bugs.python.org/issue33966 opened by vstinner #33967: functools.singledispatch: Misleading exception when calling wi https://bugs.python.org/issue33967 opened by doerwalter #33968: os.makedirs and empty string https://bugs.python.org/issue33968 opened by CarlAndersson #33969: "copytree" refuses to copy to a mount point https://bugs.python.org/issue33969 opened by james_r_c_stevens #33971: os.mknod is subject to "umask" https://bugs.python.org/issue33971 opened by james_r_c_stevens #33972: AttributeError in email.message.iter_attachments() https://bugs.python.org/issue33972 opened by skrohlas #33973: HTTP request-line parsing splits on Unicode whitespace https://bugs.python.org/issue33973 opened by tburke #33974: _stringify handles quoted strings incorrectly https://bugs.python.org/issue33974 opened by gauchj #33976: Enums don't support nested classes https://bugs.python.org/issue33976 opened by edwardw #33977: [Windows] test_compileall fails randomly with PermissionError: https://bugs.python.org/issue33977 opened by vstinner #33978: logging.config.dictConfig with file handler leaks resources https://bugs.python.org/issue33978 opened by maggyero #33980: SSL Error when uploading package to your own pypi https://bugs.python.org/issue33980 opened by javidr #33982: cgi.FieldStorage doesn't parse QUERY_STRING with POST that is https://bugs.python.org/issue33982 opened by Daniel Klein #33983: unify types for lib2to3.pytree.Base.children https://bugs.python.org/issue33983 opened by jreese #33984: test_multiprocessing_forkserver leaked [1, 2, 1] memory blocks https://bugs.python.org/issue33984 opened by vstinner #33986: asyncio: Typo in documentation: BaseSubprocessTransport -> Sub https://bugs.python.org/issue33986 opened by kbumsik #33987: need ttk.Frame inside Toplevel(s) https://bugs.python.org/issue33987 opened by markroseman #33988: [EASY] [3.7] test_platform fails when run with -Werror https://bugs.python.org/issue33988 opened by vstinner #33989: ms.key_compare is not initialized in all pathes of list_sort_i https://bugs.python.org/issue33989 opened by johnchen902 #33990: CPPFLAGS during ./configure are not passed-through in sysconfi https://bugs.python.org/issue33990 opened by ericvw #33991: lib2to3 should parse f-strings https://bugs.python.org/issue33991 opened by skreft #33994: python build egg fails with error while compiling test cases https://bugs.python.org/issue33994 opened by sabakauser #33995: test_min_max_version in test_ssl.py fails when Python is built https://bugs.python.org/issue33995 opened by Alan.Huang #33996: Crash in gen_send_ex(): _PyErr_GetTopmostException() returns f https://bugs.python.org/issue33996 opened by vstinner #33997: multiprocessing Pool hangs in terminate() https://bugs.python.org/issue33997 opened by Erik Wolf #33998: random.randrange completely ignores the step argument when sto https://bugs.python.org/issue33998 opened by bup #34000: Document when compile returns a code object v. AST https://bugs.python.org/issue34000 opened by gsnedders Most recent 15 issues with no replies (15) ========================================== #34000: Document when compile returns a code object v. AST https://bugs.python.org/issue34000 #33997: multiprocessing Pool hangs in terminate() https://bugs.python.org/issue33997 #33995: test_min_max_version in test_ssl.py fails when Python is built https://bugs.python.org/issue33995 #33994: python build egg fails with error while compiling test cases https://bugs.python.org/issue33994 #33990: CPPFLAGS during ./configure are not passed-through in sysconfi https://bugs.python.org/issue33990 #33989: ms.key_compare is not initialized in all pathes of list_sort_i https://bugs.python.org/issue33989 #33987: need ttk.Frame inside Toplevel(s) https://bugs.python.org/issue33987 #33986: asyncio: Typo in documentation: BaseSubprocessTransport -> Sub https://bugs.python.org/issue33986 #33982: cgi.FieldStorage doesn't parse QUERY_STRING with POST that is https://bugs.python.org/issue33982 #33980: SSL Error when uploading package to your own pypi https://bugs.python.org/issue33980 #33972: AttributeError in email.message.iter_attachments() https://bugs.python.org/issue33972 #33971: os.mknod is subject to "umask" https://bugs.python.org/issue33971 #33969: "copytree" refuses to copy to a mount point https://bugs.python.org/issue33969 #33967: functools.singledispatch: Misleading exception when calling wi https://bugs.python.org/issue33967 #33964: IDLE maxosc.overrideRootMenu: remove unused menudict https://bugs.python.org/issue33964 Most recent 15 issues waiting for review (15) ============================================= #33998: random.randrange completely ignores the step argument when sto https://bugs.python.org/issue33998 #33997: multiprocessing Pool hangs in terminate() https://bugs.python.org/issue33997 #33990: CPPFLAGS during ./configure are not passed-through in sysconfi https://bugs.python.org/issue33990 #33988: [EASY] [3.7] test_platform fails when run with -Werror https://bugs.python.org/issue33988 #33986: asyncio: Typo in documentation: BaseSubprocessTransport -> Sub https://bugs.python.org/issue33986 #33983: unify types for lib2to3.pytree.Base.children https://bugs.python.org/issue33983 #33978: logging.config.dictConfig with file handler leaks resources https://bugs.python.org/issue33978 #33976: Enums don't support nested classes https://bugs.python.org/issue33976 #33974: _stringify handles quoted strings incorrectly https://bugs.python.org/issue33974 #33973: HTTP request-line parsing splits on Unicode whitespace https://bugs.python.org/issue33973 #33966: test_multiprocessing_spawn.WithProcessesTestPool.test_tracebac https://bugs.python.org/issue33966 #33964: IDLE maxosc.overrideRootMenu: remove unused menudict https://bugs.python.org/issue33964 #33961: Inconsistency in exceptions for dataclasses.dataclass document https://bugs.python.org/issue33961 #33959: doc Remove time complexity mention from list Glossary entry https://bugs.python.org/issue33959 #33936: OPENSSL_VERSION_1_1 never defined in _hashopenssl.c https://bugs.python.org/issue33936 Top 10 most discussed issues (10) ================================= #33944: Deprecate and remove pth files https://bugs.python.org/issue33944 14 msgs #33613: test_multiprocessing_fork: test_semaphore_tracker_sigint() fai https://bugs.python.org/issue33613 12 msgs #33932: Calling Py_Initialize() twice now triggers a fatal error (Pyth https://bugs.python.org/issue33932 12 msgs #33919: Expose _PyCoreConfig structure to Python https://bugs.python.org/issue33919 11 msgs #33939: Provide a robust O(1) mechanism to check for infinite iterator https://bugs.python.org/issue33939 10 msgs #33930: Segfault with deep recursion into object().__dir__ https://bugs.python.org/issue33930 7 msgs #33934: locale.getlocale() seems wrong when the locale is yet unset (p https://bugs.python.org/issue33934 7 msgs #27500: ProactorEventLoop cannot open connection to ::1 https://bugs.python.org/issue27500 6 msgs #33968: os.makedirs and empty string https://bugs.python.org/issue33968 6 msgs #33927: Allow json.tool to have identical infile and outfile https://bugs.python.org/issue33927 5 msgs Issues closed (50) ================== #7060: test_multiprocessing dictionary changed size errors and hang https://bugs.python.org/issue7060 closed by vstinner #14117: Turtledemo: exception and minor glitches. https://bugs.python.org/issue14117 closed by terry.reedy #18932: Optimize selectors.EpollSelector.modify() https://bugs.python.org/issue18932 closed by giampaolo.rodola #20934: test_multiprocessing is broken by design https://bugs.python.org/issue20934 closed by vstinner #22051: Turtledemo: stop reloading demos https://bugs.python.org/issue22051 closed by terry.reedy #24033: Update _test_multiprocessing.py to use script helpers https://bugs.python.org/issue24033 closed by vstinner #24546: sequence index bug in random.choice https://bugs.python.org/issue24546 closed by rhettinger #24567: random.choice IndexError due to double-rounding https://bugs.python.org/issue24567 closed by rhettinger #25007: Add support of copy protocol to zlib compressors and decompres https://bugs.python.org/issue25007 closed by serhiy.storchaka #30339: test_multiprocessing_main_handling: "RuntimeError: Timed out w https://bugs.python.org/issue30339 closed by vstinner #30356: test_mymanager_context() of test_multiprocessing_spawn: manage https://bugs.python.org/issue30356 closed by vstinner #31463: test_multiprocessing_fork hangs test_subprocess https://bugs.python.org/issue31463 closed by vstinner #31815: Make itertools iterators interruptible https://bugs.python.org/issue31815 closed by rhettinger #32063: test_multiprocessing_forkserver failed with OSError: [Errno 48 https://bugs.python.org/issue32063 closed by vstinner #33278: libexpat uses HAVE_SYSCALL_GETRANDOM instead of HAVE_GETRANDOM https://bugs.python.org/issue33278 closed by benjamin.peterson #33573: statistics.median does not work with ordinal scale, add doc https://bugs.python.org/issue33573 closed by taleinat #33711: Could not find externals/db-* in msi.py on license generation https://bugs.python.org/issue33711 closed by zach.ware #33805: dataclasses: replace() give poor error message if using InitVa https://bugs.python.org/issue33805 closed by eric.smith #33823: concurrent.futures: cannot specify the number of cores https://bugs.python.org/issue33823 closed by vstinner #33842: Remove tarfile.filemode https://bugs.python.org/issue33842 closed by inada.naoki #33851: 3.7 regression: ast.get_docstring() for a node that lacks a do https://bugs.python.org/issue33851 closed by ned.deily #33873: False positives when running leak tests with -R 1:1 https://bugs.python.org/issue33873 closed by vstinner #33877: Doc: Delete UNIX qualification for script running instructions https://bugs.python.org/issue33877 closed by adelfino #33885: doc Replace "hook function" with "callable" in urllib.request. https://bugs.python.org/issue33885 closed by terry.reedy #33887: doc Add TOC in Design and History FAQ https://bugs.python.org/issue33887 closed by Mariatta #33897: Add a restart option to logging.basicConfig() https://bugs.python.org/issue33897 closed by vinay.sajip #33913: test_multiprocessing_spawn random failures on x86 Windows7 3.6 https://bugs.python.org/issue33913 closed by vstinner #33914: test_gdb fails for Python 2.7.15 https://bugs.python.org/issue33914 closed by vstinner #33916: test_lzma: test_refleaks_in_decompressor___init__() leaks 100 https://bugs.python.org/issue33916 closed by vstinner #33929: test_multiprocessing_spawn: WithProcessesTestProcess.test_many https://bugs.python.org/issue33929 closed by vstinner #33933: Error message says dict has no len https://bugs.python.org/issue33933 closed by serhiy.storchaka #33938: Cross compilation fail for ARM https://bugs.python.org/issue33938 closed by n0s69z #33942: IDLE: Problems using IDLE and 2.7, 3.6 macOS 64-/32-bit instal https://bugs.python.org/issue33942 closed by ned.deily #33943: doc Add references to logging.basicConfig https://bugs.python.org/issue33943 closed by taleinat #33945: concurrent.futures ProcessPoolExecutor submit() blocks on resu https://bugs.python.org/issue33945 closed by dbarcay #33946: os.symlink on Windows should use the new non-admin flag https://bugs.python.org/issue33946 closed by eryksun #33950: IDLE htest: remove spec for deleted tabbedpages.py https://bugs.python.org/issue33950 closed by terry.reedy #33951: IDLE test failing only when called by itself: HighPageTest.tes https://bugs.python.org/issue33951 closed by terry.reedy #33952: doc Fix typo in str.upper() documentation https://bugs.python.org/issue33952 closed by adelfino #33956: update vendored expat to 2.2.5 https://bugs.python.org/issue33956 closed by benjamin.peterson #33957: use standard term than generic wording https://bugs.python.org/issue33957 closed by inada.naoki #33958: Unused variable in pure embedding example https://bugs.python.org/issue33958 closed by inada.naoki #33970: bugs.python.org silently refuses registrations https://bugs.python.org/issue33970 closed by Mariatta #33975: IDLE: adjust DPI before Tk() for htests. https://bugs.python.org/issue33975 closed by terry.reedy #33979: [Exception message] Display type of not JSON serializable obje https://bugs.python.org/issue33979 closed by serhiy.storchaka #33981: test_asyncio: test_sendfile_close_peer_in_the_middle_of_receiv https://bugs.python.org/issue33981 closed by vstinner #33985: ContextVar does not have a "name" attribute https://bugs.python.org/issue33985 closed by yselivanov #33992: Compilation fails on AMD64 Windows8.1 Non-Debug 3.6: The Windo https://bugs.python.org/issue33992 closed by zach.ware #33993: zipfile module weird behavior when used with zipinfo https://bugs.python.org/issue33993 closed by serhiy.storchaka #33999: `pip3 install past` does not work https://bugs.python.org/issue33999 closed by serhiy.storchaka From tjreedy at udel.edu Fri Jun 29 12:14:52 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Fri, 29 Jun 2018 12:14:52 -0400 Subject: [Python-Dev] We now have C code coverage! In-Reply-To: References: Message-ID: On 6/29/2018 9:25 AM, Brett Cannon wrote: > On Thu, Jun 28, 2018, 21:28 Terry Reedy, > wrote: [question about our coverage bot] > Everything is either covered by the Travis or codecov configuration > files which are both checked into the cpython repo. (I'm on vacation or > else I would provide links to the files themselves.) Ammar Askar privately asked for my coveragerc file so he could experiment with the configuration. -- Terry Jan Reedy From ammar at ammaraskar.com Fri Jun 29 13:31:21 2018 From: ammar at ammaraskar.com (Ammar Askar) Date: Fri, 29 Jun 2018 10:31:21 -0700 Subject: [Python-Dev] We now have C code coverage! In-Reply-To: References: Message-ID: Oh whoops, sorry about that. I haven't really used mailing lists before and so I assumed hitting reply in gmail would send it to python-dev, not just your personal email. Just so the config file locations are publicly documented, here's what I responded with: > For IDLE's test suite, I use a customized .coveragerc. I strongly prefer to not abandon that and litter the code with # pragmas. Yup, I agree completely. Having pragmas everywhere is really annoying, its really only useful for small one-off stuff. > In order to make sense of the coverage report and have it be truthful, one needs to know what options are being used. > Is the --branch flag set? > Is .coveragerc or some other configuration file in use? > If so, what is the content? > Do we have any control over the use and content of exclusion settings? To answer all these questions at once, yeah, we have complete control over all the coverage parameters. Currently we aren't building with `--branch` and don't have a coveragrc file. We could put one in the root of CPython (which would get picked up automatically) but personally I think CI related crud should go into its own folder. Luckily we do have a ".github" folder, so I'd suggest putting the coveragerc file in there and then adding the parameter `--rcfile=.github/coveragerc` All the parameters to coverage.py can be configured here https://github.com/python/cpython/blob/master/.travis.yml#L82 If you wanna send over your IDLE coveragerc file, I can experiment and try to get it working for Travis, or you can explore it yourself if you want. On Fri, Jun 29, 2018 at 9:14 AM, Terry Reedy wrote: > On 6/29/2018 9:25 AM, Brett Cannon wrote: > >> On Thu, Jun 28, 2018, 21:28 Terry Reedy, > > wrote: > > [question about our coverage bot] > >> Everything is either covered by the Travis or codecov configuration files >> which are both checked into the cpython repo. (I'm on vacation or else I >> would provide links to the files themselves.) > > > Ammar Askar privately asked for my coveragerc file so he could experiment > with the configuration. > > -- > Terry Jan Reedy > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/ammar%40ammaraskar.com From chris.barker at noaa.gov Fri Jun 29 19:49:41 2018 From: chris.barker at noaa.gov (Chris Barker - NOAA Federal) Date: Fri, 29 Jun 2018 16:49:41 -0700 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: Message-ID: > On Jun 28, 2018, at 8:21 PM, Tim Peters wrote: Seems it?s all been said, and Tim?s latest response made an excellent case for consistency. But: > Regardless of how assignment expressions work in listcomps and genexps, this example (which uses neither) _will_ rebind the containing block's `x`: > > [x := 1] This reinforces my point that it?s not just about comprehensions, but rather that the local namespace can be altered anywhere an expression is used ? which is everywhere. That trivial example is unsurprising, but as soon as your line of code gets a bit longer, it could be far more hidden. I?m not saying it?s not worth it, but it a more significant complication than simply adding a new feature like augmented assignment or terniary expressions, where the effect is seen only where it is used. A key problem with thinking about this is that we can scan existing code to find places where this would improve the code, and decide if those use-cases would cause confusion. But we really can?t anticipate all the places where it might get used (perhaps inappropriately) that would cause confusion. We can hope that people won?t tend to do that, but who knows? Example: in a function argument: result = call_a_func(arg1, arg2, kwarg1=x, kwarg2=x:=2*y) Sure, there are always ways to write bad code, and most people wouldn?t do that, but someone, somewhere, that thinks shorter code is better code might well do it. Or something like it. After all, expressions can be virtually anywhere in your code. Is this a real risk? Maybe not, but it is a complication. -CHB From tim.peters at gmail.com Fri Jun 29 23:58:45 2018 From: tim.peters at gmail.com (Tim Peters) Date: Fri, 29 Jun 2018 22:58:45 -0500 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: Message-ID: [Tim] >> Regardless of how assignment expressions work in listcomps and genexps, >> this example (which uses neither) _will_ rebind the containing block's `x`: > >> > >> [x := 1] > [Chris Barker] > This reinforces my point that it?s not just about comprehensions, I agree, it's not at all - and I'm amazed at the over-the-top passion that minor issues of scope in comprehensions have ... inspired. It's the tip of the tail of the dog. > but rather that the local namespace can be altered anywhere > an expression is used ? which is everywhere. Yes, everywhere. But what of it? Have you read the PEP? The examples are all simple and straightforward and "local". My example above was wholly contrived to make a specific point, and I expect we'll _never_ see that line in real code. > > That trivial example is unsurprising, but as soon as your > line of code gets a bit longer, it could be far more hidden. It's not possible to prevent people from writing horrible code, and I'm hard pressed to think of _any_ programming feature that can't be so abused. From ridiculouslyLongVariableNamesWhoseVerbostiySeemsToBeAGoalInItself. massive overuse of globals, insanely deep nesting, horridly redundant parenthesization, functions with 20 undocumented arguments, creating Byzantine class structures spread over a directory full of modules to implement a concept that _could_ have been done faster and better with a list, ... So on a scale of 1 ("wake me up when it's over") to 100 ("OMG! It's the end of the world!!!"), "but it can be horridly abused" rates about a 2 on my weighting scale. Do we really think so little of our fellow Pythoneers? Key point: absolutely nobody has expressed a fear that they _themself_ will abuse assignment expressions. It's always some seemingly existential dread that someone else will ;-) > I?m not saying it?s not worth it, but it a more significant > > complication than simply adding a new feature like augmented > > assignment or terniary expressions, where the effect is seen only > > where it is used. Which is good to keep in mind when using a feature like this. Python is an imperative language, and side effects are rampant. Controlling them is important. > A key problem with thinking about this is that we can scan existing > code to find places where this would improve the code, and decide if > those use-cases would cause confusion. I went through that exercise for the PEP's Appendix A. I assume you haven't read it. I found many places where assignment expressions would make for a small improvement, and got surprised by concluding it was really the multitude of tiny, extremely-local improvements that "added up" to the real win overall, not the much rarer cases where assignment expressions really shine (such as in collapsing chains of semantically misleading ever-increasing indentation in long assign/f/else/assign/if/else/assign/if/else ...structures). I also gave examples of places where, despite being "small and local" changes, using assignment expressions appeared to be a _bad_ idea. > But we really can?t anticipate all the places where it might get used > > perhaps inappropriately) that would cause confusion. We can hope that > people won?t tend to do that, but who knows? Having spent considerable time on it myself (see just above), I do not assume that other Pythonistas are incapable of reaching sane conclusions too ;-) > Example: in a function argument: > > > > result = call_a_func(arg1, arg2, kwarg1=x, kwarg2=x:=2*y) The PEP already calls that one a SyntaxError. I can't imagine why a sane programmer would want to do that, but if they really must the PEP _will_ allow it if they parenthesize the assignment expression in this context (so "kwarg2=(x:=2*y)" instead. > Sure, there are always ways to write bad code, and most people > > wouldn?t do that, but someone, somewhere, that thinks shorter code is > > better code might well do it. Or something like it. Someone will! No doubt about it. But what of it? If someone is programming for their own amusement, why should I care? If they're working with a group, bad practice should be discouraged by the group's coding standards and enforced by the group's code review process. For this feature, and all others. "Consenting adults" is a key Python principle too. And I believe in it, despite that I wrote tabnanny.py ;-) -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Sat Jun 30 03:47:38 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 30 Jun 2018 17:47:38 +1000 Subject: [Python-Dev] Intent to accept PEP 561 -- Distributing and Packaging Type Information In-Reply-To: References: Message-ID: On 28 June 2018 at 09:11, Guido van Rossum wrote: > Well, with that, I am hereby accepting PEP 561. > > Ethan has done a tremendous job writing this PEP and implementing it, and I > am sure that package and stub authors will be very glad to hear that there > are now officially supported ways other than typeshed to distribute type > annotations. Very cool! Congratulations Ethan! Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Sat Jun 30 03:56:37 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 30 Jun 2018 17:56:37 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: <5B2D80E7.5010501@canterbury.ac.nz> <20180624055646.GL14437@ando.pearwood.info> <20180624145204.GN14437@ando.pearwood.info> Message-ID: On 28 June 2018 at 08:31, Guido van Rossum wrote: > So IIUC you are okay with the behavior described by the PEP but you want an > explicit language feature to specify it? > > I don't particularly like adding a `parentlocal` statement to the language, > because I don't think it'll be generally useful. (We don't have `goto` in > the language even though it could be used in the formal specification of > `if`, for example. :-) > > But as a descriptive mechanism to make the PEP's spec clearer I'm fine with > it. Let's call it `__parentlocal` for now. It would work a bit like > `nonlocal` but also different, since in the normal case (when there's no > matching `nonlocal` in the parent scope) it would make the target a local in > that scope rather than trying to look for a definition of the target name in > surrounding (non-class, non-global) scopes. Also if there's a matching > `global` in the parent scope, `__parentlocal` itself changes its meaning to > `global`. If you want to push a target through several level of target > scopes you can do that by having a `__parentlocal` in each scope that it > should push through (this is needed for nested comprehensions, see below). > > Given that definition of `__parentlocal`, in first approximation the scoping > rule proposed by PEP 572 would then be: In comprehensions (which in my use > in the PEP 572 discussion includes generator expressions) the targets of > inline assignments are automatically endowed with a `__parentlocal` > declaration, except inside the "outermost iterable" (since that already runs > in the parent scope). > > There would have to be additional words when comprehensions themselves are > nested (e.g. `[[a for a in range(i)] for i in range(10)]`) since the PEP's > intention is that inline assignments anywhere there end up targeting the > scope containing the outermost comprehension. But this can all be expressed > by adding `__parentlocal` for various variables in various places (including > in the "outermost iterable" of inner comprehensions). > > I'd also like to keep the rule prohibiting use of the same name as a > comprehension loop control variable and as an inline assignment target; this > rule would also prohibit shenanigans with nested comprehensions (for any set > of nested comprehensions, any name that's a loop control variable in any of > them cannot be an inline assignment target in any of them). This would also > apply to the "outermost iterable". > > Does this help at all, or did I miss something? Yep, it does, and I don't think you missed anything. Using "__parentlocal" to indicate "parent local scoping semantics apply here" still gives the concept a name and descriptive shorthand for use in pseudo-code expansions of assignment expressions in comprehensions, without needing to give it an actually usable statement level syntax, similar to the way we use "_expr_result" and "_outermost_iter" to indicate references that in reality are entries in an interpreter's stack or register set, or else a pseudo-variable that doesn't have a normal attribute identifier. And if anyone does want to make the case for the syntax being generally available, they don't need to specify how it should work - they just need to provide evidence of cases where it would clarify code unrelated to the PEP 572 use case. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From steve at pearwood.info Sat Jun 30 04:17:02 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Sat, 30 Jun 2018 18:17:02 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: Message-ID: <20180630081700.GS14437@ando.pearwood.info> On Wed, Jun 27, 2018 at 09:52:43PM -0700, Chris Barker wrote: > It seems everyone agrees that scoping rules should be the same for > generator expressions and comprehensions, Yes. I dislike saying "comprehensions and generator expressions" over and over again, so I just say "comprehensions". Principle One: - we consider generator expressions to be a lazy comprehension; - or perhaps comprehensions are eager generator expressions; - either way, they behave the same in regard to scoping rules. Principle Two: - the scope of the loop variable stays hidden inside the sub-local ("comprehension" or "implicit hidden function") scope; - i.e. it does not "leak", even if you want it to. Principle Three: - glossing over the builtin name look-up, calling list(genexpr) will remain equivalent to using a list comprehension; - similarly for set and dict comprehensions. Principle Four: - comprehensions (and genexprs) already behave "funny" inside class scope; any proposal to fix class scope is beyond the, er, scope of this PEP and can wait for another day. So far, there should be (I hope!) no disagreement with those first four principles. With those four principles in place, teaching and using comprehensions (genexprs) in the absense of assignment expressions does not need to change one iota. Normal cases stay normal; weird cases mucking about with locals() inside the comprehension are already weird and won't change. > So what about: > > l = [x:=i for i in range(3)] > > vs > > g = (x:=i for i in range(3)) > > Is there any way to keep these consistent if the "x" is in the regular > local scope? Yes. That is what closures already do. We already have such nonlocal effects in Python 3. Move the loop inside an inner (nested) function, and then either call it immediately to simulate the effect of a list comprehension, or delay calling it to behave more like a generator expression. Of course the *runtime* effects depend on whether or not the generator expression is actually evaluated. But that's no mystery, and is precisely analogous to this case: def demo(): x = 1 def inner(): nonlocal x x = 99 inner() # call the inner function print(x) This prints 99. But if you comment out the call to the inner function, it prints 1. I trust that doesn't come as a surprise. Nor should this come as a surprise: def demo(): x = 1 # assuming assignment scope is local rather than sublocal g = (x:= i for i in (99,)) L = list(g) print(x) The value of x printed will depend on whether or not you comment out the call to list(g). > Note that this thread is titled "Informal educator feedback on PEP 572". > > As an educator -- this is looking harder an harder to explain to newbies... > > Though easier if any assignments made in a "comprehension" don't "leak out". Let me introduce two more principles. Principle Five: - all expressions are executed in the local scope. Principle Six: - the scope of an assignment expression variable inside a comprehension (genexpr) should not depend on where inside the comprehension it sits. Five is, I think, so intuitive that we forget about it in the same way that we forget about the air we breathe. It would be surprising, even shocking, if two expressions in the same context were executed in different scopes: result = [x + 1, x - 2] If the first x were local and the second was global, that would be disturbing. The same rule ought to apply if we include assignment expressions: result = [(x := expr) + 1, x := x - 2] It would be disturbing if the first assignment (x := expr) executed in the local scope, and the second (x := x - 2) failed with NameError because it was executed in the global scope. Or worse, *didn't* fail with NameError, but instead returned something totally unexpected. Now bring in a comprehension: result = [(x := expr) + 1] + [x := x - 2 for a in (None,)] Do you still want the x inside the comprehension to be a different x to the one outside the comprehension? How are you going to explain that UnboundLocalError to your students? That's not actually a rhetorical question. I recognise that while Principle Five seems self-evidently desirable to me, you might consider it less important than the idea that "assignments inside comprehensions shouldn't leak". I believe that these two expressions should give the same results even to the side-effects: [(x := expr) + 1, x := x - 2] [(x := expr) + 1] + [x := x - 2 for a in (None,)] I think that is the simplest and most intuitive behaviour, the one which will be the least surprising, cause the fewest unexpected NameErrors, and be the simplest to explain. If you still prefer the "assignments shouldn't leak" idea, consider this: under the current implementation of comprehensions as an implicit hidden function, the scope of a variable depends on *where* it is, violating Principle Six. (That was the point of my introducing locals() into a previous post: to demonstrate that, today, right now, "comprehension scope" is a misnomer. Comprehensions actually execute in a hybrid of at least two scopes, the surrounding local scope and the sublocal hidden implicit function scope.) Let me bring in another equivalency: [(x := expr) + 1, x := x - 2] [(x := expr) + 1] + [x := x - 2 for a in (None,)] [(x := expr) + 1] + [a for a in (x := x - 2,)] By Principle Six, the side-effect of assigning to x shouldn't depend on where inside the comprehension it is. The two comprehension expressions shown ought to be referring to the same "x" variable (in the same scope) regardless of whether that is the surrounding local scope, or a sublocal comprehension scope. (In the case of it being a sublocal scope, the two comprehensions will raise UnboundLocalError.) But -- and this is why I raised all that hoo-ha about locals() -- according to the current implementation, they *don't*. This version would assign to x in the sublocal scope: # best viewed in a monospaced font [x := x - 2 for a in (None,)] ^^^^^^^^^^ this is sublocal scope but this would assign in the surrounding local scope: [a for a in (x := x - 2,)] ^^^^^^^^^^^^^ this is local scope I strongly believe that all three ought to be equivalent, including side-effects. (Remember that by Principle Two, we agree that the loop variable doesn't leak. The loop variable is invisible from the outside and doesn't count as a side-effect for this discussion.) So here are three possibilities (assuming assignment expressions are permitted): 1. Nick doesn't like the idea of having to inject an implicit "nonlocal" into the comprehension hidden implicit function; if we don't, that gives us the case where the scope of assignment variables depends on where they are in the comprehension, and will sometimes leak and sometimes not. This torpedoes Princple Six, and leaves you having to explain why assignment sometimes "works" inside comprehensions and sometimes gives UnboundLocalError. 2. If we decide that assignment inside a comprehension should always be sublocal, the implementation becomes more complex in order to bury the otherwise-local scope beneath another layer of even more hidden implicit functions. That rules out some interesting (but probably not critical) uses of assignment expressions inside comprehensions, such as using them as a side-channel to sneak out debugging information. And it adds a great big screaming special case to Principle Five: - all expressions, EXCEPT FOR THE INSIDE OF COMPREHENSIONS, are executed in the local scope. 3. Or we just make all assignments inside comprehensions (including gen exprs) occur in the surrounding local scope. Number 3 is my strong preference. It complicates the implementation a bit (needing to add some implicit nonlocals) but not as much as needing to hide the otherwise-local scope beneath another implicit function. And it gives by far the most consistent, useful and obvious semantics out of the three options. My not-very-extensive survey on the Python-List mailing lists suggests that, if you don't ask people explicitly about "assignment expressions", they already think of the inside of comprehensions as being part of the surrounding local scope rather than a hidden inner function. So I do not believe that this will be hard to teach. These two expressions ought to give the same result with the same side-effect: [x := 1] [x := a for a in (1,)] That, I strongly believe, is the inuitive behaviour to peope who aren't immersed in the implementation details of comprehensions, as well as being the most useful. -- Steve From ncoghlan at gmail.com Sat Jun 30 04:27:37 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 30 Jun 2018 18:27:37 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: Message-ID: On 29 June 2018 at 08:42, Chris Barker via Python-Dev wrote: > On Thu, Jun 28, 2018 at 9:28 AM, Tim Peters wrote: >> Did adding ternary `if` (truepart if expression else falsepart) complicate >> the language significantly? > > > I don't think so -- no. For two reasons: > > 1) the final chosen form is kind of verbose, but therefor more like > "executable pseudo code" :-) As apposed to the C version, for instance. > > 2) it added one new construct, that if, when someone sees it for the first > (or twenty fifth) time and doesn't understand it, they can look it up, and > find out. and it only effects that line of code. > > So adding ANYTHING does complicate the language, by simply making it a bit > larger, but some things are far more complicating than others. It's worth noting that without the bug prone "C and A or B" construct (which gives the wrong result when "not A" is True), we'd likely never have gotten "A if C else B" (which gives the right result regardless of the truth value of A). In the case of PEP 308, the new construction roughly matched the existing idiom in expressive power, it just handled it correctly by being able to exactly match the developer's intent. "NAME := EXPR" exists on a different level of complexity, since it adds name binding in arbitrary expressions for the sake of minor performance improvement in code written by developers that are exceptionally averse to the use of vertical screen real estate, and making a couple of moderately common coding patterns (loop-and-a-half, if-elif-chains with target binding) more regular, and hence easier to spot. I think the current incarnation of PEP 572 does an excellent job of making the case that says "If we add assignment expressions, we should add them this particular way" - there are a lot of syntactic and semantic complexities to navigate, and it manages to make its way through them and still come out the other side with a coherent and self-consistent proposal that copes with some thoroughly quirky existing scoping behaviour. That only leaves the question of "Does the gain in expressive power match the increase in the cognitive burden imposed on newcomers to the language?", and my personal answer to that is still "No, I don't think it does". It isn't my opinion on that that matters, though: I think that's now going to be a conversation between Guido and folks that are actively teaching Python to new developers, and are willing to get their students involved in some experiments. Cheers, Nick. P.S. It does make me wonder if it would be possible to interest the folks behind https://quorumlanguage.com/evidence.html in designing and conducting fully controlled experiments comparing the comprehensibility of pre-PEP-572 code with post-PEP-572 code *before* the syntax gets added to the language :) -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Sat Jun 30 04:30:56 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 30 Jun 2018 18:30:56 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: Message-ID: On 30 June 2018 at 09:49, Chris Barker - NOAA Federal via Python-Dev wrote: >> On Jun 28, 2018, at 8:21 PM, Tim Peters wrote: > > Seems it?s all been said, and Tim?s latest response made an excellent > case for consistency. > > But: > >> Regardless of how assignment expressions work in listcomps and genexps, this example (which uses neither) _will_ rebind the containing block's `x`: >> >> [x := 1] > > This reinforces my point that it?s not just about comprehensions, but > rather that the local namespace can be altered anywhere an expression > is used ? which is everywhere. > > That trivial example is unsurprising, but as soon as your line of code > gets a bit longer, it could be far more hidden. The significant semantic differences between "{x : 1}" and "{x := 1}" are also rather surprising :) Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From steve at pearwood.info Sat Jun 30 05:58:47 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Sat, 30 Jun 2018 19:58:47 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: Message-ID: <20180630095837.GU14437@ando.pearwood.info> On Sat, Jun 30, 2018 at 06:30:56PM +1000, Nick Coghlan wrote: > The significant semantic differences between "{x : 1}" and "{x := 1}" > are also rather surprising :) *Significant* and obvious differences are good. It's the subtle differences that you don't notice immediately that really hurt: {x+1} versus {x-1} x > y versus x < y x/y versus x//y alist = [a, b] alist = (a, b) Sometimes small differences in punctuation or spelling make a big difference to semantics. Punctuation Saves Lives! "Let's eat, grandma!" "Let's eat grandma!" Unless you propose to ban all operators and insist on a minimum string distance between all identifiers: https://docs.python.org/3/library/os.html#os.spawnl picking out little differences in functionality caused by little differences in code is a game we could play all day. At least we won't have the "=" versus "==" bug magnet from C, or the "==" versus "===" confusion from Javascript. Compared to that, the in-your-face obvious consequences of {x: 1} versus {x := 1} are pretty harmless. -- Steve From steve at pearwood.info Sat Jun 30 05:35:00 2018 From: steve at pearwood.info (Steven D'Aprano) Date: Sat, 30 Jun 2018 19:35:00 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: Message-ID: <20180630093459.GT14437@ando.pearwood.info> On Thu, Jun 28, 2018 at 03:42:49PM -0700, Chris Barker via Python-Dev wrote: > If we think hardly anyone is ever going to do that -- then I guess it > doesn't matter how it's handled. That's how you get a language with surprising special cases, gotchas and landmines in its behaviour. (Cough PHP cough.) It is one thing when gotchas occur because nobody thought of them, or because there is nothing you can do about them. But I do not think it is a good idea to *intentionally* leave gotchas lying around because "oh, I didn't think anyone would ever do that...". *wink* [...] > but here the keyword "nonlocal" is used -- you are clearly declaring that > you are messing with a nonlocal name here -- that is a lot more obvious > than simply using a := But from the point of view of somebody reading the code, there is no need for a nonlocal declaration, since the assignment is just a local assignment. Forget the loop variable -- it is a special case where "practicality beats purity" and it makes sense to have it run in a sublocal scope. Everything else is just a local, regardless of whether it is in a comprehension or not. > And "nonlocal" is not used that often, and when it is it's for careful > closure trickery -- I'm guessing := will be far more common. And, of > course, when a newbie encounters it, they can google it and see what it > means -- far different that seeing a := in a comprehension and > understanding (by osmosis??) that it might make changes in the local scope. I've given reasons why I believe that people will expect assignments in comprehensions to occur in the local scope. Aside from the special case of loop variables, people don't think of comprehensions as a separate scope. There's no Comprehension Sublocal-Local-Enclosing Local-Global-Builtin scoping rule. (Do you want there to be?) Even *class scope* comes as an unfamiliar surprise to people. I do not believe that people will "intuitively" expect assignments in a comprehension to disappear when the comprehension finishes -- I expect that most of the time they won't even think about it, but when they do, they'll expect it to hang around like *every other use* of assignment expressions. > And I don't think you can even do that with generator expressions now -- > as they can only contain expressions. It makes me cry to think of the hours I spent -- and the brownie points I lost with my wife -- showing how you can already simulate this with locals() or globals(). Did nobody read it? :-( https://mail.python.org/pipermail/python-dev/2018-June/154114.html Yes, you can do this *right now*. We just don't because playing around with locals() is a dodgy thing to do. > Maybe it's only comprehensions, and maybe it'll be rare to have a confusing > version of those, so it'll be no big deal, but this thread started talking > about educators' take on this -- and as an educator, I think this really > does complicate the language. See my recent post here: https://mail.python.org/pipermail/python-dev/2018-June/154184.html I strongly believe that the "comprehensions are local, like everything else" scenario is simpler and less surprising and easier to explain than hiding assignments inside a sublocal comprehension scope that hardly anyone even knows exists. Especially if we end up doing it inconsistently and let variables sometimes leak. > Python got much of it's "fame" by being "executable pseudo code" -- its > been moving farther and farther away from those roots. That's generally a > good thing, as we've gain expressiveness in exchangel, but we shouldn't > pretend it isn't happening, or that this proposal doesn't contribute to > that trend. I think there are two distinct forms of "complication" here. 1. Are assignment expressions in isolation complicated? 2. Given assignment expressions, can people write obfuscated, complex code? Of course the answer to Q2 is yes, the opportunity will be there. Despite my general cynicism about my fellow programmers, I actually do believe that the Python community does a brilliant job of self-policing to prevent the worst excesses of obfuscatory one-liners. I don't think that will change. So I think Q1 is the critical one. And I think the answer is, no, they're conceptually bloody simple. They evaluate the expression on the right, assign it to the name on the left, and return that value. Here is a question and answer: Question: after ``result = (x := 2) + 3``, what is the value of x? Answer: 2. Question: what if we put the assignment inside a function call? ``f((x:=2), x+3)`` Answer: still 2. Question: how about inside a list display? ``[1, x:=2, 3]`` Answer: still 2. Question: what about a dict display? ``{key: x:=2}`` A tuple? A set? Answer: still 2 to all of them. Question: how about a list comprehension? Answer: ah, now, that's complicated, it depends on which bit of the comprehension you put it in, the answer could be that x is 2 as you would expect, or it could be undefined. Yes, I can see why as an educator you would prefer that over *my* answer, which would be: Answer: it's still ^&*@^!$ two, what did you think it would be??? *wink* > > > Maybe it?s just me, but re-binding a name seems like a whole new > > > category of side effect. > > > > With no trickery at all, you've always been able to rebind attributes, and > > mutate containers, in comprehensions and genexps. Because `for` targets > > aren't limited to plain names; e.g., > > > > g = (x+y for object.attribute, a[i][j] in zip(range(3), range(3))) > > > > sure, but you are explicitly using the names "object" and "a" here -- so > while side effects in comprehension are discouraged, it's not really a > surprised that namespaces specifically named are changed. Trust me on this Chris, assuming the arguments over this PEP are finished by then (I give 50:50 odds *wink*) by the time 3.8 comes out you'll be saying "sure, but you are explicitly assigning to a local variable with the := operator, it's not really a surprise that the local variable specifically named is being changed, it would be weird if it wasn't..." and you'll have forgotten that there was ever a time we seriously discussed comprehension-scope as a thing. :-) [example with the loop variable of a comprehension] > doesn't change x in the local scope -- if that was a good idea, why is a > good idea to have := in a comprehension effect the local scope?? > > But maybe it is just me. Some of us don't think that it was a good idea *wink* But I know when I'm outvoted and when "practicality beats purity". Loop variables are semantically special. They tend to have short, often one-character names, taken from a small set of common examples (i, j, x are especially common). We don't tend to think of them as quite the same as ordinary variables: once the loop is complete, we typically ignore the loop variable. It takes a conscious effort to remember that it actually still hangs around. I've written code like this: for x in sequence: last_x = x do_stuff() # outside the loop process(last_x) and then looked at it a day later and figuratively kicked myself. What the hell was I thinking? So it is hardly surprising that people would sometimes write: for x in sequence: L = [expression for x in something] process(x) and be unpleasantly surprised in Python 2. But this is not likely to happen BY ACCIDENT with assignment expressions, and if it does, well, the answer is, change the variable name to something a little less generic, or at least *different*: for x in sequence: L = [x := a+1 for a in something] process(x) I am reluctantly forced to agree that, purity be damned, burying the loop variable inside a hidden implicit function scope is the right thing to do, even if it is sometimes annoying. But the reasons for doing do don't apply to explicit assignment expressions. We have no obligation to protect people from every accidental name clobbering caused by carelessness and poor choice of names. py> import math as module, string as module, sys as module py> module > I suppose we need to go back and look at the "real" examples of where/how > folks think they'll use := in comprehensions, and see how confusing it may > be. One the simplest but most useful examples is as a debugging aide. [function(x) for x in whatever] crashes part of the way through and raises an exception. How to debug? If it were a for-loop, one simple solution would be to wrap it in a try...except and then print the last seen value of x. try: for x in whatever: function(x) except: print(x) Can't do that with a list comprehension, not in Python 3. (It works in Python 2.) Assignment expressions to the rescue! try: [function(a := x) for x in whatever] except: print(a) That's simple and lightweight enough that with a bit of polishing, you could even leave it in production code to log a hard-to-track down error. Oh, and it doesn't easily translate to a map(), since lambda functions do execute in their own scope: map(lambda x: function(a := x), whatever) Another reason to prefer comprehensions to map. > One of these conversations was started with an example something like this: > > [(f(x), g(f(x))) for x in an_iterable] > > The OP didn't like having to call f() twice. So that would become: > > [ (temp:=f(x), g(temp)) for x in an_iterable] > > so now the question is: should "temp" be created / changed in the enclosing > local scope? The comprehension here is a red herring. result = (temp := f(x), g(temp)) Should temp be a local, or should tuple displays exist in their own, special, sublocal scope? In either case, there's no *specific* advantage to letting temp "leak" (scare quotes) out of the expression. But there's no disadvantage either, and if we pick a less prejudicial name, it might actually be advantagous: result = (useful_data := f(x), g(useful_data)) process(useful_data) Using a comprehension isn't special enough to change the rule that assignment expressions are local (unless declared global). -- Steve From ncoghlan at gmail.com Sat Jun 30 08:57:01 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 30 Jun 2018 22:57:01 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <20180630093459.GT14437@ando.pearwood.info> References: <20180630093459.GT14437@ando.pearwood.info> Message-ID: On 30 June 2018 at 19:35, Steven D'Aprano wrote: > So I think Q1 is the critical one. And I think the answer is, no, > they're conceptually bloody simple. They evaluate the expression on the > right, assign it to the name on the left, and return that value. And the proposed parent local scoping in PEP 572 has the virtue of making all of the following do *exactly the same thing*, just as they would in the version without the assignment expression: ref = object() container = [item := ref] container = [x for x in [item := ref]] container = [x for x in [item := ref] for i in range(1)] container = list(x for x in [item := ref]) container = list(x for x in [item := ref] for i in range(1)) # All variants pass this assertion, keeping the implicit sublocal scope almost entirely hidden assert container == [ref] and item is ref and item is container[0] My own objections were primarily based on the code-generation-centric concept of wanting Python's statement level scoping semantics to continue to be a superset of its expression level semantics, and Guido's offer to define "__parentlocal" in the PEP as a conventional shorthand for describing comprehension style assignment expression scoping when writing out their statement level equivalents as pseudo-code addresses that. (To put it another way: it turned out it wasn't really the semantics themselves that bothered me, since they have a lot of very attractive properties as shown above, it was the lack of a clear way of referring to those semantics other than "the way assignment expressions behave in implicit comprehension and generator expression scopes"). Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From alfred at freebsd.org Sat Jun 30 11:14:17 2018 From: alfred at freebsd.org (Alfred Perlstein) Date: Sat, 30 Jun 2018 08:14:17 -0700 Subject: [Python-Dev] Help preventing SIGPIPE/SIG_DFL anti-pattern. Message-ID: <922692be-4089-0162-4c3c-93bad5a2c3d8@freebsd.org> Hello, I'm looking for someone in the python community to help with a problem of anti-patterns showing up dealing with SIGPIPE. Specifically I've noticed an anti-pattern developing where folks will try to suppress broken pipe errors written to stdout by setting SIGPIPE's disposition to SIG_DFL.? This is actually very common, and also rather broken due to the fact that for all but the most simple text filters this opens up a problem where the process can exit unexpectedly due to SIGPIPE being generated from a remote connection the program makes. I have attached a test program which shows the problem. to use this program it takes several args. # 1. Illustrate the 'ugly output to stderr' that folks want to avoid: % python3 t0.py nocatch | head -1 # 2. Illustrate the anti-pattern, the program exits on about line 47 which most folks to not understand % python3 t0.py dfl | head -1 # 3. Show a better solution where we catch the pipe error and cleanup to avoid the message: % python3 t0.py | head -1 I did a recent audit of a few code bases and saw this pattern pop often often enough that I am asking if there's a way we can discourage the use of "signal(SIGPIPE, SIG_DFL)" unless the user really understands what they are doing. I do have a pull req here: https://github.com/python/cpython/pull/6773 where I am trying to document this on the signal page, but I can't sort out how to land this doc change. thank you, -Alfred -------------- next part -------------- A non-text attachment was scrubbed... Name: t0.py Type: text/x-python-script Size: 2223 bytes Desc: not available URL: From alfred at freebsd.org Sat Jun 30 12:31:35 2018 From: alfred at freebsd.org (Alfred Perlstein) Date: Sat, 30 Jun 2018 09:31:35 -0700 Subject: [Python-Dev] Help preventing SIGPIPE/SIG_DFL anti-pattern. Message-ID: <8b05d9f3-b359-9d4b-91da-44d7e2a57cad@freebsd.org> (sorry for the double post, looks like maybe attachments are dropped, inlined the attachment this time.) Hello, I'm looking for someone in the python community to help with a problem of anti-patterns showing up dealing with SIGPIPE. Specifically I've noticed an anti-pattern developing where folks will try to suppress broken pipe errors written to stdout by setting SIGPIPE's disposition to SIG_DFL.? This is actually very common, and also rather broken due to the fact that for all but the most simple text filters this opens up a problem where the process can exit unexpectedly due to SIGPIPE being generated from a remote connection the program makes. I have attached a test program which shows the problem. to use this program it takes several args. # 1. Illustrate the 'ugly output to stderr' that folks want to avoid: % python3 t0.py nocatch | head -1 # 2. Illustrate the anti-pattern, the program exits on about line 47 which most folks to not understand % python3 t0.py dfl | head -1 # 3. Show a better solution where we catch the pipe error and cleanup to avoid the message: % python3 t0.py | head -1 I did a recent audit of a few code bases and saw this pattern pop often often enough that I am asking if there's a way we can discourage the use of "signal(SIGPIPE, SIG_DFL)" unless the user really understands what they are doing. I do have a pull req here: https://github.com/python/cpython/pull/6773 where I am trying to document this on the signal page, but I can't sort out how to land this doc change. thank you, -Alfred === CUT HERE === # # Program showing the dangers of setting the SIG_PIPE handler to the default handler (SIG_DFL). # # To illustrate the problem run with: # ./foo.py dfl # # The program will exit in do_network_stuff() even though there is a an "except" clause. # The do_network_stuff() simulates a remote connection that closes before it can be written to # which happens often enough to be a hazard in practice. # # # import signal import sys import socket import os def sigpipe_handler(sig, frame): ??? sys.stderr.write("Got sigpipe \n\n\n") ??? sys.stderr.flush() def get_server_connection(): ??? # simulate making a connection to a remote service that closes the connection ??? # before we can write to it.? (In practice a host rebooting, or otherwise exiting while we are ??? # trying to interact with it will be the true source of such behavior.) ??? s1, s2 = socket.socketpair() ??? s2.close() ??? return s1 def do_network_stuff(): ??? # simulate interacting with a remote service that closes its connection ??? # before we can write to it.? Example: connecting to an http service and ??? # issuing a GET request, but the remote server is shutting down between ??? # when our connection finishes the 3-way handshake and when we are able ??? # to write our "GET /" request to it. ??? # In theory this function should be resilient to this, however if SIGPIPE is set ??? # to SIGDFL then this code will cause termination of the program. ??? if 'dfl' in sys.argv[1:]: ??????? signal.signal(signal.SIGPIPE, signal.SIG_DFL) ??? for x in range(5): ??????? server_conn = get_server_connection() ??????? sys.stderr.write("about to write to server socket...\n") ??????? try: ??????????? server_conn.send(b"GET /") ??????? except BrokenPipeError as bpe: ??????????? sys.stderr.write("caught broken pipe on talking to server, retrying...") def work(): ??? do_network_stuff() ??? for x in range(10000): ??????? print("y") ??? sys.stdout.flush() def main(): ??? if 'nocatch' in sys.argv[1:]: ??????? work() ??? else: ??????? try: ??????????? work() ??????? except BrokenPipeError as bpe: ??????????? signal.signal(signal.SIGPIPE, signal.SIG_DFL) ??????????? os.kill(os.getpid(), signal.SIGPIPE) if __name__ == '__main__': ??? main() From tim.peters at gmail.com Sat Jun 30 12:37:53 2018 From: tim.peters at gmail.com (Tim Peters) Date: Sat, 30 Jun 2018 11:37:53 -0500 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: Message-ID: [Nick Coghlan] > > ... > > "NAME := EXPR" exists on a different level of complexity, since it > > adds name binding in arbitrary expressions for the sake of minor > > performance improvement in code written by developers that are > > exceptionally averse to the use of vertical screen real estate, > ... Note that PEP 572 doesn't contain a single word about "performance" (neither that specific word nor any synonym), and I gave only one thought to it when writing Appendix A: "is this going to slow anything down significantly?". The answer was never "yes", which I thought was self-evident, so I never mentioned it. Neither did Chris or Guido. Best I can recall, nobody has argued for it on the grounds of "performance". except in the indirect sense that sometimes it allows a more compact way of reusing an expensive subexpression by giving it a name. Which they already do by giving it a name in a separate statement, so the possible improvement would be in brevity rather than performance. The attractions are instead in the areas of reducing redundancy, improving clarity, allowing to remove semantically pointless indentation levels in some cases, indeed trading away some horizontal whitespace in otherwise nearly empty lines for freeing up a bit of vertical screen space, and in the case of comprehensions/genexps adding straightforward ways to accomplish some conceptually trivial things that at best require trickery now (like emulating a cell object by hand). Calling all that "for the sake of minor performance improvements" - which isn't even in the list - is sooooo far off base it should have left me speechless - but it didn't ;-) But now that you mention it, ya, there will be a trivial performance improvement in some cases. I couldn't care less about that, and can confidently channel that Guido doesn't either. It would remain fine by me if assignment expressions ran trivially slower. -------------- next part -------------- An HTML attachment was scrubbed... URL: From mike at selik.org Sat Jun 30 15:35:26 2018 From: mike at selik.org (Michael Selik) Date: Sat, 30 Jun 2018 12:35:26 -0700 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: Message-ID: On Sat, Jun 30, 2018 at 9:43 AM Tim Peters wrote: > The attractions are instead in the areas of reducing redundancy, improving > clarity, allowing to remove semantically pointless indentation levels in > some cases, indeed trading away some horizontal whitespace in otherwise > nearly empty lines for freeing up a bit of vertical screen space, and in > the case of comprehensions/genexps adding straightforward ways to > accomplish some conceptually trivial things that at best require trickery > now (like emulating a cell object by hand). > The examples you provided (some were new in this thread, I think) are compelling. While my initial reaction to the proposal was mild horror, I'm not troubled by the scoping questions. Issues still bothering me: 1. Initial reactions from students was confusion over := vs = 2. This seems inconsistent with the push for type hints To be fair, I felt a similar gut reaction to f-strings, and now I can't live without them. Have I become a cranky old man, resistant to change? Your examples have put me into the "on the fence, slightly worried" category instead of "clearly a bad idea". On scoping, beginners seem more confused by UnboundLocalError than by variables bleeding between what they perceive as separate scopes. The concept of a scope can be tricky to communicate. Heck, I still make the mistake of looking up class attributes in instance methods as if they were globals. Same-scope is natural. Natural language is happy with ambiguity. Separate-scope is something programmers dreamed up. Only experienced C, Java, etc. programmers get surprised when they make assumptions about what syntax in Python creates separate scopes, and I'm not so worried about those folks. I remind them that the oldest versions of C didn't have block scopes (1975?) and they quiet down. The PEP lists many exclusions of where the new := operator is invalid [0]. I unfortunately didn't have a chance to read the initial discussion over the operator. I'm sure it was thorough :-). What I can observe is that each syntactical exclusion was caused by a different confusion, probably teased out by that discussion. Many exclusions means many confusions. My intuition is that the awkwardness stems from avoiding the replacement of = with :=. Languages that use := seem to avoid the Yoda-style comparison recommendation that is common to languages that use = for assignment expressions. I understand the reluctance for such a major change to the appearance of Python code, but it would avoid the laundry list of exclusions. There's some value in parsimony. Anyway, we've got some time for testing the idea on live subjects. Have a good weekend, everyone. -- Michael PS. Pepe just tied it up for Portugal vs Uruguay. Woo! ... and now Cavani scored again :-( [0] https://www.python.org/dev/peps/pep-0572/#exceptional-cases -------------- next part -------------- An HTML attachment was scrubbed... URL: From tjreedy at udel.edu Sat Jun 30 16:06:39 2018 From: tjreedy at udel.edu (Terry Reedy) Date: Sat, 30 Jun 2018 16:06:39 -0400 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: <20180630093459.GT14437@ando.pearwood.info> References: <20180630093459.GT14437@ando.pearwood.info> Message-ID: On 6/30/2018 5:35 AM, Steven D'Aprano wrote: > I've given reasons why I believe that people will expect assignments in > comprehensions to occur in the local scope. Aside from the special case > of loop variables, people don't think of comprehensions as a separate > scope. I think this is because comprehensions other than generator expressions were originally *defined* in terms of equivalent code in the *same* local scope, are still easily thought of in those terms, and, as I explained in my response to Guido, could, at least in simple cases, still be implemented in the local scope, so that assignment expressions would be executed and affect the expected local scope without 'nonlocal'. Generator expressions, on the other hand, have always been defined in terms of equivalent code in a *nested* scope, and must be implemented that way, so some re-definition and re-implementation is needed for assignment expressions to affect the local scope in which the g.e is defined and for that effect to be comprehensible. It might be enough to add something like "any names that are targets of assignment expressions are added to global or nonlocal declarations within the implementation generator function." If the equality [generator-expression] == list(generator-expression] is preserved, then it could serve as the definition of the list comprehension. The same could be true for set and dict comprehensions, with the understanding that the equality is modified to {a:b for ...} == dict((a,b) for ...). It should also be mentioned that the defining equality is not necessarily the implementation. -- Terry Jan Reedy From tinchester at gmail.com Sat Jun 30 18:54:07 2018 From: tinchester at gmail.com (=?UTF-8?Q?Tin_Tvrtkovi=C4=87?=) Date: Sun, 1 Jul 2018 00:54:07 +0200 Subject: [Python-Dev] PEP 544 (Protocols): adding a protocol to a class post-hoc Message-ID: Hi, PEP 544 specifies this address as "Discussions-To" so I hope I'm at the right address. I think protocols as defined in the PEP are a very interesting idea and I'm thinking of ways of applying them. The first use case is in the context of attrs. attrs has a number of functions that work only on attrs classes; asdict for example (turns an attrs class into a dictionary recursively). We can't really type hint this properly using nominal subtyping since attrs classes don't have an exclusive ancestor. But it sounds like we could use structural subtyping! An attrs class has a special class-level field, __attrs_attrs__, which holds the attribute definitions. So maybe we can define a protocol: class AttrsClass(Protocol): __attrs_attrs__: ClassVar[Tuple[Attribute, ...]] then we could define asdict as (simplified): def asdict(inst: AttrsClass) -> Dict[str, Any]: ... and it should work out. My question is how to actually add this protocol to attrs classes. Now, we currently have an attrs plugin in mypy so maybe some magic in there could make it happen in this particular case. My second use case is a small library I've developed for work, which basically wraps attrs and generates and sticks methods on a class for serialization/deserialization. Consider the following short program, which does not typecheck on the current mypy. class Serializable(Protocol): def __serialize__(self) -> int: ... def make_serializable(cl: Type) -> Type: cl = attr.s(cl) cl.__serialize__ = lambda self: 1 return cl @make_serializable class A: a: int = attr.ib() def serialize(inst: Serializable) -> int: return inst.__serialize__() serialize(A(1)) error: Argument 1 to "serialize" has incompatible type "A"; expected "Serializable" error: Too many arguments for "A" I have no desire to write a mypy plugin for this library. So I guess what is needed is a way to annotate the class decorator, telling mypy it's adding a protocol to a class. It seems to have trouble getting to this conclusion by itself. (The second error implies the attrs plugin doesn't handle wrapping attr.s, which is unfortunate but a different issue.) I have found this pattern of decorating classes and enriching them with additional methods at run-time really powerful, especially when used with run-time parsing of type information (that often gets a bad rep on this list, I've noticed :) The structural typing subsystem seems like a good fit for use cases like this, and I think it'd be a shame if we couldn't make it work somehow. -------------- next part -------------- An HTML attachment was scrubbed... URL: From greg.ewing at canterbury.ac.nz Sat Jun 30 19:20:26 2018 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Sun, 01 Jul 2018 11:20:26 +1200 Subject: [Python-Dev] Help preventing SIGPIPE/SIG_DFL anti-pattern. In-Reply-To: <922692be-4089-0162-4c3c-93bad5a2c3d8@freebsd.org> References: <922692be-4089-0162-4c3c-93bad5a2c3d8@freebsd.org> Message-ID: <5B38103A.2050002@canterbury.ac.nz> Alfred Perlstein wrote: > I am asking if there's a way we can discourage the use > of "signal(SIGPIPE, SIG_DFL)" unless the user really understands what > they are doing. Maybe there's some way that SIGPIPEs on stdout could be handled differently by default, so that they exit silently instead of producing an ugly message. That would remove the source of pain that's leading people to do this. -- Greg From ncoghlan at gmail.com Sat Jun 30 22:50:23 2018 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun, 1 Jul 2018 12:50:23 +1000 Subject: [Python-Dev] Informal educator feedback on PEP 572 (was Re: 2018 Python Language Summit coverage, last part) In-Reply-To: References: Message-ID: On 1 July 2018 at 02:37, Tim Peters wrote: > [Nick Coghlan] > >> ... > >> "NAME := EXPR" exists on a different level of complexity, since it > >> adds name binding in arbitrary expressions for the sake of minor > >> performance improvement in code written by developers that are > >> exceptionally averse to the use of vertical screen real estate, >> ... > > Note that PEP 572 doesn't contain a single word about "performance" (neither > that specific word nor any synonym), and I gave only one thought to it when > writing Appendix A: "is this going to slow anything down significantly?". > The answer was never "yes", which I thought was self-evident, so I never > mentioned it. Neither did Chris or Guido. > > Best I can recall, nobody has argued for it on the grounds of "performance". > except in the indirect sense that sometimes it allows a more compact way of > reusing an expensive subexpression by giving it a name. Which they already > do by giving it a name in a separate statement, so the possible improvement > would be in brevity rather than performance. The PEP specifically cites this example as motivation: group = re.match(data).group(1) if re.match(data) else None That code's already perfectly straightforward to read and write as a single line, so the only reason to quibble about it is because it's slower than the arguably less clear two-line alternative: _m = re.match(data) group = _m.group(1) if _m else None Thus the PEP's argument is that it wants to allow the faster version to remain a one-liner that preserves the overall structure of the version that repeats the subexpression: group = _m.group(1) if _m := re.match(data) else None That's a performance argument, not a readability one (as if you don't care about performance, you can just repeat the subexpression). Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia