From status at bugs.python.org Fri Jan 1 12:08:34 2016 From: status at bugs.python.org (Python tracker) Date: Fri, 1 Jan 2016 18:08:34 +0100 (CET) Subject: [Python-Dev] Summary of Python tracker Issues Message-ID: <20160101170834.CF291566E9@psf.upfronthosting.co.za> ACTIVITY SUMMARY (2015-12-25 - 2016-01-01) Python tracker at http://bugs.python.org/ To view or respond to any of the issues listed below, click on the issue. Do NOT respond to this message. Issues counts and deltas: open 5341 ( -4) closed 32402 (+37) total 37743 (+33) Open issues with patches: 2351 Issues opened (18) ================== #25953: re fails to identify invalid numeric group references in repla http://bugs.python.org/issue25953 opened by bazwal #25954: Python 3.5.1 installer fails on Windows 7 http://bugs.python.org/issue25954 opened by nedbat #25957: sockaddr_l2 lacks CID, address type (AF_BLUETOOTH sockets) http://bugs.python.org/issue25957 opened by mikeryan #25958: Implicit ABCs have no means of "anti-registration" http://bugs.python.org/issue25958 opened by abarnert #25959: tkinter - PhotoImage.zoom() causes segfault http://bugs.python.org/issue25959 opened by hcbd #25962: Ctrl+C Can't Exit Script with Pool on Windows http://bugs.python.org/issue25962 opened by Meng-Yuan Huang #25963: strptime not parsing some timezones http://bugs.python.org/issue25963 opened by utkonos #25966: Bug in asyncio.corotuines._format_coroutine http://bugs.python.org/issue25966 opened by Brett Rosen #25967: Devguide: add 2to3 to the "Changing CPython's Grammar" checkli http://bugs.python.org/issue25967 opened by ezio.melotti #25968: Cannot import modules out of zip file with extended-length pat http://bugs.python.org/issue25968 opened by Chris Moore #25969: Update lib2to3 grammar to include missing unpacking generaliza http://bugs.python.org/issue25969 opened by ezio.melotti #25970: py_compile.compile fails if existing bytecode file is unwritab http://bugs.python.org/issue25970 opened by byrnes #25974: Fix statistics.py after the Decimal.as_integer_ratio() change http://bugs.python.org/issue25974 opened by skrah #25981: Intern namedtuple field names http://bugs.python.org/issue25981 opened by serhiy.storchaka #25982: multiprocessing docs for Namespace lacks class definition http://bugs.python.org/issue25982 opened by davin #25983: Add tests for multi-argument type() http://bugs.python.org/issue25983 opened by serhiy.storchaka #25984: Expose a simple "is IEEE 754" flag in sys.float_info http://bugs.python.org/issue25984 opened by random832 #25985: Use sys.version_info instead of sys.version http://bugs.python.org/issue25985 opened by serhiy.storchaka Most recent 15 issues with no replies (15) ========================================== #25985: Use sys.version_info instead of sys.version http://bugs.python.org/issue25985 #25984: Expose a simple "is IEEE 754" flag in sys.float_info http://bugs.python.org/issue25984 #25982: multiprocessing docs for Namespace lacks class definition http://bugs.python.org/issue25982 #25970: py_compile.compile fails if existing bytecode file is unwritab http://bugs.python.org/issue25970 #25969: Update lib2to3 grammar to include missing unpacking generaliza http://bugs.python.org/issue25969 #25968: Cannot import modules out of zip file with extended-length pat http://bugs.python.org/issue25968 #25967: Devguide: add 2to3 to the "Changing CPython's Grammar" checkli http://bugs.python.org/issue25967 #25966: Bug in asyncio.corotuines._format_coroutine http://bugs.python.org/issue25966 #25962: Ctrl+C Can't Exit Script with Pool on Windows http://bugs.python.org/issue25962 #25952: code_context not available in exec() http://bugs.python.org/issue25952 #25951: SSLSocket.sendall() does not return None on success like socke http://bugs.python.org/issue25951 #25948: Invalid MIME encoding generated by email.mime (line too long) http://bugs.python.org/issue25948 #25946: configure should pick /usr/bin/g++ automatically if present http://bugs.python.org/issue25946 #25943: Integer overflow in _bsddb leads to heap corruption http://bugs.python.org/issue25943 #25934: ICC compiler: ICC treats denormal floating point numbers as 0. http://bugs.python.org/issue25934 Most recent 15 issues waiting for review (15) ============================================= #25985: Use sys.version_info instead of sys.version http://bugs.python.org/issue25985 #25983: Add tests for multi-argument type() http://bugs.python.org/issue25983 #25981: Intern namedtuple field names http://bugs.python.org/issue25981 #25974: Fix statistics.py after the Decimal.as_integer_ratio() change http://bugs.python.org/issue25974 #25953: re fails to identify invalid numeric group references in repla http://bugs.python.org/issue25953 #25949: Lazy creation of __dict__ in OrderedDict http://bugs.python.org/issue25949 #25945: Type confusion in partial_setstate and partial_call leads to m http://bugs.python.org/issue25945 #25942: subprocess.call SIGKILLs too liberally http://bugs.python.org/issue25942 #25941: Add 'How to Review a Patch' section to devguide http://bugs.python.org/issue25941 #25939: _ssl.enum_certificates() fails with ERROR_ACCESS_DENIED if pyt http://bugs.python.org/issue25939 #25937: DIfference between utf8 and utf-8 when i define python source http://bugs.python.org/issue25937 #25935: OrderedDict prevents garbage collection if a circulary referen http://bugs.python.org/issue25935 #25933: Unhandled exception (TypeError) with ftplib in function retrbi http://bugs.python.org/issue25933 #25925: Coverage support for CPython 2 http://bugs.python.org/issue25925 #25919: http.client PUT method ignores error responses sent immediatly http://bugs.python.org/issue25919 Top 10 most discussed issues (10) ================================= #25937: DIfference between utf8 and utf-8 when i define python source http://bugs.python.org/issue25937 12 msgs #25864: collections.abc.Mapping should include a __reversed__ that rai http://bugs.python.org/issue25864 10 msgs #25959: tkinter - PhotoImage.zoom() causes segfault http://bugs.python.org/issue25959 6 msgs #25981: Intern namedtuple field names http://bugs.python.org/issue25981 6 msgs #19475: Add timespec optional flag to datetime isoformat() to choose t http://bugs.python.org/issue19475 5 msgs #25954: Python 3.5.1 installer fails on Windows 7 http://bugs.python.org/issue25954 5 msgs #25939: _ssl.enum_certificates() fails with ERROR_ACCESS_DENIED if pyt http://bugs.python.org/issue25939 4 msgs #25942: subprocess.call SIGKILLs too liberally http://bugs.python.org/issue25942 4 msgs #25958: Implicit ABCs have no means of "anti-registration" http://bugs.python.org/issue25958 4 msgs #25974: Fix statistics.py after the Decimal.as_integer_ratio() change http://bugs.python.org/issue25974 4 msgs Issues closed (30) ================== #12484: The Py_InitModule functions no longer exist, but remain in the http://bugs.python.org/issue12484 closed by brett.cannon #19511: lib2to3 Grammar file is no longer a Python 3 superset http://bugs.python.org/issue19511 closed by ezio.melotti #19873: There is a duplicate function in Lib/test/test_pathlib.py http://bugs.python.org/issue19873 closed by ezio.melotti #20607: multiprocessing cx_Freeze windows GUI bug (& easy fixes) http://bugs.python.org/issue20607 closed by davin #21579: Python 3.4: tempfile.close attribute does not work http://bugs.python.org/issue21579 closed by mmarkk #22995: Restrict default pickleability http://bugs.python.org/issue22995 closed by serhiy.storchaka #23166: urllib2 ignores opener configuration under certain circumstanc http://bugs.python.org/issue23166 closed by martin.panter #24725: test_socket testFDPassEmpty fails on OS X 10.11 DP with "Canno http://bugs.python.org/issue24725 closed by brett.cannon #25157: Installing Python 3.5.0 32bit on Windows 8.1 64bit system give http://bugs.python.org/issue25157 closed by terry.reedy #25360: pyw should search for pythonw to implement #!/usr/bin/env pyt http://bugs.python.org/issue25360 closed by python-dev #25664: Logging cannot handle Unicode logger names http://bugs.python.org/issue25664 closed by python-dev #25685: Inefficiency with SocketHandler - may send log record message http://bugs.python.org/issue25685 closed by python-dev #25789: py launcher stderr is not piped to subprocess.Popen.stderr http://bugs.python.org/issue25789 closed by python-dev #25923: More const char http://bugs.python.org/issue25923 closed by serhiy.storchaka #25928: Add Decimal.as_integer_ratio() http://bugs.python.org/issue25928 closed by skrah #25955: email.utils.formataddr does not support RFC 6532 http://bugs.python.org/issue25955 closed by r.david.murray #25956: Unambiguous representation of recursive objects http://bugs.python.org/issue25956 closed by serhiy.storchaka #25960: Popen.wait() hangs when called from a signal handler when os.w http://bugs.python.org/issue25960 closed by gregory.p.smith #25961: Disallow the null character in type name http://bugs.python.org/issue25961 closed by serhiy.storchaka #25964: optparse.py:1668: (file) shadows builtin http://bugs.python.org/issue25964 closed by SilentGhost #25965: decimal.py: issues reported by pychecker http://bugs.python.org/issue25965 closed by r.david.murray #25971: Optimize converting float and Decimal to Fraction http://bugs.python.org/issue25971 closed by serhiy.storchaka #25972: All Windows buildbots fail compile/clean http://bugs.python.org/issue25972 closed by zach.ware #25973: Segmentation fault with nonlocal and two underscores http://bugs.python.org/issue25973 closed by python-dev #25975: Weird multiplication http://bugs.python.org/issue25975 closed by skrah #25976: telnetlib SyntaxError: invalid syntax http://bugs.python.org/issue25976 closed by SilentGhost #25977: Typo fixes in Lib/tokenize.py http://bugs.python.org/issue25977 closed by berker.peksag #25978: escape sequence r'\' giving compilation error http://bugs.python.org/issue25978 closed by r.david.murray #25979: String functions lstrip are not working properly when you have http://bugs.python.org/issue25979 closed by ethan.furman #25980: not able to find module request in lib urllib - Python35-32 http://bugs.python.org/issue25980 closed by brett.cannon From mike.romberg at comcast.net Sat Jan 2 14:32:01 2016 From: mike.romberg at comcast.net (mike.romberg at comcast.net) Date: Sat, 2 Jan 2016 12:32:01 -0700 Subject: [Python-Dev] zipimport.c broken with implicit namespace packages Message-ID: <22152.9649.202823.766027@lrd.home.lan> BRIEF INTRODUCTION: I've been using python since the early 1.X releases. Mostly for application development. On occasion I've contributed bits to the core: > grep Romberg Misc/ACKS Mike Romberg I've recently ported a large application to python3 (it started life as using 1.1 so it has been a long road for this codebase). The one big killer feature of python3 I'm attempting to use is implicit namespace packages. But they are broken with the zipimport.c module. It seems that zipimport.c never worked with these as it is comparing paths in the form 'a.b.c' to paths in the form 'a/b/c'. I created a patch that fixes this and makes zipimport work exactly the same as a standard filesystem import. I was getting my patch ready to submit when I found that this problem has already been reported: https://bugs.python.org/issue17633 Is there anything I can do to help fix this issue? I could polish up my patch create test cases and submit them. But it looks like the above patch does the same thing and is in "the process". But it has been "in the process" for three years. What else needs to be done? I'll help if I can. Mike From brett at python.org Sat Jan 2 15:36:23 2016 From: brett at python.org (Brett Cannon) Date: Sat, 02 Jan 2016 20:36:23 +0000 Subject: [Python-Dev] zipimport.c broken with implicit namespace packages In-Reply-To: <22152.9649.202823.766027@lrd.home.lan> References: <22152.9649.202823.766027@lrd.home.lan> Message-ID: FIrst off, Mike, sorry about the bug. We have unfortunately let zipimport get into a sorry state that has made no one want to work on the code anymore. That being said, I opened https://bugs.python.org/issue25711 to specifically try to fix this issue once and for all and along the way modernize zipimport by rewriting it from scratch to be more maintainable (or whatever the module is named in case we break backwards-compatibility). At this point the best option might be, Mike, if you do a code review for https://bugs.python.org/issue17633, even if it is simply a LGTM. I will then personally make sure the approved patch gets checked in for Python 3.6 in case the rewrite of zipimport misses the release. On Sat, 2 Jan 2016 at 11:35 wrote: > > BRIEF INTRODUCTION: I've been using python since the early 1.X > releases. Mostly for application development. On occasion I've > contributed bits to the core: > > > grep Romberg Misc/ACKS > Mike Romberg > > I've recently ported a large application to python3 (it started life > as using 1.1 so it has been a long road for this codebase). The one > big killer feature of python3 I'm attempting to use is implicit > namespace packages. But they are broken with the zipimport.c module. > > > It seems that zipimport.c never worked with these as it is > comparing paths in the form 'a.b.c' to paths in the form 'a/b/c'. I > created a patch that fixes this and makes zipimport work exactly the > same as a standard filesystem import. I was getting my patch ready to > submit when I found that this problem has already been reported: > > https://bugs.python.org/issue17633 > > Is there anything I can do to help fix this issue? I could polish > up my patch create test cases and submit them. But it looks like the > above patch does the same thing and is in "the process". But it has > been "in the process" for three years. What else needs to be done? > I'll help if I can. > > Mike > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mike.romberg at comcast.net Sat Jan 2 17:26:33 2016 From: mike.romberg at comcast.net (mike.romberg at comcast.net) Date: Sat, 2 Jan 2016 15:26:33 -0700 Subject: [Python-Dev] zipimport.c broken with implicit namespace packages In-Reply-To: References: <22152.9649.202823.766027@lrd.home.lan> Message-ID: <22152.20121.966791.318438@lrd.home.lan> -- >>>>> "Brett" == Brett Cannon writes: > I opened > https://bugs.python.org/issue25711 to specifically try to > fix this issue once and for all and along the way modernize > zipimport by rewriting it from scratch to be more > maintainable Every time I read about impementing a custom loader: https://docs.python.org/3/library/importlib.html I've wondered why python does not have some sort of virtual filesystem layer to deal with locating modules/packages/support files. Virtual file systems seem like a good way to store data on a wide range of storage devices. A VFSLoader object would interface with importlib and deal with: - implementing a finder and loader - Determine the actual type of file to load (.py, .pyc, .pyo, __pycache__, etc). - Do all of it's work by calling virtual functions such as: * listdir(path) * read(path) * stat(path) # for things like mtime, size, etc * write(path, data) # not all VFS implement this Then things like a ziploader would just inherit from VFSLoader implement the straight forward methods and everything should "just work" :). I see no reason why every type of loader (real filesystem, http, ssh, sql database, etc) would not do this as well. Leave all the details such as implicit namespace packages, presence of __init__.py[oc] files, .pth files, etc in one single location and put the details on how to interact with the actual storage device in leaf classes which don't know or care about the high level details. They would not even know they are loading python modules. It is just blobs of data to them. I may try my hand at creating a prototype of this for just the zipimporter and see how it goes. > At this point the best option might be, Mike, if you do a > code review for https://bugs.python.org/issue17633, even if > it is simply a LGTM. I will then personally make sure the > approved patch gets checked in for Python 3.6 in case the > rewrite of zipimport misses the release. Cool. I'll see what I can do. I was having a bit of trouble with the register/login part of the bug tracker. Which is why I came here. I'll battle with it one more time and see if I can get it to log me in. The patch should be fairly simple. In a nutshell it just does a: path.replace('.', '/') + '/' in two locations. One where it checks for the path being a directory entry in the zip file and the second to return an implicit namespace path (instead of not found) if it is a match. I'll check the patch on the tracker and see if it still works with 3.5.1. If not I'll attach mine. Mike From guido at python.org Sat Jan 2 22:41:28 2016 From: guido at python.org (Guido van Rossum) Date: Sat, 2 Jan 2016 20:41:28 -0700 Subject: [Python-Dev] zipimport.c broken with implicit namespace packages In-Reply-To: <22152.20121.966791.318438@lrd.home.lan> References: <22152.9649.202823.766027@lrd.home.lan> <22152.20121.966791.318438@lrd.home.lan> Message-ID: On Sat, Jan 2, 2016 at 3:26 PM, wrote: > > -- > >>>>> "Brett" == Brett Cannon writes: > > > I opened > > https://bugs.python.org/issue25711 to specifically try to > > fix this issue once and for all and along the way modernize > > zipimport by rewriting it from scratch to be more > > maintainable > > Every time I read about impementing a custom loader: > > https://docs.python.org/3/library/importlib.html > > I've wondered why python does not have some sort of virtual > filesystem layer to deal with locating modules/packages/support > files. Virtual file systems seem like a good way to store data on a > wide range of storage devices. > Yeah, but most devices already implement a *real* filesystem, so the only time the VFS would come in handy would be for zipfiles, where we already have a solution. > A VFSLoader object would interface with importlib and deal with: > > - implementing a finder and loader > > - Determine the actual type of file to load (.py, .pyc, .pyo, > __pycache__, etc). > > - Do all of it's work by calling virtual functions such as: > * listdir(path) > * read(path) > * stat(path) # for things like mtime, size, etc > * write(path, data) # not all VFS implement this > Emulating a decent filesystem API requires you to implement functionality that would never be used by an import loader (write() is an example -- many of the stat() fields are another example). So it would just be overkill. > Then things like a ziploader would just inherit from VFSLoader > implement the straight forward methods and everything should "just > work" :). I see no reason why every type of loader (real filesystem, > http, ssh, sql database, etc) would not do this as well. All those examples except "real filesystem" are of very limited practical value. > Leave all > the details such as implicit namespace packages, presence of > __init__.py[oc] files, .pth files, etc in one single > location and put the details on how to interact with the actual > storage device in leaf classes which don't know or care about the high > level details. They would not even know they are loading python > modules. It is just blobs of data to them. > Actually the import loader API is much more suitable and less work to implement than a VFS. > I may try my hand at creating a prototype of this for just the > zipimporter and see how it goes. > That would nevertheless be an interesting exercise -- I hope you do it and report back. > > At this point the best option might be, Mike, if you do a > > code review for https://bugs.python.org/issue17633, even if > > it is simply a LGTM. I will then personally make sure the > > approved patch gets checked in for Python 3.6 in case the > > rewrite of zipimport misses the release. > > Cool. I'll see what I can do. I was having a bit of trouble with > the register/login part of the bug tracker. Which is why I came > here. I'll battle with it one more time and see if I can get it to > log me in. > If you really can't manage to comment in the tracker (which sounds unlikely -- many people have succeeded :-) you can post your LGTM on the specific patch here. > The patch should be fairly simple. In a nutshell it just does a: > > path.replace('.', '/') + '/' in two locations. One where it checks > for the path being a directory entry in the zip file and the second to > return an implicit namespace path (instead of not found) if it is a > match. I'll check the patch on the tracker and see if it still works > with 3.5.1. If not I'll attach mine. Well, Brett would like to see your feedback on the specific patch. Does it work for you? -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Sat Jan 2 23:42:08 2016 From: brett at python.org (Brett Cannon) Date: Sun, 03 Jan 2016 04:42:08 +0000 Subject: [Python-Dev] zipimport.c broken with implicit namespace packages In-Reply-To: References: <22152.9649.202823.766027@lrd.home.lan> <22152.20121.966791.318438@lrd.home.lan> Message-ID: I just wanted to quickly say that Guido's observation as to how a VFS is overkill is right. Imagine implementing a loader using sqlite and you quickly realize that doing a dull VFS is more than necessary to implement what import needs to function. On Sat, 2 Jan 2016, 19:42 Guido van Rossum wrote: > On Sat, Jan 2, 2016 at 3:26 PM, wrote: > >> >> -- >> >>>>> "Brett" == Brett Cannon writes: >> >> > I opened >> > https://bugs.python.org/issue25711 to specifically try to >> > fix this issue once and for all and along the way modernize >> > zipimport by rewriting it from scratch to be more >> > maintainable >> >> Every time I read about impementing a custom loader: >> >> https://docs.python.org/3/library/importlib.html >> >> I've wondered why python does not have some sort of virtual >> filesystem layer to deal with locating modules/packages/support >> files. Virtual file systems seem like a good way to store data on a >> wide range of storage devices. >> > > Yeah, but most devices already implement a *real* filesystem, so the only > time the VFS would come in handy would be for zipfiles, where we already > have a solution. > > >> A VFSLoader object would interface with importlib and deal with: >> >> - implementing a finder and loader >> >> - Determine the actual type of file to load (.py, .pyc, .pyo, >> __pycache__, etc). >> >> - Do all of it's work by calling virtual functions such as: >> * listdir(path) >> * read(path) >> * stat(path) # for things like mtime, size, etc >> * write(path, data) # not all VFS implement this >> > > Emulating a decent filesystem API requires you to implement functionality > that would never be used by an import loader (write() is an example -- many > of the stat() fields are another example). So it would just be overkill. > > >> Then things like a ziploader would just inherit from VFSLoader >> implement the straight forward methods and everything should "just >> work" :). I see no reason why every type of loader (real filesystem, >> http, ssh, sql database, etc) would not do this as well. > > > All those examples except "real filesystem" are of very limited practical > value. > > >> Leave all >> the details such as implicit namespace packages, presence of >> __init__.py[oc] files, .pth files, etc in one single >> location and put the details on how to interact with the actual >> storage device in leaf classes which don't know or care about the high >> level details. They would not even know they are loading python >> modules. It is just blobs of data to them. >> > > Actually the import loader API is much more suitable and less work to > implement than a VFS. > > >> I may try my hand at creating a prototype of this for just the >> zipimporter and see how it goes. >> > > That would nevertheless be an interesting exercise -- I hope you do it and > report back. > > >> > At this point the best option might be, Mike, if you do a >> > code review for https://bugs.python.org/issue17633, even if >> > it is simply a LGTM. I will then personally make sure the >> > approved patch gets checked in for Python 3.6 in case the >> > rewrite of zipimport misses the release. >> >> Cool. I'll see what I can do. I was having a bit of trouble with >> the register/login part of the bug tracker. Which is why I came >> here. I'll battle with it one more time and see if I can get it to >> log me in. >> > > If you really can't manage to comment in the tracker (which sounds > unlikely -- many people have succeeded :-) you can post your LGTM on the > specific patch here. > > >> The patch should be fairly simple. In a nutshell it just does a: >> >> path.replace('.', '/') + '/' in two locations. One where it checks >> for the path being a directory entry in the zip file and the second to >> return an implicit namespace path (instead of not found) if it is a >> match. I'll check the patch on the tracker and see if it still works >> with 3.5.1. If not I'll attach mine. > > > Well, Brett would like to see your feedback on the specific patch. Does it > work for you? > > -- > --Guido van Rossum (python.org/~guido) > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Sat Jan 2 23:43:13 2016 From: brett at python.org (Brett Cannon) Date: Sun, 03 Jan 2016 04:43:13 +0000 Subject: [Python-Dev] zipimport.c broken with implicit namespace packages In-Reply-To: References: <22152.9649.202823.766027@lrd.home.lan> <22152.20121.966791.318438@lrd.home.lan> Message-ID: On Sat, 2 Jan 2016, 20:42 Brett Cannon wrote: > I just wanted to quickly say that Guido's observation as to how a VFS is > overkill is right. Imagine implementing a loader using sqlite and you > quickly realize that doing a dull VFS is more > "dull" -> "full" than necessary to implement what import needs to function. > > On Sat, 2 Jan 2016, 19:42 Guido van Rossum wrote: > >> On Sat, Jan 2, 2016 at 3:26 PM, wrote: >> >>> >>> -- >>> >>>>> "Brett" == Brett Cannon writes: >>> >>> > I opened >>> > https://bugs.python.org/issue25711 to specifically try to >>> > fix this issue once and for all and along the way modernize >>> > zipimport by rewriting it from scratch to be more >>> > maintainable >>> >>> Every time I read about impementing a custom loader: >>> >>> https://docs.python.org/3/library/importlib.html >>> >>> I've wondered why python does not have some sort of virtual >>> filesystem layer to deal with locating modules/packages/support >>> files. Virtual file systems seem like a good way to store data on a >>> wide range of storage devices. >>> >> >> Yeah, but most devices already implement a *real* filesystem, so the only >> time the VFS would come in handy would be for zipfiles, where we already >> have a solution. >> >> >>> A VFSLoader object would interface with importlib and deal with: >>> >>> - implementing a finder and loader >>> >>> - Determine the actual type of file to load (.py, .pyc, .pyo, >>> __pycache__, etc). >>> >>> - Do all of it's work by calling virtual functions such as: >>> * listdir(path) >>> * read(path) >>> * stat(path) # for things like mtime, size, etc >>> * write(path, data) # not all VFS implement this >>> >> >> Emulating a decent filesystem API requires you to implement functionality >> that would never be used by an import loader (write() is an example -- many >> of the stat() fields are another example). So it would just be overkill. >> >> >>> Then things like a ziploader would just inherit from VFSLoader >>> implement the straight forward methods and everything should "just >>> work" :). I see no reason why every type of loader (real filesystem, >>> http, ssh, sql database, etc) would not do this as well. >> >> >> All those examples except "real filesystem" are of very limited practical >> value. >> >> >>> Leave all >>> the details such as implicit namespace packages, presence of >>> __init__.py[oc] files, .pth files, etc in one single >>> location and put the details on how to interact with the actual >>> storage device in leaf classes which don't know or care about the high >>> level details. They would not even know they are loading python >>> modules. It is just blobs of data to them. >>> >> >> Actually the import loader API is much more suitable and less work to >> implement than a VFS. >> >> >>> I may try my hand at creating a prototype of this for just the >>> zipimporter and see how it goes. >>> >> >> That would nevertheless be an interesting exercise -- I hope you do it >> and report back. >> >> >>> > At this point the best option might be, Mike, if you do a >>> > code review for https://bugs.python.org/issue17633, even if >>> > it is simply a LGTM. I will then personally make sure the >>> > approved patch gets checked in for Python 3.6 in case the >>> > rewrite of zipimport misses the release. >>> >>> Cool. I'll see what I can do. I was having a bit of trouble with >>> the register/login part of the bug tracker. Which is why I came >>> here. I'll battle with it one more time and see if I can get it to >>> log me in. >>> >> >> If you really can't manage to comment in the tracker (which sounds >> unlikely -- many people have succeeded :-) you can post your LGTM on the >> specific patch here. >> >> >>> The patch should be fairly simple. In a nutshell it just does a: >>> >>> path.replace('.', '/') + '/' in two locations. One where it checks >>> for the path being a directory entry in the zip file and the second to >>> return an implicit namespace path (instead of not found) if it is a >>> match. I'll check the patch on the tracker and see if it still works >>> with 3.5.1. If not I'll attach mine. >> >> >> Well, Brett would like to see your feedback on the specific patch. Does >> it work for you? >> >> -- >> --Guido van Rossum (python.org/~guido) >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/brett%40python.org >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mike.romberg at comcast.net Sun Jan 3 00:29:52 2016 From: mike.romberg at comcast.net (mike.romberg at comcast.net) Date: Sat, 2 Jan 2016 22:29:52 -0700 Subject: [Python-Dev] zipimport.c broken with implicit namespace packages In-Reply-To: References: <22152.9649.202823.766027@lrd.home.lan> <22152.20121.966791.318438@lrd.home.lan> Message-ID: <22152.45520.973803.741563@lrd.home.lan> >>>>> " " == Brett Cannon writes: > I just wanted to quickly say that Guido's observation as to how > a VFS is overkill is right. Imagine implementing a loader using > sqlite and you quickly realize that doing a dull VFS is more > than necessary to implement what import needs to function. I fear I've made a poor choice in calling this abstract class a VFS (I'm terrible with names). I'm not thinking of anything along the lines of a full file system that supports open(), seek(), read() and everything else. That for sure would be overkill and way more complicated than it needs to be. All I'm really thinking about is a simple abstract interface that is used by an importer to actually locate and retrieve the binary objects that will be loaded. For the simple case I think just two methods would/could server a read only "blob/byte database": listdir(path) # returns an iterable container of "files"/"dirs" found # at path get(path) # returns a bytes object for the given path I think with those two virtual calls a more high level import layer can locate and retrieve modules to be loaded without even knowing where they came from. The higher level would know about things such as the difference between .py and .pyc "files" or the possible existance of __pycache__ directories and what may be found in them. Right now the zipimporter contains a list of file extensions to try and load (and in what order). It also lacks any knowledge of __pycache__ directories (which is one of the outstanding bugs with it). It just seems to me that this sorta logic would be better moved to a higher layer and the zip layer just translates paths into reads of byte blobs. I mentioned write()/put() for two reasons: 1) When I import a .py file then a .pyc file is created on my filesystem. I don't really know what piece of code created it. But a write to the filesystem (assuming it is writeable and permissions set etc) occurs. It might be nice for other storage systems (zip, sql, etc) could optionally support this. They could if the code that crated the .pyc simply did a put() to the object that pulled in the .py file. The interface is expanded by two calls (put() and delete()). 2) Integration with package data. I know there are modules/packages out there that help a module try and locate data files that may be associated with a package. I think it would be kinda cool for a module to instead be able to get a handle to the abstract class that loaded it. it could then use the same listdir() get() and possibly write methods the importer did. The writing bit of this may or may not be a good idea :) Anyway, hope I did not muddy the waters. I was just thinking a bit out loud and none of this may live past my own experiments. I was/am just trying to think of why the importers like the zipimporter don't work like a filesystem importer and how they would be cleaner if they just dealt with paths and byte blobs to store/get based on those paths. Mike From ncoghlan at gmail.com Sun Jan 3 01:34:21 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun, 3 Jan 2016 16:34:21 +1000 Subject: [Python-Dev] zipimport.c broken with implicit namespace packages In-Reply-To: References: <22152.9649.202823.766027@lrd.home.lan> <22152.20121.966791.318438@lrd.home.lan> <22152.45520.973803.741563@lrd.home.lan> Message-ID: On 3 January 2016 at 16:32, Nick Coghlan wrote: > For folks that are interested in that, folks that aren't following > import-sig in addition to python-dev may want to take a look at > Brett's design for the importlib.resources API: > http://nbviewer.jupyter.org/gist/brettcannon/9c4681a77a7fa09c5347 Sorry, I meant to include a link to the import-sig thread as well: https://mail.python.org/pipermail/import-sig/2015-November/001041.html Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Sun Jan 3 01:32:24 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun, 3 Jan 2016 16:32:24 +1000 Subject: [Python-Dev] zipimport.c broken with implicit namespace packages In-Reply-To: <22152.45520.973803.741563@lrd.home.lan> References: <22152.9649.202823.766027@lrd.home.lan> <22152.20121.966791.318438@lrd.home.lan> <22152.45520.973803.741563@lrd.home.lan> Message-ID: On 3 January 2016 at 15:29, wrote: >>>>>> " " == Brett Cannon writes: > > > I just wanted to quickly say that Guido's observation as to how > > a VFS is overkill is right. Imagine implementing a loader using > > sqlite and you quickly realize that doing a dull VFS is more > > than necessary to implement what import needs to function. > > I fear I've made a poor choice in calling this abstract class a VFS > (I'm terrible with names). I'm not thinking of anything along the > lines of a full file system that supports open(), seek(), read() and > everything else. That for sure would be overkill and way more > complicated than it needs to be. > > All I'm really thinking about is a simple abstract interface that is > used by an importer to actually locate and retrieve the binary objects > that will be loaded. For the simple case I think just two methods > would/could server a read only "blob/byte database": > > listdir(path) # returns an iterable container of "files"/"dirs" found > # at path > > get(path) # returns a bytes object for the given path We already have the latter: https://docs.python.org/3/library/importlib.html#importlib.abc.ResourceLoader.get_data It's the former that has taken a while to get to, as the 3rd party pkg_resources module (part of setuptools) already provides a pragmatic API that also has the virtue of being compatible with both Python 2 & 3, and there a few subtleties related to the possible use of temporary files that make a robust API design trickier than it first appears to be. For folks that are interested in that, folks that aren't following import-sig in addition to python-dev may want to take a look at Brett's design for the importlib.resources API: http://nbviewer.jupyter.org/gist/brettcannon/9c4681a77a7fa09c5347 Cheers, Nick. P.S. If anyone actually *does* want a full "virtual file system layer" API for non-filesystem storage locations: http://docs.pyfilesystem.org/en/latest/filesystems.html -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From phil at riverbankcomputing.com Sun Jan 3 05:43:48 2016 From: phil at riverbankcomputing.com (Phil Thompson) Date: Sun, 3 Jan 2016 10:43:48 +0000 Subject: [Python-Dev] zipimport.c broken with implicit namespace packages In-Reply-To: References: <22152.9649.202823.766027@lrd.home.lan> <22152.20121.966791.318438@lrd.home.lan> Message-ID: <0DC6BE63-6816-4288-B2D3-11ADDD661BFA@riverbankcomputing.com> On 3 Jan 2016, at 3:41 am, Guido van Rossum wrote: > > On Sat, Jan 2, 2016 at 3:26 PM, wrote: > > -- > >>>>> "Brett" == Brett Cannon writes: > > > I opened > > https://bugs.python.org/issue25711 to specifically try to > > fix this issue once and for all and along the way modernize > > zipimport by rewriting it from scratch to be more > > maintainable > > Every time I read about impementing a custom loader: > > https://docs.python.org/3/library/importlib.html > > I've wondered why python does not have some sort of virtual > filesystem layer to deal with locating modules/packages/support > files. Virtual file systems seem like a good way to store data on a > wide range of storage devices. > > Yeah, but most devices already implement a *real* filesystem, so the only time the VFS would come in handy would be for zipfiles, where we already have a solution. Just to point out that it would be nice to have an easier way to use something other that zipfiles. I have a need to exploit a different solution and have to patch the bootstrap code (because the zipfile support is handled as a special case). BTW the need is to create iOS and Android executables from frozen Python code. Phil From brett at python.org Sun Jan 3 12:31:28 2016 From: brett at python.org (Brett Cannon) Date: Sun, 03 Jan 2016 17:31:28 +0000 Subject: [Python-Dev] zipimport.c broken with implicit namespace packages In-Reply-To: <22152.45520.973803.741563@lrd.home.lan> References: <22152.9649.202823.766027@lrd.home.lan> <22152.20121.966791.318438@lrd.home.lan> <22152.45520.973803.741563@lrd.home.lan> Message-ID: On Sat, 2 Jan 2016 at 21:31 wrote: > >>>>> " " == Brett Cannon writes: > > > I just wanted to quickly say that Guido's observation as to how > > a VFS is overkill is right. Imagine implementing a loader using > > sqlite and you quickly realize that doing a dull VFS is more > > than necessary to implement what import needs to function. > > I fear I've made a poor choice in calling this abstract class a VFS > (I'm terrible with names). I'm not thinking of anything along the > lines of a full file system that supports open(), seek(), read() and > everything else. That for sure would be overkill and way more > complicated than it needs to be. > > All I'm really thinking about is a simple abstract interface that is > used by an importer to actually locate and retrieve the binary objects > that will be loaded. For the simple case I think just two methods > would/could server a read only "blob/byte database": > > listdir(path) # returns an iterable container of "files"/"dirs" found > # at path > > get(path) # returns a bytes object for the given path > > I think with those two virtual calls a more high level import layer > can locate and retrieve modules to be loaded without even knowing > where they came from. > > The higher level would know about things such as the difference > between .py and .pyc "files" or the possible existance of __pycache__ > directories and what may be found in them. Right now the zipimporter > contains a list of file extensions to try and load (and in what > order). It also lacks any knowledge of __pycache__ directories (which > is one of the outstanding bugs with it). It just seems to me that > this sorta logic would be better moved to a higher layer and the zip > layer just translates paths into reads of byte blobs. > > I mentioned write()/put() for two reasons: > > 1) When I import a .py file then a .pyc file is created on my > filesystem. I don't really know what piece of code created it. > But a write to the filesystem (assuming it is writeable and > permissions set etc) occurs. It might be nice for other > storage systems (zip, sql, etc) could optionally support this. > They could if the code that crated the .pyc simply did a put() > to the object that pulled in the .py file. The interface is > expanded by two calls (put() and delete()). > > 2) Integration with package data. I know there are > modules/packages out there that help a module try and locate > data files that may be associated with a package. I think it > would be kinda cool for a module to instead be able to get a > handle to the abstract class that loaded it. it could then use > the same listdir() get() and possibly write methods the importer > did. The writing bit of this may or may not be a good idea :) > > So it's possible that some abstraction might be possible, but up to this point there hasn't been a motivating factor for importlib to do this as zipimport is written in C and thus won't benefit from any abstraction that importlib uses in its Python code (hence why zipimport needs to be rewritten). Maybe after we have a zip-backed importer written in Python a common API will fall through and we can abstract that out, but I wouldn't want to guess until the code is written and someone can stare at the two implementations. It should also be said that there is nothing requiring that an importer support any concept of a file. Going back to the sqlite database example, you could make it nothing more than a table that maps module name to source code and bytecode: CREATE TABLE code ( name TEXT PRIMARY KEY NOT NULL, source BLOB, bytecode BLOB ); /* A trigger that re-generates `bytecode` when `source` is set would be nice in this example, but that's beyond my SQL abilities. */ That's enough to implement an importer and there is no mention of file paths anywhere. If you really wanted to could do a second table that acted as a database-backed file system for reading package data files, but that is not required to meet the minimum functionality -- and then some thanks to bytecode support -- for an importer. > > Anyway, hope I did not muddy the waters. I was just thinking a bit > out loud and none of this may live past my own experiments. I was/am > just trying to think of why the importers like the zipimporter don't > work like a filesystem importer and how they would be cleaner if they > just dealt with paths and byte blobs to store/get based on those paths. > It's a reasonable thing to consider, but it would be better to get zipimport fixed for you, then rewritten, and then look at code for the rewritten zipimport and what importlib.machinery.SourceFileLoader has to see if there is a common API to abstract out and build upon. -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Sun Jan 3 12:33:14 2016 From: brett at python.org (Brett Cannon) Date: Sun, 03 Jan 2016 17:33:14 +0000 Subject: [Python-Dev] zipimport.c broken with implicit namespace packages In-Reply-To: <0DC6BE63-6816-4288-B2D3-11ADDD661BFA@riverbankcomputing.com> References: <22152.9649.202823.766027@lrd.home.lan> <22152.20121.966791.318438@lrd.home.lan> <0DC6BE63-6816-4288-B2D3-11ADDD661BFA@riverbankcomputing.com> Message-ID: On Sun, 3 Jan 2016 at 02:55 Phil Thompson wrote: > On 3 Jan 2016, at 3:41 am, Guido van Rossum wrote: > > > > On Sat, Jan 2, 2016 at 3:26 PM, wrote: > > > > -- > > >>>>> "Brett" == Brett Cannon writes: > > > > > I opened > > > https://bugs.python.org/issue25711 to specifically try to > > > fix this issue once and for all and along the way modernize > > > zipimport by rewriting it from scratch to be more > > > maintainable > > > > Every time I read about impementing a custom loader: > > > > https://docs.python.org/3/library/importlib.html > > > > I've wondered why python does not have some sort of virtual > > filesystem layer to deal with locating modules/packages/support > > files. Virtual file systems seem like a good way to store data on a > > wide range of storage devices. > > > > Yeah, but most devices already implement a *real* filesystem, so the > only time the VFS would come in handy would be for zipfiles, where we > already have a solution. > > Just to point out that it would be nice to have an easier way to use > something other that zipfiles. I have a need to exploit a different > solution and have to patch the bootstrap code (because the zipfile support > is handled as a special case). BTW the need is to create iOS and Android > executables from frozen Python code. > Not quite sure about how zip files are a special-case beyond just being put in sys.meta_path automatically. You can get the same results with a .pth file or a sitecustomize.py depending on how pervasive your need is. Otherwise feel free to file an issue at bugs.python.org and we can talk over there about what you specifically need and if it's reasonable to try and support. -------------- next part -------------- An HTML attachment was scrubbed... URL: From mike.romberg at comcast.net Sun Jan 3 17:37:05 2016 From: mike.romberg at comcast.net (mike.romberg at comcast.net) Date: Sun, 3 Jan 2016 15:37:05 -0700 Subject: [Python-Dev] zipimport.c broken with implicit namespace packages In-Reply-To: References: <22152.9649.202823.766027@lrd.home.lan> <22152.20121.966791.318438@lrd.home.lan> <22152.45520.973803.741563@lrd.home.lan> Message-ID: <22153.41617.619847.318185@lrd.home.lan> >>>>> " " == Brett Cannon writes: ... > So it's possible that some abstraction might be possible, but > up to this point there hasn't been a motivating factor for > importlib to do this as zipimport is written in C and thus > won't benefit from any abstraction that importlib uses in its > Python code (hence why zipimport needs to be rewritten). Maybe > after we have a zip-backed importer written in Python a common > API will fall through and we can abstract that out, but I > wouldn't want to guess until the code is written and someone > can stare at the two implementations. Fair enough. I too think it is a good idea to make base classes after their is a need for them and not before. Some argument could be made that there is a need now as zipimported modules/packages don't work exactly the same way as "normal" ones. But since you plan a rewrite of zipimport collecting the common stuff after that makes sense. > It should also be said that there is nothing requiring that an > importer support any concept of a file. Going back to the > sqlite database example, you could make it nothing more than a > table that maps module name to source code and bytecode: Yep. I was not thinking file either. I may have said file (again I'm terrible with names) but I'm thinking an array of bytes with a string id (which I call a path). The actual storage could simply be storing the byte array using the stringID/path. ... > That's enough to implement an importer and there is no mention > of file paths anywhere. If you really wanted to could do a > second table that acted as a database-backed file system for > reading package data files, but that is not required to meet > the minimum functionality -- and then some thanks to bytecode > support -- for an importer. Yea the python module name could be viewed as a path (package.subpackage.module) and stored in a hierarchical way. Or it could simply be viewed as a string and stored in some flat database. Anyway, back at the ranch... I've got an extended version of the patch for issue17633 working on my system. I've added to the test case to check for the proper functioning of implicit namespace packages spread between two zip files. I still need to add tests for a namespace package spread between a normal filesystem and a zip file. I expect I'll have that done in a day or two. I'll post it to the bug tracker. Mike From guido at python.org Sun Jan 3 18:21:32 2016 From: guido at python.org (Guido van Rossum) Date: Sun, 3 Jan 2016 15:21:32 -0800 Subject: [Python-Dev] PEP 257 and __init__ In-Reply-To: References: Message-ID: On Tue, Dec 29, 2015 at 1:03 PM, Facundo Batista wrote: > On Tue, Dec 29, 2015 at 4:38 PM, Andrew Barnert > wrote: > > > Isn't the same thing true for every special method? There are lots of > classes where __add___ just says "a.__add__(b) = a + b" or (better > following the PEP) "Return self + value." But, in the rare case where the > semantics of "a + b" are a little tricky (think of "a / b" for > pathlib.Path), where else could you put it but __add__? > > > > Similarly, for most classes, there's only one of __init__ or __new__, > and the construction/initialization semantics are simple enough to describe > in one line of the class docstring--but when things are more complicated > and need to be documented, where else would you put it? > > Yeap. Note that I'm ok to include a docstring when the actual > behaviour would deviate from the expected one as per Reference Docs. > My point is to not make it mandatory. > > > > I usually just don't bother. You can violate PEP 257 when it makes > sense, just like PEP 8. They're just guidelines, not iron-clad rules. > > Yeap, but pep257 (the tool [0]) complains for __init__, and wanted to > know how serious was it. > > > [0] https://pypi.python.org/pypi/pep257 That is the tool's fault. I personally hate with a vengeance that there are tools named after style guide PEPs that claim to enforce the guidelines from those PEPs. The tools' rigidity and simplicity reflects badly on the PEPs, which try hard not to be rigid or simplistic. -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From tjreedy at udel.edu Mon Jan 4 02:01:15 2016 From: tjreedy at udel.edu (Terry Reedy) Date: Mon, 4 Jan 2016 02:01:15 -0500 Subject: [Python-Dev] PEP 257 and __init__ In-Reply-To: References: Message-ID: On 1/3/2016 6:21 PM, Guido van Rossum wrote: > On Tue, Dec 29, 2015 at 1:03 PM, Facundo Batista > > wrote: > > On Tue, Dec 29, 2015 at 4:38 PM, Andrew Barnert > wrote: > > > Isn't the same thing true for every special method? There are lots of classes where __add___ just says "a.__add__(b) = a + b" or (better following the PEP) "Return self + value." But, in the rare case where the semantics of "a + b" are a little tricky (think of "a / b" for pathlib.Path), where else could you put it but __add__? > > > > Similarly, for most classes, there's only one of __init__ or __new__, and the construction/initialization semantics are simple enough to describe in one line of the class docstring--but when things are more complicated and need to be documented, where else would you put it? > > Yeap. Note that I'm ok to include a docstring when the actual > behaviour would deviate from the expected one as per Reference Docs. > My point is to not make it mandatory. > > > > I usually just don't bother. You can violate PEP 257 when it makes sense, just like PEP 8. They're just guidelines, not iron-clad rules. > > Yeap, but pep257 (the tool [0]) complains for __init__, and wanted to > know how serious was it. > > > [0] https://pypi.python.org/pypi/pep257 > > > That is the tool's fault. I personally hate with a vengeance that there > are tools named after style guide PEPs that claim to enforce the > guidelines from those PEPs. The tools' rigidity and simplicity reflects > badly on the PEPs, which try hard not to be rigid or simplistic. Ask the PSF/pypi people to either prohibit such names or require a disclaimer of some sort. They are inherently confusing: "I took a look at pep008" does not mean that one even looked at the PEP. Even when the context makes clear that the referent is the module, there is confusion as to its authoritativeness. That Facudo would post here about the module's output illustrates that. To me, the name copying violates our informal trademark within Pythonland on 'PEP####'. -- Terry Jan Reedy From ncoghlan at gmail.com Mon Jan 4 02:49:32 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Mon, 4 Jan 2016 17:49:32 +1000 Subject: [Python-Dev] PEP 257 and __init__ In-Reply-To: References: Message-ID: On 4 January 2016 at 17:01, Terry Reedy wrote: > Ask the PSF/pypi people to either prohibit such names or require a > disclaimer of some sort. They are inherently confusing: "I took a look at > pep008" does not mean that one even looked at the PEP. Even when the > context makes clear that the referent is the module, there is confusion as > to its authoritativeness. That Facudo would post here about the module's > output illustrates that. To me, the name copying violates our informal > trademark within Pythonland on 'PEP####'. I don't think that's the right answer, as opinionated tools do serve a useful purpose in preventing bikeshedding during code review (people *expect* computers to be annoyingly pedantic, which frees up the human reviewers to focus on higher level concerns). As projects evolve over time, they may develop their own tweaks and customisations in their style guide and switch to a more configurable tool, or they may not. When some of the default settings for the pep8 utility became a problem, I was able to talk to the developers and persuade them to tune their defaults to be more in line with the actual PEP text, and keep their extensions to optional settings. A similar approach may work for PEP 257, by clarifying which aspects tools should be leaving to human judgement (beyond the question of whether or not to opt in to following PEP 257 at all - it's far less universal than PEP 8). Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From facundobatista at gmail.com Mon Jan 4 06:32:08 2016 From: facundobatista at gmail.com (Facundo Batista) Date: Mon, 4 Jan 2016 08:32:08 -0300 Subject: [Python-Dev] PEP 257 and __init__ In-Reply-To: References: Message-ID: On Mon, Jan 4, 2016 at 4:49 AM, Nick Coghlan wrote: > When some of the default settings for the pep8 utility became a > problem, I was able to talk to the developers and persuade them to > tune their defaults to be more in line with the actual PEP text, and > keep their extensions to optional settings. In that spirit, I opened an issue [0] in the pep257 project to be able to configure that and bypass the default behaviour, so the tool can be used in a wider set of projects. Thanks everybody for all the info. Regards, [0] https://github.com/GreenSteam/pep257/issues/171 -- . Facundo Blog: http://www.taniquetil.com.ar/plog/ PyAr: http://www.python.org/ar/ Twitter: @facundobatista From phil at riverbankcomputing.com Mon Jan 4 11:16:06 2016 From: phil at riverbankcomputing.com (Phil Thompson) Date: Mon, 4 Jan 2016 16:16:06 +0000 Subject: [Python-Dev] zipimport.c broken with implicit namespace packages In-Reply-To: References: <22152.9649.202823.766027@lrd.home.lan> <22152.20121.966791.318438@lrd.home.lan> <0DC6BE63-6816-4288-B2D3-11ADDD661BFA@riverbankcomputing.com> Message-ID: <5AF2F684-CBD1-4D88-AE60-1CA2E6357C38@riverbankcomputing.com> On 3 Jan 2016, at 5:33 pm, Brett Cannon wrote: > > > > On Sun, 3 Jan 2016 at 02:55 Phil Thompson wrote: > On 3 Jan 2016, at 3:41 am, Guido van Rossum wrote: > > > > On Sat, Jan 2, 2016 at 3:26 PM, wrote: > > > > -- > > >>>>> "Brett" == Brett Cannon writes: > > > > > I opened > > > https://bugs.python.org/issue25711 to specifically try to > > > fix this issue once and for all and along the way modernize > > > zipimport by rewriting it from scratch to be more > > > maintainable > > > > Every time I read about impementing a custom loader: > > > > https://docs.python.org/3/library/importlib.html > > > > I've wondered why python does not have some sort of virtual > > filesystem layer to deal with locating modules/packages/support > > files. Virtual file systems seem like a good way to store data on a > > wide range of storage devices. > > > > Yeah, but most devices already implement a *real* filesystem, so the only time the VFS would come in handy would be for zipfiles, where we already have a solution. > > Just to point out that it would be nice to have an easier way to use something other that zipfiles. I have a need to exploit a different solution and have to patch the bootstrap code (because the zipfile support is handled as a special case). BTW the need is to create iOS and Android executables from frozen Python code. > > Not quite sure about how zip files are a special-case beyond just being put in sys.meta_path automatically. You can get the same results with a .pth file or a sitecustomize.py depending on how pervasive your need is. Otherwise feel free to file an issue at bugs.python.org and we can talk over there about what you specifically need and if it's reasonable to try and support. I've created http://bugs.python.org/issue26007 and hope it's clear enough what the issue is. Thanks, Phil From guido at python.org Mon Jan 4 12:18:01 2016 From: guido at python.org (Guido van Rossum) Date: Mon, 4 Jan 2016 09:18:01 -0800 Subject: [Python-Dev] PEP 257 and __init__ In-Reply-To: References: Message-ID: On Sun, Jan 3, 2016 at 11:49 PM, Nick Coghlan wrote: > On 4 January 2016 at 17:01, Terry Reedy wrote: > > Ask the PSF/pypi people to either prohibit such names or require a > > disclaimer of some sort. They are inherently confusing: "I took a look > at > > pep008" does not mean that one even looked at the PEP. Even when the > > context makes clear that the referent is the module, there is confusion > as > > to its authoritativeness. That Facudo would post here about the module's > > output illustrates that. To me, the name copying violates our informal > > trademark within Pythonland on 'PEP####'. > > I don't think that's the right answer, as opinionated tools do serve a > useful purpose in preventing bikeshedding during code review (people > *expect* computers to be annoyingly pedantic, which frees up the human > reviewers to focus on higher level concerns). As projects evolve over > time, they may develop their own tweaks and customisations in their > style guide and switch to a more configurable tool, or they may not. > > When some of the default settings for the pep8 utility became a > problem, I was able to talk to the developers and persuade them to > tune their defaults to be more in line with the actual PEP text, and > keep their extensions to optional settings. > > A similar approach may work for PEP 257, by clarifying which aspects > tools should be leaving to human judgement (beyond the question of > whether or not to opt in to following PEP 257 at all - it's far less > universal than PEP 8). > Hm. I don't want the PSF to flex its muscles about trademarks, but I still don't like that there are tools named after PEPs (especially since the tools are not written by the same people that wrote the PEPs). I still recall the first time someone emailed me about a "pep8" issue (I had never heard of the tool by that name) and I was thoroughly confused for a long time. That said I expect it's too late to try and get the pep8 authors to rename it; but I filed an issue with the pep257 project and they are going to change the name: https://github.com/GreenSteam/pep257/issues/172 . FWIW I am happy that the tools exist! They can be very useful and I use pep8 myself. But I always let it know who's boss. :-) -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From mike.romberg at comcast.net Mon Jan 4 13:27:42 2016 From: mike.romberg at comcast.net (mike.romberg at comcast.net) Date: Mon, 4 Jan 2016 11:27:42 -0700 Subject: [Python-Dev] zipimport.c broken with implicit namespace packages In-Reply-To: References: <22152.9649.202823.766027@lrd.home.lan> <22152.20121.966791.318438@lrd.home.lan> <22152.45520.973803.741563@lrd.home.lan> Message-ID: <22154.47518.523647.735239@lrd.home.lan> >>>>> " " == Brett Cannon writes: ... > It's a reasonable thing to consider, but it would be better to > get zipimport fixed for you, then rewritten To that end, I've added a patch to the issue tracker: https://bugs.python.org/issue17633 My patch is issue17633-3.diff which builds upon issue17633-2.diff in that it fixes an issue with the enumerated value used by find_loader (find_loader was returning -1 which was not even a valid enumerated value). I also expanded the test cases for zipimport to cover namespace packages spread between multiple zip archives and zip archives and real filesystems. Please let me know if there is anything else I can do. Thanks, Mike From barry at python.org Tue Jan 5 18:49:21 2016 From: barry at python.org (Barry Warsaw) Date: Tue, 5 Jan 2016 18:49:21 -0500 Subject: [Python-Dev] PEP 9 - plaintext PEP format - is officially deprecated Message-ID: <20160105184921.317ac5ec@limelight.wooz.org> I don't think this will be at all controversial. Brett suggested, and there was no disagreement from the PEP editors, that plain text PEPs be deprecated. reStructuredText is clearly a better format, and all recent PEP submissions have been in reST for a while now anyway. I am therefore withdrawing[*] PEP 9 and have made other appropriate changes to make it clear that only PEP 12 format is acceptable going forward. The PEP editors will not be converting the legacy PEPs to reST, nor will we currently be renaming the relevant PEP source files to end with ".rst" since there's too much tooling that would have to change to do so. However, if either task really interests you, please get in touch with the PEP editors. it-only-took-15-years-ly y'rs, -Barry (on behalf of the PEP editors) [*] Status: Withdrawn being about the only currently appropriate resolution status for process PEPs. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 819 bytes Desc: OpenPGP digital signature URL: From vadmium+py at gmail.com Wed Jan 6 22:06:13 2016 From: vadmium+py at gmail.com (Martin Panter) Date: Thu, 7 Jan 2016 03:06:13 +0000 Subject: [Python-Dev] Branches in which to fix the SSL tests Message-ID: Currently some SSL tests in the test suite are broken by a recent certificate change at https://svn.python.org/; see for the bug report. The tests are broken when the test suite is run with the ?-unetwork? option enabled, and most of the buildbots appear to be affected. (In 3.6 the tests have temporarily been disabled as a workaround.) I have a simple patch that subsitutes the old root certificate for the new which I would like to commit, but I?m not sure which branches to apply it to, or even which branches are open to normal maintainence and bug fixes. According to Larry , 3.4.4 was the last bug fix release for 3.4, so I assumed the 3.4 branch should now be in security-fixes-only mode. However this branch still seems to get a lot of non-security action, for example the most recent bunch of changes were some work on the provisional ?pathlib? module. So firstly I would like some clarification on the status of 3.4 and what its future is. Secondly, I would normally say a fix for the test suite isn?t really appropriate for the older security branches. But in the bug report, Koobs specifically requested this be fixed in 3.4 and possibly earlier branches as well. What do others think about this? From guido at python.org Wed Jan 6 23:17:10 2016 From: guido at python.org (Guido van Rossum) Date: Wed, 6 Jan 2016 20:17:10 -0800 Subject: [Python-Dev] Branches in which to fix the SSL tests In-Reply-To: References: Message-ID: On Wed, Jan 6, 2016 at 7:06 PM, Martin Panter wrote: > Currently some SSL tests in the test suite are broken by a recent > certificate change at https://svn.python.org/; see > for the bug report. The tests are > broken when the test suite is run with the ?-unetwork? option enabled, > and most of the buildbots appear to be affected. (In 3.6 the tests > have temporarily been disabled as a workaround.) I have a simple patch > that subsitutes the old root certificate for the new which I would > like to commit, but I?m not sure which branches to apply it to, or > even which branches are open to normal maintainence and bug fixes. > > According to Larry > , > 3.4.4 was the last bug fix release for 3.4, so I assumed the 3.4 > branch should now be in security-fixes-only mode. However this branch > still seems to get a lot of non-security action, for example the most > recent bunch of changes were some work on the provisional ?pathlib? > module. So firstly I would like some clarification on the status of > 3.4 and what its future is. > To me Larry's email mainly indicates that we're not going to do more binary releases in the 3.4 branch. The work I did on pathlib is probably never going to be released in that branch -- but since I merged it into 3.5 it's not going to waste, and the effort was pretty minimal. (And people *could* still pick it up from the source.) > Secondly, I would normally say a fix for the test suite isn?t really > appropriate for the older security branches. But in the bug report, > Koobs specifically requested this be fixed in 3.4 and possibly earlier > branches as well. What do others think about this? > It should definitely be fixed in 2.7, 3.5 and 3.6. If you want to do it in 3.4 too that sounds totally fine (it's just one extra merge). Are there still any active buildbots for 3.4? The question is, why? I guess one reason might be to ensure that if we *do* have to do an emergency release of 3.4 (it's always a possibility), it would be easier if the tests were all known to be passing, otherwise doing a release (even a source-only release) would be a big hassle. OTOH if we don't make changes the tests will generally not start failing -- the current issue notwithstanding. :-) So if buildbots are a scarce resource it's fine to repurpose them for more recent releases. Oh, finally. I'm eager to answer this but I'm not actually the best resource here -- I'm pretty rusty where it comes to our build and test practices. -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From benjamin at python.org Thu Jan 7 01:02:04 2016 From: benjamin at python.org (Benjamin Peterson) Date: Wed, 06 Jan 2016 22:02:04 -0800 Subject: [Python-Dev] Branches in which to fix the SSL tests In-Reply-To: References: Message-ID: <1452146524.2360668.485185474.2107A656@webmail.messagingengine.com> We should fix hard-failing tests in all branches to which releases might be made in the future. It's important to be able to run the rests successfully before releasing. On Wed, Jan 6, 2016, at 19:06, Martin Panter wrote: > Currently some SSL tests in the test suite are broken by a recent > certificate change at https://svn.python.org/; see > for the bug report. The tests are > broken when the test suite is run with the ?-unetwork? option enabled, > and most of the buildbots appear to be affected. (In 3.6 the tests > have temporarily been disabled as a workaround.) I have a simple patch > that subsitutes the old root certificate for the new which I would > like to commit, but I?m not sure which branches to apply it to, or > even which branches are open to normal maintainence and bug fixes. > > According to Larry > , > 3.4.4 was the last bug fix release for 3.4, so I assumed the 3.4 > branch should now be in security-fixes-only mode. However this branch > still seems to get a lot of non-security action, for example the most > recent bunch of changes were some work on the provisional ?pathlib? > module. So firstly I would like some clarification on the status of > 3.4 and what its future is. > > Secondly, I would normally say a fix for the test suite isn?t really > appropriate for the older security branches. But in the bug report, > Koobs specifically requested this be fixed in 3.4 and possibly earlier > branches as well. What do others think about this? > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/benjamin%40python.org From nad at python.org Thu Jan 7 01:52:17 2016 From: nad at python.org (Ned Deily) Date: Thu, 7 Jan 2016 01:52:17 -0500 Subject: [Python-Dev] Branches in which to fix the SSL tests In-Reply-To: References: Message-ID: <95858B31-AFC9-48FF-8735-55A13FE6CAD3@python.org> On Jan 6, 2016, at 23:17, Guido van Rossum wrote: > On Wed, Jan 6, 2016 at 7:06 PM, Martin Panter wrote: > Currently some SSL tests in the test suite are broken by a recent > certificate change at https://svn.python.org/; see > for the bug report. The tests are > broken when the test suite is run with the ?-unetwork? option enabled, > and most of the buildbots appear to be affected. (In 3.6 the tests > have temporarily been disabled as a workaround.) I have a simple patch > that subsitutes the old root certificate for the new which I would > like to commit, but I?m not sure which branches to apply it to, or > even which branches are open to normal maintainence and bug fixes. > > According to Larry > , > 3.4.4 was the last bug fix release for 3.4, so I assumed the 3.4 > branch should now be in security-fixes-only mode. However this branch > still seems to get a lot of non-security action, for example the most > recent bunch of changes were some work on the provisional ?pathlib? > module. So firstly I would like some clarification on the status of > 3.4 and what its future is. > > To me Larry's email mainly indicates that we're not going to do more binary releases in the 3.4 branch. The work I did on pathlib is probably never going to be released in that branch -- but since I merged it into 3.5 it's not going to waste, and the effort was pretty minimal. (And people *could* still pick it up from the source.) > > Secondly, I would normally say a fix for the test suite isn?t really > appropriate for the older security branches. But in the bug report, > Koobs specifically requested this be fixed in 3.4 and possibly earlier > branches as well. What do others think about this? > > It should definitely be fixed in 2.7, 3.5 and 3.6. If you want to do it in 3.4 too that sounds totally fine (it's just one extra merge). The Developer's Guide describes current practices for backporting of fixes to security-fix branches (https://docs.python.org/devguide/devcycle.html#security-branches): "The only changes made to a security branch are those fixing issues exploitable by attackers such as crashes, privilege escalation and, optionally, other issues such as denial of service attacks. Any other changes are **not** considered a security risk and thus not backported to a security branch." Benjamin brings up a good point, though, about the importance of fixing hard-failures in the test suite. I've added the following to the above paragraph in the guide: "You should also consider fixing hard-failing tests in open security branches since it is important to be able to run the tests successfully before releasing." Also, I've tried to update the information in the Developer's Guide regarding branches and releases to match the current state of the world, e.g. 3.6 is the feature release under development, 3.5 and 2.7 are the current maintenance branches, and the current security-fix-only branches are 3.4, 3.3, and (for one more month) 3.2. (The web site should update within the next day.) Hope that helps! --Ned -- Ned Deily nad at python.org -- [] From mal at egenix.com Thu Jan 7 04:57:08 2016 From: mal at egenix.com (M.-A. Lemburg) Date: Thu, 7 Jan 2016 10:57:08 +0100 Subject: [Python-Dev] Branches in which to fix the SSL tests In-Reply-To: References: Message-ID: <568E3674.3010203@egenix.com> On 07.01.2016 04:06, Martin Panter wrote: > Currently some SSL tests in the test suite are broken by a recent > certificate change at https://svn.python.org/; see > for the bug report. The tests are > broken when the test suite is run with the ?-unetwork? option enabled, > and most of the buildbots appear to be affected. (In 3.6 the tests > have temporarily been disabled as a workaround.) I have a simple patch > that subsitutes the old root certificate for the new which I would > like to commit, but I?m not sure which branches to apply it to, or > even which branches are open to normal maintainence and bug fixes. As mentioned on the issue tracker I'm not convinced that your patch is a good solution going forward. Rather than basing the test on svn.python.org, which can change again in the future, we should use the domain we have specifically reserved for stdlib tests, namely pythontest.net and get this set up in a way so that we can run SSL tests again. I can help with getting SSL certificates for pythontest.net, since I'm managing the PSF SSL certificates (well, trying to keep those managed anyway ;-) - I was not involved in the choice of new CA for svn.python.org): https://wiki.python.org/psf/PSF%20SSL%20Certificates Regarding which branches to apply the final fix, others have already chimed in. Essentially, any branch which we'll need to run tests on in the foreseeable future will need to be fixed, since the change in svn.python.org's SSL setup (the expired certificate which caused the tests to fail was replaced with a new one from a different CA, again causing tests to fail) affects all released test suites. -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Experts (#1, Jan 07 2016) >>> Python Projects, Coaching and Consulting ... http://www.egenix.com/ >>> Python Database Interfaces ... http://products.egenix.com/ >>> Plone/Zope Database Interfaces ... http://zope.egenix.com/ ________________________________________________________________________ ::: We implement business ideas - efficiently in both time and costs ::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/ http://www.malemburg.com/ From vadmium+py at gmail.com Thu Jan 7 06:36:54 2016 From: vadmium+py at gmail.com (Martin Panter) Date: Thu, 7 Jan 2016 11:36:54 +0000 Subject: [Python-Dev] Branches in which to fix the SSL tests In-Reply-To: <568E3674.3010203@egenix.com> References: <568E3674.3010203@egenix.com> Message-ID: On 7 January 2016 at 09:57, M.-A. Lemburg wrote: > As mentioned on the issue tracker I'm not convinced that your > patch is a good solution going forward. Rather than basing > the test on svn.python.org, which can change again in the > future, we should use the domain we have specifically > reserved for stdlib tests, namely pythontest.net and get this > set up in a way so that we can run SSL tests again. I agree my first patch is not really sustainable; I only thought of it as a temporary workaround. If we want to change to pythontest.net, it seems there is already an SSL certificate set up with https://self-signed.pythontest.net, already used by other tests. If people want, I should be able to make a straightforward patch to switch to that. But I would prefer to go with something like my newer local-server.v2.patch, which essentially switches to a local server run for the duration of each test. It is just a question of finding someone to review my changes. > Regarding which branches to apply the final fix, others have > already chimed in. Essentially, any branch which we'll need > to run tests on in the foreseeable future will need to be fixed Yes, it seems that is a reasonable conclusion. In this case that would indeed mean all branches including 3.2. Thanks everyone for the input. From dario670 at gmail.com Thu Jan 7 03:14:38 2016 From: dario670 at gmail.com (Dario Roman) Date: Thu, 7 Jan 2016 03:14:38 -0500 Subject: [Python-Dev] Build bots situations Message-ID: I have develop few years ago a small net bit that could help , I was giving my self into it but unpleasant to my life , that they want java I never stood from them . It was develop to run that bit in python strictly but a combination with servers and I net stream sockets I could fix the problem . Nice to meet with you I consider your explanation as mentor you know the industry I built the bot to work with the Alice project and the dictionary recipients of sparkov , it it function right know but feds and Police try to hold the main program , when I find out I try to get online with the webmaster and he could not pass the permit he neglected so fast , that for me was conducting right . Anyways take care I you want copy of my work send me email to serocom13 at gmail.com, I born in this code an still understanding what was before , me my self being human and loves beers . I would like to overcome your language and try it ti put your self in another way of life , you are my mentor .... Thanks for every thing ... Serocom13 at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From larry at hastings.org Thu Jan 7 11:32:44 2016 From: larry at hastings.org (Larry Hastings) Date: Thu, 7 Jan 2016 11:32:44 -0500 Subject: [Python-Dev] Branches in which to fix the SSL tests In-Reply-To: References: Message-ID: <568E932C.4080509@hastings.org> On 01/06/2016 10:06 PM, Martin Panter wrote: > According to Larry > , > 3.4.4 was the last bug fix release for 3.4, so I assumed the 3.4 > branch should now be in security-fixes-only mode. You assume correctly. > However this branch > still seems to get a lot of non-security action, for example the most > recent bunch of changes were some work on the provisional ?pathlib? > module. I haven't looked at the changes you mention (I'm on a trip) but... what am I to do? Beyond tagging all future 3.4 releases from a repository only I have write access to and never pulling from hg.python.org, I don't really have any power to enforce the condition. I'm left trusting the better judgment of the Python core dev community. I feel like a parent who's caught his kid reading after bedtime using a flashlight under the covers. I mean, no, that's not what you should be doing right now. But fixing bugs in Python is hardly the worst thing in the world. //arry/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Thu Jan 7 11:52:09 2016 From: brett at python.org (Brett Cannon) Date: Thu, 07 Jan 2016 16:52:09 +0000 Subject: [Python-Dev] Branches in which to fix the SSL tests In-Reply-To: References: <568E3674.3010203@egenix.com> Message-ID: On Thu, 7 Jan 2016 at 03:37 Martin Panter wrote: > On 7 January 2016 at 09:57, M.-A. Lemburg wrote: > > As mentioned on the issue tracker I'm not convinced that your > > patch is a good solution going forward. Rather than basing > > the test on svn.python.org, which can change again in the > > future, we should use the domain we have specifically > > reserved for stdlib tests, namely pythontest.net and get this > > set up in a way so that we can run SSL tests again. > > I agree my first patch is not really sustainable; I only thought of it > as a temporary workaround. > > If we want to change to pythontest.net, it seems there is already an > SSL certificate set up with https://self-signed.pythontest.net, > already used by other tests. If people want, I should be able to make > a straightforward patch to switch to that. > > But I would prefer to go with something like my newer > local-server.v2.patch, which essentially switches to a local server > run for the duration of each test. It is just a question of finding > someone to review my changes. > I say update bugfix branches with simple pythontest.net and consider updating default w/ your local server solution as there we can watch the tests to make sure the test is stable, etc. -Brett > > > Regarding which branches to apply the final fix, others have > > already chimed in. Essentially, any branch which we'll need > > to run tests on in the foreseeable future will need to be fixed > > Yes, it seems that is a reasonable conclusion. In this case that would > indeed mean all branches including 3.2. Thanks everyone for the input. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From victor.stinner at gmail.com Thu Jan 7 12:04:12 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Thu, 7 Jan 2016 18:04:12 +0100 Subject: [Python-Dev] Branches in which to fix the SSL tests In-Reply-To: <568E932C.4080509@hastings.org> References: <568E932C.4080509@hastings.org> Message-ID: Hi, 2016-01-07 17:32 GMT+01:00 Larry Hastings : > On 01/06/2016 10:06 PM, Martin Panter wrote: > > According to Larry > , > 3.4.4 was the last bug fix release for 3.4, so I assumed the 3.4 > branch should now be in security-fixes-only mode. Would it be possible to have a (clear and up to date) table like http://docs.openstack.org/releases/ in the Developer Guide? List of Python versions with their status (end of life, security supported, current stable release, under development). Victor From brett at python.org Thu Jan 7 12:10:33 2016 From: brett at python.org (Brett Cannon) Date: Thu, 07 Jan 2016 17:10:33 +0000 Subject: [Python-Dev] Branches in which to fix the SSL tests In-Reply-To: References: <568E932C.4080509@hastings.org> Message-ID: On Thu, 7 Jan 2016 at 09:06 Victor Stinner wrote: > Hi, > > 2016-01-07 17:32 GMT+01:00 Larry Hastings : > > On 01/06/2016 10:06 PM, Martin Panter wrote: > > > > According to Larry > > >, > > 3.4.4 was the last bug fix release for 3.4, so I assumed the 3.4 > > branch should now be in security-fixes-only mode. > > Would it be possible to have a (clear and up to date) table like > http://docs.openstack.org/releases/ in the Developer Guide? List of > Python versions with their status (end of life, security supported, > current stable release, under development). > Sure, someone just has to write it and then either make sure it stays up-to-date or add it as another release step for RMs to follow. -------------- next part -------------- An HTML attachment was scrubbed... URL: From mal at egenix.com Thu Jan 7 12:56:41 2016 From: mal at egenix.com (M.-A. Lemburg) Date: Thu, 7 Jan 2016 18:56:41 +0100 Subject: [Python-Dev] Branches in which to fix the SSL tests In-Reply-To: References: <568E3674.3010203@egenix.com> Message-ID: <568EA6D9.3070703@egenix.com> On 07.01.2016 17:52, Brett Cannon wrote: > On Thu, 7 Jan 2016 at 03:37 Martin Panter wrote: > >> On 7 January 2016 at 09:57, M.-A. Lemburg wrote: >>> As mentioned on the issue tracker I'm not convinced that your >>> patch is a good solution going forward. Rather than basing >>> the test on svn.python.org, which can change again in the >>> future, we should use the domain we have specifically >>> reserved for stdlib tests, namely pythontest.net and get this >>> set up in a way so that we can run SSL tests again. >> >> I agree my first patch is not really sustainable; I only thought of it >> as a temporary workaround. >> >> If we want to change to pythontest.net, it seems there is already an >> SSL certificate set up with https://self-signed.pythontest.net, >> already used by other tests. If people want, I should be able to make >> a straightforward patch to switch to that. >> >> But I would prefer to go with something like my newer >> local-server.v2.patch, which essentially switches to a local server >> run for the duration of each test. It is just a question of finding >> someone to review my changes. >> > > I say update bugfix branches with simple pythontest.net and consider > updating default w/ your local server solution as there we can watch the > tests to make sure the test is stable, etc. +1 > -Brett > > >> >>> Regarding which branches to apply the final fix, others have >>> already chimed in. Essentially, any branch which we'll need >>> to run tests on in the foreseeable future will need to be fixed >> >> Yes, it seems that is a reasonable conclusion. In this case that would >> indeed mean all branches including 3.2. Thanks everyone for the input. >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/brett%40python.org >> > > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/mal%40egenix.com > -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Experts (#1, Jan 07 2016) >>> Python Projects, Coaching and Consulting ... http://www.egenix.com/ >>> Python Database Interfaces ... http://products.egenix.com/ >>> Plone/Zope Database Interfaces ... http://zope.egenix.com/ ________________________________________________________________________ ::: We implement business ideas - efficiently in both time and costs ::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/ http://www.malemburg.com/ From nad at python.org Thu Jan 7 14:38:43 2016 From: nad at python.org (Ned Deily) Date: Thu, 7 Jan 2016 14:38:43 -0500 Subject: [Python-Dev] Branches in which to fix the SSL tests In-Reply-To: References: <568E932C.4080509@hastings.org> Message-ID: <4D3394EC-9E9E-45B8-8C55-79C6965D2AE0@python.org> On Jan 7, 2016, at 12:04, Victor Stinner wrote: > 2016-01-07 17:32 GMT+01:00 Larry Hastings : >> On 01/06/2016 10:06 PM, Martin Panter wrote: >> >> According to Larry >> , >> 3.4.4 was the last bug fix release for 3.4, so I assumed the 3.4 >> branch should now be in security-fixes-only mode. > > Would it be possible to have a (clear and up to date) table like > http://docs.openstack.org/releases/ in the Developer Guide? List of > Python versions with their status (end of life, security supported, > current stable release, under development). Yes, it's a good idea. It's on my to-do list. -- Ned Deily nad at python.org -- [] From blake.a.griffith at gmail.com Thu Jan 7 17:26:23 2016 From: blake.a.griffith at gmail.com (Blake Griffith) Date: Thu, 7 Jan 2016 23:26:23 +0100 Subject: [Python-Dev] bitwise operations for bytes and bytearray Message-ID: Hi! I'm interested in adding the functionality to do something like: >>> b'a' ^ b'b' b'\x03' Instead of the good ol' TypeError. I think both bytes and bytearray should support all the bitwise operations. I've never hacked on cpython before. I'm starting by just trying to add xor to bytearray. I have a ByteArray_Xor function that I think should do this here https://github.com/cowlicks/cpython/commit/d6dddb11cdb33032b39dcb9dfdaa7b10d4377b5f But I'm not sure how to hook this in to the rest of cypython. I tried adding it where bytearray_as_sequence is declared in this bytearrayobject.c file. But that gave me compiler warnings and broke things. So now that I have this ByteArray_Xor function, how do I make it be bytearray.__xor___? Thanks! Blake -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Thu Jan 7 17:38:29 2016 From: brett at python.org (Brett Cannon) Date: Thu, 07 Jan 2016 22:38:29 +0000 Subject: [Python-Dev] bitwise operations for bytes and bytearray In-Reply-To: References: Message-ID: On Thu, 7 Jan 2016 at 14:29 Blake Griffith wrote: > Hi! > > I'm interested in adding the functionality to do something like: > > >>> b'a' ^ b'b' > b'\x03' > > > Instead of the good ol' TypeError. > > I think both bytes and bytearray should support all the bitwise operations. > > I've never hacked on cpython before. I'm starting by just trying to add > xor to bytearray. I have a ByteArray_Xor function that I think should do > this here > https://github.com/cowlicks/cpython/commit/d6dddb11cdb33032b39dcb9dfdaa7b10d4377b5f > > But I'm not sure how to hook this in to the rest of cypython. I tried > adding it where bytearray_as_sequence is declared in this > bytearrayobject.c file. But that gave me compiler warnings and broke things. > > So now that I have this ByteArray_Xor function, how do I make it be > bytearray.__xor___? > You need to set the PyNumberMethods struct with the appropriate function and then set that on the PyTypeObject. Look at https://hg.python.org/cpython/file/tip/Include/object.h#l237 and https://hg.python.org/cpython/file/tip/Objects/longobject.c#l5238 for an idea of what it takes. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jjevnik at quantopian.com Thu Jan 7 17:39:12 2016 From: jjevnik at quantopian.com (Joe Jevnik) Date: Thu, 7 Jan 2016 17:39:12 -0500 Subject: [Python-Dev] bitwise operations for bytes and bytearray In-Reply-To: References: Message-ID: You want to put the `xor` method in the `nb_xor` field of the `PyNumberMethods` structure that lives in the `tp_as_number` field of the bytes type object. Two things I noticed in a quick pass: you might want to add some type checking around the case where `a` or `b` is not a `PyByteArray` object. Also, variable length arrays are added in C99, so you will need to manually allocate the `raw_*` arrays in the heap. This seems like a cool feature! On Thu, Jan 7, 2016 at 5:26 PM, Blake Griffith wrote: > Hi! > > I'm interested in adding the functionality to do something like: > > >>> b'a' ^ b'b' > b'\x03' > > > Instead of the good ol' TypeError. > > I think both bytes and bytearray should support all the bitwise operations. > > I've never hacked on cpython before. I'm starting by just trying to add > xor to bytearray. I have a ByteArray_Xor function that I think should do > this here > https://github.com/cowlicks/cpython/commit/d6dddb11cdb33032b39dcb9dfdaa7b10d4377b5f > > But I'm not sure how to hook this in to the rest of cypython. I tried > adding it where bytearray_as_sequence is declared in this > bytearrayobject.c file. But that gave me compiler warnings and broke things. > > So now that I have this ByteArray_Xor function, how do I make it be > bytearray.__xor___? > > Thanks! > > Blake > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/joe%40quantopian.com > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vadmium+py at gmail.com Thu Jan 7 18:57:24 2016 From: vadmium+py at gmail.com (Martin Panter) Date: Thu, 7 Jan 2016 23:57:24 +0000 Subject: [Python-Dev] bitwise operations for bytes and bytearray In-Reply-To: References: Message-ID: On 7 January 2016 at 22:26, Blake Griffith wrote: > I'm interested in adding the functionality to do something like: > >>>> b'a' ^ b'b' > b'\x03' > > > Instead of the good ol' TypeError. > > I think both bytes and bytearray should support all the bitwise operations. There is a bug open about adding this kind of functionality: . From abarnert at yahoo.com Thu Jan 7 19:12:06 2016 From: abarnert at yahoo.com (Andrew Barnert) Date: Thu, 7 Jan 2016 16:12:06 -0800 Subject: [Python-Dev] bitwise operations for bytes and bytearray In-Reply-To: References: Message-ID: <6E0FFB72-142F-493C-BE29-D260302EBA16@yahoo.com> On Jan 7, 2016, at 15:57, Martin Panter wrote: > >> On 7 January 2016 at 22:26, Blake Griffith wrote: >> I'm interested in adding the functionality to do something like: >> >>>>> b'a' ^ b'b' >> b'\x03' >> >> >> Instead of the good ol' TypeError. >> >> I think both bytes and bytearray should support all the bitwise operations. > > There is a bug open about adding this kind of functionality: > . And it's in the needs patch stage, which makes it perfect for the OP: in addition to learning how to hack on builtin types, he can also learn the other parts of the dev process. (Even if the bug is eventually rejected, as seems likely given that it sat around for three years with no compelling use case and then Guido added a "very skeptical" comment.) From blake.a.griffith at gmail.com Thu Jan 7 20:06:06 2016 From: blake.a.griffith at gmail.com (Blake Griffith) Date: Fri, 8 Jan 2016 02:06:06 +0100 Subject: [Python-Dev] bitwise operations for bytes and bytearray In-Reply-To: <6E0FFB72-142F-493C-BE29-D260302EBA16@yahoo.com> References: <6E0FFB72-142F-493C-BE29-D260302EBA16@yahoo.com> Message-ID: Thanks for the quick responses y'all. I have something compiling on my branch, which is enough for me tonight. I asked a question about this on stackoverflow a while ago, it wasn't very popular https://stackoverflow.com/questions/32658420/why-cant-you-xor-bytes-objects-in-python Someone there pointed out this feature was suggested on the mailing list a while back (2006) https://mail.python.org/pipermail/python-dev/2006-March/061980.html On Fri, Jan 8, 2016 at 1:12 AM, Andrew Barnert wrote: > On Jan 7, 2016, at 15:57, Martin Panter wrote: > > > >> On 7 January 2016 at 22:26, Blake Griffith > wrote: > >> I'm interested in adding the functionality to do something like: > >> > >>>>> b'a' ^ b'b' > >> b'\x03' > >> > >> > >> Instead of the good ol' TypeError. > >> > >> I think both bytes and bytearray should support all the bitwise > operations. > > > > There is a bug open about adding this kind of functionality: > > . > > And it's in the needs patch stage, which makes it perfect for the OP: in > addition to learning how to hack on builtin types, he can also learn the > other parts of the dev process. (Even if the bug is eventually rejected, as > seems likely given that it sat around for three years with no compelling > use case and then Guido added a "very skeptical" comment.) -------------- next part -------------- An HTML attachment was scrubbed... URL: From cs at zip.com.au Thu Jan 7 20:08:20 2016 From: cs at zip.com.au (Cameron Simpson) Date: Fri, 8 Jan 2016 12:08:20 +1100 Subject: [Python-Dev] bitwise operations for bytes and bytearray In-Reply-To: <6E0FFB72-142F-493C-BE29-D260302EBA16@yahoo.com> References: <6E0FFB72-142F-493C-BE29-D260302EBA16@yahoo.com> Message-ID: <20160108010820.GA6099@cskk.homeip.net> On 07Jan2016 16:12, Python-Dev wrote: >On Jan 7, 2016, at 15:57, Martin Panter wrote: >>> On 7 January 2016 at 22:26, Blake Griffith wrote: >>> I'm interested in adding the functionality to do something like: >>>>>> b'a' ^ b'b' >>> b'\x03' >>> Instead of the good ol' TypeError. >>> >>> I think both bytes and bytearray should support all the bitwise operations. >> >> There is a bug open about adding this kind of functionality: >> . > >And it's in the needs patch stage, which makes it perfect for the OP: in >addition to learning how to hack on builtin types, he can also learn the other >parts of the dev process. (Even if the bug is eventually rejected, as seems >likely given that it sat around for three years with no compelling use case >and then Guido added a "very skeptical" comment.) The use case which springs immediately to my mind is cryptography. To encrypt a stream symmetrically you can go: cleartext-bytes ^ cryptographicly-random-bytes-from-cipher so with this one could write: def crypted(byteses, crypto_source): ''' Accept an iterable source of bytes objects and a preprimed source of crypto bytes, yield encrypted versions of the bytes objects. ''' for bs in byteses: cbs = crypto_source.next_bytes(len(bs)) yield bs ^ cbs Cheers, Cameron Simpson From ncoghlan at gmail.com Fri Jan 8 07:45:39 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 8 Jan 2016 22:45:39 +1000 Subject: [Python-Dev] Branches in which to fix the SSL tests In-Reply-To: <4D3394EC-9E9E-45B8-8C55-79C6965D2AE0@python.org> References: <568E932C.4080509@hastings.org> <4D3394EC-9E9E-45B8-8C55-79C6965D2AE0@python.org> Message-ID: On 8 January 2016 at 05:38, Ned Deily wrote: > On Jan 7, 2016, at 12:04, Victor Stinner wrote: >> 2016-01-07 17:32 GMT+01:00 Larry Hastings : >>> On 01/06/2016 10:06 PM, Martin Panter wrote: >>> >>> According to Larry >>> , >>> 3.4.4 was the last bug fix release for 3.4, so I assumed the 3.4 >>> branch should now be in security-fixes-only mode. >> >> Would it be possible to have a (clear and up to date) table like >> http://docs.openstack.org/releases/ in the Developer Guide? List of >> Python versions with their status (end of life, security supported, >> current stable release, under development). > > Yes, it's a good idea. It's on my to-do list. PHP's support status page is one of the nicest examples of this I've seen (someone mentioned it last time this question came up): http://php.net/supported-versions.php Like a lot of things though, "yes, that's a good idea" is easy, having it actually get to the top of someone's todo list is a different question :) Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From status at bugs.python.org Fri Jan 8 12:08:35 2016 From: status at bugs.python.org (Python tracker) Date: Fri, 8 Jan 2016 18:08:35 +0100 (CET) Subject: [Python-Dev] Summary of Python tracker Issues Message-ID: <20160108170835.83A12566EA@psf.upfronthosting.co.za> ACTIVITY SUMMARY (2016-01-01 - 2016-01-08) Python tracker at http://bugs.python.org/ To view or respond to any of the issues listed below, click on the issue. Do NOT respond to this message. Issues counts and deltas: open 5371 (+30) closed 32436 (+34) total 37807 (+64) Open issues with patches: 2362 Issues opened (50) ================== #6500: urllib2 maximum recursion depth exceeded http://bugs.python.org/issue6500 reopened by berker.peksag #22995: Restrict default pickleability http://bugs.python.org/issue22995 reopened by barry #25986: Collections.deque maxlen: added in 2.6 or 2.7? http://bugs.python.org/issue25986 opened by terry.reedy #25987: collections.abc.Reversible http://bugs.python.org/issue25987 opened by abarnert #25988: collections.abc.Indexable http://bugs.python.org/issue25988 opened by abarnert #25989: documentation version switcher is broken fro 2.6, 3.2, 3.3 http://bugs.python.org/issue25989 opened by Vincentdavis #25991: readline example eventually consumes all memory http://bugs.python.org/issue25991 opened by dangyogi #25992: test_gdb fails http://bugs.python.org/issue25992 opened by Bryce Miller #25993: Crashed when call time.time() after using _mm_xor_si64 http://bugs.python.org/issue25993 opened by Xiongzhi Gao #25994: File descriptor leaks in os.scandir() http://bugs.python.org/issue25994 opened by serhiy.storchaka #25995: os.walk() consumes a lot of file descriptors http://bugs.python.org/issue25995 opened by serhiy.storchaka #25996: Add support of file descriptor in os.scandir() http://bugs.python.org/issue25996 opened by serhiy.storchaka #25998: doctest terminates when accessing __wrapped__ raises an error http://bugs.python.org/issue25998 opened by Lasse Schuirmann #26000: Crash in Tokenizer - Heap-use-after-free http://bugs.python.org/issue26000 opened by William Bowling #26001: Tutorial: write() does not expect string in binary mode http://bugs.python.org/issue26001 opened by Dimitri Papadopoulos Orfanos #26002: make statistics.median_grouped more efficient http://bugs.python.org/issue26002 opened by upendra-k14 #26003: Issues with PyEval_InitThreads and PyGILState_Ensure http://bugs.python.org/issue26003 opened by tzickel #26004: pip install lifetimes - throwing error and unable to install p http://bugs.python.org/issue26004 opened by dudestc #26005: Denial of Service in SimpleHTTPServer and BaseHTTPServer http://bugs.python.org/issue26005 opened by Richard Clifford #26007: Request for Support for Embedding the Standard Library in an E http://bugs.python.org/issue26007 opened by philthompson10 #26009: HTMLParser lacking a few features to reconstruct input exactly http://bugs.python.org/issue26009 opened by jason_s #26010: document CO_* constants http://bugs.python.org/issue26010 opened by yselivanov #26011: Document necesities for cmp argument of sorted http://bugs.python.org/issue26011 opened by krichter #26013: Pickle protocol 2.0 not loading in python 3.5 http://bugs.python.org/issue26013 opened by anilredshift #26014: Guide users to the newer package install instructions http://bugs.python.org/issue26014 opened by ncoghlan #26015: Add new tests for pickling iterators of mutable sequences http://bugs.python.org/issue26015 opened by serhiy.storchaka #26016: io.TextIOWrapper.tell() report 65bit number when mix readline( http://bugs.python.org/issue26016 opened by EcmaXp_ #26017: Update https://docs.python.org/3/installing/index.html to alwa http://bugs.python.org/issue26017 opened by brett.cannon #26018: documentation of ZipFile file name encoding http://bugs.python.org/issue26018 opened by gagern #26019: collections.abc documentation incomplete http://bugs.python.org/issue26019 opened by abarnert #26020: set_display evaluation order doesn't match documented behaviou http://bugs.python.org/issue26020 opened by Hamish Campbell #26023: Missing signatures operator module http://bugs.python.org/issue26023 opened by Freddy Rietdijk #26024: Non-ascii Windows locale names http://bugs.python.org/issue26024 opened by vidartf #26025: Document pathlib.Path.__truediv__() http://bugs.python.org/issue26025 opened by brett.cannon #26027: Support Path objects in the posix module http://bugs.python.org/issue26027 opened by serhiy.storchaka #26029: Broken sentence in extending documentation http://bugs.python.org/issue26029 opened by sizeof #26030: Use PEP8 in documentation examples http://bugs.python.org/issue26030 opened by sizeof #26031: Add stat caching option to pathlib http://bugs.python.org/issue26031 opened by gvanrossum #26032: Use scandir() to speed up pathlib globbing http://bugs.python.org/issue26032 opened by gvanrossum #26033: distutils default compiler API is incomplete http://bugs.python.org/issue26033 opened by stefan #26034: venv documentation out of date http://bugs.python.org/issue26034 opened by dsadowski #26035: traceback.print_tb() takes `tb`, not `traceback` as a keyword http://bugs.python.org/issue26035 opened by Nicholas Chammas #26036: Unnecessary arguments on smtpd.SMTPServer http://bugs.python.org/issue26036 opened by sleepycal #26037: Crash when reading sys.stdin.buffer in a daemon thread http://bugs.python.org/issue26037 opened by eph ??? #26038: zipfile cannot handle zip files where the archive size for a f http://bugs.python.org/issue26038 opened by Brett Rosen #26039: More flexibility in zipfile interface http://bugs.python.org/issue26039 opened by takluyver #26040: Improve coverage and rigour of test.test_math http://bugs.python.org/issue26040 opened by jeff.allen #26041: Update deprecation messages of platform.dist() and platform.li http://bugs.python.org/issue26041 opened by berker.peksag #26045: Improve error message for http.client when posting unicode str http://bugs.python.org/issue26045 opened by Emil Stenstr??m #26049: Poor performance when reading large xmlrpc data http://bugs.python.org/issue26049 opened by pokoli Most recent 15 issues with no replies (15) ========================================== #26041: Update deprecation messages of platform.dist() and platform.li http://bugs.python.org/issue26041 #26040: Improve coverage and rigour of test.test_math http://bugs.python.org/issue26040 #26038: zipfile cannot handle zip files where the archive size for a f http://bugs.python.org/issue26038 #26034: venv documentation out of date http://bugs.python.org/issue26034 #26033: distutils default compiler API is incomplete http://bugs.python.org/issue26033 #26027: Support Path objects in the posix module http://bugs.python.org/issue26027 #26025: Document pathlib.Path.__truediv__() http://bugs.python.org/issue26025 #26018: documentation of ZipFile file name encoding http://bugs.python.org/issue26018 #26015: Add new tests for pickling iterators of mutable sequences http://bugs.python.org/issue26015 #26014: Guide users to the newer package install instructions http://bugs.python.org/issue26014 #26003: Issues with PyEval_InitThreads and PyGILState_Ensure http://bugs.python.org/issue26003 #26002: make statistics.median_grouped more efficient http://bugs.python.org/issue26002 #25993: Crashed when call time.time() after using _mm_xor_si64 http://bugs.python.org/issue25993 #25991: readline example eventually consumes all memory http://bugs.python.org/issue25991 #25989: documentation version switcher is broken fro 2.6, 3.2, 3.3 http://bugs.python.org/issue25989 Most recent 15 issues waiting for review (15) ============================================= #26049: Poor performance when reading large xmlrpc data http://bugs.python.org/issue26049 #26045: Improve error message for http.client when posting unicode str http://bugs.python.org/issue26045 #26039: More flexibility in zipfile interface http://bugs.python.org/issue26039 #26038: zipfile cannot handle zip files where the archive size for a f http://bugs.python.org/issue26038 #26035: traceback.print_tb() takes `tb`, not `traceback` as a keyword http://bugs.python.org/issue26035 #26031: Add stat caching option to pathlib http://bugs.python.org/issue26031 #26029: Broken sentence in extending documentation http://bugs.python.org/issue26029 #26020: set_display evaluation order doesn't match documented behaviou http://bugs.python.org/issue26020 #26019: collections.abc documentation incomplete http://bugs.python.org/issue26019 #26017: Update https://docs.python.org/3/installing/index.html to alwa http://bugs.python.org/issue26017 #26015: Add new tests for pickling iterators of mutable sequences http://bugs.python.org/issue26015 #26013: Pickle protocol 2.0 not loading in python 3.5 http://bugs.python.org/issue26013 #26010: document CO_* constants http://bugs.python.org/issue26010 #26002: make statistics.median_grouped more efficient http://bugs.python.org/issue26002 #26001: Tutorial: write() does not expect string in binary mode http://bugs.python.org/issue26001 Top 10 most discussed issues (10) ================================= #25958: Implicit ABCs have no means of "anti-registration" http://bugs.python.org/issue25958 25 msgs #22570: Better stdlib support for Path objects http://bugs.python.org/issue22570 17 msgs #25596: regular files handled as directories in the glob module http://bugs.python.org/issue25596 10 msgs #26007: Request for Support for Embedding the Standard Library in an E http://bugs.python.org/issue26007 10 msgs #19251: bitwise ops for bytes of equal length http://bugs.python.org/issue19251 9 msgs #22995: Restrict default pickleability http://bugs.python.org/issue22995 9 msgs #25940: SSL tests failed due to expired svn.python.org SSL certificate http://bugs.python.org/issue25940 9 msgs #25864: collections.abc.Mapping should include a __reversed__ that rai http://bugs.python.org/issue25864 7 msgs #26032: Use scandir() to speed up pathlib globbing http://bugs.python.org/issue26032 7 msgs #6500: urllib2 maximum recursion depth exceeded http://bugs.python.org/issue6500 6 msgs Issues closed (33) ================== #5501: Update multiprocessing docs re: freeze_support http://bugs.python.org/issue5501 closed by berker.peksag #16544: Add external link to ast docs http://bugs.python.org/issue16544 closed by orsenthil #18918: help('FILES') finds no documentation http://bugs.python.org/issue18918 closed by orsenthil #20440: Use the Py_SETREF macro http://bugs.python.org/issue20440 closed by serhiy.storchaka #20969: Author of EPUB version of Python docs is set to Unknown instea http://bugs.python.org/issue20969 closed by orsenthil #21221: Minor struct_time documentation bug http://bugs.python.org/issue21221 closed by orsenthil #21815: imaplib truncates some untagged responses http://bugs.python.org/issue21815 closed by r.david.murray #22709: restore accepting detached stdin in fileinput binary mode http://bugs.python.org/issue22709 closed by r.david.murray #24036: GB2312 codec is using a wrong covert table http://bugs.python.org/issue24036 closed by lemburg #24104: Use after free in xmlparser_setevents (2) http://bugs.python.org/issue24104 closed by serhiy.storchaka #24120: pathlib.(r)glob stops on PermissionDenied exception http://bugs.python.org/issue24120 closed by gvanrossum #24733: Logically Dead Code http://bugs.python.org/issue24733 closed by orsenthil #24898: Documentation for str.find() is confusing http://bugs.python.org/issue24898 closed by orsenthil #25637: Move non-collections-related ABCs out of collections.abc http://bugs.python.org/issue25637 closed by brett.cannon #25672: set SSL_MODE_RELEASE_BUFFERS http://bugs.python.org/issue25672 closed by python-dev #25813: co_flags section of inspect module docs out of date http://bugs.python.org/issue25813 closed by BreamoreBoy #25917: Fixing howto links in docs http://bugs.python.org/issue25917 closed by orsenthil #25990: Pydoc fails on Python file with nonlocal http://bugs.python.org/issue25990 closed by serhiy.storchaka #25997: Tarfile.add with bytes path is failing http://bugs.python.org/issue25997 closed by Patrik Dufresne #25999: Add support of negative number in bin() http://bugs.python.org/issue25999 closed by larry #26006: 32 bits python ctypes creates 64 bits process from 32 bits exe http://bugs.python.org/issue26006 closed by Artur Korobeynyk #26008: Different behaviour platform.linux_distribution() on Python2.7 http://bugs.python.org/issue26008 closed by eric.smith #26012: pathlib.Path().rglob() is fooled by symlink loops http://bugs.python.org/issue26012 closed by gvanrossum #26021: Missing IPv6 support for pypi.python.org http://bugs.python.org/issue26021 closed by dstufft #26022: string.replace(' ',' ') has to be called 2 times before it wo http://bugs.python.org/issue26022 closed by orsenthil #26026: True%2 is True http://bugs.python.org/issue26026 closed by SilentGhost #26028: .sort() Causing Strings to Be Listed on the same line http://bugs.python.org/issue26028 closed by SilentGhost #26042: Consider dropping magic number for more detailed .pyc file nam http://bugs.python.org/issue26042 closed by brett.cannon #26043: ON DELETE CASCADE does not work when using sqlite3 library http://bugs.python.org/issue26043 closed by Vitaminus Maximus #26044: Name mangling overrides externally defined names http://bugs.python.org/issue26044 closed by ethan.furman #26046: Typo in documentation of unittest http://bugs.python.org/issue26046 closed by python-dev #26047: argparse.ArgumentError documentation wrong http://bugs.python.org/issue26047 closed by SilentGhost #26048: New user in community http://bugs.python.org/issue26048 closed by ezio.melotti From blake.a.griffith at gmail.com Sat Jan 9 19:17:49 2016 From: blake.a.griffith at gmail.com (Blake Griffith) Date: Sun, 10 Jan 2016 01:17:49 +0100 Subject: [Python-Dev] bitwise operations for bytes and bytearray In-Reply-To: <20160108010820.GA6099@cskk.homeip.net> References: <6E0FFB72-142F-493C-BE29-D260302EBA16@yahoo.com> <20160108010820.GA6099@cskk.homeip.net> Message-ID: A little update, I got ^, &, and | working for bytearrays. You can view the diff here: https://github.com/python/cpython/compare/master...cowlicks:bitwise-bytes?expand=1 How does it look? Joe, is this how I should allocate the arrays? Am I freeing them properly? Am I checking the input enough? After some feedback, I'll probably add bitshifting and negation. Then work on bytes objects. Does this warrant a pep? On Fri, Jan 8, 2016 at 2:08 AM, Cameron Simpson wrote: > On 07Jan2016 16:12, Python-Dev wrote: > >> On Jan 7, 2016, at 15:57, Martin Panter wrote: >> >>> On 7 January 2016 at 22:26, Blake Griffith >>>> wrote: >>>> I'm interested in adding the functionality to do something like: >>>> >>>>> b'a' ^ b'b' >>>>>>> >>>>>> b'\x03' >>>> Instead of the good ol' TypeError. >>>> >>>> I think both bytes and bytearray should support all the bitwise >>>> operations. >>>> >>> >>> There is a bug open about adding this kind of functionality: >>> . >>> >> >> And it's in the needs patch stage, which makes it perfect for the OP: in >> addition to learning how to hack on builtin types, he can also learn the >> other parts of the dev process. (Even if the bug is eventually rejected, as >> seems likely given that it sat around for three years with no compelling >> use case and then Guido added a "very skeptical" comment.) >> > > The use case which springs immediately to my mind is cryptography. To > encrypt a stream symmetrically you can go: > > cleartext-bytes ^ cryptographicly-random-bytes-from-cipher > > so with this one could write: > > def crypted(byteses, crypto_source): > ''' Accept an iterable source of bytes objects and a preprimed source > of crypto bytes, yield encrypted versions of the bytes objects. > ''' > for bs in byteses: > cbs = crypto_source.next_bytes(len(bs)) > yield bs ^ cbs > > Cheers, > Cameron Simpson > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/blake.a.griffith%40gmail.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From abarnert at yahoo.com Sat Jan 9 19:31:58 2016 From: abarnert at yahoo.com (Andrew Barnert) Date: Sat, 9 Jan 2016 16:31:58 -0800 Subject: [Python-Dev] bitwise operations for bytes and bytearray In-Reply-To: References: <6E0FFB72-142F-493C-BE29-D260302EBA16@yahoo.com> <20160108010820.GA6099@cskk.homeip.net> Message-ID: <3C8E1B3A-1282-4932-B2B0-2D6FF51298D2@yahoo.com> On Jan 9, 2016, at 16:17, Blake Griffith wrote: > > A little update, I got ^, &, and | working for bytearrays. You can view the diff here: > https://github.com/python/cpython/compare/master...cowlicks:bitwise-bytes?expand=1 If you upload the diff to the issue on the tracker, the reitveld code review app should be able to pick it up automatically, allowing people to comment on it inline, in a much nicer format than a mailing list thread. It's especially nice if you're adding things in stages--people who have been following along can just look at the changes between patch 3 and 4, while new people can look at all the changes in one go, etc. > How does it look? > Joe, is this how I should allocate the arrays? Am I freeing them properly? > Am I checking the input enough? > > After some feedback, I'll probably add bitshifting and negation. Then work on bytes objects. > > Does this warrant a pep? Personally, I'd just make the case for the feature on the tracker issue. If one of the core devs thinks it needs a PEP, or further discussion on this list or -ideas, they'll say so there. At present, it seems like there's not much support for the idea, but I think that's at least partly because people want to see realistic use cases (that aren't server better by the existing bitarray/bitstring/etc. modules on PyPI, or using a NumPy array, or just using ints, etc.). >> On Fri, Jan 8, 2016 at 2:08 AM, Cameron Simpson wrote: >>> On 07Jan2016 16:12, Python-Dev wrote: >>> On Jan 7, 2016, at 15:57, Martin Panter wrote: >>>>> On 7 January 2016 at 22:26, Blake Griffith wrote: >>>>> I'm interested in adding the functionality to do something like: >>>>>>>> b'a' ^ b'b' >>>>> b'\x03' >>>>> Instead of the good ol' TypeError. >>>>> >>>>> I think both bytes and bytearray should support all the bitwise operations. >>>> >>>> There is a bug open about adding this kind of functionality: >>>> . >>> >>> And it's in the needs patch stage, which makes it perfect for the OP: in addition to learning how to hack on builtin types, he can also learn the other parts of the dev process. (Even if the bug is eventually rejected, as seems likely given that it sat around for three years with no compelling use case and then Guido added a "very skeptical" comment.) >> >> The use case which springs immediately to my mind is cryptography. To encrypt a stream symmetrically you can go: >> >> cleartext-bytes ^ cryptographicly-random-bytes-from-cipher >> >> so with this one could write: >> >> def crypted(byteses, crypto_source): >> ''' Accept an iterable source of bytes objects and a preprimed source of crypto bytes, yield encrypted versions of the bytes objects. >> ''' >> for bs in byteses: >> cbs = crypto_source.next_bytes(len(bs)) >> yield bs ^ cbs >> >> Cheers, >> Cameron Simpson >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: https://mail.python.org/mailman/options/python-dev/blake.a.griffith%40gmail.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vadmium+py at gmail.com Sat Jan 9 22:21:37 2016 From: vadmium+py at gmail.com (Martin Panter) Date: Sun, 10 Jan 2016 03:21:37 +0000 Subject: [Python-Dev] bitwise operations for bytes and bytearray In-Reply-To: References: <6E0FFB72-142F-493C-BE29-D260302EBA16@yahoo.com> <20160108010820.GA6099@cskk.homeip.net> Message-ID: On 10 January 2016 at 00:17, Blake Griffith wrote: > A little update, I got ^, &, and | working for bytearrays. You can view the > diff here: > https://github.com/python/cpython/compare/master...cowlicks:bitwise-bytes?expand=1 > > How does it look? I left some comments against your commits on Git Hub. Unfortunately they seem to be made in the base Python repository, rather than your fork, which I did not expect. > Joe, is this how I should allocate the arrays? Am I freeing them properly? > Am I checking the input enough? > > After some feedback, I'll probably add bitshifting and negation. Then work > on bytes objects. > > Does this warrant a pep? From brett at python.org Sun Jan 10 12:43:48 2016 From: brett at python.org (Brett Cannon) Date: Sun, 10 Jan 2016 17:43:48 +0000 Subject: [Python-Dev] GitHub migration planning has started Message-ID: For those of you who have not heard, I made the decision a little over a week ago to move Python's development from our home-grown workflow to one hosted on GitHub (mainly for code hosting and code review; we're keeping bugs.python.org for our issue tracker). The hope is that this will let core developers work through patches faster so that we have a better turn-around time while being at least as good as our current workflow for external contributors (but I will be shocked if it isn't better). There are also people involved with the migration who plan to put in the effort to make sure external contributors can still submit patches without ever interacting with GitHub. If you want to help with the transition, then feel free to join the core-workflow mailing list where all the discussions on the details of the migration are occurring (including the PEP I'm starting to write to outline the steps we will be taking): https://mail.python.org/mailman/listinfo/core-workflow -------------- next part -------------- An HTML attachment was scrubbed... URL: From rymg19 at gmail.com Sun Jan 10 12:48:37 2016 From: rymg19 at gmail.com (Ryan Gonzalez) Date: Sun, 10 Jan 2016 11:48:37 -0600 Subject: [Python-Dev] GitHub migration planning has started In-Reply-To: References: Message-ID: <4896FEAF-63DE-4E49-8271-34CF7BB9AAFF@gmail.com> Is it possible to contribute to this, even if you're not part of the core dev team? On January 10, 2016 11:43:48 AM CST, Brett Cannon wrote: >For those of you who have not heard, I made the decision a little over >a >week ago to move Python's development from our home-grown workflow to >one >hosted on GitHub (mainly for code hosting and code review; we're >keeping >bugs.python.org for our issue tracker). The hope is that this will let >core >developers work through patches faster so that we have a better >turn-around >time while being at least as good as our current workflow for external >contributors (but I will be shocked if it isn't better). There are also >people involved with the migration who plan to put in the effort to >make >sure external contributors can still submit patches without ever >interacting with GitHub. > >If you want to help with the transition, then feel free to join the >core-workflow mailing list where all the discussions on the details of >the >migration are occurring (including the PEP I'm starting to write to >outline >the steps we will be taking): >https://mail.python.org/mailman/listinfo/core-workflow > > >------------------------------------------------------------------------ > >_______________________________________________ >Python-Dev mailing list >Python-Dev at python.org >https://mail.python.org/mailman/listinfo/python-dev >Unsubscribe: >https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com -- Sent from my Nexus 5 with K-9 Mail. Please excuse my brevity. -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Sun Jan 10 12:59:45 2016 From: brett at python.org (Brett Cannon) Date: Sun, 10 Jan 2016 17:59:45 +0000 Subject: [Python-Dev] GitHub migration planning has started In-Reply-To: <4896FEAF-63DE-4E49-8271-34CF7BB9AAFF@gmail.com> References: <4896FEAF-63DE-4E49-8271-34CF7BB9AAFF@gmail.com> Message-ID: On Sun, 10 Jan 2016 at 09:48 Ryan Gonzalez wrote: > Is it possible to contribute to this, even if you're not part of the core > dev team? > Sure! There's going to be plenty of code to write, decisions to be made, etc. While I will making most of the final decisions, I will be asking for feedback from people, asking some to go off and write some code, etc. -Brett > > On January 10, 2016 11:43:48 AM CST, Brett Cannon > wrote: > >> For those of you who have not heard, I made the decision a little over a >> week ago to move Python's development from our home-grown workflow to one >> hosted on GitHub (mainly for code hosting and code review; we're keeping >> bugs.python.org for our issue tracker). The hope is that this will let >> core developers work through patches faster so that we have a better >> turn-around time while being at least as good as our current workflow for >> external contributors (but I will be shocked if it isn't better). There are >> also people involved with the migration who plan to put in the effort to >> make sure external contributors can still submit patches without ever >> interacting with GitHub. >> >> If you want to help with the transition, then feel free to join the >> core-workflow mailing list where all the discussions on the details of the >> migration are occurring (including the PEP I'm starting to write to outline >> the steps we will be taking): >> https://mail.python.org/mailman/listinfo/core-workflow >> >> ------------------------------ >> >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com >> >> > -- > Sent from my Nexus 5 with K-9 Mail. Please excuse my brevity. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From stephane at wirtel.be Sun Jan 10 15:36:42 2016 From: stephane at wirtel.be (Stephane Wirtel) Date: Sun, 10 Jan 2016 21:36:42 +0100 Subject: [Python-Dev] GitHub migration planning has started In-Reply-To: References: Message-ID: <20160110203642.GD28792@sg1> On 01/10, Brett Cannon wrote: > For those of you who have not heard, I made the decision a little over a > week ago to move Python's development from our home-grown workflow to one > hosted on GitHub (mainly for code hosting and code review; we're keeping > bugs.python.org for our issue tracker). The hope is that this will let core > developers work through patches faster so that we have a better turn-around > time while being at least as good as our current workflow for external > contributors (but I will be shocked if it isn't better). There are also > people involved with the migration who plan to put in the effort to make > sure external contributors can still submit patches without ever > interacting with GitHub. > > If you want to help with the transition, then feel free to join the > core-workflow mailing list where all the discussions on the details of the > migration are occurring (including the PEP I'm starting to write to outline > the steps we will be taking): > https://mail.python.org/mailman/listinfo/core-workflow > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/stephane%40wirtel.be \o/ Already on core-workflow. -- St?phane Wirtel - http://wirtel.be - @matrixise From tjreedy at udel.edu Sun Jan 10 22:52:49 2016 From: tjreedy at udel.edu (Terry Reedy) Date: Sun, 10 Jan 2016 22:52:49 -0500 Subject: [Python-Dev] GitHub migration planning has started In-Reply-To: References: Message-ID: On 1/10/2016 12:43 PM, Brett Cannon wrote: > For those of you who have not heard, I made the decision a little over a > week ago to move Python's development from our home-grown workflow to > one hosted on GitHub (mainly for code hosting and code review; we're > keeping bugs.python.org for our issue tracker). > The hope is that this will let core developers work through patches > faster so that we have a better turn-around time while being at least as > good as our current workflow for external contributors (but I will be > shocked if it isn't better). There are also people involved with the > migration who plan to put in the effort to make sure external > contributors can still submit patches without ever interacting with GitHub. > > If you want to help with the transition, then feel free to join the > core-workflow mailing list where all the discussions on the details of > the migration are occurring (including the PEP I'm starting to write to > outline the steps we will be taking): > https://mail.python.org/mailman/listinfo/core-workflow Is there a gmane mirror, or do you think this is too limited (and temporary) for that? -- Terry Jan Reedy From brett at python.org Mon Jan 11 00:44:04 2016 From: brett at python.org (Brett Cannon) Date: Mon, 11 Jan 2016 05:44:04 +0000 Subject: [Python-Dev] GitHub migration planning has started In-Reply-To: References: Message-ID: On Sun, 10 Jan 2016 at 19:53 Terry Reedy wrote: > On 1/10/2016 12:43 PM, Brett Cannon wrote: > > For those of you who have not heard, I made the decision a little over a > > week ago to move Python's development from our home-grown workflow to > > one hosted on GitHub (mainly for code hosting and code review; we're > > keeping bugs.python.org for our issue tracker). > > The hope is that this will let core developers work through patches > > faster so that we have a better turn-around time while being at least as > > good as our current workflow for external contributors (but I will be > > shocked if it isn't better). There are also people involved with the > > migration who plan to put in the effort to make sure external > > contributors can still submit patches without ever interacting with > GitHub. > > > > If you want to help with the transition, then feel free to join the > > core-workflow mailing list where all the discussions on the details of > > the migration are occurring (including the PEP I'm starting to write to > > outline the steps we will be taking): > > https://mail.python.org/mailman/listinfo/core-workflow > > Is there a gmane mirror, or do you think this is too limited (and > temporary) for that? > I have no idea if a gmane mirror was set up. -------------- next part -------------- An HTML attachment was scrubbed... URL: From vadmium+py at gmail.com Mon Jan 11 00:58:38 2016 From: vadmium+py at gmail.com (Martin Panter) Date: Mon, 11 Jan 2016 05:58:38 +0000 Subject: [Python-Dev] GitHub migration planning has started In-Reply-To: References: Message-ID: On 11 January 2016 at 03:52, Terry Reedy wrote: > On 1/10/2016 12:43 PM, Brett Cannon wrote: >> If you want to help with the transition, then feel free to join the >> core-workflow mailing list where all the discussions on the details of >> the migration are occurring (including the PEP I'm starting to write to >> outline the steps we will be taking): >> https://mail.python.org/mailman/listinfo/core-workflow > > > Is there a gmane mirror, or do you think this is too limited (and temporary) > for that? I guess this is it: From tjreedy at udel.edu Mon Jan 11 01:46:45 2016 From: tjreedy at udel.edu (Terry Reedy) Date: Mon, 11 Jan 2016 01:46:45 -0500 Subject: [Python-Dev] GitHub migration planning has started In-Reply-To: References: Message-ID: On 1/11/2016 12:58 AM, Martin Panter wrote: > On 11 January 2016 at 03:52, Terry Reedy wrote: >> On 1/10/2016 12:43 PM, Brett Cannon wrote: >>> If you want to help with the transition, then feel free to join the >>> core-workflow mailing list where all the discussions on the details of >>> the migration are occurring (including the PEP I'm starting to write to >>> outline the steps we will be taking): >>> https://mail.python.org/mailman/listinfo/core-workflow >> >> >> Is there a gmane mirror, or do you think this is too limited (and temporary) >> for that? > > I guess this is it: Thanks. When I right-click news.gmane.org (in Thunderbird) and select subscribe, it does not appear in the 'Current groups' tab. I just discovered that there is also a 'New groups' tab that has it. -- Terry Jan Reedy From barry at python.org Mon Jan 11 10:10:41 2016 From: barry at python.org (Barry Warsaw) Date: Mon, 11 Jan 2016 10:10:41 -0500 Subject: [Python-Dev] [Python-ideas] PEP 9 - plaintext PEP format - is officially deprecated In-Reply-To: References: <20160105184921.317ac5ec@limelight.wooz.org> Message-ID: <20160111101041.1ba6177d@limelight.wooz.org> On Jan 11, 2016, at 03:25 PM, anatoly techtonik wrote: >On Wed, Jan 6, 2016 at 2:49 AM, Barry Warsaw wrote: > >> reStructuredText is clearly a better format > >Can you expand on that? I use markdown everywhere reST is better than plain text. Markdown is not a PEP format option. >> all recent PEP submissions have been in reST for a while now anyway. > >Is it possible to query exact numbers automatically? Feel free to grep the PEPs hg repo. >What is the tooling support for handling PEP 9 and PEP 12? UTSL. Everything is in the PEPs hg repo. Cheers, -Barry -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 819 bytes Desc: OpenPGP digital signature URL: From victor.stinner at gmail.com Mon Jan 11 11:49:19 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Mon, 11 Jan 2016 17:49:19 +0100 Subject: [Python-Dev] PEP 509: Add a private version to dict Message-ID: Hi, After a first round on python-ideas, here is the second version of my PEP. The main changes since the first version are that the dictionary version is no more exposed at the Python level and the field type now also has a size of 64-bit on 32-bit platforms. The PEP is part of a serie of 3 PEP adding an API to implement a static Python optimizer specializing functions with guards. The second PEP is currently discussed on python-ideas and I'm still working on the third PEP. Thanks to Red Hat for giving me time to experiment on this. HTML version: https://www.python.org/dev/peps/pep-0509/ PEP: 509 Title: Add a private version to dict Version: $Revision$ Last-Modified: $Date$ Author: Victor Stinner Status: Draft Type: Standards Track Content-Type: text/x-rst Created: 4-January-2016 Python-Version: 3.6 Abstract ======== Add a new private version to builtin ``dict`` type, incremented at each change, to implement fast guards on namespaces. Rationale ========= In Python, the builtin ``dict`` type is used by many instructions. For example, the ``LOAD_GLOBAL`` instruction searchs for a variable in the global namespace, or in the builtins namespace (two dict lookups). Python uses ``dict`` for the builtins namespace, globals namespace, type namespaces, instance namespaces, etc. The local namespace (namespace of a function) is usually optimized to an array, but it can be a dict too. Python is hard to optimize because almost everything is mutable: builtin functions, function code, global variables, local variables, ... can be modified at runtime. Implementing optimizations respecting the Python semantics requires to detect when "something changes": we will call these checks "guards". The speedup of optimizations depends on the speed of guard checks. This PEP proposes to add a version to dictionaries to implement fast guards on namespaces. Dictionary lookups can be skipped if the version does not change which is the common case for most namespaces. The performance of a guard does not depend on the number of watched dictionary entries, complexity of O(1), if the dictionary version does not change. Example of optimization: copy the value of a global variable to function constants. This optimization requires a guard on the global variable to check if it was modified. If the variable is modified, the variable must be loaded at runtime when the function is called, instead of using the constant. See the `PEP 510 -- Specialized functions with guards `_ for the concrete usage of guards to specialize functions and for the rationale on Python static optimizers. Guard example ============= Pseudo-code of an fast guard to check if a dictionary entry was modified (created, updated or deleted) using an hypothetical ``dict_get_version(dict)`` function:: UNSET = object() class GuardDictKey: def __init__(self, dict, key): self.dict = dict self.key = key self.value = dict.get(key, UNSET) self.version = dict_get_version(dict) def check(self): """Return True if the dictionary entry did not changed.""" # read the version field of the dict structure version = dict_get_version(self.dict) if version == self.version: # Fast-path: dictionary lookup avoided return True # lookup in the dictionary value = self.dict.get(self.key, UNSET) if value is self.value: # another key was modified: # cache the new dictionary version self.version = version return True # the key was modified return False Usage of the dict version ========================= Specialized functions using guards ---------------------------------- The `PEP 510 -- Specialized functions with guards `_ proposes an API to support specialized functions with guards. It allows to implement static optimizers for Python without breaking the Python semantics. Example of a static Python optimizer: the astoptimizer of the `FAT Python `_ project implements many optimizations which require guards on namespaces. Examples: * Call pure builtins: to replace ``len("abc")`` with ``3``, guards on ``builtins.__dict__['len']`` and ``globals()['len']`` are required * Loop unrolling: to unroll the loop ``for i in range(...): ...``, guards on ``builtins.__dict__['range']`` and ``globals()['range']`` are required Pyjion ------ According of Brett Cannon, one of the two main developers of Pyjion, Pyjion can also benefit from dictionary version to implement optimizations. Pyjion is a JIT compiler for Python based upon CoreCLR (Microsoft .NET Core runtime). Unladen Swallow --------------- Even if dictionary version was not explicitly mentionned, optimization globals and builtins lookup was part of the Unladen Swallow plan: "Implement one of the several proposed schemes for speeding lookups of globals and builtins." Source: `Unladen Swallow ProjectPlan `_. Unladen Swallow is a fork of CPython 2.6.1 adding a JIT compiler implemented with LLVM. The project stopped in 2011: `Unladen Swallow Retrospective `_. Changes ======= Add a ``ma_version`` field to the ``PyDictObject`` structure with the C type ``PY_INT64_T``, 64-bit unsigned integer. New empty dictionaries are initilized to version ``0``. The version is incremented at each change: * ``clear()`` if the dict was non-empty * ``pop(key)`` if the key exists * ``popitem()`` if the dict is non-empty * ``setdefault(key, value)`` if the `key` does not exist * ``__detitem__(key)`` if the key exists * ``__setitem__(key, value)`` if the `key` doesn't exist or if the value is different * ``update(...)`` if new values are different than existing values (the version can be incremented multiple times) Example using an hypothetical ``dict_get_version(dict)`` function:: >>> d = {} >>> dict_get_version(d) 0 >>> d['key'] = 'value' >>> dict_get_version(d) 1 >>> d['key'] = 'new value' >>> dict_get_version(d) 2 >>> del d['key'] >>> dict_get_version(d) 3 If a dictionary is created with items, the version is also incremented at each dictionary insertion. Example:: >>> d=dict(x=7, y=33) >>> dict_get_version(d) 2 The version is not incremented if an existing key is set to the same value. For efficiency, values are compared by their identity: ``new_value is old_value``, not by their content: ``new_value == old_value``. Example:: >>> d={} >>> value = object() >>> d['key'] = value >>> dict_get_version(d) 2 >>> d['key'] = value >>> dict_get_version(d) 2 .. note:: CPython uses some singleton like integers in the range [-5; 257], empty tuple, empty strings, Unicode strings of a single character in the range [U+0000; U+00FF], etc. When a key is set twice to the same singleton, the version is not modified. Implementation ============== The `issue #26058: PEP 509: Add ma_version to PyDictObject `_ contains a patch implementing this PEP. On pybench and timeit microbenchmarks, the patch does not seem to add any overhead on dictionary operations. When the version does not change, ``PyDict_GetItem()`` takes 14.8 ns for a dictioanry lookup, whereas a guard check only takes 3.8 ns. Moreover, a guard can watch for multiple keys. For example, for an optimization using 10 global variables in a function, 10 dictionary lookups costs 148 ns, whereas the guard still only costs 3.8 ns when the version does not change (39x as fast). Integer overflow ================ The implementation uses the C unsigned integer type ``PY_INT64_T`` to store the version, a 64 bits unsigned integer. The C code uses ``version++``. On integer overflow, the version is wrapped to ``0`` (and then continue to be incremented) according to the C standard. After an integer overflow, a guard can succeed whereas the watched dictionary key was modified. The bug occurs if the dictionary is modified at least ``2 ** 64`` times between two checks of the guard and if the new version (theorical value with no integer overflow) is equal to the old version modulo ``2 ** 64``. If a dictionary is modified each nanosecond, an overflow takes longer than 584 years. Using a 32-bit version, the overflow occurs only after 4 seconds. That's why a 64-bit unsigned type is also used on 32-bit systems. A dictionary lookup at the C level takes 14.8 ns. A risk of a bug every 584 years is acceptable. Alternatives ============ Expose the version at Python level as a read-only __version__ property ---------------------------------------------------------------------- The first version of the PEP proposed to expose the dictionary version as a read-only ``__version__`` property at Python level, and also to add the property to ``collections.UserDict`` (since this type must mimick the ``dict`` API). There are multiple issues: * To be consistent and avoid bad surprises, the version must be added to all mapping types. Implementing a new mapping type would require extra work for no benefit, since the version is only required on the ``dict`` type in practice. * All Python implementations must implement this new property, it gives more work to other implementations, whereas they may not use the dictionary version at all. * The ``__version__`` can be wrapped on integer overflow. It is error prone: using ``dict.__version__ <= guard_version`` is wrong, ``dict.__version__ == guard_version`` must be used instead to reduce the risk of bug on integer overflow (even if the integer overflow is unlikely in practice). * Exposing the dictionary version at Python level can lead the false assumption on performances. Checking ``dict.__version__`` at the Python level is not faster than a dictionary lookup. A dictionary lookup has a cost of 48.7 ns and checking a guard has a cost of 47.5 ns, the difference is only 1.2 ns (3%):: $ ./python -m timeit -s 'd = {str(i):i for i in range(100)}' 'd["33"] == 33' 10000000 loops, best of 3: 0.0487 usec per loop $ ./python -m timeit -s 'd = {str(i):i for i in range(100)}' 'd.__version__ == 100' 10000000 loops, best of 3: 0.0475 usec per loop Bikeshedding on the property name: * ``__cache_token__``: name proposed by Nick Coghlan, name coming from `abc.get_cache_token() `_. * ``__version__`` * ``__timestamp__`` Add a version to each dict entry -------------------------------- A single version per dictionary requires to keep a strong reference to the value which can keep the value alive longer than expected. If we add also a version per dictionary entry, the guard can only store the entry version to avoid the strong reference to the value (only strong references to the dictionary and to the key are needed). Changes: add a ``me_version`` field to the ``PyDictKeyEntry`` structure, the field has the C type ``PY_INT64_T``. When a key is created or modified, the entry version is set to the dictionary version which is incremented at any change (create, modify, delete). Pseudo-code of an fast guard to check if a dictionary key was modified using hypothetical ``dict_get_version(dict)`` ``dict_get_entry_version(dict)`` functions:: UNSET = object() class GuardDictKey: def __init__(self, dict, key): self.dict = dict self.key = key self.dict_version = dict_get_version(dict) self.entry_version = dict_get_entry_version(dict, key) def check(self): """Return True if the dictionary entry did not changed.""" # read the version field of the dict structure dict_version = dict_get_version(self.dict) if dict_version == self.version: # Fast-path: dictionary lookup avoided return True # lookup in the dictionary entry_version = get_dict_key_version(dict, key) if entry_version == self.entry_version: # another key was modified: # cache the new dictionary version self.dict_version = dict_version return True # the key was modified return False The main drawback of this option is the impact on the memory footprint. It increases the size of each dictionary entry, so the overhead depends on the number of buckets (dictionary entries, used or unused yet). For example, it increases the size of each dictionary entry by 8 bytes on 64-bit system. In Python, the memory footprint matters and the trend is to reduce it. Examples: * `PEP 393 -- Flexible String Representation `_ * `PEP 412 -- Key-Sharing Dictionary `_ Add a new dict subtype ---------------------- Add a new ``verdict`` type, subtype of ``dict``. When guards are needed, use the ``verdict`` for namespaces (module namespace, type namespace, instance namespace, etc.) instead of ``dict``. Leave the ``dict`` type unchanged to not add any overhead (memory footprint) when guards are not needed. Technical issue: a lot of C code in the wild, including CPython core, expecting the exact ``dict`` type. Issues: * ``exec()`` requires a ``dict`` for globals and locals. A lot of code use ``globals={}``. It is not possible to cast the ``dict`` to a ``dict`` subtype because the caller expects the ``globals`` parameter to be modified (``dict`` is mutable). * Functions call directly ``PyDict_xxx()`` functions, instead of calling ``PyObject_xxx()`` if the object is a ``dict`` subtype * ``PyDict_CheckExact()`` check fails on ``dict`` subtype, whereas some functions require the exact ``dict`` type. * ``Python/ceval.c`` does not completly supports dict subtypes for namespaces The ``exec()`` issue is a blocker issue. Other issues: * The garbage collector has a special code to "untrack" ``dict`` instances. If a ``dict`` subtype is used for namespaces, the garbage collector can be unable to break some reference cycles. * Some functions have a fast-path for ``dict`` which would not be taken for ``dict`` subtypes, and so it would make Python a little bit slower. Prior Art ========= Method cache and type version tag --------------------------------- In 2007, Armin Rigo wrote a patch to to implement a cache of methods. It was merged into Python 2.6. The patch adds a "type attribute cache version tag" (``tp_version_tag``) and a "valid version tag" flag to types (the ``PyTypeObject`` structure). The type version tag is not available at the Python level. The version tag has the C type ``unsigned int``. The cache is a global hash table of 4096 entries, shared by all types. The cache is global to "make it fast, have a deterministic and low memory footprint, and be easy to invalidate". Each cache entry has a version tag. A global version tag is used to create the next version tag, it also has the C type ``unsigned int``. By default, a type has its "valid version tag" flag cleared to indicate that the version tag is invalid. When the first method of the type is cached, the version tag and the "valid version tag" flag are set. When a type is modified, the "valid version tag" flag of the type and its subclasses is cleared. Later, when a cache entry of these types is used, the entry is removed because its version tag is outdated. On integer overflow, the whole cache is cleared and the global version tag is reset to ``0``. See `Method cache (issue #1685986) `_ and `Armin's method cache optimization updated for Python 2.6 (issue #1700288) `_. Globals / builtins cache ------------------------ In 2010, Antoine Pitrou proposed a `Globals / builtins cache (issue #10401) `_ which adds a private ``ma_version`` field to the ``PyDictObject`` structure (``dict`` type), the field has the C type ``Py_ssize_t``. The patch adds a "global and builtin cache" to functions and frames, and changes ``LOAD_GLOBAL`` and ``STORE_GLOBAL`` instructions to use the cache. The change on the ``PyDictObject`` structure is very similar to this PEP. Cached globals+builtins lookup ------------------------------ In 2006, Andrea Griffini proposed a patch implementing a `Cached globals+builtins lookup optimization `_. The patch adds a private ``timestamp`` field to the ``PyDictObject`` structure (``dict`` type), the field has the C type ``size_t``. Thread on python-dev: `About dictionary lookup caching `_. Guard against changing dict during iteration -------------------------------------------- In 2013, Serhiy Storchaka proposed `Guard against changing dict during iteration (issue #19332) `_ which adds a ``ma_count`` field to the ``PyDictObject`` structure (``dict`` type), the field has the C type ``size_t``. This field is incremented when the dictionary is modified, and so is very similar to the proposed dictionary version. Sadly, the dictionary version proposed in this PEP doesn't help to detect dictionary mutation. The dictionary version changes when values are replaced, whereas modifying dictionary values while iterating on dictionary keys is legit in Python. PySizer ------- `PySizer `_: a memory profiler for Python, Google Summer of Code 2005 project by Nick Smallbone. This project has a patch for CPython 2.4 which adds ``key_time`` and ``value_time`` fields to dictionary entries. It uses a global process-wide counter for dictionaries, incremented each time that a dictionary is modified. The times are used to decide when child objects first appeared in their parent objects. Discussion ========== Thread on the python-ideas mailing list: `RFC: PEP: Add dict.__version__ `_. Copyright ========= This document has been placed in the public domain. From fijall at gmail.com Mon Jan 11 14:09:18 2016 From: fijall at gmail.com (Maciej Fijalkowski) Date: Mon, 11 Jan 2016 21:09:18 +0200 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: References: Message-ID: Hi Victor. You know that pypy does this stuff without changing and exposing python semantics right? We have a version dict that does not leak abstractions to the user. In general, doing stuff like that where there is a public API that leaks details of certain optimizations makes it harder and harder for optimizing compilers to do their job properly, if you want to do something slightly different. Can we make this happen (as you noted in the prior art) WITHOUT changing ANY of the things exposed to the user? On Mon, Jan 11, 2016 at 6:49 PM, Victor Stinner wrote: > Hi, > > After a first round on python-ideas, here is the second version of my > PEP. The main changes since the first version are that the dictionary > version is no more exposed at the Python level and the field type now > also has a size of 64-bit on 32-bit platforms. > > The PEP is part of a serie of 3 PEP adding an API to implement a > static Python optimizer specializing functions with guards. The second > PEP is currently discussed on python-ideas and I'm still working on > the third PEP. > > Thanks to Red Hat for giving me time to experiment on this. > > > HTML version: > https://www.python.org/dev/peps/pep-0509/ > > > PEP: 509 > Title: Add a private version to dict > Version: $Revision$ > Last-Modified: $Date$ > Author: Victor Stinner > Status: Draft > Type: Standards Track > Content-Type: text/x-rst > Created: 4-January-2016 > Python-Version: 3.6 > > > Abstract > ======== > > Add a new private version to builtin ``dict`` type, incremented at each > change, to implement fast guards on namespaces. > > > Rationale > ========= > > In Python, the builtin ``dict`` type is used by many instructions. For > example, the ``LOAD_GLOBAL`` instruction searchs for a variable in the > global namespace, or in the builtins namespace (two dict lookups). > Python uses ``dict`` for the builtins namespace, globals namespace, type > namespaces, instance namespaces, etc. The local namespace (namespace of > a function) is usually optimized to an array, but it can be a dict too. > > Python is hard to optimize because almost everything is mutable: builtin > functions, function code, global variables, local variables, ... can be > modified at runtime. Implementing optimizations respecting the Python > semantics requires to detect when "something changes": we will call > these checks "guards". > > The speedup of optimizations depends on the speed of guard checks. This > PEP proposes to add a version to dictionaries to implement fast guards > on namespaces. > > Dictionary lookups can be skipped if the version does not change which > is the common case for most namespaces. The performance of a guard does > not depend on the number of watched dictionary entries, complexity of > O(1), if the dictionary version does not change. > > Example of optimization: copy the value of a global variable to function > constants. This optimization requires a guard on the global variable to > check if it was modified. If the variable is modified, the variable must > be loaded at runtime when the function is called, instead of using the > constant. > > See the `PEP 510 -- Specialized functions with guards > `_ for the concrete usage of > guards to specialize functions and for the rationale on Python static > optimizers. > > > Guard example > ============= > > Pseudo-code of an fast guard to check if a dictionary entry was modified > (created, updated or deleted) using an hypothetical > ``dict_get_version(dict)`` function:: > > UNSET = object() > > class GuardDictKey: > def __init__(self, dict, key): > self.dict = dict > self.key = key > self.value = dict.get(key, UNSET) > self.version = dict_get_version(dict) > > def check(self): > """Return True if the dictionary entry did not changed.""" > > # read the version field of the dict structure > version = dict_get_version(self.dict) > if version == self.version: > # Fast-path: dictionary lookup avoided > return True > > # lookup in the dictionary > value = self.dict.get(self.key, UNSET) > if value is self.value: > # another key was modified: > # cache the new dictionary version > self.version = version > return True > > # the key was modified > return False > > > Usage of the dict version > ========================= > > Specialized functions using guards > ---------------------------------- > > The `PEP 510 -- Specialized functions with guards > `_ proposes an API to support > specialized functions with guards. It allows to implement static > optimizers for Python without breaking the Python semantics. > > Example of a static Python optimizer: the astoptimizer of the `FAT > Python `_ project > implements many optimizations which require guards on namespaces. > Examples: > > * Call pure builtins: to replace ``len("abc")`` with ``3``, guards on > ``builtins.__dict__['len']`` and ``globals()['len']`` are required > * Loop unrolling: to unroll the loop ``for i in range(...): ...``, > guards on ``builtins.__dict__['range']`` and ``globals()['range']`` > are required > > > Pyjion > ------ > > According of Brett Cannon, one of the two main developers of Pyjion, > Pyjion can also benefit from dictionary version to implement > optimizations. > > Pyjion is a JIT compiler for Python based upon CoreCLR (Microsoft .NET > Core runtime). > > > Unladen Swallow > --------------- > > Even if dictionary version was not explicitly mentionned, optimization > globals and builtins lookup was part of the Unladen Swallow plan: > "Implement one of the several proposed schemes for speeding lookups of > globals and builtins." Source: `Unladen Swallow ProjectPlan > `_. > > Unladen Swallow is a fork of CPython 2.6.1 adding a JIT compiler > implemented with LLVM. The project stopped in 2011: `Unladen Swallow > Retrospective > `_. > > > Changes > ======= > > Add a ``ma_version`` field to the ``PyDictObject`` structure with the C > type ``PY_INT64_T``, 64-bit unsigned integer. New empty dictionaries are > initilized to version ``0``. The version is incremented at each change: > > * ``clear()`` if the dict was non-empty > * ``pop(key)`` if the key exists > * ``popitem()`` if the dict is non-empty > * ``setdefault(key, value)`` if the `key` does not exist > * ``__detitem__(key)`` if the key exists > * ``__setitem__(key, value)`` if the `key` doesn't exist or if the value > is different > * ``update(...)`` if new values are different than existing values (the > version can be incremented multiple times) > > Example using an hypothetical ``dict_get_version(dict)`` function:: > > >>> d = {} > >>> dict_get_version(d) > 0 > >>> d['key'] = 'value' > >>> dict_get_version(d) > 1 > >>> d['key'] = 'new value' > >>> dict_get_version(d) > 2 > >>> del d['key'] > >>> dict_get_version(d) > 3 > > If a dictionary is created with items, the version is also incremented > at each dictionary insertion. Example:: > > >>> d=dict(x=7, y=33) > >>> dict_get_version(d) > 2 > > The version is not incremented if an existing key is set to the same > value. For efficiency, values are compared by their identity: > ``new_value is old_value``, not by their content: > ``new_value == old_value``. Example:: > > >>> d={} > >>> value = object() > >>> d['key'] = value > >>> dict_get_version(d) > 2 > >>> d['key'] = value > >>> dict_get_version(d) > 2 > > .. note:: > CPython uses some singleton like integers in the range [-5; 257], > empty tuple, empty strings, Unicode strings of a single character in > the range [U+0000; U+00FF], etc. When a key is set twice to the same > singleton, the version is not modified. > > > Implementation > ============== > > The `issue #26058: PEP 509: Add ma_version to PyDictObject > `_ contains a patch implementing > this PEP. > > On pybench and timeit microbenchmarks, the patch does not seem to add > any overhead on dictionary operations. > > When the version does not change, ``PyDict_GetItem()`` takes 14.8 ns for > a dictioanry lookup, whereas a guard check only takes 3.8 ns. Moreover, > a guard can watch for multiple keys. For example, for an optimization > using 10 global variables in a function, 10 dictionary lookups costs 148 > ns, whereas the guard still only costs 3.8 ns when the version does not > change (39x as fast). > > > Integer overflow > ================ > > The implementation uses the C unsigned integer type ``PY_INT64_T`` to > store the version, a 64 bits unsigned integer. The C code uses > ``version++``. On integer overflow, the version is wrapped to ``0`` (and > then continue to be incremented) according to the C standard. > > After an integer overflow, a guard can succeed whereas the watched > dictionary key was modified. The bug occurs if the dictionary is > modified at least ``2 ** 64`` times between two checks of the guard and > if the new version (theorical value with no integer overflow) is equal > to the old version modulo ``2 ** 64``. > > If a dictionary is modified each nanosecond, an overflow takes longer > than 584 years. Using a 32-bit version, the overflow occurs only after 4 > seconds. That's why a 64-bit unsigned type is also used on 32-bit > systems. A dictionary lookup at the C level takes 14.8 ns. > > A risk of a bug every 584 years is acceptable. > > > Alternatives > ============ > > Expose the version at Python level as a read-only __version__ property > ---------------------------------------------------------------------- > > The first version of the PEP proposed to expose the dictionary version > as a read-only ``__version__`` property at Python level, and also to add > the property to ``collections.UserDict`` (since this type must mimick > the ``dict`` API). > > There are multiple issues: > > * To be consistent and avoid bad surprises, the version must be added to > all mapping types. Implementing a new mapping type would require extra > work for no benefit, since the version is only required on the > ``dict`` type in practice. > * All Python implementations must implement this new property, it gives > more work to other implementations, whereas they may not use the > dictionary version at all. > * The ``__version__`` can be wrapped on integer overflow. It is error > prone: using ``dict.__version__ <= guard_version`` is wrong, > ``dict.__version__ == guard_version`` must be used instead to reduce > the risk of bug on integer overflow (even if the integer overflow is > unlikely in practice). > * Exposing the dictionary version at Python level can lead the > false assumption on performances. Checking ``dict.__version__`` at > the Python level is not faster than a dictionary lookup. A dictionary > lookup has a cost of 48.7 ns and checking a guard has a cost of 47.5 > ns, the difference is only 1.2 ns (3%):: > > > $ ./python -m timeit -s 'd = {str(i):i for i in range(100)}' 'd["33"] == 33' > 10000000 loops, best of 3: 0.0487 usec per loop > $ ./python -m timeit -s 'd = {str(i):i for i in range(100)}' > 'd.__version__ == 100' > 10000000 loops, best of 3: 0.0475 usec per loop > > Bikeshedding on the property name: > > * ``__cache_token__``: name proposed by Nick Coghlan, name coming from > `abc.get_cache_token() > `_. > * ``__version__`` > * ``__timestamp__`` > > > Add a version to each dict entry > -------------------------------- > > A single version per dictionary requires to keep a strong reference to > the value which can keep the value alive longer than expected. If we add > also a version per dictionary entry, the guard can only store the entry > version to avoid the strong reference to the value (only strong > references to the dictionary and to the key are needed). > > Changes: add a ``me_version`` field to the ``PyDictKeyEntry`` structure, > the field has the C type ``PY_INT64_T``. When a key is created or > modified, the entry version is set to the dictionary version which is > incremented at any change (create, modify, delete). > > Pseudo-code of an fast guard to check if a dictionary key was modified > using hypothetical ``dict_get_version(dict)`` > ``dict_get_entry_version(dict)`` functions:: > > UNSET = object() > > class GuardDictKey: > def __init__(self, dict, key): > self.dict = dict > self.key = key > self.dict_version = dict_get_version(dict) > self.entry_version = dict_get_entry_version(dict, key) > > def check(self): > """Return True if the dictionary entry did not changed.""" > > # read the version field of the dict structure > dict_version = dict_get_version(self.dict) > if dict_version == self.version: > # Fast-path: dictionary lookup avoided > return True > > # lookup in the dictionary > entry_version = get_dict_key_version(dict, key) > if entry_version == self.entry_version: > # another key was modified: > # cache the new dictionary version > self.dict_version = dict_version > return True > > # the key was modified > return False > > The main drawback of this option is the impact on the memory footprint. > It increases the size of each dictionary entry, so the overhead depends > on the number of buckets (dictionary entries, used or unused yet). For > example, it increases the size of each dictionary entry by 8 bytes on > 64-bit system. > > In Python, the memory footprint matters and the trend is to reduce it. > Examples: > > * `PEP 393 -- Flexible String Representation > `_ > * `PEP 412 -- Key-Sharing Dictionary > `_ > > > Add a new dict subtype > ---------------------- > > Add a new ``verdict`` type, subtype of ``dict``. When guards are needed, > use the ``verdict`` for namespaces (module namespace, type namespace, > instance namespace, etc.) instead of ``dict``. > > Leave the ``dict`` type unchanged to not add any overhead (memory > footprint) when guards are not needed. > > Technical issue: a lot of C code in the wild, including CPython core, > expecting the exact ``dict`` type. Issues: > > * ``exec()`` requires a ``dict`` for globals and locals. A lot of code > use ``globals={}``. It is not possible to cast the ``dict`` to a > ``dict`` subtype because the caller expects the ``globals`` parameter > to be modified (``dict`` is mutable). > * Functions call directly ``PyDict_xxx()`` functions, instead of calling > ``PyObject_xxx()`` if the object is a ``dict`` subtype > * ``PyDict_CheckExact()`` check fails on ``dict`` subtype, whereas some > functions require the exact ``dict`` type. > * ``Python/ceval.c`` does not completly supports dict subtypes for > namespaces > > > The ``exec()`` issue is a blocker issue. > > Other issues: > > * The garbage collector has a special code to "untrack" ``dict`` > instances. If a ``dict`` subtype is used for namespaces, the garbage > collector can be unable to break some reference cycles. > * Some functions have a fast-path for ``dict`` which would not be taken > for ``dict`` subtypes, and so it would make Python a little bit > slower. > > > Prior Art > ========= > > Method cache and type version tag > --------------------------------- > > In 2007, Armin Rigo wrote a patch to to implement a cache of methods. It > was merged into Python 2.6. The patch adds a "type attribute cache > version tag" (``tp_version_tag``) and a "valid version tag" flag to > types (the ``PyTypeObject`` structure). > > The type version tag is not available at the Python level. > > The version tag has the C type ``unsigned int``. The cache is a global > hash table of 4096 entries, shared by all types. The cache is global to > "make it fast, have a deterministic and low memory footprint, and be > easy to invalidate". Each cache entry has a version tag. A global > version tag is used to create the next version tag, it also has the C > type ``unsigned int``. > > By default, a type has its "valid version tag" flag cleared to indicate > that the version tag is invalid. When the first method of the type is > cached, the version tag and the "valid version tag" flag are set. When a > type is modified, the "valid version tag" flag of the type and its > subclasses is cleared. Later, when a cache entry of these types is used, > the entry is removed because its version tag is outdated. > > On integer overflow, the whole cache is cleared and the global version > tag is reset to ``0``. > > See `Method cache (issue #1685986) > `_ and `Armin's method cache > optimization updated for Python 2.6 (issue #1700288) > `_. > > > Globals / builtins cache > ------------------------ > > In 2010, Antoine Pitrou proposed a `Globals / builtins cache (issue > #10401) `_ which adds a private > ``ma_version`` field to the ``PyDictObject`` structure (``dict`` type), > the field has the C type ``Py_ssize_t``. > > The patch adds a "global and builtin cache" to functions and frames, and > changes ``LOAD_GLOBAL`` and ``STORE_GLOBAL`` instructions to use the > cache. > > The change on the ``PyDictObject`` structure is very similar to this > PEP. > > > Cached globals+builtins lookup > ------------------------------ > > In 2006, Andrea Griffini proposed a patch implementing a `Cached > globals+builtins lookup optimization > `_. The patch adds a private > ``timestamp`` field to the ``PyDictObject`` structure (``dict`` type), > the field has the C type ``size_t``. > > Thread on python-dev: `About dictionary lookup caching > `_. > > > Guard against changing dict during iteration > -------------------------------------------- > > In 2013, Serhiy Storchaka proposed `Guard against changing dict during > iteration (issue #19332) `_ which > adds a ``ma_count`` field to the ``PyDictObject`` structure (``dict`` > type), the field has the C type ``size_t``. This field is incremented > when the dictionary is modified, and so is very similar to the proposed > dictionary version. > > Sadly, the dictionary version proposed in this PEP doesn't help to > detect dictionary mutation. The dictionary version changes when values > are replaced, whereas modifying dictionary values while iterating on > dictionary keys is legit in Python. > > > PySizer > ------- > > `PySizer `_: a memory profiler for Python, > Google Summer of Code 2005 project by Nick Smallbone. > > This project has a patch for CPython 2.4 which adds ``key_time`` and > ``value_time`` fields to dictionary entries. It uses a global > process-wide counter for dictionaries, incremented each time that a > dictionary is modified. The times are used to decide when child objects > first appeared in their parent objects. > > > Discussion > ========== > > Thread on the python-ideas mailing list: `RFC: PEP: Add dict.__version__ > `_. > > > Copyright > ========= > > This document has been placed in the public domain. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/fijall%40gmail.com From victor.stinner at gmail.com Mon Jan 11 14:56:26 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Mon, 11 Jan 2016 20:56:26 +0100 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: References: Message-ID: Le 11 janv. 2016 8:09 PM, "Maciej Fijalkowski" a ?crit : > Hi Victor. > > You know that pypy does this stuff without changing and exposing > python semantics right? We have a version dict that does not leak > abstractions to the user. The PEP adds a field to the C structure PyDictObject. Are you asking me to hide it from the C structure? The first version of my PEP added a public read-only property at Python level, but I changed the PEP. See the alternatives section for more detail. Victor > In general, doing stuff like that where there is a public API that > leaks details of certain optimizations makes it harder and harder for > optimizing compilers to do their job properly, if you want to do > something slightly different. > > Can we make this happen (as you noted in the prior art) WITHOUT > changing ANY of the things exposed to the user? > > On Mon, Jan 11, 2016 at 6:49 PM, Victor Stinner > wrote: > > Hi, > > > > After a first round on python-ideas, here is the second version of my > > PEP. The main changes since the first version are that the dictionary > > version is no more exposed at the Python level and the field type now > > also has a size of 64-bit on 32-bit platforms. > > > > The PEP is part of a serie of 3 PEP adding an API to implement a > > static Python optimizer specializing functions with guards. The second > > PEP is currently discussed on python-ideas and I'm still working on > > the third PEP. > > > > Thanks to Red Hat for giving me time to experiment on this. > > > > > > HTML version: > > https://www.python.org/dev/peps/pep-0509/ > > > > > > PEP: 509 > > Title: Add a private version to dict > > Version: $Revision$ > > Last-Modified: $Date$ > > Author: Victor Stinner > > Status: Draft > > Type: Standards Track > > Content-Type: text/x-rst > > Created: 4-January-2016 > > Python-Version: 3.6 > > > > > > Abstract > > ======== > > > > Add a new private version to builtin ``dict`` type, incremented at each > > change, to implement fast guards on namespaces. > > > > > > Rationale > > ========= > > > > In Python, the builtin ``dict`` type is used by many instructions. For > > example, the ``LOAD_GLOBAL`` instruction searchs for a variable in the > > global namespace, or in the builtins namespace (two dict lookups). > > Python uses ``dict`` for the builtins namespace, globals namespace, type > > namespaces, instance namespaces, etc. The local namespace (namespace of > > a function) is usually optimized to an array, but it can be a dict too. > > > > Python is hard to optimize because almost everything is mutable: builtin > > functions, function code, global variables, local variables, ... can be > > modified at runtime. Implementing optimizations respecting the Python > > semantics requires to detect when "something changes": we will call > > these checks "guards". > > > > The speedup of optimizations depends on the speed of guard checks. This > > PEP proposes to add a version to dictionaries to implement fast guards > > on namespaces. > > > > Dictionary lookups can be skipped if the version does not change which > > is the common case for most namespaces. The performance of a guard does > > not depend on the number of watched dictionary entries, complexity of > > O(1), if the dictionary version does not change. > > > > Example of optimization: copy the value of a global variable to function > > constants. This optimization requires a guard on the global variable to > > check if it was modified. If the variable is modified, the variable must > > be loaded at runtime when the function is called, instead of using the > > constant. > > > > See the `PEP 510 -- Specialized functions with guards > > `_ for the concrete usage of > > guards to specialize functions and for the rationale on Python static > > optimizers. > > > > > > Guard example > > ============= > > > > Pseudo-code of an fast guard to check if a dictionary entry was modified > > (created, updated or deleted) using an hypothetical > > ``dict_get_version(dict)`` function:: > > > > UNSET = object() > > > > class GuardDictKey: > > def __init__(self, dict, key): > > self.dict = dict > > self.key = key > > self.value = dict.get(key, UNSET) > > self.version = dict_get_version(dict) > > > > def check(self): > > """Return True if the dictionary entry did not changed.""" > > > > # read the version field of the dict structure > > version = dict_get_version(self.dict) > > if version == self.version: > > # Fast-path: dictionary lookup avoided > > return True > > > > # lookup in the dictionary > > value = self.dict.get(self.key, UNSET) > > if value is self.value: > > # another key was modified: > > # cache the new dictionary version > > self.version = version > > return True > > > > # the key was modified > > return False > > > > > > Usage of the dict version > > ========================= > > > > Specialized functions using guards > > ---------------------------------- > > > > The `PEP 510 -- Specialized functions with guards > > `_ proposes an API to support > > specialized functions with guards. It allows to implement static > > optimizers for Python without breaking the Python semantics. > > > > Example of a static Python optimizer: the astoptimizer of the `FAT > > Python `_ project > > implements many optimizations which require guards on namespaces. > > Examples: > > > > * Call pure builtins: to replace ``len("abc")`` with ``3``, guards on > > ``builtins.__dict__['len']`` and ``globals()['len']`` are required > > * Loop unrolling: to unroll the loop ``for i in range(...): ...``, > > guards on ``builtins.__dict__['range']`` and ``globals()['range']`` > > are required > > > > > > Pyjion > > ------ > > > > According of Brett Cannon, one of the two main developers of Pyjion, > > Pyjion can also benefit from dictionary version to implement > > optimizations. > > > > Pyjion is a JIT compiler for Python based upon CoreCLR (Microsoft .NET > > Core runtime). > > > > > > Unladen Swallow > > --------------- > > > > Even if dictionary version was not explicitly mentionned, optimization > > globals and builtins lookup was part of the Unladen Swallow plan: > > "Implement one of the several proposed schemes for speeding lookups of > > globals and builtins." Source: `Unladen Swallow ProjectPlan > > `_. > > > > Unladen Swallow is a fork of CPython 2.6.1 adding a JIT compiler > > implemented with LLVM. The project stopped in 2011: `Unladen Swallow > > Retrospective > > `_. > > > > > > Changes > > ======= > > > > Add a ``ma_version`` field to the ``PyDictObject`` structure with the C > > type ``PY_INT64_T``, 64-bit unsigned integer. New empty dictionaries are > > initilized to version ``0``. The version is incremented at each change: > > > > * ``clear()`` if the dict was non-empty > > * ``pop(key)`` if the key exists > > * ``popitem()`` if the dict is non-empty > > * ``setdefault(key, value)`` if the `key` does not exist > > * ``__detitem__(key)`` if the key exists > > * ``__setitem__(key, value)`` if the `key` doesn't exist or if the value > > is different > > * ``update(...)`` if new values are different than existing values (the > > version can be incremented multiple times) > > > > Example using an hypothetical ``dict_get_version(dict)`` function:: > > > > >>> d = {} > > >>> dict_get_version(d) > > 0 > > >>> d['key'] = 'value' > > >>> dict_get_version(d) > > 1 > > >>> d['key'] = 'new value' > > >>> dict_get_version(d) > > 2 > > >>> del d['key'] > > >>> dict_get_version(d) > > 3 > > > > If a dictionary is created with items, the version is also incremented > > at each dictionary insertion. Example:: > > > > >>> d=dict(x=7, y=33) > > >>> dict_get_version(d) > > 2 > > > > The version is not incremented if an existing key is set to the same > > value. For efficiency, values are compared by their identity: > > ``new_value is old_value``, not by their content: > > ``new_value == old_value``. Example:: > > > > >>> d={} > > >>> value = object() > > >>> d['key'] = value > > >>> dict_get_version(d) > > 2 > > >>> d['key'] = value > > >>> dict_get_version(d) > > 2 > > > > .. note:: > > CPython uses some singleton like integers in the range [-5; 257], > > empty tuple, empty strings, Unicode strings of a single character in > > the range [U+0000; U+00FF], etc. When a key is set twice to the same > > singleton, the version is not modified. > > > > > > Implementation > > ============== > > > > The `issue #26058: PEP 509: Add ma_version to PyDictObject > > `_ contains a patch implementing > > this PEP. > > > > On pybench and timeit microbenchmarks, the patch does not seem to add > > any overhead on dictionary operations. > > > > When the version does not change, ``PyDict_GetItem()`` takes 14.8 ns for > > a dictioanry lookup, whereas a guard check only takes 3.8 ns. Moreover, > > a guard can watch for multiple keys. For example, for an optimization > > using 10 global variables in a function, 10 dictionary lookups costs 148 > > ns, whereas the guard still only costs 3.8 ns when the version does not > > change (39x as fast). > > > > > > Integer overflow > > ================ > > > > The implementation uses the C unsigned integer type ``PY_INT64_T`` to > > store the version, a 64 bits unsigned integer. The C code uses > > ``version++``. On integer overflow, the version is wrapped to ``0`` (and > > then continue to be incremented) according to the C standard. > > > > After an integer overflow, a guard can succeed whereas the watched > > dictionary key was modified. The bug occurs if the dictionary is > > modified at least ``2 ** 64`` times between two checks of the guard and > > if the new version (theorical value with no integer overflow) is equal > > to the old version modulo ``2 ** 64``. > > > > If a dictionary is modified each nanosecond, an overflow takes longer > > than 584 years. Using a 32-bit version, the overflow occurs only after 4 > > seconds. That's why a 64-bit unsigned type is also used on 32-bit > > systems. A dictionary lookup at the C level takes 14.8 ns. > > > > A risk of a bug every 584 years is acceptable. > > > > > > Alternatives > > ============ > > > > Expose the version at Python level as a read-only __version__ property > > ---------------------------------------------------------------------- > > > > The first version of the PEP proposed to expose the dictionary version > > as a read-only ``__version__`` property at Python level, and also to add > > the property to ``collections.UserDict`` (since this type must mimick > > the ``dict`` API). > > > > There are multiple issues: > > > > * To be consistent and avoid bad surprises, the version must be added to > > all mapping types. Implementing a new mapping type would require extra > > work for no benefit, since the version is only required on the > > ``dict`` type in practice. > > * All Python implementations must implement this new property, it gives > > more work to other implementations, whereas they may not use the > > dictionary version at all. > > * The ``__version__`` can be wrapped on integer overflow. It is error > > prone: using ``dict.__version__ <= guard_version`` is wrong, > > ``dict.__version__ == guard_version`` must be used instead to reduce > > the risk of bug on integer overflow (even if the integer overflow is > > unlikely in practice). > > * Exposing the dictionary version at Python level can lead the > > false assumption on performances. Checking ``dict.__version__`` at > > the Python level is not faster than a dictionary lookup. A dictionary > > lookup has a cost of 48.7 ns and checking a guard has a cost of 47.5 > > ns, the difference is only 1.2 ns (3%):: > > > > > > $ ./python -m timeit -s 'd = {str(i):i for i in range(100)}' 'd["33"] == 33' > > 10000000 loops, best of 3: 0.0487 usec per loop > > $ ./python -m timeit -s 'd = {str(i):i for i in range(100)}' > > 'd.__version__ == 100' > > 10000000 loops, best of 3: 0.0475 usec per loop > > > > Bikeshedding on the property name: > > > > * ``__cache_token__``: name proposed by Nick Coghlan, name coming from > > `abc.get_cache_token() > > `_. > > * ``__version__`` > > * ``__timestamp__`` > > > > > > Add a version to each dict entry > > -------------------------------- > > > > A single version per dictionary requires to keep a strong reference to > > the value which can keep the value alive longer than expected. If we add > > also a version per dictionary entry, the guard can only store the entry > > version to avoid the strong reference to the value (only strong > > references to the dictionary and to the key are needed). > > > > Changes: add a ``me_version`` field to the ``PyDictKeyEntry`` structure, > > the field has the C type ``PY_INT64_T``. When a key is created or > > modified, the entry version is set to the dictionary version which is > > incremented at any change (create, modify, delete). > > > > Pseudo-code of an fast guard to check if a dictionary key was modified > > using hypothetical ``dict_get_version(dict)`` > > ``dict_get_entry_version(dict)`` functions:: > > > > UNSET = object() > > > > class GuardDictKey: > > def __init__(self, dict, key): > > self.dict = dict > > self.key = key > > self.dict_version = dict_get_version(dict) > > self.entry_version = dict_get_entry_version(dict, key) > > > > def check(self): > > """Return True if the dictionary entry did not changed.""" > > > > # read the version field of the dict structure > > dict_version = dict_get_version(self.dict) > > if dict_version == self.version: > > # Fast-path: dictionary lookup avoided > > return True > > > > # lookup in the dictionary > > entry_version = get_dict_key_version(dict, key) > > if entry_version == self.entry_version: > > # another key was modified: > > # cache the new dictionary version > > self.dict_version = dict_version > > return True > > > > # the key was modified > > return False > > > > The main drawback of this option is the impact on the memory footprint. > > It increases the size of each dictionary entry, so the overhead depends > > on the number of buckets (dictionary entries, used or unused yet). For > > example, it increases the size of each dictionary entry by 8 bytes on > > 64-bit system. > > > > In Python, the memory footprint matters and the trend is to reduce it. > > Examples: > > > > * `PEP 393 -- Flexible String Representation > > `_ > > * `PEP 412 -- Key-Sharing Dictionary > > `_ > > > > > > Add a new dict subtype > > ---------------------- > > > > Add a new ``verdict`` type, subtype of ``dict``. When guards are needed, > > use the ``verdict`` for namespaces (module namespace, type namespace, > > instance namespace, etc.) instead of ``dict``. > > > > Leave the ``dict`` type unchanged to not add any overhead (memory > > footprint) when guards are not needed. > > > > Technical issue: a lot of C code in the wild, including CPython core, > > expecting the exact ``dict`` type. Issues: > > > > * ``exec()`` requires a ``dict`` for globals and locals. A lot of code > > use ``globals={}``. It is not possible to cast the ``dict`` to a > > ``dict`` subtype because the caller expects the ``globals`` parameter > > to be modified (``dict`` is mutable). > > * Functions call directly ``PyDict_xxx()`` functions, instead of calling > > ``PyObject_xxx()`` if the object is a ``dict`` subtype > > * ``PyDict_CheckExact()`` check fails on ``dict`` subtype, whereas some > > functions require the exact ``dict`` type. > > * ``Python/ceval.c`` does not completly supports dict subtypes for > > namespaces > > > > > > The ``exec()`` issue is a blocker issue. > > > > Other issues: > > > > * The garbage collector has a special code to "untrack" ``dict`` > > instances. If a ``dict`` subtype is used for namespaces, the garbage > > collector can be unable to break some reference cycles. > > * Some functions have a fast-path for ``dict`` which would not be taken > > for ``dict`` subtypes, and so it would make Python a little bit > > slower. > > > > > > Prior Art > > ========= > > > > Method cache and type version tag > > --------------------------------- > > > > In 2007, Armin Rigo wrote a patch to to implement a cache of methods. It > > was merged into Python 2.6. The patch adds a "type attribute cache > > version tag" (``tp_version_tag``) and a "valid version tag" flag to > > types (the ``PyTypeObject`` structure). > > > > The type version tag is not available at the Python level. > > > > The version tag has the C type ``unsigned int``. The cache is a global > > hash table of 4096 entries, shared by all types. The cache is global to > > "make it fast, have a deterministic and low memory footprint, and be > > easy to invalidate". Each cache entry has a version tag. A global > > version tag is used to create the next version tag, it also has the C > > type ``unsigned int``. > > > > By default, a type has its "valid version tag" flag cleared to indicate > > that the version tag is invalid. When the first method of the type is > > cached, the version tag and the "valid version tag" flag are set. When a > > type is modified, the "valid version tag" flag of the type and its > > subclasses is cleared. Later, when a cache entry of these types is used, > > the entry is removed because its version tag is outdated. > > > > On integer overflow, the whole cache is cleared and the global version > > tag is reset to ``0``. > > > > See `Method cache (issue #1685986) > > `_ and `Armin's method cache > > optimization updated for Python 2.6 (issue #1700288) > > `_. > > > > > > Globals / builtins cache > > ------------------------ > > > > In 2010, Antoine Pitrou proposed a `Globals / builtins cache (issue > > #10401) `_ which adds a private > > ``ma_version`` field to the ``PyDictObject`` structure (``dict`` type), > > the field has the C type ``Py_ssize_t``. > > > > The patch adds a "global and builtin cache" to functions and frames, and > > changes ``LOAD_GLOBAL`` and ``STORE_GLOBAL`` instructions to use the > > cache. > > > > The change on the ``PyDictObject`` structure is very similar to this > > PEP. > > > > > > Cached globals+builtins lookup > > ------------------------------ > > > > In 2006, Andrea Griffini proposed a patch implementing a `Cached > > globals+builtins lookup optimization > > `_. The patch adds a private > > ``timestamp`` field to the ``PyDictObject`` structure (``dict`` type), > > the field has the C type ``size_t``. > > > > Thread on python-dev: `About dictionary lookup caching > > `_. > > > > > > Guard against changing dict during iteration > > -------------------------------------------- > > > > In 2013, Serhiy Storchaka proposed `Guard against changing dict during > > iteration (issue #19332) `_ which > > adds a ``ma_count`` field to the ``PyDictObject`` structure (``dict`` > > type), the field has the C type ``size_t``. This field is incremented > > when the dictionary is modified, and so is very similar to the proposed > > dictionary version. > > > > Sadly, the dictionary version proposed in this PEP doesn't help to > > detect dictionary mutation. The dictionary version changes when values > > are replaced, whereas modifying dictionary values while iterating on > > dictionary keys is legit in Python. > > > > > > PySizer > > ------- > > > > `PySizer `_: a memory profiler for Python, > > Google Summer of Code 2005 project by Nick Smallbone. > > > > This project has a patch for CPython 2.4 which adds ``key_time`` and > > ``value_time`` fields to dictionary entries. It uses a global > > process-wide counter for dictionaries, incremented each time that a > > dictionary is modified. The times are used to decide when child objects > > first appeared in their parent objects. > > > > > > Discussion > > ========== > > > > Thread on the python-ideas mailing list: `RFC: PEP: Add dict.__version__ > > `_. > > > > > > Copyright > > ========= > > > > This document has been placed in the public domain. > > _______________________________________________ > > Python-Dev mailing list > > Python-Dev at python.org > > https://mail.python.org/mailman/listinfo/python-dev > > Unsubscribe: https://mail.python.org/mailman/options/python-dev/fijall%40gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From fijall at gmail.com Mon Jan 11 15:10:30 2016 From: fijall at gmail.com (Maciej Fijalkowski) Date: Mon, 11 Jan 2016 22:10:30 +0200 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: References: Message-ID: On Mon, Jan 11, 2016 at 9:56 PM, Victor Stinner wrote: > Le 11 janv. 2016 8:09 PM, "Maciej Fijalkowski" a ?crit : >> Hi Victor. >> >> You know that pypy does this stuff without changing and exposing >> python semantics right? We have a version dict that does not leak >> abstractions to the user. > > The PEP adds a field to the C structure PyDictObject. Are you asking me to > hide it from the C structure? > > The first version of my PEP added a public read-only property at Python > level, but I changed the PEP. See the alternatives section for more detail. > > Victor I asked you to hide it from python, read the wrong version :-) Cool! From greg at krypto.org Mon Jan 11 18:07:33 2016 From: greg at krypto.org (Gregory P. Smith) Date: Mon, 11 Jan 2016 23:07:33 +0000 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: References: Message-ID: On Mon, Jan 11, 2016 at 8:50 AM Victor Stinner wrote: > Hi, > > After a first round on python-ideas, here is the second version of my > PEP. The main changes since the first version are that the dictionary > version is no more exposed at the Python level and the field type now > also has a size of 64-bit on 32-bit platforms. > > The PEP is part of a serie of 3 PEP adding an API to implement a > static Python optimizer specializing functions with guards. The second > PEP is currently discussed on python-ideas and I'm still working on > the third PEP. > > Thanks to Red Hat for giving me time to experiment on this. > > > HTML version: > https://www.python.org/dev/peps/pep-0509/ > > > PEP: 509 > Title: Add a private version to dict > Version: $Revision$ > Last-Modified: $Date$ > Author: Victor Stinner > Status: Draft > Type: Standards Track > Content-Type: text/x-rst > Created: 4-January-2016 > Python-Version: 3.6 > > > Abstract > ======== > > Add a new private version to builtin ``dict`` type, incremented at each > change, to implement fast guards on namespaces. > > > Rationale > ========= > > In Python, the builtin ``dict`` type is used by many instructions. For > example, the ``LOAD_GLOBAL`` instruction searchs for a variable in the > global namespace, or in the builtins namespace (two dict lookups). > Python uses ``dict`` for the builtins namespace, globals namespace, type > namespaces, instance namespaces, etc. The local namespace (namespace of > a function) is usually optimized to an array, but it can be a dict too. > > Python is hard to optimize because almost everything is mutable: builtin > functions, function code, global variables, local variables, ... can be > modified at runtime. Implementing optimizations respecting the Python > semantics requires to detect when "something changes": we will call > these checks "guards". > > The speedup of optimizations depends on the speed of guard checks. This > PEP proposes to add a version to dictionaries to implement fast guards > on namespaces. > > Dictionary lookups can be skipped if the version does not change which > is the common case for most namespaces. The performance of a guard does > not depend on the number of watched dictionary entries, complexity of > O(1), if the dictionary version does not change. > > Example of optimization: copy the value of a global variable to function > constants. This optimization requires a guard on the global variable to > check if it was modified. If the variable is modified, the variable must > be loaded at runtime when the function is called, instead of using the > constant. > > See the `PEP 510 -- Specialized functions with guards > `_ for the concrete usage of > guards to specialize functions and for the rationale on Python static > optimizers. > > > Guard example > ============= > > Pseudo-code of an fast guard to check if a dictionary entry was modified > (created, updated or deleted) using an hypothetical > ``dict_get_version(dict)`` function:: > > UNSET = object() > > class GuardDictKey: > def __init__(self, dict, key): > self.dict = dict > self.key = key > self.value = dict.get(key, UNSET) > self.version = dict_get_version(dict) > > def check(self): > """Return True if the dictionary entry did not changed.""" > > # read the version field of the dict structure > version = dict_get_version(self.dict) > if version == self.version: > # Fast-path: dictionary lookup avoided > return True > > # lookup in the dictionary > value = self.dict.get(self.key, UNSET) > if value is self.value: > # another key was modified: > # cache the new dictionary version > self.version = version > return True > > # the key was modified > return False > > > Usage of the dict version > ========================= > > Specialized functions using guards > ---------------------------------- > > The `PEP 510 -- Specialized functions with guards > `_ proposes an API to support > specialized functions with guards. It allows to implement static > optimizers for Python without breaking the Python semantics. > > Example of a static Python optimizer: the astoptimizer of the `FAT > Python `_ project > implements many optimizations which require guards on namespaces. > Examples: > > * Call pure builtins: to replace ``len("abc")`` with ``3``, guards on > ``builtins.__dict__['len']`` and ``globals()['len']`` are required > * Loop unrolling: to unroll the loop ``for i in range(...): ...``, > guards on ``builtins.__dict__['range']`` and ``globals()['range']`` > are required > > > Pyjion > ------ > > According of Brett Cannon, one of the two main developers of Pyjion, > Pyjion can also benefit from dictionary version to implement > optimizations. > > Pyjion is a JIT compiler for Python based upon CoreCLR (Microsoft .NET > Core runtime). > > > Unladen Swallow > --------------- > > Even if dictionary version was not explicitly mentionned, optimization > globals and builtins lookup was part of the Unladen Swallow plan: > "Implement one of the several proposed schemes for speeding lookups of > globals and builtins." Source: `Unladen Swallow ProjectPlan > `_. > > Unladen Swallow is a fork of CPython 2.6.1 adding a JIT compiler > implemented with LLVM. The project stopped in 2011: `Unladen Swallow > Retrospective > >`_. > > > Changes > ======= > > Add a ``ma_version`` field to the ``PyDictObject`` structure with the C > type ``PY_INT64_T``, 64-bit unsigned integer. New empty dictionaries are > initilized to version ``0``. The version is incremented at each change: > > * ``clear()`` if the dict was non-empty > * ``pop(key)`` if the key exists > * ``popitem()`` if the dict is non-empty > * ``setdefault(key, value)`` if the `key` does not exist > * ``__detitem__(key)`` if the key exists > * ``__setitem__(key, value)`` if the `key` doesn't exist or if the value > is different > * ``update(...)`` if new values are different than existing values (the > version can be incremented multiple times) > Please be more explicit about what tests you are performing on the values. setitem's "if the value is different" really should mean "if value is not dict['key']". similarly for update, there should never be equality checks performed on the values. just an "is" test of it they are the same object or not. > > Example using an hypothetical ``dict_get_version(dict)`` function:: > > >>> d = {} > >>> dict_get_version(d) > 0 > >>> d['key'] = 'value' > >>> dict_get_version(d) > 1 > >>> d['key'] = 'new value' > >>> dict_get_version(d) > 2 > >>> del d['key'] > >>> dict_get_version(d) > 3 > > If a dictionary is created with items, the version is also incremented > at each dictionary insertion. Example:: > > >>> d=dict(x=7, y=33) > >>> dict_get_version(d) > 2 > > The version is not incremented if an existing key is set to the same > value. For efficiency, values are compared by their identity: > ``new_value is old_value``, not by their content: > ``new_value == old_value``. Example:: > > >>> d={} > >>> value = object() > >>> d['key'] = value > >>> dict_get_version(d) > 2 > >>> d['key'] = value > >>> dict_get_version(d) > 2 > > .. note:: > CPython uses some singleton like integers in the range [-5; 257], > empty tuple, empty strings, Unicode strings of a single character in > the range [U+0000; U+00FF], etc. When a key is set twice to the same > singleton, the version is not modified. > > > Implementation > ============== > > The `issue #26058: PEP 509: Add ma_version to PyDictObject > `_ contains a patch implementing > this PEP. > > On pybench and timeit microbenchmarks, the patch does not seem to add > any overhead on dictionary operations. > > When the version does not change, ``PyDict_GetItem()`` takes 14.8 ns for > a dictioanry lookup, whereas a guard check only takes 3.8 ns. Moreover, > a guard can watch for multiple keys. For example, for an optimization > using 10 global variables in a function, 10 dictionary lookups costs 148 > ns, whereas the guard still only costs 3.8 ns when the version does not > change (39x as fast). > > > Integer overflow > ================ > > The implementation uses the C unsigned integer type ``PY_INT64_T`` to > store the version, a 64 bits unsigned integer. The C code uses > ``version++``. On integer overflow, the version is wrapped to ``0`` (and > then continue to be incremented) according to the C standard. > > After an integer overflow, a guard can succeed whereas the watched > dictionary key was modified. The bug occurs if the dictionary is > modified at least ``2 ** 64`` times between two checks of the guard and > if the new version (theorical value with no integer overflow) is equal > to the old version modulo ``2 ** 64``. > > If a dictionary is modified each nanosecond, an overflow takes longer > than 584 years. Using a 32-bit version, the overflow occurs only after 4 > seconds. That's why a 64-bit unsigned type is also used on 32-bit > systems. A dictionary lookup at the C level takes 14.8 ns. > > A risk of a bug every 584 years is acceptable. > > > Alternatives > ============ > Expose the version at Python level as a read-only __version__ property > ---------------------------------------------------------------------- > > The first version of the PEP proposed to expose the dictionary version > as a read-only ``__version__`` property at Python level, and also to add > the property to ``collections.UserDict`` (since this type must mimick > the ``dict`` API). > > There are multiple issues: > > * To be consistent and avoid bad surprises, the version must be added to > all mapping types. Implementing a new mapping type would require extra > work for no benefit, since the version is only required on the > ``dict`` type in practice. > * All Python implementations must implement this new property, it gives > more work to other implementations, whereas they may not use the > dictionary version at all. > * The ``__version__`` can be wrapped on integer overflow. It is error > prone: using ``dict.__version__ <= guard_version`` is wrong, > ``dict.__version__ == guard_version`` must be used instead to reduce > the risk of bug on integer overflow (even if the integer overflow is > unlikely in practice). > * Exposing the dictionary version at Python level can lead the > false assumption on performances. Checking ``dict.__version__`` at > the Python level is not faster than a dictionary lookup. A dictionary > lookup has a cost of 48.7 ns and checking a guard has a cost of 47.5 > ns, the difference is only 1.2 ns (3%):: > > > $ ./python -m timeit -s 'd = {str(i):i for i in range(100)}' 'd["33"] > == 33' > 10000000 loops, best of 3: 0.0487 usec per loop > $ ./python -m timeit -s 'd = {str(i):i for i in range(100)}' > 'd.__version__ == 100' > 10000000 loops, best of 3: 0.0475 usec per loop > > Bikeshedding on the property name: > > * ``__cache_token__``: name proposed by Nick Coghlan, name coming from > `abc.get_cache_token() > `_. > * ``__version__`` > * ``__timestamp__`` > > > Add a version to each dict entry > -------------------------------- > > A single version per dictionary requires to keep a strong reference to > the value which can keep the value alive longer than expected. If we add > also a version per dictionary entry, the guard can only store the entry > version to avoid the strong reference to the value (only strong > references to the dictionary and to the key are needed). > > Changes: add a ``me_version`` field to the ``PyDictKeyEntry`` structure, > the field has the C type ``PY_INT64_T``. When a key is created or > modified, the entry version is set to the dictionary version which is > incremented at any change (create, modify, delete). > > Pseudo-code of an fast guard to check if a dictionary key was modified > using hypothetical ``dict_get_version(dict)`` > ``dict_get_entry_version(dict)`` functions:: > > UNSET = object() > > class GuardDictKey: > def __init__(self, dict, key): > self.dict = dict > self.key = key > self.dict_version = dict_get_version(dict) > self.entry_version = dict_get_entry_version(dict, key) > > def check(self): > """Return True if the dictionary entry did not changed.""" > > # read the version field of the dict structure > dict_version = dict_get_version(self.dict) > if dict_version == self.version: > # Fast-path: dictionary lookup avoided > return True > > # lookup in the dictionary > entry_version = get_dict_key_version(dict, key) > if entry_version == self.entry_version: > # another key was modified: > # cache the new dictionary version > self.dict_version = dict_version > return True > > # the key was modified > return False > > The main drawback of this option is the impact on the memory footprint. > It increases the size of each dictionary entry, so the overhead depends > on the number of buckets (dictionary entries, used or unused yet). For > example, it increases the size of each dictionary entry by 8 bytes on > 64-bit system. > > In Python, the memory footprint matters and the trend is to reduce it. > Examples: > > * `PEP 393 -- Flexible String Representation > `_ > * `PEP 412 -- Key-Sharing Dictionary > `_ > > > Add a new dict subtype > ---------------------- > > Add a new ``verdict`` type, subtype of ``dict``. When guards are needed, > use the ``verdict`` for namespaces (module namespace, type namespace, > instance namespace, etc.) instead of ``dict``. > > Leave the ``dict`` type unchanged to not add any overhead (memory > footprint) when guards are not needed. > > Technical issue: a lot of C code in the wild, including CPython core, > expecting the exact ``dict`` type. Issues: > > * ``exec()`` requires a ``dict`` for globals and locals. A lot of code > use ``globals={}``. It is not possible to cast the ``dict`` to a > ``dict`` subtype because the caller expects the ``globals`` parameter > to be modified (``dict`` is mutable). > * Functions call directly ``PyDict_xxx()`` functions, instead of calling > ``PyObject_xxx()`` if the object is a ``dict`` subtype > * ``PyDict_CheckExact()`` check fails on ``dict`` subtype, whereas some > functions require the exact ``dict`` type. > * ``Python/ceval.c`` does not completly supports dict subtypes for > namespaces > > > The ``exec()`` issue is a blocker issue. > > Other issues: > > * The garbage collector has a special code to "untrack" ``dict`` > instances. If a ``dict`` subtype is used for namespaces, the garbage > collector can be unable to break some reference cycles. > * Some functions have a fast-path for ``dict`` which would not be taken > for ``dict`` subtypes, and so it would make Python a little bit > slower. > > > Prior Art > ========= > > Method cache and type version tag > --------------------------------- > > In 2007, Armin Rigo wrote a patch to to implement a cache of methods. It > was merged into Python 2.6. The patch adds a "type attribute cache > version tag" (``tp_version_tag``) and a "valid version tag" flag to > types (the ``PyTypeObject`` structure). > > The type version tag is not available at the Python level. > > The version tag has the C type ``unsigned int``. The cache is a global > hash table of 4096 entries, shared by all types. The cache is global to > "make it fast, have a deterministic and low memory footprint, and be > easy to invalidate". Each cache entry has a version tag. A global > version tag is used to create the next version tag, it also has the C > type ``unsigned int``. > > By default, a type has its "valid version tag" flag cleared to indicate > that the version tag is invalid. When the first method of the type is > cached, the version tag and the "valid version tag" flag are set. When a > type is modified, the "valid version tag" flag of the type and its > subclasses is cleared. Later, when a cache entry of these types is used, > the entry is removed because its version tag is outdated. > > On integer overflow, the whole cache is cleared and the global version > tag is reset to ``0``. > > See `Method cache (issue #1685986) > `_ and `Armin's method cache > optimization updated for Python 2.6 (issue #1700288) > `_. > > > Globals / builtins cache > ------------------------ > > In 2010, Antoine Pitrou proposed a `Globals / builtins cache (issue > #10401) `_ which adds a private > ``ma_version`` field to the ``PyDictObject`` structure (``dict`` type), > the field has the C type ``Py_ssize_t``. > > The patch adds a "global and builtin cache" to functions and frames, and > changes ``LOAD_GLOBAL`` and ``STORE_GLOBAL`` instructions to use the > cache. > > The change on the ``PyDictObject`` structure is very similar to this > PEP. > > > Cached globals+builtins lookup > ------------------------------ > > In 2006, Andrea Griffini proposed a patch implementing a `Cached > globals+builtins lookup optimization > `_. The patch adds a private > ``timestamp`` field to the ``PyDictObject`` structure (``dict`` type), > the field has the C type ``size_t``. > > Thread on python-dev: `About dictionary lookup caching > >`_. > > > Guard against changing dict during iteration > -------------------------------------------- > > In 2013, Serhiy Storchaka proposed `Guard against changing dict during > iteration (issue #19332) `_ which > adds a ``ma_count`` field to the ``PyDictObject`` structure (``dict`` > type), the field has the C type ``size_t``. This field is incremented > when the dictionary is modified, and so is very similar to the proposed > dictionary version. > > Sadly, the dictionary version proposed in this PEP doesn't help to > detect dictionary mutation. The dictionary version changes when values > are replaced, whereas modifying dictionary values while iterating on > dictionary keys is legit in Python. > > > PySizer > ------- > > `PySizer `_: a memory profiler for Python, > Google Summer of Code 2005 project by Nick Smallbone. > > This project has a patch for CPython 2.4 which adds ``key_time`` and > ``value_time`` fields to dictionary entries. It uses a global > process-wide counter for dictionaries, incremented each time that a > dictionary is modified. The times are used to decide when child objects > first appeared in their parent objects. > > > Discussion > ========== > > Thread on the python-ideas mailing list: `RFC: PEP: Add dict.__version__ > >`_. > > > Copyright > ========= > > This document has been placed in the public domain. > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/greg%40krypto.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From victor.stinner at gmail.com Mon Jan 11 18:24:03 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Tue, 12 Jan 2016 00:24:03 +0100 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: References: Message-ID: 2016-01-12 0:07 GMT+01:00 Gregory P. Smith : >> Changes >> ======= >> >> (...) > > Please be more explicit about what tests you are performing on the values. > setitem's "if the value is different" really should mean "if value is not > dict['key']". similarly for update, there should never be equality checks > performed on the values. just an "is" test of it they are the same object > or not. Ok, done. By the way, it's also explained below: values are compared by their identify, not by their content. For best dict efficiency, we can not implement this micro-optimization (to avoid a potential branch misprediction in the CPU) and always increase the version. But for guards, the micro-optimization can avoid a lot of dictionary lookups, especially when a guard watches for a large number of keys. Victor From abarnert at yahoo.com Mon Jan 11 18:35:35 2016 From: abarnert at yahoo.com (Andrew Barnert) Date: Mon, 11 Jan 2016 15:35:35 -0800 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: References: Message-ID: <3FEDF0A4-9B54-4087-B7F8-405C11169F3F@yahoo.com> On Jan 11, 2016, at 15:24, Victor Stinner wrote: > > 2016-01-12 0:07 GMT+01:00 Gregory P. Smith : >>> Changes >>> ======= >>> >>> (...) >> >> Please be more explicit about what tests you are performing on the values. >> setitem's "if the value is different" really should mean "if value is not >> dict['key']". similarly for update, there should never be equality checks >> performed on the values. just an "is" test of it they are the same object >> or not. > > Ok, done. By the way, it's also explained below: values are compared > by their identify, not by their content. > > For best dict efficiency, we can not implement this micro-optimization > (to avoid a potential branch misprediction in the CPU) and always > increase the version. But for guards, the micro-optimization can avoid > a lot of dictionary lookups, especially when a guard watches for a > large number of keys. Are you saying that d[key] = d[key] may or may not increment the version, so any optimizer can't rely on the fact that it doesn't? If so, that seems reasonable. (The worst case in incrementing the version unnecessarily is that you miss an optimization that would have been safe, right?). From victor.stinner at gmail.com Tue Jan 12 09:25:22 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Tue, 12 Jan 2016 15:25:22 +0100 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: <3FEDF0A4-9B54-4087-B7F8-405C11169F3F@yahoo.com> References: <3FEDF0A4-9B54-4087-B7F8-405C11169F3F@yahoo.com> Message-ID: Well, it was just a remark. 2016-01-12 0:35 GMT+01:00 Andrew Barnert : > Are you saying that d[key] = d[key] may or may not increment the version, so any optimizer can't rely on the fact that it doesn't? Optimizers don't have to rely on this exactly behaviour. Not incrementing the version on such case avoids dictionary lookups in the guard. My current patch does not increment if the value is the same, and I'm unable to see any performance regression on *micro* benchmarks: https://bugs.python.org/issue26058 So I'm in favor of making guards as efficient as possible and not increment the version in dict ;-) Victor From jimjjewett at gmail.com Tue Jan 12 13:34:07 2016 From: jimjjewett at gmail.com (Jim J. Jewett) Date: Tue, 12 Jan 2016 13:34:07 -0500 Subject: [Python-Dev] PEP 509 Message-ID: (1) Please make it clear within the abstract what counts as a change. (1a) E.g., a second paragraph such as "Adding or removing a key, or replacing a value, counts as a change. Modifying an object in place, or replacing it with itself may not be picked up." (1b) Is there a way to force a version update? d[k]=d[k] seems like it should do that (absent the optimization to prevent it), but I confess that I can't come up with a good use case that doesn't start seeming internal to a specific optimizer. (1c) Section "Guard against changing dict during iteration" says "Sadly, the dictionary version proposed in this PEP doesn't help to detect dictionary mutation." Why not? Wouldn't that mutation involve replacing a value, which ought to trigger a version change? (2) I would like to see a .get on the guard object, so that it could be used in place of the dict lookup even from python. If this doesn't make sense (e.g., doesn't really save time since the guard has to be used from python), please mention that in the Guard Example text. (3) It would be possible to define the field as reserved in the main header, and require another header to use it even from C. (3a) This level of privacy might be overkill, but I would prefer that the decision be explicit. (3b) The change should almost certainly be hidden from the ABI / Py_LIMITED_API -jJ From ethan at stoneleaf.us Tue Jan 12 13:52:24 2016 From: ethan at stoneleaf.us (Ethan Furman) Date: Tue, 12 Jan 2016 10:52:24 -0800 Subject: [Python-Dev] PEP 509 In-Reply-To: References: Message-ID: <56954B68.3090208@stoneleaf.us> On 01/12/2016 10:34 AM, Jim J. Jewett wrote: > (1c) Section "Guard against changing dict during iteration" says > "Sadly, the dictionary version proposed in this PEP doesn't help to > detect dictionary mutation." Why not? Wouldn't that mutation involve > replacing a value, which ought to trigger a version change? Yes it would, but mutating a dictionary value during iteration is legal, so we cannot use the __version__ [1] change to tell us that something illegal happened. [1] We're not going to call it __version__ are we? Seems like __cache_token__ is a much better name. -- ~Ethan~ From victor.stinner at gmail.com Tue Jan 12 16:34:09 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Tue, 12 Jan 2016 22:34:09 +0100 Subject: [Python-Dev] PEP 509 In-Reply-To: <56954B68.3090208@stoneleaf.us> References: <56954B68.3090208@stoneleaf.us> Message-ID: 2016-01-12 19:52 GMT+01:00 Ethan Furman : > [1] We're not going to call it __version__ are we? Seems like > __cache_token__ is a much better name. See the online version to the most recent version of the PEP: https://www.python.org/dev/peps/pep-0509/ In the first version I proposed to expose the version, but I changed my mind, it's not private (only exposed in the C API). Victor From victor.stinner at gmail.com Tue Jan 12 16:42:35 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Tue, 12 Jan 2016 22:42:35 +0100 Subject: [Python-Dev] PEP 509 In-Reply-To: References: Message-ID: 2016-01-12 19:34 GMT+01:00 Jim J. Jewett : > (1) Please make it clear within the abstract what counts as a change. I don't think that an abstract must give the long list of cases when the version is modified or not. It's explained in detail at: https://www.python.org/dev/peps/pep-0509/#changes > (1b) Is there a way to force a version update? No. Why would you do that? (What is your use case.) FYI there is a private API in _testcapi to set the version, for unit tests. > (2) I would like to see a .get on the guard object, so that it could > be used in place of the dict lookup even from python. If this doesn't > make sense (e.g., doesn't really save time since the guard has to be > used from python), please mention that in the Guard Example text. Optimizations are out of the scope of this PEP. Please see https://www.python.org/dev/peps/pep-0510/ to more examples of specialization with guards. See also https://faster-cpython.readthedocs.org/fat_python.html for concrete optimizations using specialization with guards. > (3) It would be possible to define the field as reserved in the main > header, and require another header to use it even from C. > > (3a) This level of privacy might be overkill, but I would prefer that > the decision be explicit. > > (3b) The change should almost certainly be hidden from the ABI / Py_LIMITED_API Oh, the PyDictObject structure is not part of the stable ABI. It seems worth to mention it in the PEP. Victor From victor.stinner at gmail.com Tue Jan 12 17:13:22 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Tue, 12 Jan 2016 23:13:22 +0100 Subject: [Python-Dev] PEP 510: Specialize functions with guards Message-ID: Hi, I posted a first version of this PEP on python-ideas and I got interesting feedback. The main changes in this second version is that the whole API is now private: no more exposed in Python at all, only in the Python C API (C level). The PEP is part of a serie of 3 PEP (509, 510, 511) adding an API to implement a static Python optimizer specializing functions with guards. I will post a first version of the PEP 511 to python-ideas soon. HTML version: https://www.python.org/dev/peps/pep-0510/#changes Victor PEP: 510 Title: Specialize functions with guards Version: $Revision$ Last-Modified: $Date$ Author: Victor Stinner Status: Draft Type: Standards Track Content-Type: text/x-rst Created: 4-January-2016 Python-Version: 3.6 Abstract ======== Add functions to the Python C API to specialize pure Python functions: add specialized codes with guards. It allows to implement static optimizers respecting the Python semantics. Rationale ========= Python semantics ---------------- Python is hard to optimize because almost everything is mutable: builtin functions, function code, global variables, local variables, ... can be modified at runtime. Implement optimizations respecting the Python semantics requires to detect when "something changes", we will call these checks "guards". This PEP proposes to add a public API to the Python C API to add specialized codes with guards to a function. When the function is called, a specialized code is used if nothing changed, otherwise use the original bytecode. Even if guards help to respect most parts of the Python semantics, it's hard to optimize Python without making subtle changes on the exact behaviour. CPython has a long history and many applications rely on implementation details. A compromise must be found between "everything is mutable" and performance. Writing an optimizer is out of the scope of this PEP. Why not a JIT compiler? ----------------------- There are multiple JIT compilers for Python actively developed: * `PyPy `_ * `Pyston `_ * `Numba `_ * `Pyjion `_ Numba is specific to numerical computation. Pyston and Pyjion are still young. PyPy is the most complete Python interpreter, it is much faster than CPython and has a very good compatibility with CPython (it respects the Python semantics). There are still issues with Python JIT compilers which avoid them to be widely used instead of CPython. Many popular libraries like numpy, PyGTK, PyQt, PySide and wxPython are implemented in C or C++ and use the Python C API. To have a small memory footprint and better performances, Python JIT compilers do not use reference counting to use a faster garbage collector, do not use C structures of CPython objects and manage memory allocations differently. PyPy has a ``cpyext`` module which emulates the Python C API but it has worse performances than CPython and does not support the full Python C API. New features are first developped in CPython. In january 2016, the latest CPython stable version is 3.5, whereas PyPy only supports Python 2.7 and 3.2, and Pyston only supports Python 2.7. Even if PyPy has a very good compatibility with Python, some modules are still not compatible with PyPy: see `PyPy Compatibility Wiki `_. The incomplete support of the the Python C API is part of this problem. There are also subtle differences between PyPy and CPython like reference counting: object destructors are always called in PyPy, but can be called "later" than in CPython. Using context managers helps to control when resources are released. Even if PyPy is much faster than CPython in a wide range of benchmarks, some users still report worse performances than CPython on some specific use cases or unstable performances. When Python is used as a scripting program for programs running less than 1 minute, JIT compilers can be slower because their startup time is higher and the JIT compiler takes time to optimize the code. For example, most Mercurial commands take a few seconds. Numba now supports ahead of time compilation, but it requires decorator to specify arguments types and it only supports numerical types. CPython 3.5 has almost no optimization: the peephole optimizer only implements basic optimizations. A static compiler is a compromise between CPython 3.5 and PyPy. .. note:: There was also the Unladen Swallow project, but it was abandoned in 2011. Examples ======== Following examples are not written to show powerful optimizations promising important speedup, but to be short and easy to understand, just to explain the principle. Hypothetical myoptimizer module ------------------------------- Examples in this PEP uses an hypothetical ``myoptimizer`` module which provides the following functions and types: * ``specialize(func, code, guards)``: add the specialized code `code` with guards `guards` to the function `func` * ``get_specialized(func)``: get the list of specialized codes as a list of ``(code, guards)`` tuples where `code` is a callable or code object and `guards` is a list of a guards * ``GuardBuiltins(name)``: guard watching for ``builtins.__dict__[name]`` and ``globals()[name]``. The guard fails if ``builtins.__dict__[name]`` is replaced, or if ``globals()[name]`` is set. Using bytecode -------------- Add specialized bytecode where the call to the pure builtin function ``chr(65)`` is replaced with its result ``"A"``:: import myoptimizer def func(): return chr(65) def fast_func(): return "A" myoptimizer.specialize(func, fast_func.__code__, [myoptimizer.GuardBuiltins("chr")]) del fast_func Example showing the behaviour of the guard:: print("func(): %s" % func()) print("#specialized: %s" % len(myoptimizer.get_specialized(func))) print() import builtins builtins.chr = lambda obj: "mock" print("func(): %s" % func()) print("#specialized: %s" % len(myoptimizer.get_specialized(func))) Output:: func(): A #specialized: 1 func(): mock #specialized: 0 The first call uses the specialized bytecode which returns the string ``"A"``. The second call removes the specialized code because the builtin ``chr()`` function was replaced, and executes the original bytecode calling ``chr(65)``. On a microbenchmark, calling the specialized bytecode takes 88 ns, whereas the original function takes 145 ns (+57 ns): 1.6 times as fast. Using builtin function ---------------------- Add the C builtin ``chr()`` function as the specialized code instead of a bytecode calling ``chr(obj)``:: import myoptimizer def func(arg): return chr(arg) myoptimizer.specialize(func, chr, [myoptimizer.GuardBuiltins("chr")]) Example showing the behaviour of the guard:: print("func(65): %s" % func(65)) print("#specialized: %s" % len(myoptimizer.get_specialized(func))) print() import builtins builtins.chr = lambda obj: "mock" print("func(65): %s" % func(65)) print("#specialized: %s" % len(myoptimizer.get_specialized(func))) Output:: func(): A #specialized: 1 func(): mock #specialized: 0 The first call calls the C builtin ``chr()`` function (without creating a Python frame). The second call removes the specialized code because the builtin ``chr()`` function was replaced, and executes the original bytecode. On a microbenchmark, calling the C builtin takes 95 ns, whereas the original bytecode takes 155 ns (+60 ns): 1.6 times as fast. Calling directly ``chr(65)`` takes 76 ns. Choose the specialized code =========================== Pseudo-code to choose the specialized code to call a pure Python function:: def call_func(func, args, kwargs): specialized = myoptimizer.get_specialized(func) nspecialized = len(specialized) index = 0 while index < nspecialized: specialized_code, guards = specialized[index] for guard in guards: check = guard(args, kwargs) if check: break if not check: # all guards succeeded: # use the specialized code return specialized_code elif check == 1: # a guard failed temporarely: # try the next specialized code index += 1 else: assert check == 2 # a guard will always fail: # remove the specialized code del specialized[index] # if a guard of each specialized code failed, or if the function # has no specialized code, use original bytecode code = func.__code__ Changes ======= Changes to the Python C API: * Add a ``PyFuncGuardObject`` object and a ``PyFuncGuard_Type`` type * Add a ``PySpecializedCode`` structure * Add the following fields to the ``PyFunctionObject`` structure:: Py_ssize_t nb_specialized; PySpecializedCode *specialized; * Add function methods: * ``PyFunction_Specialize()`` * ``PyFunction_GetSpecializedCodes()`` * ``PyFunction_GetSpecializedCode()`` None of these function and types are exposed at the Python level. All these additions are explicitly excluded of the stable ABI. When a function code is replaced (``func.__code__ = new_code``), all specialized codes and guards are removed. When a function is serialized ``pickle``, specialized codes and guards are ignored (not serialized). Specialized codes and guards are not stored in ``.pyc`` files but created and registered at runtime, when a module is loaded. Function guard -------------- Add a function guard object:: typedef struct { PyObject ob_base; int (*init) (PyObject *guard, PyObject *func); int (*check) (PyObject *guard, PyObject **stack, int na, int nk); } PyFuncGuardObject; The ``init()`` function initializes a guard: * Return ``0`` on success * Return ``1`` if the guard will always fail: ``PyFunction_Specialize()`` must ignore the specialized code * Raise an exception and return ``-1`` on error The ``check()`` function checks a guard: * Return ``0`` on success * Return ``1`` if the guard failed temporarely * Return ``2`` if the guard will always fail: the specialized code must be removed * Raise an exception and return ``-1`` on error *stack* is an array of arguments: indexed arguments followed by (*key*, *value*) pairs of keyword arguments. *na* is the number of indexed arguments. *nk* is the number of keyword arguments: the number of (*key*, *value*) pairs. `stack` contains ``na + nk * 2`` objects. Specialized code ---------------- Add a specialized code structure:: typedef struct { PyObject *code; /* callable or code object */ Py_ssize_t nb_guard; PyObject **guards; /* PyFuncGuardObject objects */ } PySpecializedCode; Function methods ---------------- Add a function method to specialize the function, add a specialized code with guards:: int PyFunction_Specialize(PyObject *func, PyObject *code, PyObject *guards) Result: * Return ``0`` on success * Return ``1`` if the specialization has been ignored * Raise an exception and return ``-1`` on error Add a function method to get the list of specialized codes:: PyObject* PyFunction_GetSpecializedCodes(PyObject *func) Return a list of (*code*, *guards*) tuples where *code* is a callable or code object and *guards* is a list of ``PyFuncGuard`` objects. Raise an exception and return ``NULL`` on error. Add a function method to get the specialized code:: PyObject* PyFunction_GetSpecializedCode(PyObject *func, PyObject **stack, int na, int nk) See ``check()`` function of guards for *stack*, *na* and *nk* arguments. Return a callable or a code object on success. Raise an exception and return ``NULL`` on error. Benchmark --------- Microbenchmark on ``python3.6 -m timeit -s 'def f(): pass' 'f()'`` (best of 3 runs): * Original Python: 79 ns * Patched Python: 79 ns According to this microbenchmark, the changes has no overhead on calling a Python function without specialization. Other implementations of Python =============================== This PEP only contains changes to the Python C API, the Python API is unchanged. Other implementations of Python are free to not implement new additions, or implement added functions as no-op: * ``PyFunction_Specialize()``: always return ``1`` (the specialization has been ignored) * ``PyFunction_GetSpecializedCodes()``: always return an empty list * ``PyFunction_GetSpecializedCode()``: return the function code object, as the existing ``PyFunction_GET_CODE()`` macro Discussion ========== Thread on the python-ideas mailing list: `RFC: PEP: Specialized functions with guards `_. Copyright ========= This document has been placed in the public domain. From ethan at stoneleaf.us Tue Jan 12 17:24:50 2016 From: ethan at stoneleaf.us (Ethan Furman) Date: Tue, 12 Jan 2016 14:24:50 -0800 Subject: [Python-Dev] PEP 509 In-Reply-To: References: <56954B68.3090208@stoneleaf.us> Message-ID: <56957D32.6020401@stoneleaf.us> On 01/12/2016 01:34 PM, Victor Stinner wrote: > 2016-01-12 19:52 GMT+01:00 Ethan Furman : >> [1] We're not going to call it __version__ are we? Seems like >> __cache_token__ is a much better name. > > See the online version to the most recent version of the PEP: > https://www.python.org/dev/peps/pep-0509/ > > In the first version I proposed to expose the version, but I changed > my mind, it's not private (only exposed in the C API). Even if not exposed at the Python layer, it's still exposed when working at the C layer. Is __version__ any less confusing there? (I only work in C when working on Python, and only occasionally, so my question is real.) -- ~Ethan~ From tjreedy at udel.edu Tue Jan 12 17:57:15 2016 From: tjreedy at udel.edu (Terry Reedy) Date: Tue, 12 Jan 2016 17:57:15 -0500 Subject: [Python-Dev] PEP 509 In-Reply-To: <56957D32.6020401@stoneleaf.us> References: <56954B68.3090208@stoneleaf.us> <56957D32.6020401@stoneleaf.us> Message-ID: On 1/12/2016 5:24 PM, Ethan Furman wrote: > On 01/12/2016 01:34 PM, Victor Stinner wrote: >> 2016-01-12 19:52 GMT+01:00 Ethan Furman : >>> [1] We're not going to call it __version__ are we? Seems like >>> __cache_token__ is a much better name. While I understand the rationale against __version__, it strikes me as a better description of what it is, and easier on the brain than __cache_token__. Maybe there is something even better, such as __seqnum__. This is literally what the attribute is, a sequence/revision number, as with the Python repository, without the connotations of 'version', as in 'Python version'. Each commit to CPython changes the repository state without changing the 'version'. A dict.update may run thru 1000s of sequence numbers, but only the final result is a 'version' from the programmers point of view. -- Terry Jan Reedy From victor.stinner at gmail.com Tue Jan 12 18:24:32 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Wed, 13 Jan 2016 00:24:32 +0100 Subject: [Python-Dev] PEP 509 In-Reply-To: <56957D32.6020401@stoneleaf.us> References: <56954B68.3090208@stoneleaf.us> <56957D32.6020401@stoneleaf.us> Message-ID: 2016-01-12 23:24 GMT+01:00 Ethan Furman : > Even if not exposed at the Python layer, it's still exposed when working at > the C layer. Is __version__ any less confusing there? (I only work in C > when working on Python, and only occasionally, so my question is real.) Fields of the PyDictObject must be prefixed with "ma_". If you read the prior art of the PEP, you will see that Antoine Pitrou also proposed the "ma_version" name. The existing version on types used the "version_tag" name. Maybe I should pick this one. Dunder names like __version__ is not used in the C language. Do you expect "__version__" to be somehow "protected" or "private"? The field is definitely public in the Python C API, and I don't think that it's a problem. It's not really a choice, there is no way to hide a field of a C structure. If you start to do random things on the C API, it's your responsability :-) Victor From ethan at stoneleaf.us Tue Jan 12 18:50:56 2016 From: ethan at stoneleaf.us (Ethan Furman) Date: Tue, 12 Jan 2016 15:50:56 -0800 Subject: [Python-Dev] PEP 509 In-Reply-To: References: <56954B68.3090208@stoneleaf.us> <56957D32.6020401@stoneleaf.us> Message-ID: <56959160.3020203@stoneleaf.us> On 01/12/2016 03:24 PM, Victor Stinner wrote: > 2016-01-12 23:24 GMT+01:00 Ethan Furman wrote: >> Even if not exposed at the Python layer, it's still exposed when >> working at the C layer. Is __version__ any less confusing there? >> (I only work in C when working on Python, and only occasionally, so >> my question is real.) > > Fields of the PyDictObject must be prefixed with "ma_". If you read > the prior art of the PEP, you will see that Antoine Pitrou also > proposed the "ma_version" name. The existing version on types used the > "version_tag" name. Maybe I should pick this one. > > Dunder names like __version__ is not used in the C language. > > Do you expect "__version__" to be somehow "protected" or "private"? Nope, I just don't want to be misdirected when I see the name. :) I think ma_version (or ma_seqnum) will be fine. -- ~Ethan~ From Eddy at Quicksall.com Tue Jan 12 21:55:50 2016 From: Eddy at Quicksall.com (Eddy Quicksall) Date: Tue, 12 Jan 2016 21:55:50 -0500 Subject: [Python-Dev] Building with VS2015 Message-ID: <01f401d14dad$e984f1e0$bc8ed5a0$@com> I downloaded https://www.python.org/ftp/python/3.4.4/Python-3.4.4.tgz. After reading Python-3.4.4\PCbuild\readme.txt I opened pcbuild.sln and selected release/winn32 then tried a build all (F7). But that give lots of errors and missing .h files (mostly sqlite3.h). Does anyone know what I may be doing wrong or have misunderstood? Here is the output: 1>------ Build started: Project: make_versioninfo, Configuration: Release Win32 ------ 2>------ Build started: Project: kill_python, Configuration: Release Win32 ------ 3>------ Skipped Build: Project: bdist_wininst, Configuration: Release Win32 ------ 3>Project not selected to build for this solution configuration 4>------ Build started: Project: python3dll, Configuration: Release Win32 ------ 5>------ Build started: Project: xxlimited, Configuration: Release Win32 ------ 1> make_versioninfo.c 5> xxlimited.c 2> kill_python.c 4> 4> Microsoft (R) Program Maintenance Utility Version 14.00.23026.0 4> Copyright (C) Microsoft Corporation. All rights reserved. 4> 4> lib /def:python34stub.def /out:W:\Python-3.4.4\PCbuild\python34stub.lib /MACHINE:x86 5>LINK : fatal error LNK1104: cannot open file 'python3.lib' 4> Microsoft (R) Library Manager Version 14.00.23026.0 4> Copyright (C) Microsoft Corporation. All rights reserved. 4> 4> Creating library W:\Python-3.4.4\PCbuild\python34stub.lib and object W:\Python-3.4.4\PCbuild\python34stub.exp 4> cl /LD /FeW:\Python-3.4.4\PCbuild\python3.dll python3dll.c python3.def W:\Python-3.4.4\PCbuild\python34stub.lib 4> Microsoft (R) C/C++ Optimizing Compiler Version 19.00.23026 for x86 4> Copyright (C) Microsoft Corporation. All rights reserved. 4> 4> python3dll.c 1> make_versioninfo.vcxproj -> W:\Python-3.4.4\PCbuild\make_versioninfo.exe 2> Generating code 2> All 5 functions were compiled because no usable IPDB/IOBJ from previous compilation was found. 2> Finished generating code 4> Microsoft (R) Incremental Linker Version 14.00.23026.0 4> Copyright (C) Microsoft Corporation. All rights reserved. 4> 4> /dll 4> /implib:W:\Python-3.4.4\PCbuild\python3.lib 4> /out:W:\Python-3.4.4\PCbuild\python3.dll 4> /def:python3.def 4> python3dll.obj 4> W:\Python-3.4.4\PCbuild\python34stub.lib 4> Creating library W:\Python-3.4.4\PCbuild\python3.lib and object W:\Python-3.4.4\PCbuild\python3.exp 6>------ Build started: Project: pylauncher, Configuration: Release Win32 ------ 7>------ Build started: Project: pywlauncher, Configuration: Release Win32 ------ 7> launcher.c 6> launcher.c 2> kill_python.vcxproj -> W:\Python-3.4.4\PCbuild\kill_python.exe 8>------ Build started: Project: pythoncore, Configuration: Release Win32 ------ 9>------ Build started: Project: sqlite3, Configuration: Release Win32 ------ 9> sqlite3.c 9>c1 : fatal error C1083: Cannot open source file: '..\externals\sqlite-3.8.11.0\sqlite3.c': No such file or directory 7> Generating code 7> All 41 functions were compiled because no usable IPDB/IOBJ from previous compilation was found. 7> Finished generating code 8> _bisectmodule.c 8> _codecsmodule.c 6> Generating code 6> All 41 functions were compiled because no usable IPDB/IOBJ from previous compilation was found. 6> Finished generating code 8> _collectionsmodule.c 7> pywlauncher.vcxproj -> W:\Python-3.4.4\PCbuild\pyw.exe 8> _csv.c 6> pylauncher.vcxproj -> W:\Python-3.4.4\PCbuild\py.exe 8> _functoolsmodule.c 8> _heapqmodule.c 8> _json.c 8> _localemodule.c 8> _lsprof.c 8> _math.c 8> _pickle.c 8> _randommodule.c 8> _sre.c 8> _stat.c 8> _struct.c 8> _weakref.c 8> arraymodule.c 8> atexitmodule.c 8> audioop.c 8> binascii.c 8> Generating Code... 8> Compiling... 8> cmathmodule.c 8> _datetimemodule.c 8>..\Modules\_datetimemodule.c(4843): error C2065: 'daylight': undeclared identifier 8>..\Modules\_datetimemodule.c(4844): error C2065: 'tzname': undeclared identifier 8>..\Modules\_datetimemodule.c(4844): error C2109: subscript requires array or pointer type 8>..\Modules\_datetimemodule.c(4846): error C2065: 'tzname': undeclared identifier 8>..\Modules\_datetimemodule.c(4846): error C2109: subscript requires array or pointer type 8> errnomodule.c 8> faulthandler.c 8> gcmodule.c 8> hashtable.c 8> itertoolsmodule.c 8> main.c 8>..\Modules\main.c(521): error C2198: 'wcstok': too few arguments for call 8>..\Modules\main.c(523): error C2198: 'wcstok': too few arguments for call 8> mathmodule.c 8> md5module.c 8> mmapmodule.c 8> _opcode.c 8> _operator.c 8> parsermodule.c 8> posixmodule.c 8> rotatingtree.c 8> sha1module.c 8> sha256module.c 8> sha512module.c 8> signalmodule.c 8> Generating Code... 8> Compiling... 8> symtablemodule.c 8> _threadmodule.c 8> _tracemalloc.c 8> timemodule.c 8>..\Modules\timemodule.c(1325): error C2065: 'timezone': undeclared identifier 8>..\Modules\timemodule.c(1329): error C2065: 'timezone': undeclared identifier 8>..\Modules\timemodule.c(1331): error C2065: 'daylight': undeclared identifier 8>..\Modules\timemodule.c(1332): error C2065: 'tzname': undeclared identifier 8>..\Modules\timemodule.c(1332): error C2109: subscript requires array or pointer type 8>..\Modules\timemodule.c(1332): error C2198: 'PyUnicode_DecodeLocale': too few arguments for call 8>..\Modules\timemodule.c(1333): error C2065: 'tzname': undeclared identifier 8>..\Modules\timemodule.c(1333): error C2109: subscript requires array or pointer type 8>..\Modules\timemodule.c(1333): error C2198: 'PyUnicode_DecodeLocale': too few arguments for call 8> xxsubtype.c 8> zipimport.c 8> zlibmodule.c 8> fileio.c 8> bytesio.c 8> stringio.c 8> bufferedio.c 8>..\Modules\_io\bufferedio.c(321): warning C4244: 'function': conversion from 'double' to '__int64', possible loss of data 8> iobase.c 8> textio.c 8> _iomodule.c 8> adler32.c 8> compress.c 8> crc32.c 8> deflate.c 8> infback.c 8> inffast.c 8> Generating Code... 8> Compiling... 8> inflate.c 8> inftrees.c 8> trees.c 8> uncompr.c 8> zutil.c 8> _codecs_cn.c 8> _codecs_hk.c 8> _codecs_iso2022.c 8> _codecs_jp.c 8> _codecs_kr.c 8> _codecs_tw.c 8> multibytecodec.c 8> _winapi.c 8>..\Modules\_winapi.c(875): warning C4996: 'GetVersion': was declared deprecated 8> C:\Program Files (x86)\Windows Kits\8.1\Include\um\sysinfoapi.h(110): note: see declaration of 'GetVersion' 8> abstract.c 8> accu.c 8> boolobject.c 8> bytes_methods.c 8> bytearrayobject.c 8> bytesobject.c 8> capsule.c 8> Generating Code... 8> Compiling... 8> cellobject.c 8> classobject.c 8> codeobject.c 8> complexobject.c 8> descrobject.c 8> dictobject.c 8> enumobject.c 8> exceptions.c 8> fileobject.c 8> floatobject.c 8> frameobject.c 8> funcobject.c 8> genobject.c 8> iterobject.c 8> listobject.c 8> longobject.c 8> memoryobject.c 8> methodobject.c 8> moduleobject.c 8> namespaceobject.c 8> Generating Code... 8> Compiling... 8> object.c 8> obmalloc.c 8> rangeobject.c 8> setobject.c 8> sliceobject.c 8> structseq.c 8> tupleobject.c 8> typeobject.c 8> unicodectype.c 8> unicodeobject.c 8>..\Objects\unicodeobject.c(15011): warning C4996: 'GetVersionExA': was declared deprecated 8> C:\Program Files (x86)\Windows Kits\8.1\Include\um\sysinfoapi.h(433): note: see declaration of 'GetVersionExA' 8> weakrefobject.c 8> acceler.c 8> bitset.c 8> firstsets.c 8> grammar.c 8> grammar1.c 8> listnode.c 8> metagrammar.c 8> myreadline.c 8> node.c 8> Generating Code... 8> Compiling... 8> parser.c 8> parsetok.c 8> tokenizer.c 8> winreg.c 8> config.c 8> dl_nt.c 8> getpathp.c 8>..\PC\getpathp.c(454): error C2198: 'wcstok': too few arguments for call 8>..\PC\getpathp.c(456): error C2198: 'wcstok': too few arguments for call 8>..\PC\getpathp.c(458): error C2198: 'wcstok': too few arguments for call 8> msvcrtmodule.c 8> pyhash.c 8> random.c 8> _warnings.c 8> asdl.c 8> ast.c 8> bltinmodule.c 8> ceval.c 8> codecs.c 8> compile.c 8> dynamic_annotations.c 8> dynload_win.c 8> errors.c 8> Generating Code... 8> Compiling... 8> fileutils.c 8> formatter_unicode.c 8> frozen.c 8> future.c 8> getargs.c 8> getcompiler.c 8> getcopyright.c 8> getopt.c 8> getplatform.c 8> getversion.c 8> graminit.c 8> import.c 8> importdl.c 8> marshal.c 8> modsupport.c 8> mysnprintf.c 8> mystrtoul.c 8> peephole.c 8> pyarena.c 8> pyctype.c 8> Generating Code... 8> Compiling... 8> pyfpe.c 8> pymath.c 8> pytime.c 8> pystate.c 8> pystrcmp.c 8> pystrtod.c 8> dtoa.c 8> Python-ast.c 8> pythonrun.c 8> structmember.c 8> symtable.c 8> sysmodule.c 8>..\Python\sysmodule.c(782): warning C4996: 'GetVersionExA': was declared deprecated 8> C:\Program Files (x86)\Windows Kits\8.1\Include\um\sysinfoapi.h(433): note: see declaration of 'GetVersionExA' 8> thread.c 8> traceback.c 8> Generating Code... 10>------ Build started: Project: python, Configuration: Release Win32 ------ 11>------ Build started: Project: _socket, Configuration: Release Win32 ------ 12>------ Build started: Project: _decimal, Configuration: Release Win32 ------ 13>------ Build started: Project: _ctypes, Configuration: Release Win32 ------ 11> socketmodule.c 12> _decimal.c 13> _ctypes.c 12> basearith.c 10> python.c 12> constants.c 12> context.c 11>..\Modules\socketmodule.c(866): warning C4996: 'inet_addr': Use inet_pton() or InetPton() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 11> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(1850): note: see declaration of 'inet_addr' 11>..\Modules\socketmodule.c(3749): warning C4996: 'WSADuplicateSocketA': Use WSADuplicateSocketW() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 11> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(2872): note: see declaration of 'WSADuplicateSocketA' 11>..\Modules\socketmodule.c(3959): warning C4996: 'WSASocketA': Use WSASocketW() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 11> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(3457): note: see declaration of 'WSASocketA' 11>..\Modules\socketmodule.c(3993): warning C4996: 'WSASocketA': Use WSASocketW() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 11> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(3457): note: see declaration of 'WSASocketA' 11>..\Modules\socketmodule.c(4399): warning C4996: 'gethostbyname': Use getaddrinfo() or GetAddrInfoW() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 11> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(2238): note: see declaration of 'gethostbyname' 11>..\Modules\socketmodule.c(4497): warning C4996: 'gethostbyaddr': Use getnameinfo() or GetNameInfoW() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 11> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(2216): note: see declaration of 'gethostbyaddr' 11>..\Modules\socketmodule.c(4627): warning C4996: 'WSADuplicateSocketA': Use WSADuplicateSocketW() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 11> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(2872): note: see declaration of 'WSADuplicateSocketA' 11>..\Modules\socketmodule.c(4632): warning C4996: 'WSASocketA': Use WSASocketW() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 11> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(3457): note: see declaration of 'WSASocketA' 11>..\Modules\socketmodule.c(4921): warning C4996: 'inet_addr': Use inet_pton() or InetPton() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 11> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(1850): note: see declaration of 'inet_addr' 11>..\Modules\socketmodule.c(4963): warning C4996: 'inet_ntoa': Use inet_ntop() or InetNtop() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 11> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(1868): note: see declaration of 'inet_ntoa' 11>..\Modules\socketmodule.c(5037): warning C4996: 'WSAStringToAddressA': Use WSAStringToAddressW() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 12> convolute.c 11> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(3623): note: see declaration of 'WSAStringToAddressA' 11>..\Modules\socketmodule.c(5175): warning C4996: 'WSAAddressToStringA': Use WSAAddressToStringW() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 11> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(3556): note: see declaration of 'WSAAddressToStringA' 11>..\Modules\socketmodule.c(5783): warning C4996: 'GetVersion': was declared deprecated 11> C:\Program Files (x86)\Windows Kits\8.1\Include\um\sysinfoapi.h(110): note: see declaration of 'GetVersion' 11>..\Modules\socketmodule.c(6922): warning C4996: 'inet_addr': Use inet_pton() or InetPton() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 11> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(1850): note: see declaration of 'inet_addr' 11>..\Modules\socketmodule.c(6941): warning C4996: 'inet_ntoa': Use inet_ntop() or InetNtop() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 11> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(1868): note: see declaration of 'inet_ntoa' 12> crt.c 13> callbacks.c 12> difradix2.c 12> fnt.c 12> fourstep.c 12> io.c 10>LINK : fatal error LNK1181: cannot open input file 'W:\Python-3.4.4\PCbuild\python34.lib' 11>LINK : fatal error LNK1181: cannot open input file 'W:\Python-3.4.4\PCbuild\python34.lib' 12> memory.c 14>------ Build started: Project: ssl, Configuration: Release Win32 ------ 15>------ Build started: Project: _ctypes_test, Configuration: Release Win32 ------ 13> callproc.c 12> mpdecimal.c 14> '"W:\Python-3.4.4\PCbuild\python.exe"' is not recognized as an internal or external command, 14> operable program or batch file. 14>C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(37,5): error MSB3073: The command "cd "W:\Python-3.4.4\PCbuild\" 14>C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(37,5): error MSB3073: "W:\Python-3.4.4\PCbuild\python.exe" build_ssl.py Release Win32 -a 14>C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(37,5): error MSB3073: " exited with code 9009. 16>------ Build started: Project: _elementtree, Configuration: Release Win32 ------ 15> _ctypes_test.c 12> numbertheory.c 13> cfield.c 15>LINK : fatal error LNK1181: cannot open input file 'W:\Python-3.4.4\PCbuild\python34.lib' 17>------ Build started: Project: _msi, Configuration: Release Win32 ------ 12> sixstep.c 12> transpose.c 12>LINK : fatal error LNK1181: cannot open input file 'W:\Python-3.4.4\PCbuild\python34.lib' 18>------ Build started: Project: _sqlite3, Configuration: Release Win32 ------ 19>------ Build started: Project: _ssl, Configuration: Release Win32 ------ 18> cache.c 19> _ssl.c 18> connection.c 18>w:\python-3.4.4\modules\_sqlite\connection.h(33): fatal error C1083: Cannot open include file: 'sqlite3.h': No such file or directory 18> cursor.c 18>w:\python-3.4.4\modules\_sqlite\connection.h(33): fatal error C1083: Cannot open include file: 'sqlite3.h': No such file or directory 18> microprotocols.c 19>..\Modules\_ssl.c(59): fatal error C1083: Cannot open include file: 'openssl/rsa.h': No such file or directory 20>------ Build started: Project: _testcapi, Configuration: Release Win32 ------ 18>w:\python-3.4.4\modules\_sqlite\connection.h(33): fatal error C1083: Cannot open include file: 'sqlite3.h': No such file or directory 18> module.c 13> ffi.c 17> _msi.c 13> malloc_closure.c 13> prep_cif.c 13> stgdict.c 13> win32.c 18>w:\python-3.4.4\modules\_sqlite\connection.h(33): fatal error C1083: Cannot open include file: 'sqlite3.h': No such file or directory 18> prepare_protocol.c 17>LINK : fatal error LNK1181: cannot open input file 'fci.lib' 21>------ Build started: Project: _testimportmultiple, Configuration: Release Win32 ------ 18> row.c 13>LINK : fatal error LNK1181: cannot open input file 'W:\Python-3.4.4\PCbuild\python34.lib' 20> _testcapimodule.c 22>------ Build started: Project: _tkinter, Configuration: Release Win32 ------ 18>w:\python-3.4.4\modules\_sqlite\connection.h(33): fatal error C1083: Cannot open include file: 'sqlite3.h': No such file or directory 18> statement.c 21> _testimportmultiple.c 18>w:\python-3.4.4\modules\_sqlite\connection.h(33): fatal error C1083: Cannot open include file: 'sqlite3.h': No such file or directory 18> util.c 18>w:\python-3.4.4\modules\_sqlite\connection.h(33): fatal error C1083: Cannot open include file: 'sqlite3.h': No such file or directory 21>LINK : fatal error LNK1181: cannot open input file 'W:\Python-3.4.4\PCbuild\python34.lib' 23>------ Build started: Project: _bz2, Configuration: Release Win32 ------ 24>------ Build started: Project: select, Configuration: Release Win32 ------ 22> _tkinter.c 20>LINK : fatal error LNK1181: cannot open input file 'W:\Python-3.4.4\PCbuild\python34.lib' 25>------ Build started: Project: _lzma, Configuration: Release Win32 ------ 25> _lzmamodule.c 23> _bz2module.c 25>..\Modules\_lzmamodule.c(19): fatal error C1083: Cannot open include file: 'lzma.h': No such file or directory 26>------ Build started: Project: unicodedata, Configuration: Release Win32 ------ 24> selectmodule.c 23>..\Modules\_bz2module.c(12): fatal error C1083: Cannot open include file: 'bzlib.h': No such file or directory 22>..\Modules\_tkinter.c(55): fatal error C1083: Cannot open include file: 'tcl.h': No such file or directory 23> blocksort.c 22> tkappinit.c 22>..\Modules\tkappinit.c(16): fatal error C1083: Cannot open include file: 'tcl.h': No such file or directory 23>c1 : fatal error C1083: Cannot open source file: '..\externals\bzip2-1.0.6\blocksort.c': No such file or directory 23> bzlib.c 23>c1 : fatal error C1083: Cannot open source file: '..\externals\bzip2-1.0.6\bzlib.c': No such file or directory 23> compress.c 23>c1 : fatal error C1083: Cannot open source file: '..\externals\bzip2-1.0.6\compress.c': No such file or directory 23> crctable.c 23>c1 : fatal error C1083: Cannot open source file: '..\externals\bzip2-1.0.6\crctable.c': No such file or directory 23> decompress.c 23>c1 : fatal error C1083: Cannot open source file: '..\externals\bzip2-1.0.6\decompress.c': No such file or directory 23> huffman.c 23>c1 : fatal error C1083: Cannot open source file: '..\externals\bzip2-1.0.6\huffman.c': No such file or directory 23> randtable.c 23>c1 : fatal error C1083: Cannot open source file: '..\externals\bzip2-1.0.6\randtable.c': No such file or directory 27>------ Build started: Project: pyexpat, Configuration: Release Win32 ------ 24>LINK : fatal error LNK1181: cannot open input file 'W:\Python-3.4.4\PCbuild\python34.lib' 26> unicodedata.c 28>------ Build started: Project: _hashlib, Configuration: Release Win32 ------ 29>------ Build started: Project: _multiprocessing, Configuration: Release Win32 ------ 27> pyexpat.c 28> _hashopenssl.c 29> multiprocessing.c 27> xmlparse.c 28>..\Modules\_hashopenssl.c(22): fatal error C1083: Cannot open include file: 'openssl/evp.h': No such file or directory 30>------ Build started: Project: pythonw, Configuration: Release Win32 ------ 27> xmlrole.c 27> xmltok.c 29> semaphore.c 27>LINK : fatal error LNK1181: cannot open input file 'W:\Python-3.4.4\PCbuild\python34.lib' 31>------ Build started: Project: winsound, Configuration: Release Win32 ------ 30> WinMain.c 31> winsound.c 29>LINK : fatal error LNK1181: cannot open input file 'W:\Python-3.4.4\PCbuild\python34.lib' 32>------ Build started: Project: _testbuffer, Configuration: Release Win32 ------ 26>LINK : fatal error LNK1181: cannot open input file 'W:\Python-3.4.4\PCbuild\python34.lib' 33>------ Skipped Build: Project: _freeze_importlib, Configuration: Release Win32 ------ 33>Project not selected to build for this solution configuration 34>------ Build started: Project: _overlapped, Configuration: Release Win32 ------ 32> _testbuffer.c 34> overlapped.c 30>LINK : fatal error LNK1181: cannot open input file 'W:\Python-3.4.4\PCbuild\python34.lib' 35>------ Build started: Project: _testembed, Configuration: Release Win32 ------ 35> _testembed.c 32>LINK : fatal error LNK1181: cannot open input file 'W:\Python-3.4.4\PCbuild\python34.lib' 31>LINK : fatal error LNK1181: cannot open input file 'W:\Python-3.4.4\PCbuild\python34.lib' 34>..\Modules\overlapped.c(977): warning C4996: 'WSAStringToAddressA': Use WSAStringToAddressW() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 34> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(3623): note: see declaration of 'WSAStringToAddressA' 34>..\Modules\overlapped.c(988): warning C4996: 'WSAStringToAddressA': Use WSAStringToAddressW() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 34> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(3623): note: see declaration of 'WSAStringToAddressA' 35>LINK : fatal error LNK1181: cannot open input file 'W:\Python-3.4.4\PCbuild\python34.lib' 34>LINK : fatal error LNK1181: cannot open input file 'W:\Python-3.4.4\PCbuild\python34.lib' ========== Build: 5 succeeded, 28 failed, 1 up-to-date, 2 skipped ========== -------------- next part -------------- An HTML attachment was scrubbed... URL: From tritium-list at sdamon.com Tue Jan 12 22:32:26 2016 From: tritium-list at sdamon.com (Alexander Walters) Date: Tue, 12 Jan 2016 22:32:26 -0500 Subject: [Python-Dev] Building with VS2015 In-Reply-To: <01f401d14dad$e984f1e0$bc8ed5a0$@com> References: <01f401d14dad$e984f1e0$bc8ed5a0$@com> Message-ID: <5695C54A.3070101@sdamon.com> This is a mailing list for the development of python itself, not support for building it. That said... 3.4 uses visual studio 2010, for starters. 3.5 uses 2015. It also looks like you have a lot of missing dependencies. On 1/12/2016 21:55, Eddy Quicksall wrote: > > I downloaded https://www.python.org/ftp/python/3.4.4/Python-3.4.4.tgz. > > After reading Python-3.4.4\PCbuild\readme.txt I opened pcbuild.sln and > selected release/winn32 then tried a build all (F7). But that give > lots of errors and missing .h files (mostly sqlite3.h). > > Does anyone know what I may be doing wrong or have misunderstood? > > Here is the output: > > 1>------ Build started: Project: make_versioninfo, Configuration: > Release Win32 ------ > > 2>------ Build started: Project: kill_python, Configuration: Release > Win32 ------ > > 3>------ Skipped Build: Project: bdist_wininst, Configuration: Release > Win32 ------ > > 3>Project not selected to build for this solution configuration > > 4>------ Build started: Project: python3dll, Configuration: Release > Win32 ------ > > 5>------ Build started: Project: xxlimited, Configuration: Release > Win32 ------ > > 1> make_versioninfo.c > > 5> xxlimited.c > > 2> kill_python.c > > 4> > > 4> Microsoft (R) Program Maintenance Utility Version 14.00.23026.0 > > 4> Copyright (C) Microsoft Corporation. All rights reserved. > > 4> > > 4> lib /def:python34stub.def > /out:W:\Python-3.4.4\PCbuild\python34stub.lib /MACHINE:x86 > > 5>LINK : fatal error LNK1104: cannot open file 'python3.lib' > > 4> Microsoft (R) Library Manager Version 14.00.23026.0 > > 4> Copyright (C) Microsoft Corporation. All rights reserved. > > 4> > > 4> Creating library W:\Python-3.4.4\PCbuild\python34stub.lib and > object W:\Python-3.4.4\PCbuild\python34stub.exp > > 4> cl /LD /FeW:\Python-3.4.4\PCbuild\python3.dll python3dll.c > python3.def W:\Python-3.4.4\PCbuild\python34stub.lib > > 4> Microsoft (R) C/C++ Optimizing Compiler Version 19.00.23026 for x86 > > 4> Copyright (C) Microsoft Corporation. All rights reserved. > > 4> > > 4> python3dll.c > > 1> make_versioninfo.vcxproj -> > W:\Python-3.4.4\PCbuild\make_versioninfo.exe > > 2> Generating code > > 2> All 5 functions were compiled because no usable IPDB/IOBJ from > previous compilation was found. > > 2> Finished generating code > > 4> Microsoft (R) Incremental Linker Version 14.00.23026.0 > > 4> Copyright (C) Microsoft Corporation. All rights reserved. > > 4> > > 4> /dll > > 4> /implib:W:\Python-3.4.4\PCbuild\python3.lib > > 4> /out:W:\Python-3.4.4\PCbuild\python3.dll > > 4> /def:python3.def > > 4> python3dll.obj > > 4> W:\Python-3.4.4\PCbuild\python34stub.lib > > 4> Creating library W:\Python-3.4.4\PCbuild\python3.lib and object > W:\Python-3.4.4\PCbuild\python3.exp > > 6>------ Build started: Project: pylauncher, Configuration: Release > Win32 ------ > > 7>------ Build started: Project: pywlauncher, Configuration: Release > Win32 ------ > > 7> launcher.c > > 6> launcher.c > > 2> kill_python.vcxproj -> W:\Python-3.4.4\PCbuild\kill_python.exe > > 8>------ Build started: Project: pythoncore, Configuration: Release > Win32 ------ > > 9>------ Build started: Project: sqlite3, Configuration: Release Win32 > ------ > > 9> sqlite3.c > > 9>c1 : fatal error C1083: Cannot open source file: > '..\externals\sqlite-3.8.11.0\sqlite3.c': No such file or directory > > 7> Generating code > > 7> All 41 functions were compiled because no usable IPDB/IOBJ from > previous compilation was found. > > 7> Finished generating code > > 8> _bisectmodule.c > > 8> _codecsmodule.c > > 6> Generating code > > 6> All 41 functions were compiled because no usable IPDB/IOBJ from > previous compilation was found. > > 6> Finished generating code > > 8> _collectionsmodule.c > > 7> pywlauncher.vcxproj -> W:\Python-3.4.4\PCbuild\pyw.exe > > 8> _csv.c > > 6> pylauncher.vcxproj -> W:\Python-3.4.4\PCbuild\py.exe > > 8> _functoolsmodule.c > > 8> _heapqmodule.c > > 8> _json.c > > 8> _localemodule.c > > 8> _lsprof.c > > 8> _math.c > > 8> _pickle.c > > 8> _randommodule.c > > 8> _sre.c > > 8> _stat.c > > 8> _struct.c > > 8> _weakref.c > > 8> arraymodule.c > > 8> atexitmodule.c > > 8> audioop.c > > 8> binascii.c > > 8> Generating Code... > > 8> Compiling... > > 8> cmathmodule.c > > 8> _datetimemodule.c > > 8>..\Modules\_datetimemodule.c(4843): error C2065: 'daylight': > undeclared identifier > > 8>..\Modules\_datetimemodule.c(4844): error C2065: 'tzname': > undeclared identifier > > 8>..\Modules\_datetimemodule.c(4844): error C2109: subscript requires > array or pointer type > > 8>..\Modules\_datetimemodule.c(4846): error C2065: 'tzname': > undeclared identifier > > 8>..\Modules\_datetimemodule.c(4846): error C2109: subscript requires > array or pointer type > > 8> errnomodule.c > > 8> faulthandler.c > > 8> gcmodule.c > > 8> hashtable.c > > 8> itertoolsmodule.c > > 8> main.c > > 8>..\Modules\main.c(521): error C2198: 'wcstok': too few arguments for > call > > 8>..\Modules\main.c(523): error C2198: 'wcstok': too few arguments for > call > > 8> mathmodule.c > > 8> md5module.c > > 8> mmapmodule.c > > 8> _opcode.c > > 8> _operator.c > > 8> parsermodule.c > > 8> posixmodule.c > > 8> rotatingtree.c > > 8> sha1module.c > > 8> sha256module.c > > 8> sha512module.c > > 8> signalmodule.c > > 8> Generating Code... > > 8> Compiling... > > 8> symtablemodule.c > > 8> _threadmodule.c > > 8> _tracemalloc.c > > 8> timemodule.c > > 8>..\Modules\timemodule.c(1325): error C2065: 'timezone': undeclared > identifier > > 8>..\Modules\timemodule.c(1329): error C2065: 'timezone': undeclared > identifier > > 8>..\Modules\timemodule.c(1331): error C2065: 'daylight': undeclared > identifier > > 8>..\Modules\timemodule.c(1332): error C2065: 'tzname': undeclared > identifier > > 8>..\Modules\timemodule.c(1332): error C2109: subscript requires array > or pointer type > > 8>..\Modules\timemodule.c(1332): error C2198: > 'PyUnicode_DecodeLocale': too few arguments for call > > 8>..\Modules\timemodule.c(1333): error C2065: 'tzname': undeclared > identifier > > 8>..\Modules\timemodule.c(1333): error C2109: subscript requires array > or pointer type > > 8>..\Modules\timemodule.c(1333): error C2198: > 'PyUnicode_DecodeLocale': too few arguments for call > > 8> xxsubtype.c > > 8> zipimport.c > > 8> zlibmodule.c > > 8> fileio.c > > 8> bytesio.c > > 8> stringio.c > > 8> bufferedio.c > > 8>..\Modules\_io\bufferedio.c(321): warning C4244: 'function': > conversion from 'double' to '__int64', possible loss of data > > 8> iobase.c > > 8> textio.c > > 8> _iomodule.c > > 8> adler32.c > > 8> compress.c > > 8> crc32.c > > 8> deflate.c > > 8> infback.c > > 8> inffast.c > > 8> Generating Code... > > 8> Compiling... > > 8> inflate.c > > 8> inftrees.c > > 8> trees.c > > 8> uncompr.c > > 8> zutil.c > > 8> _codecs_cn.c > > 8> _codecs_hk.c > > 8> _codecs_iso2022.c > > 8> _codecs_jp.c > > 8> _codecs_kr.c > > 8> _codecs_tw.c > > 8> multibytecodec.c > > 8> _winapi.c > > 8>..\Modules\_winapi.c(875): warning C4996: 'GetVersion': was declared > deprecated > > 8> C:\Program Files (x86)\Windows > Kits\8.1\Include\um\sysinfoapi.h(110): note: see declaration of > 'GetVersion' > > 8> abstract.c > > 8> accu.c > > 8> boolobject.c > > 8> bytes_methods.c > > 8> bytearrayobject.c > > 8> bytesobject.c > > 8> capsule.c > > 8> Generating Code... > > 8> Compiling... > > 8> cellobject.c > > 8> classobject.c > > 8> codeobject.c > > 8> complexobject.c > > 8> descrobject.c > > 8> dictobject.c > > 8> enumobject.c > > 8> exceptions.c > > 8> fileobject.c > > 8> floatobject.c > > 8> frameobject.c > > 8> funcobject.c > > 8> genobject.c > > 8> iterobject.c > > 8> listobject.c > > 8> longobject.c > > 8> memoryobject.c > > 8> methodobject.c > > 8> moduleobject.c > > 8> namespaceobject.c > > 8> Generating Code... > > 8> Compiling... > > 8> object.c > > 8> obmalloc.c > > 8> rangeobject.c > > 8> setobject.c > > 8> sliceobject.c > > 8> structseq.c > > 8> tupleobject.c > > 8> typeobject.c > > 8> unicodectype.c > > 8> unicodeobject.c > > 8>..\Objects\unicodeobject.c(15011): warning C4996: 'GetVersionExA': > was declared deprecated > > 8> C:\Program Files (x86)\Windows > Kits\8.1\Include\um\sysinfoapi.h(433): note: see declaration of > 'GetVersionExA' > > 8> weakrefobject.c > > 8> acceler.c > > 8> bitset.c > > 8> firstsets.c > > 8> grammar.c > > 8> grammar1.c > > 8> listnode.c > > 8> metagrammar.c > > 8> myreadline.c > > 8> node.c > > 8> Generating Code... > > 8> Compiling... > > 8> parser.c > > 8> parsetok.c > > 8> tokenizer.c > > 8> winreg.c > > 8> config.c > > 8> dl_nt.c > > 8> getpathp.c > > 8>..\PC\getpathp.c(454): error C2198: 'wcstok': too few arguments for call > > 8>..\PC\getpathp.c(456): error C2198: 'wcstok': too few arguments for call > > 8>..\PC\getpathp.c(458): error C2198: 'wcstok': too few arguments for call > > 8> msvcrtmodule.c > > 8> pyhash.c > > 8> random.c > > 8> _warnings.c > > 8> asdl.c > > 8> ast.c > > 8> bltinmodule.c > > 8> ceval.c > > 8> codecs.c > > 8> compile.c > > 8> dynamic_annotations.c > > 8> dynload_win.c > > 8> errors.c > > 8> Generating Code... > > 8> Compiling... > > 8> fileutils.c > > 8> formatter_unicode.c > > 8> frozen.c > > 8> future.c > > 8> getargs.c > > 8> getcompiler.c > > 8> getcopyright.c > > 8> getopt.c > > 8> getplatform.c > > 8> getversion.c > > 8> graminit.c > > 8> import.c > > 8> importdl.c > > 8> marshal.c > > 8> modsupport.c > > 8> mysnprintf.c > > 8> mystrtoul.c > > 8> peephole.c > > 8> pyarena.c > > 8> pyctype.c > > 8> Generating Code... > > 8> Compiling... > > 8> pyfpe.c > > 8> pymath.c > > 8> pytime.c > > 8> pystate.c > > 8> pystrcmp.c > > 8> pystrtod.c > > 8> dtoa.c > > 8> Python-ast.c > > 8> pythonrun.c > > 8> structmember.c > > 8> symtable.c > > 8> sysmodule.c > > 8>..\Python\sysmodule.c(782): warning C4996: 'GetVersionExA': was > declared deprecated > > 8> C:\Program Files (x86)\Windows > Kits\8.1\Include\um\sysinfoapi.h(433): note: see declaration of > 'GetVersionExA' > > 8> thread.c > > 8> traceback.c > > 8> Generating Code... > > 10>------ Build started: Project: python, Configuration: Release Win32 > ------ > > 11>------ Build started: Project: _socket, Configuration: Release > Win32 ------ > > 12>------ Build started: Project: _decimal, Configuration: Release > Win32 ------ > > 13>------ Build started: Project: _ctypes, Configuration: Release > Win32 ------ > > 11> socketmodule.c > > 12> _decimal.c > > 13> _ctypes.c > > 12> basearith.c > > 10> python.c > > 12> constants.c > > 12> context.c > > 11>..\Modules\socketmodule.c(866): warning C4996: 'inet_addr': Use > inet_pton() or InetPton() instead or define > _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings > > 11> C:\Program Files (x86)\Windows > Kits\8.1\Include\um\winsock2.h(1850): note: see declaration of 'inet_addr' > > 11>..\Modules\socketmodule.c(3749): warning C4996: > 'WSADuplicateSocketA': Use WSADuplicateSocketW() instead or define > _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings > > 11> C:\Program Files (x86)\Windows > Kits\8.1\Include\um\winsock2.h(2872): note: see declaration of > 'WSADuplicateSocketA' > > 11>..\Modules\socketmodule.c(3959): warning C4996: 'WSASocketA': Use > WSASocketW() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to > disable deprecated API warnings > > 11> C:\Program Files (x86)\Windows > Kits\8.1\Include\um\winsock2.h(3457): note: see declaration of > 'WSASocketA' > > 11>..\Modules\socketmodule.c(3993): warning C4996: 'WSASocketA': Use > WSASocketW() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to > disable deprecated API warnings > > 11> C:\Program Files (x86)\Windows > Kits\8.1\Include\um\winsock2.h(3457): note: see declaration of > 'WSASocketA' > > 11>..\Modules\socketmodule.c(4399): warning C4996: 'gethostbyname': > Use getaddrinfo() or GetAddrInfoW() instead or define > _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings > > 11> C:\Program Files (x86)\Windows > Kits\8.1\Include\um\winsock2.h(2238): note: see declaration of > 'gethostbyname' > > 11>..\Modules\socketmodule.c(4497): warning C4996: 'gethostbyaddr': > Use getnameinfo() or GetNameInfoW() instead or define > _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings > > 11> C:\Program Files (x86)\Windows > Kits\8.1\Include\um\winsock2.h(2216): note: see declaration of > 'gethostbyaddr' > > 11>..\Modules\socketmodule.c(4627): warning C4996: > 'WSADuplicateSocketA': Use WSADuplicateSocketW() instead or define > _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings > > 11> C:\Program Files (x86)\Windows > Kits\8.1\Include\um\winsock2.h(2872): note: see declaration of > 'WSADuplicateSocketA' > > 11>..\Modules\socketmodule.c(4632): warning C4996: 'WSASocketA': Use > WSASocketW() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to > disable deprecated API warnings > > 11> C:\Program Files (x86)\Windows > Kits\8.1\Include\um\winsock2.h(3457): note: see declaration of > 'WSASocketA' > > 11>..\Modules\socketmodule.c(4921): warning C4996: 'inet_addr': Use > inet_pton() or InetPton() instead or define > _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings > > 11> C:\Program Files (x86)\Windows > Kits\8.1\Include\um\winsock2.h(1850): note: see declaration of 'inet_addr' > > 11>..\Modules\socketmodule.c(4963): warning C4996: 'inet_ntoa': Use > inet_ntop() or InetNtop() instead or define > _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings > > 11> C:\Program Files (x86)\Windows > Kits\8.1\Include\um\winsock2.h(1868): note: see declaration of 'inet_ntoa' > > 11>..\Modules\socketmodule.c(5037): warning C4996: > 'WSAStringToAddressA': Use WSAStringToAddressW() instead or define > _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings > > 12> convolute.c > > 11> C:\Program Files (x86)\Windows > Kits\8.1\Include\um\winsock2.h(3623): note: see declaration of > 'WSAStringToAddressA' > > 11>..\Modules\socketmodule.c(5175): warning C4996: > 'WSAAddressToStringA': Use WSAAddressToStringW() instead or define > _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings > > 11> C:\Program Files (x86)\Windows > Kits\8.1\Include\um\winsock2.h(3556): note: see declaration of > 'WSAAddressToStringA' > > 11>..\Modules\socketmodule.c(5783): warning C4996: 'GetVersion': was > declared deprecated > > 11> C:\Program Files (x86)\Windows > Kits\8.1\Include\um\sysinfoapi.h(110): note: see declaration of > 'GetVersion' > > 11>..\Modules\socketmodule.c(6922): warning C4996: 'inet_addr': Use > inet_pton() or InetPton() instead or define > _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings > > 11> C:\Program Files (x86)\Windows > Kits\8.1\Include\um\winsock2.h(1850): note: see declaration of 'inet_addr' > > 11>..\Modules\socketmodule.c(6941): warning C4996: 'inet_ntoa': Use > inet_ntop() or InetNtop() instead or define > _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings > > 11> C:\Program Files (x86)\Windows > Kits\8.1\Include\um\winsock2.h(1868): note: see declaration of 'inet_ntoa' > > 12> crt.c > > 13> callbacks.c > > 12> difradix2.c > > 12> fnt.c > > 12> fourstep.c > > 12> io.c > > 10>LINK : fatal error LNK1181: cannot open input file > 'W:\Python-3.4.4\PCbuild\python34.lib' > > 11>LINK : fatal error LNK1181: cannot open input file > 'W:\Python-3.4.4\PCbuild\python34.lib' > > 12> memory.c > > 14>------ Build started: Project: ssl, Configuration: Release Win32 ------ > > 15>------ Build started: Project: _ctypes_test, Configuration: Release > Win32 ------ > > 13> callproc.c > > 12> mpdecimal.c > > 14> '"W:\Python-3.4.4\PCbuild\python.exe"' is not recognized as an > internal or external command, > > 14> operable program or batch file. > > 14>C:\Program Files > (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(37,5): error > MSB3073: The command "cd "W:\Python-3.4.4\PCbuild\" > > 14>C:\Program Files > (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(37,5): error > MSB3073: "W:\Python-3.4.4\PCbuild\python.exe" build_ssl.py Release > Win32 -a > > 14>C:\Program Files > (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(37,5): error > MSB3073: " exited with code 9009. > > 16>------ Build started: Project: _elementtree, Configuration: Release > Win32 ------ > > 15> _ctypes_test.c > > 12> numbertheory.c > > 13> cfield.c > > 15>LINK : fatal error LNK1181: cannot open input file > 'W:\Python-3.4.4\PCbuild\python34.lib' > > 17>------ Build started: Project: _msi, Configuration: Release Win32 > ------ > > 12> sixstep.c > > 12> transpose.c > > 12>LINK : fatal error LNK1181: cannot open input file > 'W:\Python-3.4.4\PCbuild\python34.lib' > > 18>------ Build started: Project: _sqlite3, Configuration: Release > Win32 ------ > > 19>------ Build started: Project: _ssl, Configuration: Release Win32 > ------ > > 18> cache.c > > 19> _ssl.c > > 18> connection.c > > 18>w:\python-3.4.4\modules\_sqlite\connection.h(33): fatal error > C1083: Cannot open include file: 'sqlite3.h': No such file or directory > > 18> cursor.c > > 18>w:\python-3.4.4\modules\_sqlite\connection.h(33): fatal error > C1083: Cannot open include file: 'sqlite3.h': No such file or directory > > 18> microprotocols.c > > 19>..\Modules\_ssl.c(59): fatal error C1083: Cannot open include file: > 'openssl/rsa.h': No such file or directory > > 20>------ Build started: Project: _testcapi, Configuration: Release > Win32 ------ > > 18>w:\python-3.4.4\modules\_sqlite\connection.h(33): fatal error > C1083: Cannot open include file: 'sqlite3.h': No such file or directory > > 18> module.c > > 13> ffi.c > > 17> _msi.c > > 13> malloc_closure.c > > 13> prep_cif.c > > 13> stgdict.c > > 13> win32.c > > 18>w:\python-3.4.4\modules\_sqlite\connection.h(33): fatal error > C1083: Cannot open include file: 'sqlite3.h': No such file or directory > > 18> prepare_protocol.c > > 17>LINK : fatal error LNK1181: cannot open input file 'fci.lib' > > 21>------ Build started: Project: _testimportmultiple, Configuration: > Release Win32 ------ > > 18> row.c > > 13>LINK : fatal error LNK1181: cannot open input file > 'W:\Python-3.4.4\PCbuild\python34.lib' > > 20> _testcapimodule.c > > 22>------ Build started: Project: _tkinter, Configuration: Release > Win32 ------ > > 18>w:\python-3.4.4\modules\_sqlite\connection.h(33): fatal error > C1083: Cannot open include file: 'sqlite3.h': No such file or directory > > 18> statement.c > > 21> _testimportmultiple.c > > 18>w:\python-3.4.4\modules\_sqlite\connection.h(33): fatal error > C1083: Cannot open include file: 'sqlite3.h': No such file or directory > > 18> util.c > > 18>w:\python-3.4.4\modules\_sqlite\connection.h(33): fatal error > C1083: Cannot open include file: 'sqlite3.h': No such file or directory > > 21>LINK : fatal error LNK1181: cannot open input file > 'W:\Python-3.4.4\PCbuild\python34.lib' > > 23>------ Build started: Project: _bz2, Configuration: Release Win32 > ------ > > 24>------ Build started: Project: select, Configuration: Release Win32 > ------ > > 22> _tkinter.c > > 20>LINK : fatal error LNK1181: cannot open input file > 'W:\Python-3.4.4\PCbuild\python34.lib' > > 25>------ Build started: Project: _lzma, Configuration: Release Win32 > ------ > > 25> _lzmamodule.c > > 23> _bz2module.c > > 25>..\Modules\_lzmamodule.c(19): fatal error C1083: Cannot open > include file: 'lzma.h': No such file or directory > > 26>------ Build started: Project: unicodedata, Configuration: Release > Win32 ------ > > 24> selectmodule.c > > 23>..\Modules\_bz2module.c(12): fatal error C1083: Cannot open include > file: 'bzlib.h': No such file or directory > > 22>..\Modules\_tkinter.c(55): fatal error C1083: Cannot open include > file: 'tcl.h': No such file or directory > > 23> blocksort.c > > 22> tkappinit.c > > 22>..\Modules\tkappinit.c(16): fatal error C1083: Cannot open include > file: 'tcl.h': No such file or directory > > 23>c1 : fatal error C1083: Cannot open source file: > '..\externals\bzip2-1.0.6\blocksort.c': No such file or directory > > 23> bzlib.c > > 23>c1 : fatal error C1083: Cannot open source file: > '..\externals\bzip2-1.0.6\bzlib.c': No such file or directory > > 23> compress.c > > 23>c1 : fatal error C1083: Cannot open source file: > '..\externals\bzip2-1.0.6\compress.c': No such file or directory > > 23> crctable.c > > 23>c1 : fatal error C1083: Cannot open source file: > '..\externals\bzip2-1.0.6\crctable.c': No such file or directory > > 23> decompress.c > > 23>c1 : fatal error C1083: Cannot open source file: > '..\externals\bzip2-1.0.6\decompress.c': No such file or directory > > 23> huffman.c > > 23>c1 : fatal error C1083: Cannot open source file: > '..\externals\bzip2-1.0.6\huffman.c': No such file or directory > > 23> randtable.c > > 23>c1 : fatal error C1083: Cannot open source file: > '..\externals\bzip2-1.0.6\randtable.c': No such file or directory > > 27>------ Build started: Project: pyexpat, Configuration: Release > Win32 ------ > > 24>LINK : fatal error LNK1181: cannot open input file > 'W:\Python-3.4.4\PCbuild\python34.lib' > > 26> unicodedata.c > > 28>------ Build started: Project: _hashlib, Configuration: Release > Win32 ------ > > 29>------ Build started: Project: _multiprocessing, Configuration: > Release Win32 ------ > > 27> pyexpat.c > > 28> _hashopenssl.c > > 29> multiprocessing.c > > 27> xmlparse.c > > 28>..\Modules\_hashopenssl.c(22): fatal error C1083: Cannot open > include file: 'openssl/evp.h': No such file or directory > > 30>------ Build started: Project: pythonw, Configuration: Release > Win32 ------ > > 27> xmlrole.c > > 27> xmltok.c > > 29> semaphore.c > > 27>LINK : fatal error LNK1181: cannot open input file > 'W:\Python-3.4.4\PCbuild\python34.lib' > > 31>------ Build started: Project: winsound, Configuration: Release > Win32 ------ > > 30> WinMain.c > > 31> winsound.c > > 29>LINK : fatal error LNK1181: cannot open input file > 'W:\Python-3.4.4\PCbuild\python34.lib' > > 32>------ Build started: Project: _testbuffer, Configuration: Release > Win32 ------ > > 26>LINK : fatal error LNK1181: cannot open input file > 'W:\Python-3.4.4\PCbuild\python34.lib' > > 33>------ Skipped Build: Project: _freeze_importlib, Configuration: > Release Win32 ------ > > 33>Project not selected to build for this solution configuration > > 34>------ Build started: Project: _overlapped, Configuration: Release > Win32 ------ > > 32> _testbuffer.c > > 34> overlapped.c > > 30>LINK : fatal error LNK1181: cannot open input file > 'W:\Python-3.4.4\PCbuild\python34.lib' > > 35>------ Build started: Project: _testembed, Configuration: Release > Win32 ------ > > 35> _testembed.c > > 32>LINK : fatal error LNK1181: cannot open input file > 'W:\Python-3.4.4\PCbuild\python34.lib' > > 31>LINK : fatal error LNK1181: cannot open input file > 'W:\Python-3.4.4\PCbuild\python34.lib' > > 34>..\Modules\overlapped.c(977): warning C4996: 'WSAStringToAddressA': > Use WSAStringToAddressW() instead or define > _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings > > 34> C:\Program Files (x86)\Windows > Kits\8.1\Include\um\winsock2.h(3623): note: see declaration of > 'WSAStringToAddressA' > > 34>..\Modules\overlapped.c(988): warning C4996: 'WSAStringToAddressA': > Use WSAStringToAddressW() instead or define > _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings > > 34> C:\Program Files (x86)\Windows > Kits\8.1\Include\um\winsock2.h(3623): note: see declaration of > 'WSAStringToAddressA' > > 35>LINK : fatal error LNK1181: cannot open input file > 'W:\Python-3.4.4\PCbuild\python34.lib' > > 34>LINK : fatal error LNK1181: cannot open input file > 'W:\Python-3.4.4\PCbuild\python34.lib' > > ========== Build: 5 succeeded, 28 failed, 1 up-to-date, 2 skipped > ========== > > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/tritium-list%40sdamon.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From zachary.ware+pydev at gmail.com Tue Jan 12 23:03:47 2016 From: zachary.ware+pydev at gmail.com (Zachary Ware) Date: Tue, 12 Jan 2016 22:03:47 -0600 Subject: [Python-Dev] Building with VS2015 In-Reply-To: <5695C54A.3070101@sdamon.com> References: <01f401d14dad$e984f1e0$bc8ed5a0$@com> <5695C54A.3070101@sdamon.com> Message-ID: On Tue, Jan 12, 2016 at 9:32 PM, Alexander Walters wrote: > This is a mailing list for the development of python itself, not support for > building it. That said... > > 3.4 uses visual studio 2010, for starters. 3.5 uses 2015. Agreed with all of the above. You'll be much happier using either 3.5 with VS 2015, or 3.4 with VS 2010. You might coerce things the other way around, but you won't get official support for it and you won't match the rest of the world (in particular, binary packages from PyPI won't work). > It also looks like you have a lot of missing dependencies. Run `PCbuild\get_externals.bat` after installing Subversion (and ensuring that svn.exe is on PATH) to resolve the missing dependencies. The other errors will require patching the source, and it's too late in the 3.4 release cycle for those patches to be accepted for 3.4.5. I also find it much simpler to use `PCbuild\build.bat` instead of building from the VS GUI. A 32bit Release build with all the optional pieces can be accomplished with `PCbuild\build.bat -e`; add '-p x64' for a 64bit build. -- Zach From paulson at busiq.com Wed Jan 13 14:32:23 2016 From: paulson at busiq.com (Matthew Paulson) Date: Wed, 13 Jan 2016 14:32:23 -0500 Subject: [Python-Dev] Discussion related to memory leaks requested Message-ID: <5696A647.6090501@busiq.com> Hi: I've spent some time performing memory leak analysis while using Python in an embedded configuration. The pattern is: Py_Initialize(); ... run empty python source file ... Py_Finalize(); I've identified several suspect areas including dictionary maitenace in import.c:~ 414 /* Clear the modules dict. */ PyDict_Clear(modules); /* Restore the original builtins dict, to ensure that any user data gets cleared. */ dict = PyDict_Copy(interp->builtins); if (dict == NULL) PyErr_Clear(); PyDict_Clear(interp->builtins); if (PyDict_Update(interp->builtins, interp->builtins_copy)) PyErr_Clear(); Py_XDECREF(dict); /* Clear module dict copies stored in the interpreter state */ Is there someone in the group that would like to discuss this topic. There seems to be other leaks as well. I'm new to Python-dev, but willing to help or work with someone who is more familiar with these areas than I. Thanks, Matt -- -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: MattSig.JPG Type: image/jpeg Size: 38491 bytes Desc: not available URL: From brett at python.org Wed Jan 13 15:06:11 2016 From: brett at python.org (Brett Cannon) Date: Wed, 13 Jan 2016 20:06:11 +0000 Subject: [Python-Dev] Discussion related to memory leaks requested In-Reply-To: <5696A647.6090501@busiq.com> References: <5696A647.6090501@busiq.com> Message-ID: Probably the best way to handle this, Matthew, is to open issues at bugs.python.org for each of the leaks you have found and then they can be discussed there. And thanks for being willing to report these! On Wed, 13 Jan 2016 at 11:42 Matthew Paulson wrote: > Hi: > > I've spent some time performing memory leak analysis while using Python in > an embedded configuration. > > The pattern is: > > Py_Initialize(); > > ... run empty python source file ... > > Py_Finalize(); > > > I've identified several suspect areas including dictionary maitenace in > import.c:~ 414 > > /* Clear the modules dict. */ > PyDict_Clear(modules); > /* Restore the original builtins dict, to ensure that any > user data gets cleared. */ > dict = PyDict_Copy(interp->builtins); > if (dict == NULL) > PyErr_Clear(); > PyDict_Clear(interp->builtins); > if (PyDict_Update(interp->builtins, interp->builtins_copy)) > PyErr_Clear(); > Py_XDECREF(dict); > /* Clear module dict copies stored in the interpreter state */ > > > Is there someone in the group that would like to discuss this topic. > There seems to be other leaks as well. I'm new to Python-dev, but willing > to help or work with someone who is more familiar with these areas than I. > > Thanks, > > Matt > > > -- > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: MattSig.JPG Type: image/jpeg Size: 38491 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: MattSig.JPG Type: image/jpeg Size: 38491 bytes Desc: not available URL: From vadmium+py at gmail.com Wed Jan 13 16:15:29 2016 From: vadmium+py at gmail.com (Martin Panter) Date: Wed, 13 Jan 2016 21:15:29 +0000 Subject: [Python-Dev] Modifying the self-signed.pythontest.net certificate Message-ID: In order to fix the SSL test suite , I would like to modify the certificate used by https://self-signed.pythontest.net. So far I have a patch ready for the pythontestdotnet repository, but I want to know if I can just push to that repository, or if other steps are required. Judging by , Georg Brandl was involved in setting the server up. So far I haven?t heard anything from him on the SSL tests bug, so I am asking here in case somebody else can help. From Eddy at Quicksall.com Wed Jan 13 15:05:11 2016 From: Eddy at Quicksall.com (Eddy Quicksall) Date: Wed, 13 Jan 2016 15:05:11 -0500 Subject: [Python-Dev] Debugging using VS 2015 In-Reply-To: <5695C54A.3070101@sdamon.com> References: <01f401d14dad$e984f1e0$bc8ed5a0$@com> <5695C54A.3070101@sdamon.com> Message-ID: <003201d14e3d$b6560f80$23022e80$@com> I am using 3.5.1. I'm adding an extension for my special case. I know this list is for development and that's what I'm doing but I would like to use VS 2015 to do my debugging. If there is another list I should use to get the build to work then please let me know the list. I have been able to build using "PCbuild\build.bat -e" but I'm not able to build using PCbuild\pcbuild.sln. It gives lots of errors. I'm using Debug/Win32. Below is the output. Note that lots of .c files are missing .. Is there another package I should get (but then why would it build correctly with build.bat)? It could be that I need to add the equivalent to the "-e" option. 1>------ Rebuild All started: Project: pythoncore, Configuration: Debug Win32 ------ 2>------ Rebuild All started: Project: tcl, Configuration: Debug Win32 ------ 3>------ Rebuild All started: Project: ssleay, Configuration: Debug Win32 ------ 4>------ Rebuild All started: Project: sqlite3, Configuration: Debug Win32 ------ 1> Killing any running python_d.exe instances... 1> 'hg' is not recognized as an internal or external command, 1> operable program or batch file. 1>W:\Python-3.5.1\PCbuild\pythoncore.vcxproj(403,5): warning MSB3073: The command "hg id -b > "W:\Python-3.5.1\PCbuild\obj\\win32_Debug\pythoncore\hgbranch.txt"" exited with code 9009. 1> 'hg' is not recognized as an internal or external command, 1> operable program or batch file. 1>W:\Python-3.5.1\PCbuild\pythoncore.vcxproj(404,5): warning MSB3073: The command "hg id -i > "W:\Python-3.5.1\PCbuild\obj\\win32_Debug\pythoncore\hgversion.txt"" exited with code 9009. 1> 'hg' is not recognized as an internal or external command, 1> operable program or batch file. 1>W:\Python-3.5.1\PCbuild\pythoncore.vcxproj(405,5): warning MSB3073: The command "hg id -t > "W:\Python-3.5.1\PCbuild\obj\\win32_Debug\pythoncore\hgtag.txt"" exited with code 9009. 2>C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(41,5): warning MSB8005: The property 'NMakeReBuildCommandLine' doesn't exist. Skipping... 5>------ Rebuild All started: Project: python3dll, Configuration: Debug Win32 ------ 1> _bisectmodule.c 5> python3dll.c 4> sqlite3.c 3> d1_both.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\d1_both.c': No such file or directory 3> d1_lib.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\d1_lib.c': No such file or directory 1> _codecsmodule.c 3> d1_pkt.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\d1_pkt.c': No such file or directory 3> d1_srtp.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\d1_srtp.c': No such file or directory 3> s2_clnt.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s2_clnt.c': No such file or directory 3> s2_enc.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s2_enc.c': No such file or directory 3> s2_lib.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s2_lib.c': No such file or directory 3> s2_meth.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s2_meth.c': No such file or directory 3> s2_pkt.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s2_pkt.c': No such file or directory 3> s2_srvr.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s2_srvr.c': No such file or directory 3> s23_clnt.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s23_clnt.c': No such file or directory 3> s23_lib.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s23_lib.c': No such file or directory 3> s23_meth.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s23_meth.c': No such file or directory 3> s23_pkt.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s23_pkt.c': No such file or directory 3> s23_srvr.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s23_srvr.c': No such file or directory 3> s3_both.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_both.c': No such file or directory 3> s3_cbc.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_cbc.c': No such file or directory 3> s3_clnt.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_clnt.c': No such file or directory 3> s3_enc.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_enc.c': No such file or directory 3> s3_lib.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_lib.c': No such file or directory 3> Generating Code... 3> Compiling... 3> s3_meth.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_meth.c': No such file or directory 3> s3_pkt.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_pkt.c': No such file or directory 3> s3_srvr.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_srvr.c': No such file or directory 3> ssl_algs.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_algs.c': No such file or directory 3> ssl_asn1.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_asn1.c': No such file or directory 3> ssl_cert.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_cert.c': No such file or directory 3> ssl_ciph.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_ciph.c': No such file or directory 3> ssl_err.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_err.c': No such file or directory 3> ssl_err2.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_err2.c': No such file or directory 3> ssl_lib.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_lib.c': No such file or directory 3> ssl_rsa.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_rsa.c': No such file or directory 3> ssl_sess.c 1> _collectionsmodule.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_sess.c': No such file or directory 3> t1_clnt.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\t1_clnt.c': No such file or directory 3> t1_enc.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\t1_enc.c': No such file or directory 3> t1_lib.c 1> _csv.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\t1_lib.c': No such file or directory 3> t1_meth.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\t1_meth.c': No such file or directory 3> t1_reneg.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\t1_reneg.c': No such file or directory 3> t1_srvr.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\t1_srvr.c': No such file or directory 3> tls_srp.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\tls_srp.c': No such file or directory 3> Generating Code... 6>------ Rebuild All started: Project: tk, Configuration: Debug Win32 ------ 1> _functoolsmodule.c 1> _heapqmodule.c 1> _json.c 1> _localemodule.c 4> Creating library W:\Python-3.5.1\PCBuild\win32\sqlite3_d.lib and object W:\Python-3.5.1\PCBuild\win32\sqlite3_d.exp 5> Creating library W:\Python-3.5.1\PCBuild\win32\python3_dstub.lib and object W:\Python-3.5.1\PCBuild\win32\python3_dstub.exp 1> _lsprof.c 5> Creating library W:\Python-3.5.1\PCBuild\win32\python3_d.lib and object W:\Python-3.5.1\PCBuild\win32\python3_d.exp 5> python3dll.vcxproj -> W:\Python-3.5.1\PCBuild\win32\python3_d.dll 7>------ Rebuild All started: Project: libeay, Configuration: Debug Win32 ------ 6>C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(41,5): warning MSB8005: The property 'NMakeReBuildCommandLine' doesn't exist. Skipping... 8>------ Skipped Rebuild All: Project: bdist_wininst, Configuration: Debug Win32 ------ 8>Project not selected to build for this solution configuration 9>------ Skipped Rebuild All: Project: xxlimited, Configuration: Release Win32 ------ 9>Project not selected to build for this solution configuration 10>------ Rebuild All started: Project: pylauncher, Configuration: Debug Win32 ------ 1> _math.c 4> sqlite3.vcxproj -> W:\Python-3.5.1\PCBuild\win32\sqlite3_d.dll 11>------ Rebuild All started: Project: pywlauncher, Configuration: Debug Win32 ------ 10> launcher.c 1> _pickle.c 7> nasm: fatal: unable to open input file `W:\Python-3.5.1\externals\openssl-1.0.2d\tmp32\aes-586.asm' 1> _randommodule.c 7>W:\Python-3.5.1\PCbuild\openssl.props(69,5): error MSB3073: The command "setlocal 7>W:\Python-3.5.1\PCbuild\openssl.props(69,5): error MSB3073: set PATH=W:\Python-3.5.1\externals\\nasm-2.11.06\;%PATH% 7>W:\Python-3.5.1\PCbuild\openssl.props(69,5): error MSB3073: nasm.exe -f win32 -o "W:\Python-3.5.1\externals\openssl-1.0.2d\tmp\\win32_Debug\libeay\aes-586.ob j" "W:\Python-3.5.1\externals\openssl-1.0.2d\tmp32\aes-586.asm"" exited with code 1. 12>------ Rebuild All started: Project: tix, Configuration: Debug Win32 ------ 11> launcher.c 1> _sre.c 1> _stat.c 1> _struct.c 12> md Debug_VC13 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixClass.obj ..\generic\tixClass.c 12> tixClass.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixCmds.obj ..\generic\tixCmds.c 12> tixCmds.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixCompat.obj ..\generic\tixCompat.c 12> tixCompat.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixDiImg.obj ..\generic\tixDiImg.c 12> tixDiImg.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixDiITxt.obj ..\generic\tixDiITxt.c 12> tixDiITxt.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixDiStyle.obj ..\generic\tixDiStyle.c 12> tixDiStyle.c 11> pywlauncher.vcxproj -> W:\Python-3.5.1\PCBuild\win32\pyw_d.exe 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixDItem.obj ..\generic\tixDItem.c 12> tixDItem.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixDiText.obj ..\generic\tixDiText.c 12> tixDiText.c 1> _weakref.c 1> arraymodule.c 1> atexitmodule.c 1> audioop.c 1> binascii.c 1> Generating Code... 1> Compiling... 1> cmathmodule.c 1> _datetimemodule.c 1> errnomodule.c 1> faulthandler.c 1> gcmodule.c 10> pylauncher.vcxproj -> W:\Python-3.5.1\PCBuild\win32\py_d.exe 1> hashtable.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixDiWin.obj ..\generic\tixDiWin.c 12> tixDiWin.c 1> itertoolsmodule.c 1> main.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixError.obj ..\generic\tixError.c 12> tixError.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixForm.obj ..\generic\tixForm.c 1> mathmodule.c 12> tixForm.c 1> md5module.c 1> mmapmodule.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixFormMisc.obj ..\generic\tixFormMisc.c 12> tixFormMisc.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixGeometry.obj ..\generic\tixGeometry.c 12> tixGeometry.c 1> _opcode.c 1> _operator.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixGrid.obj ..\generic\tixGrid.c 1> parsermodule.c 12> tixGrid.c 1> posixmodule.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixGrData.obj ..\generic\tixGrData.c 12> tixGrData.c 1> rotatingtree.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixGrRC.obj ..\generic\tixGrRC.c 1> sha1module.c 12> tixGrRC.c 1> sha256module.c 1> sha512module.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixGrFmt.obj ..\generic\tixGrFmt.c 12> tixGrFmt.c 1> signalmodule.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixGrSel.obj ..\generic\tixGrSel.c 12> tixGrSel.c 1> Generating Code... 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixGrUtl.obj ..\generic\tixGrUtl.c 12> tixGrUtl.c 1> Compiling... 1> symtablemodule.c 1> _threadmodule.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixHLCol.obj ..\generic\tixHLCol.c 12> tixHLCol.c 1> _tracemalloc.c 1> timemodule.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixHLHdr.obj ..\generic\tixHLHdr.c 12> tixHLHdr.c 1> xxsubtype.c 1> zipimport.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixHLInd.obj ..\generic\tixHLInd.c 12> tixHLInd.c 1> zlibmodule.c 1> fileio.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixHList.obj ..\generic\tixHList.c 12> tixHList.c 1> bytesio.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixImgCmp.obj ..\generic\tixImgCmp.c 12> tixImgCmp.c 1> stringio.c 1> bufferedio.c 12>..\generic\tixImgCmp.c(169): warning C4090: 'function': different 'const' qualifiers 12>..\generic\tixImgCmp.c(169): warning C4028: formal parameter 2 different from declaration 12>..\generic\tixImgCmp.c(169): warning C4028: formal parameter 5 different from declaration 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixImgXpm.obj ..\generic\tixImgXpm.c 12> tixImgXpm.c 1> iobase.c 1> textio.c 12>..\generic\tixImgXpm.c(88): warning C4090: 'function': different 'const' qualifiers 12>..\generic\tixImgXpm.c(88): warning C4028: formal parameter 2 different from declaration 12>..\generic\tixImgXpm.c(88): warning C4028: formal parameter 5 different from declaration 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixInit.obj ..\generic\tixInit.c 1> _iomodule.c 12> tixInit.c 1> adler32.c 1> compress.c 1> crc32.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixList.obj ..\generic\tixList.c 1> deflate.c 12> tixList.c 1> infback.c 1> inffast.c 1> Generating Code... 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixMethod.obj ..\generic\tixMethod.c 1> Compiling... 1> inflate.c 12> tixMethod.c 1> inftrees.c 1> trees.c 1> uncompr.c 1> zutil.c 1> _codecs_cn.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixNBFrame.obj ..\generic\tixNBFrame.c 12> tixNBFrame.c 1> _codecs_hk.c 1> _codecs_iso2022.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixOption.obj ..\generic\tixOption.c 12> tixOption.c 1> _codecs_jp.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixSmpLs.obj ..\generic\tixSmpLs.c 1> _codecs_kr.c 12> tixSmpLs.c 1> _codecs_tw.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixScroll.obj ..\generic\tixScroll.c 12> tixScroll.c 1> multibytecodec.c 1> _winapi.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixTList.obj ..\generic\tixTList.c 12> tixTList.c 1> abstract.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixUtils.obj ..\generic\tixUtils.c 12> tixUtils.c 1> accu.c 1> boolobject.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixWCmpt.obj ..\win\tixWCmpt.c 1> bytes_methods.c 12> tixWCmpt.c 1> bytearrayobject.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixWidget.obj ..\generic\tixWidget.c 12> tixWidget.c 1> bytesobject.c 1> capsule.c 1> Generating Code... 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixWinDraw.obj ..\win\tixWinDraw.c 12> tixWinDraw.c 1> Compiling... 1> cellobject.c 1> classobject.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixWinXpm.obj ..\win\tixWinXpm.c 12> tixWinXpm.c 1> codeobject.c 1> complexobject.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixWinWm.obj ..\win\tixWinWm.c 1> descrobject.c 12> tixWinWm.c 1> dictobject.c 12> link.exe -debug -debugtype:cv /RELEASE /NOLOGO /MACHINE:IX86 -dll W:\Python-3.5.1\externals\tk-8.6.4.2\win\Debug_VC13\tkstub86.lib W:\Python-3.5.1\externals\tcl-core-8.6.4.2\win\Debug_VC13\tclstub86.lib kernel32.lib advapi32.lib user32.lib gdi32.lib comdlg32.lib -out:Debug_VC13\tix84g.dll @C:\Users\eddyq\AppData\Local\Temp\nmF029.tmp 12>LINK : fatal error LNK1181: cannot open input file 'W:\Python-3.5.1\externals\tk-8.6.4.2\win\Debug_VC13\tkstub86.lib' 12>NMAKE : fatal error U1077: '"C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\bin\link.exe"' : return code '0x49d' 12> Stop. 12>C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(44,5): error MSB3073: The command "setlocal 12>C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(44,5): error MSB3073: if not exist "W:\Python-3.5.1\externals\tcltk\lib\tix8.4.3\tix84g.dll" goto build 12>C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(44,5): error MSB3073: goto :eof 12>C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(44,5): error MSB3073: :build 12>C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(44,5): error MSB3073: set VCINSTALLDIR=C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\ 12>C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(44,5): error MSB3073: cd /D "W:\Python-3.5.1\externals\tix-8.4.3.6\win" 12>C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(44,5): error MSB3073: nmake /nologo -f makefile.vc MACHINE=IX86 DEBUG=1 NODEBUG=0 TCL_DBGX=g TK_DBGX=g TCL_MAJOR=8 TCL_MINOR=6 TCL_PATCH=4 BUILDDIRTOP="Debug_VC13" TCL_DIR="W:\Python-3.5.1\externals\tcl-core-8.6.4.2" TK_DIR="W:\Python-3.5.1\externals\tk-8.6.4.2" INSTALL_DIR="W:\Python-3.5.1\externals\tcltk" all install 12>C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(44,5): error MSB3073: " exited with code 2. 1> enumobject.c 1> exceptions.c 1> fileobject.c 1> floatobject.c 1> frameobject.c 1> funcobject.c 1> genobject.c 1> iterobject.c 1> listobject.c 1> longobject.c 1> memoryobject.c 1> methodobject.c 1> moduleobject.c 1> namespaceobject.c 1> Generating Code... 1> Compiling... 1> object.c 1> obmalloc.c 1> odictobject.c 1> rangeobject.c 1> setobject.c 1> sliceobject.c 1> structseq.c 1> tupleobject.c 1> typeobject.c 1> unicodectype.c 1> unicodeobject.c 1> weakrefobject.c 1> acceler.c 1> bitset.c 1> firstsets.c 1> grammar.c 1> grammar1.c 1> listnode.c 1> metagrammar.c 1> myreadline.c 1> Generating Code... 1> Compiling... 1> node.c 1> parser.c 1> parsetok.c 1> tokenizer.c 1> invalid_parameter_handler.c 1> winreg.c 1> config.c 1> getpathp.c 1> msvcrtmodule.c 1> pyhash.c 1> random.c 1> _warnings.c 1> asdl.c 1> ast.c 1> bltinmodule.c 1> ceval.c 1> codecs.c 1> compile.c 1> dynamic_annotations.c 1> dynload_win.c 1> Generating Code... 1> Compiling... 1> errors.c 1> fileutils.c 1> formatter_unicode.c 1> frozen.c 1> future.c 1> getargs.c 1> getcompiler.c 1> getcopyright.c 1> getopt.c 1> getplatform.c 1> getversion.c 1> graminit.c 1> import.c 1> importdl.c 1> marshal.c 1> modsupport.c 1> mysnprintf.c 1> mystrtoul.c 1> peephole.c 1> pyarena.c 1> Generating Code... 1> Compiling... 1> pyctype.c 1> pyfpe.c 1> pylifecycle.c 1> pymath.c 1> pytime.c 1> pystate.c 1> pystrcmp.c 1> pystrhex.c 1> pystrtod.c 1> dtoa.c 1> Python-ast.c 1> pythonrun.c 1> structmember.c 1> symtable.c 1> sysmodule.c 1> thread.c 1> traceback.c 1> dl_nt.c 1> Generating Code... 1> getbuildinfo.c 1> Creating library W:\Python-3.5.1\PCBuild\win32\python35_d.lib and object W:\Python-3.5.1\PCBuild\win32\python35_d.exp 1> pythoncore.vcxproj -> W:\Python-3.5.1\PCBuild\win32\python35_d.dll 13>------ Rebuild All started: Project: _socket, Configuration: Debug Win32 ------ 14>------ Rebuild All started: Project: _ctypes_test, Configuration: Debug Win32 ------ 15>------ Rebuild All started: Project: _elementtree, Configuration: Debug Win32 ------ 16>------ Rebuild All started: Project: _msi, Configuration: Debug Win32 ------ 15> _elementtree.c 13> socketmodule.c 16> _msi.c 14> _ctypes_test.c 15> xmlparse.c 15> xmlrole.c 15> xmltok.c 15> Generating Code... 13>..\Modules\socketmodule.c(1014): warning C4996: 'inet_addr': Use inet_pton() or InetPton() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 13> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(1850): note: see declaration of 'inet_addr' 13>..\Modules\socketmodule.c(4040): warning C4996: 'WSADuplicateSocketA': Use WSADuplicateSocketW() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 13> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(2872): note: see declaration of 'WSADuplicateSocketA' 13>..\Modules\socketmodule.c(4254): warning C4996: 'WSASocketA': Use WSASocketW() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 13> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(3457): note: see declaration of 'WSASocketA' 13>..\Modules\socketmodule.c(4288): warning C4996: 'WSASocketA': Use WSASocketW() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 13> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(3457): note: see declaration of 'WSASocketA' 13>..\Modules\socketmodule.c(4694): warning C4996: 'gethostbyname': Use getaddrinfo() or GetAddrInfoW() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 13> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(2238): note: see declaration of 'gethostbyname' 13>..\Modules\socketmodule.c(4792): warning C4996: 'gethostbyaddr': Use getnameinfo() or GetNameInfoW() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 13> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(2216): note: see declaration of 'gethostbyaddr' 13>..\Modules\socketmodule.c(4922): warning C4996: 'WSADuplicateSocketA': Use WSADuplicateSocketW() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 13> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(2872): note: see declaration of 'WSADuplicateSocketA' 13>..\Modules\socketmodule.c(4927): warning C4996: 'WSASocketA': Use WSASocketW() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 13> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(3457): note: see declaration of 'WSASocketA' 13>..\Modules\socketmodule.c(5216): warning C4996: 'inet_addr': Use inet_pton() or InetPton() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 13> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(1850): note: see declaration of 'inet_addr' 13>..\Modules\socketmodule.c(5259): warning C4996: 'inet_ntoa': Use inet_ntop() or InetNtop() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 13> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(1868): note: see declaration of 'inet_ntoa' 13>..\Modules\socketmodule.c(5333): warning C4996: 'WSAStringToAddressA': Use WSAStringToAddressW() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 13> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(3623): note: see declaration of 'WSAStringToAddressA' 13>..\Modules\socketmodule.c(5477): warning C4996: 'WSAAddressToStringA': Use WSAAddressToStringW() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 13> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(3556): note: see declaration of 'WSAAddressToStringA' 14> Creating library W:\Python-3.5.1\PCBuild\win32\_ctypes_test_d.lib and object W:\Python-3.5.1\PCBuild\win32\_ctypes_test_d.exp 16> Creating library W:\Python-3.5.1\PCBuild\win32\_msi_d.lib and object W:\Python-3.5.1\PCBuild\win32\_msi_d.exp 13> Creating library W:\Python-3.5.1\PCBuild\win32\_socket_d.lib and object W:\Python-3.5.1\PCBuild\win32\_socket_d.exp 15> Creating library W:\Python-3.5.1\PCBuild\win32\_elementtree_d.lib and object W:\Python-3.5.1\PCBuild\win32\_elementtree_d.exp 14> _ctypes_test.vcxproj -> W:\Python-3.5.1\PCBuild\win32\_ctypes_test_d.pyd 16> _msi.vcxproj -> W:\Python-3.5.1\PCBuild\win32\_msi_d.pyd 13> _socket.vcxproj -> W:\Python-3.5.1\PCBuild\win32\_socket_d.pyd 15> _elementtree.vcxproj -> W:\Python-3.5.1\PCBuild\win32\_elementtree_d.pyd 17>------ Rebuild All started: Project: _sqlite3, Configuration: Debug Win32 ------ 18>------ Rebuild All started: Project: _ssl, Configuration: Debug Win32 ------ 19>------ Rebuild All started: Project: _testcapi, Configuration: Debug Win32 ------ 20>------ Rebuild All started: Project: _testimportmultiple, Configuration: Debug Win32 ------ 19> _testcapimodule.c 20> _testimportmultiple.c 17> cache.c 18> _ssl.c 17> connection.c 17> cursor.c 20> Creating library W:\Python-3.5.1\PCBuild\win32\_testimportmultiple_d.lib and object W:\Python-3.5.1\PCBuild\win32\_testimportmultiple_d.exp 17> microprotocols.c 20> _testimportmultiple.vcxproj -> W:\Python-3.5.1\PCBuild\win32\_testimportmultiple_d.pyd 18>..\Modules\_ssl.c(59): fatal error C1083: Cannot open include file: 'openssl/rsa.h': No such file or directory 21>------ Rebuild All started: Project: _tkinter, Configuration: Debug Win32 ------ 17> module.c 22>------ Rebuild All started: Project: _bz2, Configuration: Debug Win32 ------ 17> prepare_protocol.c 19> Creating library W:\Python-3.5.1\PCBuild\win32\_testcapi_d.lib and object W:\Python-3.5.1\PCBuild\win32\_testcapi_d.exp 22> _bz2module.c 17> row.c 19> _testcapi.vcxproj -> W:\Python-3.5.1\PCBuild\win32\_testcapi_d.pyd 23>------ Rebuild All started: Project: select, Configuration: Debug Win32 ------ 23> selectmodule.c 22> blocksort.c 17> statement.c 21> _tkinter.c 17> util.c 17> Generating Code... 23> Creating library W:\Python-3.5.1\PCBuild\win32\select_d.lib and object W:\Python-3.5.1\PCBuild\win32\select_d.exp 21>..\Modules\_tkinter.c(49): fatal error C1083: Cannot open include file: 'tcl.h': No such file or directory 21> tkappinit.c 21>..\Modules\tkappinit.c(16): fatal error C1083: Cannot open include file: 'tcl.h': No such file or directory 21> Generating Code... 24>------ Rebuild All started: Project: _lzma, Configuration: Debug Win32 ------ 22> bzlib.c 23> select.vcxproj -> W:\Python-3.5.1\PCBuild\win32\select_d.pyd 25>------ Rebuild All started: Project: pyexpat, Configuration: Debug Win32 ------ 24> _lzmamodule.c 17> Creating library W:\Python-3.5.1\PCBuild\win32\_sqlite3_d.lib and object W:\Python-3.5.1\PCBuild\win32\_sqlite3_d.exp 17> _sqlite3.vcxproj -> W:\Python-3.5.1\PCBuild\win32\_sqlite3_d.pyd 24> Creating library W:\Python-3.5.1\PCBuild\win32\_lzma_d.lib and object W:\Python-3.5.1\PCBuild\win32\_lzma_d.exp 24> _lzma.vcxproj -> W:\Python-3.5.1\PCBuild\win32\_lzma_d.pyd 26>------ Rebuild All started: Project: _hashlib, Configuration: Debug Win32 ------ 26> _hashopenssl.c 26>..\Modules\_hashopenssl.c(23): fatal error C1083: Cannot open include file: 'openssl/evp.h': No such file or directory 27>------ Rebuild All started: Project: python, Configuration: Debug Win32 ------ 27> python.c 25> pyexpat.c 25> xmlparse.c 22> compress.c 25> xmlrole.c 25> xmltok.c 25> Generating Code... 22> crctable.c 22> decompress.c 22> huffman.c 22> randtable.c 28>------ Rebuild All started: Project: _multiprocessing, Configuration: Debug Win32 ------ 27> python.vcxproj -> W:\Python-3.5.1\PCBuild\win32\python_d.exe 29>------ Rebuild All started: Project: pythonw, Configuration: Debug Win32 ------ 28> multiprocessing.c 22> Generating Code... 29> WinMain.c 25> Creating library W:\Python-3.5.1\PCBuild\win32\pyexpat_d.lib and object W:\Python-3.5.1\PCBuild\win32\pyexpat_d.exp 25> pyexpat.vcxproj -> W:\Python-3.5.1\PCBuild\win32\pyexpat_d.pyd 28> semaphore.c 30>------ Rebuild All started: Project: _testbuffer, Configuration: Debug Win32 ------ 30> _testbuffer.c 28> Generating Code... 22> Creating library W:\Python-3.5.1\PCBuild\win32\_bz2_d.lib and object W:\Python-3.5.1\PCBuild\win32\_bz2_d.exp 28> Creating library W:\Python-3.5.1\PCBuild\win32\_multiprocessing_d.lib and object W:\Python-3.5.1\PCBuild\win32\_multiprocessing_d.exp 22> _bz2.vcxproj -> W:\Python-3.5.1\PCBuild\win32\_bz2_d.pyd 31>------ Skipped Rebuild All: Project: _freeze_importlib, Configuration: Debug Win32 ------ 31>Project not selected to build for this solution configuration 32>------ Rebuild All started: Project: _overlapped, Configuration: Debug Win32 ------ 30> Creating library W:\Python-3.5.1\PCBuild\win32\_testbuffer_d.lib and object W:\Python-3.5.1\PCBuild\win32\_testbuffer_d.exp 30> _testbuffer.vcxproj -> W:\Python-3.5.1\PCBuild\win32\_testbuffer_d.pyd 33>------ Rebuild All started: Project: _testembed, Configuration: Debug Win32 ------ 32> overlapped.c 33> _testembed.c 28> _multiprocessing.vcxproj -> W:\Python-3.5.1\PCBuild\win32\_multiprocessing_d.pyd 34>------ Rebuild All started: Project: _testmultiphase, Configuration: Debug Win32 ------ 34> _testmultiphase.c 29> pythonw.vcxproj -> W:\Python-3.5.1\PCBuild\win32\pythonw_d.exe 35>------ Rebuild All started: Project: winsound, Configuration: Debug Win32 ------ 32>..\Modules\overlapped.c(977): warning C4996: 'WSAStringToAddressA': Use WSAStringToAddressW() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 32> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(3623): note: see declaration of 'WSAStringToAddressA' 32>..\Modules\overlapped.c(988): warning C4996: 'WSAStringToAddressA': Use WSAStringToAddressW() instead or define _WINSOCK_DEPRECATED_NO_WARNINGS to disable deprecated API warnings 32> C:\Program Files (x86)\Windows Kits\8.1\Include\um\winsock2.h(3623): note: see declaration of 'WSAStringToAddressA' 34> Creating library W:\Python-3.5.1\PCBuild\win32\_testmultiphase_d.lib and object W:\Python-3.5.1\PCBuild\win32\_testmultiphase_d.exp 35> winsound.c 33> _testembed.vcxproj -> W:\Python-3.5.1\PCBuild\win32\_testembed_d.exe 36>------ Rebuild All started: Project: _decimal, Configuration: Debug Win32 ------ 32> Creating library W:\Python-3.5.1\PCBuild\win32\_overlapped_d.lib and object W:\Python-3.5.1\PCBuild\win32\_overlapped_d.exp 34> _testmultiphase.vcxproj -> W:\Python-3.5.1\PCBuild\win32\_testmultiphase_d.pyd 37>------ Rebuild All started: Project: _ctypes, Configuration: Debug Win32 ------ 36> _decimal.c 37> _ctypes.c 36> basearith.c 36> constants.c 36> context.c 32> _overlapped.vcxproj -> W:\Python-3.5.1\PCBuild\win32\_overlapped_d.pyd 38>------ Rebuild All started: Project: unicodedata, Configuration: Debug Win32 ------ 36> convolute.c 36> crt.c 38> unicodedata.c 36> difradix2.c 35> Creating library W:\Python-3.5.1\PCBuild\win32\winsound_d.lib and object W:\Python-3.5.1\PCBuild\win32\winsound_d.exp 37> callbacks.c 36> fnt.c 36> fourstep.c 35> winsound.vcxproj -> W:\Python-3.5.1\PCBuild\win32\winsound_d.pyd 36> io.c 36> memory.c 36> mpdecimal.c 37> callproc.c 36> numbertheory.c 36> sixstep.c 36> transpose.c 36> Generating Code... 38> Creating library W:\Python-3.5.1\PCBuild\win32\unicodedata_d.lib and object W:\Python-3.5.1\PCBuild\win32\unicodedata_d.exp 37> cfield.c 38> unicodedata.vcxproj -> W:\Python-3.5.1\PCBuild\win32\unicodedata_d.pyd 36> Creating library W:\Python-3.5.1\PCBuild\win32\_decimal_d.lib and object W:\Python-3.5.1\PCBuild\win32\_decimal_d.exp 36> _decimal.vcxproj -> W:\Python-3.5.1\PCBuild\win32\_decimal_d.pyd 37> ffi.c 37> malloc_closure.c 37> prep_cif.c 37> stgdict.c 37> win32.c 37> Generating Code... 37> Creating library W:\Python-3.5.1\PCBuild\win32\_ctypes_d.lib and object W:\Python-3.5.1\PCBuild\win32\_ctypes_d.exp 37> _ctypes.vcxproj -> W:\Python-3.5.1\PCBuild\win32\_ctypes_d.pyd ========== Rebuild All: 29 succeeded, 6 failed, 3 skipped ========== -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Wed Jan 13 16:30:07 2016 From: brett at python.org (Brett Cannon) Date: Wed, 13 Jan 2016 21:30:07 +0000 Subject: [Python-Dev] Debugging using VS 2015 In-Reply-To: <003201d14e3d$b6560f80$23022e80$@com> References: <01f401d14dad$e984f1e0$bc8ed5a0$@com> <5695C54A.3070101@sdamon.com> <003201d14e3d$b6560f80$23022e80$@com> Message-ID: Use PCbuild/get_externals.bat to download the external dependencies. On Wed, 13 Jan 2016 at 13:25 Eddy Quicksall wrote: > I am using 3.5.1. I?m adding an extension for my special case. > > > > I know this list is for development and that?s what I?m doing but I would > like to use VS 2015 to do my debugging. If there is another list I should > use to get the build to work then please let me know the list. > > > > I have been able to build using ?PCbuild\build.bat ?e? but I?m not able to > build using PCbuild\pcbuild.sln. It gives lots of errors. > > > > I?m using Debug/Win32. > > > > Below is the output. Note that lots of .c files are missing ?. Is there > another package I should get (but then why would it build correctly with > build.bat)? It could be that I need to add the equivalent to the ?-e? > option. > > > > 1>------ Rebuild All started: Project: pythoncore, Configuration: Debug > Win32 ------ > > 2>------ Rebuild All started: Project: tcl, Configuration: Debug Win32 > ------ > > 3>------ Rebuild All started: Project: ssleay, Configuration: Debug Win32 > ------ > > 4>------ Rebuild All started: Project: sqlite3, Configuration: Debug Win32 > ------ > > 1> Killing any running python_d.exe instances... > > 1> 'hg' is not recognized as an internal or external command, > > 1> operable program or batch file. > > 1>W:\Python-3.5.1\PCbuild\pythoncore.vcxproj(403,5): warning MSB3073: The > command "hg id -b > > "W:\Python-3.5.1\PCbuild\obj\\win32_Debug\pythoncore\hgbranch.txt"" exited > with code 9009. > > 1> 'hg' is not recognized as an internal or external command, > > 1> operable program or batch file. > > 1>W:\Python-3.5.1\PCbuild\pythoncore.vcxproj(404,5): warning MSB3073: The > command "hg id -i > > "W:\Python-3.5.1\PCbuild\obj\\win32_Debug\pythoncore\hgversion.txt"" exited > with code 9009. > > 1> 'hg' is not recognized as an internal or external command, > > 1> operable program or batch file. > > 1>W:\Python-3.5.1\PCbuild\pythoncore.vcxproj(405,5): warning MSB3073: The > command "hg id -t > > "W:\Python-3.5.1\PCbuild\obj\\win32_Debug\pythoncore\hgtag.txt"" exited > with code 9009. > > 2>C:\Program Files > (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(41,5): > warning MSB8005: The property 'NMakeReBuildCommandLine' doesn't exist. > Skipping... > > 5>------ Rebuild All started: Project: python3dll, Configuration: Debug > Win32 ------ > > 1> _bisectmodule.c > > 5> python3dll.c > > 4> sqlite3.c > > 3> d1_both.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\d1_both.c': No such file or > directory > > 3> d1_lib.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\d1_lib.c': No such file or > directory > > 1> _codecsmodule.c > > 3> d1_pkt.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\d1_pkt.c': No such file or > directory > > 3> d1_srtp.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\d1_srtp.c': No such file or > directory > > 3> s2_clnt.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s2_clnt.c': No such file or > directory > > 3> s2_enc.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s2_enc.c': No such file or > directory > > 3> s2_lib.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s2_lib.c': No such file or > directory > > 3> s2_meth.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s2_meth.c': No such file or > directory > > 3> s2_pkt.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s2_pkt.c': No such file or > directory > > 3> s2_srvr.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s2_srvr.c': No such file or > directory > > 3> s23_clnt.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s23_clnt.c': No such file or > directory > > 3> s23_lib.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s23_lib.c': No such file or > directory > > 3> s23_meth.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s23_meth.c': No such file or > directory > > 3> s23_pkt.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s23_pkt.c': No such file or > directory > > 3> s23_srvr.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s23_srvr.c': No such file or > directory > > 3> s3_both.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_both.c': No such file or > directory > > 3> s3_cbc.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_cbc.c': No such file or > directory > > 3> s3_clnt.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_clnt.c': No such file or > directory > > 3> s3_enc.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_enc.c': No such file or > directory > > 3> s3_lib.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_lib.c': No such file or > directory > > 3> Generating Code... > > 3> Compiling... > > 3> s3_meth.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_meth.c': No such file or > directory > > 3> s3_pkt.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_pkt.c': No such file or > directory > > 3> s3_srvr.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_srvr.c': No such file or > directory > > 3> ssl_algs.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_algs.c': No such file or > directory > > 3> ssl_asn1.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_asn1.c': No such file or > directory > > 3> ssl_cert.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_cert.c': No such file or > directory > > 3> ssl_ciph.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_ciph.c': No such file or > directory > > 3> ssl_err.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_err.c': No such file or > directory > > 3> ssl_err2.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_err2.c': No such file or > directory > > 3> ssl_lib.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_lib.c': No such file or > directory > > 3> ssl_rsa.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_rsa.c': No such file or > directory > > 3> ssl_sess.c > > 1> _collectionsmodule.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_sess.c': No such file or > directory > > 3> t1_clnt.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\t1_clnt.c': No such file or > directory > > 3> t1_enc.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\t1_enc.c': No such file or > directory > > 3> t1_lib.c > > 1> _csv.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\t1_lib.c': No such file or > directory > > 3> t1_meth.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\t1_meth.c': No such file or > directory > > 3> t1_reneg.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\t1_reneg.c': No such file or > directory > > 3> t1_srvr.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\t1_srvr.c': No such file or > directory > > 3> tls_srp.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\tls_srp.c': No such file or > directory > > 3> Generating Code... > > 6>------ Rebuild All started: Project: tk, Configuration: Debug Win32 > ------ > > 1> _functoolsmodule.c > > 1> _heapqmodule.c > > 1> _json.c > > 1> _localemodule.c > > 4> Creating library W:\Python-3.5.1\PCBuild\win32\sqlite3_d.lib and > object W:\Python-3.5.1\PCBuild\win32\sqlite3_d.exp > > 5> Creating library W:\Python-3.5.1\PCBuild\win32\python3_dstub.lib > and object W:\Python-3.5.1\PCBuild\win32\python3_dstub.exp > > 1> _lsprof.c > > 5> Creating library W:\Python-3.5.1\PCBuild\win32\python3_d.lib and > object W:\Python-3.5.1\PCBuild\win32\python3_d.exp > > 5> python3dll.vcxproj -> W:\Python-3.5.1\PCBuild\win32\python3_d.dll > > 7>------ Rebuild All started: Project: libeay, Configuration: Debug Win32 > ------ > > 6>C:\Program Files > (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(41,5): > warning MSB8005: The property 'NMakeReBuildCommandLine' doesn't exist. > Skipping... > > 8>------ Skipped Rebuild All: Project: bdist_wininst, Configuration: Debug > Win32 ------ > > 8>Project not selected to build for this solution configuration > > 9>------ Skipped Rebuild All: Project: xxlimited, Configuration: Release > Win32 ------ > > 9>Project not selected to build for this solution configuration > > 10>------ Rebuild All started: Project: pylauncher, Configuration: Debug > Win32 ------ > > 1> _math.c > > 4> sqlite3.vcxproj -> W:\Python-3.5.1\PCBuild\win32\sqlite3_d.dll > > 11>------ Rebuild All started: Project: pywlauncher, Configuration: Debug > Win32 ------ > > 10> launcher.c > > 1> _pickle.c > > 7> nasm: fatal: unable to open input file > `W:\Python-3.5.1\externals\openssl-1.0.2d\tmp32\aes-586.asm' > > 1> _randommodule.c > > 7>W:\Python-3.5.1\PCbuild\openssl.props(69,5): error MSB3073: The command > "setlocal > > 7>W:\Python-3.5.1\PCbuild\openssl.props(69,5): error MSB3073: set > PATH=W:\Python-3.5.1\externals\\nasm-2.11.06\;%PATH% > > 7>W:\Python-3.5.1\PCbuild\openssl.props(69,5): error MSB3073: nasm.exe -f > win32 -o > "W:\Python-3.5.1\externals\openssl-1.0.2d\tmp\\win32_Debug\libeay\aes-586.obj" > "W:\Python-3.5.1\externals\openssl-1.0.2d\tmp32\aes-586.asm"" exited with > code 1. > > 12>------ Rebuild All started: Project: tix, Configuration: Debug Win32 > ------ > > 11> launcher.c > > 1> _sre.c > > 1> _stat.c > > 1> _struct.c > > 12> md Debug_VC13 > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixClass.obj ..\generic\tixClass.c > > 12> tixClass.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixCmds.obj ..\generic\tixCmds.c > > 12> tixCmds.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixCompat.obj ..\generic\tixCompat.c > > 12> tixCompat.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixDiImg.obj ..\generic\tixDiImg.c > > 12> tixDiImg.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixDiITxt.obj ..\generic\tixDiITxt.c > > 12> tixDiITxt.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixDiStyle.obj ..\generic\tixDiStyle.c > > 12> tixDiStyle.c > > 11> pywlauncher.vcxproj -> W:\Python-3.5.1\PCBuild\win32\pyw_d.exe > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixDItem.obj ..\generic\tixDItem.c > > 12> tixDItem.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixDiText.obj ..\generic\tixDiText.c > > 12> tixDiText.c > > 1> _weakref.c > > 1> arraymodule.c > > 1> atexitmodule.c > > 1> audioop.c > > 1> binascii.c > > 1> Generating Code... > > 1> Compiling... > > 1> cmathmodule.c > > 1> _datetimemodule.c > > 1> errnomodule.c > > 1> faulthandler.c > > 1> gcmodule.c > > 10> pylauncher.vcxproj -> W:\Python-3.5.1\PCBuild\win32\py_d.exe > > 1> hashtable.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixDiWin.obj ..\generic\tixDiWin.c > > 12> tixDiWin.c > > 1> itertoolsmodule.c > > 1> main.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixError.obj ..\generic\tixError.c > > 12> tixError.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixForm.obj ..\generic\tixForm.c > > 1> mathmodule.c > > 12> tixForm.c > > 1> md5module.c > > 1> mmapmodule.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixFormMisc.obj ..\generic\tixFormMisc.c > > 12> tixFormMisc.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixGeometry.obj ..\generic\tixGeometry.c > > 12> tixGeometry.c > > 1> _opcode.c > > 1> _operator.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixGrid.obj ..\generic\tixGrid.c > > 1> parsermodule.c > > 12> tixGrid.c > > 1> posixmodule.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixGrData.obj ..\generic\tixGrData.c > > 12> tixGrData.c > > 1> rotatingtree.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixGrRC.obj ..\generic\tixGrRC.c > > 1> sha1module.c > > 12> tixGrRC.c > > 1> sha256module.c > > 1> sha512module.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixGrFmt.obj ..\generic\tixGrFmt.c > > 12> tixGrFmt.c > > 1> signalmodule.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixGrSel.obj ..\generic\tixGrSel.c > > 12> tixGrSel.c > > 1> Generating Code... > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixGrUtl.obj ..\generic\tixGrUtl.c > > 12> tixGrUtl.c > > 1> Compiling... > > 1> symtablemodule.c > > 1> _threadmodule.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixHLCol.obj ..\generic\tixHLCol.c > > 12> tixHLCol.c > > 1> _tracemalloc.c > > 1> timemodule.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixHLHdr.obj ..\generic\tixHLHdr.c > > 12> tixHLHdr.c > > 1> xxsubtype.c > > 1> zipimport.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixHLInd.obj ..\generic\tixHLInd.c > > 12> tixHLInd.c > > 1> zlibmodule.c > > 1> fileio.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixHList.obj ..\generic\tixHList.c > > 12> tixHList.c > > 1> bytesio.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixImgCmp.obj ..\generic\tixImgCmp.c > > 12> tixImgCmp.c > > 1> stringio.c > > 1> bufferedio.c > > 12>..\generic\tixImgCmp.c(169): warning C4090: 'function': different > 'const' qualifiers > > 12>..\generic\tixImgCmp.c(169): warning C4028: formal parameter 2 > different from declaration > > 12>..\generic\tixImgCmp.c(169): warning C4028: formal parameter 5 > different from declaration > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixImgXpm.obj ..\generic\tixImgXpm.c > > 12> tixImgXpm.c > > 1> iobase.c > > 1> textio.c > > 12>..\generic\tixImgXpm.c(88): warning C4090: 'function': different > 'const' qualifiers > > 12>..\generic\tixImgXpm.c(88): warning C4028: formal parameter 2 different > from declaration > > 12>..\generic\tixImgXpm.c(88): warning C4028: formal parameter 5 different > from declaration > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixInit.obj ..\generic\tixInit.c > > 1> _iomodule.c > > 12> tixInit.c > > 1> adler32.c > > 1> compress.c > > 1> crc32.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixList.obj ..\generic\tixList.c > > 1> deflate.c > > 12> tixList.c > > 1> infback.c > > 1> inffast.c > > 1> Generating Code... > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixMethod.obj ..\generic\tixMethod.c > > 1> Compiling... > > 1> inflate.c > > 12> tixMethod.c > > 1> inftrees.c > > 1> trees.c > > 1> uncompr.c > > 1> zutil.c > > 1> _codecs_cn.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixNBFrame.obj ..\generic\tixNBFrame.c > > 12> tixNBFrame.c > > 1> _codecs_hk.c > > 1> _codecs_iso2022.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixOption.obj ..\generic\tixOption.c > > 12> tixOption.c > > 1> _codecs_jp.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixSmpLs.obj ..\generic\tixSmpLs.c > > 1> _codecs_kr.c > > 12> tixSmpLs.c > > 1> _codecs_tw.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixScroll.obj ..\generic\tixScroll.c > > 12> tixScroll.c > > 1> multibytecodec.c > > 1> _winapi.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixTList.obj ..\generic\tixTList.c > > 12> tixTList.c > > 1> abstract.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixUtils.obj ..\generic\tixUtils.c > > 12> tixUtils.c > > 1> accu.c > > 1> boolobject.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixWCmpt.obj ..\win\tixWCmpt.c > > 1> bytes_methods.c > > 12> tixWCmpt.c > > 1> bytearrayobject.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixWidget.obj ..\generic\tixWidget.c > > 12> tixWidget.c > > 1> bytesobject.c > > 1> capsule.c > > 1> Generating Code... > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixWinDraw.obj ..\win\tixWinDraw.c > > 12> tixWinDraw.c > > 1> Compiling... > > 1> cellobject.c > > 1> classobject.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixWinXpm.obj ..\win\tixWinXpm.c > > 12> tixWinXpm.c > > 1> codeobject.c > > 1> complexobject.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixWinWm.obj ..\win\tixWinWm.c > > 1> descrobject.c > > 12> tixWinWm.c > > 1> dictobject.c > > 12> link.exe -debug -debugtype:cv /RELEASE /NOLOGO /MACHINE:IX86 -dll > W:\Python-3.5.1\externals\tk-8.6.4.2\win\Debug_VC13\tkstub86.lib > W:\Python-3.5.1\externals\tcl-core-8.6.4.2\win\Debug_VC13\tclstub86.lib > kernel32.lib advapi32.lib user32.lib gdi32.lib comdlg32.lib > -out:Debug_VC13\tix84g.dll @C:\Users\eddyq\AppData\Local\Temp\nmF029.tmp > > 12>LINK : fatal error LNK1181: cannot open input file > 'W:\Python-3.5.1\externals\tk-8.6.4.2\win\Debug_VC13\tkstub86.lib' > > 12>NMAKE : fatal error U1077: '"C:\Program Files (x86)\Microsoft Visual > Studio 14.0\VC\bin\link.exe"' : return code '0x49d' > > 12> Stop. > > 12>C:\Program Files > (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(44,5): > error MSB3073: The command "setlocal > > 12>C:\Program Files > (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(44,5): > error MSB3073: if not exist > "W:\Python-3.5.1\externals\tcltk\lib\tix8.4.3\tix84g.dll" goto build > > 12>C:\Program Files > (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(44,5): > error MSB3073: goto :eof > > 12>C:\Program Files > (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(44,5): > error MSB3073: :build > > 12>C:\Program Files > (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(44,5): > error MSB3073: set VCINSTALLDIR=C:\Program Files (x86)\Microsoft Visual > Studio 14.0\VC\ > > 12>C:\Program Files > (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(44,5): > error MSB3073: cd /D "W:\Python-3.5.1\externals\tix-8.4.3.6\win" > > 12>C:\Program Files > (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(44,5): > error MSB3073: nmake /nologo -f makefile.vc MACHINE=IX86 DEBUG=1 > NODEBUG=0 TCL_DBGX=g TK_DBGX=g TCL_MAJOR=8 TCL_MINOR=6 TCL_PATCH=4 > BUILDDIRTOP="Debug_VC13" > TCL_DIR="W:\Python-3.5.1\externals\tcl-core-8.6.4.2" > TK_DIR="W:\Python-3.5.1\externals\tk-8.6.4.2" > INSTALL_DIR="W:\Python-3.5.1\externals\tcltk" all install > > 12>C:\Program Files > (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(44,5): > error MSB3073: " exited with code 2. > > 1> enumobject.c > > 1> exceptions.c > > 1> fileobject.c > > 1> floatobject.c > > 1> frameobject.c > > 1> funcobject.c > > 1> genobject.c > > 1> iterobject.c > > 1> listobject.c > > 1> longobject.c > > 1> memoryobject.c > > 1> methodobject.c > > 1> moduleobject.c > > 1> namespaceobject.c > > 1> Generating Code... > > 1> Compiling... > > 1> object.c > > 1> obmalloc.c > > 1> odictobject.c > > 1> rangeobject.c > > 1> setobject.c > > 1> sliceobject.c > > 1> structseq.c > > 1> tupleobject.c > > 1> typeobject.c > > 1> unicodectype.c > > 1> unicodeobject.c > > 1> weakrefobject.c > > 1> acceler.c > > 1> bitset.c > > 1> firstsets.c > > 1> grammar.c > > 1> grammar1.c > > 1> listnode.c > > 1> metagrammar.c > > 1> myreadline.c > > 1> Generating Code... > > 1> Compiling... > > 1> node.c > > 1> parser.c > > 1> parsetok.c > > 1> tokenizer.c > > 1> invalid_parameter_handler.c > > 1> winreg.c > > 1> config.c > > 1> getpathp.c > > 1> msvcrtmodule.c > > 1> pyhash.c > > 1> random.c > > 1> _warnings.c > > 1> asdl.c > > 1> ast.c > > 1> bltinmodule.c > > 1> ceval.c > > 1> codecs.c > > 1> compile.c > < > -------------- next part -------------- An HTML attachment was scrubbed... URL: From victor.stinner at gmail.com Wed Jan 13 17:10:30 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Wed, 13 Jan 2016 23:10:30 +0100 Subject: [Python-Dev] Discussion related to memory leaks requested In-Reply-To: <5696A647.6090501@busiq.com> References: <5696A647.6090501@busiq.com> Message-ID: Hi, 2016-01-13 20:32 GMT+01:00 Matthew Paulson : > I've spent some time performing memory leak analysis while using Python in an embedded configuration. Hum, did you try tracemalloc? https://docs.python.org/dev/library/tracemalloc.html https://pytracemalloc.readthedocs.org/ > Is there someone in the group that would like to discuss this topic. There seems to be other leaks as well. I'm new to Python-dev, but willing to help or work with someone who is more familiar with these areas than I. Are you able to reproduce the leak with a simple program? Victor From victor.stinner at gmail.com Wed Jan 13 17:32:15 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Wed, 13 Jan 2016 23:32:15 +0100 Subject: [Python-Dev] PEP 510: Specialize functions with guards In-Reply-To: References: Message-ID: I extracted a patch from my FAT Python project to implement the PEP 510: https://bugs.python.org/issue26098 FYI I also extracted the runtime part of the FAT Python optimizer and put it on GitHub: https://github.com/haypo/fat The fat module provides specialize(), get_specialized() and replace_consts() functions, but also 6 different guards: GuardArgType, GuardBuiltins, GuardDict, GuardFunc, GuardGlobals, GuardTypeDict. Victor From paulson at busiq.com Wed Jan 13 17:49:41 2016 From: paulson at busiq.com (Matthew Paulson) Date: Wed, 13 Jan 2016 17:49:41 -0500 Subject: [Python-Dev] Discussion related to memory leaks requested In-Reply-To: References: <5696A647.6090501@busiq.com> Message-ID: <5696D485.2030607@busiq.com> Hi Victor: No, I'm using the new heap analysis functions in DS2015. We think we have found one issue. In the following sequence, dict has no side effects, yet it is used -- unless someone can shed light on why dict is used in this case: /* Clear the modules dict. */ PyDict_Clear(modules); /* Restore the original builtins dict, to ensure that any user data gets cleared. */ dict = PyDict_Copy(interp->builtins); if (dict == NULL) PyErr_Clear(); PyDict_Clear(interp->builtins); if (PyDict_Update(interp->builtins, interp->builtins_copy)) PyErr_Clear(); Py_XDECREF(dict); And removing dict from this sequence seems to have fixed one of the issues, yielding 14k per iteration. Simple program: Good idea. We will try that -- right now it's embedded in a more complex environment, but we have tried to strip it down to a very simple sequence. The next item on our list is memory that is not getting freed after running simple string. It's in the parsertok sequence -- it seems that the syntax tree is not getting cleared -- but this opinion is preliminary. Best, Matt On 1/13/2016 5:10 PM, Victor Stinner wrote: > Hi, > > 2016-01-13 20:32 GMT+01:00 Matthew Paulson : >> I've spent some time performing memory leak analysis while using Python in an embedded configuration. > Hum, did you try tracemalloc? > > https://docs.python.org/dev/library/tracemalloc.html > https://pytracemalloc.readthedocs.org/ > >> Is there someone in the group that would like to discuss this topic. There seems to be other leaks as well. I'm new to Python-dev, but willing to help or work with someone who is more familiar with these areas than I. > Are you able to reproduce the leak with a simple program? > > Victor > > -- -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: MattSig.JPG Type: image/jpeg Size: 38491 bytes Desc: not available URL: From abarnert at yahoo.com Wed Jan 13 18:45:52 2016 From: abarnert at yahoo.com (Andrew Barnert) Date: Wed, 13 Jan 2016 15:45:52 -0800 Subject: [Python-Dev] Discussion related to memory leaks requested In-Reply-To: <5696D485.2030607@busiq.com> References: <5696A647.6090501@busiq.com> <5696D485.2030607@busiq.com> Message-ID: <6AFE80F6-AFF2-4B2E-BA98-E3F24C4CCF73@yahoo.com> On Jan 13, 2016, at 14:49, Matthew Paulson wrote: > > Hi Victor: > > No, I'm using the new heap analysis functions in DS2015. Isn't that going to report any memory that Python's higher level allocators hold in their freelists as leaked, even though it isn't leaked? > We think we have found one issue. In the following sequence, dict has no side effects, yet it is used -- unless someone can shed light on why dict is used in this case: Where do you see an issue here? The dict will have one ref, so the decref at the end should return it to the freelist. Also, it looks like there _is_ a side effect here. When you add a bunch of elements to a dict, it grows. When you delete a bunch of elements, it generally doesn't shrink. But when you clear the dict, it does shrink. So, copying it to a temporary dict, clearing it, updating it from the temporary dict, and then releasing the temporary dict should force it to shrink. So, the overall effect should be that you have a smaller hash table for the builtins dict, and a chunk of memory sitting on the freelists ready to be reused. If your analyzer is showing the freelists as leaked, this will look like a net leak rather than a net recovery, but that's just a problem in the analyzer. Of course I could be wrong, but I think the first step is to rule out the possibility that you're measuring the wrong thing... > /* Clear the modules dict. */ > PyDict_Clear(modules); > /* Restore the original builtins dict, to ensure that any > user data gets cleared. */ > dict = PyDict_Copy(interp->builtins); > if (dict == NULL) > PyErr_Clear(); > PyDict_Clear(interp->builtins); > if (PyDict_Update(interp->builtins, interp->builtins_copy)) > PyErr_Clear(); > Py_XDECREF(dict); > > And removing dict from this sequence seems to have fixed one of the issues, yielding 14k per iteration. > Simple program: Good idea. We will try that -- right now it's embedded in a more complex environment, but we have tried to strip it down to a very simple sequence. > > The next item on our list is memory that is not getting freed after running simple string. It's in the parsertok sequence -- it seems that the syntax tree is not getting cleared -- but this opinion is preliminary. > > Best, > > Matt > >> On 1/13/2016 5:10 PM, Victor Stinner wrote: >> Hi, >> >> 2016-01-13 20:32 GMT+01:00 Matthew Paulson : >>> I've spent some time performing memory leak analysis while using Python in an embedded configuration. >> Hum, did you try tracemalloc? >> >> https://docs.python.org/dev/library/tracemalloc.html >> https://pytracemalloc.readthedocs.org/ >> >>> Is there someone in the group that would like to discuss this topic. There seems to be other leaks as well. I'm new to Python-dev, but willing to help or work with someone who is more familiar with these areas than I. >> Are you able to reproduce the leak with a simple program? >> >> Victor >> >> > > -- > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/abarnert%40yahoo.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From Eddy at Quicksall.com Wed Jan 13 18:56:35 2016 From: Eddy at Quicksall.com (Eddy Quicksall) Date: Wed, 13 Jan 2016 18:56:35 -0500 Subject: [Python-Dev] Debugging using VS 2015 In-Reply-To: References: <01f401d14dad$e984f1e0$bc8ed5a0$@com> <5695C54A.3070101@sdamon.com> <003201d14e3d$b6560f80$23022e80$@com> Message-ID: <005e01d14e5e$09660bb0$1c322310$@com> Those files already exist: W:\Python-3.5.1>PCbuild\get_externals.bat Fetching external libraries... bzip2-1.0.6 already exists, skipping. nasm-2.11.06 already exists, skipping. openssl-1.0.2d already exists, skipping. sqlite-3.8.11.0 already exists, skipping. tcl-core-8.6.4.2 already exists, skipping. tk-8.6.4.2 already exists, skipping. tix-8.4.3.6 already exists, skipping. xz-5.0.5 already exists, skipping. Finished. W:\Python-3.5.1> From: Brett Cannon [mailto:brett at python.org] Sent: Wednesday, January 13, 2016 4:30 PM To: Eddy Quicksall; python-dev at python.org Subject: Re: [Python-Dev] Debugging using VS 2015 Use PCbuild/get_externals.bat to download the external dependencies. On Wed, 13 Jan 2016 at 13:25 Eddy Quicksall wrote: I am using 3.5.1. I?m adding an extension for my special case. I know this list is for development and that?s what I?m doing but I would like to use VS 2015 to do my debugging. If there is another list I should use to get the build to work then please let me know the list. I have been able to build using ?PCbuild\build.bat ?e? but I?m not able to build using PCbuild\pcbuild.sln. It gives lots of errors. I?m using Debug/Win32. Below is the output. Note that lots of .c files are missing ?. Is there another package I should get (but then why would it build correctly with build.bat)? It could be that I need to add the equivalent to the ?-e? option. 1>------ Rebuild All started: Project: pythoncore, Configuration: Debug Win32 ------ 2>------ Rebuild All started: Project: tcl, Configuration: Debug Win32 ------ 3>------ Rebuild All started: Project: ssleay, Configuration: Debug Win32 ------ 4>------ Rebuild All started: Project: sqlite3, Configuration: Debug Win32 ------ 1> Killing any running python_d.exe instances... 1> 'hg' is not recognized as an internal or external command, 1> operable program or batch file. 1>W:\Python-3.5.1\PCbuild\pythoncore.vcxproj(403,5): warning MSB3073: The command "hg id -b > "W:\Python-3.5.1\PCbuild\obj\\win32_Debug\pythoncore\hgbranch.txt"" exited with code 9009. 1> 'hg' is not recognized as an internal or external command, 1> operable program or batch file. 1>W:\Python-3.5.1\PCbuild\pythoncore.vcxproj(404,5): warning MSB3073: The command "hg id -i > "W:\Python-3.5.1\PCbuild\obj\\win32_Debug\pythoncore\hgversion.txt"" exited with code 9009. 1> 'hg' is not recognized as an internal or external command, 1> operable program or batch file. 1>W:\Python-3.5.1\PCbuild\pythoncore.vcxproj(405,5): warning MSB3073: The command "hg id -t > "W:\Python-3.5.1\PCbuild\obj\\win32_Debug\pythoncore\hgtag.txt"" exited with code 9009. 2>C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(41,5): warning MSB8005: The property 'NMakeReBuildCommandLine' doesn't exist. Skipping... 5>------ Rebuild All started: Project: python3dll, Configuration: Debug Win32 ------ 1> _bisectmodule.c 5> python3dll.c 4> sqlite3.c 3> d1_both.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\d1_both.c': No such file or directory 3> d1_lib.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\d1_lib.c': No such file or directory 1> _codecsmodule.c 3> d1_pkt.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\d1_pkt.c': No such file or directory 3> d1_srtp.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\d1_srtp.c': No such file or directory 3> s2_clnt.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s2_clnt.c': No such file or directory 3> s2_enc.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s2_enc.c': No such file or directory 3> s2_lib.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s2_lib.c': No such file or directory 3> s2_meth.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s2_meth.c': No such file or directory 3> s2_pkt.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s2_pkt.c': No such file or directory 3> s2_srvr.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s2_srvr.c': No such file or directory 3> s23_clnt.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s23_clnt.c': No such file or directory 3> s23_lib.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s23_lib.c': No such file or directory 3> s23_meth.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s23_meth.c': No such file or directory 3> s23_pkt.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s23_pkt.c': No such file or directory 3> s23_srvr.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s23_srvr.c': No such file or directory 3> s3_both.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_both.c': No such file or directory 3> s3_cbc.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_cbc.c': No such file or directory 3> s3_clnt.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_clnt.c': No such file or directory 3> s3_enc.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_enc.c': No such file or directory 3> s3_lib.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_lib.c': No such file or directory 3> Generating Code... 3> Compiling... 3> s3_meth.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_meth.c': No such file or directory 3> s3_pkt.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_pkt.c': No such file or directory 3> s3_srvr.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_srvr.c': No such file or directory 3> ssl_algs.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_algs.c': No such file or directory 3> ssl_asn1.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_asn1.c': No such file or directory 3> ssl_cert.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_cert.c': No such file or directory 3> ssl_ciph.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_ciph.c': No such file or directory 3> ssl_err.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_err.c': No such file or directory 3> ssl_err2.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_err2.c': No such file or directory 3> ssl_lib.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_lib.c': No such file or directory 3> ssl_rsa.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_rsa.c': No such file or directory 3> ssl_sess.c 1> _collectionsmodule.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_sess.c': No such file or directory 3> t1_clnt.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\t1_clnt.c': No such file or directory 3> t1_enc.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\t1_enc.c': No such file or directory 3> t1_lib.c 1> _csv.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\t1_lib.c': No such file or directory 3> t1_meth.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\t1_meth.c': No such file or directory 3> t1_reneg.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\t1_reneg.c': No such file or directory 3> t1_srvr.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\t1_srvr.c': No such file or directory 3> tls_srp.c 3>c1 : fatal error C1083: Cannot open source file: 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\tls_srp.c': No such file or directory 3> Generating Code... 6>------ Rebuild All started: Project: tk, Configuration: Debug Win32 ------ 1> _functoolsmodule.c 1> _heapqmodule.c 1> _json.c 1> _localemodule.c 4> Creating library W:\Python-3.5.1\PCBuild\win32\sqlite3_d.lib and object W:\Python-3.5.1\PCBuild\win32\sqlite3_d.exp 5> Creating library W:\Python-3.5.1\PCBuild\win32\python3_dstub.lib and object W:\Python-3.5.1\PCBuild\win32\python3_dstub.exp 1> _lsprof.c 5> Creating library W:\Python-3.5.1\PCBuild\win32\python3_d.lib and object W:\Python-3.5.1\PCBuild\win32\python3_d.exp 5> python3dll.vcxproj -> W:\Python-3.5.1\PCBuild\win32\python3_d.dll 7>------ Rebuild All started: Project: libeay, Configuration: Debug Win32 ------ 6>C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(41,5): warning MSB8005: The property 'NMakeReBuildCommandLine' doesn't exist. Skipping... 8>------ Skipped Rebuild All: Project: bdist_wininst, Configuration: Debug Win32 ------ 8>Project not selected to build for this solution configuration 9>------ Skipped Rebuild All: Project: xxlimited, Configuration: Release Win32 ------ 9>Project not selected to build for this solution configuration 10>------ Rebuild All started: Project: pylauncher, Configuration: Debug Win32 ------ 1> _math.c 4> sqlite3.vcxproj -> W:\Python-3.5.1\PCBuild\win32\sqlite3_d.dll 11>------ Rebuild All started: Project: pywlauncher, Configuration: Debug Win32 ------ 10> launcher.c 1> _pickle.c 7> nasm: fatal: unable to open input file `W:\Python-3.5.1\externals\openssl-1.0.2d\tmp32\aes-586.asm' 1> _randommodule.c 7>W:\Python-3.5.1\PCbuild\openssl.props(69,5): error MSB3073: The command "setlocal 7>W:\Python-3.5.1\PCbuild\openssl.props(69,5): error MSB3073: set PATH=W:\Python-3.5.1\externals\\nasm-2.11.06\;%PATH% 7>W:\Python-3.5.1\PCbuild\openssl.props(69,5): error MSB3073: nasm.exe -f win32 -o "W:\Python-3.5.1\externals\openssl-1.0.2d\tmp\\win32_Debug\libeay\aes-586.obj" "W:\Python-3.5.1\externals\openssl-1.0.2d\tmp32\aes-586.asm"" exited with code 1. 12>------ Rebuild All started: Project: tix, Configuration: Debug Win32 ------ 11> launcher.c 1> _sre.c 1> _stat.c 1> _struct.c 12> md Debug_VC13 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixClass.obj ..\generic\tixClass.c 12> tixClass.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixCmds.obj ..\generic\tixCmds.c 12> tixCmds.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixCompat.obj ..\generic\tixCompat.c 12> tixCompat.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixDiImg.obj ..\generic\tixDiImg.c 12> tixDiImg.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixDiITxt.obj ..\generic\tixDiITxt.c 12> tixDiITxt.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixDiStyle.obj ..\generic\tixDiStyle.c 12> tixDiStyle.c 11> pywlauncher.vcxproj -> W:\Python-3.5.1\PCBuild\win32\pyw_d.exe 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixDItem.obj ..\generic\tixDItem.c 12> tixDItem.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixDiText.obj ..\generic\tixDiText.c 12> tixDiText.c 1> _weakref.c 1> arraymodule.c 1> atexitmodule.c 1> audioop.c 1> binascii.c 1> Generating Code... 1> Compiling... 1> cmathmodule.c 1> _datetimemodule.c 1> errnomodule.c 1> faulthandler.c 1> gcmodule.c 10> pylauncher.vcxproj -> W:\Python-3.5.1\PCBuild\win32\py_d.exe 1> hashtable.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixDiWin.obj ..\generic\tixDiWin.c 12> tixDiWin.c 1> itertoolsmodule.c 1> main.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixError.obj ..\generic\tixError.c 12> tixError.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixForm.obj ..\generic\tixForm.c 1> mathmodule.c 12> tixForm.c 1> md5module.c 1> mmapmodule.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixFormMisc.obj ..\generic\tixFormMisc.c 12> tixFormMisc.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixGeometry.obj ..\generic\tixGeometry.c 12> tixGeometry.c 1> _opcode.c 1> _operator.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixGrid.obj ..\generic\tixGrid.c 1> parsermodule.c 12> tixGrid.c 1> posixmodule.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixGrData.obj ..\generic\tixGrData.c 12> tixGrData.c 1> rotatingtree.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixGrRC.obj ..\generic\tixGrRC.c 1> sha1module.c 12> tixGrRC.c 1> sha256module.c 1> sha512module.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixGrFmt.obj ..\generic\tixGrFmt.c 12> tixGrFmt.c 1> signalmodule.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixGrSel.obj ..\generic\tixGrSel.c 12> tixGrSel.c 1> Generating Code... 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixGrUtl.obj ..\generic\tixGrUtl.c 12> tixGrUtl.c 1> Compiling... 1> symtablemodule.c 1> _threadmodule.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixHLCol.obj ..\generic\tixHLCol.c 12> tixHLCol.c 1> _tracemalloc.c 1> timemodule.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixHLHdr.obj ..\generic\tixHLHdr.c 12> tixHLHdr.c 1> xxsubtype.c 1> zipimport.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixHLInd.obj ..\generic\tixHLInd.c 12> tixHLInd.c 1> zlibmodule.c 1> fileio.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixHList.obj ..\generic\tixHList.c 12> tixHList.c 1> bytesio.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixImgCmp.obj ..\generic\tixImgCmp.c 12> tixImgCmp.c 1> stringio.c 1> bufferedio.c 12>..\generic\tixImgCmp.c(169): warning C4090: 'function': different 'const' qualifiers 12>..\generic\tixImgCmp.c(169): warning C4028: formal parameter 2 different from declaration 12>..\generic\tixImgCmp.c(169): warning C4028: formal parameter 5 different from declaration 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixImgXpm.obj ..\generic\tixImgXpm.c 12> tixImgXpm.c 1> iobase.c 1> textio.c 12>..\generic\tixImgXpm.c(88): warning C4090: 'function': different 'const' qualifiers 12>..\generic\tixImgXpm.c(88): warning C4028: formal parameter 2 different from declaration 12>..\generic\tixImgXpm.c(88): warning C4028: formal parameter 5 different from declaration 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixInit.obj ..\generic\tixInit.c 1> _iomodule.c 12> tixInit.c 1> adler32.c 1> compress.c 1> crc32.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixList.obj ..\generic\tixList.c 1> deflate.c 12> tixList.c 1> infback.c 1> inffast.c 1> Generating Code... 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixMethod.obj ..\generic\tixMethod.c 1> Compiling... 1> inflate.c 12> tixMethod.c 1> inftrees.c 1> trees.c 1> uncompr.c 1> zutil.c 1> _codecs_cn.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixNBFrame.obj ..\generic\tixNBFrame.c 12> tixNBFrame.c 1> _codecs_hk.c 1> _codecs_iso2022.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixOption.obj ..\generic\tixOption.c 12> tixOption.c 1> _codecs_jp.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixSmpLs.obj ..\generic\tixSmpLs.c 1> _codecs_kr.c 12> tixSmpLs.c 1> _codecs_tw.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixScroll.obj ..\generic\tixScroll.c 12> tixScroll.c 1> multibytecodec.c 1> _winapi.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixTList.obj ..\generic\tixTList.c 12> tixTList.c 1> abstract.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixUtils.obj ..\generic\tixUtils.c 12> tixUtils.c 1> accu.c 1> boolobject.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixWCmpt.obj ..\win\tixWCmpt.c 1> bytes_methods.c 12> tixWCmpt.c 1> bytearrayobject.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixWidget.obj ..\generic\tixWidget.c 12> tixWidget.c 1> bytesobject.c 1> capsule.c 1> Generating Code... 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixWinDraw.obj ..\win\tixWinDraw.c 12> tixWinDraw.c 1> Compiling... 1> cellobject.c 1> classobject.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixWinXpm.obj ..\win\tixWinXpm.c 12> tixWinXpm.c 1> codeobject.c 1> complexobject.c 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual Studio\VC98\include" -I..\win -I..\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic -IW:\Python-3.5.1\externals\tk-8.6.4.2\win -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix -FoDebug_VC13\tixWinWm.obj ..\win\tixWinWm.c 1> descrobject.c 12> tixWinWm.c 1> dictobject.c 12> link.exe -debug -debugtype:cv /RELEASE /NOLOGO /MACHINE:IX86 -dll W:\Python-3.5.1\externals\tk-8.6.4.2\win\Debug_VC13\tkstub86.lib W:\Python-3.5.1\externals\tcl-core-8.6.4.2\win\Debug_VC13\tclstub86.lib kernel32.lib advapi32.lib user32.lib gdi32.lib comdlg32.lib -out:Debug_VC13\tix84g.dll @C:\Users\eddyq\AppData\Local\Temp\nmF029.tmp 12>LINK : fatal error LNK1181: cannot open input file 'W:\Python-3.5.1\externals\tk-8.6.4.2\win\Debug_VC13\tkstub86.lib' 12>NMAKE : fatal error U1077: '"C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\bin\link.exe"' : return code '0x49d' 12> Stop. 12>C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(44,5): error MSB3073: The command "setlocal 12>C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(44,5): error MSB3073: if not exist "W:\Python-3.5.1\externals\tcltk\lib\tix8.4.3\tix84g.dll" goto build 12>C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(44,5): error MSB3073: goto :eof 12>C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(44,5): error MSB3073: :build 12>C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(44,5): error MSB3073: set VCINSTALLDIR=C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\ 12>C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(44,5): error MSB3073: cd /D "W:\Python-3.5.1\externals\tix-8.4.3.6\win" 12>C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(44,5): error MSB3073: nmake /nologo -f makefile.vc MACHINE=IX86 DEBUG=1 NODEBUG=0 TCL_DBGX=g TK_DBGX=g TCL_MAJOR=8 TCL_MINOR=6 TCL_PATCH=4 BUILDDIRTOP="Debug_VC13" TCL_DIR="W:\Python-3.5.1\externals\tcl-core-8.6.4.2" TK_DIR="W:\Python-3.5.1\externals\tk-8.6.4.2" INSTALL_DIR="W:\Python-3.5.1\externals\tcltk" all install 12>C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(44,5): error MSB3073: " exited with code 2. 1> enumobject.c 1> exceptions.c 1> fileobject.c 1> floatobject.c 1> frameobject.c 1> funcobject.c 1> genobject.c 1> iterobject.c 1> listobject.c 1> longobject.c 1> memoryobject.c 1> methodobject.c 1> moduleobject.c 1> namespaceobject.c 1> Generating Code... 1> Compiling... 1> object.c 1> obmalloc.c 1> odictobject.c 1> rangeobject.c 1> setobject.c 1> sliceobject.c 1> structseq.c 1> tupleobject.c 1> typeobject.c 1> unicodectype.c 1> unicodeobject.c 1> weakrefobject.c 1> acceler.c 1> bitset.c 1> firstsets.c 1> grammar.c 1> grammar1.c 1> listnode.c 1> metagrammar.c 1> myreadline.c 1> Generating Code... 1> Compiling... 1> node.c 1> parser.c 1> parsetok.c 1> tokenizer.c 1> invalid_parameter_handler.c 1> winreg.c 1> config.c 1> getpathp.c 1> msvcrtmodule.c 1> pyhash.c 1> random.c 1> _warnings.c 1> asdl.c 1> ast.c 1> bltinmodule.c 1> ceval.c 1> codecs.c 1> compile.c < -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Wed Jan 13 19:14:05 2016 From: brett at python.org (Brett Cannon) Date: Thu, 14 Jan 2016 00:14:05 +0000 Subject: [Python-Dev] Debugging using VS 2015 In-Reply-To: <005e01d14e5e$09660bb0$1c322310$@com> References: <01f401d14dad$e984f1e0$bc8ed5a0$@com> <5695C54A.3070101@sdamon.com> <003201d14e3d$b6560f80$23022e80$@com> <005e01d14e5e$09660bb0$1c322310$@com> Message-ID: Then your paths might be off; I would try with a fresh checkout and make sure you can build Python cleanly before trying your embedded case to make sure your tooling is set up (e.g., you don't have Mercurial installed so that's a cause of one of your error messages). On Wed, 13 Jan 2016 at 15:58 Eddy Quicksall wrote: > Those files already exist: > > > > W:\Python-3.5.1>PCbuild\get_externals.bat > > Fetching external libraries... > > bzip2-1.0.6 already exists, skipping. > > nasm-2.11.06 already exists, skipping. > > openssl-1.0.2d already exists, skipping. > > sqlite-3.8.11.0 already exists, skipping. > > tcl-core-8.6.4.2 already exists, skipping. > > tk-8.6.4.2 already exists, skipping. > > tix-8.4.3.6 already exists, skipping. > > xz-5.0.5 already exists, skipping. > > Finished. > > > > W:\Python-3.5.1> > > > > *From:* Brett Cannon [mailto:brett at python.org] > *Sent:* Wednesday, January 13, 2016 4:30 PM > *To:* Eddy Quicksall; python-dev at python.org > *Subject:* Re: [Python-Dev] Debugging using VS 2015 > > > > Use PCbuild/get_externals.bat > to > download the external dependencies. > > > > On Wed, 13 Jan 2016 at 13:25 Eddy Quicksall wrote: > > I am using 3.5.1. I?m adding an extension for my special case. > > > > I know this list is for development and that?s what I?m doing but I would > like to use VS 2015 to do my debugging. If there is another list I should > use to get the build to work then please let me know the list. > > > > I have been able to build using ?PCbuild\build.bat ?e? but I?m not able to > build using PCbuild\pcbuild.sln. It gives lots of errors. > > > > I?m using Debug/Win32. > > > > Below is the output. Note that lots of .c files are missing ?. Is there > another package I should get (but then why would it build correctly with > build.bat)? It could be that I need to add the equivalent to the ?-e? > option. > > > > 1>------ Rebuild All started: Project: pythoncore, Configuration: Debug > Win32 ------ > > 2>------ Rebuild All started: Project: tcl, Configuration: Debug Win32 > ------ > > 3>------ Rebuild All started: Project: ssleay, Configuration: Debug Win32 > ------ > > 4>------ Rebuild All started: Project: sqlite3, Configuration: Debug Win32 > ------ > > 1> Killing any running python_d.exe instances... > > 1> 'hg' is not recognized as an internal or external command, > > 1> operable program or batch file. > > 1>W:\Python-3.5.1\PCbuild\pythoncore.vcxproj(403,5): warning MSB3073: The > command "hg id -b > > "W:\Python-3.5.1\PCbuild\obj\\win32_Debug\pythoncore\hgbranch.txt"" exited > with code 9009. > > 1> 'hg' is not recognized as an internal or external command, > > 1> operable program or batch file. > > 1>W:\Python-3.5.1\PCbuild\pythoncore.vcxproj(404,5): warning MSB3073: The > command "hg id -i > > "W:\Python-3.5.1\PCbuild\obj\\win32_Debug\pythoncore\hgversion.txt"" exited > with code 9009. > > 1> 'hg' is not recognized as an internal or external command, > > 1> operable program or batch file. > > 1>W:\Python-3.5.1\PCbuild\pythoncore.vcxproj(405,5): warning MSB3073: The > command "hg id -t > > "W:\Python-3.5.1\PCbuild\obj\\win32_Debug\pythoncore\hgtag.txt"" exited > with code 9009. > > 2>C:\Program Files > (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(41,5): > warning MSB8005: The property 'NMakeReBuildCommandLine' doesn't exist. > Skipping... > > 5>------ Rebuild All started: Project: python3dll, Configuration: Debug > Win32 ------ > > 1> _bisectmodule.c > > 5> python3dll.c > > 4> sqlite3.c > > 3> d1_both.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\d1_both.c': No such file or > directory > > 3> d1_lib.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\d1_lib.c': No such file or > directory > > 1> _codecsmodule.c > > 3> d1_pkt.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\d1_pkt.c': No such file or > directory > > 3> d1_srtp.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\d1_srtp.c': No such file or > directory > > 3> s2_clnt.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s2_clnt.c': No such file or > directory > > 3> s2_enc.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s2_enc.c': No such file or > directory > > 3> s2_lib.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s2_lib.c': No such file or > directory > > 3> s2_meth.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s2_meth.c': No such file or > directory > > 3> s2_pkt.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s2_pkt.c': No such file or > directory > > 3> s2_srvr.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s2_srvr.c': No such file or > directory > > 3> s23_clnt.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s23_clnt.c': No such file or > directory > > 3> s23_lib.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s23_lib.c': No such file or > directory > > 3> s23_meth.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s23_meth.c': No such file or > directory > > 3> s23_pkt.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s23_pkt.c': No such file or > directory > > 3> s23_srvr.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s23_srvr.c': No such file or > directory > > 3> s3_both.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_both.c': No such file or > directory > > 3> s3_cbc.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_cbc.c': No such file or > directory > > 3> s3_clnt.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_clnt.c': No such file or > directory > > 3> s3_enc.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_enc.c': No such file or > directory > > 3> s3_lib.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_lib.c': No such file or > directory > > 3> Generating Code... > > 3> Compiling... > > 3> s3_meth.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_meth.c': No such file or > directory > > 3> s3_pkt.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_pkt.c': No such file or > directory > > 3> s3_srvr.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\s3_srvr.c': No such file or > directory > > 3> ssl_algs.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_algs.c': No such file or > directory > > 3> ssl_asn1.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_asn1.c': No such file or > directory > > 3> ssl_cert.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_cert.c': No such file or > directory > > 3> ssl_ciph.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_ciph.c': No such file or > directory > > 3> ssl_err.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_err.c': No such file or > directory > > 3> ssl_err2.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_err2.c': No such file or > directory > > 3> ssl_lib.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_lib.c': No such file or > directory > > 3> ssl_rsa.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_rsa.c': No such file or > directory > > 3> ssl_sess.c > > 1> _collectionsmodule.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\ssl_sess.c': No such file or > directory > > 3> t1_clnt.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\t1_clnt.c': No such file or > directory > > 3> t1_enc.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\t1_enc.c': No such file or > directory > > 3> t1_lib.c > > 1> _csv.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\t1_lib.c': No such file or > directory > > 3> t1_meth.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\t1_meth.c': No such file or > directory > > 3> t1_reneg.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\t1_reneg.c': No such file or > directory > > 3> t1_srvr.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\t1_srvr.c': No such file or > directory > > 3> tls_srp.c > > 3>c1 : fatal error C1083: Cannot open source file: > 'W:\Python-3.5.1\externals\openssl-1.0.2d\ssl\tls_srp.c': No such file or > directory > > 3> Generating Code... > > 6>------ Rebuild All started: Project: tk, Configuration: Debug Win32 > ------ > > 1> _functoolsmodule.c > > 1> _heapqmodule.c > > 1> _json.c > > 1> _localemodule.c > > 4> Creating library W:\Python-3.5.1\PCBuild\win32\sqlite3_d.lib and > object W:\Python-3.5.1\PCBuild\win32\sqlite3_d.exp > > 5> Creating library W:\Python-3.5.1\PCBuild\win32\python3_dstub.lib > and object W:\Python-3.5.1\PCBuild\win32\python3_dstub.exp > > 1> _lsprof.c > > 5> Creating library W:\Python-3.5.1\PCBuild\win32\python3_d.lib and > object W:\Python-3.5.1\PCBuild\win32\python3_d.exp > > 5> python3dll.vcxproj -> W:\Python-3.5.1\PCBuild\win32\python3_d.dll > > 7>------ Rebuild All started: Project: libeay, Configuration: Debug Win32 > ------ > > 6>C:\Program Files > (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.MakeFile.Targets(41,5): > warning MSB8005: The property 'NMakeReBuildCommandLine' doesn't exist. > Skipping... > > 8>------ Skipped Rebuild All: Project: bdist_wininst, Configuration: Debug > Win32 ------ > > 8>Project not selected to build for this solution configuration > > 9>------ Skipped Rebuild All: Project: xxlimited, Configuration: Release > Win32 ------ > > 9>Project not selected to build for this solution configuration > > 10>------ Rebuild All started: Project: pylauncher, Configuration: Debug > Win32 ------ > > 1> _math.c > > 4> sqlite3.vcxproj -> W:\Python-3.5.1\PCBuild\win32\sqlite3_d.dll > > 11>------ Rebuild All started: Project: pywlauncher, Configuration: Debug > Win32 ------ > > 10> launcher.c > > 1> _pickle.c > > 7> nasm: fatal: unable to open input file > `W:\Python-3.5.1\externals\openssl-1.0.2d\tmp32\aes-586.asm' > > 1> _randommodule.c > > 7>W:\Python-3.5.1\PCbuild\openssl.props(69,5): error MSB3073: The command > "setlocal > > 7>W:\Python-3.5.1\PCbuild\openssl.props(69,5): error MSB3073: set > PATH=W:\Python-3.5.1\externals\\nasm-2.11.06\;%PATH% > > 7>W:\Python-3.5.1\PCbuild\openssl.props(69,5): error MSB3073: nasm.exe -f > win32 -o > "W:\Python-3.5.1\externals\openssl-1.0.2d\tmp\\win32_Debug\libeay\aes-586.obj" > "W:\Python-3.5.1\externals\openssl-1.0.2d\tmp32\aes-586.asm"" exited with > code 1. > > 12>------ Rebuild All started: Project: tix, Configuration: Debug Win32 > ------ > > 11> launcher.c > > 1> _sre.c > > 1> _stat.c > > 1> _struct.c > > 12> md Debug_VC13 > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixClass.obj ..\generic\tixClass.c > > 12> tixClass.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixCmds.obj ..\generic\tixCmds.c > > 12> tixCmds.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixCompat.obj ..\generic\tixCompat.c > > 12> tixCompat.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixDiImg.obj ..\generic\tixDiImg.c > > 12> tixDiImg.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixDiITxt.obj ..\generic\tixDiITxt.c > > 12> tixDiITxt.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixDiStyle.obj ..\generic\tixDiStyle.c > > 12> tixDiStyle.c > > 11> pywlauncher.vcxproj -> W:\Python-3.5.1\PCBuild\win32\pyw_d.exe > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixDItem.obj ..\generic\tixDItem.c > > 12> tixDItem.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixDiText.obj ..\generic\tixDiText.c > > 12> tixDiText.c > > 1> _weakref.c > > 1> arraymodule.c > > 1> atexitmodule.c > > 1> audioop.c > > 1> binascii.c > > 1> Generating Code... > > 1> Compiling... > > 1> cmathmodule.c > > 1> _datetimemodule.c > > 1> errnomodule.c > > 1> faulthandler.c > > 1> gcmodule.c > > 10> pylauncher.vcxproj -> W:\Python-3.5.1\PCBuild\win32\py_d.exe > > 1> hashtable.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixDiWin.obj ..\generic\tixDiWin.c > > 12> tixDiWin.c > > 1> itertoolsmodule.c > > 1> main.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixError.obj ..\generic\tixError.c > > 12> tixError.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixForm.obj ..\generic\tixForm.c > > 1> mathmodule.c > > 12> tixForm.c > > 1> md5module.c > > 1> mmapmodule.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixFormMisc.obj ..\generic\tixFormMisc.c > > 12> tixFormMisc.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixGeometry.obj ..\generic\tixGeometry.c > > 12> tixGeometry.c > > 1> _opcode.c > > 1> _operator.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixGrid.obj ..\generic\tixGrid.c > > 1> parsermodule.c > > 12> tixGrid.c > > 1> posixmodule.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixGrData.obj ..\generic\tixGrData.c > > 12> tixGrData.c > > 1> rotatingtree.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixGrRC.obj ..\generic\tixGrRC.c > > 1> sha1module.c > > 12> tixGrRC.c > > 1> sha256module.c > > 1> sha512module.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixGrFmt.obj ..\generic\tixGrFmt.c > > 12> tixGrFmt.c > > 1> signalmodule.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixGrSel.obj ..\generic\tixGrSel.c > > 12> tixGrSel.c > > 1> Generating Code... > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixGrUtl.obj ..\generic\tixGrUtl.c > > 12> tixGrUtl.c > > 1> Compiling... > > 1> symtablemodule.c > > 1> _threadmodule.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixHLCol.obj ..\generic\tixHLCol.c > > 12> tixHLCol.c > > 1> _tracemalloc.c > > 1> timemodule.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixHLHdr.obj ..\generic\tixHLHdr.c > > 12> tixHLHdr.c > > 1> xxsubtype.c > > 1> zipimport.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixHLInd.obj ..\generic\tixHLInd.c > > 12> tixHLInd.c > > 1> zlibmodule.c > > 1> fileio.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixHList.obj ..\generic\tixHList.c > > 12> tixHList.c > > 1> bytesio.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixImgCmp.obj ..\generic\tixImgCmp.c > > 12> tixImgCmp.c > > 1> stringio.c > > 1> bufferedio.c > > 12>..\generic\tixImgCmp.c(169): warning C4090: 'function': different > 'const' qualifiers > > 12>..\generic\tixImgCmp.c(169): warning C4028: formal parameter 2 > different from declaration > > 12>..\generic\tixImgCmp.c(169): warning C4028: formal parameter 5 > different from declaration > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixImgXpm.obj ..\generic\tixImgXpm.c > > 12> tixImgXpm.c > > 1> iobase.c > > 1> textio.c > > 12>..\generic\tixImgXpm.c(88): warning C4090: 'function': different > 'const' qualifiers > > 12>..\generic\tixImgXpm.c(88): warning C4028: formal parameter 2 different > from declaration > > 12>..\generic\tixImgXpm.c(88): warning C4028: formal parameter 5 different > from declaration > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixInit.obj ..\generic\tixInit.c > > 1> _iomodule.c > > 12> tixInit.c > > 1> adler32.c > > 1> compress.c > > 1> crc32.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixList.obj ..\generic\tixList.c > > 1> deflate.c > > 12> tixList.c > > 1> infback.c > > 1> inffast.c > > 1> Generating Code... > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixMethod.obj ..\generic\tixMethod.c > > 1> Compiling... > > 1> inflate.c > > 12> tixMethod.c > > 1> inftrees.c > > 1> trees.c > > 1> uncompr.c > > 1> zutil.c > > 1> _codecs_cn.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixNBFrame.obj ..\generic\tixNBFrame.c > > 12> tixNBFrame.c > > 1> _codecs_hk.c > > 1> _codecs_iso2022.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixOption.obj ..\generic\tixOption.c > > 12> tixOption.c > > 1> _codecs_jp.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixSmpLs.obj ..\generic\tixSmpLs.c > > 1> _codecs_kr.c > > 12> tixSmpLs.c > > 1> _codecs_tw.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixScroll.obj ..\generic\tixScroll.c > > 12> tixScroll.c > > 1> multibytecodec.c > > 1> _winapi.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixTList.obj ..\generic\tixTList.c > > 12> tixTList.c > > 1> abstract.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixUtils.obj ..\generic\tixUtils.c > > 12> tixUtils.c > > 1> accu.c > > 1> boolobject.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixWCmpt.obj ..\win\tixWCmpt.c > > 1> bytes_methods.c > > 12> tixWCmpt.c > > 1> bytearrayobject.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixWidget.obj ..\generic\tixWidget.c > > 12> tixWidget.c > > 1> bytesobject.c > > 1> capsule.c > > 1> Generating Code... > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixWinDraw.obj ..\win\tixWinDraw.c > > 12> tixWinDraw.c > > 1> Compiling... > > 1> cellobject.c > > 1> classobject.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixWinXpm.obj ..\win\tixWinXpm.c > > 12> tixWinXpm.c > > 1> codeobject.c > > 1> complexobject.c > > 12> cl.exe -DWIN32 -D_WIN32 -D_MT -DSTDC_HEADERS -D_DLL -c -W3 -nologo > -MD -FpDebug_VC13\ -Od -Zi -I"c:\Program Files\Microsoft Visual > Studio\VC98\include" -I..\win -I..\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tk-8.6.4.2\win > -IW:\Python-3.5.1\externals\tk-8.6.4.2\xlib > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\generic > -IW:\Python-3.5.1\externals\tcl-core-8.6.4.2\win -D__WIN32__ > -D_CRT_SECURE_NO_WARNINGS -DUSE_TCL_STUBS=1 -DUSE_TK_STUBS=1 -DBUILD_tix > -FoDebug_VC13\tixWinWm.obj ..\win\tixWinWm.c > > 1> descrobject.c > > 12> tixWinWm.c > > 1> dictobject.c > > 12> link.exe -debug -debugtype:cv /RELEASE /NOLOGO /MACHINE:IX86 -dll > W:\Python-3.5.1\externals\tk-8.6.4.2\win\Debug_VC13\tkstub86.lib > W:\Python-3.5.1\externals\tcl-core-8.6.4.2\win\Debug_VC13\tclstub86.lib > kernel32.lib advapi32.lib user32.lib gdi32.lib comdlg32.lib > -out:Debug_VC13\tix84g.dll @C:\Users\eddyq\AppData\Local\Temp\nmF029.tmp > > 12>LINK : fatal error LNK1181: cannot open input file > 'W:\Python-3.5.1\externals\tk-8.6.4.2\win\Debug_VC13\tkstub86.lib' > > 12>NMAKE : fatal error U1077: '"C:\Program Files (x86)\Microsoft Visual > Studio 14.0\VC\bin\link.exe"' : return code '0x49d' > > 12> Stop. > > 12>C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Microsoft.Make > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From paulson at busiq.com Wed Jan 13 19:18:21 2016 From: paulson at busiq.com (Matthew Paulson) Date: Wed, 13 Jan 2016 19:18:21 -0500 Subject: [Python-Dev] Discussion related to memory leaks requested In-Reply-To: <6AFE80F6-AFF2-4B2E-BA98-E3F24C4CCF73@yahoo.com> References: <5696A647.6090501@busiq.com> <5696D485.2030607@busiq.com> <6AFE80F6-AFF2-4B2E-BA98-E3F24C4CCF73@yahoo.com> Message-ID: <5696E94D.3060508@busiq.com> Hi Andrew: These are all good points, and I defer to your experience -- I am new to python internals, but the fact remains that after multiple iterations of our embedded test case, we are seeing continued allocations (DS2015) and growth of the working set (windows task manager). If your are pooling resources on the free list, wouldn't you expect these items to get reused and for things to stabilize after a while? We're not seeing that. I think Victor's suggestion of a very simple test case is probably the best idea. I'll try to put that together in the next few days and if it also demonstrates the problem, then I'll submit it here. Thanks for your time and help. Best, Matt On 1/13/2016 6:45 PM, Andrew Barnert wrote: > On Jan 13, 2016, at 14:49, Matthew Paulson > wrote: > >> Hi Victor: >> >> No, I'm using the new heap analysis functions in DS2015. > > Isn't that going to report any memory that Python's higher level > allocators hold in their freelists as leaked, even though it isn't leaked? > >> We think we have found one issue. In the following sequence, dict has >> no side effects, yet it is used -- unless someone can shed light on >> why dict is used in this case: > > Where do you see an issue here? The dict will have one ref, so the > decref at the end should return it to the freelist. > > Also, it looks like there _is_ a side effect here. When you add a > bunch of elements to a dict, it grows. When you delete a bunch of > elements, it generally doesn't shrink. But when you clear the dict, it > does shrink. So, copying it to a temporary dict, clearing it, updating > it from the temporary dict, and then releasing the temporary dict > should force it to shrink. > > So, the overall effect should be that you have a smaller hash table > for the builtins dict, and a chunk of memory sitting on the freelists > ready to be reused. If your analyzer is showing the freelists as > leaked, this will look like a net leak rather than a net recovery, but > that's just a problem in the analyzer. > > Of course I could be wrong, but I think the first step is to rule out > the possibility that you're measuring the wrong thing... > >> /* Clear the modules dict. */ >> PyDict_Clear(modules); >> /* Restore the original builtins dict, to ensure that any >> user data gets cleared. */ >> dict = PyDict_Copy(interp->builtins); >> if (dict == NULL) >> PyErr_Clear(); >> PyDict_Clear(interp->builtins); >> if (PyDict_Update(interp->builtins, interp->builtins_copy)) >> PyErr_Clear(); >> Py_XDECREF(dict); >> >> And removing dict from this sequence seems to have fixed one of the >> issues, yielding 14k per iteration. > >> Simple program: Good idea. We will try that -- right now it's >> embedded in a more complex environment, but we have tried to strip it >> down to a very simple sequence. >> >> The next item on our list is memory that is not getting freed after >> running simple string. It's in the parsertok sequence -- it seems >> that the syntax tree is not getting cleared -- but this opinion is >> preliminary. >> >> Best, >> >> Matt >> >> On 1/13/2016 5:10 PM, Victor Stinner wrote: >>> Hi, >>> >>> 2016-01-13 20:32 GMT+01:00 Matthew Paulson: >>>> I've spent some time performing memory leak analysis while using Python in an embedded configuration. >>> Hum, did you try tracemalloc? >>> >>> https://docs.python.org/dev/library/tracemalloc.html >>> https://pytracemalloc.readthedocs.org/ >>> >>>> Is there someone in the group that would like to discuss this topic. There seems to be other leaks as well. I'm new to Python-dev, but willing to help or work with someone who is more familiar with these areas than I. >>> Are you able to reproduce the leak with a simple program? >>> >>> Victor >>> >>> >> >> -- >> >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/abarnert%40yahoo.com -- -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: MattSig.JPG Type: image/jpeg Size: 38491 bytes Desc: not available URL: From steve.dower at python.org Wed Jan 13 19:50:46 2016 From: steve.dower at python.org (Steve Dower) Date: Wed, 13 Jan 2016 16:50:46 -0800 Subject: [Python-Dev] Debugging using VS 2015 In-Reply-To: <005e01d14e5e$09660bb0$1c322310$@com> References: <01f401d14dad$e984f1e0$bc8ed5a0$@com> <5695C54A.3070101@sdamon.com> <003201d14e3d$b6560f80$23022e80$@com> <005e01d14e5e$09660bb0$1c322310$@com> Message-ID: <5696F0E6.8090700@python.org> On 13Jan2016 1556, Eddy Quicksall wrote: > Those files already exist: > > W:\Python-3.5.1>PCbuild\get_externals.bat > Fetching external libraries... > bzip2-1.0.6 already exists, skipping. > nasm-2.11.06 already exists, skipping. > openssl-1.0.2d already exists, skipping. > sqlite-3.8.11.0 already exists, skipping. > tcl-core-8.6.4.2 already exists, skipping. > tk-8.6.4.2 already exists, skipping. > tix-8.4.3.6 already exists, skipping. > xz-5.0.5 already exists, skipping. > Finished. > W:\Python-3.5.1> You may need to delete the externals/openssl-1.0.2d directory and run get_externals.bat again. Sometimes when you try to build without having downloaded the externals it creates enough directories that the script thinks it exists when it isn't all there. Cheers, Steve From ncoghlan at gmail.com Wed Jan 13 20:58:47 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 14 Jan 2016 11:58:47 +1000 Subject: [Python-Dev] Discussion related to memory leaks requested In-Reply-To: <5696E94D.3060508@busiq.com> References: <5696A647.6090501@busiq.com> <5696D485.2030607@busiq.com> <6AFE80F6-AFF2-4B2E-BA98-E3F24C4CCF73@yahoo.com> <5696E94D.3060508@busiq.com> Message-ID: On 14 January 2016 at 10:18, Matthew Paulson wrote: > Hi Andrew: > > These are all good points, and I defer to your experience -- I am new to > python internals, but the fact remains that after multiple iterations of > our embedded test case, we are seeing continued allocations (DS2015) and > growth of the working set (windows task manager). If your are pooling > resources on the free list, wouldn't you expect these items to get reused > and for things to stabilize after a while? We're not seeing that. > > I think Victor's suggestion of a very simple test case is probably the > best idea. I'll try to put that together in the next few days and if it > also demonstrates the problem, then I'll submit it here. > If you want to throw your debugger at it, there's an existing subinterpreter test case in _testembed that should exhibit any Initialize/Finalize leaks: https://github.com/python/cpython/blob/master/Programs/_testembed.c#L36 However, if there is one, our existing automated leak monitoring unfortunately wouldn't pick it up, as the embedding tests run in a subprocess rather than the main test process. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From trent at trent.me Wed Jan 13 22:47:15 2016 From: trent at trent.me (Trent Nelson) Date: Thu, 14 Jan 2016 03:47:15 +0000 Subject: [Python-Dev] Discussion related to memory leaks requested In-Reply-To: <5696E94D.3060508@busiq.com> References: <5696A647.6090501@busiq.com> <5696D485.2030607@busiq.com> <6AFE80F6-AFF2-4B2E-BA98-E3F24C4CCF73@yahoo.com> <5696E94D.3060508@busiq.com> Message-ID: Gflags/umdh is pretty useful on Windows, I used it to track down a few quirky PyParallel memory leaks. Steps: 1. Enable global flags: gflags ?i python.exe +ust 2. Launch Python. 3. Enable the umdh tracer: umdh ?p: -f:d1.log 4. Kill it after a short run. 5. Re-launch Python. 6. Enable it again: umdh ?p: -f:d2.log 7. Let it run for longer ? long enough to make sure it?s leaking memory. 8. Kill it, then generate a dump file: umdh ?d d1.log d2.log > dump.txt (Those steps were pretty specific to my particular situation, but it should at least be a reasonable starting point for what to google to find out more.) Here are two sample outputs that pin-pointed the exact leak path: + 49116 ( 49116 - 0) 6 allocs BackTrace9763CA0 + 6 ( 6 - 0) BackTrace9763CA0 allocations ntdll!RtlpCallInterceptRoutine+40 ntdll!RtlAllocateHeap+79846 SQLSRV32!SQLAllocateMemory+26 markSQLSRV32!SQLAllocConnect+F6 SQLSRV32!SQLAllocHandle+83 ODBC32!RetcodeDriverInit+2D9 ODBC32!SQLInternalDriverConnectW+2F ODBC32!CDispenser::CreateResource+DB comsvcs!CHolder::SafeDispenserDriver::CreateResource+43 comsvcs!CHolder::AllocResource+24D ODBC32!CDispenser::TryAllocResource+6E ODBC32!CDispenser::GetActiveConnection+72 ODBC32!SQLDriverConnectW+9D4 pyodbc!Connect+14F (c:\users\trent\home\src\pyparallel\contrib\pyodbc\src\connection.cpp, 85) pyodbc!Connection_New+CD (c:\users\trent\home\src\pyparallel\contrib\pyodbc\src\connection.cpp, 166) pyodbc!mod_connect+579 (c:\users\trent\home\src\pyparallel\contrib\pyodbc\src\pyodbcmodule.cpp, 378) python33!PyCFunction_Call+F3 (c:\users\trent\home\src\pyparallel\objects\methodobject.c, 84) python33!call_function+371 (c:\users\trent\home\src\pyparallel\python\ceval.c, 4130) python33!PyEval_EvalFrameEx+356C (c:\users\trent\home\src\pyparallel\python\ceval.c, 2745) python33!fast_function+113 (c:\users\trent\home\src\pyparallel\python\ceval.c, 4219) python33!call_function+529 (c:\users\trent\home\src\pyparallel\python\ceval.c, 4152) python33!PyEval_EvalFrameEx+356C (c:\users\trent\home\src\pyparallel\python\ceval.c, 2745) python33!fast_function+113 (c:\users\trent\home\src\pyparallel\python\ceval.c, 4219) python33!call_function+529 (c:\users\trent\home\src\pyparallel\python\ceval.c, 4152) python33!PyEval_EvalFrameEx+356C (c:\users\trent\home\src\pyparallel\python\ceval.c, 2745) python33!PyEval_EvalCodeEx+B4D (c:\users\trent\home\src\pyparallel\python\ceval.c, 3500) python33!function_call+1BB (c:\users\trent\home\src\pyparallel\objects\funcobject.c, 639) python33!PyObject_Call+7C (c:\users\trent\home\src\pyparallel\objects\abstract.c, 2036) python33!method_call+F9 (c:\users\trent\home\src\pyparallel\objects\classobject.c, 353) python33!PyObject_Call+7C (c:\users\trent\home\src\pyparallel\objects\abstract.c, 2036) python33!PyEval_CallObjectWithKeywords+16C (c:\users\trent\home\src\pyparallel\python\ceval.c, 4011) python33!PxSocket_IOLoop+1249 (c:\users\trent\home\src\pyparallel\python\pyparallel.c, 9128) + 48432 ( 48432 - 0) 6 allocs BackTrace97635E0 + 6 ( 6 - 0) BackTrace97635E0 allocations ntdll!RtlpCallInterceptRoutine+40 ntdll!RtlAllocateHeap+79846 SQLSRV32!SQLAllocateMemory+26 SQLSRV32!SQLAllocConnect+4D SQLSRV32!SQLAllocHandle+83 ODBC32!RetcodeDriverInit+2D9 ODBC32!SQLInternalDriverConnectW+2F ODBC32!CDispenser::CreateResource+DB comsvcs!CHolder::SafeDispenserDriver::CreateResource+43 comsvcs!CHolder::AllocResource+24D ODBC32!CDispenser::TryAllocResource+6E ODBC32!CDispenser::GetActiveConnection+72 ODBC32!SQLDriverConnectW+9D4 pyodbc!Connect+14F (c:\users\trent\home\src\pyparallel\contrib\pyodbc\src\connection.cpp, 85) pyodbc!Connection_New+CD (c:\users\trent\home\src\pyparallel\contrib\pyodbc\src\connection.cpp, 166) pyodbc!mod_connect+579 (c:\users\trent\home\src\pyparallel\contrib\pyodbc\src\pyodbcmodule.cpp, 378) python33!PyCFunction_Call+F3 (c:\users\trent\home\src\pyparallel\objects\methodobject.c, 84) python33!call_function+371 (c:\users\trent\home\src\pyparallel\python\ceval.c, 4130) python33!PyEval_EvalFrameEx+356C (c:\users\trent\home\src\pyparallel\python\ceval.c, 2745) python33!fast_function+113 (c:\users\trent\home\src\pyparallel\python\ceval.c, 4219) python33!call_function+529 (c:\users\trent\home\src\pyparallel\python\ceval.c, 4152) python33!PyEval_EvalFrameEx+356C (c:\users\trent\home\src\pyparallel\python\ceval.c, 2745) python33!fast_function+113 (c:\users\trent\home\src\pyparallel\python\ceval.c, 4219) python33!call_function+529 (c:\users\trent\home\src\pyparallel\python\ceval.c, 4152) python33!PyEval_EvalFrameEx+356C (c:\users\trent\home\src\pyparallel\python\ceval.c, 2745) python33!PyEval_EvalCodeEx+B4D (c:\users\trent\home\src\pyparallel\python\ceval.c, 3500) python33!function_call+1BB (c:\users\trent\home\src\pyparallel\objects\funcobject.c, 639) python33!PyObject_Call+7C (c:\users\trent\home\src\pyparallel\objects\abstract.c, 2036) python33!method_call+F9 (c:\users\trent\home\src\pyparallel\objects\classobject.c, 353) python33!PyObject_Call+7C (c:\users\trent\home\src\pyparallel\objects\abstract.c, 2036) python33!PyEval_CallObjectWithKeywords+16C (c:\users\trent\home\src\pyparallel\python\ceval.c, 4011) python33!PxSocket_IOLoop+1249 (c:\users\trent\home\src\pyparallel\python\pyparallel.c, 9128) You can see that the leak is coming from SQLAllocateMemory in this case, which we trace back to pyodbc Connection_New(). In this particular case, we were continually creating new connections, but never deallocating them. Another example: + 9420800 ( 9420800 - 0) 2300 allocs BackTrace8E96BE0 + 2300 ( 2300 - 0) BackTrace8E96BE0 allocations ntdll!RtlpCallInterceptRoutine+40 ntdll!RtlAllocateHeap+79846 python33!_PyTLSHeap_Init+9E (c:\users\trent\home\src\pyparallel\python\pyparallel.c, 3040) python33!_PyTLSHeap_Malloc+B0 (c:\users\trent\home\src\pyparallel\python\pyparallel.c, 3124) python33!_PyHeap_Malloc+55 (c:\users\trent\home\src\pyparallel\python\pyparallel.c, 3170) python33!PyUnicode_New+149 (c:\users\trent\home\src\pyparallel\objects\unicodeobject.c, 1051) python33!PyUnicode_DecodeUTF8Stateful+D2 (c:\users\trent\home\src\pyparallel\objects\unicodeobject.c, 4910) python33!PyUnicode_InternFromString+16 (c:\users\trent\home\src\pyparallel\objects\unicodeobject.c, 14505) python33!PyObject_GetAttrString+2E (c:\users\trent\home\src\pyparallel\objects\object.c, 881) pyodbc!lowercase+18 (c:\users\trent\home\src\pyparallel\contrib\pyodbc\src\pyodbcmodule.h, 69) pyodbc!execute+772 (c:\users\trent\home\src\pyparallel\contrib\pyodbc\src\cursor.cpp, 869) pyodbc!Cursor_execute+D4 (c:\users\trent\home\src\pyparallel\contrib\pyodbc\src\cursor.cpp, 945) python33!PyCFunction_Call+12D (c:\users\trent\home\src\pyparallel\objects\methodobject.c, 117) python33!call_function+371 (c:\users\trent\home\src\pyparallel\python\ceval.c, 4130) python33!PyEval_EvalFrameEx+356C (c:\users\trent\home\src\pyparallel\python\ceval.c, 2745) python33!fast_function+113 (c:\users\trent\home\src\pyparallel\python\ceval.c, 4219) python33!call_function+529 (c:\users\trent\home\src\pyparallel\python\ceval.c, 4152) python33!PyEval_EvalFrameEx+356C (c:\users\trent\home\src\pyparallel\python\ceval.c, 2745) python33!fast_function+113 (c:\users\trent\home\src\pyparallel\python\ceval.c, 4219) python33!call_function+529 (c:\users\trent\home\src\pyparallel\python\ceval.c, 4152) python33!PyEval_EvalFrameEx+356C (c:\users\trent\home\src\pyparallel\python\ceval.c, 2745) python33!PyEval_EvalCodeEx+B4D (c:\users\trent\home\src\pyparallel\python\ceval.c, 3500) python33!function_call+1BB (c:\users\trent\home\src\pyparallel\objects\funcobject.c, 639) python33!PyObject_Call+7C (c:\users\trent\home\src\pyparallel\objects\abstract.c, 2036) python33!method_call+F9 (c:\users\trent\home\src\pyparallel\objects\classobject.c, 353) python33!PyObject_Call+7C (c:\users\trent\home\src\pyparallel\objects\abstract.c, 2036) python33!PyEval_CallObjectWithKeywords+16C (c:\users\trent\home\src\pyparallel\python\ceval.c, 4011) python33!PxSocket_IOLoop+1249 (c:\users\trent\home\src\pyparallel\python\pyparallel.c, 9128) python33!PxSocket_IOCallback+295 (c:\users\trent\home\src\pyparallel\python\pyparallel.c, 10612) KERNEL32!BasepTpIoCallback+59 ntdll!TppIopExecuteCallback+182 ntdll!TppWorkerThread+8B4 We can trace this leak back to lowercase(), which was interning a string. At this point, we were intercepting string interning and making it thread local allocation, but those allocations were never being freed. From: Python-Dev [mailto:python-dev-bounces+trent=snakebite.org at python.org] On Behalf Of Matthew Paulson Sent: Wednesday, January 13, 2016 7:18 PM To: Andrew Barnert Cc: Python Dev ; Gary F. Desrochers Subject: Re: [Python-Dev] Discussion related to memory leaks requested Hi Andrew: These are all good points, and I defer to your experience -- I am new to python internals, but the fact remains that after multiple iterations of our embedded test case, we are seeing continued allocations (DS2015) and growth of the working set (windows task manager). If your are pooling resources on the free list, wouldn't you expect these items to get reused and for things to stabilize after a while? We're not seeing that. I think Victor's suggestion of a very simple test case is probably the best idea. I'll try to put that together in the next few days and if it also demonstrates the problem, then I'll submit it here. Thanks for your time and help. Best, Matt On 1/13/2016 6:45 PM, Andrew Barnert wrote: On Jan 13, 2016, at 14:49, Matthew Paulson > wrote: Hi Victor: No, I'm using the new heap analysis functions in DS2015. Isn't that going to report any memory that Python's higher level allocators hold in their freelists as leaked, even though it isn't leaked? We think we have found one issue. In the following sequence, dict has no side effects, yet it is used -- unless someone can shed light on why dict is used in this case: Where do you see an issue here? The dict will have one ref, so the decref at the end should return it to the freelist. Also, it looks like there _is_ a side effect here. When you add a bunch of elements to a dict, it grows. When you delete a bunch of elements, it generally doesn't shrink. But when you clear the dict, it does shrink. So, copying it to a temporary dict, clearing it, updating it from the temporary dict, and then releasing the temporary dict should force it to shrink. So, the overall effect should be that you have a smaller hash table for the builtins dict, and a chunk of memory sitting on the freelists ready to be reused. If your analyzer is showing the freelists as leaked, this will look like a net leak rather than a net recovery, but that's just a problem in the analyzer. Of course I could be wrong, but I think the first step is to rule out the possibility that you're measuring the wrong thing... /* Clear the modules dict. */ PyDict_Clear(modules); /* Restore the original builtins dict, to ensure that any user data gets cleared. */ dict = PyDict_Copy(interp->builtins); if (dict == NULL) PyErr_Clear(); PyDict_Clear(interp->builtins); if (PyDict_Update(interp->builtins, interp->builtins_copy)) PyErr_Clear(); Py_XDECREF(dict); And removing dict from this sequence seems to have fixed one of the issues, yielding 14k per iteration. Simple program: Good idea. We will try that -- right now it's embedded in a more complex environment, but we have tried to strip it down to a very simple sequence. The next item on our list is memory that is not getting freed after running simple string. It's in the parsertok sequence -- it seems that the syntax tree is not getting cleared -- but this opinion is preliminary. Best, Matt On 1/13/2016 5:10 PM, Victor Stinner wrote: Hi, 2016-01-13 20:32 GMT+01:00 Matthew Paulson : I've spent some time performing memory leak analysis while using Python in an embedded configuration. Hum, did you try tracemalloc? https://docs.python.org/dev/library/tracemalloc.html https://pytracemalloc.readthedocs.org/ Is there someone in the group that would like to discuss this topic. There seems to be other leaks as well. I'm new to Python-dev, but willing to help or work with someone who is more familiar with these areas than I. Are you able to reproduce the leak with a simple program? Victor -- _______________________________________________ Python-Dev mailing list Python-Dev at python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/abarnert%40yahoo.com -- [cid:image001.jpg at 01D14E51.93AED0C0] -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image001.jpg Type: image/jpeg Size: 38491 bytes Desc: image001.jpg URL: From benjamin at python.org Thu Jan 14 00:34:29 2016 From: benjamin at python.org (Benjamin Peterson) Date: Wed, 13 Jan 2016 21:34:29 -0800 Subject: [Python-Dev] Modifying the self-signed.pythontest.net certificate In-Reply-To: References: Message-ID: <1452749669.3075604.491708178.7CD07DB8@webmail.messagingengine.com> On Wed, Jan 13, 2016, at 13:15, Martin Panter wrote: > In order to fix the SSL test suite > , I would like to modify the > certificate used by https://self-signed.pythontest.net. So far I have > a patch ready for the pythontestdotnet repository, but I want to know > if I can just push to that repository, or if other steps are required. It should suffice to update that repo. I suppose the server might have to be kicked, too, as it's a cert change. Ping this thread if that doesn't work. From benjamin at python.org Thu Jan 14 00:42:24 2016 From: benjamin at python.org (Benjamin Peterson) Date: Wed, 13 Jan 2016 21:42:24 -0800 Subject: [Python-Dev] Discussion related to memory leaks requested In-Reply-To: <5696A647.6090501@busiq.com> References: <5696A647.6090501@busiq.com> Message-ID: <1452750144.3077065.491709618.053F1E57@webmail.messagingengine.com> This is a "well-known" issue. Parts of the interpreter (and especially, extension modules) cheerfully stash objects in global variables with no way to clean them up. Fixing this is a large project, which probably involves implementing PEP 489. On Wed, Jan 13, 2016, at 11:32, Matthew Paulson wrote: > Hi: > > I've spent some time performing memory leak analysis while using Python > in an embedded configuration. > > The pattern is: > > Py_Initialize(); > > ... run empty python source file ... > > Py_Finalize(); > > > I've identified several suspect areas including dictionary maitenace in > import.c:~ 414 > > /* Clear the modules dict. */ > PyDict_Clear(modules); > /* Restore the original builtins dict, to ensure that any > user data gets cleared. */ > dict = PyDict_Copy(interp->builtins); > if (dict == NULL) > PyErr_Clear(); > PyDict_Clear(interp->builtins); > if (PyDict_Update(interp->builtins, interp->builtins_copy)) > PyErr_Clear(); > Py_XDECREF(dict); > /* Clear module dict copies stored in the interpreter state */ > > > Is there someone in the group that would like to discuss this topic. > There seems to be other leaks as well. I'm new to Python-dev, but > willing to help or work with someone who is more familiar with these > areas than I. > > Thanks, > > Matt > > > -- > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/benjamin%40python.org > Email had 1 attachment: > + MattSig.JPG > 52k (image/jpeg) From vadmium+py at gmail.com Thu Jan 14 02:31:02 2016 From: vadmium+py at gmail.com (Martin Panter) Date: Thu, 14 Jan 2016 07:31:02 +0000 Subject: [Python-Dev] Modifying the self-signed.pythontest.net certificate In-Reply-To: <1452749669.3075604.491708178.7CD07DB8@webmail.messagingengine.com> References: <1452749669.3075604.491708178.7CD07DB8@webmail.messagingengine.com> Message-ID: On 14 January 2016 at 05:34, Benjamin Peterson wrote: > On Wed, Jan 13, 2016, at 13:15, Martin Panter wrote: >> In order to fix the SSL test suite >> , I would like to modify the >> certificate used by https://self-signed.pythontest.net. So far I have >> a patch ready for the pythontestdotnet repository, but I want to know >> if I can just push to that repository, or if other steps are required. > > It should suffice to update that repo. I suppose the server might have > to be kicked, too, as it's a cert change. Ping this thread if that > doesn't work. It seems to have worked, thanks. Not immediately, but after waiting ~20 minutes the certificate is now updated. I am working on an update for all branches 3.2+ and 2.7; I expect test_httplib will be broken until I fix it. BTW is giving me a 403 Forbidden response, but it was doing this before I pushed my change, so I think it?s not my fault :) From benjamin at python.org Thu Jan 14 02:46:50 2016 From: benjamin at python.org (Benjamin Peterson) Date: Wed, 13 Jan 2016 23:46:50 -0800 Subject: [Python-Dev] Modifying the self-signed.pythontest.net certificate In-Reply-To: References: <1452749669.3075604.491708178.7CD07DB8@webmail.messagingengine.com> Message-ID: <1452757610.3100460.491767618.0B033637@webmail.messagingengine.com> On Wed, Jan 13, 2016, at 23:31, Martin Panter wrote: > On 14 January 2016 at 05:34, Benjamin Peterson > wrote: > > On Wed, Jan 13, 2016, at 13:15, Martin Panter wrote: > >> In order to fix the SSL test suite > >> , I would like to modify the > >> certificate used by https://self-signed.pythontest.net. So far I have > >> a patch ready for the pythontestdotnet repository, but I want to know > >> if I can just push to that repository, or if other steps are required. > > > > It should suffice to update that repo. I suppose the server might have > > to be kicked, too, as it's a cert change. Ping this thread if that > > doesn't work. > > It seems to have worked, thanks. Not immediately, but after waiting > ~20 minutes the certificate is now updated. I am working on an update > for all branches 3.2+ and 2.7; I expect test_httplib will be broken > until I fix it. Thank you for taking care of this issue. > > BTW is giving me a 403 Forbidden > response, but it was doing this before I pushed my change, so I think > it?s not my fault :) Fixed. From ncoghlan at gmail.com Thu Jan 14 04:45:02 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 14 Jan 2016 19:45:02 +1000 Subject: [Python-Dev] Discussion related to memory leaks requested In-Reply-To: <1452750144.3077065.491709618.053F1E57@webmail.messagingengine.com> References: <5696A647.6090501@busiq.com> <1452750144.3077065.491709618.053F1E57@webmail.messagingengine.com> Message-ID: On 14 January 2016 at 15:42, Benjamin Peterson wrote: > This is a "well-known" issue. Parts of the interpreter (and especially, > extension modules) cheerfully stash objects in global variables with no > way to clean them up. Fixing this is a large project, which probably > involves implementing PEP 489. The actual multi-phase extension module import system from 489 was implemented for 3.5, but indeed, the modules with stashed global state haven't been converted yet. I didn't think we loaded any of those by default, though... Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From encukou at gmail.com Thu Jan 14 05:16:34 2016 From: encukou at gmail.com (Petr Viktorin) Date: Thu, 14 Jan 2016 11:16:34 +0100 Subject: [Python-Dev] Discussion related to memory leaks requested In-Reply-To: References: <5696A647.6090501@busiq.com> <1452750144.3077065.491709618.053F1E57@webmail.messagingengine.com> Message-ID: <56977582.2090109@gmail.com> On 01/14/2016 10:45 AM, Nick Coghlan wrote: > On 14 January 2016 at 15:42, Benjamin Peterson wrote: >> This is a "well-known" issue. Parts of the interpreter (and especially, >> extension modules) cheerfully stash objects in global variables with no >> way to clean them up. Fixing this is a large project, which probably >> involves implementing PEP 489. > > The actual multi-phase extension module import system from 489 was > implemented for 3.5, but indeed, the modules with stashed global state > haven't been converted yet. The hairy details on why the global variables haven't yet gone away are on import-sig [0]. Nick suggested a workable solution there that I really need to go back to and implement. [0] https://mail.python.org/pipermail/import-sig/2015-July/001022.html From skrah.temporarily at gmail.com Thu Jan 14 05:37:59 2016 From: skrah.temporarily at gmail.com (Stefan Krah) Date: Thu, 14 Jan 2016 10:37:59 +0000 (UTC) Subject: [Python-Dev] Discussion related to memory leaks requested References: <5696A647.6090501@busiq.com> <1452750144.3077065.491709618.053F1E57@webmail.messagingengine.com> <56977582.2090109@gmail.com> Message-ID: Petr Viktorin gmail.com> writes: > The hairy details on why the global variables haven't yet gone away are > on import-sig [0]. Nick suggested a workable solution there that I > really need to go back to and implement. > > [0] https://mail.python.org/pipermail/import-sig/2015-July/001022.html I want to add here that existing schemes for eliminating global variables are inefficient (20% speed hit for _decimal), so a complete solution would have to address that as well. Stefan Krah From brett at python.org Thu Jan 14 13:36:38 2016 From: brett at python.org (Brett Cannon) Date: Thu, 14 Jan 2016 18:36:38 +0000 Subject: [Python-Dev] PEP 506 secrets module In-Reply-To: <20151016005711.GC11980@ando.pearwood.info> References: <20151016005711.GC11980@ando.pearwood.info> Message-ID: I noticed an article about default rand usage in Go from the Go Weekly newsletter and it reminded me about PEP 506 and the secrets module. That's when I noticed that the PEP is still open. What is the current blocker on the PEP? On Thu, 15 Oct 2015 at 17:57 Steven D'Aprano wrote: > Hi, > > As extensively discussed on Python-Ideas, the secrets module and PEP 506 > is (I hope) ready for pronouncement. > > https://www.python.org/dev/peps/pep-0506/ > > There is code and tests here: > > https://bitbucket.org/sdaprano/secrets > > > or you can run > > hg clone https://sdaprano at bitbucket.org/sdaprano/secrets > > > The code is written for and tested on Python 2.6, 2.7, 3.1 - 3.4. > > > > -- > Steve > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Thu Jan 14 13:47:09 2016 From: guido at python.org (Guido van Rossum) Date: Thu, 14 Jan 2016 10:47:09 -0800 Subject: [Python-Dev] PEP 506 secrets module In-Reply-To: References: <20151016005711.GC11980@ando.pearwood.info> Message-ID: I think the discussion petered out and nobody asked me to approve it yet (or I lost track of it). I'm almost happy to approve it in the current state. My only quibble is with some naming -- I'm not sure that a super-generic name like 'equal' is better than the original ('compare_digest'), and I would have picked a different name for token_url -- probably token_urlsafe. But maybe Steven can convince me that the names currently in the PEP are better. (I also don't like the wishy-washy position of the PEP on the actual specs of the proposed functions. But I'm fine with the actual implementation shown as the spec.) On Thu, Jan 14, 2016 at 10:36 AM, Brett Cannon wrote: > I noticed an article about default rand usage in Go > > from the Go Weekly newsletter and it reminded me about PEP 506 and the > secrets module. That's when I noticed that the PEP is still open. What is > the current blocker on the PEP? > > On Thu, 15 Oct 2015 at 17:57 Steven D'Aprano wrote: > >> Hi, >> >> As extensively discussed on Python-Ideas, the secrets module and PEP 506 >> is (I hope) ready for pronouncement. >> >> https://www.python.org/dev/peps/pep-0506/ >> >> There is code and tests here: >> >> https://bitbucket.org/sdaprano/secrets >> >> >> or you can run >> >> hg clone https://sdaprano at bitbucket.org/sdaprano/secrets >> >> >> The code is written for and tested on Python 2.6, 2.7, 3.1 - 3.4. >> >> >> >> -- >> Steve >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/brett%40python.org >> > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From paulson at busiq.com Thu Jan 14 14:25:19 2016 From: paulson at busiq.com (Matthew Paulson) Date: Thu, 14 Jan 2016 14:25:19 -0500 Subject: [Python-Dev] Discussion related to memory leaks requested In-Reply-To: References: <5696A647.6090501@busiq.com> <1452750144.3077065.491709618.053F1E57@webmail.messagingengine.com> Message-ID: <5697F61F.7050004@busiq.com> Hi All: I've created a simple program to make sure I wasn't lying to you all ;-> Here it is: for (ii = 0; ii < 100; ii++) { Py_Initialize(); if ((code = Py_CompileString(p, "foo", Py_file_input)) == NULL) printf("PyRun_SimpleString() failed\n"); else { if (PyRun_SimpleString(p) == -1) printf("PyRun_SimpleString() failed\n"); Py_CLEAR(code); } Py_Finalize(); } This sequence causes about 10k growth per iteration and after many cycles, there's no indication that any pooling logic is helping. Our "useful" example is slightly more complex, and therefore may explain why I was seeing about 16k per iteration. Unless I've done something obviously wrong, I tend to believe Benjamin's claim that this issue is well known. Suggestion: I have had great success with similar problems in the past by using a pools implementation sitting on top of what I call a "block memory allocator". The bottom (block) allocator grabs large blocks from the heap and then doles them out to the pools layer, which in turn doles them out to the requester. When client memory is freed -- it is NOT -- rather it's added to the pool which contains like-sized blocks -- call it an "organized free list". This is a very, very fast way to handle high allocation frequency patterns. Finally, during shutdown, the pool simply vaporizes and the block allocator returns a the (fewer) large blocks back to the heap. This avoids thrashing the heap, forcing it to coalesce inefficiently and also avoids heap fragmentation, which can cause unwanted growth as well... Note that this would be a "hard-reset" of all allocated memory, and any global data in the text segment would also have to be cleared, but it would provide a fast, clean way to ensure that each invocation was 100% clean. I don't claim to understand all the intricacies of the many way python can be embedded, but as I said, this strategy has worked very well for me in the past building servers written in C that have to stay up for months at a time. Happy to discuss further, if anyone has any interest. Best, Matt On 1/14/2016 4:45 AM, Nick Coghlan wrote: > On 14 January 2016 at 15:42, Benjamin Peterson wrote: >> This is a "well-known" issue. Parts of the interpreter (and especially, >> extension modules) cheerfully stash objects in global variables with no >> way to clean them up. Fixing this is a large project, which probably >> involves implementing PEP 489. > The actual multi-phase extension module import system from 489 was > implemented for 3.5, but indeed, the modules with stashed global state > haven't been converted yet. > > I didn't think we loaded any of those by default, though... > > Cheers, > Nick. > -- -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: MattSig.JPG Type: image/jpeg Size: 38491 bytes Desc: not available URL: From stephen at xemacs.org Thu Jan 14 22:16:00 2016 From: stephen at xemacs.org (Stephen J. Turnbull) Date: Fri, 15 Jan 2016 12:16:00 +0900 Subject: [Python-Dev] PEP 509 In-Reply-To: References: <56954B68.3090208@stoneleaf.us> <56957D32.6020401@stoneleaf.us> Message-ID: <22168.25712.743971.624078@turnbull.sk.tsukuba.ac.jp> Terry Reedy writes: > While I understand the rationale against __version__, it strikes me as a > better description of what it is, and easier on the brain than > __cache_token__. Maybe there is something even better, such as > __seqnum__. Or __generation__, as in "generational garbage collector"? From ncoghlan at gmail.com Thu Jan 14 23:19:25 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 15 Jan 2016 14:19:25 +1000 Subject: [Python-Dev] Discussion related to memory leaks requested In-Reply-To: <5697F61F.7050004@busiq.com> References: <5696A647.6090501@busiq.com> <1452750144.3077065.491709618.053F1E57@webmail.messagingengine.com> <5697F61F.7050004@busiq.com> Message-ID: On 15 January 2016 at 05:25, Matthew Paulson wrote: > Hi All: > > I've created a simple program to make sure I wasn't lying to you all ;-> > > Here it is: > > for (ii = 0; ii < 100; ii++) > { > Py_Initialize(); > > if ((code = Py_CompileString(p, "foo", Py_file_input)) == NULL) > printf("PyRun_SimpleString() failed\n"); > else > { > if (PyRun_SimpleString(p) == -1) > printf("PyRun_SimpleString() failed\n"); > > Py_CLEAR(code); > } > > Py_Finalize(); > } > > This sequence causes about 10k growth per iteration and after many cycles, > there's no indication that any pooling logic is helping. Our "useful" > example is slightly more complex, and therefore may explain why I was > seeing about 16k per iteration. > > Unless I've done something obviously wrong, I tend to believe Benjamin's > claim that this issue is well known. > > Suggestion: I have had great success with similar problems in the past by > using a pools implementation sitting on top of what I call a "block memory > allocator". The bottom (block) allocator grabs large blocks from the heap > and then doles them out to the pools layer, which in turn doles them out to > the requester. When client memory is freed -- it is NOT -- rather it's > added to the pool which contains like-sized blocks -- call it an "organized > free list". This is a very, very fast way to handle high allocation > frequency patterns. Finally, during shutdown, the pool simply vaporizes > and the block allocator returns a the (fewer) large blocks back to the > heap. This avoids thrashing the heap, forcing it to coalesce inefficiently > and also avoids heap fragmentation, which can cause unwanted growth as > well... > > Note that this would be a "hard-reset" of all allocated memory, and any > global data in the text segment would also have to be cleared, but it would > provide a fast, clean way to ensure that each invocation was 100% clean. > CPython does use an arena based allocator, but PyFinalize doesn't purge it (if it did, there'd be segfaults rather than memory growth when modules keep pointers across Initialize/Finalize cycles). Building with PYMALLOC_DEBUG and setting PYTHONMALLOCSTATS in the environment will cause it to dump debugging info during Py_Finalize. Building with Py_TRACE_REFS and setting PYTHONDUMPREFS also provides info on live Python objects during shutdown. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia -------------- next part -------------- An HTML attachment was scrubbed... URL: From status at bugs.python.org Fri Jan 15 12:08:35 2016 From: status at bugs.python.org (Python tracker) Date: Fri, 15 Jan 2016 18:08:35 +0100 (CET) Subject: [Python-Dev] Summary of Python tracker Issues Message-ID: <20160115170835.81B815667D@psf.upfronthosting.co.za> ACTIVITY SUMMARY (2016-01-08 - 2016-01-15) Python tracker at http://bugs.python.org/ To view or respond to any of the issues listed below, click on the issue. Do NOT respond to this message. Issues counts and deltas: open 5387 (+16) closed 32496 (+60) total 37883 (+76) Open issues with patches: 2372 Issues opened (51) ================== #25668: Deadlock in logging caused by a possible race condition with " http://bugs.python.org/issue25668 reopened by fviard #26050: Add new StreamReader.readuntil() method http://bugs.python.org/issue26050 opened by mmarkk #26051: Non-data descriptors in pydoc http://bugs.python.org/issue26051 opened by Antony.Lee #26052: pydoc for __init__ with not docstring http://bugs.python.org/issue26052 opened by Antony.Lee #26053: regression in pdb output between 2.7 and 3.5 http://bugs.python.org/issue26053 opened by doughellmann #26057: Avoid nonneeded use of PyUnicode_FromObject() http://bugs.python.org/issue26057 opened by serhiy.storchaka #26058: PEP 509: Add ma_version to PyDictObject http://bugs.python.org/issue26058 opened by haypo #26059: Integer Overflow in strop.replace() http://bugs.python.org/issue26059 opened by Ramin Farajpour Cami #26060: Class __dict__ iteration order changing due to type instance k http://bugs.python.org/issue26060 opened by ncoghlan #26065: python embedded 3.5 amd64 crash when using venv http://bugs.python.org/issue26065 opened by Laurent Dufrechou #26067: test_shutil fails when gid name is missing http://bugs.python.org/issue26067 opened by Dinesh Wijekoon #26068: re.compile() repr end quote truncated http://bugs.python.org/issue26068 opened by ThiefMaster #26070: Launcher fails to find in-place built binaries from earlier Py http://bugs.python.org/issue26070 opened by mhammond #26071: bdist_wininst created binaries fail to start and find 32bit Py http://bugs.python.org/issue26071 opened by mhammond #26072: pdb fails to access variables closed over http://bugs.python.org/issue26072 opened by Antony.Lee #26073: Update the list of magic numbers in launcher http://bugs.python.org/issue26073 opened by serhiy.storchaka #26075: typing.Union unifies types too broadly http://bugs.python.org/issue26075 opened by alex.gronholm #26076: redundant checks in tok_get in Parser\tokenizer.c http://bugs.python.org/issue26076 opened by Oren Milman #26077: Make slicing of immutable structures return a view instead of http://bugs.python.org/issue26077 opened by Filip Haglund #26079: Build with Visual Studio 2015 using PlatformToolset=v120 http://bugs.python.org/issue26079 opened by bjoernthiel #26081: Implement asyncio Future in C to improve performance http://bugs.python.org/issue26081 opened by yselivanov #26082: functools.lru_cache user specified cachedict support http://bugs.python.org/issue26082 opened by wdv4758h #26085: Tkinter spoils the input text http://bugs.python.org/issue26085 opened by fresh_nick #26086: Bug in standardmodule os http://bugs.python.org/issue26086 opened by Johano #26089: Duplicated keyword in distutils metadata http://bugs.python.org/issue26089 opened by Augustin Laville #26090: More correct string truncating in PyUnicode_FromFormat() http://bugs.python.org/issue26090 opened by serhiy.storchaka #26092: doctest should allow custom sys.displayhook http://bugs.python.org/issue26092 opened by Sergey.Kirpichev #26093: __qualname__ different when calling generator object w/ functi http://bugs.python.org/issue26093 opened by dino.viehland #26094: ConfigParser.get() doc to be updated according to the configpa http://bugs.python.org/issue26094 opened by khyox #26095: Update porting HOWTO to special-case Python 2 code, not Python http://bugs.python.org/issue26095 opened by brett.cannon #26098: PEP 510: Specialize functions with guards http://bugs.python.org/issue26098 opened by haypo #26099: site ignores ImportError when running sitecustomize and usercu http://bugs.python.org/issue26099 opened by haypo #26100: Add test.support.optim_args_from_interpreter_flags() http://bugs.python.org/issue26100 opened by haypo #26101: Lib/test/test_compileall.py fails when run directly http://bugs.python.org/issue26101 opened by haypo #26102: access violation in PyErrFetch if tcur==null in PyGILState_Rel http://bugs.python.org/issue26102 opened by cberger #26103: Contradiction in definition of "data descriptor" between (dott http://bugs.python.org/issue26103 opened by Aaron Hall #26106: Move licences to literal blocks http://bugs.python.org/issue26106 opened by sizeof #26107: code.co_lnotab: use signed line number delta to support moving http://bugs.python.org/issue26107 opened by haypo #26108: Calling PyInitialize with 2.7.11 on Windows x64 terminates pro http://bugs.python.org/issue26108 opened by David Heffernan #26109: _Py_DumpTraceback should be PyAPI_FUNC http://bugs.python.org/issue26109 opened by John.Malmberg #26110: Speedup method calls 1.2x http://bugs.python.org/issue26110 opened by yselivanov #26111: On Windows, os.scandir will keep a handle on the directory unt http://bugs.python.org/issue26111 opened by remyroy #26114: Rewrite math.erf() and math.erfc() from scratch http://bugs.python.org/issue26114 opened by brett.cannon #26117: Close directory descriptor in scandir iterator on error http://bugs.python.org/issue26117 opened by serhiy.storchaka #26119: Windows Installer can sometimes silently fail pip stage http://bugs.python.org/issue26119 opened by Paul Hammant #26120: pydoc: move __future__ imports out of the DATA block http://bugs.python.org/issue26120 opened by Antony.Lee #26121: Use C99 functions in math if available http://bugs.python.org/issue26121 opened by serhiy.storchaka #26122: Isolated mode doesn't ignore PYTHONHASHSEED http://bugs.python.org/issue26122 opened by ncoghlan #26123: http.client status code constants incompatible with Python 3.4 http://bugs.python.org/issue26123 opened by srittau #26124: shlex.quote and pipes.quote do not quote shell keywords http://bugs.python.org/issue26124 opened by Charles Daffern #26125: Incorrect error message in the module asyncio.selector_events. http://bugs.python.org/issue26125 opened by Paradisee Most recent 15 issues with no replies (15) ========================================== #26122: Isolated mode doesn't ignore PYTHONHASHSEED http://bugs.python.org/issue26122 #26120: pydoc: move __future__ imports out of the DATA block http://bugs.python.org/issue26120 #26117: Close directory descriptor in scandir iterator on error http://bugs.python.org/issue26117 #26103: Contradiction in definition of "data descriptor" between (dott http://bugs.python.org/issue26103 #26102: access violation in PyErrFetch if tcur==null in PyGILState_Rel http://bugs.python.org/issue26102 #26094: ConfigParser.get() doc to be updated according to the configpa http://bugs.python.org/issue26094 #26093: __qualname__ different when calling generator object w/ functi http://bugs.python.org/issue26093 #26092: doctest should allow custom sys.displayhook http://bugs.python.org/issue26092 #26085: Tkinter spoils the input text http://bugs.python.org/issue26085 #26081: Implement asyncio Future in C to improve performance http://bugs.python.org/issue26081 #26073: Update the list of magic numbers in launcher http://bugs.python.org/issue26073 #26072: pdb fails to access variables closed over http://bugs.python.org/issue26072 #26051: Non-data descriptors in pydoc http://bugs.python.org/issue26051 #26040: Improve coverage and rigour of test.test_math http://bugs.python.org/issue26040 #26038: zipfile cannot handle zip files where the archive size for a f http://bugs.python.org/issue26038 Most recent 15 issues waiting for review (15) ============================================= #26125: Incorrect error message in the module asyncio.selector_events. http://bugs.python.org/issue26125 #26121: Use C99 functions in math if available http://bugs.python.org/issue26121 #26117: Close directory descriptor in scandir iterator on error http://bugs.python.org/issue26117 #26110: Speedup method calls 1.2x http://bugs.python.org/issue26110 #26107: code.co_lnotab: use signed line number delta to support moving http://bugs.python.org/issue26107 #26106: Move licences to literal blocks http://bugs.python.org/issue26106 #26101: Lib/test/test_compileall.py fails when run directly http://bugs.python.org/issue26101 #26100: Add test.support.optim_args_from_interpreter_flags() http://bugs.python.org/issue26100 #26099: site ignores ImportError when running sitecustomize and usercu http://bugs.python.org/issue26099 #26098: PEP 510: Specialize functions with guards http://bugs.python.org/issue26098 #26089: Duplicated keyword in distutils metadata http://bugs.python.org/issue26089 #26082: functools.lru_cache user specified cachedict support http://bugs.python.org/issue26082 #26081: Implement asyncio Future in C to improve performance http://bugs.python.org/issue26081 #26079: Build with Visual Studio 2015 using PlatformToolset=v120 http://bugs.python.org/issue26079 #26076: redundant checks in tok_get in Parser\tokenizer.c http://bugs.python.org/issue26076 Top 10 most discussed issues (10) ================================= #25887: awaiting on coroutine more than once should be an error http://bugs.python.org/issue25887 16 msgs #26059: Integer Overflow in strop.replace() http://bugs.python.org/issue26059 15 msgs #26058: PEP 509: Add ma_version to PyDictObject http://bugs.python.org/issue26058 13 msgs #26071: bdist_wininst created binaries fail to start and find 32bit Py http://bugs.python.org/issue26071 12 msgs #25596: Use scandir() to speed up the glob module http://bugs.python.org/issue25596 10 msgs #26068: re.compile() repr end quote truncated http://bugs.python.org/issue26068 10 msgs #25940: SSL tests failed due to expired svn.python.org SSL certificate http://bugs.python.org/issue25940 9 msgs #25994: File descriptor leaks in os.scandir() http://bugs.python.org/issue25994 8 msgs #26111: On Windows, os.scandir will keep a handle on the directory unt http://bugs.python.org/issue26111 7 msgs #25995: os.walk() consumes a lot of file descriptors http://bugs.python.org/issue25995 6 msgs Issues closed (58) ================== #7944: Use the 'with' statement in conjunction with 'open' throughout http://bugs.python.org/issue7944 closed by ezio.melotti #11440: fix_callable should be dropped from lib2to3 / changed http://bugs.python.org/issue11440 closed by SilentGhost #13963: dev guide has no mention of mechanics of patch review http://bugs.python.org/issue13963 closed by ezio.melotti #15430: Improve filecmp documentation http://bugs.python.org/issue15430 closed by orsenthil #19006: UnitTest docs should have a single list of assertions http://bugs.python.org/issue19006 closed by ezio.melotti #19316: devguide: compiler - wording http://bugs.python.org/issue19316 closed by ezio.melotti #22138: patch.object doesn't restore function defaults http://bugs.python.org/issue22138 closed by orsenthil #22570: Better stdlib support for Path objects http://bugs.python.org/issue22570 closed by gvanrossum #22642: trace module: convert to argparse http://bugs.python.org/issue22642 closed by orsenthil #23675: glossary entry for 'method resolution order' links to somethin http://bugs.python.org/issue23675 closed by orsenthil #23942: Explain naming of the patch files in the bug tracker http://bugs.python.org/issue23942 closed by ezio.melotti #24649: python -mtrace --help is wrong http://bugs.python.org/issue24649 closed by SilentGhost #24752: SystemError when importing from a non-package directory http://bugs.python.org/issue24752 closed by brett.cannon #24786: Changes in the devguide repository are not published online in http://bugs.python.org/issue24786 closed by ezio.melotti #24789: ctypes doc string http://bugs.python.org/issue24789 closed by ezio.melotti #25347: assert_has_calls output is formatted inconsistently http://bugs.python.org/issue25347 closed by orsenthil #25348: Update pgo_build.bat to use --pgo flag for regrtest http://bugs.python.org/issue25348 closed by python-dev #25486: Resurrect inspect.getargspec() in 3.6 http://bugs.python.org/issue25486 closed by yselivanov #25517: regex howto example in "Lookahead Assertions" http://bugs.python.org/issue25517 closed by ezio.melotti #25574: 2.7 incorrectly documents objects hash as equal to id http://bugs.python.org/issue25574 closed by ezio.melotti #25730: invisible sidebar content with code snippets http://bugs.python.org/issue25730 closed by ezio.melotti #25752: asyncio.readline - add customizable line separator http://bugs.python.org/issue25752 closed by martin.panter #25802: Finish deprecating load_module() http://bugs.python.org/issue25802 closed by brett.cannon #25822: Add docstrings to fields of urllib.parse results http://bugs.python.org/issue25822 closed by orsenthil #25967: Devguide: add 2to3 to the "Changing CPython's Grammar" checkli http://bugs.python.org/issue25967 closed by ezio.melotti #25986: Collections.deque maxlen: added in 2.6 or 2.7? http://bugs.python.org/issue25986 closed by ezio.melotti #25991: readline example eventually consumes all memory http://bugs.python.org/issue25991 closed by ezio.melotti #26001: Tutorial: write() does not expect string in binary mode http://bugs.python.org/issue26001 closed by ezio.melotti #26004: pip install lifetimes - throwing error and unable to install p http://bugs.python.org/issue26004 closed by terry.reedy #26025: Document pathlib.Path.__truediv__() http://bugs.python.org/issue26025 closed by brett.cannon #26029: Broken sentence in extending documentation http://bugs.python.org/issue26029 closed by terry.reedy #26030: Use PEP8 in documentation examples http://bugs.python.org/issue26030 closed by ezio.melotti #26054: Unable to run scripts: idle3 -r script.py http://bugs.python.org/issue26054 closed by zach.ware #26055: sys.argv[0] is the python file, not "" http://bugs.python.org/issue26055 closed by Wei #26056: installation failure http://bugs.python.org/issue26056 closed by zach.ware #26061: logging LogRecordFactory allow kwargs http://bugs.python.org/issue26061 closed by vinay.sajip #26062: IPython4 bash magic ! with {} does not work with Python 3.5.1 http://bugs.python.org/issue26062 closed by brett.cannon #26063: Update copyright in the devguide http://bugs.python.org/issue26063 closed by ezio.melotti #26064: directory is getting separated http://bugs.python.org/issue26064 closed by SilentGhost #26066: Language on the "Cryptographic Services" documentation page is http://bugs.python.org/issue26066 closed by python-dev #26069: Remove the Deprecated API in trace module http://bugs.python.org/issue26069 closed by orsenthil #26074: Add a method to pip.. pip.require("package_name") http://bugs.python.org/issue26074 closed by berker.peksag #26078: Python launcher options enhancement http://bugs.python.org/issue26078 closed by eryksun #26080: "abandonned" -> "abandoned" in PEP 510's https://hg.python.org http://bugs.python.org/issue26080 closed by haypo #26083: ValueError: insecure string pickle in subprocess.Popen on Pyth http://bugs.python.org/issue26083 closed by gregory.p.smith #26084: HTMLParser mishandles last attribute in self-closing tag http://bugs.python.org/issue26084 closed by ezio.melotti #26087: PEP 0373 should be updated http://bugs.python.org/issue26087 closed by python-dev #26088: re http://bugs.python.org/issue26088 closed by ezio.melotti #26091: decimal.Decimal(0)**0 throws decimal.InvalidOperation http://bugs.python.org/issue26091 closed by skrah #26096: '*' glob string matches dot files in pathlib http://bugs.python.org/issue26096 closed by gvanrossum #26097: 2.7 documentation about TextTestRunner do not specify all the http://bugs.python.org/issue26097 closed by orsenthil #26104: Reference leak in functools.partial constructor in failure cas http://bugs.python.org/issue26104 closed by serhiy.storchaka #26105: Python JSON module doesn't actually produce JSON http://bugs.python.org/issue26105 closed by ezio.melotti #26112: Error on example using "dialect" parameter. have to be: "diale http://bugs.python.org/issue26112 closed by SilentGhost #26113: pathlib p.match('') should return False rather than raising ex http://bugs.python.org/issue26113 closed by gvanrossum #26115: pathlib.glob('**') returns only directories http://bugs.python.org/issue26115 closed by SilentGhost #26116: CSV-module. The example code don't work. Have to be: reader = http://bugs.python.org/issue26116 closed by SilentGhost #26118: String performance issue using single quotes http://bugs.python.org/issue26118 closed by SilentGhost From ethan at stoneleaf.us Fri Jan 15 13:22:56 2016 From: ethan at stoneleaf.us (Ethan Furman) Date: Fri, 15 Jan 2016 10:22:56 -0800 Subject: [Python-Dev] Boolean value of an Enum member Message-ID: <56993900.60105@stoneleaf.us> When Enum was being designed one of the questions considered was where to start autonumbering: zero or one. As I remember the discussion we chose not to start with zero because we didn't want an enum member to be False by default, and having a member with value 0 be True was discordant. So the functional API starts with 1 unless overridden. In fact, according to the Enum docs: The reason for defaulting to ``1`` as the starting number and not ``0`` is that ``0`` is ``False`` in a boolean sense, but enum members all evaluate to ``True``. However, if the Enum is combined with some other type (str, int, float, etc), then most behaviour is determined by that type -- including boolean evaluation. So the empty string, 0 values, etc, will cause that Enum member to evaluate as False. So the question now is: for a standard Enum (meaning no other type besides Enum is involved) should __bool__ look to the value of the Enum member to determine True/False, or should we always be True by default and make the Enum creator add their own __bool__ if they want something different? On the one hand we have backwards compatibility, which will take a version to change. On the other hand we have a pretty basic difference in how zero/empty is handled between "pure" Enums and "mixed" Enums. On the gripping hand we have . . . Please respond with your thoughts on changing pure Enums to match mixed Enums or any experience you have had with relying on the "always True" behaviour or if you have implemented your own __bool__ to match the standard True/False meanings or if you have implemented your own __bool__ to match some other scheme entirely. -- ~Ethan~ From guido at python.org Fri Jan 15 13:28:40 2016 From: guido at python.org (Guido van Rossum) Date: Fri, 15 Jan 2016 10:28:40 -0800 Subject: [Python-Dev] [Python-ideas] Boolean value of an Enum member In-Reply-To: <56993900.60105@stoneleaf.us> References: <56993900.60105@stoneleaf.us> Message-ID: Honestly I think it's too late to change. The proposal to change plain Enums to False when their value is zero (or falsey) would be a huge backward incompatibility. I don't think there's a reasonable path forward, and also don't think there's a big reason to regret the current semantics. On Fri, Jan 15, 2016 at 10:22 AM, Ethan Furman wrote: > When Enum was being designed one of the questions considered was where to > start autonumbering: zero or one. > > As I remember the discussion we chose not to start with zero because we > didn't want an enum member to be False by default, and having a member with > value 0 be True was discordant. So the functional API starts with 1 unless > overridden. In fact, according to the Enum docs: > > The reason for defaulting to ``1`` as the starting number and > not ``0`` is that ``0`` is ``False`` in a boolean sense, but > enum members all evaluate to ``True``. > > However, if the Enum is combined with some other type (str, int, float, > etc), then most behaviour is determined by that type -- including boolean > evaluation. So the empty string, 0 values, etc, will cause that Enum > member to evaluate as False. > > So the question now is: for a standard Enum (meaning no other type > besides Enum is involved) should __bool__ look to the value of the Enum > member to determine True/False, or should we always be True by default and > make the Enum creator add their own __bool__ if they want something > different? > > On the one hand we have backwards compatibility, which will take a version > to change. > > On the other hand we have a pretty basic difference in how zero/empty is > handled between "pure" Enums and "mixed" Enums. > > On the gripping hand we have . . . > > Please respond with your thoughts on changing pure Enums to match mixed > Enums or any experience you have had with relying on the "always True" > behaviour or if you have implemented your own __bool__ to match the > standard True/False meanings or if you have implemented your own __bool__ > to match some other scheme entirely. > > -- > ~Ethan~ > _______________________________________________ > Python-ideas mailing list > Python-ideas at python.org > https://mail.python.org/mailman/listinfo/python-ideas > Code of Conduct: http://python.org/psf/codeofconduct/ > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From barry at python.org Fri Jan 15 13:32:58 2016 From: barry at python.org (Barry Warsaw) Date: Fri, 15 Jan 2016 13:32:58 -0500 Subject: [Python-Dev] Boolean value of an Enum member In-Reply-To: <56993900.60105@stoneleaf.us> References: <56993900.60105@stoneleaf.us> Message-ID: <20160115133258.3ca202dd@limelight.wooz.org> On Jan 15, 2016, at 10:22 AM, Ethan Furman wrote: >So the question now is: for a standard Enum (meaning no other type besides >Enum is involved) should __bool__ look to the value of the Enum member to >determine True/False, or should we always be True by default and make the >Enum creator add their own __bool__ if they want something different? The latter. I think in general enums are primarily a symbolic value and don't have truthiness. It's also so easy to override when you define the enum that it's not worth changing the current behavior. Cheers, -Barry From davidr at openscg.com Fri Jan 15 17:13:41 2016 From: davidr at openscg.com (Rader, David) Date: Fri, 15 Jan 2016 17:13:41 -0500 Subject: [Python-Dev] 2.7.11 Windows Installer issues on Win2008R2 Message-ID: Problem 1: The .manifest information for the VC runtime dll's has been changed in the recent versions of the 2.7.x 64-bit installers for Windows. Python fails to run on a clean Win2008R2 install after running the Python installer to install "Just for me". The installation succeeds if "Install for all users" is selected. After install completes, trying to run python results in: The application has failed to start because it's side-by-side configuration is incorrect. Please see the application event log or use the command-line sxstrace.exe tool for more detail. The event viewer log shows: Activation context generation failed for "C:\Python27\python.exe".Error in manifest or policy file "C:\Python27\Microsoft.VC90.CRT.MANIFEST" on line 4. Component identity found in manifest does not match the identity of the component requested. Reference is Microsoft.VC90.CRT,processorArchitecture="amd64",publicKeyToken="1fc8b3b9a1e18e3b",type="win32",version="9.0.21022.8". Definition is Microsoft.VC90.CRT,processorArchitecture="amd64",publicKeyToken="1fc8b3b9a1e18e3b",type="win32",version="9.0.30729.1". Please use sxstrace.exe for detailed diagnosis. This means that that VC2008 SP1 dll and manifest are included in the installer, but the Python.exe is compiled against VC2008 (_not_ SP1). Replacing the installed manifest and VC90 with one pulled from an older distribution with the correct 9.0.21022.8 version enables python to run. Problem 2: The compiled DLLs in the DLLs folder incorrectly have the VC manifest included in them as well. This breaks the side-by-side look up, since the VC90 dll is not in the DLLs folder. So if you try to import socket, you get an error message like: Traceback (most recent call last): File "hub\scripts\pgc.py", line 9, in import socket File "C:\Python27\lib\socket.py", line 47, in import _socket ImportError: DLL load failed: The application has failed to start because its si de-by-side configuration is incorrect. Please see the application event log or u se the command-line sxstrace.exe tool for more detail. Previous versions of Python for windows have had this problem but it was corrected. It looks like it has crept back in. -- David Rader -------------- next part -------------- An HTML attachment was scrubbed... URL: From tjreedy at udel.edu Fri Jan 15 18:20:48 2016 From: tjreedy at udel.edu (Terry Reedy) Date: Fri, 15 Jan 2016 18:20:48 -0500 Subject: [Python-Dev] 2.7.11 Windows Installer issues on Win2008R2 In-Reply-To: References: Message-ID: On 1/15/2016 5:13 PM, Rader, David wrote: [description of problems] Please register at bugs.python.org and open a new issue for Versions 2.7, Components: Installation, with 'benjamin.peterson' and 'loewis' on the Nosy List. Copy what you wrote in the Comment: box. -- Terry Jan Reedy From Eddy at Quicksall.com Fri Jan 15 18:31:23 2016 From: Eddy at Quicksall.com (Eddy Quicksall) Date: Fri, 15 Jan 2016 18:31:23 -0500 Subject: [Python-Dev] C struct for Str( ) In-Reply-To: <5696F0E6.8090700@python.org> References: <01f401d14dad$e984f1e0$bc8ed5a0$@com> <5695C54A.3070101@sdamon.com> <003201d14e3d$b6560f80$23022e80$@com> <005e01d14e5e$09660bb0$1c322310$@com> <5696F0E6.8090700@python.org> Message-ID: <00e201d14fec$d946ff80$8bd4fe80$@com> I want to fill an Str() string from a C function. But I think I am using the wrong structure (PyBytesObject). I have written a C function to dump the Python class but as you can see the structure I'm using does not match the data in the class. Can someone please tell me the correct structure: --------- Python snip --------- class _vendorRecord_2: vendorListNumber = str() vendorNumber = str() vendorName = str('x' * 20) vendorRecord_2 = _vendorRecord_2() print(len(vendorRecord_2.vendorName)) print(vendorRecord_2.vendorName + '|') XBaseDump_PythonString(py_object(vendorRecord_2.vendorName)) --------- C function -------- #define MS_NO_COREDLL #undef _DEBUG #include DSI_DLL void CALL_TYPE XBaseDump_PythonString( PyBytesObject *pBytesObject ) { printf( "ob_size = %d, ob_shash = %X, ob_sval = %s\n", pBytesObject->ob_base.ob_size, pBytesObject->ob_shash, pBytesObject->ob_sval ); printf( "offsetof(ob_size) = %d, offsetof(ob_sval) = %d\n", offsetof( PyBytesObject, ob_base.ob_size ), offsetof( PyBytesObject, ob_sval ) ); printf( "sizeof(PyBytesObject) = %d\n", sizeof(PyBytesObject) ); DsmDumpBytes( pBytesObject, 32 ); } -------- output ------------ 20 xxxxxxxxxxxxxxxxxxxx| ob_size = 20, ob_shash = 70768D53, ob_sval = ?sci offsetof(ob_size) = 8, offsetof(ob_sval) = 16 sizeof(PyBytesObject) = 20 0000: 03 00 00 00 60 B0 DC 1D 14 00 00 00 53 8D 76 70 ....-........... 0016: E5 73 63 69 00 00 00 00 78 78 78 78 78 78 78 78 V............... --- This email has been checked for viruses by Avast antivirus software. https://www.avast.com/antivirus From brett at python.org Fri Jan 15 18:58:07 2016 From: brett at python.org (Brett Cannon) Date: Fri, 15 Jan 2016 23:58:07 +0000 Subject: [Python-Dev] C struct for Str( ) In-Reply-To: <00e201d14fec$d946ff80$8bd4fe80$@com> References: <01f401d14dad$e984f1e0$bc8ed5a0$@com> <5695C54A.3070101@sdamon.com> <003201d14e3d$b6560f80$23022e80$@com> <005e01d14e5e$09660bb0$1c322310$@com> <5696F0E6.8090700@python.org> <00e201d14fec$d946ff80$8bd4fe80$@com> Message-ID: I don't quite see what this has to do with has to do with the development of Python, Eddy. You can always reference the C API at https://docs.python.org/3/c-api/index.html . And `PyBytesObject` is an instance of `bytes` in Python. On Fri, 15 Jan 2016 at 15:33 Eddy Quicksall wrote: > I want to fill an Str() string from a C function. But I think I am using > the > wrong structure (PyBytesObject). I have written a C function to dump the > Python class but as you can see the structure I'm using does not match the > data in the class. > > Can someone please tell me the correct structure: > > --------- Python snip --------- > class _vendorRecord_2: > vendorListNumber = str() > vendorNumber = str() > vendorName = str('x' * 20) > > vendorRecord_2 = _vendorRecord_2() > > print(len(vendorRecord_2.vendorName)) > print(vendorRecord_2.vendorName + '|') > XBaseDump_PythonString(py_object(vendorRecord_2.vendorName)) > > --------- C function -------- > #define MS_NO_COREDLL > #undef _DEBUG > #include > DSI_DLL void CALL_TYPE XBaseDump_PythonString( PyBytesObject *pBytesObject > ) > { > printf( "ob_size = %d, ob_shash = %X, ob_sval = %s\n", > pBytesObject->ob_base.ob_size, pBytesObject->ob_shash, > pBytesObject->ob_sval > ); > printf( "offsetof(ob_size) = %d, offsetof(ob_sval) = %d\n", offsetof( > PyBytesObject, ob_base.ob_size ), > offsetof( PyBytesObject, ob_sval ) ); > printf( "sizeof(PyBytesObject) = %d\n", sizeof(PyBytesObject) ); > DsmDumpBytes( pBytesObject, 32 ); > } > > -------- output ------------ > 20 > xxxxxxxxxxxxxxxxxxxx| > ob_size = 20, ob_shash = 70768D53, ob_sval = ?sci > offsetof(ob_size) = 8, offsetof(ob_sval) = 16 > sizeof(PyBytesObject) = 20 > 0000: 03 00 00 00 60 B0 DC 1D 14 00 00 00 53 8D 76 70 ....-........... > 0016: E5 73 63 69 00 00 00 00 78 78 78 78 78 78 78 78 V............... > > > > > --- > This email has been checked for viruses by Avast antivirus software. > https://www.avast.com/antivirus > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From Eddy at Quicksall.com Fri Jan 15 19:23:39 2016 From: Eddy at Quicksall.com (Eddy Quicksall) Date: Fri, 15 Jan 2016 19:23:39 -0500 Subject: [Python-Dev] C struct for Str( ) In-Reply-To: References: <01f401d14dad$e984f1e0$bc8ed5a0$@com> <5695C54A.3070101@sdamon.com> <003201d14e3d$b6560f80$23022e80$@com> <005e01d14e5e$09660bb0$1c322310$@com> <5696F0E6.8090700@python.org> <00e201d14fec$d946ff80$8bd4fe80$@com> Message-ID: <00e601d14ff4$2679dc30$736d9490$@com> Sorry, I must be on the wrong list. Can you please give me the correct list? Eddy From: Brett Cannon [mailto:brett at python.org] Sent: Friday, January 15, 2016 6:58 PM To: Eddy Quicksall; python-dev at python.org Subject: Re: [Python-Dev] C struct for Str( ) I don't quite see what this has to do with has to do with the development of Python, Eddy. You can always reference the C API at https://docs.python.org/3/c-api/index.html . And `PyBytesObject` is an instance of `bytes` in Python. On Fri, 15 Jan 2016 at 15:33 Eddy Quicksall wrote: I want to fill an Str() string from a C function. But I think I am using the wrong structure (PyBytesObject). I have written a C function to dump the Python class but as you can see the structure I'm using does not match the data in the class. Can someone please tell me the correct structure: --------- Python snip --------- class _vendorRecord_2: vendorListNumber = str() vendorNumber = str() vendorName = str('x' * 20) vendorRecord_2 = _vendorRecord_2() print(len(vendorRecord_2.vendorName)) print(vendorRecord_2.vendorName + '|') XBaseDump_PythonString(py_object(vendorRecord_2.vendorName)) --------- C function -------- #define MS_NO_COREDLL #undef _DEBUG #include DSI_DLL void CALL_TYPE XBaseDump_PythonString( PyBytesObject *pBytesObject ) { printf( "ob_size = %d, ob_shash = %X, ob_sval = %s\n", pBytesObject->ob_base.ob_size, pBytesObject->ob_shash, pBytesObject->ob_sval ); printf( "offsetof(ob_size) = %d, offsetof(ob_sval) = %d\n", offsetof( PyBytesObject, ob_base.ob_size ), offsetof( PyBytesObject, ob_sval ) ); printf( "sizeof(PyBytesObject) = %d\n", sizeof(PyBytesObject) ); DsmDumpBytes( pBytesObject, 32 ); } -------- output ------------ 20 xxxxxxxxxxxxxxxxxxxx| ob_size = 20, ob_shash = 70768D53, ob_sval = ?sci offsetof(ob_size) = 8, offsetof(ob_sval) = 16 sizeof(PyBytesObject) = 20 0000: 03 00 00 00 60 B0 DC 1D 14 00 00 00 53 8D 76 70 ....-........... 0016: E5 73 63 69 00 00 00 00 78 78 78 78 78 78 78 78 V............... --- This email has been checked for viruses by Avast antivirus software. https://www.avast.com/antivirus _______________________________________________ Python-Dev mailing list Python-Dev at python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/brett%40python.org --- This email has been checked for viruses by Avast antivirus software. https://www.avast.com/antivirus -------------- next part -------------- An HTML attachment was scrubbed... URL: From victor.pantoja at gmail.com Fri Jan 15 20:08:15 2016 From: victor.pantoja at gmail.com (Victor Pantoja) Date: Fri, 15 Jan 2016 23:08:15 -0200 Subject: [Python-Dev] C struct for Str( ) In-Reply-To: <00e601d14ff4$2679dc30$736d9490$@com> References: <01f401d14dad$e984f1e0$bc8ed5a0$@com> <5695C54A.3070101@sdamon.com> <003201d14e3d$b6560f80$23022e80$@com> <005e01d14e5e$09660bb0$1c322310$@com> <5696F0E6.8090700@python.org> <00e201d14fec$d946ff80$8bd4fe80$@com> <00e601d14ff4$2679dc30$736d9490$@com> Message-ID: Hi Eddy You can try some of the lists in https://mail.python.org/mailman/listinfo. Best, victor 2016-01-15 22:23 GMT-02:00 Eddy Quicksall : > Sorry, I must be on the wrong list. Can you please give me the correct > list? > > > > Eddy > > > > *From:* Brett Cannon [mailto:brett at python.org] > *Sent:* Friday, January 15, 2016 6:58 PM > *To:* Eddy Quicksall; python-dev at python.org > *Subject:* Re: [Python-Dev] C struct for Str( ) > > > > I don't quite see what this has to do with has to do with the development > of Python, Eddy. You can always reference the C API at > https://docs.python.org/3/c-api/index.html . And `PyBytesObject` is an > instance of `bytes` in Python. > > > > On Fri, 15 Jan 2016 at 15:33 Eddy Quicksall wrote: > > I want to fill an Str() string from a C function. But I think I am using > the > wrong structure (PyBytesObject). I have written a C function to dump the > Python class but as you can see the structure I'm using does not match the > data in the class. > > Can someone please tell me the correct structure: > > --------- Python snip --------- > class _vendorRecord_2: > vendorListNumber = str() > vendorNumber = str() > vendorName = str('x' * 20) > > vendorRecord_2 = _vendorRecord_2() > > print(len(vendorRecord_2.vendorName)) > print(vendorRecord_2.vendorName + '|') > XBaseDump_PythonString(py_object(vendorRecord_2.vendorName)) > > --------- C function -------- > #define MS_NO_COREDLL > #undef _DEBUG > #include > DSI_DLL void CALL_TYPE XBaseDump_PythonString( PyBytesObject *pBytesObject > ) > { > printf( "ob_size = %d, ob_shash = %X, ob_sval = %s\n", > pBytesObject->ob_base.ob_size, pBytesObject->ob_shash, > pBytesObject->ob_sval > ); > printf( "offsetof(ob_size) = %d, offsetof(ob_sval) = %d\n", offsetof( > PyBytesObject, ob_base.ob_size ), > offsetof( PyBytesObject, ob_sval ) ); > printf( "sizeof(PyBytesObject) = %d\n", sizeof(PyBytesObject) ); > DsmDumpBytes( pBytesObject, 32 ); > } > > -------- output ------------ > 20 > xxxxxxxxxxxxxxxxxxxx| > ob_size = 20, ob_shash = 70768D53, ob_sval = ?sci > offsetof(ob_size) = 8, offsetof(ob_sval) = 16 > sizeof(PyBytesObject) = 20 > 0000: 03 00 00 00 60 B0 DC 1D 14 00 00 00 53 8D 76 70 ....-........... > 0016: E5 73 63 69 00 00 00 00 78 78 78 78 78 78 78 78 V............... > > > > > --- > This email has been checked for viruses by Avast antivirus software. > https://www.avast.com/antivirus > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > > > This > email has been sent from a virus-free computer protected by Avast. > www.avast.com > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/victor.pantoja%40gmail.com > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From avivcohn123 at yahoo.com Sat Jan 16 11:05:06 2016 From: avivcohn123 at yahoo.com (Aviv Cohn) Date: Sat, 16 Jan 2016 16:05:06 +0000 (UTC) Subject: [Python-Dev] Should inspect.getargspec take any callable? References: <375069367.4888403.1452960306976.JavaMail.yahoo.ref@mail.yahoo.com> Message-ID: <375069367.4888403.1452960306976.JavaMail.yahoo@mail.yahoo.com> Hello :)This is my first mail here, I had an idea that I'd like to propose. The `getargspec` function in the `inspect` module enforces the input parameter to be either a method or a function. ? ? def getargspec(func):? ? ? ? """Get the names and default values of a function's arguments.? ??? ? ? ? A tuple of four things is returned: (args, varargs, varkw, defaults).? ? ? ? 'args' is a list of the argument names (it may contain nested lists).? ? ? ? 'varargs' and 'varkw' are the names of the * and ** arguments or None.? ??? ? 'defaults' is an n-tuple of the default values of the last n arguments.? ??? ? """? ?? ? ??? ? if ismethod(func):? ??? ? ? ? func = func.im_func? ??? ? if not isfunction(func):? ??? ? ? ? raise TypeError('{!r} is not a Python function'.format(func))? ??? ? args, varargs, varkw = getargs(func.func_code)? ??? ? return ArgSpec(args, varargs, varkw, func.func_defaults) Passing in a callable which is not a function causes a TypeError to be raised. I think in this case any callable should be allowed, allowing classes and callable objects as well.We can switch on whether `func` is a function, a class or a callable object, and pass into `getargs` the appropriate value. What is your opinion?Thank you -------------- next part -------------- An HTML attachment was scrubbed... URL: From abarnert at yahoo.com Sat Jan 16 12:23:10 2016 From: abarnert at yahoo.com (Andrew Barnert) Date: Sat, 16 Jan 2016 09:23:10 -0800 Subject: [Python-Dev] Should inspect.getargspec take any callable? In-Reply-To: <375069367.4888403.1452960306976.JavaMail.yahoo@mail.yahoo.com> References: <375069367.4888403.1452960306976.JavaMail.yahoo.ref@mail.yahoo.com> <375069367.4888403.1452960306976.JavaMail.yahoo@mail.yahoo.com> Message-ID: On Jan 16, 2016, at 08:05, Aviv Cohn via Python-Dev wrote: > > The `getargspec` function in the `inspect` module enforces the input parameter to be either a method or a function. The `getargspec` already works with classes, callable objects, and some builtins. It's also deprecated, in part because its API can't handle various features (like keyword-only arguments). There is an extended version that can handle some of those features, but as of 3.5 that one is deprecated as well. The `signature` function is much easier to use, as well as being more powerful. > > def getargspec(func): > """Get the names and default values of a function's arguments. > > A tuple of four things is returned: (args, varargs, varkw, defaults). > 'args' is a list of the argument names (it may contain nested lists). > 'varargs' and 'varkw' are the names of the * and ** arguments or None. > 'defaults' is an n-tuple of the default values of the last n arguments. > """ > > if ismethod(func): > func = func.im_func > if not isfunction(func): > raise TypeError('{!r} is not a Python function'.format(func)) > args, varargs, varkw = getargs(func.func_code) > return ArgSpec(args, varargs, varkw, func.func_defaults) > > Passing in a callable which is not a function causes a TypeError to be raised. > > I think in this case any callable should be allowed, allowing classes and callable objects as well. > We can switch on whether `func` is a function, a class or a callable object, and pass into `getargs` the appropriate value. > > What is your opinion? > Thank you > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/abarnert%40yahoo.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From ethan at stoneleaf.us Sat Jan 16 13:18:15 2016 From: ethan at stoneleaf.us (Ethan Furman) Date: Sat, 16 Jan 2016 10:18:15 -0800 Subject: [Python-Dev] [Python-ideas] Boolean value of an Enum member In-Reply-To: <5699583A.8090102@canterbury.ac.nz> References: <56993900.60105@stoneleaf.us> <5699583A.8090102@canterbury.ac.nz> Message-ID: <569A8967.8010805@stoneleaf.us> [resending to lists -- sorry, Greg] On 01/15/2016 12:36 PM, Greg Ewing wrote: > Ethan Furman wrote: >> So the question now is: for a standard Enum (meaning no other type >> besides Enum is involved) should __bool__ look to the value of the >> Enum member to determine True/False, or should we always be True by >> default and make the Enum creator add their own __bool__ if they want >> something different? > > Can't you just specify a starting value of 0 if you > want the enum to have a false value? That doesn't > seem too onerous to me. You can start with zero, but unless the Enum is mixed with a numeric type it will evaluate to True. Also, but there are other falsey values that a pure Enum member could have: False, None, '', etc., to name a few. However, as Barry said, writing your own is a whopping two lines of code: def __bool__(self): return bool(self._value_) With Barry and Guido's feedback this issue is closed. Thanks everyone! -- ~Ethan~ From ncoghlan at gmail.com Sat Jan 16 22:32:55 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun, 17 Jan 2016 13:32:55 +1000 Subject: [Python-Dev] Should inspect.getargspec take any callable? In-Reply-To: References: <375069367.4888403.1452960306976.JavaMail.yahoo.ref@mail.yahoo.com> <375069367.4888403.1452960306976.JavaMail.yahoo@mail.yahoo.com> Message-ID: On 17 January 2016 at 03:23, Andrew Barnert via Python-Dev wrote: > On Jan 16, 2016, at 08:05, Aviv Cohn via Python-Dev > wrote: > > The `getargspec` function in the `inspect` module enforces the input > parameter to be either a method or a function. > > > The `getargspec` already works with classes, callable objects, and some > builtins. > > It's also deprecated, in part because its API can't handle various features > (like keyword-only arguments). There is an extended version that can handle > some of those features, but as of 3.5 that one is deprecated as well. > > The `signature` function is much easier to use, as well as being more > powerful. As Andrew states here, the limitations of getargspec() and getfullargspec() are why they were deprecated in favour of inspect.signature() in Python 3.3: https://docs.python.org/3/library/inspect.html#introspecting-callables-with-the-signature-object The funcsigs project provides a backport of much of that functionality to earlier Python versions: https://pypi.python.org/pypi/funcsigs Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From brett at python.org Sun Jan 17 14:10:40 2016 From: brett at python.org (Brett Cannon) Date: Sun, 17 Jan 2016 19:10:40 +0000 Subject: [Python-Dev] Update PEP 7 to require curly braces in C Message-ID: While doing a review of http://bugs.python.org/review/26129/ I asked to have curly braces put around all `if` statement bodies. Serhiy pointed out that PEP 7 says curly braces are optional: https://www.python.org/dev/peps/pep-0007/#id5. I would like to change that. My argument is to require them to prevent bugs like the one Apple made with OpenSSL about two years ago: https://www.imperialviolet.org/2014/02/22/applebug.html. Skipping the curly braces is purely an aesthetic thing while leaving them out can lead to actual bugs. Anyone object if I update PEP 7 to remove the optionality of curly braces in PEP 7? -------------- next part -------------- An HTML attachment was scrubbed... URL: From ethan at stoneleaf.us Sun Jan 17 16:58:52 2016 From: ethan at stoneleaf.us (Ethan Furman) Date: Sun, 17 Jan 2016 13:58:52 -0800 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: Message-ID: <569C0E9C.1000307@stoneleaf.us> On 01/17/2016 11:10 AM, Brett Cannon wrote: > https://www.imperialviolet.org/2014/02/22/applebug.html. Skipping the > curly braces is purely an aesthetic thing while leaving them out can > lead to actual bugs. Not sure what that sentence actually says, but +1 on making them mandatory. -- ~Ethan~ From brett at python.org Sun Jan 17 17:19:52 2016 From: brett at python.org (Brett Cannon) Date: Sun, 17 Jan 2016 22:19:52 +0000 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: <569C0E9C.1000307@stoneleaf.us> References: <569C0E9C.1000307@stoneleaf.us> Message-ID: On Sun, 17 Jan 2016, 13:59 Ethan Furman wrote: > On 01/17/2016 11:10 AM, Brett Cannon wrote: > > > https://www.imperialviolet.org/2014/02/22/applebug.html. Skipping the > > curly braces is purely an aesthetic thing while leaving them out can > > lead to actual bugs. > > Not sure what that sentence actually says, but +1 on making them mandatory. > Yeah, bad phrasing on my part. What I meant to say is leaving them off is an aesthetic thing while requiring them is a bug prevention thing. When it comes to writing C code I always vote for practicality over aesthetics. > -- > ~Ethan~ > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From robertc at robertcollins.net Sun Jan 17 17:49:28 2016 From: robertc at robertcollins.net (Robert Collins) Date: Mon, 18 Jan 2016 11:49:28 +1300 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: <569C0E9C.1000307@stoneleaf.us> Message-ID: +1 from me on requiring them. On 18 January 2016 at 11:19, Brett Cannon wrote: > > > On Sun, 17 Jan 2016, 13:59 Ethan Furman wrote: >> >> On 01/17/2016 11:10 AM, Brett Cannon wrote: >> >> > https://www.imperialviolet.org/2014/02/22/applebug.html. Skipping the >> > curly braces is purely an aesthetic thing while leaving them out can >> > lead to actual bugs. >> >> Not sure what that sentence actually says, but +1 on making them >> mandatory. > > > > Yeah, bad phrasing on my part. What I meant to say is leaving them off is an > aesthetic thing while requiring them is a bug prevention thing. When it > comes to writing C code I always vote for practicality over aesthetics. > >> >> -- >> ~Ethan~ >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/brett%40python.org > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/robertc%40robertcollins.net > -- Robert Collins Distinguished Technologist HP Converged Cloud From guido at python.org Sun Jan 17 21:50:51 2016 From: guido at python.org (Guido van Rossum) Date: Sun, 17 Jan 2016 18:50:51 -0800 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: <569C0E9C.1000307@stoneleaf.us> Message-ID: I'm +0. The editor I use is too smart to let me make this mistake, but I don't object to recommending it. As usual, though, let's not start mindless reformatting of existing code. -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From victor.stinner at gmail.com Mon Jan 18 02:00:19 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Mon, 18 Jan 2016 08:00:19 +0100 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: Message-ID: I like if without braces when the body is only one line, especially when there is no else block. Victor Le dimanche 17 janvier 2016, Brett Cannon a ?crit : > While doing a review of http://bugs.python.org/review/26129/ I asked to > have curly braces put around all `if` statement bodies. Serhiy pointed out > that PEP 7 says curly braces are optional: > https://www.python.org/dev/peps/pep-0007/#id5. I would like to change > that. > > My argument is to require them to prevent bugs like the one Apple made > with OpenSSL about two years ago: > https://www.imperialviolet.org/2014/02/22/applebug.html. Skipping the > curly braces is purely an aesthetic thing while leaving them out can lead > to actual bugs. > > Anyone object if I update PEP 7 to remove the optionality of curly braces > in PEP 7? > -------------- next part -------------- An HTML attachment was scrubbed... URL: From g.brandl at gmx.net Mon Jan 18 02:02:20 2016 From: g.brandl at gmx.net (Georg Brandl) Date: Mon, 18 Jan 2016 08:02:20 +0100 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: <569C0E9C.1000307@stoneleaf.us> Message-ID: On 01/17/2016 11:19 PM, Brett Cannon wrote: > > > On Sun, 17 Jan 2016, 13:59 Ethan Furman > wrote: > > On 01/17/2016 11:10 AM, Brett Cannon wrote: > > > https://www.imperialviolet.org/2014/02/22/applebug.html. Skipping the > > curly braces is purely an aesthetic thing while leaving them out can > > lead to actual bugs. > > Not sure what that sentence actually says, but +1 on making them mandatory. > > > > Yeah, bad phrasing on my part. What I meant to say is leaving them off is an > aesthetic thing while requiring them is a bug prevention thing. When it comes to > writing C code I always vote for practicality over aesthetics. +1. Out of curiosity, I made a quick script to see if we had any candidates for bugs related to this. I didn't expect any bugs to be found, since with the amount of static checkers that have been run they should have been found. The only problem I found was in the S390 port of libffi (#ifdef-conditional code which wouldn't even compile). I also found (in ast.c) two instances of semantically correct code with the wrong indent level which I fixed (see rev 1ececa34b748). cheers, Georg From storchaka at gmail.com Mon Jan 18 03:05:46 2016 From: storchaka at gmail.com (Serhiy Storchaka) Date: Mon, 18 Jan 2016 10:05:46 +0200 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: Message-ID: On 17.01.16 21:10, Brett Cannon wrote: > While doing a review of http://bugs.python.org/review/26129/ > I asked to have curly braces put > around all `if` statement bodies. Serhiy pointed out that PEP 7 says > curly braces are optional: > https://www.python.org/dev/peps/pep-0007/#id5. I would like to change that. > > My argument is to require them to prevent bugs like the one Apple made > with OpenSSL about two years ago: > https://www.imperialviolet.org/2014/02/22/applebug.html. Skipping the > curly braces is purely an aesthetic thing while leaving them out can > lead to actual bugs. > > Anyone object if I update PEP 7 to remove the optionality of curly > braces in PEP 7? I'm -0. The code without braces looks more clear. Especially if the body is one-line return, break, continue or goto statement. Sometimes it is appropriate to add an empty line after it for even larger clearness. On the other hand, there is no a precedence of bugs like the one Apple made in CPython sources. Mandatory braces *may be* will prevent hypothetical bug, but for sure make a lot of correct code harder to read. From mal at egenix.com Mon Jan 18 03:24:40 2016 From: mal at egenix.com (M.-A. Lemburg) Date: Mon, 18 Jan 2016 09:24:40 +0100 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: Message-ID: <569CA148.7030404@egenix.com> On 18.01.2016 08:00, Victor Stinner wrote: > I like if without braces when the body is only one line, especially when > there is no else block. Same here. Compilers warn about these things today, so I don't think we need to go paranoid ;-) > Victor > > > Le dimanche 17 janvier 2016, Brett Cannon a ?crit : > >> While doing a review of http://bugs.python.org/review/26129/ I asked to >> have curly braces put around all `if` statement bodies. Serhiy pointed out >> that PEP 7 says curly braces are optional: >> https://www.python.org/dev/peps/pep-0007/#id5. I would like to change >> that. >> >> My argument is to require them to prevent bugs like the one Apple made >> with OpenSSL about two years ago: >> https://www.imperialviolet.org/2014/02/22/applebug.html. Skipping the >> curly braces is purely an aesthetic thing while leaving them out can lead >> to actual bugs. >> >> Anyone object if I update PEP 7 to remove the optionality of curly braces >> in PEP 7? -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Experts (#1, Jan 18 2016) >>> Python Projects, Coaching and Consulting ... http://www.egenix.com/ >>> Python Database Interfaces ... http://products.egenix.com/ >>> Plone/Zope Database Interfaces ... http://zope.egenix.com/ ________________________________________________________________________ ::: We implement business ideas - efficiently in both time and costs ::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/ http://www.malemburg.com/ From abarnert at yahoo.com Mon Jan 18 03:39:42 2016 From: abarnert at yahoo.com (Andrew Barnert) Date: Mon, 18 Jan 2016 00:39:42 -0800 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: Message-ID: On Jan 17, 2016, at 11:10, Brett Cannon wrote: > > While doing a review of http://bugs.python.org/review/26129/ I asked to have curly braces put around all `if` statement bodies. Serhiy pointed out that PEP 7 says curly braces are optional: https://www.python.org/dev/peps/pep-0007/#id5. I would like to change that. > > My argument is to require them to prevent bugs like the one Apple made with OpenSSL about two years ago: https://www.imperialviolet.org/2014/02/22/applebug.html. Skipping the curly braces is purely an aesthetic thing while leaving them out can lead to actual bugs. > > Anyone object if I update PEP 7 to remove the optionality of curly braces in PEP 7? There are two ways you could do that. The first is to just change "braces may be omitted where C permits, but when present, they should be formatted as follows" to something like "braces must not be omitted, and should be formatted as follows", changing one-liner tests into this: if (!obj) { return -1; } Alternatively, it could say something like "braces must not be omitted; when other C styles would use a braceless one-liner, a one-liner with braces should be used instead; otherwise, they should be formatted as follows", changing the same tests into: if (!obj) { return -1; } The first one is obviously a much bigger change in the formatting of actual code, even if it's a simpler change to the PEP. Is that what was intended? -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Mon Jan 18 03:47:01 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Mon, 18 Jan 2016 18:47:01 +1000 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: Message-ID: On 18 January 2016 at 05:10, Brett Cannon wrote: > While doing a review of http://bugs.python.org/review/26129/ I asked to have > curly braces put around all `if` statement bodies. Serhiy pointed out that > PEP 7 says curly braces are optional: > https://www.python.org/dev/peps/pep-0007/#id5. I would like to change that. > > My argument is to require them to prevent bugs like the one Apple made with > OpenSSL about two years ago: > https://www.imperialviolet.org/2014/02/22/applebug.html. Skipping the curly > braces is purely an aesthetic thing while leaving them out can lead to > actual bugs. > > Anyone object if I update PEP 7 to remove the optionality of curly braces in > PEP 7? +1 from me, as I usually add them to code I'm editing anyway (I find it too hard to read otherwise, especially when there's a long series of them). Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From larry at hastings.org Mon Jan 18 03:59:12 2016 From: larry at hastings.org (Larry Hastings) Date: Mon, 18 Jan 2016 00:59:12 -0800 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: Message-ID: <569CA960.4030401@hastings.org> On 01/17/2016 11:10 AM, Brett Cannon wrote: > Anyone object if I update PEP 7 to remove the optionality of curly > braces in PEP 7? I'm -1. I don't like being forced to add the curly braces when the code is perfectly clear without them. If this was a frequent problem then I'd put up with it, but I can't recall ever making this particular mistake myself or seeing it in CPython source. It seems to me like a fix for a problem we don't have. //arry/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From soltysh at gmail.com Mon Jan 18 06:42:44 2016 From: soltysh at gmail.com (Maciej Szulik) Date: Mon, 18 Jan 2016 12:42:44 +0100 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: <569CA960.4030401@hastings.org> References: <569CA960.4030401@hastings.org> Message-ID: On Mon, Jan 18, 2016 at 9:59 AM, Larry Hastings wrote: > > > On 01/17/2016 11:10 AM, Brett Cannon wrote: > > Anyone object if I update PEP 7 to remove the optionality of curly braces > in PEP 7? > > > I'm -1. I don't like being forced to add the curly braces when the code > is perfectly clear without them. If this was a frequent problem then I'd > put up with it, but I can't recall ever making this particular mistake > myself or seeing it in CPython source. It seems to me like a fix for a > problem we don't have. > > I'm +1. We don't have that problem yet and the idea Brett brought up is for future changes that will happen. We'll be soon moving to github, which should simplify the process of submitting PRs from other developers interested in making our beautiful language even more awesome. I'm quite positive that with current review process that kind of bug should not happen, but you never know. Having this as a requirement is rather to minimize the risk of potentially having such bugs. I've switched to this style myself good couple years ago and I find it very readable atm. Maciej */arry* > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/soltysh%40gmail.com > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From rosuav at gmail.com Mon Jan 18 06:57:58 2016 From: rosuav at gmail.com (Chris Angelico) Date: Mon, 18 Jan 2016 22:57:58 +1100 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: <569CA960.4030401@hastings.org> Message-ID: On Mon, Jan 18, 2016 at 10:42 PM, Maciej Szulik wrote: > On Mon, Jan 18, 2016 at 9:59 AM, Larry Hastings wrote: >> >> >> >> On 01/17/2016 11:10 AM, Brett Cannon wrote: >> >> Anyone object if I update PEP 7 to remove the optionality of curly braces >> in PEP 7? >> >> >> I'm -1. I don't like being forced to add the curly braces when the code >> is perfectly clear without them. If this was a frequent problem then I'd >> put up with it, but I can't recall ever making this particular mistake >> myself or seeing it in CPython source. It seems to me like a fix for a >> problem we don't have. >> > > I'm +1. We don't have that problem yet and the idea Brett brought up is for > future changes that will happen. > We'll be soon moving to github, which should simplify the process of > submitting PRs from other developers > interested in making our beautiful language even more awesome. I'm quite > positive that with current review > process that kind of bug should not happen, but you never know. Having this > as a requirement is rather to > minimize the risk of potentially having such bugs. I've switched to this > style myself good couple years ago > and I find it very readable atm. Rather than forcing people to use braces, wouldn't it be easier to just add a linter to the toolchain that will detect those kinds of problems and reject the commit? Which might be as simple as telling a compiler to treat this warning as an error, and then always use that compiler at some point before committing. (I don't know how many compilers can check this.) I'm -1 on forcing people to use a particular style when a script could do the same job more reliably. ChrisA From skrah.temporarily at gmail.com Mon Jan 18 06:58:20 2016 From: skrah.temporarily at gmail.com (Stefan Krah) Date: Mon, 18 Jan 2016 11:58:20 +0000 (UTC) Subject: [Python-Dev] Update PEP 7 to require curly braces in C References: Message-ID: Brett Cannon python.org> writes: > Anyone object if I update PEP 7 to remove the optionality of curly braces in PEP 7? I strongly prefer braces everywhere, but I'm -1 on enforcing it. Stefan Krah From larry at hastings.org Mon Jan 18 07:02:38 2016 From: larry at hastings.org (Larry Hastings) Date: Mon, 18 Jan 2016 04:02:38 -0800 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: <569CA960.4030401@hastings.org> Message-ID: <569CD45E.2030703@hastings.org> On 01/18/2016 03:57 AM, Chris Angelico wrote: > Rather than forcing people to use braces, wouldn't it be easier to > just add a linter to the toolchain that will detect those kinds of > problems and reject the commit? I don't understand your suggestion. If we automatically reject commits that lack this style of braces, then surely we are "forcing people to use [them]"? //arry/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From storchaka at gmail.com Mon Jan 18 07:04:26 2016 From: storchaka at gmail.com (Serhiy Storchaka) Date: Mon, 18 Jan 2016 14:04:26 +0200 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: <569CA960.4030401@hastings.org> Message-ID: On 18.01.16 13:42, Maciej Szulik wrote: > We'll be soon moving to github, which should simplify the process of > submitting PRs from other developers > interested in making our beautiful language even more awesome. I'm quite > positive that with current review > process that kind of bug should not happen, but you never know. If moving to GitHub will decrease the quality of source code, it is bad idea. From rosuav at gmail.com Mon Jan 18 07:09:23 2016 From: rosuav at gmail.com (Chris Angelico) Date: Mon, 18 Jan 2016 23:09:23 +1100 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: <569CD45E.2030703@hastings.org> References: <569CA960.4030401@hastings.org> <569CD45E.2030703@hastings.org> Message-ID: On Mon, Jan 18, 2016 at 11:02 PM, Larry Hastings wrote: > On 01/18/2016 03:57 AM, Chris Angelico wrote: > > Rather than forcing people to use braces, wouldn't it be easier to > just add a linter to the toolchain that will detect those kinds of > problems and reject the commit? > > > I don't understand your suggestion. If we automatically reject commits that > lack this style of braces, then surely we are "forcing people to use > [them]"? Only in the exact situation that this is trying to prevent, where indentation implies something that braces don't stipulate. From the original Apple bug link: if ((err = SSLHashSHA1.update(&hashCtx, &signedParams)) != 0) goto fail; goto fail; Since there are two indented lines after the if and no braces, this should be flagged as an error. If there's only one indented line, braces are optional. ChrisA From barry at python.org Mon Jan 18 09:49:56 2016 From: barry at python.org (Barry Warsaw) Date: Mon, 18 Jan 2016 09:49:56 -0500 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: Message-ID: <20160118094956.5263efe2@limelight.wooz.org> On Jan 17, 2016, at 07:10 PM, Brett Cannon wrote: >Anyone object if I update PEP 7 to remove the optionality of curly braces >in PEP 7? +1 for requiring them everywhere, -1 for a wholesale reformatting of existing code. New code and significant refactoring/rewriting can adopt braces everywhere. Cheers, -Barry From yselivanov.ml at gmail.com Mon Jan 18 11:52:08 2016 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Mon, 18 Jan 2016 11:52:08 -0500 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: Message-ID: <569D1838.9000108@gmail.com> I'm usually fine with code like this: if ( ... ) return But when there is a multi-line 'else' clause, I just want to use braces for both 'if' and its 'else'. So, for consistency, I'm +1 for recommending to use braces everywhere. And +1 for requiring to use braces if one of the clauses of the 'if' statement is multi-line. Yury On 2016-01-17 2:10 PM, Brett Cannon wrote: > While doing a review of http://bugs.python.org/review/26129/ I asked > to have curly braces put around all `if` statement bodies. Serhiy > pointed out that PEP 7 says curly braces are optional: > https://www.python.org/dev/peps/pep-0007/#id5. I would like to change > that. > > My argument is to require them to prevent bugs like the one Apple made > with OpenSSL about two years ago: > https://www.imperialviolet.org/2014/02/22/applebug.html. Skipping the > curly braces is purely an aesthetic thing while leaving them out can > lead to actual bugs. > > Anyone object if I update PEP 7 to remove the optionality of curly > braces in PEP 7? > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/yselivanov.ml%40gmail.com From brett at python.org Mon Jan 18 11:57:13 2016 From: brett at python.org (Brett Cannon) Date: Mon, 18 Jan 2016 16:57:13 +0000 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: <569CA960.4030401@hastings.org> Message-ID: On Mon, 18 Jan 2016 at 04:05 Serhiy Storchaka wrote: > On 18.01.16 13:42, Maciej Szulik wrote: > > We'll be soon moving to github, which should simplify the process of > > submitting PRs from other developers > > interested in making our beautiful language even more awesome. I'm quite > > positive that with current review > > process that kind of bug should not happen, but you never know. > > If moving to GitHub will decrease the quality of source code, it is bad > idea. > It's not going to decrease code quality. -------------- next part -------------- An HTML attachment was scrubbed... URL: From fijall at gmail.com Mon Jan 18 15:18:28 2016 From: fijall at gmail.com (Maciej Fijalkowski) Date: Mon, 18 Jan 2016 21:18:28 +0100 Subject: [Python-Dev] _PyThreadState_Current Message-ID: Hi change in between 3.5.0 and 3.5.1 (hiding _PyThreadState_Current and pyatomic.h) broke vmprof. The problem is that as a profile, vmprof can really encounter _PyThreadState_Current being null, while crashing an interpreter is a bit not ideal in this case. Any chance, a) _PyThreadState_Current can be restored in visibility? b) can I get a better API to get it in case it can be NULL, but also in 3.5 (since it works in 3.5.0 and breaks in 3.5.1) Cheers, fijal From victor.stinner at gmail.com Mon Jan 18 15:25:18 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Mon, 18 Jan 2016 21:25:18 +0100 Subject: [Python-Dev] _PyThreadState_Current In-Reply-To: References: Message-ID: Hum, you can try to lie and define Py_BUILD_CORE? Victor 2016-01-18 21:18 GMT+01:00 Maciej Fijalkowski : > Hi > > change in between 3.5.0 and 3.5.1 (hiding _PyThreadState_Current and > pyatomic.h) broke vmprof. The problem is that as a profile, vmprof can > really encounter _PyThreadState_Current being null, while crashing an > interpreter is a bit not ideal in this case. > > Any chance, a) _PyThreadState_Current can be restored in visibility? > b) can I get a better API to get it in case it can be NULL, but also > in 3.5 (since it works in 3.5.0 and breaks in 3.5.1) > > Cheers, > fijal > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/victor.stinner%40gmail.com From fijall at gmail.com Mon Jan 18 15:31:05 2016 From: fijall at gmail.com (Maciej Fijalkowski) Date: Mon, 18 Jan 2016 21:31:05 +0100 Subject: [Python-Dev] _PyThreadState_Current In-Reply-To: References: Message-ID: Good point On Mon, Jan 18, 2016 at 9:25 PM, Victor Stinner wrote: > Hum, you can try to lie and define Py_BUILD_CORE? > > Victor > > 2016-01-18 21:18 GMT+01:00 Maciej Fijalkowski : >> Hi >> >> change in between 3.5.0 and 3.5.1 (hiding _PyThreadState_Current and >> pyatomic.h) broke vmprof. The problem is that as a profile, vmprof can >> really encounter _PyThreadState_Current being null, while crashing an >> interpreter is a bit not ideal in this case. >> >> Any chance, a) _PyThreadState_Current can be restored in visibility? >> b) can I get a better API to get it in case it can be NULL, but also >> in 3.5 (since it works in 3.5.0 and breaks in 3.5.1) >> >> Cheers, >> fijal >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: https://mail.python.org/mailman/options/python-dev/victor.stinner%40gmail.com From fijall at gmail.com Mon Jan 18 15:32:28 2016 From: fijall at gmail.com (Maciej Fijalkowski) Date: Mon, 18 Jan 2016 21:32:28 +0100 Subject: [Python-Dev] _PyThreadState_Current In-Reply-To: References: Message-ID: seems to work thanks. That said, I would love to have PyThreadState_Get equivalent that would let me handle the NULL. On Mon, Jan 18, 2016 at 9:31 PM, Maciej Fijalkowski wrote: > Good point > > On Mon, Jan 18, 2016 at 9:25 PM, Victor Stinner > wrote: >> Hum, you can try to lie and define Py_BUILD_CORE? >> >> Victor >> >> 2016-01-18 21:18 GMT+01:00 Maciej Fijalkowski : >>> Hi >>> >>> change in between 3.5.0 and 3.5.1 (hiding _PyThreadState_Current and >>> pyatomic.h) broke vmprof. The problem is that as a profile, vmprof can >>> really encounter _PyThreadState_Current being null, while crashing an >>> interpreter is a bit not ideal in this case. >>> >>> Any chance, a) _PyThreadState_Current can be restored in visibility? >>> b) can I get a better API to get it in case it can be NULL, but also >>> in 3.5 (since it works in 3.5.0 and breaks in 3.5.1) >>> >>> Cheers, >>> fijal >>> _______________________________________________ >>> Python-Dev mailing list >>> Python-Dev at python.org >>> https://mail.python.org/mailman/listinfo/python-dev >>> Unsubscribe: https://mail.python.org/mailman/options/python-dev/victor.stinner%40gmail.com From victor.stinner at gmail.com Mon Jan 18 17:43:54 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Mon, 18 Jan 2016 23:43:54 +0100 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: References: Message-ID: Is someone opposed to this PEP 509? The main complain was the change on the public Python API, but the PEP doesn't change the Python API anymore. I'm not aware of any remaining issue on this PEP. Victor 2016-01-11 17:49 GMT+01:00 Victor Stinner : > Hi, > > After a first round on python-ideas, here is the second version of my > PEP. The main changes since the first version are that the dictionary > version is no more exposed at the Python level and the field type now > also has a size of 64-bit on 32-bit platforms. > > The PEP is part of a serie of 3 PEP adding an API to implement a > static Python optimizer specializing functions with guards. The second > PEP is currently discussed on python-ideas and I'm still working on > the third PEP. > > Thanks to Red Hat for giving me time to experiment on this. > > > HTML version: > https://www.python.org/dev/peps/pep-0509/ > > > PEP: 509 > Title: Add a private version to dict > Version: $Revision$ > Last-Modified: $Date$ > Author: Victor Stinner > Status: Draft > Type: Standards Track > Content-Type: text/x-rst > Created: 4-January-2016 > Python-Version: 3.6 > > > Abstract > ======== > > Add a new private version to builtin ``dict`` type, incremented at each > change, to implement fast guards on namespaces. > > > Rationale > ========= > > In Python, the builtin ``dict`` type is used by many instructions. For > example, the ``LOAD_GLOBAL`` instruction searchs for a variable in the > global namespace, or in the builtins namespace (two dict lookups). > Python uses ``dict`` for the builtins namespace, globals namespace, type > namespaces, instance namespaces, etc. The local namespace (namespace of > a function) is usually optimized to an array, but it can be a dict too. > > Python is hard to optimize because almost everything is mutable: builtin > functions, function code, global variables, local variables, ... can be > modified at runtime. Implementing optimizations respecting the Python > semantics requires to detect when "something changes": we will call > these checks "guards". > > The speedup of optimizations depends on the speed of guard checks. This > PEP proposes to add a version to dictionaries to implement fast guards > on namespaces. > > Dictionary lookups can be skipped if the version does not change which > is the common case for most namespaces. The performance of a guard does > not depend on the number of watched dictionary entries, complexity of > O(1), if the dictionary version does not change. > > Example of optimization: copy the value of a global variable to function > constants. This optimization requires a guard on the global variable to > check if it was modified. If the variable is modified, the variable must > be loaded at runtime when the function is called, instead of using the > constant. > > See the `PEP 510 -- Specialized functions with guards > `_ for the concrete usage of > guards to specialize functions and for the rationale on Python static > optimizers. > > > Guard example > ============= > > Pseudo-code of an fast guard to check if a dictionary entry was modified > (created, updated or deleted) using an hypothetical > ``dict_get_version(dict)`` function:: > > UNSET = object() > > class GuardDictKey: > def __init__(self, dict, key): > self.dict = dict > self.key = key > self.value = dict.get(key, UNSET) > self.version = dict_get_version(dict) > > def check(self): > """Return True if the dictionary entry did not changed.""" > > # read the version field of the dict structure > version = dict_get_version(self.dict) > if version == self.version: > # Fast-path: dictionary lookup avoided > return True > > # lookup in the dictionary > value = self.dict.get(self.key, UNSET) > if value is self.value: > # another key was modified: > # cache the new dictionary version > self.version = version > return True > > # the key was modified > return False > > > Usage of the dict version > ========================= > > Specialized functions using guards > ---------------------------------- > > The `PEP 510 -- Specialized functions with guards > `_ proposes an API to support > specialized functions with guards. It allows to implement static > optimizers for Python without breaking the Python semantics. > > Example of a static Python optimizer: the astoptimizer of the `FAT > Python `_ project > implements many optimizations which require guards on namespaces. > Examples: > > * Call pure builtins: to replace ``len("abc")`` with ``3``, guards on > ``builtins.__dict__['len']`` and ``globals()['len']`` are required > * Loop unrolling: to unroll the loop ``for i in range(...): ...``, > guards on ``builtins.__dict__['range']`` and ``globals()['range']`` > are required > > > Pyjion > ------ > > According of Brett Cannon, one of the two main developers of Pyjion, > Pyjion can also benefit from dictionary version to implement > optimizations. > > Pyjion is a JIT compiler for Python based upon CoreCLR (Microsoft .NET > Core runtime). > > > Unladen Swallow > --------------- > > Even if dictionary version was not explicitly mentionned, optimization > globals and builtins lookup was part of the Unladen Swallow plan: > "Implement one of the several proposed schemes for speeding lookups of > globals and builtins." Source: `Unladen Swallow ProjectPlan > `_. > > Unladen Swallow is a fork of CPython 2.6.1 adding a JIT compiler > implemented with LLVM. The project stopped in 2011: `Unladen Swallow > Retrospective > `_. > > > Changes > ======= > > Add a ``ma_version`` field to the ``PyDictObject`` structure with the C > type ``PY_INT64_T``, 64-bit unsigned integer. New empty dictionaries are > initilized to version ``0``. The version is incremented at each change: > > * ``clear()`` if the dict was non-empty > * ``pop(key)`` if the key exists > * ``popitem()`` if the dict is non-empty > * ``setdefault(key, value)`` if the `key` does not exist > * ``__detitem__(key)`` if the key exists > * ``__setitem__(key, value)`` if the `key` doesn't exist or if the value > is different > * ``update(...)`` if new values are different than existing values (the > version can be incremented multiple times) > > Example using an hypothetical ``dict_get_version(dict)`` function:: > > >>> d = {} > >>> dict_get_version(d) > 0 > >>> d['key'] = 'value' > >>> dict_get_version(d) > 1 > >>> d['key'] = 'new value' > >>> dict_get_version(d) > 2 > >>> del d['key'] > >>> dict_get_version(d) > 3 > > If a dictionary is created with items, the version is also incremented > at each dictionary insertion. Example:: > > >>> d=dict(x=7, y=33) > >>> dict_get_version(d) > 2 > > The version is not incremented if an existing key is set to the same > value. For efficiency, values are compared by their identity: > ``new_value is old_value``, not by their content: > ``new_value == old_value``. Example:: > > >>> d={} > >>> value = object() > >>> d['key'] = value > >>> dict_get_version(d) > 2 > >>> d['key'] = value > >>> dict_get_version(d) > 2 > > .. note:: > CPython uses some singleton like integers in the range [-5; 257], > empty tuple, empty strings, Unicode strings of a single character in > the range [U+0000; U+00FF], etc. When a key is set twice to the same > singleton, the version is not modified. > > > Implementation > ============== > > The `issue #26058: PEP 509: Add ma_version to PyDictObject > `_ contains a patch implementing > this PEP. > > On pybench and timeit microbenchmarks, the patch does not seem to add > any overhead on dictionary operations. > > When the version does not change, ``PyDict_GetItem()`` takes 14.8 ns for > a dictioanry lookup, whereas a guard check only takes 3.8 ns. Moreover, > a guard can watch for multiple keys. For example, for an optimization > using 10 global variables in a function, 10 dictionary lookups costs 148 > ns, whereas the guard still only costs 3.8 ns when the version does not > change (39x as fast). > > > Integer overflow > ================ > > The implementation uses the C unsigned integer type ``PY_INT64_T`` to > store the version, a 64 bits unsigned integer. The C code uses > ``version++``. On integer overflow, the version is wrapped to ``0`` (and > then continue to be incremented) according to the C standard. > > After an integer overflow, a guard can succeed whereas the watched > dictionary key was modified. The bug occurs if the dictionary is > modified at least ``2 ** 64`` times between two checks of the guard and > if the new version (theorical value with no integer overflow) is equal > to the old version modulo ``2 ** 64``. > > If a dictionary is modified each nanosecond, an overflow takes longer > than 584 years. Using a 32-bit version, the overflow occurs only after 4 > seconds. That's why a 64-bit unsigned type is also used on 32-bit > systems. A dictionary lookup at the C level takes 14.8 ns. > > A risk of a bug every 584 years is acceptable. > > > Alternatives > ============ > > Expose the version at Python level as a read-only __version__ property > ---------------------------------------------------------------------- > > The first version of the PEP proposed to expose the dictionary version > as a read-only ``__version__`` property at Python level, and also to add > the property to ``collections.UserDict`` (since this type must mimick > the ``dict`` API). > > There are multiple issues: > > * To be consistent and avoid bad surprises, the version must be added to > all mapping types. Implementing a new mapping type would require extra > work for no benefit, since the version is only required on the > ``dict`` type in practice. > * All Python implementations must implement this new property, it gives > more work to other implementations, whereas they may not use the > dictionary version at all. > * The ``__version__`` can be wrapped on integer overflow. It is error > prone: using ``dict.__version__ <= guard_version`` is wrong, > ``dict.__version__ == guard_version`` must be used instead to reduce > the risk of bug on integer overflow (even if the integer overflow is > unlikely in practice). > * Exposing the dictionary version at Python level can lead the > false assumption on performances. Checking ``dict.__version__`` at > the Python level is not faster than a dictionary lookup. A dictionary > lookup has a cost of 48.7 ns and checking a guard has a cost of 47.5 > ns, the difference is only 1.2 ns (3%):: > > > $ ./python -m timeit -s 'd = {str(i):i for i in range(100)}' 'd["33"] == 33' > 10000000 loops, best of 3: 0.0487 usec per loop > $ ./python -m timeit -s 'd = {str(i):i for i in range(100)}' > 'd.__version__ == 100' > 10000000 loops, best of 3: 0.0475 usec per loop > > Bikeshedding on the property name: > > * ``__cache_token__``: name proposed by Nick Coghlan, name coming from > `abc.get_cache_token() > `_. > * ``__version__`` > * ``__timestamp__`` > > > Add a version to each dict entry > -------------------------------- > > A single version per dictionary requires to keep a strong reference to > the value which can keep the value alive longer than expected. If we add > also a version per dictionary entry, the guard can only store the entry > version to avoid the strong reference to the value (only strong > references to the dictionary and to the key are needed). > > Changes: add a ``me_version`` field to the ``PyDictKeyEntry`` structure, > the field has the C type ``PY_INT64_T``. When a key is created or > modified, the entry version is set to the dictionary version which is > incremented at any change (create, modify, delete). > > Pseudo-code of an fast guard to check if a dictionary key was modified > using hypothetical ``dict_get_version(dict)`` > ``dict_get_entry_version(dict)`` functions:: > > UNSET = object() > > class GuardDictKey: > def __init__(self, dict, key): > self.dict = dict > self.key = key > self.dict_version = dict_get_version(dict) > self.entry_version = dict_get_entry_version(dict, key) > > def check(self): > """Return True if the dictionary entry did not changed.""" > > # read the version field of the dict structure > dict_version = dict_get_version(self.dict) > if dict_version == self.version: > # Fast-path: dictionary lookup avoided > return True > > # lookup in the dictionary > entry_version = get_dict_key_version(dict, key) > if entry_version == self.entry_version: > # another key was modified: > # cache the new dictionary version > self.dict_version = dict_version > return True > > # the key was modified > return False > > The main drawback of this option is the impact on the memory footprint. > It increases the size of each dictionary entry, so the overhead depends > on the number of buckets (dictionary entries, used or unused yet). For > example, it increases the size of each dictionary entry by 8 bytes on > 64-bit system. > > In Python, the memory footprint matters and the trend is to reduce it. > Examples: > > * `PEP 393 -- Flexible String Representation > `_ > * `PEP 412 -- Key-Sharing Dictionary > `_ > > > Add a new dict subtype > ---------------------- > > Add a new ``verdict`` type, subtype of ``dict``. When guards are needed, > use the ``verdict`` for namespaces (module namespace, type namespace, > instance namespace, etc.) instead of ``dict``. > > Leave the ``dict`` type unchanged to not add any overhead (memory > footprint) when guards are not needed. > > Technical issue: a lot of C code in the wild, including CPython core, > expecting the exact ``dict`` type. Issues: > > * ``exec()`` requires a ``dict`` for globals and locals. A lot of code > use ``globals={}``. It is not possible to cast the ``dict`` to a > ``dict`` subtype because the caller expects the ``globals`` parameter > to be modified (``dict`` is mutable). > * Functions call directly ``PyDict_xxx()`` functions, instead of calling > ``PyObject_xxx()`` if the object is a ``dict`` subtype > * ``PyDict_CheckExact()`` check fails on ``dict`` subtype, whereas some > functions require the exact ``dict`` type. > * ``Python/ceval.c`` does not completly supports dict subtypes for > namespaces > > > The ``exec()`` issue is a blocker issue. > > Other issues: > > * The garbage collector has a special code to "untrack" ``dict`` > instances. If a ``dict`` subtype is used for namespaces, the garbage > collector can be unable to break some reference cycles. > * Some functions have a fast-path for ``dict`` which would not be taken > for ``dict`` subtypes, and so it would make Python a little bit > slower. > > > Prior Art > ========= > > Method cache and type version tag > --------------------------------- > > In 2007, Armin Rigo wrote a patch to to implement a cache of methods. It > was merged into Python 2.6. The patch adds a "type attribute cache > version tag" (``tp_version_tag``) and a "valid version tag" flag to > types (the ``PyTypeObject`` structure). > > The type version tag is not available at the Python level. > > The version tag has the C type ``unsigned int``. The cache is a global > hash table of 4096 entries, shared by all types. The cache is global to > "make it fast, have a deterministic and low memory footprint, and be > easy to invalidate". Each cache entry has a version tag. A global > version tag is used to create the next version tag, it also has the C > type ``unsigned int``. > > By default, a type has its "valid version tag" flag cleared to indicate > that the version tag is invalid. When the first method of the type is > cached, the version tag and the "valid version tag" flag are set. When a > type is modified, the "valid version tag" flag of the type and its > subclasses is cleared. Later, when a cache entry of these types is used, > the entry is removed because its version tag is outdated. > > On integer overflow, the whole cache is cleared and the global version > tag is reset to ``0``. > > See `Method cache (issue #1685986) > `_ and `Armin's method cache > optimization updated for Python 2.6 (issue #1700288) > `_. > > > Globals / builtins cache > ------------------------ > > In 2010, Antoine Pitrou proposed a `Globals / builtins cache (issue > #10401) `_ which adds a private > ``ma_version`` field to the ``PyDictObject`` structure (``dict`` type), > the field has the C type ``Py_ssize_t``. > > The patch adds a "global and builtin cache" to functions and frames, and > changes ``LOAD_GLOBAL`` and ``STORE_GLOBAL`` instructions to use the > cache. > > The change on the ``PyDictObject`` structure is very similar to this > PEP. > > > Cached globals+builtins lookup > ------------------------------ > > In 2006, Andrea Griffini proposed a patch implementing a `Cached > globals+builtins lookup optimization > `_. The patch adds a private > ``timestamp`` field to the ``PyDictObject`` structure (``dict`` type), > the field has the C type ``size_t``. > > Thread on python-dev: `About dictionary lookup caching > `_. > > > Guard against changing dict during iteration > -------------------------------------------- > > In 2013, Serhiy Storchaka proposed `Guard against changing dict during > iteration (issue #19332) `_ which > adds a ``ma_count`` field to the ``PyDictObject`` structure (``dict`` > type), the field has the C type ``size_t``. This field is incremented > when the dictionary is modified, and so is very similar to the proposed > dictionary version. > > Sadly, the dictionary version proposed in this PEP doesn't help to > detect dictionary mutation. The dictionary version changes when values > are replaced, whereas modifying dictionary values while iterating on > dictionary keys is legit in Python. > > > PySizer > ------- > > `PySizer `_: a memory profiler for Python, > Google Summer of Code 2005 project by Nick Smallbone. > > This project has a patch for CPython 2.4 which adds ``key_time`` and > ``value_time`` fields to dictionary entries. It uses a global > process-wide counter for dictionaries, incremented each time that a > dictionary is modified. The times are used to decide when child objects > first appeared in their parent objects. > > > Discussion > ========== > > Thread on the python-ideas mailing list: `RFC: PEP: Add dict.__version__ > `_. > > > Copyright > ========= > > This document has been placed in the public domain. From barry at python.org Mon Jan 18 18:04:10 2016 From: barry at python.org (Barry Warsaw) Date: Mon, 18 Jan 2016 18:04:10 -0500 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: References: Message-ID: <20160118180410.0da0bc8f@limelight.wooz.org> On Jan 18, 2016, at 11:43 PM, Victor Stinner wrote: >Is someone opposed to this PEP 509? > >The main complain was the change on the public Python API, but the PEP >doesn't change the Python API anymore. > >I'm not aware of any remaining issue on this PEP. Have you done any performance analysis for a wide range of Python applications? Did you address my suggestion on python-ideas to make the new C API optionally compiled in? I still think this is maintenance and potential performance overhead we don't want to commit to long term unless it enables significant optimization. Since you probably can't prove that without some experimentation, this API should be provisional. Cheers, -Barry From brett at python.org Mon Jan 18 18:20:21 2016 From: brett at python.org (Brett Cannon) Date: Mon, 18 Jan 2016 23:20:21 +0000 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: Message-ID: On Sun, 17 Jan 2016 at 11:10 Brett Cannon wrote: > While doing a review of http://bugs.python.org/review/26129/ I asked to > have curly braces put around all `if` statement bodies. Serhiy pointed out > that PEP 7 says curly braces are optional: > https://www.python.org/dev/peps/pep-0007/#id5. I would like to change > that. > > My argument is to require them to prevent bugs like the one Apple made > with OpenSSL about two years ago: > https://www.imperialviolet.org/2014/02/22/applebug.html. Skipping the > curly braces is purely an aesthetic thing while leaving them out can lead > to actual bugs. > > Anyone object if I update PEP 7 to remove the optionality of curly braces > in PEP 7? > Currently this thread stands at: +1 Brett Ethan Robert Georg Nick Maciej Szulik +0 Guido -0 Serhiy MAL -1 Victor (maybe; didn't specifically vote) Larry Stefan -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Mon Jan 18 18:30:06 2016 From: brett at python.org (Brett Cannon) Date: Mon, 18 Jan 2016 23:30:06 +0000 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: <569CA960.4030401@hastings.org> References: <569CA960.4030401@hastings.org> Message-ID: On Mon, 18 Jan 2016 at 00:59 Larry Hastings wrote: > > > On 01/17/2016 11:10 AM, Brett Cannon wrote: > > Anyone object if I update PEP 7 to remove the optionality of curly braces > in PEP 7? > > > I'm -1. I don't like being forced to add the curly braces when the code > is perfectly clear without them. > I'm going to assume you mean it is clearer without them, else I don't hear an argument against beyond "I don't wanna". For me, I don't see how:: int x = do_something(); if (x != 10) return NULL; do_some_more(); is any clearer or more readable than:: int x = do_something(); if (x != 10) { return NULL; } do_some_more(); > If this was a frequent problem then I'd put up with it, but I can't recall > ever making this particular mistake myself or seeing it in CPython source. > It seems to me like a fix for a problem we don't have. > I personally have come close to screwing up like this, but I caught it before I put up a patch. I honestly also prefer the consistency of just always using braces rather than the sometimes rule we have. It isn't like C is a pretty, safe language to begin with, so I would rather be consistent and avoid potential errors. -------------- next part -------------- An HTML attachment was scrubbed... URL: From tjreedy at udel.edu Mon Jan 18 19:18:08 2016 From: tjreedy at udel.edu (Terry Reedy) Date: Mon, 18 Jan 2016 19:18:08 -0500 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: Message-ID: On 1/18/2016 6:20 PM, Brett Cannon wrote: > > > On Sun, 17 Jan 2016 at 11:10 Brett Cannon > wrote: > > While doing a review of http://bugs.python.org/review/26129/ I asked > to have curly braces put around all `if` statement bodies. Serhiy > pointed out that PEP 7 says curly braces are optional: > https://www.python.org/dev/peps/pep-0007/#id5. I would like to > change that. > > My argument is to require them to prevent bugs like the one Apple > made with OpenSSL about two years ago: > https://www.imperialviolet.org/2014/02/22/applebug.html. Skipping > the curly braces is purely an aesthetic thing while leaving them out > can lead to actual bugs. > > Anyone object if I update PEP 7 to remove the optionality of curly > braces in PEP 7? > > > Currently this thread stands at: > > +1 > Brett > Ethan > Robert > Georg > Nick > Maciej Szulik > +0 > Guido > -0 > Serhiy > MAL > -1 > Victor (maybe; didn't specifically vote) > Larry > Stefan Though I don't write C anymore, I occasionally read our C sources. I dislike mixed bracketing in a multiple clause if/else statement, and would strongly recommend against that. On the other hand, to my Python-trained eye, brackets for one line clauses are just noise. +-0. If coverity's scan does not flag the sort of misleading bug bait formatting that at least partly prompted this thread if (a): b; c; then I think we should find or write something that does and run it over existing code as well as patches. -- Terry Jan Reedy From python at mrabarnett.plus.com Mon Jan 18 20:04:03 2016 From: python at mrabarnett.plus.com (MRAB) Date: Tue, 19 Jan 2016 01:04:03 +0000 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: Message-ID: On 2016-01-19 00:18:08, "Terry Reedy" wrote: >On 1/18/2016 6:20 PM, Brett Cannon wrote: >> >>On Sun, 17 Jan 2016 at 11:10 Brett Cannon >> wrote: >> >> While doing a review of http://bugs.python.org/review/26129/ I >>asked >> to have curly braces put around all `if` statement bodies. Serhiy >> pointed out that PEP 7 says curly braces are optional: >> https://www.python.org/dev/peps/pep-0007/#id5. I would like to >> change that. >> >> My argument is to require them to prevent bugs like the one Apple >> made with OpenSSL about two years ago: >> https://www.imperialviolet.org/2014/02/22/applebug.html. Skipping >> the curly braces is purely an aesthetic thing while leaving them >>out >> can lead to actual bugs. >> >> Anyone object if I update PEP 7 to remove the optionality of curly >> braces in PEP 7? >> >> >>Currently this thread stands at: >> >>+1 >> Brett >> Ethan >> Robert >> Georg >> Nick >> Maciej Szulik >>+0 >> Guido >>-0 >> Serhiy >> MAL >>-1 >> Victor (maybe; didn't specifically vote) >> Larry >> Stefan > >Though I don't write C anymore, I occasionally read our C sources. I >dislike mixed bracketing in a multiple clause if/else statement, and >would strongly recommend against that. On the other hand, to my >Python-trained eye, brackets for one line clauses are just noise. +-0. > >If coverity's scan does not flag the sort of misleading bug bait >formatting that at least partly prompted this thread > >if (a): > b; > c; > >then I think we should find or write something that does and run it >over existing code as well as patches. > I agree. After all, how hard could it be? :-) From greg.ewing at canterbury.ac.nz Mon Jan 18 23:27:21 2016 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Tue, 19 Jan 2016 17:27:21 +1300 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: <569CA960.4030401@hastings.org> Message-ID: <569DBB29.3080701@canterbury.ac.nz> Brett Cannon wrote: > For me, I don't see how:: > > if (x != 10) > return NULL; > do_some_more(); > > is any clearer or more readable than:: > > if (x != 10) { > return NULL; > } > do_some_more(); Maybe not for that piece of code on its own, but the version with braces takes up one more line. Put a few of those together, and you can't fit as much code on the screen. If it makes the difference between being able to see e.g. the whole of a loop at once vs. having to scroll up and down, it could make the code as a whole harder to read. -- Greg From tritium-list at sdamon.com Mon Jan 18 23:40:30 2016 From: tritium-list at sdamon.com (Alexander Walters) Date: Mon, 18 Jan 2016 23:40:30 -0500 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: <569DBB29.3080701@canterbury.ac.nz> References: <569CA960.4030401@hastings.org> <569DBB29.3080701@canterbury.ac.nz> Message-ID: <569DBE3E.5020509@sdamon.com> On 1/18/2016 23:27, Greg Ewing wrote: > Brett Cannon wrote: >> For me, I don't see how:: >> >> if (x != 10) >> return NULL; >> do_some_more(); >> >> is any clearer or more readable than:: >> >> if (x != 10) { >> return NULL; >> } >> do_some_more(); > > Maybe not for that piece of code on its own, but the version > with braces takes up one more line. Put a few of those together, > and you can't fit as much code on the screen. If it makes the > difference between being able to see e.g. the whole of a loop > at once vs. having to scroll up and down, it could make the > code as a whole harder to read. > When someone trying to make this argument in #python for Python code... the response is newlines are free. Almost this entire thread has me confused - the arguments against are kind of hypocritical; You are developing a language with a built in design ethic, and ignoring those ethics while building the implementation itself. Newlines are free, use them Explicit > Implicit - Explicitly scope everything. I am not a core developer, but I just kind of feel its hypocritical to oppose always using brackets for the development of *python* From ncoghlan at gmail.com Tue Jan 19 00:09:07 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Tue, 19 Jan 2016 15:09:07 +1000 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: <569DBE3E.5020509@sdamon.com> References: <569CA960.4030401@hastings.org> <569DBB29.3080701@canterbury.ac.nz> <569DBE3E.5020509@sdamon.com> Message-ID: On 19 January 2016 at 14:40, Alexander Walters wrote: > > > On 1/18/2016 23:27, Greg Ewing wrote: >> >> Brett Cannon wrote: >>> >>> For me, I don't see how:: >>> >>> if (x != 10) >>> return NULL; >>> do_some_more(); >>> >>> is any clearer or more readable than:: >>> >>> if (x != 10) { >>> return NULL; >>> } >>> do_some_more(); >> >> >> Maybe not for that piece of code on its own, but the version >> with braces takes up one more line. Put a few of those together, >> and you can't fit as much code on the screen. If it makes the >> difference between being able to see e.g. the whole of a loop >> at once vs. having to scroll up and down, it could make the >> code as a whole harder to read. >> > When someone trying to make this argument in #python for Python code... the > response is newlines are free. Almost this entire thread has me confused - > the arguments against are kind of hypocritical; You are developing a > language with a built in design ethic, and ignoring those ethics while > building the implementation itself. There are two conflicting code aesthetics at work here, and the relevant one for the folks that prefer to avoid braces in C where they can is: >>> from __future__ import braces File "", line 1 SyntaxError: not a chance So if we bring *Python* into the comparison, we can see it splits the difference between the two C variants by omitting the closing brace and replacing the opening brace with ":": x = do_something() if (x != 10): return None do_some_more() The additional "cost" of mandatory braces in C is thus more lines containing only a single "}", while the benefit is simply not having to think about the braceless variant as a possible alternative spelling. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From greg.ewing at canterbury.ac.nz Tue Jan 19 00:09:11 2016 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Tue, 19 Jan 2016 18:09:11 +1300 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: <569DBE3E.5020509@sdamon.com> References: <569CA960.4030401@hastings.org> <569DBB29.3080701@canterbury.ac.nz> <569DBE3E.5020509@sdamon.com> Message-ID: <569DC4F7.7050406@canterbury.ac.nz> Alexander Walters wrote: > When someone trying to make this argument in #python for Python code... > the response is newlines are free. Well, I disagree. I very rarely put blank lines in a function in any language, because it makes it hard to scan the code visually and pick out the beginnings of functions. So while they may help readability locally, they hurt it globally. If I find myself needing to put blank lines in a function in order to make it readable, I take it as a sign that the function ought to be split up into smaller functions. -- Greg From greg.ewing at canterbury.ac.nz Tue Jan 19 00:16:37 2016 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Tue, 19 Jan 2016 18:16:37 +1300 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: <569DBE3E.5020509@sdamon.com> References: <569CA960.4030401@hastings.org> <569DBB29.3080701@canterbury.ac.nz> <569DBE3E.5020509@sdamon.com> Message-ID: <569DC6B5.303@canterbury.ac.nz> Alexander Walters wrote: > I am not a core developer, but I just kind of feel its hypocritical to > oppose always using brackets for the development of *python* If we were being *really* pythonic, we would write all the C code without any braces at all, and feed it through a filter that adds them based on indentation. End of argument. :-) -- Greg From tritium-list at sdamon.com Tue Jan 19 01:05:54 2016 From: tritium-list at sdamon.com (Alexander Walters) Date: Tue, 19 Jan 2016 01:05:54 -0500 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: <569DC4F7.7050406@canterbury.ac.nz> References: <569CA960.4030401@hastings.org> <569DBB29.3080701@canterbury.ac.nz> <569DBE3E.5020509@sdamon.com> <569DC4F7.7050406@canterbury.ac.nz> Message-ID: <569DD242.1080209@sdamon.com> On 1/19/2016 00:09, Greg Ewing wrote: > Alexander Walters wrote: >> When someone trying to make this argument in #python for Python >> code... the response is newlines are free. > > Well, I disagree. I very rarely put blank lines in a function > in any language, because it makes it hard to scan the code > visually and pick out the beginnings of functions. So while > they may help readability locally, they hurt it globally. I find the opposite to be true when I work in C > If I find myself needing to put blank lines in a function in > order to make it readable, I take it as a sign that the > function ought to be split up into smaller functions. > .... Well, maybe I should change my position based on this. Any excuse to break code out into more functions... is usually the right idea. From v+python at g.nevcal.com Tue Jan 19 01:04:55 2016 From: v+python at g.nevcal.com (Glenn Linderman) Date: Mon, 18 Jan 2016 22:04:55 -0800 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: <569DC6B5.303@canterbury.ac.nz> References: <569CA960.4030401@hastings.org> <569DBB29.3080701@canterbury.ac.nz> <569DBE3E.5020509@sdamon.com> <569DC6B5.303@canterbury.ac.nz> Message-ID: <569DD207.3070603@g.nevcal.com> On 1/18/2016 9:16 PM, Greg Ewing wrote: > Alexander Walters wrote: >> I am not a core developer, but I just kind of feel its hypocritical >> to oppose always using brackets for the development of *python* > > If we were being *really* pythonic, we would write all > the C code without any braces at all, and feed it through > a filter that adds them based on indentation. End of > argument. :-) > No argument, an extension: If we were being *really* pythonic, we would re-implement CPython in Python, with a Python-to-C option (Cython?), and then we wouldn't need to worry about using braces for block delimiters. -------------- next part -------------- An HTML attachment was scrubbed... URL: From tritium-list at sdamon.com Tue Jan 19 01:09:05 2016 From: tritium-list at sdamon.com (Alexander Walters) Date: Tue, 19 Jan 2016 01:09:05 -0500 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: <569DD207.3070603@g.nevcal.com> References: <569CA960.4030401@hastings.org> <569DBB29.3080701@canterbury.ac.nz> <569DBE3E.5020509@sdamon.com> <569DC6B5.303@canterbury.ac.nz> <569DD207.3070603@g.nevcal.com> Message-ID: <569DD301.8010509@sdamon.com> On 1/19/2016 01:04, Glenn Linderman wrote: > On 1/18/2016 9:16 PM, Greg Ewing wrote: >> Alexander Walters wrote: >>> I am not a core developer, but I just kind of feel its hypocritical >>> to oppose always using brackets for the development of *python* >> >> If we were being *really* pythonic, we would write all >> the C code without any braces at all, and feed it through >> a filter that adds them based on indentation. End of >> argument. :-) >> > > No argument, an extension: If we were being *really* pythonic, we > would re-implement CPython in Python, with a Python-to-C option > (Cython?), and then we wouldn't need to worry about using braces for > block delimiters. So what I can take away from this is pypy is the pythonic python. -------------- next part -------------- An HTML attachment was scrubbed... URL: From larry at hastings.org Tue Jan 19 02:21:09 2016 From: larry at hastings.org (Larry Hastings) Date: Mon, 18 Jan 2016 23:21:09 -0800 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: <569DBE3E.5020509@sdamon.com> References: <569CA960.4030401@hastings.org> <569DBB29.3080701@canterbury.ac.nz> <569DBE3E.5020509@sdamon.com> Message-ID: <569DE3E5.4070906@hastings.org> On 01/18/2016 08:40 PM, Alexander Walters wrote: > I am not a core developer, but I just kind of feel its hypocritical to > oppose always using brackets for the development of *python* CPython isn't written in Python, it's written in C. So we use C idioms, which naturally are different from Python idioms. //arry/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From tritium-list at sdamon.com Tue Jan 19 02:25:07 2016 From: tritium-list at sdamon.com (Alexander Walters) Date: Tue, 19 Jan 2016 02:25:07 -0500 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: <569DE3E5.4070906@hastings.org> References: <569CA960.4030401@hastings.org> <569DBB29.3080701@canterbury.ac.nz> <569DBE3E.5020509@sdamon.com> <569DE3E5.4070906@hastings.org> Message-ID: <569DE4D3.7090607@sdamon.com> On 1/19/2016 02:21, Larry Hastings wrote: > > On 01/18/2016 08:40 PM, Alexander Walters wrote: >> I am not a core developer, but I just kind of feel its hypocritical >> to oppose always using brackets for the development of *python* > > CPython isn't written in Python, it's written in C. So we use C > idioms, which naturally are different from Python idioms. > Yes, and always using brackets is an accepted C idiom (one of two, the other being only use brackets when you must). Always using them is more in line with the spirit of python. > //arry/ > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/tritium-list%40sdamon.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From g.brandl at gmx.net Tue Jan 19 02:32:06 2016 From: g.brandl at gmx.net (Georg Brandl) Date: Tue, 19 Jan 2016 08:32:06 +0100 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: Message-ID: On 01/19/2016 01:18 AM, Terry Reedy wrote: > Though I don't write C anymore, I occasionally read our C sources. I > dislike mixed bracketing in a multiple clause if/else statement, and > would strongly recommend against that. On the other hand, to my > Python-trained eye, brackets for one line clauses are just noise. +-0. > > If coverity's scan does not flag the sort of misleading bug bait > formatting that at least partly prompted this thread > > if (a): > b; > c; > > then I think we should find or write something that does and run it over > existing code as well as patches. I don't know if static checkers care about whitespace and indentation in C; it might be a very obvious thing to do for Python programmers, but maybe not for C static checker developers :) And probably with good reason, since whitespace isn't a consideration except for nicely readable code, which many people (not talking about CPython here) apparently don't care about, you'd have tons of spurious checker messages. In many cases (especially error handling ones like "goto fail"), I expect the checker to flag the error anyway, but for semantic reasons, not because of whitespace. Georg From pludemann at google.com Tue Jan 19 02:34:00 2016 From: pludemann at google.com (Peter Ludemann) Date: Mon, 18 Jan 2016 23:34:00 -0800 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: <569DE4D3.7090607@sdamon.com> References: <569CA960.4030401@hastings.org> <569DBB29.3080701@canterbury.ac.nz> <569DBE3E.5020509@sdamon.com> <569DE3E5.4070906@hastings.org> <569DE4D3.7090607@sdamon.com> Message-ID: On 18 January 2016 at 23:25, Alexander Walters wrote: > > > On 1/19/2016 02:21, Larry Hastings wrote: > > > On 01/18/2016 08:40 PM, Alexander Walters wrote: > > I am not a core developer, but I just kind of feel its hypocritical to > oppose always using brackets for the development of *python* > > > CPython isn't written in Python, it's written in C. So we use C idioms, > which naturally are different from Python idioms. > > > Yes, and always using brackets is an accepted C idiom (one of two, the > other being only use brackets when you must). Always using them is more > in line with the spirit of python. > ?And in the spirit of safety, which I think is Brett's concern. Yes, there are tools that could catch the Apple bug , by catching incorrect indentation. But if the rule is to *always* use braces, that just feels safer. Even if it offends some people's ability to read everything at one glance (my suggestion: use a smaller font or a larger screen; or refactor the code so it isn't as big a chunk). There's a fairly exhaustive list of styles here: https://en.wikipedia.org/wiki/Indent_style I'm sure that some of these have not been advocated in this thread.? > > */arry* > > > _______________________________________________ > Python-Dev mailing listPython-Dev at python.orghttps://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/tritium-list%40sdamon.com > > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/pludemann%40google.com > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From victor.stinner at gmail.com Tue Jan 19 03:03:07 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Tue, 19 Jan 2016 09:03:07 +0100 Subject: [Python-Dev] PEP 510: Specialize functions with guards In-Reply-To: References: Message-ID: Oh, I think that the PEP 510 lacks two functions to: * remove a specific specialized code * remove all specialized code Victor From mal at egenix.com Tue Jan 19 03:48:35 2016 From: mal at egenix.com (M.-A. Lemburg) Date: Tue, 19 Jan 2016 09:48:35 +0100 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: Message-ID: <569DF863.3000109@egenix.com> On 19.01.2016 00:20, Brett Cannon wrote: > On Sun, 17 Jan 2016 at 11:10 Brett Cannon wrote: > >> While doing a review of http://bugs.python.org/review/26129/ I asked to >> have curly braces put around all `if` statement bodies. Serhiy pointed out >> that PEP 7 says curly braces are optional: >> https://www.python.org/dev/peps/pep-0007/#id5. I would like to change >> that. >> >> My argument is to require them to prevent bugs like the one Apple made >> with OpenSSL about two years ago: >> https://www.imperialviolet.org/2014/02/22/applebug.html. Skipping the >> curly braces is purely an aesthetic thing while leaving them out can lead >> to actual bugs. >> >> Anyone object if I update PEP 7 to remove the optionality of curly braces >> in PEP 7? >> > > Currently this thread stands at: Make that: > +1 > Brett > Ethan > Robert > Georg > Nick > Maciej Szulik > +0 > Guido > -0 > Serhiy > -1 MAL > Victor (maybe; didn't specifically vote) > Larry > Stefan There are plenty other cases where typos can ruin the flow of your code in C; the discussed case is not a very common one. I find the whole discussion a bit strange: In Python we're advocating not having to use curly braces, because whitespace already provides the needed semantics for us, yet you are now advocating that without adding extra curly braces we'd be in danger of writing wrong code. The Apple bug can happen in Python just as well: if a: run_if_true() else: run_if_false() can turn into (say by hitting a wrong key in the editor): if a: run_if_true() run_if_false() (run_if_false is now run when a is true, and nothing is done in case a is false) So what's the correct way to address this ? It's having a test for both branches (a is true, a is false), not starting to write e.g. if a: run_if_true() if not a: run_if false() to feel more "secure". Also note that the extra braces won't necessarily help you. If you remove "else" from: if (a) { run_if_true(); } else { run_if_false(); } you get exactly the same Apply bug as before, only with more curly braces. This all sounds a bit like security theater to me ;-) I'd say: people who feel better using the extra braces can use them, while others who don't need them can go ahead as always ... and both groups should write more tests :-) BTW: There are other things in PEP 7 which should probably be updated instead, e.g. it still says we should use C89, but we're having more and more C99 code (mostly new library functions) in CPython. -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Experts (#1, Jan 19 2016) >>> Python Projects, Coaching and Consulting ... http://www.egenix.com/ >>> Python Database Interfaces ... http://products.egenix.com/ >>> Plone/Zope Database Interfaces ... http://zope.egenix.com/ ________________________________________________________________________ ::: We implement business ideas - efficiently in both time and costs ::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/ http://www.malemburg.com/ From soltysh at gmail.com Tue Jan 19 04:00:03 2016 From: soltysh at gmail.com (Maciej Szulik) Date: Tue, 19 Jan 2016 10:00:03 +0100 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: <569DF863.3000109@egenix.com> References: <569DF863.3000109@egenix.com> Message-ID: To add my additional 0.02$ to the discussion. Quite a while ago my coworkers and I agreed to use strict rules regarding code formatting. That idea turned into a formatter we've had committed in the project itself. This forced everyone on the team to use it and thus producing exactly "the same looking" code. I found it very nice, although we've debated over how those rules should be applied for good couple months, that was at one of the previous companies I've worked for. Currently by day I'm working with Go, which go-fmt is by default required for all our projects and I found it extremely handy see [1], [2], [3]. Maciej [1] https://github.com/openshift/origin/ [2] https://github.com/openshift/source-to-image/ [3] https://github.com/kubernetes/kubernetes On Tue, Jan 19, 2016 at 9:48 AM, M.-A. Lemburg wrote: > On 19.01.2016 00:20, Brett Cannon wrote: > > On Sun, 17 Jan 2016 at 11:10 Brett Cannon wrote: > > > >> While doing a review of http://bugs.python.org/review/26129/ I asked to > >> have curly braces put around all `if` statement bodies. Serhiy pointed > out > >> that PEP 7 says curly braces are optional: > >> https://www.python.org/dev/peps/pep-0007/#id5. I would like to change > >> that. > >> > >> My argument is to require them to prevent bugs like the one Apple made > >> with OpenSSL about two years ago: > >> https://www.imperialviolet.org/2014/02/22/applebug.html. Skipping the > >> curly braces is purely an aesthetic thing while leaving them out can > lead > >> to actual bugs. > >> > >> Anyone object if I update PEP 7 to remove the optionality of curly > braces > >> in PEP 7? > >> > > > > Currently this thread stands at: > > Make that: > > > +1 > > Brett > > Ethan > > Robert > > Georg > > Nick > > Maciej Szulik > > +0 > > Guido > > -0 > > Serhiy > > -1 > MAL > > Victor (maybe; didn't specifically vote) > > Larry > > Stefan > > There are plenty other cases where typos can ruin the flow > of your code in C; the discussed case is not a very common one. > > I find the whole discussion a bit strange: In Python we're > advocating not having to use curly braces, because whitespace > already provides the needed semantics for us, yet you are > now advocating that without adding extra curly braces > we'd be in danger of writing wrong code. > > The Apple bug can happen in Python just as well: > > if a: > run_if_true() > else: > run_if_false() > > can turn into (say by hitting a wrong key in the editor): > > if a: > run_if_true() > run_if_false() > > (run_if_false is now run when a is true, and nothing is > done in case a is false) > > So what's the correct way to address this ? > > It's having a test for both branches (a is true, a is false), > not starting to write e.g. > > if a: > run_if_true() > if not a: > run_if false() > > to feel more "secure". > > Also note that the extra braces won't necessarily help you. > If you remove "else" from: > > if (a) { > run_if_true(); > } > else { > run_if_false(); > } > > you get exactly the same Apply bug as before, only with more > curly braces. > > This all sounds a bit like security theater to me ;-) > > I'd say: people who feel better using the extra braces can use > them, while others who don't need them can go ahead as always > ... and both groups should write more tests :-) > > BTW: There are other things in PEP 7 which should probably be updated > instead, e.g. it still says we should use C89, but we're having more > and more C99 code (mostly new library functions) in CPython. > > -- > Marc-Andre Lemburg > eGenix.com > > Professional Python Services directly from the Experts (#1, Jan 19 2016) > >>> Python Projects, Coaching and Consulting ... http://www.egenix.com/ > >>> Python Database Interfaces ... http://products.egenix.com/ > >>> Plone/Zope Database Interfaces ... http://zope.egenix.com/ > ________________________________________________________________________ > > ::: We implement business ideas - efficiently in both time and costs ::: > > eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 > D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg > Registered at Amtsgericht Duesseldorf: HRB 46611 > http://www.egenix.com/company/contact/ > http://www.malemburg.com/ > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/soltysh%40gmail.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From victor.stinner at gmail.com Tue Jan 19 04:42:11 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Tue, 19 Jan 2016 10:42:11 +0100 Subject: [Python-Dev] Reference cycle on the module dict (globals()) Message-ID: Hi, While working on my FAT Python optimizer project, I found an annoying bug in my code. When at least one guard is created with a reference to the global namespace (globals(), the module dictionary), objects of the module are no more removed at exit. Example: --- import sys class MessageAtExit: def __del__(self): print('__del__ called') # display a message at exit, when message_at_exit is removed message_at_exit = MessageAtExit() # create a reference cycle: # module -> module dict -> Guard -> module dict guard = sys.Guard(globals()) --- (the code is adapted from a test of test_gc) Apply attached patch to Python 3.6 to get the sys.Guard object. It's a minimalist object to keep a strong reference to an object. I expected the garbage collector to break such (simple?) reference cycle. The Guard object implements a traverse module, but it is never called. Did I miss something obvious, or is it a known issue of the garbage collector on modules? Victor -------------- next part -------------- A non-text attachment was scrubbed... Name: guard.patch Type: text/x-patch Size: 3804 bytes Desc: not available URL: From skrah.temporarily at gmail.com Tue Jan 19 05:29:07 2016 From: skrah.temporarily at gmail.com (Stefan Krah) Date: Tue, 19 Jan 2016 10:29:07 +0000 (UTC) Subject: [Python-Dev] Update PEP 7 to require curly braces in C References: <569DF863.3000109@egenix.com> Message-ID: M.-A. Lemburg egenix.com> writes: > > Currently this thread stands at: > > Make that: > > > +1 > > Brett > > Ethan > > Robert > > Georg > > Nick > > Maciej Szulik > > +0 > > Guido > > -0 > > Serhiy > > -1 > MAL > > Victor (maybe; didn't specifically vote) > > Larry > > Stefan I want to clarify my position a bit: Personally, in _decimal/* I've always used braces and I prefer that. But from reading the Python sources in general, I got the impression that the default style at least for one-liner if-statements is to omit braces. So, in memoryview.c, I adapted to that style. I think enforcing braces won't do anything for security. DJB (who had a single exploit found in qmail in 20 years) even uses nested for-loops without braces. IMO secure code can only be achieved by auditing it quietly in a terminal, not being distracted by peripheral things like version control and web interfaces (green merge buttons!) and trying to do formal proofs (if time allows for it). So I would not want to enforce a style if it makes some people unhappy. Stefan Krah From encukou at gmail.com Tue Jan 19 05:39:28 2016 From: encukou at gmail.com (Petr Viktorin) Date: Tue, 19 Jan 2016 11:39:28 +0100 Subject: [Python-Dev] Reference cycle on the module dict (globals()) In-Reply-To: References: Message-ID: <569E1260.6070105@gmail.com> On 01/19/2016 10:42 AM, Victor Stinner wrote: > Hi, > > While working on my FAT Python optimizer project, I found an annoying > bug in my code. When at least one guard is created with a reference to > the global namespace (globals(), the module dictionary), objects of > the module are no more removed at exit. > > Example: > --- > import sys > > class MessageAtExit: > def __del__(self): > print('__del__ called') > > # display a message at exit, when message_at_exit is removed > message_at_exit = MessageAtExit() > > # create a reference cycle: > # module -> module dict -> Guard -> module dict > guard = sys.Guard(globals()) > --- > (the code is adapted from a test of test_gc) > > Apply attached patch to Python 3.6 to get the sys.Guard object. It's a > minimalist object to keep a strong reference to an object. > > I expected the garbage collector to break such (simple?) reference cycle. > > The Guard object implements a traverse module, but it is never called. > > Did I miss something obvious, or is it a known issue of the garbage > collector on modules? The default type flags are for objects that don't store references. Since you're creating a mutable container, you need to set Py_TPFLAGS_HAVE_GC. See https://docs.python.org/3/c-api/gcsupport.html for all the details. From victor.stinner at gmail.com Tue Jan 19 07:32:39 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Tue, 19 Jan 2016 13:32:39 +0100 Subject: [Python-Dev] _PyThreadState_Current In-Reply-To: References: Message-ID: Since it's a regression introduced in Python 3.5.1, I propose to introduce a new private function _PyThreadState_FastGet() to reintroduce the feature: https://bugs.python.org/issue26154 Using afunction instead of using directly the variable hides how atomic variables are implemented and so avoid compiler issues. Victor 2016-01-18 21:18 GMT+01:00 Maciej Fijalkowski : > Hi > > change in between 3.5.0 and 3.5.1 (hiding _PyThreadState_Current and > pyatomic.h) broke vmprof. The problem is that as a profile, vmprof can > really encounter _PyThreadState_Current being null, while crashing an > interpreter is a bit not ideal in this case. > > Any chance, a) _PyThreadState_Current can be restored in visibility? > b) can I get a better API to get it in case it can be NULL, but also > in 3.5 (since it works in 3.5.0 and breaks in 3.5.1) > > Cheers, > fijal > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/victor.stinner%40gmail.com From victor.stinner at gmail.com Tue Jan 19 07:54:17 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Tue, 19 Jan 2016 13:54:17 +0100 Subject: [Python-Dev] Reference cycle on the module dict (globals()) In-Reply-To: <569E1260.6070105@gmail.com> References: <569E1260.6070105@gmail.com> Message-ID: Hi, 2016-01-19 11:39 GMT+01:00 Petr Viktorin : >> Did I miss something obvious, or is it a known issue of the garbage >> collector on modules? > > The default type flags are for objects that don't store references. > Since you're creating a mutable container, you need to set > Py_TPFLAGS_HAVE_GC. See https://docs.python.org/3/c-api/gcsupport.html > for all the details. Ok, so I missed this important flag :-) Thanks! I had to fight against the C API to fix all my bugs, but now it works well: a guard keeps a strong reference to the global namespace, but objects are still destroyed when the module is unloaded! FYI I updated my PEP 510 patch to track guards with the garbage collector, and my fat project to fix bugs related to GC: - https://bugs.python.org/issue26098 - https://github.com/haypo/fat Victor From dmalcolm at redhat.com Tue Jan 19 10:36:56 2016 From: dmalcolm at redhat.com (David Malcolm) Date: Tue, 19 Jan 2016 10:36:56 -0500 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: Message-ID: <1453217816.12356.22.camel@surprise> On Mon, 2016-01-18 at 19:18 -0500, Terry Reedy wrote: > On 1/18/2016 6:20 PM, Brett Cannon wrote: > > > > > > On Sun, 17 Jan 2016 at 11:10 Brett Cannon > > wrote: > > > > While doing a review of http://bugs.python.org/review/26129/ I asked > > to have curly braces put around all `if` statement bodies. Serhiy > > pointed out that PEP 7 says curly braces are optional: > > https://www.python.org/dev/peps/pep-0007/#id5. I would like to > > change that. > > > > My argument is to require them to prevent bugs like the one Apple > > made with OpenSSL about two years ago: > > https://www.imperialviolet.org/2014/02/22/applebug.html. Skipping > > the curly braces is purely an aesthetic thing while leaving them out > > can lead to actual bugs. > > > > Anyone object if I update PEP 7 to remove the optionality of curly > > braces in PEP 7? > > > > > > Currently this thread stands at: > > > > +1 > > Brett > > Ethan > > Robert > > Georg > > Nick > > Maciej Szulik > > +0 > > Guido > > -0 > > Serhiy > > MAL > > -1 > > Victor (maybe; didn't specifically vote) > > Larry > > Stefan > > Though I don't write C anymore, I occasionally read our C sources. I > dislike mixed bracketing in a multiple clause if/else statement, and > would strongly recommend against that. On the other hand, to my > Python-trained eye, brackets for one line clauses are just noise. +-0. > > If coverity's scan does not flag the sort of misleading bug bait > formatting that at least partly prompted this thread > > if (a): > b; > c; > > then I think we should find or write something that does and run it over > existing code as well as patches. FWIW, for the forthcoming gcc 6, I've implemented a new -Wmisleading-indentation warning that catches this. It's currently enabled by -Wall: sslKeyExchange.c: In function 'SSLVerifySignedServerKeyExchange': sslKeyExchange.c:631:8: warning: statement is indented as if it were guarded by... [-Wmisleading-indentation] goto fail; ^~~~ sslKeyExchange.c:629:4: note: ...this 'if' clause, but it is not if ((err = SSLHashSHA1.update(&hashCtx, &signedParams)) != 0) ^~ (not that I've had time for core Python development lately, but FWIW in gcc-python-plugin I mandate braces for single-statement clauses). Dave From yselivanov.ml at gmail.com Tue Jan 19 11:55:43 2016 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Tue, 19 Jan 2016 11:55:43 -0500 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: Message-ID: <569E6A8F.60903@gmail.com> On 2016-01-18 6:20 PM, Brett Cannon wrote: > > > On Sun, 17 Jan 2016 at 11:10 Brett Cannon > wrote: > > While doing a review of http://bugs.python.org/review/26129/ I > asked to have curly braces put around all `if` statement bodies. > Serhiy pointed out that PEP 7 says curly braces are optional: > https://www.python.org/dev/peps/pep-0007/#id5. I would like to > change that. > > My argument is to require them to prevent bugs like the one Apple > made with OpenSSL about two years ago: > https://www.imperialviolet.org/2014/02/22/applebug.html. Skipping > the curly braces is purely an aesthetic thing while leaving them > out can lead to actual bugs. > > Anyone object if I update PEP 7 to remove the optionality of curly > braces in PEP 7? > > > Currently this thread stands at: > > +1 > Brett > Ethan > Robert > Georg > Nick > Maciej Szulik I guess you forgot to count Barry and me as +1s. Yury From jimjjewett at gmail.com Tue Jan 19 11:56:41 2016 From: jimjjewett at gmail.com (Jim J. Jewett) Date: Tue, 19 Jan 2016 08:56:41 -0800 (PST) Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: Message-ID: <569e6ac9.0611370a.41841.ffffd93b@mx.google.com> > On Jan 17, 2016, at 11:10, Brett Cannon wrote: >> While doing a review of http://bugs.python.org/review/26129/ >> ... update PEP 7 to remove the optionality of curly braces On Mon Jan 18 03:39:42 EST 2016, Andrew Barnert pointed out: > There are two ways you could do that. [The one most people are talking about, which often makes an if-clause visually too heavy ... though Alexander Walters pointed out that "Any excuse to break code out into more functions... is usually the right idea."] if (!obj) { return -1; } > Alternatively, it could say something like "braces must not be omitted; > when other C styles would use a braceless one-liner, a one-liner with > braces should be used instead; otherwise, they should be formatted as follows" That "otherwise" gets a bit awkward, but I like the idea. Perhaps "braces must not be omitted, and should normally be formatted as follows. ... Where other C styles would permit a braceless one-liner, the expression and braces may be moved to a single line, as follows: " if (x > 5) { y++ } I think that is clearly better, but it may be *too* lightweight for flow control. if (!obj) { return -1; } does work for me, and I think the \n{} may actually be useful for warning that flow control takes a jump. One reason I posted was to point to a specific example already in PEP 7 itself: if (type->tp_dictoffset != 0 && base->tp_dictoffset == 0 && type->tp_dictoffset == b_size && (size_t)t_size == b_size + sizeof(PyObject *)) return 0; /* "Forgive" adding a __dict__ only */ For me, that return is already visually lost, simply because it shares an indentation with the much larger test expression. Would that be better as either: /* "Forgive" adding a __dict__ only */ if (type->tp_dictoffset != 0 && base->tp_dictoffset == 0 && type->tp_dictoffset == b_size && (size_t)t_size == b_size + sizeof(PyObject *)) { return 0; } or: /* "Forgive" adding a __dict__ only */ if (type->tp_dictoffset != 0 && base->tp_dictoffset == 0 && type->tp_dictoffset == b_size && (size_t)t_size == b_size + sizeof(PyObject *)) { return 0; } -jJ -- If there are still threading problems with my replies, please email me with details, so that I can try to resolve them. -jJ From brett at python.org Tue Jan 19 12:07:36 2016 From: brett at python.org (Brett Cannon) Date: Tue, 19 Jan 2016 17:07:36 +0000 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: <569DF863.3000109@egenix.com> References: <569DF863.3000109@egenix.com> Message-ID: On Tue, 19 Jan 2016 at 00:48 M.-A. Lemburg wrote: > On 19.01.2016 00:20, Brett Cannon wrote: > > On Sun, 17 Jan 2016 at 11:10 Brett Cannon wrote: > > > >> While doing a review of http://bugs.python.org/review/26129/ I asked to > >> have curly braces put around all `if` statement bodies. Serhiy pointed > out > >> that PEP 7 says curly braces are optional: > >> https://www.python.org/dev/peps/pep-0007/#id5. I would like to change > >> that. > >> > >> My argument is to require them to prevent bugs like the one Apple made > >> with OpenSSL about two years ago: > >> https://www.imperialviolet.org/2014/02/22/applebug.html. Skipping the > >> curly braces is purely an aesthetic thing while leaving them out can > lead > >> to actual bugs. > >> > >> Anyone object if I update PEP 7 to remove the optionality of curly > braces > >> in PEP 7? > >> > > > > Currently this thread stands at: > > Make that: > > > +1 > > Brett > > Ethan > > Robert > > Georg > > Nick > > Maciej Szulik > > +0 > > Guido > > -0 > > Serhiy > > -1 > MAL > > Victor (maybe; didn't specifically vote) > > Larry > > Stefan > > There are plenty other cases where typos can ruin the flow > of your code in C; the discussed case is not a very common one. > > I find the whole discussion a bit strange: In Python we're > advocating not having to use curly braces, because whitespace > already provides the needed semantics for us, yet you are > now advocating that without adding extra curly braces > we'd be in danger of writing wrong code. > Yes, because in one language whitespace represents semantics while the other is just formatting; I don't have to question the meaning of when something is indented in Python, but in C I have to question whether the indentation is an accident or the missing braces is the accident. > > The Apple bug can happen in Python just as well: > > if a: > run_if_true() > else: > run_if_false() > > can turn into (say by hitting a wrong key in the editor): > > if a: > run_if_true() > run_if_false() > > (run_if_false is now run when a is true, and nothing is > done in case a is false) > > So what's the correct way to address this ? > > It's having a test for both branches (a is true, a is false), > not starting to write e.g. > > if a: > run_if_true() > if not a: > run_if false() > > to feel more "secure". > OK, but what if the block was instead: if (a) run_if_true(); Py_INCREF(a); ? Unit tests are not going to easily turn up a refcount leak, and the number of times I have needed to email python-dev when Antoine's daily refcount email has found a leak for several days shows people do not watch closely for this. It's one thing when the expressions are obviously mutually exclusive, but that's an ideal case. This isn't always about losing an `else` statement as it can come from inserting a new statement and not noticing that the braces are missing. > > Also note that the extra braces won't necessarily help you. > If you remove "else" from: > > if (a) { > run_if_true(); > } > else { > run_if_false(); > } > > you get exactly the same Apply bug as before, only with more > curly braces. > > This all sounds a bit like security theater to me ;-) > That's fine. I also want format consistency by always using braces. > > I'd say: people who feel better using the extra braces can use > them, while others who don't need them can go ahead as always > ... and both groups should write more tests :-) > Well, I'm dropping out of this discussion because I have enough on my plate with the GitHub migration than to keep fighting this. > > BTW: There are other things in PEP 7 which should probably be updated > instead, e.g. it still says we should use C89, but we're having more > and more C99 code (mostly new library functions) in CPython. > That's a whole other discussion (which I support, but I'm not going to lead since I'm burned out at the moment on C-related discussions). -Brett -------------- next part -------------- An HTML attachment was scrubbed... URL: From abarnert at yahoo.com Tue Jan 19 12:27:49 2016 From: abarnert at yahoo.com (Andrew Barnert) Date: Tue, 19 Jan 2016 09:27:49 -0800 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: <569e6ac9.0611370a.41841.ffffd93b@mx.google.com> References: <569e6ac9.0611370a.41841.ffffd93b@mx.google.com> Message-ID: <40C5A9D5-2EA9-4DC6-BFFF-FE95CCA5AD3D@yahoo.com> > On Jan 19, 2016, at 08:56, Jim J. Jewett wrote: > > On Mon Jan 18 03:39:42 EST 2016, Andrew Barnert pointed out: >> >> Alternatively, it could say something like "braces must not be omitted; >> when other C styles would use a braceless one-liner, a one-liner with >> braces should be used instead; otherwise, they should be formatted as follows" > > That "otherwise" gets a bit awkward, but I like the idea. Perhaps > "braces must not be omitted, and should normally be formatted as > follows. ... Where other C styles would permit a braceless one-liner, > the expression and braces may be moved to a single line, as follows: " > > if (x > 5) { y++ } > > I think that is clearly better, but it may be *too* lightweight for > flow control. > > if (!obj) > { return -1; } > > does work for me, and I think the \n{} may actually be useful for > warning that flow control takes a jump. Your wording is much better than mine. And so is your suggestion. Giving people the option of 1 or 3 lines, but not 2, seems a little silly. And, while I rarely use or see your 2-line version in C, I use it quite a bit in C++ (and related languages like D), so it doesn't look at all weird to me. But I'll leave it up to people who only do C (and Python) and/or who are more familiar with the CPython code base to judge. From ethan at stoneleaf.us Tue Jan 19 13:29:07 2016 From: ethan at stoneleaf.us (Ethan Furman) Date: Tue, 19 Jan 2016 10:29:07 -0800 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: <569e6ac9.0611370a.41841.ffffd93b@mx.google.com> References: <569e6ac9.0611370a.41841.ffffd93b@mx.google.com> Message-ID: <569E8073.3010106@stoneleaf.us> On 01/19/2016 08:56 AM, Jim J. Jewett wrote: > That "otherwise" gets a bit awkward, but I like the idea. Perhaps > "braces must not be omitted, and should normally be formatted as > follows. ... Where other C styles would permit a braceless one-liner, > the expression and braces may be moved to a single line, as follows: " > > if (x > 5) { y++ } > > I think that is clearly better, but it may be *too* lightweight for > flow control. > > if (!obj) > { return -1; } > > does work for me, and I think the \n{} may actually be useful for > warning that flow control takes a jump. Either of those two, with preference for the second on multiline ifs, seems quite readable to me. -- ~Ethan~ From guido at python.org Tue Jan 19 14:36:21 2016 From: guido at python.org (Guido van Rossum) Date: Tue, 19 Jan 2016 11:36:21 -0800 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: <569E8073.3010106@stoneleaf.us> References: <569e6ac9.0611370a.41841.ffffd93b@mx.google.com> <569E8073.3010106@stoneleaf.us> Message-ID: Let's not switch to either of those options. Visually I much prefer either of these: if (test) { blah; } or if (test) blah; over the versions with '{ blah; }' (regardless of whether it's on the same line as 'if' or on the next line). It looks like the shorter versions are mostly used inside macros, where aesthetics usually go out the door anyways in favor of robustness. Since this discussion is never going to end until someone says "enough", let me just attempt that (though technically it's Brett's call) -- let's go with the strong recommendation to prefer if (test) { blah; } and stop there. On Tue, Jan 19, 2016 at 10:29 AM, Ethan Furman wrote: > On 01/19/2016 08:56 AM, Jim J. Jewett wrote: > > That "otherwise" gets a bit awkward, but I like the idea. Perhaps >> "braces must not be omitted, and should normally be formatted as >> follows. ... Where other C styles would permit a braceless one-liner, >> the expression and braces may be moved to a single line, as follows: " >> >> if (x > 5) { y++ } >> >> I think that is clearly better, but it may be *too* lightweight for >> flow control. >> >> if (!obj) >> { return -1; } >> >> does work for me, and I think the \n{} may actually be useful for >> warning that flow control takes a jump. >> > > Either of those two, with preference for the second on multiline ifs, > seems quite readable to me. > > -- > ~Ethan~ > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Tue Jan 19 15:12:25 2016 From: brett at python.org (Brett Cannon) Date: Tue, 19 Jan 2016 20:12:25 +0000 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: <569e6ac9.0611370a.41841.ffffd93b@mx.google.com> <569E8073.3010106@stoneleaf.us> Message-ID: Here is a proposed update: diff -r 633f51d10a67 pep-0007.txt --- a/pep-0007.txt Mon Jan 18 10:52:57 2016 -0800 +++ b/pep-0007.txt Tue Jan 19 12:11:44 2016 -0800 @@ -75,9 +75,9 @@ } * Code structure: one space between keywords like ``if``, ``for`` and - the following left paren; no spaces inside the paren; braces may be - omitted where C permits but when present, they should be formatted - as shown:: + the following left paren; no spaces inside the paren; braces are + strongly preferred but may be omitted where C permits, and they + should be formatted as shown:: if (mro != NULL) { ... On Tue, 19 Jan 2016 at 11:37 Guido van Rossum wrote: > Let's not switch to either of those options. Visually I much prefer either > of these: > > if (test) { > blah; > } > > or > > if (test) > blah; > > over the versions with '{ blah; }' (regardless of whether it's on the same > line as 'if' or on the next line). It looks like the shorter versions are > mostly used inside macros, where aesthetics usually go out the door anyways > in favor of robustness. > > Since this discussion is never going to end until someone says "enough", > let me just attempt that (though technically it's Brett's call) -- let's go > with the strong recommendation to prefer > > if (test) { > blah; > } > > and stop there. > > On Tue, Jan 19, 2016 at 10:29 AM, Ethan Furman wrote: > >> On 01/19/2016 08:56 AM, Jim J. Jewett wrote: >> >> That "otherwise" gets a bit awkward, but I like the idea. Perhaps >>> "braces must not be omitted, and should normally be formatted as >>> follows. ... Where other C styles would permit a braceless one-liner, >>> the expression and braces may be moved to a single line, as follows: " >>> >>> if (x > 5) { y++ } >>> >>> I think that is clearly better, but it may be *too* lightweight for >>> flow control. >>> >>> if (!obj) >>> { return -1; } >>> >>> does work for me, and I think the \n{} may actually be useful for >>> warning that flow control takes a jump. >>> >> >> Either of those two, with preference for the second on multiline ifs, >> seems quite readable to me. >> >> -- >> ~Ethan~ > > >> >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> > Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/guido%40python.org >> > > > > -- > --Guido van Rossum (python.org/~guido) > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From francismb at email.de Tue Jan 19 15:22:27 2016 From: francismb at email.de (francismb) Date: Tue, 19 Jan 2016 21:22:27 +0100 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: Message-ID: <569E9B03.80200@email.de> Hi Brett, On 01/19/2016 12:20 AM, Brett Cannon wrote: > On Sun, 17 Jan 2016 at 11:10 Brett Cannon wrote: > >> While doing a review of http://bugs.python.org/review/26129/ I asked to >> have curly braces put around all `if` statement bodies. Serhiy pointed out >> that PEP 7 says curly braces are optional: >> https://www.python.org/dev/peps/pep-0007/#id5. I would like to change >> that. >> >> My argument is to require them to prevent bugs like the one Apple made >> with OpenSSL about two years ago: >> https://www.imperialviolet.org/2014/02/22/applebug.html. Skipping the >> curly braces is purely an aesthetic thing while leaving them out can lead >> to actual bugs. >> >> Anyone object if I update PEP 7 to remove the optionality of curly braces >> in PEP 7? >> > What about about a code formatter bot ? (new workflow). If one just could agree, then those reviews should just disappear (?). Regards, francis From guido at python.org Tue Jan 19 15:32:46 2016 From: guido at python.org (Guido van Rossum) Date: Tue, 19 Jan 2016 12:32:46 -0800 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: <569E9B03.80200@email.de> References: <569E9B03.80200@email.de> Message-ID: A formatter bot would be quite complicated to introduce without disrupitions of everybody's workflow (remember that we have about half a million lines of C code in the Python repo). If you want to discuss that please start a new thread on python-dev. On Tue, Jan 19, 2016 at 12:22 PM, francismb wrote: > Hi Brett, > > On 01/19/2016 12:20 AM, Brett Cannon wrote: > > On Sun, 17 Jan 2016 at 11:10 Brett Cannon wrote: > > > >> While doing a review of http://bugs.python.org/review/26129/ I asked to > >> have curly braces put around all `if` statement bodies. Serhiy pointed > out > >> that PEP 7 says curly braces are optional: > >> https://www.python.org/dev/peps/pep-0007/#id5. I would like to change > >> that. > >> > >> My argument is to require them to prevent bugs like the one Apple made > >> with OpenSSL about two years ago: > >> https://www.imperialviolet.org/2014/02/22/applebug.html. Skipping the > >> curly braces is purely an aesthetic thing while leaving them out can > lead > >> to actual bugs. > >> > >> Anyone object if I update PEP 7 to remove the optionality of curly > braces > >> in PEP 7? > >> > > > > What about about a code formatter bot ? (new workflow). If one just > could agree, then those reviews should just disappear (?). > > Regards, > francis > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From greg at krypto.org Tue Jan 19 15:59:13 2016 From: greg at krypto.org (Gregory P. Smith) Date: Tue, 19 Jan 2016 20:59:13 +0000 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: Message-ID: On Sun, Jan 17, 2016 at 11:12 AM Brett Cannon wrote: > While doing a review of http://bugs.python.org/review/26129/ I asked to > have curly braces put around all `if` statement bodies. Serhiy pointed out > that PEP 7 says curly braces are optional: > https://www.python.org/dev/peps/pep-0007/#id5. I would like to change > that. > > My argument is to require them to prevent bugs like the one Apple made > with OpenSSL about two years ago: > https://www.imperialviolet.org/2014/02/22/applebug.html. Skipping the > curly braces is purely an aesthetic thing while leaving them out can lead > to actual bugs. > > Anyone object if I update PEP 7 to remove the optionality of curly braces > in PEP 7? > +1, always using {}s is just good C style. (and, duh, of course we do *not* go modifying code for this retroactively, pep8 vs our existing python code is evidence of that) If I had _my_ way we'd require clang format for C/C++ files and yapf for all Python files before accepting a commit. Like any good modern open source project should. People who don't like defensive bug reducing coding practices should be glad I don't get my way. :P -gps -------------- next part -------------- An HTML attachment was scrubbed... URL: From francismb at email.de Tue Jan 19 15:59:18 2016 From: francismb at email.de (francismb) Date: Tue, 19 Jan 2016 21:59:18 +0100 Subject: [Python-Dev] Code formatter bot Message-ID: <569EA3A6.4010802@email.de> Dear Core-Devs, what's your opinion about a code-formatter bot for cpython. Pros, Cons, where could be applicable (new commits, new workflow, it doesn't make sense), ... - At least it should follow PEP 7 ;-) - ... Thanks in advance, francis From srkunze at mail.de Tue Jan 19 16:44:30 2016 From: srkunze at mail.de (Sven R. Kunze) Date: Tue, 19 Jan 2016 22:44:30 +0100 Subject: [Python-Dev] Code formatter bot In-Reply-To: <569EA3A6.4010802@email.de> References: <569EA3A6.4010802@email.de> Message-ID: <569EAE3E.20509@mail.de> Not a core dev, but I would definitely recommend using them. Best, Sven On 19.01.2016 21:59, francismb wrote: > Dear Core-Devs, > what's your opinion about a code-formatter bot for cpython. > Pros, Cons, where could be applicable (new commits, new workflow, it > doesn't make sense), ... > > > - At least it should follow PEP 7 ;-) > - ... > > > Thanks in advance, > francis > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/srkunze%40mail.de From senthil at uthcode.com Tue Jan 19 19:29:18 2016 From: senthil at uthcode.com (Senthil Kumaran) Date: Tue, 19 Jan 2016 16:29:18 -0800 Subject: [Python-Dev] Code formatter bot In-Reply-To: <569EA3A6.4010802@email.de> References: <569EA3A6.4010802@email.de> Message-ID: On Tue, Jan 19, 2016 at 12:59 PM, francismb wrote: > Pros, Cons, where could be applicable (new commits, new workflow, it > doesn't make sense), ... > -1. formatting should be done by humans (with the help of tools) before committing. It should not be left to a robot to make automatic changes. We already some pre-commit hooks which do basic checks. If anything more automated is desirable then enhancement to the pre-commit hooks could be the place to look for. As far as I know, none of the core devs have expressed any complaints the pre-commit hooks. -- Senthil -------------- next part -------------- An HTML attachment was scrubbed... URL: From vadmium+py at gmail.com Tue Jan 19 22:33:37 2016 From: vadmium+py at gmail.com (Martin Panter) Date: Wed, 20 Jan 2016 03:33:37 +0000 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: <569e6ac9.0611370a.41841.ffffd93b@mx.google.com> <569E8073.3010106@stoneleaf.us> Message-ID: On 19 January 2016 at 20:12, Brett Cannon wrote: > Here is a proposed update: > > diff -r 633f51d10a67 pep-0007.txt > --- a/pep-0007.txt Mon Jan 18 10:52:57 2016 -0800 > +++ b/pep-0007.txt Tue Jan 19 12:11:44 2016 -0800 > @@ -75,9 +75,9 @@ > } > > * Code structure: one space between keywords like ``if``, ``for`` and > - the following left paren; no spaces inside the paren; braces may be > - omitted where C permits but when present, they should be formatted > - as shown:: > + the following left paren; no spaces inside the paren; braces are > + strongly preferred but may be omitted where C permits, and they > + should be formatted as shown:: > > if (mro != NULL) { > ... This change seems to be accidentally smuggled in, in the guise of a PEP 512 update :) My view is I prefer always using curly brackets in my own code. It is easier to add printf() debugging without making mistakes. I support ?strongly preferring? them in the style guide, which is as much as a style guide can do anyway. From greg at krypto.org Wed Jan 20 02:29:43 2016 From: greg at krypto.org (Gregory P. Smith) Date: Wed, 20 Jan 2016 07:29:43 +0000 Subject: [Python-Dev] Code formatter bot In-Reply-To: References: <569EA3A6.4010802@email.de> Message-ID: Indeed, automated code formatting is a good thing. But a bot is the wrong approach. You want a code formatting checker as a potential pre-submit hook (like we have had for white space issues in the past), but until you have super high confidence in it you need to make sure it is not a blocker for a commit or push, just a default check that can be explicitly skipped for the infrequent cases where it is wrong. It's a workflow thing. Devs should have their editor setup to auto-run the correct formatter or easily run it if not automatic. -gps On Tue, Jan 19, 2016 at 4:29 PM Senthil Kumaran wrote: > > On Tue, Jan 19, 2016 at 12:59 PM, francismb wrote: > >> Pros, Cons, where could be applicable (new commits, new workflow, it >> doesn't make sense), ... >> > > -1. formatting should be done by humans (with the help of tools) before > committing. > It should not be left to a robot to make automatic changes. > > We already some pre-commit hooks which do basic checks. If anything more > automated is desirable then enhancement to the pre-commit hooks could be > the place to look for. > As far as I know, none of the core devs have expressed any complaints the > pre-commit hooks. > > > -- > Senthil > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/greg%40krypto.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ben+python at benfinney.id.au Wed Jan 20 03:35:45 2016 From: ben+python at benfinney.id.au (Ben Finney) Date: Wed, 20 Jan 2016 19:35:45 +1100 Subject: [Python-Dev] Code formatter bot References: <569EA3A6.4010802@email.de> Message-ID: <85wpr4myvy.fsf@benfinney.id.au> francismb writes: > what's your opinion about a code-formatter bot for cpython. What is the proposal? The opinions will surely depend on: * What formatting is to be applied automatically? * If you propose to enforce rigid interpretations of the style-guide PEPs as automatic rules, that will incur the wrath of those who have made clear those PEPs are not to be used that way. There is a clear opinion (from at least the BDFL) that the style guide PEPs are guidelines to be applied with human judgement. * If on the other hand you propose to enforce only those rules which are strict enough to be applied automatically (e.g. ?don't mix spaces and TABs?, ?encode source using UTF-8 only?) then that's best done by editor plug-ins like EditorConfig[0]. * At which point in the workflow will the formatting be applied? * If you propose to change the code *after* the programmer sees it in their text editor, that is sure to be unpopular. The code committed to VCS should exactly match what the programmer sees when they choose to commit. * If you propose to reject the code at time of committing, there are VCS hooks that can do that; I don't know what different system you propose. * If you propose to present formatting violations as errors in the programmer's text editor so they can be corrected before using the VCS, then there are tools like EditorConfig[0] to do that. [0]: EditorConfig -- \ ?Pinky, are you pondering what I'm pondering?? ?I think so, | `\ Brain, but isn't a cucumber that small called a gherkin?? | _o__) ?_Pinky and The Brain_ | Ben Finney From victor.stinner at gmail.com Wed Jan 20 12:40:14 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Wed, 20 Jan 2016 18:40:14 +0100 Subject: [Python-Dev] Devguide: Add a table summarizing status of Python branches Message-ID: Hi, I proposed a patch for the devguide to give the current status of all Python branches: active, bugfix, security only, end-of-line, with their end-of-life when applicable (past date or scheduled date) http://bugs.python.org/issue26165 What do you think? Does it look correct? We will have to update this table each time that the status of a branch change. Hopefully, it's not a common event, so it will not require a lot of work for release managers :-) Victor From brett at python.org Wed Jan 20 12:45:11 2016 From: brett at python.org (Brett Cannon) Date: Wed, 20 Jan 2016 17:45:11 +0000 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: <569e6ac9.0611370a.41841.ffffd93b@mx.google.com> <569E8073.3010106@stoneleaf.us> Message-ID: On Tue, 19 Jan 2016 at 19:33 Martin Panter wrote: > On 19 January 2016 at 20:12, Brett Cannon wrote: > > Here is a proposed update: > > > > diff -r 633f51d10a67 pep-0007.txt > > --- a/pep-0007.txt Mon Jan 18 10:52:57 2016 -0800 > > +++ b/pep-0007.txt Tue Jan 19 12:11:44 2016 -0800 > > @@ -75,9 +75,9 @@ > > } > > > > * Code structure: one space between keywords like ``if``, ``for`` and > > - the following left paren; no spaces inside the paren; braces may be > > - omitted where C permits but when present, they should be formatted > > - as shown:: > > + the following left paren; no spaces inside the paren; braces are > > + strongly preferred but may be omitted where C permits, and they > > + should be formatted as shown:: > > > > if (mro != NULL) { > > ... > > This change seems to be accidentally smuggled in, in the guise of a > PEP 512 update :) > Darn, sorry about that; forgot I had that change sitting in my peps checkout. I'll revert it when I get home (unless the change is actually acceptable to Guido). -Brett > > My view is I prefer always using curly brackets in my own code. It is > easier to add printf() debugging without making mistakes. I support > ?strongly preferring? them in the style guide, which is as much as a > style guide can do anyway. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Wed Jan 20 12:53:33 2016 From: brett at python.org (Brett Cannon) Date: Wed, 20 Jan 2016 17:53:33 +0000 Subject: [Python-Dev] Devguide: Add a table summarizing status of Python branches In-Reply-To: References: Message-ID: On Wed, 20 Jan 2016 at 09:41 Victor Stinner wrote: > Hi, > > I proposed a patch for the devguide to give the current status of all > Python branches: active, bugfix, security only, end-of-line, with > their end-of-life when applicable (past date or scheduled date) > http://bugs.python.org/issue26165 > > What do you think? Does it look correct? > I would update it have a "first release" date column and also the projected EOL for Python 3.5. Otherwise LGTM. -Brett > > We will have to update this table each time that the status of a > branch change. Hopefully, it's not a common event, so it will not > require a lot of work for release managers :-) > > Victor > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From abarnert at yahoo.com Wed Jan 20 12:58:12 2016 From: abarnert at yahoo.com (Andrew Barnert) Date: Wed, 20 Jan 2016 09:58:12 -0800 Subject: [Python-Dev] Code formatter bot In-Reply-To: <85wpr4myvy.fsf@benfinney.id.au> References: <569EA3A6.4010802@email.de> <85wpr4myvy.fsf@benfinney.id.au> Message-ID: <2859591B-BAF8-4D29-9509-AF2E6C48D076@yahoo.com> On Jan 20, 2016, at 00:35, Ben Finney wrote: > > francismb writes: > >> what's your opinion about a code-formatter bot for cpython. > > What is the proposal? The opinions will surely depend on: ... plus: * How does the formatter bot deal with "legacy code"? Large parts of CPython predate PEPs 7 and 8, and the decision was made long ago not to reformat existing code unless that code is being substantially modified for some other reason. Similarly, when the PEPs are updated, the usual decision is to not reformat old code. * When code _is_ auto-reformatted, what tools do you have to help git's merge logic, Reitveld, human readers looking at diffs or blame/annotate locally or on the web, etc. look past the hundreds of trivial changes to highlight the ones that matter? * What's the argument for specifically automating code formatting instead of any of the other things a commit-triggered linter can catch just as easily? But one comment on Ben's comment: > * If on the other hand you propose to enforce only those rules which > are strict enough to be applied automatically (e.g. ?don't mix > spaces and TABs?, ?encode source using UTF-8 only?) then that's best > done by editor plug-ins like EditorConfig[0]. In my experience (although mostly with projects with a lot fewer contributors than CPython...), it can be helpful to have both suggested editor plugins that do the auto formatting on the dev's computer, and VCS-triggered checkers that ensure the formatting was correct. That catches those occasional cases where you do a quick "trivial" edit in nano instead of your usual editor and then forget you did so and try to check in), without the nasty side-effects you mention later (like committing code you haven't seen). (Of course writing plugins that understand "legacy code" in the exact same way as the commit filter can be tricky, but in that case, it's better to know that one or the other isn't working as intended--both so a human can decide, and so people can see the bug in the plugin or filter--than to automatically make changes that weren't wanted.) From ben+python at benfinney.id.au Wed Jan 20 13:09:31 2016 From: ben+python at benfinney.id.au (Ben Finney) Date: Thu, 21 Jan 2016 05:09:31 +1100 Subject: [Python-Dev] Code formatter bot References: <569EA3A6.4010802@email.de> <85wpr4myvy.fsf@benfinney.id.au> <2859591B-BAF8-4D29-9509-AF2E6C48D076@yahoo.com> Message-ID: <85oacgm8bo.fsf@benfinney.id.au> Andrew Barnert via Python-Dev writes: > [?] it can be helpful to have both suggested editor plugins that do > the auto formatting on the dev's computer, and VCS-triggered checkers > that ensure the formatting was correct. Right, I was not intending the different stages to be exclusive. I was seeking clarification from the OP on the intended stages proposed to be automated. It's also worth noting that neither of the above approaches you mention would qualify IMO to be termed a ?bot?, since they still leave it to the human operator to deal with formatting violations before the edits reach the VCS. But that's another clarification I'm seeking from the OP: what ?bot? is being proposed? -- \ ?Nothing is more sacred than the facts.? ?Sam Harris, _The End | `\ of Faith_, 2004 | _o__) | Ben Finney From yselivanov.ml at gmail.com Wed Jan 20 13:02:56 2016 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 20 Jan 2016 13:02:56 -0500 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: References: Message-ID: <569FCBD0.4040307@gmail.com> On 2016-01-18 5:43 PM, Victor Stinner wrote: > Is someone opposed to this PEP 509? > > The main complain was the change on the public Python API, but the PEP > doesn't change the Python API anymore. > > I'm not aware of any remaining issue on this PEP. Victor, I've been experimenting with the PEP to implement a per-opcode cache in ceval loop (I'll share my progress on that in a few days). This allows to significantly speedup LOAD_GLOBAL and LOAD_METHOD opcodes, to the point, where they don't require any dict lookups at all. Some macro-benchmarks (such as chameleon_v2) demonstrate impressive ~10% performance boost. I rely on your dict->ma_version to implement cache invalidation. However, besides guarding against version change, I also want to guard against the dict being swapped for another dict, to avoid situations like this: def foo(): print(bar) exec(foo.__code__, {'bar': 1}, {}) exec(foo.__code__, {'bar': 2}, {}) What I propose is to add a pointer "ma_extra" (same 64bits), which will be set to NULL for most dict instances (instead of ma_version). "ma_extra" can then point to a struct that has a globally unique dict ID (uint64), and a version tag (unit64). A macro like PyDict_GET_ID and PyDict_GET_VERSION could then efficiently fetch the version/unique ID of the dict for guards. "ma_extra" would also make it easier for us to extend dicts in the future. Yury From fijall at gmail.com Wed Jan 20 13:15:21 2016 From: fijall at gmail.com (Maciej Fijalkowski) Date: Wed, 20 Jan 2016 19:15:21 +0100 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: <569FCBD0.4040307@gmail.com> References: <569FCBD0.4040307@gmail.com> Message-ID: The easiest version is to have global numbering (as opposed to local). Anyway, I would strongly suggest getting some benchmarks done and showing performance benefits first, because you don't want PEPs to be final when you don't exactly know the details. On Wed, Jan 20, 2016 at 7:02 PM, Yury Selivanov wrote: > On 2016-01-18 5:43 PM, Victor Stinner wrote: >> >> Is someone opposed to this PEP 509? >> >> The main complain was the change on the public Python API, but the PEP >> doesn't change the Python API anymore. >> >> I'm not aware of any remaining issue on this PEP. > > > Victor, > > I've been experimenting with the PEP to implement a per-opcode > cache in ceval loop (I'll share my progress on that in a few > days). This allows to significantly speedup LOAD_GLOBAL and > LOAD_METHOD opcodes, to the point, where they don't require > any dict lookups at all. Some macro-benchmarks (such as > chameleon_v2) demonstrate impressive ~10% performance boost. > > I rely on your dict->ma_version to implement cache invalidation. > > However, besides guarding against version change, I also want > to guard against the dict being swapped for another dict, to > avoid situations like this: > > > def foo(): > print(bar) > > exec(foo.__code__, {'bar': 1}, {}) > exec(foo.__code__, {'bar': 2}, {}) > > > What I propose is to add a pointer "ma_extra" (same 64bits), > which will be set to NULL for most dict instances (instead of > ma_version). "ma_extra" can then point to a struct that has a > globally unique dict ID (uint64), and a version tag (unit64). > A macro like PyDict_GET_ID and PyDict_GET_VERSION could then > efficiently fetch the version/unique ID of the dict for guards. > > "ma_extra" would also make it easier for us to extend dicts > in the future. > > Yury > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/fijall%40gmail.com From brett at python.org Wed Jan 20 13:22:50 2016 From: brett at python.org (Brett Cannon) Date: Wed, 20 Jan 2016 18:22:50 +0000 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: <569FCBD0.4040307@gmail.com> References: <569FCBD0.4040307@gmail.com> Message-ID: On Wed, 20 Jan 2016 at 10:11 Yury Selivanov wrote: > On 2016-01-18 5:43 PM, Victor Stinner wrote: > > Is someone opposed to this PEP 509? > > > > The main complain was the change on the public Python API, but the PEP > > doesn't change the Python API anymore. > > > > I'm not aware of any remaining issue on this PEP. > > Victor, > > I've been experimenting with the PEP to implement a per-opcode > cache in ceval loop (I'll share my progress on that in a few > days). This allows to significantly speedup LOAD_GLOBAL and > LOAD_METHOD opcodes, to the point, where they don't require > any dict lookups at all. Some macro-benchmarks (such as > chameleon_v2) demonstrate impressive ~10% performance boost. > Ooh, now my brain is trying to figure out the design of the cache. :) > > I rely on your dict->ma_version to implement cache invalidation. > > However, besides guarding against version change, I also want > to guard against the dict being swapped for another dict, to > avoid situations like this: > > > def foo(): > print(bar) > > exec(foo.__code__, {'bar': 1}, {}) > exec(foo.__code__, {'bar': 2}, {}) > > > What I propose is to add a pointer "ma_extra" (same 64bits), > which will be set to NULL for most dict instances (instead of > ma_version). "ma_extra" can then point to a struct that has a > globally unique dict ID (uint64), and a version tag (unit64). > A macro like PyDict_GET_ID and PyDict_GET_VERSION could then > efficiently fetch the version/unique ID of the dict for guards. > > "ma_extra" would also make it easier for us to extend dicts > in the future. > Why can't you simply use the id of the dict object as the globally unique dict ID? It's already globally unique amongst all Python objects which makes it inherently unique amongst dicts. -------------- next part -------------- An HTML attachment was scrubbed... URL: From fijall at gmail.com Wed Jan 20 13:36:53 2016 From: fijall at gmail.com (Maciej Fijalkowski) Date: Wed, 20 Jan 2016 19:36:53 +0100 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: References: <569FCBD0.4040307@gmail.com> Message-ID: On Wed, Jan 20, 2016 at 7:22 PM, Brett Cannon wrote: > > > On Wed, 20 Jan 2016 at 10:11 Yury Selivanov wrote: >> >> On 2016-01-18 5:43 PM, Victor Stinner wrote: >> > Is someone opposed to this PEP 509? >> > >> > The main complain was the change on the public Python API, but the PEP >> > doesn't change the Python API anymore. >> > >> > I'm not aware of any remaining issue on this PEP. >> >> Victor, >> >> I've been experimenting with the PEP to implement a per-opcode >> cache in ceval loop (I'll share my progress on that in a few >> days). This allows to significantly speedup LOAD_GLOBAL and >> LOAD_METHOD opcodes, to the point, where they don't require >> any dict lookups at all. Some macro-benchmarks (such as >> chameleon_v2) demonstrate impressive ~10% performance boost. > > > Ooh, now my brain is trying to figure out the design of the cache. :) > >> >> >> I rely on your dict->ma_version to implement cache invalidation. >> >> However, besides guarding against version change, I also want >> to guard against the dict being swapped for another dict, to >> avoid situations like this: >> >> >> def foo(): >> print(bar) >> >> exec(foo.__code__, {'bar': 1}, {}) >> exec(foo.__code__, {'bar': 2}, {}) >> >> >> What I propose is to add a pointer "ma_extra" (same 64bits), >> which will be set to NULL for most dict instances (instead of >> ma_version). "ma_extra" can then point to a struct that has a >> globally unique dict ID (uint64), and a version tag (unit64). >> A macro like PyDict_GET_ID and PyDict_GET_VERSION could then >> efficiently fetch the version/unique ID of the dict for guards. >> >> "ma_extra" would also make it easier for us to extend dicts >> in the future. > > > Why can't you simply use the id of the dict object as the globally unique > dict ID? It's already globally unique amongst all Python objects which makes > it inherently unique amongst dicts. > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/fijall%40gmail.com > Brett, you need two things - the ID of the dict and the version tag. What we do in pypy is we have a small object (called, surprisingly, VersionTag()) and we use the ID of that. That way you can change the version id of an existing dict and have only one field. From yselivanov.ml at gmail.com Wed Jan 20 13:38:49 2016 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 20 Jan 2016 13:38:49 -0500 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: References: <569FCBD0.4040307@gmail.com> Message-ID: <569FD439.9040704@gmail.com> On 2016-01-20 1:15 PM, Maciej Fijalkowski wrote: > [..] > > Anyway, I would strongly suggest getting some benchmarks done and > showing performance benefits first, because you don't want PEPs to be > final when you don't exactly know the details. I agree 100%. Yury From tjreedy at udel.edu Wed Jan 20 13:39:58 2016 From: tjreedy at udel.edu (Terry Reedy) Date: Wed, 20 Jan 2016 13:39:58 -0500 Subject: [Python-Dev] Devguide: Add a table summarizing status of Python branches In-Reply-To: References: Message-ID: On 1/20/2016 12:40 PM, Victor Stinner wrote: > Hi, > > I proposed a patch for the devguide to give the current status of all > Python branches: active, bugfix, security only, end-of-line, with > their end-of-life when applicable (past date or scheduled date) > http://bugs.python.org/issue26165 > > What do you think? Does it look correct? I thought end-of-life was 5 years after initial release, not 5 years after last bugfix. That would put eol for 3.4 in Feb 2019, I believe. > We will have to update this table each time that the status of a > branch change. Hopefully, it's not a common event, so it will not > require a lot of work for release managers :-) I believe there is some text describing current releases somewhere that also needs to be changed. The release pep or scripts should have a reminder in the sections about the transitions. -- Terry Jan Reedy From tjreedy at udel.edu Wed Jan 20 13:42:24 2016 From: tjreedy at udel.edu (Terry Reedy) Date: Wed, 20 Jan 2016 13:42:24 -0500 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: <569e6ac9.0611370a.41841.ffffd93b@mx.google.com> <569E8073.3010106@stoneleaf.us> Message-ID: On 1/20/2016 12:45 PM, Brett Cannon wrote: > > > On Tue, 19 Jan 2016 at 19:33 Martin Panter > wrote: > > On 19 January 2016 at 20:12, Brett Cannon > wrote: > > Here is a proposed update: > > > > diff -r 633f51d10a67 pep-0007.txt > > --- a/pep-0007.txt Mon Jan 18 10:52:57 2016 -0800 > > +++ b/pep-0007.txt Tue Jan 19 12:11:44 2016 -0800 > > @@ -75,9 +75,9 @@ > > } > > > > * Code structure: one space between keywords like ``if``, > ``for`` and > > - the following left paren; no spaces inside the paren; braces > may be > > - omitted where C permits but when present, they should be formatted > > - as shown:: > > + the following left paren; no spaces inside the paren; braces are > > + strongly preferred but may be omitted where C permits, and they > > + should be formatted as shown:: > > > > if (mro != NULL) { > > ... > > This change seems to be accidentally smuggled in, in the guise of a > PEP 512 update :) > > > Darn, sorry about that; forgot I had that change sitting in my peps > checkout. I'll revert it when I get home (unless the change is actually > acceptable to Guido). I thought that the above was your intentional compromise change given the range of opinions ;-). -- Terry Jan Reedy From yselivanov.ml at gmail.com Wed Jan 20 13:45:51 2016 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 20 Jan 2016 13:45:51 -0500 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: References: <569FCBD0.4040307@gmail.com> Message-ID: <569FD5DF.2000700@gmail.com> Brett, On 2016-01-20 1:22 PM, Brett Cannon wrote: > > > On Wed, 20 Jan 2016 at 10:11 Yury Selivanov > wrote: > > On 2016-01-18 5:43 PM, Victor Stinner wrote: > > Is someone opposed to this PEP 509? > > > > The main complain was the change on the public Python API, but > the PEP > > doesn't change the Python API anymore. > > > > I'm not aware of any remaining issue on this PEP. > > Victor, > > I've been experimenting with the PEP to implement a per-opcode > cache in ceval loop (I'll share my progress on that in a few > days). This allows to significantly speedup LOAD_GLOBAL and > LOAD_METHOD opcodes, to the point, where they don't require > any dict lookups at all. Some macro-benchmarks (such as > chameleon_v2) demonstrate impressive ~10% performance boost. > > > Ooh, now my brain is trying to figure out the design of the cache. :) Yeah, it's tricky. I'll need some time to draft a comprehensible overview. And I want to implement a couple more optimizations and benchmark it better. BTW, I've some updates (html5lib benchmark for py3, new benchmarks for calling C methods, and I want to port some PyPy benchmakrs) to the benchmarks suite. Should I just commit them, or should I use bugs.python.org? > > I rely on your dict->ma_version to implement cache invalidation. > > However, besides guarding against version change, I also want > to guard against the dict being swapped for another dict, to > avoid situations like this: > > > def foo(): > print(bar) > > exec(foo.__code__, {'bar': 1}, {}) > exec(foo.__code__, {'bar': 2}, {}) > > > What I propose is to add a pointer "ma_extra" (same 64bits), > which will be set to NULL for most dict instances (instead of > ma_version). "ma_extra" can then point to a struct that has a > globally unique dict ID (uint64), and a version tag (unit64). > A macro like PyDict_GET_ID and PyDict_GET_VERSION could then > efficiently fetch the version/unique ID of the dict for guards. > > "ma_extra" would also make it easier for us to extend dicts > in the future. > > > Why can't you simply use the id of the dict object as the globally > unique dict ID? It's already globally unique amongst all Python > objects which makes it inherently unique amongst dicts. We have a freelist for dicts -- so if the dict dies, there could be a new dict in its place, with the same ma_version. While the probability of such hiccups is low, we still have to account for it. Yury From yselivanov.ml at gmail.com Wed Jan 20 14:00:31 2016 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 20 Jan 2016 14:00:31 -0500 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: References: <569FCBD0.4040307@gmail.com> Message-ID: <569FD94F.2000808@gmail.com> On 2016-01-20 1:36 PM, Maciej Fijalkowski wrote: > On Wed, Jan 20, 2016 at 7:22 PM, Brett Cannon wrote: >> >> On Wed, 20 Jan 2016 at 10:11 Yury Selivanov wrote: [..] >>> "ma_extra" would also make it easier for us to extend dicts >>> in the future. >> >> Why can't you simply use the id of the dict object as the globally unique >> dict ID? It's already globally unique amongst all Python objects which makes >> it inherently unique amongst dicts. >> >> > Brett, you need two things - the ID of the dict and the version tag. > What we do in pypy is we have a small object (called, surprisingly, > VersionTag()) and we use the ID of that. That way you can change the > version id of an existing dict and have only one field. Yeah, that's essentially what I propose with ma_extra. Yury From fijall at gmail.com Wed Jan 20 14:02:53 2016 From: fijall at gmail.com (Maciej Fijalkowski) Date: Wed, 20 Jan 2016 20:02:53 +0100 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: <569FD94F.2000808@gmail.com> References: <569FCBD0.4040307@gmail.com> <569FD94F.2000808@gmail.com> Message-ID: On Wed, Jan 20, 2016 at 8:00 PM, Yury Selivanov wrote: > > > On 2016-01-20 1:36 PM, Maciej Fijalkowski wrote: >> >> On Wed, Jan 20, 2016 at 7:22 PM, Brett Cannon wrote: >>> >>> >>> On Wed, 20 Jan 2016 at 10:11 Yury Selivanov >>> wrote: > > [..] >>>> >>>> "ma_extra" would also make it easier for us to extend dicts >>>> in the future. >>> >>> >>> Why can't you simply use the id of the dict object as the globally unique >>> dict ID? It's already globally unique amongst all Python objects which >>> makes >>> it inherently unique amongst dicts. >>> >>> >> Brett, you need two things - the ID of the dict and the version tag. >> What we do in pypy is we have a small object (called, surprisingly, >> VersionTag()) and we use the ID of that. That way you can change the >> version id of an existing dict and have only one field. > > > > Yeah, that's essentially what I propose with ma_extra. > > Yury The trick is we use only one field :-) you're proposing to have both fields - version tag and dict id. Why not just use the id of the object (without any fields)? From yselivanov.ml at gmail.com Wed Jan 20 14:08:48 2016 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 20 Jan 2016 14:08:48 -0500 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: References: <569FCBD0.4040307@gmail.com> <569FD94F.2000808@gmail.com> Message-ID: <569FDB40.9060906@gmail.com> On 2016-01-20 2:02 PM, Maciej Fijalkowski wrote: > On Wed, Jan 20, 2016 at 8:00 PM, Yury Selivanov wrote: > [..] >>> Brett, you need two things - the ID of the dict and the version tag. >>> What we do in pypy is we have a small object (called, surprisingly, >>> VersionTag()) and we use the ID of that. That way you can change the >>> version id of an existing dict and have only one field. >> Yeah, that's essentially what I propose with ma_extra. >> >> Yury > The trick is we use only one field :-) > > you're proposing to have both fields - version tag and dict id. Why > not just use the id of the object (without any fields)? What if your old dict is GCed, its "VersionTag()" (1) object is freed, and you have a new dict, for which a new "VersionTag()" (2) object happens to be allocated at the same address as (1)? Yury From fijall at gmail.com Wed Jan 20 14:09:54 2016 From: fijall at gmail.com (Maciej Fijalkowski) Date: Wed, 20 Jan 2016 20:09:54 +0100 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: <569FDB40.9060906@gmail.com> References: <569FCBD0.4040307@gmail.com> <569FD94F.2000808@gmail.com> <569FDB40.9060906@gmail.com> Message-ID: On Wed, Jan 20, 2016 at 8:08 PM, Yury Selivanov wrote: > > On 2016-01-20 2:02 PM, Maciej Fijalkowski wrote: >> >> On Wed, Jan 20, 2016 at 8:00 PM, Yury Selivanov >> wrote: >> > [..] >>>> >>>> Brett, you need two things - the ID of the dict and the version tag. >>>> What we do in pypy is we have a small object (called, surprisingly, >>>> VersionTag()) and we use the ID of that. That way you can change the >>>> version id of an existing dict and have only one field. >>> >>> Yeah, that's essentially what I propose with ma_extra. >>> >>> Yury >> >> The trick is we use only one field :-) >> >> you're proposing to have both fields - version tag and dict id. Why >> not just use the id of the object (without any fields)? > > > What if your old dict is GCed, its "VersionTag()" (1) object is > freed, and you have a new dict, for which a new "VersionTag()" (2) > object happens to be allocated at the same address as (1)? > > Yury > You don't free a version tag that's stored in the guard. You store the object and not id From yselivanov.ml at gmail.com Wed Jan 20 14:23:43 2016 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 20 Jan 2016 14:23:43 -0500 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: References: <569FCBD0.4040307@gmail.com> <569FD94F.2000808@gmail.com> <569FDB40.9060906@gmail.com> Message-ID: <569FDEBF.2090709@gmail.com> On 2016-01-20 2:09 PM, Maciej Fijalkowski wrote: >> > > You don't free a version tag that's stored in the guard. You store the > object and not id Ah, got it. Although for my current cache design it would be more memory efficient to use the dict itself to store its own unique id and tag, hence my "ma_extra" proposal. In any case, the current "ma_version" proposal is flawed :( Yury From fijall at gmail.com Wed Jan 20 14:31:42 2016 From: fijall at gmail.com (Maciej Fijalkowski) Date: Wed, 20 Jan 2016 20:31:42 +0100 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: <569FDEBF.2090709@gmail.com> References: <569FCBD0.4040307@gmail.com> <569FD94F.2000808@gmail.com> <569FDB40.9060906@gmail.com> <569FDEBF.2090709@gmail.com> Message-ID: there is also the problem that you don't want it on all dicts. So having two extra words is more to pay than having extra objects (also, comparison is cheaper for guards) On Wed, Jan 20, 2016 at 8:23 PM, Yury Selivanov wrote: > > > On 2016-01-20 2:09 PM, Maciej Fijalkowski wrote: >>> >>> > >> >> You don't free a version tag that's stored in the guard. You store the >> object and not id > > > Ah, got it. Although for my current cache design it would be > more memory efficient to use the dict itself to store its own > unique id and tag, hence my "ma_extra" proposal. In any case, > the current "ma_version" proposal is flawed :( > > Yury From v+python at g.nevcal.com Wed Jan 20 14:45:42 2016 From: v+python at g.nevcal.com (Glenn Linderman) Date: Wed, 20 Jan 2016 11:45:42 -0800 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: References: <569FCBD0.4040307@gmail.com> Message-ID: <569FE3E6.303@g.nevcal.com> On 1/20/2016 10:36 AM, Maciej Fijalkowski wrote: >> Why can't you simply use the id of the dict object as the globally unique >> >dict ID? It's already globally unique amongst all Python objects which makes >> >it inherently unique amongst dicts. >> > >> >_______________________________________________ >> >Python-Dev mailing list >> >Python-Dev at python.org >> >https://mail.python.org/mailman/listinfo/python-dev >> >Unsubscribe: >> >https://mail.python.org/mailman/options/python-dev/fijall%40gmail.com >> > > Brett, you need two things - the ID of the dict and the version tag. > What we do in pypy is we have a small object (called, surprisingly, > VersionTag()) and we use the ID of that. That way you can change the > version id of an existing dict and have only one field. For the reuse case, can't you simply keep the ma_version "live" in dict items on the free list, rather than starting over at (presumably) 0 ? Then if the dict is reused, it bumps the ma_version, and the fallback code goes on with (presumably) relocating the original dict (oops, it's gone), and dealing with the fallout. Then you can use the regular dict id as the globally unique dict id? And only need the one uint64, rather than a separately allocated extra pair of uint64? -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Wed Jan 20 15:18:53 2016 From: brett at python.org (Brett Cannon) Date: Wed, 20 Jan 2016 20:18:53 +0000 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: <569e6ac9.0611370a.41841.ffffd93b@mx.google.com> <569E8073.3010106@stoneleaf.us> Message-ID: On Wed, 20 Jan 2016 at 10:45 Terry Reedy wrote: > On 1/20/2016 12:45 PM, Brett Cannon wrote: > > > > > > On Tue, 19 Jan 2016 at 19:33 Martin Panter > > wrote: > > > > On 19 January 2016 at 20:12, Brett Cannon > > wrote: > > > Here is a proposed update: > > > > > > diff -r 633f51d10a67 pep-0007.txt > > > --- a/pep-0007.txt Mon Jan 18 10:52:57 2016 -0800 > > > +++ b/pep-0007.txt Tue Jan 19 12:11:44 2016 -0800 > > > @@ -75,9 +75,9 @@ > > > } > > > > > > * Code structure: one space between keywords like ``if``, > > ``for`` and > > > - the following left paren; no spaces inside the paren; braces > > may be > > > - omitted where C permits but when present, they should be > formatted > > > - as shown:: > > > + the following left paren; no spaces inside the paren; braces > are > > > + strongly preferred but may be omitted where C permits, and they > > > + should be formatted as shown:: > > > > > > if (mro != NULL) { > > > ... > > > > This change seems to be accidentally smuggled in, in the guise of a > > PEP 512 update :) > > > > > > Darn, sorry about that; forgot I had that change sitting in my peps > > checkout. I'll revert it when I get home (unless the change is actually > > acceptable to Guido). > > I thought that the above was your intentional compromise change given > the range of opinions ;-). > It is, but Guido is the author of PEP 7 and so I didn't want to check in the change without his approval first. -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Wed Jan 20 15:20:02 2016 From: brett at python.org (Brett Cannon) Date: Wed, 20 Jan 2016 20:20:02 +0000 Subject: [Python-Dev] Devguide: Add a table summarizing status of Python branches In-Reply-To: References: Message-ID: On Wed, 20 Jan 2016 at 10:40 Terry Reedy wrote: > On 1/20/2016 12:40 PM, Victor Stinner wrote: > > Hi, > > > > I proposed a patch for the devguide to give the current status of all > > Python branches: active, bugfix, security only, end-of-line, with > > their end-of-life when applicable (past date or scheduled date) > > http://bugs.python.org/issue26165 > > > > What do you think? Does it look correct? > > I thought end-of-life was 5 years after initial release, not 5 years > after last bugfix. It is, which is why I requested the first release date be a column. > That would put eol for 3.4 in Feb 2019, I believe. > > > We will have to update this table each time that the status of a > > branch change. Hopefully, it's not a common event, so it will not > > require a lot of work for release managers :-) > > I believe there is some text describing current releases somewhere that > also needs to be changed. The release pep or scripts should have a > reminder in the sections about the transitions. > PEP 101 would need a tweak to remind the RM to update the devguide. -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Wed Jan 20 15:23:59 2016 From: brett at python.org (Brett Cannon) Date: Wed, 20 Jan 2016 20:23:59 +0000 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: <569FD5DF.2000700@gmail.com> References: <569FCBD0.4040307@gmail.com> <569FD5DF.2000700@gmail.com> Message-ID: On Wed, 20 Jan 2016 at 10:46 Yury Selivanov wrote: > Brett, > > On 2016-01-20 1:22 PM, Brett Cannon wrote: > > > > > > On Wed, 20 Jan 2016 at 10:11 Yury Selivanov > > wrote: > > > > On 2016-01-18 5:43 PM, Victor Stinner wrote: > > > Is someone opposed to this PEP 509? > > > > > > The main complain was the change on the public Python API, but > > the PEP > > > doesn't change the Python API anymore. > > > > > > I'm not aware of any remaining issue on this PEP. > > > > Victor, > > > > I've been experimenting with the PEP to implement a per-opcode > > cache in ceval loop (I'll share my progress on that in a few > > days). This allows to significantly speedup LOAD_GLOBAL and > > LOAD_METHOD opcodes, to the point, where they don't require > > any dict lookups at all. Some macro-benchmarks (such as > > chameleon_v2) demonstrate impressive ~10% performance boost. > > > > > > Ooh, now my brain is trying to figure out the design of the cache. :) > > Yeah, it's tricky. I'll need some time to draft a comprehensible > overview. And I want to implement a couple more optimizations and > benchmark it better. > > BTW, I've some updates (html5lib benchmark for py3, new benchmarks > for calling C methods, and I want to port some PyPy benchmakrs) > to the benchmarks suite. Should I just commit them, or should I > use bugs.python.org? > I actually emailed speed@ to see if people were interested in finally sitting down with all the various VM implementations at PyCon and trying to come up with a reasonable base set of benchmarks that better reflect modern Python usage, but I never heard back. Anyway, issues on bugs.python.org are probably best to talk about new benchmarks before adding them (fixes and updates to pre-existing benchmarks can just go in). > > > > > I rely on your dict->ma_version to implement cache invalidation. > > > > However, besides guarding against version change, I also want > > to guard against the dict being swapped for another dict, to > > avoid situations like this: > > > > > > def foo(): > > print(bar) > > > > exec(foo.__code__, {'bar': 1}, {}) > > exec(foo.__code__, {'bar': 2}, {}) > > > > > > What I propose is to add a pointer "ma_extra" (same 64bits), > > which will be set to NULL for most dict instances (instead of > > ma_version). "ma_extra" can then point to a struct that has a > > globally unique dict ID (uint64), and a version tag (unit64). > > A macro like PyDict_GET_ID and PyDict_GET_VERSION could then > > efficiently fetch the version/unique ID of the dict for guards. > > > > "ma_extra" would also make it easier for us to extend dicts > > in the future. > > > > > > Why can't you simply use the id of the dict object as the globally > > unique dict ID? It's already globally unique amongst all Python > > objects which makes it inherently unique amongst dicts. > > We have a freelist for dicts -- so if the dict dies, there > could be a new dict in its place, with the same ma_version. > Ah, I figured it would be too simple to use something we already had. > > While the probability of such hiccups is low, we still have > to account for it. > Yep. -------------- next part -------------- An HTML attachment was scrubbed... URL: From yselivanov.ml at gmail.com Wed Jan 20 15:27:12 2016 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 20 Jan 2016 15:27:12 -0500 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: <569FE3E6.303@g.nevcal.com> References: <569FCBD0.4040307@gmail.com> <569FE3E6.303@g.nevcal.com> Message-ID: <569FEDA0.1040506@gmail.com> On 2016-01-20 2:45 PM, Glenn Linderman wrote: > For the reuse case, can't you simply keep the ma_version "live" in > dict items on the free list, rather than starting over at (presumably) > 0 ? Then if the dict is reused, it bumps the ma_version, and the > fallback code goes on with (presumably) relocating the original dict > (oops, it's gone), and dealing with the fallout. Not all dicts are created from a freelist, and not all dicts go to the freelist when they are GCed. You still can have this situation: - dict "A" is used as f_locals for a frame, its ma_version is set to X - dict "A" is GCed, but the freelist is full, so it's just freed - after a while, you call the code object, a new dict "B" is allocated with malloc (since now the freelist happens to be empty) for f_locals - it happens that "B" is allocated where "A" was, and its ma_version happens to be X by an accident > > Then you can use the regular dict id as the globally unique dict id? > And only need the one uint64, rather than a separately allocated extra > pair of uint64? In my design only namespace dicts will have a non-NULL ma_extra, which means that only a fraction of dicts will actually have a separated pair of uint64s. I think that we should either use one global ma_version (as Maciej suggested) or ma_extra, as it gives more flexibility for us to extend dicts in the future. A global (shared between all dicts) unit64 ma_version is actually quite reliable -- if a program does 1,000,000 dict modifications per second, it would take it 600,000 years till wrap-around. Yury From brett at python.org Wed Jan 20 15:50:41 2016 From: brett at python.org (Brett Cannon) Date: Wed, 20 Jan 2016 20:50:41 +0000 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: <569FEDA0.1040506@gmail.com> References: <569FCBD0.4040307@gmail.com> <569FE3E6.303@g.nevcal.com> <569FEDA0.1040506@gmail.com> Message-ID: On Wed, 20 Jan 2016 at 12:27 Yury Selivanov wrote: > > > On 2016-01-20 2:45 PM, Glenn Linderman wrote: > > For the reuse case, can't you simply keep the ma_version "live" in > > dict items on the free list, rather than starting over at (presumably) > > 0 ? Then if the dict is reused, it bumps the ma_version, and the > > fallback code goes on with (presumably) relocating the original dict > > (oops, it's gone), and dealing with the fallout. > > Not all dicts are created from a freelist, and not all dicts go to the > freelist when they are GCed. > > You still can have this situation: > > - dict "A" is used as f_locals for a frame, its ma_version is set to X > - dict "A" is GCed, but the freelist is full, so it's just freed > - after a while, you call the code object, a new dict "B" is allocated > with malloc (since now the freelist happens to be empty) for f_locals > - it happens that "B" is allocated where "A" was, and its ma_version > happens to be X by an accident > > > > > Then you can use the regular dict id as the globally unique dict id? > > And only need the one uint64, rather than a separately allocated extra > > pair of uint64? > > In my design only namespace dicts will have a non-NULL ma_extra, which > means that only a fraction of dicts will actually have a separated pair > of uint64s. > > I think that we should either use one global ma_version (as Maciej > suggested) or ma_extra, as it gives more flexibility for us to extend > dicts in the future. > > A global (shared between all dicts) unit64 ma_version is actually quite > reliable -- if a program does 1,000,000 dict modifications per second, > it would take it 600,000 years till wrap-around. > Since you're going to need to hold the GIL for the modifications there won't be any locking or contention problems, so it sounds like the global value is the best since that's simple, uses the least amount of memory, and will be easiest to use as a modification check since that will be a simple uint64 comparison instead of comparing a GUID + version. -------------- next part -------------- An HTML attachment was scrubbed... URL: From mark at hotpy.org Wed Jan 20 16:06:01 2016 From: mark at hotpy.org (Mark Shannon) Date: Wed, 20 Jan 2016 21:06:01 +0000 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: References: Message-ID: <569FF6B9.6000907@hotpy.org> On 11/01/16 16:49, Victor Stinner wrote: > Hi, > > After a first round on python-ideas, here is the second version of my > PEP. The main changes since the first version are that the dictionary > version is no more exposed at the Python level and the field type now > also has a size of 64-bit on 32-bit platforms. > > The PEP is part of a serie of 3 PEP adding an API to implement a > static Python optimizer specializing functions with guards. The second > PEP is currently discussed on python-ideas and I'm still working on > the third PEP. If anyone wants to experiment (at the C, not Python, level) with dict versioning to optimise load-global/builtins, then you can do so without adding a version number. A "version" can created by splitting the dict with "make_keys_shared" and then making the keys-object immutable by setting "dk_usable" to zero. This means that any change to the keys will force a keys-object change, but changes to the values will not. For many optimisations, this is want you want. Using this trick: To read a global, check that the keys is the expected keys and read the value straight out of the values array at the known index. To read a builtins, check that the module keys is the expected keys and thus cannot shadow the builtins, then read the builtins as above. I don't know how much help this will be for a static optimiser, but it could work well for a dynamic optimiser. I used this optimisation in HotPy for optimising object attribute lookups. Cheers, Mark. From victor.stinner at gmail.com Wed Jan 20 16:22:01 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Wed, 20 Jan 2016 22:22:01 +0100 Subject: [Python-Dev] Devguide: Add a table summarizing status of Python branches In-Reply-To: References: Message-ID: I pushed my table, it will be online in a few hours (I don't know when the devguide is recompiled?): http://docs.python.org/devguide/triaging.html#generating-special-links-in-a-comment By the way, it would be super cool to rebuild the PEPs with a post-commit hook server-side, rather than having to wait the crontab which requires to wait ~30 minutes (1h? I don't know exactly). 2016-01-20 21:20 GMT+01:00 Brett Cannon : > It is, which is why I requested the first release date be a column. I added a Scheduled column with a link to the Release Schedule PEP of each version. I also added a column with the date of the first date. I added 5 years to estimate the end-of-line. I used the same month and same date, with a comment above explaining that the release manager is free to adujst the end-of-line date. Thanks for the feedback. > PEP 101 would need a tweak to remind the RM to update the devguide. Can someone please mention this table in the PEP Victor From v+python at g.nevcal.com Wed Jan 20 16:18:25 2016 From: v+python at g.nevcal.com (Glenn Linderman) Date: Wed, 20 Jan 2016 13:18:25 -0800 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: References: <569FCBD0.4040307@gmail.com> <569FE3E6.303@g.nevcal.com> <569FEDA0.1040506@gmail.com> Message-ID: <569FF9A1.8010108@g.nevcal.com> On 1/20/2016 12:50 PM, Brett Cannon wrote: > > A global (shared between all dicts) unit64 ma_version is actually > quite > reliable -- if a program does 1,000,000 dict modifications per second, > it would take it 600,000 years till wrap-around. > But would invalidate everything, instead of just a fraction of things, on every update to anything that is monitored... -------------- next part -------------- An HTML attachment was scrubbed... URL: From greg.ewing at canterbury.ac.nz Wed Jan 20 17:01:03 2016 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 21 Jan 2016 11:01:03 +1300 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: <569FCBD0.4040307@gmail.com> References: <569FCBD0.4040307@gmail.com> Message-ID: <56A0039F.6070409@canterbury.ac.nz> Yury Selivanov wrote: > What I propose is to add a pointer "ma_extra" (same 64bits), > which will be set to NULL for most dict instances (instead of > ma_version). "ma_extra" can then point to a struct that has a > globally unique dict ID (uint64), and a version tag (unit64). Why not just use a single global counter for allocating dict version tags, instead of one per dict? -- Greg From brett at python.org Wed Jan 20 17:01:28 2016 From: brett at python.org (Brett Cannon) Date: Wed, 20 Jan 2016 22:01:28 +0000 Subject: [Python-Dev] Devguide: Add a table summarizing status of Python branches In-Reply-To: References: Message-ID: On Wed, 20 Jan 2016 at 13:22 Victor Stinner wrote: > I pushed my table, it will be online in a few hours (I don't know when > the devguide is recompiled?): > > http://docs.python.org/devguide/triaging.html#generating-special-links-in-a-comment > > By the way, it would be super cool to rebuild the PEPs with a > post-commit hook server-side, rather than having to wait the crontab > which requires to wait ~30 minutes (1h? I don't know exactly). > This is a proposed optional, future feature leading from moving to GitHub: https://www.python.org/dev/peps/pep-0512/#web-hooks-for-re-generating-web-content -Brett -------------- next part -------------- An HTML attachment was scrubbed... URL: From francismb at email.de Wed Jan 20 17:18:12 2016 From: francismb at email.de (francismb) Date: Wed, 20 Jan 2016 23:18:12 +0100 Subject: [Python-Dev] Code formatter bot In-Reply-To: <569EA3A6.4010802@email.de> References: <569EA3A6.4010802@email.de> Message-ID: <56A007A4.80705@email.de> Thanks again to all persons that commented so far. > what's your opinion about a code-formatter bot for cpython. > Pros, Cons, where could be applicable (new commits, new workflow, it > doesn't make sense), ... > > > - At least it should follow PEP 7 ;-) > - ... There seems to be too much implicit information on the lines above. Sorry for that. I'll try to clarify. >From my point of view (not a core-dev) the reviews seem to be one of the bottlenecks to the commit throughput, and I noticed on the other PEP 7 thread that time is taken to review such, IMHO, (and as Andrew also noted) trivialities. Thus the basic idea was to get that noise away from the reviews, somehow (better upfront, but why not after, accidental PEP 7 noise commits can happen). Well, people should first agree on that PEP and then some automatic could come. Please notice that the interaction is not just: core-dev committing to a repo or a bot committing to a repo but external-contributor that tries to minimize review iterations. I have no problem with some process (call it now a bot or script) that just changes a patch/file to reduce that cycle (but it could, run on all the workflow points you mentioned, plus on the PR-site as a kind of an advisor/mentor). Regards, francis From victor.stinner at gmail.com Wed Jan 20 17:28:00 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Wed, 20 Jan 2016 23:28:00 +0100 Subject: [Python-Dev] Devguide: Add a table summarizing status of Python branches In-Reply-To: References: Message-ID: 2016-01-20 23:01 GMT+01:00 Brett Cannon : > This is a proposed optional, future feature leading from moving to GitHub: > https://www.python.org/dev/peps/pep-0512/#web-hooks-for-re-generating-web-content I'm using the free service ReadTheDocs.org and it's really impressive how fast it is to update the HTML page after a push. It's usually less than 10 seconds. Victor From guido at python.org Wed Jan 20 17:31:01 2016 From: guido at python.org (Guido van Rossum) Date: Wed, 20 Jan 2016 14:31:01 -0800 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: <569e6ac9.0611370a.41841.ffffd93b@mx.google.com> <569E8073.3010106@stoneleaf.us> Message-ID: The wording is totally fine! You might still want to revert it and re-commit it so it doesn't look like a mistake when reviewing the log. BTW When can we start using git for the peps repo? On Wed, Jan 20, 2016 at 12:18 PM, Brett Cannon wrote: > > > On Wed, 20 Jan 2016 at 10:45 Terry Reedy wrote: > >> On 1/20/2016 12:45 PM, Brett Cannon wrote: >> > >> > >> > On Tue, 19 Jan 2016 at 19:33 Martin Panter > > > wrote: >> > >> > On 19 January 2016 at 20:12, Brett Cannon > > > wrote: >> > > Here is a proposed update: >> > > >> > > diff -r 633f51d10a67 pep-0007.txt >> > > --- a/pep-0007.txt Mon Jan 18 10:52:57 2016 -0800 >> > > +++ b/pep-0007.txt Tue Jan 19 12:11:44 2016 -0800 >> > > @@ -75,9 +75,9 @@ >> > > } >> > > >> > > * Code structure: one space between keywords like ``if``, >> > ``for`` and >> > > - the following left paren; no spaces inside the paren; braces >> > may be >> > > - omitted where C permits but when present, they should be >> formatted >> > > - as shown:: >> > > + the following left paren; no spaces inside the paren; braces >> are >> > > + strongly preferred but may be omitted where C permits, and >> they >> > > + should be formatted as shown:: >> > > >> > > if (mro != NULL) { >> > > ... >> > >> > This change seems to be accidentally smuggled in, in the guise of a >> > PEP 512 update :) >> > >> > >> > Darn, sorry about that; forgot I had that change sitting in my peps >> > checkout. I'll revert it when I get home (unless the change is actually >> > acceptable to Guido). >> >> I thought that the above was your intentional compromise change given >> the range of opinions ;-). >> > > It is, but Guido is the author of PEP 7 and so I didn't want to check in > the change without his approval first. > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Wed Jan 20 17:41:16 2016 From: brett at python.org (Brett Cannon) Date: Wed, 20 Jan 2016 22:41:16 +0000 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: <569e6ac9.0611370a.41841.ffffd93b@mx.google.com> <569E8073.3010106@stoneleaf.us> Message-ID: On Wed, 20 Jan 2016 at 14:31 Guido van Rossum wrote: > The wording is totally fine! You might still want to revert it and > re-commit it so it doesn't look like a mistake when reviewing the log. > Sure thing! > > BTW When can we start using git for the peps repo? > Depends on how fast the parts covered in https://www.python.org/dev/peps/pep-0512/#requirements-for-code-only-repositories and https://www.python.org/dev/peps/pep-0512/#requirements-for-web-related-repositories takes. My hope is before PyCon US (although if we choose to not enforce the CLA on PEP contributions then it could happen even faster). -Brett > > On Wed, Jan 20, 2016 at 12:18 PM, Brett Cannon wrote: > >> >> >> On Wed, 20 Jan 2016 at 10:45 Terry Reedy wrote: >> >>> On 1/20/2016 12:45 PM, Brett Cannon wrote: >>> > >>> > >>> > On Tue, 19 Jan 2016 at 19:33 Martin Panter >> > > wrote: >>> > >>> > On 19 January 2016 at 20:12, Brett Cannon >> > > wrote: >>> > > Here is a proposed update: >>> > > >>> > > diff -r 633f51d10a67 pep-0007.txt >>> > > --- a/pep-0007.txt Mon Jan 18 10:52:57 2016 -0800 >>> > > +++ b/pep-0007.txt Tue Jan 19 12:11:44 2016 -0800 >>> > > @@ -75,9 +75,9 @@ >>> > > } >>> > > >>> > > * Code structure: one space between keywords like ``if``, >>> > ``for`` and >>> > > - the following left paren; no spaces inside the paren; braces >>> > may be >>> > > - omitted where C permits but when present, they should be >>> formatted >>> > > - as shown:: >>> > > + the following left paren; no spaces inside the paren; braces >>> are >>> > > + strongly preferred but may be omitted where C permits, and >>> they >>> > > + should be formatted as shown:: >>> > > >>> > > if (mro != NULL) { >>> > > ... >>> > >>> > This change seems to be accidentally smuggled in, in the guise of a >>> > PEP 512 update :) >>> > >>> > >>> > Darn, sorry about that; forgot I had that change sitting in my peps >>> > checkout. I'll revert it when I get home (unless the change is actually >>> > acceptable to Guido). >>> >>> I thought that the above was your intentional compromise change given >>> the range of opinions ;-). >>> >> >> It is, but Guido is the author of PEP 7 and so I didn't want to check in >> the change without his approval first. >> >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> > Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/guido%40python.org >> >> > > > -- > --Guido van Rossum (python.org/~guido) > -------------- next part -------------- An HTML attachment was scrubbed... URL: From victor.stinner at gmail.com Wed Jan 20 18:45:50 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Thu, 21 Jan 2016 00:45:50 +0100 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: <569FF9A1.8010108@g.nevcal.com> References: <569FCBD0.4040307@gmail.com> <569FE3E6.303@g.nevcal.com> <569FEDA0.1040506@gmail.com> <569FF9A1.8010108@g.nevcal.com> Message-ID: Hi, 2016-01-20 22:18 GMT+01:00 Glenn Linderman : > On 1/20/2016 12:50 PM, Brett Cannon wrote: >> >> A global (shared between all dicts) unit64 ma_version is actually quite >> reliable -- if a program does 1,000,000 dict modifications per second, >> it would take it 600,000 years till wrap-around. I think that Yury found a bug in FAT Python. I didn't test the case when the builtins dictionary is replaced after the definition of the function. To be more concrete: when a function is executed in a different namespace using exec(code, namespace). That's why I like the PEP process, it helps to find all issues before going too far :-) I like the idea of global counter for dictionary versions. It means that the dictionary constructor increases this counter instead of always starting to 0. FYI a fat.GuardDict keeps a strong reference to the dictionary. For some guards, I hesitated to store the object identifier and/or using a weak reference. An object identifier is not reliable because the object can be destroyed and a new object, completly different, or of the same type, can get the same identifier. > But would invalidate everything, instead of just a fraction of things, on > every update to anything that is monitored... I don't understand this point. In short, the guard only has to compare two 64 bit integers in the fast-path, when nothing changed. For a namespace, it means that no value was replaced in this namespace. If a different namespace is modified, the version of the watched namespace does not change, so we are still in the fast-path. If a value is replaced in the watched namespace, but not the watched variable, we have to take a slow-path, hopefully only once. The worst case is when a value different than the watched value is modified between each guard check. In this case, we always need a dict lookup. An heuristic can be chosen to decide to give up after N tries. Currently, fat.GuardDict always retries. Victor From brett at python.org Wed Jan 20 19:08:12 2016 From: brett at python.org (Brett Cannon) Date: Thu, 21 Jan 2016 00:08:12 +0000 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: References: <569FCBD0.4040307@gmail.com> <569FE3E6.303@g.nevcal.com> <569FEDA0.1040506@gmail.com> <569FF9A1.8010108@g.nevcal.com> Message-ID: On Wed, 20 Jan 2016 at 15:46 Victor Stinner wrote: > Hi, > > 2016-01-20 22:18 GMT+01:00 Glenn Linderman : > > On 1/20/2016 12:50 PM, Brett Cannon wrote: > >> > >> A global (shared between all dicts) unit64 ma_version is actually quite > >> reliable -- if a program does 1,000,000 dict modifications per second, > >> it would take it 600,000 years till wrap-around. > > I think that Yury found a bug in FAT Python. I didn't test the case > when the builtins dictionary is replaced after the definition of the > function. To be more concrete: when a function is executed in a > different namespace using exec(code, namespace). That's why I like the > PEP process, it helps to find all issues before going too far :-) > > I like the idea of global counter for dictionary versions. It means > that the dictionary constructor increases this counter instead of > always starting to 0. > > FYI a fat.GuardDict keeps a strong reference to the dictionary. For > some guards, I hesitated to store the object identifier and/or using a > weak reference. An object identifier is not reliable because the > object can be destroyed and a new object, completly different, or of > the same type, can get the same identifier. > > > But would invalidate everything, instead of just a fraction of things, on > > every update to anything that is monitored... > > I don't understand this point. > I think Glenn was assuming we had a single, global version # that all dicts shared *without* having a per-dict version ID. The key thing here is that we have a global counter that tracks the number of mutations for *all* dictionaries but whose value we store as a *per-dictionary* value. That ends up making the version ID inherently both a token representing the state of any dict but also the uniqueness of the dict since no two dictionaries will ever have the same version ID. > > In short, the guard only has to compare two 64 bit integers in the > fast-path, when nothing changed. For a namespace, it means that no > value was replaced in this namespace. > > If a different namespace is modified, the version of the watched > namespace does not change, so we are still in the fast-path. > > If a value is replaced in the watched namespace, but not the watched > variable, we have to take a slow-path, hopefully only once. > > The worst case is when a value different than the watched value is > modified between each guard check. In this case, we always need a dict > lookup. An heuristic can be chosen to decide to give up after N tries. > Currently, fat.GuardDict always retries. > Does "retries" mean "check if the value really changed, and if it hasn't then just update the version ID the guard checks"? -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Wed Jan 20 19:09:43 2016 From: brett at python.org (Brett Cannon) Date: Thu, 21 Jan 2016 00:09:43 +0000 Subject: [Python-Dev] Devguide: Add a table summarizing status of Python branches In-Reply-To: References: Message-ID: On Wed, 20 Jan 2016 at 14:28 Victor Stinner wrote: > 2016-01-20 23:01 GMT+01:00 Brett Cannon : > > This is a proposed optional, future feature leading from moving to > GitHub: > > > https://www.python.org/dev/peps/pep-0512/#web-hooks-for-re-generating-web-content > > I'm using the free service ReadTheDocs.org and it's really impressive > how fast it is to update the HTML page after a push. It's usually less > than 10 seconds. > I have no idea if the way our docs are built would work on readthedocs.org, but if it could then I would definitely vote to move our docs there and have the PSF make a regular donation for the service. But this is a discussion to have on core-workflow@ and not here. -------------- next part -------------- An HTML attachment was scrubbed... URL: From v+python at g.nevcal.com Wed Jan 20 19:10:56 2016 From: v+python at g.nevcal.com (Glenn Linderman) Date: Wed, 20 Jan 2016 16:10:56 -0800 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: References: <569FCBD0.4040307@gmail.com> <569FE3E6.303@g.nevcal.com> <569FEDA0.1040506@gmail.com> <569FF9A1.8010108@g.nevcal.com> Message-ID: <56A02210.7030002@g.nevcal.com> On 1/20/2016 4:08 PM, Brett Cannon wrote: > > > On Wed, 20 Jan 2016 at 15:46 Victor Stinner > wrote: > > Hi, > > 2016-01-20 22:18 GMT+01:00 Glenn Linderman >: > > On 1/20/2016 12:50 PM, Brett Cannon wrote: > >> > >> A global (shared between all dicts) unit64 ma_version is > actually quite > >> reliable -- if a program does 1,000,000 dict modifications per > second, > >> it would take it 600,000 years till wrap-around. > > I think that Yury found a bug in FAT Python. I didn't test the case > when the builtins dictionary is replaced after the definition of the > function. To be more concrete: when a function is executed in a > different namespace using exec(code, namespace). That's why I like the > PEP process, it helps to find all issues before going too far :-) > > I like the idea of global counter for dictionary versions. It means > that the dictionary constructor increases this counter instead of > always starting to 0. > > FYI a fat.GuardDict keeps a strong reference to the dictionary. For > some guards, I hesitated to store the object identifier and/or using a > weak reference. An object identifier is not reliable because the > object can be destroyed and a new object, completly different, or of > the same type, can get the same identifier. > > > But would invalidate everything, instead of just a fraction of > things, on > > every update to anything that is monitored... > > I don't understand this point. > > > I think Glenn was assuming we had a single, global version # that all > dicts shared *without* having a per-dict version ID. The key thing > here is that we have a global counter that tracks the number of > mutations for *all* dictionaries but whose value we store as a > *per-dictionary* value. That ends up making the version ID inherently > both a token representing the state of any dict but also the > uniqueness of the dict since no two dictionaries will ever have the > same version ID. This would work. You were correct about my assumptions. -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Wed Jan 20 19:16:31 2016 From: brett at python.org (Brett Cannon) Date: Thu, 21 Jan 2016 00:16:31 +0000 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: <569e6ac9.0611370a.41841.ffffd93b@mx.google.com> <569E8073.3010106@stoneleaf.us> Message-ID: I just checked with Van and we should have CLAs for even PEP contributors, so it will have to go through the same steps as the other ancillary repositories. On Wed, 20 Jan 2016 at 14:41 Brett Cannon wrote: > On Wed, 20 Jan 2016 at 14:31 Guido van Rossum wrote: > >> The wording is totally fine! You might still want to revert it and >> re-commit it so it doesn't look like a mistake when reviewing the log. >> > > Sure thing! > > >> >> BTW When can we start using git for the peps repo? >> > > Depends on how fast the parts covered in > https://www.python.org/dev/peps/pep-0512/#requirements-for-code-only-repositories > and > https://www.python.org/dev/peps/pep-0512/#requirements-for-web-related-repositories takes. > My hope is before PyCon US (although if we choose to not enforce the CLA on > PEP contributions then it could happen even faster). > > -Brett > > >> >> On Wed, Jan 20, 2016 at 12:18 PM, Brett Cannon wrote: >> >>> >>> >>> On Wed, 20 Jan 2016 at 10:45 Terry Reedy wrote: >>> >>>> On 1/20/2016 12:45 PM, Brett Cannon wrote: >>>> > >>>> > >>>> > On Tue, 19 Jan 2016 at 19:33 Martin Panter >>> > > wrote: >>>> > >>>> > On 19 January 2016 at 20:12, Brett Cannon >>> > > wrote: >>>> > > Here is a proposed update: >>>> > > >>>> > > diff -r 633f51d10a67 pep-0007.txt >>>> > > --- a/pep-0007.txt Mon Jan 18 10:52:57 2016 -0800 >>>> > > +++ b/pep-0007.txt Tue Jan 19 12:11:44 2016 -0800 >>>> > > @@ -75,9 +75,9 @@ >>>> > > } >>>> > > >>>> > > * Code structure: one space between keywords like ``if``, >>>> > ``for`` and >>>> > > - the following left paren; no spaces inside the paren; braces >>>> > may be >>>> > > - omitted where C permits but when present, they should be >>>> formatted >>>> > > - as shown:: >>>> > > + the following left paren; no spaces inside the paren; >>>> braces are >>>> > > + strongly preferred but may be omitted where C permits, and >>>> they >>>> > > + should be formatted as shown:: >>>> > > >>>> > > if (mro != NULL) { >>>> > > ... >>>> > >>>> > This change seems to be accidentally smuggled in, in the guise of >>>> a >>>> > PEP 512 update :) >>>> > >>>> > >>>> > Darn, sorry about that; forgot I had that change sitting in my peps >>>> > checkout. I'll revert it when I get home (unless the change is >>>> actually >>>> > acceptable to Guido). >>>> >>>> I thought that the above was your intentional compromise change given >>>> the range of opinions ;-). >>>> >>> >>> It is, but Guido is the author of PEP 7 and so I didn't want to check in >>> the change without his approval first. >>> >>> _______________________________________________ >>> Python-Dev mailing list >>> Python-Dev at python.org >>> https://mail.python.org/mailman/listinfo/python-dev >>> >> Unsubscribe: >>> https://mail.python.org/mailman/options/python-dev/guido%40python.org >>> >>> >> >> >> -- >> --Guido van Rossum (python.org/~guido) >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From victor.stinner at gmail.com Wed Jan 20 19:23:44 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Thu, 21 Jan 2016 01:23:44 +0100 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: References: <569FCBD0.4040307@gmail.com> <569FE3E6.303@g.nevcal.com> <569FEDA0.1040506@gmail.com> <569FF9A1.8010108@g.nevcal.com> Message-ID: 2016-01-21 1:08 GMT+01:00 Brett Cannon : > On Wed, 20 Jan 2016 at 15:46 Victor Stinner >> The worst case is when a value different than the watched value is >> modified between each guard check. In this case, we always need a dict >> lookup. An heuristic can be chosen to decide to give up after N tries. >> Currently, fat.GuardDict always retries. > > Does "retries" mean "check if the value really changed, and if it hasn't > then just update the version ID the guard checks"? If the dict version changes (because a value different than the watched value is modified) each time that the guard is checked, the guard always require a dict lookup to check if the watched value changed. Victor From victor.stinner at gmail.com Wed Jan 20 19:33:03 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Thu, 21 Jan 2016 01:33:03 +0100 Subject: [Python-Dev] Devguide: Add a table summarizing status of Python branches In-Reply-To: References: Message-ID: 2016-01-21 1:09 GMT+01:00 Brett Cannon : > On Wed, 20 Jan 2016 at 14:28 Victor Stinner >> I'm using the free service ReadTheDocs.org and it's really impressive >> how fast it is to update the HTML page after a push. It's usually less >> than 10 seconds. > > I have no idea if the way our docs are built would work on readthedocs.org, > but if it could then I would definitely vote to move our docs there and have > the PSF make a regular donation for the service. Oh, I was talking about small documentations of personal projects. I didn't propose to move Python docs to readthedocs.org. I don't know if it makes sense. It's just to say that we can do better than 30 minutes of the current system :-) Victor From abarnert at yahoo.com Wed Jan 20 20:54:41 2016 From: abarnert at yahoo.com (Andrew Barnert) Date: Thu, 21 Jan 2016 01:54:41 +0000 (UTC) Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: References: Message-ID: <126377093.6733723.1453341281211.JavaMail.yahoo@mail.yahoo.com> On Wednesday, January 20, 2016 4:10 PM, Brett Cannon wrote: >I think Glenn was assuming we had a single, global version # that all dicts shared without having a per-dict version ID. The key thing here is that we have a global counter that tracks the number of mutations for all dictionaries but whose value we store as a per-dictionary value. That ends up making the version ID inherently both a token representing the state of any dict but also the uniqueness of the dict since no two dictionaries will ever have the same version ID. This idea worries me. I'm not sure why, but I think because of threading. After all, it's pretty rare for two threads to both want to work on the same dict, but very, very common for two threads to both want to work on _any_ dict. So, imagine someone manages to remove the GIL from CPython by using STM: now most transactions are bumping that global counter, meaning most transactions fail and have to be retried, so you end up with 8 cores each running at 1/64th the speed of a single core but burning 100% CPU. Obviously a real-life implementation wouldn't be _that_ stupid; you'd special-case the version-bumping (maybe unconditionally bump it N times before starting the transaction, and then as long as you don't bump more than N times during the transaction, you can commit without touching it), but there's still going to be a lot of contention. And that also affects something like PyPy being able to use FAT-Python-style AoT optimizations via cpyext. At first glance that sounds like a stupid idea--why would you want to run an optimizer through a slow emulator? But the optimizer only runs once and transforms the function code, which runs a zillion times, so who cares how slow the optimizer is? Of course it may still be true that many of the AoT optimizations that FAT makes don't apply very well to PyPy, in which case it doesn't matter. But I don't think we can assume that a priori. Is there a way to define this loosely enough so that the implementation _can_ be a single global counter, if that turns out to be most efficient, but can also be a counter per dictionary and a globally-unique ID per dictionary? From brett at python.org Wed Jan 20 21:04:58 2016 From: brett at python.org (Brett Cannon) Date: Thu, 21 Jan 2016 02:04:58 +0000 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: <126377093.6733723.1453341281211.JavaMail.yahoo@mail.yahoo.com> References: <126377093.6733723.1453341281211.JavaMail.yahoo@mail.yahoo.com> Message-ID: On Wed, 20 Jan 2016, 17:54 Andrew Barnert wrote: > On Wednesday, January 20, 2016 4:10 PM, Brett Cannon > wrote: > > > >I think Glenn was assuming we had a single, global version # that all > dicts shared without having a per-dict version ID. The key thing here is > that we have a global counter that tracks the number of mutations for all > dictionaries but whose value we store as a per-dictionary value. That ends > up making the version ID inherently both a token representing the state of > any dict but also the uniqueness of the dict since no two dictionaries will > ever have the same version ID. > > This idea worries me. I'm not sure why, but I think because of threading. > After all, it's pretty rare for two threads to both want to work on the > same dict, but very, very common for two threads to both want to work on > _any_ dict. So, imagine someone manages to remove the GIL from CPython by > using STM: now most transactions are bumping that global counter, meaning > most transactions fail and have to be retried, so you end up with 8 cores > each running at 1/64th the speed of a single core but burning 100% CPU. > Obviously a real-life implementation wouldn't be _that_ stupid; you'd > special-case the version-bumping (maybe unconditionally bump it N times > before starting the transaction, and then as long as you don't bump more > than N times during the transaction, you can commit without touching it), > but there's still going to be a lot of contention. > This is all being regarded as an implementation detail of CPython, so in this hypothetical STM world we can drop all of this (or lock it). > And that also affects something like PyPy being able to use > FAT-Python-style AoT optimizations via cpyext. At first glance that sounds > like a stupid idea--why would you want to run an optimizer through a slow > emulator? But the optimizer only runs once and transforms the function > code, which runs a zillion times, so who cares how slow the optimizer is? > Of course it may still be true that many of the AoT optimizations that FAT > makes don't apply very well to PyPy, in which case it doesn't matter. But I > don't think we can assume that a priori. > > Is there a way to define this loosely enough so that the implementation > _can_ be a single global counter, if that turns out to be most efficient, > but can also be a counter per dictionary and a globally-unique ID per > dictionary? > There's no need to if this is all under the hood and in no way affects anyone but the eval loop and those who choose to use it. We can make sure to preface all of this with underscores so it's obvious they are private and so use at your own peril. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From yselivanov.ml at gmail.com Wed Jan 20 21:13:38 2016 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 20 Jan 2016 21:13:38 -0500 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: <126377093.6733723.1453341281211.JavaMail.yahoo@mail.yahoo.com> References: <126377093.6733723.1453341281211.JavaMail.yahoo@mail.yahoo.com> Message-ID: <56A03ED2.3010204@gmail.com> On 2016-01-20 8:54 PM, Andrew Barnert via Python-Dev wrote: >> >I think Glenn was assuming we had a single, global version # that all dicts shared without having a per-dict version ID. The key thing here is that we have a global counter that tracks the number of mutations for all dictionaries but whose value we store as a per-dictionary value. That ends up making the version ID inherently both a token representing the state of any dict but also the uniqueness of the dict since no two dictionaries will ever have the same version ID. > This idea worries me. I'm not sure why, but I think because of threading. After all, it's pretty rare for two threads to both want to work on the same dict, but very, very common for two threads to both want to work on_any_ dict. So, imagine someone manages to remove the GIL from CPython by using STM: now most transactions are bumping that global counter, meaning most transactions fail and have to be retried, so you end up with 8 cores each running at 1/64th the speed of a single core but burning 100% CPU. Obviously a real-life implementation wouldn't be_that_ stupid; you'd special-case the version-bumping (maybe unconditionally bump it N times before starting the transaction, and then as long as you don't bump more than N times during the transaction, you can commit without touching it), but there's still going to be a lot of contention. Well, PEP 509 proposes to add ma_version only for CPython. It's an implementation detail of CPython. Victor's FAT optimizer is also tailored specifically for CPython, and making it work on PyPy would require a completely different set of hacks. To remove the GIL or implement an efficient STM one will have to rewrite (and potentially break) so much code in CPython, that ma_version won't be a concern. For now, though, ma_version will be protected by GIL, so threading shouldn't be a problem. > > And that also affects something like PyPy being able to use FAT-Python-style AoT optimizations via cpyext. At first glance that sounds like a stupid idea--why would you want to run an optimizer through a slow emulator? But the optimizer only runs once and transforms the function code, which runs a zillion times, so who cares how slow the optimizer is? Of course it may still be true that many of the AoT optimizations that FAT makes don't apply very well to PyPy, in which case it doesn't matter. But I don't think we can assume that a priori. The idea of FAT is that it will also generate optimized code objects with guards. I doubt it would make any sense to use it under PyPy or any jitted Python implementation. JITs have a far better understanding of the code than any static optimizer. > > Is there a way to define this loosely enough so that the implementation_can_ be a single global counter, if that turns out to be most efficient, but can also be a counter per dictionary and a globally-unique ID per dictionary? Defining it "loosely" means that you can't trust it. I'd just explicitly say that: - ma_version is an implementation detail of CPython and may not be implemented on other platforms; - ma_version can be removed from future CPython releases; - ma_version can be used by code optimizers tailored specifically for CPython and CPython itself. Yury From yselivanov.ml at gmail.com Thu Jan 21 00:08:57 2016 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Thu, 21 Jan 2016 00:08:57 -0500 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: <56A0039F.6070409@canterbury.ac.nz> References: <569FCBD0.4040307@gmail.com> <56A0039F.6070409@canterbury.ac.nz> Message-ID: <56A067E9.8090409@gmail.com> On 2016-01-20 5:01 PM, Greg Ewing wrote: > Yury Selivanov wrote: >> What I propose is to add a pointer "ma_extra" (same 64bits), >> which will be set to NULL for most dict instances (instead of >> ma_version). "ma_extra" can then point to a struct that has a >> globally unique dict ID (uint64), and a version tag (unit64). > > Why not just use a single global counter for allocating > dict version tags, instead of one per dict? > Yeah, I think that's what we agreed on: https://mail.python.org/pipermail/python-dev/2016-January/142837.html The only advantage of ma_extra pointer is that it allows to add more stuff to dicts in the future. Yury From ncoghlan at gmail.com Thu Jan 21 02:07:43 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 21 Jan 2016 17:07:43 +1000 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: <569e6ac9.0611370a.41841.ffffd93b@mx.google.com> <569E8073.3010106@stoneleaf.us> Message-ID: On 21 January 2016 at 10:16, Brett Cannon wrote: > I just checked with Van and we should have CLAs for even PEP contributors, > so it will have to go through the same steps as the other ancillary > repositories. We should probably mention that in PEP 1 - I wasn't aware of that requirement, so I've never explicitly checked CLA status for folks contributing packaging related PEPs. (And looking at the just-checked-in PEP 513, I apparently have a CLA to chase up...) Regards, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From victor.stinner at gmail.com Thu Jan 21 02:46:21 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Thu, 21 Jan 2016 08:46:21 +0100 Subject: [Python-Dev] Devguide: Add a table summarizing status of Python branches In-Reply-To: References: Message-ID: 2016-01-20 22:22 GMT+01:00 Victor Stinner : > I pushed my table, it will be online in a few hours (I don't know when > the devguide is recompiled?): > http://docs.python.org/devguide/triaging.html#generating-special-links-in-a-comment Hum ok, it takes more than a few hours in fact. It's still not online 10 hours after my push :-/ https://docs.python.org/devguide/ Victor From greg.ewing at canterbury.ac.nz Thu Jan 21 00:20:42 2016 From: greg.ewing at canterbury.ac.nz (Greg Ewing) Date: Thu, 21 Jan 2016 18:20:42 +1300 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: <126377093.6733723.1453341281211.JavaMail.yahoo@mail.yahoo.com> References: <126377093.6733723.1453341281211.JavaMail.yahoo@mail.yahoo.com> Message-ID: <56A06AAA.7020507@canterbury.ac.nz> Andrew Barnert via Python-Dev wrote: > imagine someone manages to remove the GIL from CPython by using > STM: now most transactions are bumping that global counter, meaning most > transactions fail and have to be retried, If this becomes a problem, the tag could be split into two parts of m and n bits, with m + n = 64. Use a global counter for allocating the high half, and increment the low half locally. When the low half overflows, allocate a new high half. A value of n = 16 or so ought to reduce contention for the global counter to something fairly negligible, I would think, without much risk of the high half ever wrapping around. -- Greg From victor.stinner at gmail.com Thu Jan 21 03:28:25 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Thu, 21 Jan 2016 09:28:25 +0100 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: <126377093.6733723.1453341281211.JavaMail.yahoo@mail.yahoo.com> References: <126377093.6733723.1453341281211.JavaMail.yahoo@mail.yahoo.com> Message-ID: 2016-01-21 2:54 GMT+01:00 Andrew Barnert : > This idea worries me. I'm not sure why, but I think because of threading. After all, it's pretty rare for two threads to both want to work on the same dict, but very, very common for two threads to both want to work on _any_ dict. So, imagine someone manages to remove the GIL from CPython by using STM: (...) That's a huge project :-) PyPy works one this, but PyPy is not CPython. > (...) now most transactions are bumping that global counter, meaning most transactions fail and have to be retried, Python has atomic types which work well with multiple concurrent threads. > And that also affects something like PyPy being able to use FAT-Python-style AoT optimizations via cpyext. I don't think that using a static optimizer with PyPy makes sense. Reminder that a dynamic optimize (JIT compiler) beats a static compiler on performance ;-) PyPy has a very different design, it's a very bad idea to implement optimizations on cpyext which is emulated and known to be slow. > Is there a way to define this loosely enough so that the implementation _can_ be a single global counter, if that turns out to be most efficient, but can also be a counter per dictionary and a globally-unique ID per dictionary? I don't see how a single global counter for all dictionary can be used to implement fast guards on namespaces. See the rationale of the PEP 509: I wrote it to implement fast guards on namespaces. Victor From victor.stinner at gmail.com Thu Jan 21 03:34:07 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Thu, 21 Jan 2016 09:34:07 +0100 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: <56A067E9.8090409@gmail.com> References: <569FCBD0.4040307@gmail.com> <56A0039F.6070409@canterbury.ac.nz> <56A067E9.8090409@gmail.com> Message-ID: 2016-01-21 6:08 GMT+01:00 Yury Selivanov : > Yeah, I think that's what we agreed on: > https://mail.python.org/pipermail/python-dev/2016-January/142837.html > > The only advantage of ma_extra pointer is that it allows to add more stuff > to dicts in the future. I don't agree on ma_extra since I don't understand it :-) What is the advantage compared to a simple integer? If it's a pointer, it requires to dereference the pointer? You say that it can be NULL, does it mean that we also have to first test if the pointer is NULL. Does it mean that we have to allocate a second memory block to store a version tag object? When do you create a version tag or not? Note: The PEP 509 proposes to use 64-bit integer for the version on 32-bit systems to avoid integer overflow after a few seconds. I first proposed to use the size_t type (which has the size of a pointer) but it doesn't work. I tried to fix FAT Python to handle your use case: function defined in a namespace and run in a different namespace (especially for the builtin dictionary). It looks like I don't need the discussion change to use a global version, FAT Python guards have a different design than your cache. But if a global counter doesn't make the slow more complex and opens new kinds of optimization, I agree to change my PEP 509. Victor From yselivanov.ml at gmail.com Thu Jan 21 11:12:16 2016 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Thu, 21 Jan 2016 11:12:16 -0500 Subject: [Python-Dev] PEP 509: Add a private version to dict In-Reply-To: References: <569FCBD0.4040307@gmail.com> <56A0039F.6070409@canterbury.ac.nz> <56A067E9.8090409@gmail.com> Message-ID: <56A10360.1060604@gmail.com> On 2016-01-21 3:34 AM, Victor Stinner wrote: [..] > But if a global counter doesn't make the slow more complex and opens > new kinds of optimization, I agree to change my PEP 509. Please. That would allow us to use ma_version to implement caches in CPython itself. Yury From brett at python.org Thu Jan 21 12:12:10 2016 From: brett at python.org (Brett Cannon) Date: Thu, 21 Jan 2016 17:12:10 +0000 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: <569e6ac9.0611370a.41841.ffffd93b@mx.google.com> <569E8073.3010106@stoneleaf.us> Message-ID: On Wed, 20 Jan 2016 at 23:07 Nick Coghlan wrote: > On 21 January 2016 at 10:16, Brett Cannon wrote: > > I just checked with Van and we should have CLAs for even PEP > contributors, > > so it will have to go through the same steps as the other ancillary > > repositories. > > We should probably mention that in PEP 1 Yep. > - I wasn't aware of that > requirement, so I've never explicitly checked CLA status for folks > contributing packaging related PEPs. (And looking at the > just-checked-in PEP 513, I apparently have a CLA to chase up...) > Yeah, I didn't know either until I asked Van, so I think it's new to everyone. :) -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Thu Jan 21 12:18:13 2016 From: brett at python.org (Brett Cannon) Date: Thu, 21 Jan 2016 17:18:13 +0000 Subject: [Python-Dev] Devguide: Add a table summarizing status of Python branches In-Reply-To: References: Message-ID: It's live: https://docs.python.org/devguide/#status-of-python-branches On Wed, 20 Jan 2016 at 23:47 Victor Stinner wrote: > 2016-01-20 22:22 GMT+01:00 Victor Stinner : > > I pushed my table, it will be online in a few hours (I don't know when > > the devguide is recompiled?): > > > http://docs.python.org/devguide/triaging.html#generating-special-links-in-a-comment > > Hum ok, it takes more than a few hours in fact. It's still not online > 10 hours after my push :-/ > https://docs.python.org/devguide/ > > Victor > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From zachary.ware+pydev at gmail.com Thu Jan 21 12:18:13 2016 From: zachary.ware+pydev at gmail.com (Zachary Ware) Date: Thu, 21 Jan 2016 11:18:13 -0600 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: <569e6ac9.0611370a.41841.ffffd93b@mx.google.com> <569E8073.3010106@stoneleaf.us> Message-ID: On Thu, Jan 21, 2016 at 11:12 AM, Brett Cannon wrote: > On Wed, 20 Jan 2016 at 23:07 Nick Coghlan wrote: >> - I wasn't aware of that >> requirement, so I've never explicitly checked CLA status for folks >> contributing packaging related PEPs. (And looking at the >> just-checked-in PEP 513, I apparently have a CLA to chase up...) > > Yeah, I didn't know either until I asked Van, so I think it's new to > everyone. :) It's quite surprising to me, since all (as far as I know) PEPs are explicitly public domain. I am very much not a lawyer, though :) -- Zach From guido at python.org Thu Jan 21 12:40:34 2016 From: guido at python.org (Guido van Rossum) Date: Thu, 21 Jan 2016 09:40:34 -0800 Subject: [Python-Dev] Devguide: Add a table summarizing status of Python branches In-Reply-To: References: Message-ID: Thanks Victor for doing this. I'm starting a campaign to tell people about it on Twitter. :-) On Thu, Jan 21, 2016 at 9:18 AM, Brett Cannon wrote: > It's live: https://docs.python.org/devguide/#status-of-python-branches > > On Wed, 20 Jan 2016 at 23:47 Victor Stinner > wrote: > >> 2016-01-20 22:22 GMT+01:00 Victor Stinner : >> > I pushed my table, it will be online in a few hours (I don't know when >> > the devguide is recompiled?): >> > >> http://docs.python.org/devguide/triaging.html#generating-special-links-in-a-comment >> >> Hum ok, it takes more than a few hours in fact. It's still not online >> 10 hours after my push :-/ >> https://docs.python.org/devguide/ >> >> Victor >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/brett%40python.org >> > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org > > -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Thu Jan 21 13:42:40 2016 From: p.f.moore at gmail.com (Paul Moore) Date: Thu, 21 Jan 2016 18:42:40 +0000 Subject: [Python-Dev] Devguide: Add a table summarizing status of Python branches In-Reply-To: References: Message-ID: On 21 January 2016 at 17:18, Brett Cannon wrote: > It's live: https://docs.python.org/devguide/#status-of-python-branches Nice :-) Minor nit, the status column says "end of life", but the text below the table uses the term "end of line" (as does the comment "Versions older than 2.6 reached their end-of-line". From my experience, "end of life" is the more common term. Paul From steve.dower at python.org Thu Jan 21 14:50:37 2016 From: steve.dower at python.org (Steve Dower) Date: Thu, 21 Jan 2016 11:50:37 -0800 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: <569e6ac9.0611370a.41841.ffffd93b@mx.google.com> <569E8073.3010106@stoneleaf.us> Message-ID: I'm still yet to meet a lawyer who trusts "public domain" statements... The CLA will ensure we have enough rights to republish the PEP on p.o or future sites, and it doesn't prevent authors from also releasing the text elsewhere under other terms. Top-posted from my Windows Phone -----Original Message----- From: "Zachary Ware" Sent: ?1/?21/?2016 9:23 To: "Python-Dev" Subject: Re: [Python-Dev] Update PEP 7 to require curly braces in C On Thu, Jan 21, 2016 at 11:12 AM, Brett Cannon wrote: > On Wed, 20 Jan 2016 at 23:07 Nick Coghlan wrote: >> - I wasn't aware of that >> requirement, so I've never explicitly checked CLA status for folks >> contributing packaging related PEPs. (And looking at the >> just-checked-in PEP 513, I apparently have a CLA to chase up...) > > Yeah, I didn't know either until I asked Van, so I think it's new to > everyone. :) It's quite surprising to me, since all (as far as I know) PEPs are explicitly public domain. I am very much not a lawyer, though :) -- Zach _______________________________________________ Python-Dev mailing list Python-Dev at python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/steve.dower%40python.org -------------- next part -------------- An HTML attachment was scrubbed... URL: From emile at fenx.com Thu Jan 21 16:27:44 2016 From: emile at fenx.com (Emile van Sebille) Date: Thu, 21 Jan 2016 13:27:44 -0800 Subject: [Python-Dev] Devguide: Add a table summarizing status of Python branches In-Reply-To: References: Message-ID: On 1/21/2016 10:42 AM, Paul Moore wrote: > On 21 January 2016 at 17:18, Brett Cannon wrote: >> It's live: https://docs.python.org/devguide/#status-of-python-branches > > Nice :-) > > Minor nit, the status column says "end of life", but the text below > the table uses the term "end of line" (as does the comment "Versions > older than 2.6 reached their end-of-line". From my experience, "end of > life" is the more common term. I'd prefer end-of-support -- bet you can't count how many pre 2.5 installations are still live. Emile From stephen at xemacs.org Fri Jan 22 00:25:38 2016 From: stephen at xemacs.org (Stephen J. Turnbull) Date: Fri, 22 Jan 2016 14:25:38 +0900 Subject: [Python-Dev] Devguide: Add a table summarizing status of Python branches In-Reply-To: References: Message-ID: <22177.48466.360573.863828@turnbull.sk.tsukuba.ac.jp> Emile van Sebille writes: > I'd prefer end-of-support -- bet you can't count how many pre 2.5 > installations are still live. I see your point, but (having just been thinking about CLAs and Schneier's blog) have to suggest that software that has an explicit "security support" period and is in use after that ends isn't "live". It is "undead". Hope-the-admin-is-named-Alice-ly y'rs, From stephen at xemacs.org Fri Jan 22 00:32:40 2016 From: stephen at xemacs.org (Stephen J. Turnbull) Date: Fri, 22 Jan 2016 14:32:40 +0900 Subject: [Python-Dev] Update PEP 7 to require curly braces in C In-Reply-To: References: <569e6ac9.0611370a.41841.ffffd93b@mx.google.com> <569E8073.3010106@stoneleaf.us> Message-ID: <22177.48888.899524.150195@turnbull.sk.tsukuba.ac.jp> Zachary Ware writes: > It's quite surprising to me, since all (as far as I know) PEPs are > explicitly public domain. I am very much not a lawyer, though :) The reason for any explicit CLA is CYA: you can't be sure that the contributor DTRTs, so the CLA shows that the contribution was received in good faith. (Some CLAs also include indemnification clauses, such as the contributor warrants that they own the copyright, etc, but all CLAs are useful to show good faith, which may matter big time if the contributor turns out to have a all-IP-yours-is-ours employment contract or the like.) You don't really need to be a lawyer to understand this stuff, you just need to think like a security person. https://www.schneier.com/blog/archives/2008/03/the_security_mi_1.html From ncoghlan at gmail.com Fri Jan 22 02:54:05 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Fri, 22 Jan 2016 17:54:05 +1000 Subject: [Python-Dev] Devguide: Add a table summarizing status of Python branches In-Reply-To: References: Message-ID: On 22 January 2016 at 07:27, Emile van Sebille wrote: > On 1/21/2016 10:42 AM, Paul Moore wrote: >> >> On 21 January 2016 at 17:18, Brett Cannon wrote: >>> >>> It's live: https://docs.python.org/devguide/#status-of-python-branches >> >> Nice :-) >> >> Minor nit, the status column says "end of life", but the text below >> the table uses the term "end of line" (as does the comment "Versions >> older than 2.6 reached their end-of-line". From my experience, "end of >> life" is the more common term. > > I'd prefer end-of-support -- bet you can't count how many pre 2.5 > installations are still live. I can count the number of folks contributing changes to the upstream Python 2.5 branch: zero. Even if somebody offered a patch for it, we wouldn't accept it - that maintenance branch is dead, which is what the "End of Life" refers to. Folks are still free to run it (all past Python releases remain online, all the way back to 1.1), and downstreams may still offer support for it, but that's their call. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From victor.stinner at gmail.com Fri Jan 22 03:26:59 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Fri, 22 Jan 2016 09:26:59 +0100 Subject: [Python-Dev] Devguide: Add a table summarizing status of Python branches In-Reply-To: References: Message-ID: 2016-01-21 19:42 GMT+01:00 Paul Moore : > Minor nit, the status column says "end of life", but the text below > the table uses the term "end of line" Ooops, it's a funny typo. Fixed. Thanks for the report! Victor From victor.stinner at gmail.com Fri Jan 22 11:03:39 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Fri, 22 Jan 2016 17:03:39 +0100 Subject: [Python-Dev] Devguide: Add a table summarizing status of Python branches In-Reply-To: References: Message-ID: 2016-01-21 18:18 GMT+01:00 Brett Cannon : > It's live: https://docs.python.org/devguide/#status-of-python-branches There is a very strange bug in this website. This URL shows the table: https://docs.python.org/devguide/ This URL doesn't show the table: https://docs.python.org/devguide/index.html Outdated version of the guide? This bug can be seen without a browser, using wget: $ wget -O- https://docs.python.org/devguide/ 2>&1|grep 'Python branches'

Status of Python branches<(...)
  • Status of Python branches
  • $ wget -O- https://docs.python.org/devguide/index.html 2>&1|grep 'Python branches' Victor From berker.peksag at gmail.com Fri Jan 22 11:21:48 2016 From: berker.peksag at gmail.com (=?UTF-8?Q?Berker_Peksa=C4=9F?=) Date: Fri, 22 Jan 2016 18:21:48 +0200 Subject: [Python-Dev] Devguide: Add a table summarizing status of Python branches In-Reply-To: References: Message-ID: On Fri, Jan 22, 2016 at 6:03 PM, Victor Stinner wrote: > 2016-01-21 18:18 GMT+01:00 Brett Cannon : >> It's live: https://docs.python.org/devguide/#status-of-python-branches > > There is a very strange bug in this website. > > This URL shows the table: > https://docs.python.org/devguide/ > > This URL doesn't show the table: > https://docs.python.org/devguide/index.html > > Outdated version of the guide? It looks like a cache issue. I purged the cache for /devguide/index.html: $ wget -O- https://docs.python.org/devguide/index.html 2>&1 | grep "Python branches"

    Status of Python branches?

  • Status of Python branches
  • --Berker From status at bugs.python.org Fri Jan 22 12:08:33 2016 From: status at bugs.python.org (Python tracker) Date: Fri, 22 Jan 2016 18:08:33 +0100 (CET) Subject: [Python-Dev] Summary of Python tracker Issues Message-ID: <20160122170833.EF922560CE@psf.upfronthosting.co.za> ACTIVITY SUMMARY (2016-01-15 - 2016-01-22) Python tracker at http://bugs.python.org/ To view or respond to any of the issues listed below, click on the issue. Do NOT respond to this message. Issues counts and deltas: open 5354 (-33) closed 32585 (+89) total 37939 (+56) Open issues with patches: 2348 Issues opened (31) ================== #25791: Raise an ImportWarning when __spec__.parent/__package__ isn't http://bugs.python.org/issue25791 reopened by brett.cannon #26128: Let the subprocess.STARTUPINFO constructor take arguments http://bugs.python.org/issue26128 opened by cool-RR #26130: redundant local copy of a char pointer in classify in Parser\p http://bugs.python.org/issue26130 opened by Oren Milman #26131: Raise ImportWarning when loader.load_module() is used http://bugs.python.org/issue26131 opened by brett.cannon #26132: 2.7.11 Windows Installer issues on Win2008R2 http://bugs.python.org/issue26132 opened by David Rader #26133: asyncio: ugly error related to signal handlers at exit if the http://bugs.python.org/issue26133 opened by Alex Brandt #26134: HTTPPasswordMgrWithPriorAuth does not work with DigestAuthenti http://bugs.python.org/issue26134 opened by guesommer #26136: DeprecationWarning for PEP 479 (generator_stop) http://bugs.python.org/issue26136 opened by martin.panter #26137: [idea] use the Microsoft Antimalware Scan Interface http://bugs.python.org/issue26137 opened by Alexander Riccio #26140: inspect.iscoroutinefunction raises TypeError when checks Mock http://bugs.python.org/issue26140 opened by miyakogi #26141: typing module documentation incomplete http://bugs.python.org/issue26141 opened by Ben.Darnell #26143: Ensure that IDLE's stdlib imports are from the stdlib http://bugs.python.org/issue26143 opened by terry.reedy #26144: test_pkg test_4 and/or test_7 sometimes fail http://bugs.python.org/issue26144 opened by martin.panter #26145: PEP 511: Add sys.set_code_transformers() http://bugs.python.org/issue26145 opened by haypo #26146: PEP 511: Add ast.Constant to allow AST optimizer to emit const http://bugs.python.org/issue26146 opened by haypo #26148: String literals are not interned if in a tuple http://bugs.python.org/issue26148 opened by serhiy.storchaka #26149: Suggest PyCharm Community as an editor for Unix platforms http://bugs.python.org/issue26149 opened by John Hagen #26152: A non-breaking space in a source http://bugs.python.org/issue26152 opened by Drekin #26153: PyImport_GetModuleDict: no module dictionary! when `__del__` t http://bugs.python.org/issue26153 opened by minrk #26155: 3.5.1 installer issue on Win 7 32 bit http://bugs.python.org/issue26155 opened by TarotRedhand #26158: File truncate() not defaulting to current position as document http://bugs.python.org/issue26158 opened by fornax #26159: Unsafe to BaseEventLoop.set_debug(False) when PYTHONASYNCIODEB http://bugs.python.org/issue26159 opened by Bradley McLean #26160: Tutorial incorrectly claims that (explicit) relative imports d http://bugs.python.org/issue26160 opened by Kevin.Norris #26167: Improve copy.copy speed for built-in types (list/set/dict) http://bugs.python.org/issue26167 opened by josh.r #26168: Py_BuildValue may leak 'N' arguments on PyTuple_New failure http://bugs.python.org/issue26168 opened by squidevil #26173: test_ssl.bad_cert_test() exception handling http://bugs.python.org/issue26173 opened by martin.panter #26175: Fully implement IOBase abstract on SpooledTemporaryFile http://bugs.python.org/issue26175 opened by Gary Fernie #26176: EmailMessage example doesn't work http://bugs.python.org/issue26176 opened by Srujan Chaitanya #26177: tkinter: Canvas().keys returns empty strings. http://bugs.python.org/issue26177 opened by terry.reedy #26180: multiprocessing.util._afterfork_registry leak in threaded envi http://bugs.python.org/issue26180 opened by mzamazal #26181: argparse can't handle positional argument after list (help mes http://bugs.python.org/issue26181 opened by atpage Most recent 15 issues with no replies (15) ========================================== #26181: argparse can't handle positional argument after list (help mes http://bugs.python.org/issue26181 #26180: multiprocessing.util._afterfork_registry leak in threaded envi http://bugs.python.org/issue26180 #26176: EmailMessage example doesn't work http://bugs.python.org/issue26176 #26173: test_ssl.bad_cert_test() exception handling http://bugs.python.org/issue26173 #26168: Py_BuildValue may leak 'N' arguments on PyTuple_New failure http://bugs.python.org/issue26168 #26159: Unsafe to BaseEventLoop.set_debug(False) when PYTHONASYNCIODEB http://bugs.python.org/issue26159 #26153: PyImport_GetModuleDict: no module dictionary! when `__del__` t http://bugs.python.org/issue26153 #26141: typing module documentation incomplete http://bugs.python.org/issue26141 #26136: DeprecationWarning for PEP 479 (generator_stop) http://bugs.python.org/issue26136 #26131: Raise ImportWarning when loader.load_module() is used http://bugs.python.org/issue26131 #26128: Let the subprocess.STARTUPINFO constructor take arguments http://bugs.python.org/issue26128 #26122: Isolated mode doesn't ignore PYTHONHASHSEED http://bugs.python.org/issue26122 #26120: pydoc: move __future__ imports out of the DATA block http://bugs.python.org/issue26120 #26117: Close directory descriptor in scandir iterator on error http://bugs.python.org/issue26117 #26103: Contradiction in definition of "data descriptor" between (dott http://bugs.python.org/issue26103 Most recent 15 issues waiting for review (15) ============================================= #26177: tkinter: Canvas().keys returns empty strings. http://bugs.python.org/issue26177 #26175: Fully implement IOBase abstract on SpooledTemporaryFile http://bugs.python.org/issue26175 #26167: Improve copy.copy speed for built-in types (list/set/dict) http://bugs.python.org/issue26167 #26146: PEP 511: Add ast.Constant to allow AST optimizer to emit const http://bugs.python.org/issue26146 #26145: PEP 511: Add sys.set_code_transformers() http://bugs.python.org/issue26145 #26144: test_pkg test_4 and/or test_7 sometimes fail http://bugs.python.org/issue26144 #26140: inspect.iscoroutinefunction raises TypeError when checks Mock http://bugs.python.org/issue26140 #26130: redundant local copy of a char pointer in classify in Parser\p http://bugs.python.org/issue26130 #26125: Incorrect error message in the module asyncio.selector_events. http://bugs.python.org/issue26125 #26121: Use C99 functions in math if available http://bugs.python.org/issue26121 #26117: Close directory descriptor in scandir iterator on error http://bugs.python.org/issue26117 #26110: Speedup method calls 1.2x http://bugs.python.org/issue26110 #26098: PEP 510: Specialize functions with guards http://bugs.python.org/issue26098 #26089: Duplicated keyword in distutils metadata http://bugs.python.org/issue26089 #26082: functools.lru_cache user specified cachedict support http://bugs.python.org/issue26082 Top 10 most discussed issues (10) ================================= #26158: File truncate() not defaulting to current position as document http://bugs.python.org/issue26158 14 msgs #25702: Link Time Optimizations support for GCC and CLANG http://bugs.python.org/issue25702 12 msgs #19475: Add timespec optional flag to datetime isoformat() to choose t http://bugs.python.org/issue19475 9 msgs #25878: CPython on Windows builds with /W3, not /W4 http://bugs.python.org/issue25878 8 msgs #25907: Documentation i18n: Added trans tags in sphinx templates http://bugs.python.org/issue25907 8 msgs #26145: PEP 511: Add sys.set_code_transformers() http://bugs.python.org/issue26145 8 msgs #26152: A non-breaking space in a source http://bugs.python.org/issue26152 7 msgs #23883: __all__ lists are incomplete http://bugs.python.org/issue23883 6 msgs #25934: ICC compiler: ICC treats denormal floating point numbers as 0. http://bugs.python.org/issue25934 6 msgs #26146: PEP 511: Add ast.Constant to allow AST optimizer to emit const http://bugs.python.org/issue26146 6 msgs Issues closed (84) ================== #5626: misleading comment in socket.gethostname() documentation http://bugs.python.org/issue5626 closed by berker.peksag #8604: Adding an atomic FS write API http://bugs.python.org/issue8604 closed by haypo #9006: xml-rpc Server object does not propagate the encoding to Unmar http://bugs.python.org/issue9006 closed by serhiy.storchaka #12869: PyOS_StdioReadline is printing the prompt on stderr http://bugs.python.org/issue12869 closed by martin.panter #14046: argparse: assertion failure if optional argument has square/ro http://bugs.python.org/issue14046 closed by martin.panter #15809: 2.7 IDLE console uses incorrect encoding. http://bugs.python.org/issue15809 closed by serhiy.storchaka #16620: Avoid using private function glob.glob1() in msi module and to http://bugs.python.org/issue16620 closed by serhiy.storchaka #16907: Distutils fails to build extension in path with spaces http://bugs.python.org/issue16907 closed by zach.ware #16956: Allow signed line number deltas in the code object's line num http://bugs.python.org/issue16956 closed by haypo #17633: zipimport's handling of namespace packages is incorrect http://bugs.python.org/issue17633 closed by brett.cannon #18620: multiprocessing page leaves out important part of Pool example http://bugs.python.org/issue18620 closed by berker.peksag #21385: in debug mode, compile(ast) fails with an assertion error if a http://bugs.python.org/issue21385 closed by haypo #21847: Fix xmlrpc in unicodeless build http://bugs.python.org/issue21847 closed by serhiy.storchaka #21949: Document the Py_SIZE() macro. http://bugs.python.org/issue21949 closed by berker.peksag #23795: argparse -- incorrect usage for mutually exclusive http://bugs.python.org/issue23795 closed by martin.panter #23962: Incorrect TimeoutError referenced in concurrent.futures docume http://bugs.python.org/issue23962 closed by orsenthil #23965: test_ssl failure on Fedora 22 http://bugs.python.org/issue23965 closed by ncoghlan #24520: Stop using deprecated floating-point environment functions on http://bugs.python.org/issue24520 closed by haypo #24761: ERROR: test_dh_params (test.test_ssl.ThreadedTests) http://bugs.python.org/issue24761 closed by martin.panter #24832: Issue building viewable docs with newer sphinx (default theme http://bugs.python.org/issue24832 closed by benjamin.peterson #24985: Python install test fails - OpenSSL - "dh key too small" http://bugs.python.org/issue24985 closed by martin.panter #25058: Right square bracket argparse metavar http://bugs.python.org/issue25058 closed by martin.panter #25089: Can't run Python Launcher on Windows http://bugs.python.org/issue25089 closed by steve.dower #25366: test_venv fails with --without-threads http://bugs.python.org/issue25366 closed by berker.peksag #25613: fix ssl tests with sslv3 disabled http://bugs.python.org/issue25613 closed by martin.panter #25644: Unable to open IDLE on Windows 10 with Python 2.7.10 http://bugs.python.org/issue25644 closed by steve.dower #25694: test.libregrtest not installed http://bugs.python.org/issue25694 closed by steve.dower #25704: Update the devguide to 3.5 http://bugs.python.org/issue25704 closed by berker.peksag #25731: Assigning and deleting __new__ attr on the class does not allo http://bugs.python.org/issue25731 closed by python-dev #25759: Python 2.7.11rc1 not building with Visual Studio 2015 http://bugs.python.org/issue25759 closed by steve.dower #25765: Installation error http://bugs.python.org/issue25765 closed by steve.dower #25799: 2.7.11rc1 not added to Win10 app list (start menu) http://bugs.python.org/issue25799 closed by terry.reedy #25824: 32-bit 2.7.11 installer creates registry keys that are incompa http://bugs.python.org/issue25824 closed by steve.dower #25843: code_richcompare() don't use constant type when comparing code http://bugs.python.org/issue25843 closed by haypo #25850: Building extensions with MSVC 2015 Express fails http://bugs.python.org/issue25850 closed by steve.dower #25859: EOFError in test_nntplib.NetworkedNNTPTests.test_starttls() http://bugs.python.org/issue25859 closed by martin.panter #25876: test_gdb: use subprocess._args_from_interpreter_flags() to tes http://bugs.python.org/issue25876 closed by haypo #25909: Incorrect documentation for PyMapping_Items and like http://bugs.python.org/issue25909 closed by orsenthil #25925: Coverage support for CPython 2 http://bugs.python.org/issue25925 closed by zach.ware #25935: OrderedDict prevents garbage collection if a circulary referen http://bugs.python.org/issue25935 closed by serhiy.storchaka #25982: multiprocessing docs for Namespace lacks class definition http://bugs.python.org/issue25982 closed by orsenthil #25993: Crashed when call time.time() after using _mm_xor_si64 http://bugs.python.org/issue25993 closed by steve.dower #26013: Pickle protocol 2.0 not loading in python 3.5 http://bugs.python.org/issue26013 closed by serhiy.storchaka #26017: Update https://docs.python.org/3/installing/index.html to alwa http://bugs.python.org/issue26017 closed by orsenthil #26035: traceback.print_tb() takes `tb`, not `traceback` as a keyword http://bugs.python.org/issue26035 closed by orsenthil #26059: Integer Overflow in strop.replace() http://bugs.python.org/issue26059 closed by serhiy.storchaka #26065: python embedded 3.5 amd64 crash when using venv http://bugs.python.org/issue26065 closed by steve.dower #26070: Launcher fails to find in-place built binaries from earlier Py http://bugs.python.org/issue26070 closed by steve.dower #26071: bdist_wininst created binaries fail to start and find 32bit Py http://bugs.python.org/issue26071 closed by steve.dower #26073: Update the list of magic numbers in launcher http://bugs.python.org/issue26073 closed by steve.dower #26077: Make slicing of immutable structures return a view instead of http://bugs.python.org/issue26077 closed by serhiy.storchaka #26099: site ignores ImportError when running sitecustomize and usercu http://bugs.python.org/issue26099 closed by haypo #26100: PEP 511: Add test.support.optim_args_from_interpreter_flags() http://bugs.python.org/issue26100 closed by haypo #26101: Lib/test/test_compileall.py fails when run directly http://bugs.python.org/issue26101 closed by haypo #26106: Move licences to literal blocks http://bugs.python.org/issue26106 closed by haypo #26107: PEP 511: code.co_lnotab: use signed line number delta to suppo http://bugs.python.org/issue26107 closed by haypo #26108: Calling PyInitialize with 2.7.11 on Windows x64 terminates pro http://bugs.python.org/issue26108 closed by steve.dower #26114: Rewrite math.erf() and math.erfc() from scratch http://bugs.python.org/issue26114 closed by brett.cannon #26126: Possible subtle bug when normalizing and str.translate()ing http://bugs.python.org/issue26126 closed by SilentGhost #26127: Broken link in docs for tokenize http://bugs.python.org/issue26127 closed by martin.panter #26129: Difference in behaviour with grp.getgrgid and pwd.getpwuid http://bugs.python.org/issue26129 closed by larry #26135: Documentation Recommends Deprecated `imp` Module http://bugs.python.org/issue26135 closed by orsenthil #26138: Disable /W4 warning (non-standard dllimport behavior) http://bugs.python.org/issue26138 closed by skrah #26139: libmpdec: disable /W4 warning (non-standard dllimport behavior http://bugs.python.org/issue26139 closed by skrah #26142: Formatting bug on https://docs.python.org/2.7/c-api/intro.html http://bugs.python.org/issue26142 closed by orsenthil #26147: Encoding errors in xmlrpc http://bugs.python.org/issue26147 closed by serhiy.storchaka #26150: SequenceMatcher's algorithm is not correct http://bugs.python.org/issue26150 closed by tim.peters #26151: str(bytes) does __repr__() instead of __str__() http://bugs.python.org/issue26151 closed by haypo #26154: Add private _PyThreadState_UncheckedGet() to get the current t http://bugs.python.org/issue26154 closed by haypo #26156: Bad name into power operator syntax http://bugs.python.org/issue26156 closed by yselivanov #26157: Typo in asyncio documentation http://bugs.python.org/issue26157 closed by berker.peksag #26161: Use Py_uintptr_t instead of void* for atomic pointers http://bugs.python.org/issue26161 closed by haypo #26162: thread error http://bugs.python.org/issue26162 closed by eryksun #26163: FAIL: test_hash_effectiveness (test.test_set.TestFrozenSet) http://bugs.python.org/issue26163 closed by haypo #26164: test_with_pip() of test_venv fails on Windows buildbots http://bugs.python.org/issue26164 closed by haypo #26165: devguide: table summarizing status of Python branches http://bugs.python.org/issue26165 closed by haypo #26166: zlib compressor/decompressor objects should support copy proto http://bugs.python.org/issue26166 closed by serhiy.storchaka #26169: Pasting 900000 chars into a tk Entry widget fails http://bugs.python.org/issue26169 closed by serhiy.storchaka #26170: pip Crash on Unpacking in get_platform() line 119 http://bugs.python.org/issue26170 closed by dstufft #26171: heap overflow in zipimporter module http://bugs.python.org/issue26171 closed by python-dev #26172: iBook can't open ePub http://bugs.python.org/issue26172 closed by python-dev #26174: Exception alias cause destruction of existing variable http://bugs.python.org/issue26174 closed by brett.cannon #26178: Python C-API: __all__ Creator http://bugs.python.org/issue26178 closed by Devyn Johnson #26179: Python C-API "unused-parameter" warnings http://bugs.python.org/issue26179 closed by haypo From robertpancoast77 at gmail.com Fri Jan 22 14:56:37 2016 From: robertpancoast77 at gmail.com (=?UTF-8?Q?=C6=A6OB_COASTN?=) Date: Fri, 22 Jan 2016 14:56:37 -0500 Subject: [Python-Dev] python3 k1om dissociation permanence Message-ID: Hello, Enabling the build system for Intel MIC k1om is non-trivial using Python-3.4.4 Using Python2 for the k1om is very popular, but I require Python3 support on k1om. The first requirement to complete this build involved the download and extraction of pre-built MPSS RPM's. Then built required host python bins using GCC. Lastly, build MIC bins using ICC. The exacts are appended to the end of this message. I would like to discuss a few change requirements that trial and error has revealed. 1.) libffi requires the University OF Cantabria patch because the k1om is not binary compatible with x86_64. [attached] These libffi changes could be implemented using the __MIC__ or __KNC__ macros. *see https://software.intel.com/en-us/node/514528 2.) ./configure script halts during failure to locate readelf for the host. I simply commented out these lines in the ./configure file: #if test "$cross_compiling" = yes; then #case "$READELF" in #readelf|:) #as_fn_error $? "readelf for the host is required for cross builds" "$LINENO" 5 #;; #esac #fi Ideally, ./configure would support ICC and k1om. Am I missing something in the configure/make commands below? Is it possible to bypass the readelf requirement when cross-compiling for k1om? Additionally, are any of the command line parameters below unnecessary? PKG_CONFIG_LIBDIR PKG_CONFIG_PATH PYTHON_FOR_BUILD _PYTHON_HOST_PLATFORM HOSTPGEN HOSTARCH BUILDARCH Thanks, Rob #copy/unpack the k1om bins tarball cd /home/ wget mpss-3.4.6-k1om.tar tar xvf mpss-3.4.6-k1om.tar cd /home/mpss-3.4.6/k1om/ for rpm in *.rpm; do rpm2cpio $rpm | cpio -idm; done #vars PythonVersion=Python-3.4.4 k1om_rpm=/home/mpss-3.4.6/k1om/ INSTALLPREFIX=/home/Python/release/$PythonVersion-mic SRC=/home/Python/$PythonVersion echo "Compiling host Python" cd $SRC && make distclean cd $SRC && ./configure cd $SRC && make python Parser/pgen rm -f $SRC/hostpython mv $SRC/python $SRC/hostpython rm -f $SRC/Parser/hostpgen mv $SRC/Parser/pgen $SRC/Parser/hostpgen cd $SRC && make distclean echo "Configuring Python for MIC..." cd $SRC && CONFIG_SITE=config.site \ ./configure \ CC="icc -mmic" \ CFLAGS="-I$k1om_rpm/include -I$k1om_rpm/usr/include -wd10006" \ CXX="icpc -mmic" \ CPPFLAGS="-I$k1om_rpm/include -I$k1om_rpm/usr/include -wd10006" \ PKG_CONFIG_LIBDIR="$k1om_rpm/usr/lib64/pkgconfig" \ PKG_CONFIG_PATH="$k1om_rpm/usr/lib64/pkgconfig" \ --host=x86_64-k1om-linux \ --build=x86_64-linux-gnu \ --with-cxx-main="icpc -mmic" \ --disable-ipv6 echo "done" echo "Compiling Python for MIC..." cd $SRC && make \ PYTHON_FOR_BUILD=./hostpython \ _PYTHON_HOST_PLATFORM=x86_64-k1om-linux \ HOSTPGEN=./Parser/hostpgen \ HOSTARCH=x86_64-k1om-linux \ BUILDARCH=x86_64-linux-gnu \ EXTRA_CFLAGS="-fp-model precise -shared -fPIC" \ LDFLAGS="-L$k1om_rpm/lib64 -L$k1om_rpm/usr/lib64" echo "done" echo "Installing Python for MIC" mkdir -p $INSTALLPREFIX cd $SRC && make install \ PYTHON_FOR_BUILD=./hostpython \ _PYTHON_HOST_PLATFORM=x86_64-k1om-linux \ prefix=$INSTALLPREFIX echo "done" -------------- next part -------------- A non-text attachment was scrubbed... Name: 0001-k1om-libffi.patch Type: application/octet-stream Size: 18705 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 0002-READELF.patch Type: application/octet-stream Size: 790 bytes Desc: not available URL: From brett at python.org Fri Jan 22 16:50:04 2016 From: brett at python.org (Brett Cannon) Date: Fri, 22 Jan 2016 21:50:04 +0000 Subject: [Python-Dev] python3 k1om dissociation permanence In-Reply-To: References: Message-ID: If you could, ?OB, can you open issues on bugs.python.org for each of these problems/changes? Otherwise we will lose track of this. On Fri, 22 Jan 2016 at 11:57 ?OB COASTN wrote: > Hello, > > Enabling the build system for Intel MIC k1om is non-trivial using > Python-3.4.4 > Using Python2 for the k1om is very popular, but I require Python3 > support on k1om. > > The first requirement to complete this build involved the download and > extraction of pre-built MPSS RPM's. > Then built required host python bins using GCC. > Lastly, build MIC bins using ICC. > The exacts are appended to the end of this message. > > I would like to discuss a few change requirements that trial and error > has revealed. > > 1.) libffi requires the University OF Cantabria patch because the k1om > is not binary compatible with x86_64. [attached] > > These libffi changes could be implemented using the __MIC__ or __KNC__ > macros. > *see https://software.intel.com/en-us/node/514528 > > 2.) ./configure script halts during failure to locate readelf for the host. > > I simply commented out these lines in the ./configure file: > #if test "$cross_compiling" = yes; then > #case "$READELF" in > #readelf|:) > #as_fn_error $? "readelf for the host is required for cross > builds" "$LINENO" 5 > #;; > #esac > #fi > > Ideally, ./configure would support ICC and k1om. > Am I missing something in the configure/make commands below? > Is it possible to bypass the readelf requirement when cross-compiling for > k1om? > > Additionally, are any of the command line parameters below unnecessary? > PKG_CONFIG_LIBDIR > PKG_CONFIG_PATH > PYTHON_FOR_BUILD > _PYTHON_HOST_PLATFORM > HOSTPGEN > HOSTARCH > BUILDARCH > > > Thanks, > Rob > > > > > #copy/unpack the k1om bins tarball > cd /home/ > wget mpss-3.4.6-k1om.tar > tar xvf mpss-3.4.6-k1om.tar > cd /home/mpss-3.4.6/k1om/ > for rpm in *.rpm; do rpm2cpio $rpm | cpio -idm; done > > #vars > PythonVersion=Python-3.4.4 > k1om_rpm=/home/mpss-3.4.6/k1om/ > INSTALLPREFIX=/home/Python/release/$PythonVersion-mic > SRC=/home/Python/$PythonVersion > > echo "Compiling host Python" > cd $SRC && make distclean > cd $SRC && ./configure > cd $SRC && make python Parser/pgen > rm -f $SRC/hostpython > mv $SRC/python $SRC/hostpython > rm -f $SRC/Parser/hostpgen > mv $SRC/Parser/pgen $SRC/Parser/hostpgen > cd $SRC && make distclean > > echo "Configuring Python for MIC..." > cd $SRC && CONFIG_SITE=config.site \ > ./configure \ > CC="icc -mmic" \ > CFLAGS="-I$k1om_rpm/include -I$k1om_rpm/usr/include -wd10006" \ > CXX="icpc -mmic" \ > CPPFLAGS="-I$k1om_rpm/include -I$k1om_rpm/usr/include -wd10006" \ > PKG_CONFIG_LIBDIR="$k1om_rpm/usr/lib64/pkgconfig" \ > PKG_CONFIG_PATH="$k1om_rpm/usr/lib64/pkgconfig" \ > --host=x86_64-k1om-linux \ > --build=x86_64-linux-gnu \ > --with-cxx-main="icpc -mmic" \ > --disable-ipv6 > echo "done" > > echo "Compiling Python for MIC..." > cd $SRC && make \ > PYTHON_FOR_BUILD=./hostpython \ > _PYTHON_HOST_PLATFORM=x86_64-k1om-linux \ > HOSTPGEN=./Parser/hostpgen \ > HOSTARCH=x86_64-k1om-linux \ > BUILDARCH=x86_64-linux-gnu \ > EXTRA_CFLAGS="-fp-model precise -shared -fPIC" \ > LDFLAGS="-L$k1om_rpm/lib64 -L$k1om_rpm/usr/lib64" > echo "done" > > echo "Installing Python for MIC" > mkdir -p $INSTALLPREFIX > cd $SRC && make install \ > PYTHON_FOR_BUILD=./hostpython \ > _PYTHON_HOST_PLATFORM=x86_64-k1om-linux \ > prefix=$INSTALLPREFIX > echo "done" > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Fri Jan 22 18:44:57 2016 From: brett at python.org (Brett Cannon) Date: Fri, 22 Jan 2016 23:44:57 +0000 Subject: [Python-Dev] do people use sys._mercurial? Message-ID: Since we are going to be switching over to Git, sys._mercurial is going to be made to return a dummy value of `('CPython', '', '')` once we switch to Git. But my question is do we bother to replace it with sys._git? I wanted to make sure that the effort is worth it to keep changing these VCS-specific attributes every time we change our VCS (and no, we are not going to adopt a generic one; already had that debate). So do please speak up if you actually have found value from sys._mercurial. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Fri Jan 22 21:47:14 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sat, 23 Jan 2016 12:47:14 +1000 Subject: [Python-Dev] do people use sys._mercurial? In-Reply-To: References: Message-ID: On 23 January 2016 at 09:44, Brett Cannon wrote: > Since we are going to be switching over to Git, sys._mercurial is going to > be made to return a dummy value of `('CPython', '', '')` once we switch to > Git. But my question is do we bother to replace it with sys._git? I wanted > to make sure that the effort is worth it to keep changing these VCS-specific > attributes every time we change our VCS (and no, we are not going to adopt a > generic one; already had that debate). So do please speak up if you actually > have found value from sys._mercurial. It's incorporated into the output of "platform.python_(branch|revision|build)()", so I assume most people would use those if they needed to report exact build information, rather than accessing the attribute directly. We also use it ourselves in printing the appropriate banner when running an interactive prompt (note the first line of the banner): Python 3.6.0a0 (default:32a4e7b337c9, Jan 23 2016, 12:30:00) [GCC 5.3.1 20151207 (Red Hat 5.3.1-2)] on linux Type "help", "copyright", "credits" or "license" for more information. >>> import sys >>> sys._mercurial ('CPython', 'default', '32a4e7b337c9') And the regression test suite: $ ./python -m test == CPython 3.6.0a0 (default:32a4e7b337c9, Jan 23 2016, 12:30:00) [GCC 5.3.1 20151207 (Red Hat 5.3.1-2)] == Linux-4.3.3-301.fc23.x86_64-x86_64-with-fedora-23-Twenty_Three little-endian == hash algorithm: siphash24 64bit == /home/ncoghlan/devel/cpython/build/test_python_13167 Testing with flags: sys.flags(debug=0, inspect=0, interactive=0, optimize=0, dont_write_bytecode=0, no_user_site=0, no_site=0, ignore_environment=0, verbose=0, bytes_warning=0, quiet=0, hash_randomization=1, isolated=0) [ 1/401] test_grammar ... I guess that means "Update platform.py, and test the interactive prompt and regression test banners still work as expected when built from git" needs to be added to the CPython migration part of the PEP. (In looking into this, I found several of the docstrings in platform.py still referred to Subversion, even though the platform._sys_version() helping had been updated to handle Mercurial) Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From nad at python.org Fri Jan 22 22:33:39 2016 From: nad at python.org (Ned Deily) Date: Fri, 22 Jan 2016 22:33:39 -0500 Subject: [Python-Dev] do people use sys._mercurial? In-Reply-To: References: Message-ID: On Jan 22, 2016, at 18:44, Brett Cannon wrote: > Since we are going to be switching over to Git, sys._mercurial is going to be made to return a dummy value of `('CPython', '', '')` once we switch to Git. But my question is do we bother to replace it with sys._git? I wanted to make sure that the effort is worth it to keep changing these VCS-specific attributes every time we change our VCS (and no, we are not going to adopt a generic one; already had that debate). So do please speak up if you actually have found value from sys._mercurial. As long as the git revision tag (if any) and hash show up in sys.version and the interpreter interactive (REPL) mode header as they do today with hg and previously with svn (that is, when the interpreter is built from a vcs checkout), I'm happy: $ python3 Python 3.5.1 (v3.5.1:37a07cee5969, Dec 5 2015, 21:12:44) [GCC 4.2.1 (Apple Inc. build 5666) (dot 3)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> -- Ned Deily nad at python.org -- [] From guido at python.org Sat Jan 23 00:50:03 2016 From: guido at python.org (Guido van Rossum) Date: Fri, 22 Jan 2016 21:50:03 -0800 Subject: [Python-Dev] Typehinting repo moved to python/typing Message-ID: This is just a note that with Benjamin's help we've moved the ambv/typehinting repo on GitHub into the python org, so its URL is now https://github.com/python/typing . This repo was used most intensely for discussions during PEP 484's drafting period. It also contains the code for typing.py, repackaged for earlier releases on PyPI. The issue tracker is still open for proposals to change PEP 484, which is not unheard of given its provisional status. If you find a pointer to the original location of this repo in a file you can update, please go ahead (though GitHub is pretty good at forwarding URLs from renamed repos). -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: From rosuav at gmail.com Sat Jan 23 01:03:12 2016 From: rosuav at gmail.com (Chris Angelico) Date: Sat, 23 Jan 2016 17:03:12 +1100 Subject: [Python-Dev] Buildbot timing out - test suite failure - test_socket issue with UDP6? Message-ID: I just had a major crash on the system that hosts the angelico-debian-amd64 buildbot, and as usual, checked it carefully after bringing everything up. It seems now to be timing out after an hour of operation: http://buildbot.python.org/all/builders/AMD64%20Debian%20root%203.x/builds/3132/steps/test/logs/stdio This is happening across all the 3.* branches on my buildbot, but NOT on 2.7, and not on any other buildbots. That makes it look like some kind of config problem at my end. In seeking to diagnose it, I duplicated the 3.x build directory and manually ran the commands to run a test, and it stalled out (I didn't let it go to the whole hour but it sat there for some minutes) at the same place: in test_socket. Running just that test file: $ ./python Lib/test/test_socket.py ... chomp lots of lines ... testRecvmsgPeek (__main__.RecvmsgUDP6Test) ... seems to indicate that the stall is due to IPv6 and UDP. The VM should have full IPv6 support, although my ISPs don't carry IPv6 traffic, so it won't be able to reach the internet proper; but it should be able to do all manner of local tests. The test runs just fine on my main system, so it's only failing in the buildbot VM. Any ideas as to what's going on or how to diagnose? By the way, this looks odd: make buildbottest TESTOPTS= TESTPYTHONOPTS= TESTTIMEOUT=3600 in dir /root/buildarea/3.x.angelico-debian-amd64/build (timeout 3900 secs) The parameter says 3600 (which corresponds to the error message at the end), but it echoes back that the timeout is 3900 seconds. ChrisA From zachary.ware+pydev at gmail.com Sat Jan 23 01:39:07 2016 From: zachary.ware+pydev at gmail.com (Zachary Ware) Date: Sat, 23 Jan 2016 00:39:07 -0600 Subject: [Python-Dev] Buildbot timing out - test suite failure - test_socket issue with UDP6? In-Reply-To: References: Message-ID: On Sat, Jan 23, 2016 at 12:03 AM, Chris Angelico wrote: > By the way, this looks odd: > > make buildbottest TESTOPTS= TESTPYTHONOPTS= TESTTIMEOUT=3600 > in dir /root/buildarea/3.x.angelico-debian-amd64/build (timeout 3900 secs) > > The parameter says 3600 (which corresponds to the error message at the > end), but it echoes back that the timeout is 3900 seconds. I'm no help on the main issue, but to explain the timeout difference: TESTTIMEOUT is a makefile variable that sets the faulthandler timeout that tries to make Python bail out with a stack trace instead of letting buildbot kill Python silently. The 3900 second timeout is buildbot's "there's been no output in this long, assume it's hung and kill it" timeout. The difference between the two is to give faulthandler a chance to do its thing. -- Zach From rosuav at gmail.com Sat Jan 23 02:47:14 2016 From: rosuav at gmail.com (Chris Angelico) Date: Sat, 23 Jan 2016 18:47:14 +1100 Subject: [Python-Dev] Buildbot timing out - test suite failure - test_socket issue with UDP6? In-Reply-To: References: Message-ID: On Sat, Jan 23, 2016 at 5:39 PM, Zachary Ware wrote: > On Sat, Jan 23, 2016 at 12:03 AM, Chris Angelico wrote: >> By the way, this looks odd: >> >> make buildbottest TESTOPTS= TESTPYTHONOPTS= TESTTIMEOUT=3600 >> in dir /root/buildarea/3.x.angelico-debian-amd64/build (timeout 3900 secs) >> >> The parameter says 3600 (which corresponds to the error message at the >> end), but it echoes back that the timeout is 3900 seconds. > > I'm no help on the main issue, but to explain the timeout difference: > TESTTIMEOUT is a makefile variable that sets the faulthandler timeout > that tries to make Python bail out with a stack trace instead of > letting buildbot kill Python silently. The 3900 second timeout is > buildbot's "there's been no output in this long, assume it's hung and > kill it" timeout. The difference between the two is to give > faulthandler a chance to do its thing. Ah, cool. That's one mystery cleared up, at least. ChrisA From victor.stinner at gmail.com Sat Jan 23 09:37:36 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Sat, 23 Jan 2016 15:37:36 +0100 Subject: [Python-Dev] Buildbot timing out - test suite failure - test_socket issue with UDP6? In-Reply-To: References: Message-ID: 3600 seconds is the maximum duration of a single test file. We may reduce it since a single test file should not take longer than 30 min. Maybe we can do better and put the timeout on a single test function. Victor -------------- next part -------------- An HTML attachment was scrubbed... URL: From rosuav at gmail.com Sat Jan 23 09:55:58 2016 From: rosuav at gmail.com (Chris Angelico) Date: Sun, 24 Jan 2016 01:55:58 +1100 Subject: [Python-Dev] Buildbot timing out - test suite failure - test_socket issue with UDP6? In-Reply-To: References: Message-ID: On Sun, Jan 24, 2016 at 1:37 AM, Victor Stinner wrote: > 3600 seconds is the maximum duration of a single test file. We may reduce it > since a single test file should not take longer than 30 min. Maybe we can do > better and put the timeout on a single test function. I'd be inclined to put some strong timeouts on test_socket.py (at some level or other of granularity). Most of those tests should be finished in a fraction of a second; a few might take a few seconds, maybe. None should take a whole minute. But they might easily sit around that long. But I'd rather know what I messed up in my recreation of the VM's configs. ChrisA From brett at python.org Sat Jan 23 14:30:32 2016 From: brett at python.org (Brett Cannon) Date: Sat, 23 Jan 2016 19:30:32 +0000 Subject: [Python-Dev] do people use sys._mercurial? In-Reply-To: References: Message-ID: It seems that people do use the values so there will be a sys._git attribute. On Fri, 22 Jan 2016 at 19:57 Ned Deily wrote: > On Jan 22, 2016, at 18:44, Brett Cannon wrote: > > Since we are going to be switching over to Git, sys._mercurial is going > to be made to return a dummy value of `('CPython', '', '')` once we switch > to Git. But my question is do we bother to replace it with sys._git? I > wanted to make sure that the effort is worth it to keep changing these > VCS-specific attributes every time we change our VCS (and no, we are not > going to adopt a generic one; already had that debate). So do please speak > up if you actually have found value from sys._mercurial. > > As long as the git revision tag (if any) and hash show up in sys.version > and the interpreter interactive (REPL) mode header as they do today with hg > and previously with svn (that is, when the interpreter is built from a vcs > checkout), I'm happy: > > $ python3 > Python 3.5.1 (v3.5.1:37a07cee5969, Dec 5 2015, 21:12:44) > [GCC 4.2.1 (Apple Inc. build 5666) (dot 3)] on darwin > Type "help", "copyright", "credits" or "license" for more information. > >>> > > -- > Ned Deily > nad at python.org -- [] > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From francismb at email.de Sat Jan 23 14:45:19 2016 From: francismb at email.de (francismb) Date: Sat, 23 Jan 2016 20:45:19 +0100 Subject: [Python-Dev] do people use sys._mercurial? In-Reply-To: References: Message-ID: <56A3D84F.70803@email.de> Hi, On 01/23/2016 12:44 AM, Brett Cannon wrote: > Since we are going to be switching over to Git, sys._mercurial is going to > be made to return a dummy value of `('CPython', '', '')` once we switch to > Git. for me sys._mercurial it's already returning that (?) : what should return now? (it's a bug?) $ ipython Python 2.7.11 (default, Dec 9 2015, 00:29:25) Type "copyright", "credits" or "license" for more information. IPython 2.4.1 -- An enhanced Interactive Python. ? -> Introduction and overview of IPython's features. %quickref -> Quick reference. help -> Python's own help system. object? -> Details about 'object', use 'object??' for extra details. In [1]: import sys In [2]: sys._mercurial Out[2]: ('CPython', '', '') In [3]: Regards, francis From brett at python.org Sat Jan 23 14:48:56 2016 From: brett at python.org (Brett Cannon) Date: Sat, 23 Jan 2016 19:48:56 +0000 Subject: [Python-Dev] do people use sys._mercurial? In-Reply-To: <56A3D84F.70803@email.de> References: <56A3D84F.70803@email.de> Message-ID: On Sat, 23 Jan 2016 at 11:45 francismb wrote: > Hi, > > On 01/23/2016 12:44 AM, Brett Cannon wrote: > > Since we are going to be switching over to Git, sys._mercurial is going > to > > be made to return a dummy value of `('CPython', '', '')` once we switch > to > > Git. > > for me sys._mercurial it's already returning that (?) : what should > return now? (it's a bug?) > Depends on your OS and how CPython was built whether it returns that value or something more useful. IOW it's not a bug. -------------- next part -------------- An HTML attachment was scrubbed... URL: From random832 at fastmail.com Sat Jan 23 19:09:41 2016 From: random832 at fastmail.com (Random832) Date: Sat, 23 Jan 2016 19:09:41 -0500 Subject: [Python-Dev] do people use sys._mercurial? References: Message-ID: Brett Cannon writes: > (and no, we are not going to adopt a generic one; already had that > debate). Do you have a link to the relevant discussions? From brett at python.org Sat Jan 23 19:29:12 2016 From: brett at python.org (Brett Cannon) Date: Sun, 24 Jan 2016 00:29:12 +0000 Subject: [Python-Dev] do people use sys._mercurial? In-Reply-To: References: Message-ID: Some discussion happened on core-workflow@, otherwise you can look through the python-dev archives for when we added sys._mercurial. On Sat, 23 Jan 2016, 16:25 Random832 wrote: > Brett Cannon writes: > > (and no, we are not going to adopt a generic one; already had that > > debate). > > Do you have a link to the relevant discussions? > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ncoghlan at gmail.com Sun Jan 24 07:17:32 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun, 24 Jan 2016 22:17:32 +1000 Subject: [Python-Dev] do people use sys._mercurial? In-Reply-To: References: <56A3D84F.70803@email.de> Message-ID: On 24 January 2016 at 05:48, Brett Cannon wrote: > On Sat, 23 Jan 2016 at 11:45 francismb wrote: >> for me sys._mercurial it's already returning that (?) : what should >> return now? (it's a bug?) > > Depends on your OS and how CPython was built whether it returns that value > or something more useful. IOW it's not a bug. Linux distros tend to build Python from a tarball rather than a source checkout, for example, which means the build directory doesn't include any VCS details: $ python3 Python 3.4.3 (default, Jun 29 2015, 12:16:01) [GCC 5.1.1 20150618 (Red Hat 5.1.1-4)] on linux Type "help", "copyright", "credits" or "license" for more information. >>> import sys >>> sys._mercurial ('CPython', '', '') While my local checkout does have those details: $ ./python Python 3.6.0a0 (default:32a4e7b337c9, Jan 23 2016, 12:30:00) [GCC 5.3.1 20151207 (Red Hat 5.3.1-2)] on linux Type "help", "copyright", "credits" or "license" for more information. >>> import sys >>> sys._mercurial ('CPython', 'default', '32a4e7b337c9') Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From ncoghlan at gmail.com Sun Jan 24 07:13:32 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Sun, 24 Jan 2016 22:13:32 +1000 Subject: [Python-Dev] do people use sys._mercurial? In-Reply-To: References: Message-ID: On 24 January 2016 at 10:29, Brett Cannon wrote: > Some discussion happened on core-workflow@, otherwise you can look through > the python-dev archives for when we added sys._mercurial. We actually forgot one relevant point in those discussions: there's already a generic API for accessing this information in the platform module. At the sys module level, though, the key point is that the precise semantic interpretation is VCS dependent, so changing the variable name conveys immediately to users both that the semantics have changed, and which VCS is now being used to define them. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From francismb at email.de Sun Jan 24 10:48:07 2016 From: francismb at email.de (francismb) Date: Sun, 24 Jan 2016 16:48:07 +0100 Subject: [Python-Dev] Code formatter bot In-Reply-To: <569EA3A6.4010802@email.de> References: <569EA3A6.4010802@email.de> Message-ID: <56A4F237.6060709@email.de> Hi, from your valuable feedback, here is what I thing could be a previous requirements list (lets call it for e.g. autopep7 script by now): - It should follow PEP 7 :-) - It should check PEP 7 compliance on a per file basis (for simplicity) - It should be embeddable on the test suite, returning PASS or FAILURE - It should be usable as a pre-commit hook - It could return a patch to apply to the non compliant file making it compliant - It could return the lines (an the reason why) that aren't compliant Some other details on using some autopep7 on the infrastructure (as part of a workflow, by wrapping or enhancing it): - It could be used it on a regular basis against the cpython repo to be able to see how much files doesn't follow PEP 7. The first time shows the legacy ones (plus the current failures). - It should be used in pair with a "skip" list to avoid checking the legacy code mentioned (or a list of pairs file:reason to skip). - It could be used on the CI-side to flag PEP 7 compliance (core-devs doesn't have to need to point to that in reviews). Or could be used as a part of series of checks that generate a "ready-to-review" flag. - It could be used to continually check and proposes patches (on bug.python.org). - It could be used to continually check, log the issue on the tracker and commit the patch to the cpython repository. Regards, francis From francismb at email.de Sun Jan 24 12:40:14 2016 From: francismb at email.de (francismb) Date: Sun, 24 Jan 2016 18:40:14 +0100 Subject: [Python-Dev] do people use sys._mercurial? In-Reply-To: References: <56A3D84F.70803@email.de> Message-ID: <56A50C7E.3090209@email.de> Hi, On 01/24/2016 01:17 PM, Nick Coghlan wrote: > > Linux distros tend to build Python from a tarball rather than a source > checkout, for example, which means the build directory doesn't include > any VCS details: > Does that helps traceability (reproducibility)? If distros use (?) the tarball from the release why it doesn't have, at least, the information from where that tarball was generated from (the check out point) ? Regards, francis From raymond.hettinger at gmail.com Sun Jan 24 19:28:36 2016 From: raymond.hettinger at gmail.com (Raymond Hettinger) Date: Sun, 24 Jan 2016 16:28:36 -0800 Subject: [Python-Dev] Code formatter bot In-Reply-To: <569EA3A6.4010802@email.de> References: <569EA3A6.4010802@email.de> Message-ID: > On Jan 19, 2016, at 12:59 PM, francismb wrote: > > Dear Core-Devs, > what's your opinion about a code-formatter bot for cpython. > Pros, Cons, where could be applicable (new commits, new workflow, it > doesn't make sense), ... > > > - At least it should follow PEP 7 ;-) Please don't do this. It misses the spirit of how the style-guides are intended to be used. "I personally hate with a vengeance that there are tools named after style guide PEPs that claim to enforce the guidelines from those PEPs. The tools' rigidity and simplicity reflects badly on the PEPs, which try hard not to be rigid or simplistic." -- GvR https://mail.python.org/pipermail/python-dev/2016-January/142643.html "PEP 8 unto thyself, not onto others" -- Me https://www.youtube.com/watch?v=wf-BqAjZb8M (the most popular talk from last year's Pycon) Almost nothing that is wrong with CPython is stylistic, the real issues are more substantive. That is where you should devote your talents. Raymond Hettinger From stephen at xemacs.org Sun Jan 24 21:07:17 2016 From: stephen at xemacs.org (Stephen J. Turnbull) Date: Mon, 25 Jan 2016 11:07:17 +0900 Subject: [Python-Dev] do people use sys._mercurial? In-Reply-To: <56A50C7E.3090209@email.de> References: <56A3D84F.70803@email.de> <56A50C7E.3090209@email.de> Message-ID: <22181.33621.40740.909756@turnbull.sk.tsukuba.ac.jp> francismb writes: > Does that helps traceability (reproducibility)? If distros use (?) > the tarball from the release why it doesn't have, at least, the > information from where that tarball was generated from (the check > out point) ? The pointer goes in the other direction: there will be a tag in the repo named to indicate the version. From ncoghlan at gmail.com Sun Jan 24 21:24:34 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Mon, 25 Jan 2016 12:24:34 +1000 Subject: [Python-Dev] do people use sys._mercurial? In-Reply-To: <56A50C7E.3090209@email.de> References: <56A3D84F.70803@email.de> <56A50C7E.3090209@email.de> Message-ID: On 25 January 2016 at 03:40, francismb wrote: > On 01/24/2016 01:17 PM, Nick Coghlan wrote: >> Linux distros tend to build Python from a tarball rather than a source >> checkout, for example, which means the build directory doesn't include >> any VCS details: > Does that helps traceability (reproducibility)? If distros use (?) the > tarball from the release why it doesn't have, at least, the information > from where that tarball was generated from (the check out point) ? The main reason is that distro packaging processes long predate Subversion's popularisation of atomic commits in open source version control tools, and are designed to cope with release processes that involve uploading a source tarball to a web server, so they don't assume VCS tags or revision IDs will be available. However, distro processes also capture the source code itself, and often apply additional distro-specific patches, at which point claiming to correspond directly to any given upstream commit would be inaccurate. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From victor.stinner at gmail.com Mon Jan 25 13:16:29 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Mon, 25 Jan 2016 19:16:29 +0100 Subject: [Python-Dev] FAT Python (lack of) performance Message-ID: Hi, Summary: FAT Python is not faster, but it will be ;-) -- When I started the FAT Python as a fork of CPython 3.6, I put everything in the same repository. Last weeks, I focused on splitting my giant patch (10k lines) into small reviewable patches. I wrote 3 PEP (509 dict version, 510 function specialziation, 511 code tranformers) and I enhanced the API to make it usable for more use cases than just FAT Python. I also created fatoptimizer (the AST optimizer) and fat (runtime dependency of the optimizer) projects on GitHub to separate clearly what should be outside Python core. For all links, see: http://faster-cpython.readthedocs.org/fat_python.html For the fatoptimizer project, my constraint is to be able to run the full Python test suite unmodified. In practice, I have to disable some optimizations by putting a "__fatoptimizer__= {...}" configuration to some test files. For example, I have to disable constant folding on test_bool because it tests that False+2 gives 2 at runtime, whereas the optimizer replaces directly False+2 with 2 during the compilation. Well, test_bool.py is not the best example because all tests pass with the constant folding optimization (if I comment my "__fatoptimizer__={...}" change). This constraint ensures that the optimizer "works" and doesn't break (too much ;-)) the Python semantics, but it's more difficult to implement powerful optimizations. I also found and fixed various kinds of bugs. In my code obviously, but also in the Python core, in various places. Some bugs only concern AST transformers which is a new feature, but I had to fix them. For example, Python didn't support negative line number delta in co_lntotab of code objects, and so line number were all wrong on optimized code. I merged my enhancement in the default branch of CPython (issue #26107). In short, I focused on having something working (respecting the Python semantics), rather than spending time on writing optimizations. -- When I asked explicitly "Is someone opposed to this PEP 509 [dict verion] ?", Barry Warsaw answered that a performance analysis is required. Extract of his mail: "I still think this is maintenance and potential performance overhead we don't want to commit to long term unless it enables significant optimization. Since you probably can't prove that without some experimentation, this API should be provisional." Last week, I ran some benchmarks and I have to admin that I was disappointed. Not only fatoptimizer doesn't make Python faster, but it makes it much slower on some tests! http://fatoptimizer.readthedocs.org/en/latest/benchmarks.html Quickly, I identified a major performance issue when nested functions are specialized, especially in Lib/json/encoder.py (tested by bm_json_v2.py benchmark). I fixed my optimizer to not specialize nested functions anymore. This simple change fixed the main performance issue. Reminder: in performance critical code, don't use nested functions! I will maybe propose patches for Lib/json/encoder.py to stop using nested functions. I only ran benchmarks with the optimizer enabled. I now have to measure the overhead of my patches (PEP 509, 510 and 511) adding the API fat AST optimizers. The overhead must be negligible. For me, it's a requirement of the whole project. Changes must not make Python slower when the optimizer is not used. fatoptimizer is faster on microbenchmarks, but I had to write manually some optimizations: http://fatoptimizer.readthedocs.org/en/latest/microbenchmarks.html IMHO fatoptimizer is not faster on macro benchmarks because it is not smart enough (yet) to generate the most interesting optimizations, like function inlining and specialization for argument types. You can estimate the speedup if you specialize manually your functions. -- Barry also wrote: "Did you address my suggestion on python-ideas to make the new C API optionally compiled in?" Well, it is an option, but I would prefer to have the API for AST optimizer directly built in Python. The first beta version of Python 3.6 is scheduled in September 2016 (deadline for new features in Python 3.6), so I still have a few months to implement more powerful optimizations and prove that it can be faster ;-) Victor From francismb at email.de Mon Jan 25 14:55:38 2016 From: francismb at email.de (francismb) Date: Mon, 25 Jan 2016 20:55:38 +0100 Subject: [Python-Dev] Code formatter bot In-Reply-To: References: <569EA3A6.4010802@email.de> Message-ID: <56A67DBA.8010005@email.de> Hi, On 01/25/2016 01:28 AM, Raymond Hettinger wrote: > >> - At least it should follow PEP 7 ;-) > > Please don't do this. It misses the spirit of how the style-guides are intended to be used. > > "I personally hate with a vengeance that there are tools named after style guide PEPs that claim to enforce the guidelines from those PEPs. The tools' rigidity and simplicity reflects badly on the PEPs, which try hard not to be rigid or simplistic." -- GvR > https://mail.python.org/pipermail/python-dev/2016-January/142643.html > Good point, tools need to get better (more context sensitive), maybe should work only as advisors and have just funny names (and not claim the spirit). > "PEP 8 unto thyself, not onto others" -- Me > https://www.youtube.com/watch?v=wf-BqAjZb8M > (the most popular talk from last year's Pycon) > > Almost nothing that is wrong with CPython is stylistic, the real issues are more substantive. That is where you should devote your talents. > Good advices and refactoring on your talk (and I love the gorilla part but I think that the "color_distraction" trick doesn't work for "non native" speakers as me ;-)) and indeed, that wanted to be my point: Let's review what the code does and not 007-Things. It's was really about the workflow not style, but I'll follow your advice. Regards, francis PS: Interesting is also how/why do people introduce bug's by reformatting or trying to beautify things (and how those can be avoided, by allowing change). From gmludo at gmail.com Mon Jan 25 16:20:24 2016 From: gmludo at gmail.com (Ludovic Gasc) Date: Mon, 25 Jan 2016 22:20:24 +0100 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: Message-ID: Hi, Just thanks for this big contribution. And maybe this project could give new ideas to optimize Python, who knows ? At least, you've win a beer for the FOSDEM event this week-end ;-) Have a nice week. -- Ludovic Gasc (GMLudo) http://www.gmludo.eu/ 2016-01-25 19:16 GMT+01:00 Victor Stinner : > Hi, > > Summary: FAT Python is not faster, but it will be ;-) > > -- > > When I started the FAT Python as a fork of CPython 3.6, I put > everything in the same repository. Last weeks, I focused on splitting > my giant patch (10k lines) into small reviewable patches. I wrote 3 > PEP (509 dict version, 510 function specialziation, 511 code > tranformers) and I enhanced the API to make it usable for more use > cases than just FAT Python. I also created fatoptimizer (the AST > optimizer) and fat (runtime dependency of the optimizer) projects on > GitHub to separate clearly what should be outside Python core. For all > links, see: > > http://faster-cpython.readthedocs.org/fat_python.html > > For the fatoptimizer project, my constraint is to be able to run the > full Python test suite unmodified. In practice, I have to disable some > optimizations by putting a "__fatoptimizer__= {...}" configuration to > some test files. For example, I have to disable constant folding on > test_bool because it tests that False+2 gives 2 at runtime, whereas > the optimizer replaces directly False+2 with 2 during the compilation. > Well, test_bool.py is not the best example because all tests pass with > the constant folding optimization (if I comment my > "__fatoptimizer__={...}" change). > > This constraint ensures that the optimizer "works" and doesn't break > (too much ;-)) the Python semantics, but it's more difficult to > implement powerful optimizations. > > I also found and fixed various kinds of bugs. In my code obviously, > but also in the Python core, in various places. Some bugs only concern > AST transformers which is a new feature, but I had to fix them. For > example, Python didn't support negative line number delta in > co_lntotab of code objects, and so line number were all wrong on > optimized code. I merged my enhancement in the default branch of > CPython (issue #26107). > > In short, I focused on having something working (respecting the Python > semantics), rather than spending time on writing optimizations. > > -- > > When I asked explicitly "Is someone opposed to this PEP 509 [dict > verion] ?", Barry Warsaw answered that a performance analysis is > required. Extract of his mail: > > "I still think this is maintenance and potential performance > overhead we don't want to commit to long term unless it enables > significant optimization. Since you probably can't prove that without > some experimentation, this API should be provisional." > > Last week, I ran some benchmarks and I have to admin that I was > disappointed. Not only fatoptimizer doesn't make Python faster, but it > makes it much slower on some tests! > > http://fatoptimizer.readthedocs.org/en/latest/benchmarks.html > > Quickly, I identified a major performance issue when nested functions > are specialized, especially in Lib/json/encoder.py (tested by > bm_json_v2.py benchmark). I fixed my optimizer to not specialize > nested functions anymore. This simple change fixed the main > performance issue. Reminder: in performance critical code, don't use > nested functions! I will maybe propose patches for Lib/json/encoder.py > to stop using nested functions. > > I only ran benchmarks with the optimizer enabled. I now have to > measure the overhead of my patches (PEP 509, 510 and 511) adding the > API fat AST optimizers. The overhead must be negligible. For me, it's > a requirement of the whole project. Changes must not make Python > slower when the optimizer is not used. > > fatoptimizer is faster on microbenchmarks, but I had to write manually > some optimizations: > > http://fatoptimizer.readthedocs.org/en/latest/microbenchmarks.html > > IMHO fatoptimizer is not faster on macro benchmarks because it is not > smart enough (yet) to generate the most interesting optimizations, > like function inlining and specialization for argument types. You can > estimate the speedup if you specialize manually your functions. > > -- > > Barry also wrote: "Did you address my suggestion on python-ideas to > make the new C API optionally compiled in?" > > Well, it is an option, but I would prefer to have the API for AST > optimizer directly built in Python. > > The first beta version of Python 3.6 is scheduled in September 2016 > (deadline for new features in Python 3.6), so I still have a few > months to implement more powerful optimizations and prove that it can > be faster ;-) > > Victor > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/gmludo%40gmail.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From victor.stinner at gmail.com Mon Jan 25 16:43:16 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Mon, 25 Jan 2016 22:43:16 +0100 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: Message-ID: Hi, 2016-01-25 22:20 GMT+01:00 Ludovic Gasc : > Just thanks for this big contribution. > And maybe this project could give new ideas to optimize Python, who knows ? Sorry for my long email. I should try to summarize next time :-) In short: FAT Python is not fast today, but it will be faster if you give me a few more months to implement optimizations which will unlock the real power of AST optimizers ;-) I have a looooong list of ideas of optimizations: https://fatoptimizer.readthedocs.org/en/latest/todo.html According to microbenchmarks, the most promising optimizations are functions inlining (Python function calls are slow :-/) and specialize the code for the type of arguments. And I agree to wait until fatoptimizer is proven to be faster than the regular CPython before taking a decision of my 3 PEPs (509, 510, 511). Victor From srkunze at mail.de Mon Jan 25 16:51:11 2016 From: srkunze at mail.de (Sven R. Kunze) Date: Mon, 25 Jan 2016 22:51:11 +0100 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: Message-ID: <56A698CF.3080402@mail.de> Hi Victor, I encourage you to proceed here. I would love to see your PEPs (509-511) incorporated into CPython. It's not that I consider Python slow (although some folks claim so), but performance improvements are always welcome; especially when I glance over diagrams like those: http://blog.carlesmateo.com/wp-content/uploads/2014/10/blog-carlesmateo-com-performance-several-languages-php7-phantomjs-nodejs-java-bash-go-perl-luajit-hhvm3_9-scale_mod5.png So, I join Barry when he says, we want more benchmarking and definite results, however, I might be less strict than he is and say: - even if FAT might not optimize significantly (whatever definition we apply), PEP 509 to 511 are a great win for CPython - they provide a great infrastructure for optimizing CPython AND extending/experimenting Python as an ecosystem - FAT provides interesting insights into the field of optimizing a dynamic language So, keep up the good work. I am eager to see where this goes. If there's anything I can do, let me know. :) Best, Sven On 25.01.2016 19:16, Victor Stinner wrote: > Hi, > > Summary: FAT Python is not faster, but it will be ;-) > > -- > > When I started the FAT Python as a fork of CPython 3.6, I put > everything in the same repository. Last weeks, I focused on splitting > my giant patch (10k lines) into small reviewable patches. I wrote 3 > PEP (509 dict version, 510 function specialziation, 511 code > tranformers) and I enhanced the API to make it usable for more use > cases than just FAT Python. I also created fatoptimizer (the AST > optimizer) and fat (runtime dependency of the optimizer) projects on > GitHub to separate clearly what should be outside Python core. For all > links, see: > > http://faster-cpython.readthedocs.org/fat_python.html > > For the fatoptimizer project, my constraint is to be able to run the > full Python test suite unmodified. In practice, I have to disable some > optimizations by putting a "__fatoptimizer__= {...}" configuration to > some test files. For example, I have to disable constant folding on > test_bool because it tests that False+2 gives 2 at runtime, whereas > the optimizer replaces directly False+2 with 2 during the compilation. > Well, test_bool.py is not the best example because all tests pass with > the constant folding optimization (if I comment my > "__fatoptimizer__={...}" change). > > This constraint ensures that the optimizer "works" and doesn't break > (too much ;-)) the Python semantics, but it's more difficult to > implement powerful optimizations. > > I also found and fixed various kinds of bugs. In my code obviously, > but also in the Python core, in various places. Some bugs only concern > AST transformers which is a new feature, but I had to fix them. For > example, Python didn't support negative line number delta in > co_lntotab of code objects, and so line number were all wrong on > optimized code. I merged my enhancement in the default branch of > CPython (issue #26107). > > In short, I focused on having something working (respecting the Python > semantics), rather than spending time on writing optimizations. > > -- > > When I asked explicitly "Is someone opposed to this PEP 509 [dict > verion] ?", Barry Warsaw answered that a performance analysis is > required. Extract of his mail: > > "I still think this is maintenance and potential performance > overhead we don't want to commit to long term unless it enables > significant optimization. Since you probably can't prove that without > some experimentation, this API should be provisional." > > Last week, I ran some benchmarks and I have to admin that I was > disappointed. Not only fatoptimizer doesn't make Python faster, but it > makes it much slower on some tests! > > http://fatoptimizer.readthedocs.org/en/latest/benchmarks.html > > Quickly, I identified a major performance issue when nested functions > are specialized, especially in Lib/json/encoder.py (tested by > bm_json_v2.py benchmark). I fixed my optimizer to not specialize > nested functions anymore. This simple change fixed the main > performance issue. Reminder: in performance critical code, don't use > nested functions! I will maybe propose patches for Lib/json/encoder.py > to stop using nested functions. > > I only ran benchmarks with the optimizer enabled. I now have to > measure the overhead of my patches (PEP 509, 510 and 511) adding the > API fat AST optimizers. The overhead must be negligible. For me, it's > a requirement of the whole project. Changes must not make Python > slower when the optimizer is not used. > > fatoptimizer is faster on microbenchmarks, but I had to write manually > some optimizations: > > http://fatoptimizer.readthedocs.org/en/latest/microbenchmarks.html > > IMHO fatoptimizer is not faster on macro benchmarks because it is not > smart enough (yet) to generate the most interesting optimizations, > like function inlining and specialization for argument types. You can > estimate the speedup if you specialize manually your functions. > > -- > > Barry also wrote: "Did you address my suggestion on python-ideas to > make the new C API optionally compiled in?" > > Well, it is an option, but I would prefer to have the API for AST > optimizer directly built in Python. > > The first beta version of Python 3.6 is scheduled in September 2016 > (deadline for new features in Python 3.6), so I still have a few > months to implement more powerful optimizations and prove that it can > be faster ;-) > > Victor > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/srkunze%40mail.de From barry at python.org Mon Jan 25 17:14:51 2016 From: barry at python.org (Barry Warsaw) Date: Mon, 25 Jan 2016 17:14:51 -0500 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: Message-ID: <20160125171451.49a43f0a@subdivisions.wooz.org> On Jan 25, 2016, at 07:16 PM, Victor Stinner wrote: >Barry also wrote: "Did you address my suggestion on python-ideas to >make the new C API optionally compiled in?" > >Well, it is an option, but I would prefer to have the API for AST >optimizer directly built in Python. In my plan, it would be, but it would have to be enabled with a configure switch, and provisionally protected by an #ifdef. >And I agree to wait until fatoptimizer is proven to be faster than the >regular CPython before taking a decision of my 3 PEPs (509, 510, 511). +1 - and just to be clear, I hope you succeed beyond your wildest imagination. :) Cheers, -Barry From abarnert at yahoo.com Mon Jan 25 17:28:57 2016 From: abarnert at yahoo.com (Andrew Barnert) Date: Mon, 25 Jan 2016 14:28:57 -0800 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: Message-ID: On Jan 25, 2016, at 13:43, Victor Stinner wrote: > > According to microbenchmarks, the most promising optimizations are > functions inlining (Python function calls are slow :-/) and specialize > the code for the type of arguments. Can you specialize a function with a C API function, or only with bytecode? I'm not sure how much benefit you'd get out of specializing list vs. generic iterable or int vs. whatever from an AST transform, but substituting raw C code, on the other hand... From brett at python.org Mon Jan 25 17:34:20 2016 From: brett at python.org (Brett Cannon) Date: Mon, 25 Jan 2016 22:34:20 +0000 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: Message-ID: On Mon, 25 Jan 2016 at 14:30 Andrew Barnert via Python-Dev < python-dev at python.org> wrote: > On Jan 25, 2016, at 13:43, Victor Stinner > wrote: > > > > According to microbenchmarks, the most promising optimizations are > > functions inlining (Python function calls are slow :-/) and specialize > > the code for the type of arguments. > > Can you specialize a function with a C API function, or only with > bytecode? I'm not sure how much benefit you'd get out of specializing list > vs. generic iterable or int vs. whatever from an AST transform, but > substituting raw C code, on the other hand... > Victor's work is only manipulation of ASTs and bytecode. If you want something that low-level you need to either reach for Cython or hope a project like Pyjion pays off. -------------- next part -------------- An HTML attachment was scrubbed... URL: From victor.stinner at gmail.com Mon Jan 25 17:46:45 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Mon, 25 Jan 2016 23:46:45 +0100 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: Message-ID: Hi, 2016-01-25 23:28 GMT+01:00 Andrew Barnert : > On Jan 25, 2016, at 13:43, Victor Stinner wrote: >> >> According to microbenchmarks, the most promising optimizations are >> functions inlining (Python function calls are slow :-/) and specialize >> the code for the type of arguments. > > Can you specialize a function with a C API function, or only with bytecode? I'm not sure how much benefit you'd get out of specializing list vs. generic iterable or int vs. whatever from an AST transform, but substituting raw C code, on the other hand... As I wrote in the first part of my email, I redesigned to API to make it more generic. One of my change was to change PyFunction_Specialize() to not only accept code objects, but any callable object. The PEP 510 even contains an example using a builtin function as the specialized code: https://www.python.org/dev/peps/pep-0510/#using-builtin-function "On a microbenchmark, calling the C builtin takes 95 ns, whereas the original bytecode takes 155 ns (+60 ns): 1.6 times as fast. Calling directly chr(65) takes 76 ns." You can design an AST optimizer to compile some functions to C and then register them as specialized code at runtime. I have a side project to use Cython and/or pythran to specialize some functions using type annotation on parameters. Victor From abarnert at yahoo.com Mon Jan 25 17:50:31 2016 From: abarnert at yahoo.com (Andrew Barnert) Date: Mon, 25 Jan 2016 14:50:31 -0800 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: Message-ID: <505C9ADE-804D-4EA9-97D2-A67800A6F5FD@yahoo.com> On Jan 25, 2016, at 14:46, Victor Stinner wrote: > > You can design an AST optimizer to compile some functions to C and > then register them as specialized code at runtime. I have a side > project to use Cython and/or pythran to specialize some functions > using type annotation on parameters. That last part is exactly what I was thinking of. One way in which cythonizing your code isn't 100% compatible is that if you, say, shadow or replace int or range, the cython code is now wrong. Which is exactly the kind of thing FAT can guard against. Which is very cool. Glad to see you already thought of that before me. :) From victor.stinner at gmail.com Mon Jan 25 17:58:12 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Mon, 25 Jan 2016 23:58:12 +0100 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: <56A698CF.3080402@mail.de> References: <56A698CF.3080402@mail.de> Message-ID: 2016-01-25 22:51 GMT+01:00 Sven R. Kunze : > - they provide a great infrastructure for optimizing CPython AND > extending/experimenting Python as an ecosystem I hope that these API will create more optimizer projects than just fatoptimizer. For example, I expect more specialized optimizers like numba or pythran which are very efficient but more specific (ex: numeric computations) than fatoptimizer. Maybe not new optimizers, but just glue to existing static compilers (numba, pythran, cython, etc.). > If there's anything I can do, let me know. :) Oh, they are a lot of things to do! My patches for PEP 509, 510 and 511 still need some love (reviews): http://bugs.python.org/issue26058 http://bugs.python.org/issue26098 http://bugs.python.org/issue26145 I'm finishing my patch adding ast.Constant. This one is less controversal, it has no impact on performance nor the Python semantics: http://bugs.python.org/issue26146 But these patches are boring C code. You may prefer to work on the funny fatoptimizer project which is written in pure Python: https://fatoptimizer.readthedocs.org/en/latest/ Victor From songofacandy at gmail.com Mon Jan 25 21:21:45 2016 From: songofacandy at gmail.com (INADA Naoki) Date: Tue, 26 Jan 2016 11:21:45 +0900 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: <56A698CF.3080402@mail.de> Message-ID: I'm very interested in it. Ruby 2.2 and PHP 7 are faster than Python 2. Python 3 is slower than Python 2. Performance is a attractive feature. Python 3 lacks it. How can I help your work? On Tue, Jan 26, 2016 at 7:58 AM, Victor Stinner wrote: > 2016-01-25 22:51 GMT+01:00 Sven R. Kunze : > > - they provide a great infrastructure for optimizing CPython AND > > extending/experimenting Python as an ecosystem > > I hope that these API will create more optimizer projects than just > fatoptimizer. > > For example, I expect more specialized optimizers like numba or > pythran which are very efficient but more specific (ex: numeric > computations) than fatoptimizer. Maybe not new optimizers, but just > glue to existing static compilers (numba, pythran, cython, etc.). > > > > If there's anything I can do, let me know. :) > > Oh, they are a lot of things to do! My patches for PEP 509, 510 and > 511 still need some love (reviews): > > http://bugs.python.org/issue26058 > http://bugs.python.org/issue26098 > http://bugs.python.org/issue26145 > > I'm finishing my patch adding ast.Constant. This one is less > controversal, it has no impact on performance nor the Python > semantics: > > http://bugs.python.org/issue26146 > > > But these patches are boring C code. You may prefer to work on the > funny fatoptimizer project which is written in pure Python: > > https://fatoptimizer.readthedocs.org/en/latest/ > > Victor > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/songofacandy%40gmail.com > -- INADA Naoki -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Mon Jan 25 21:34:43 2016 From: brett at python.org (Brett Cannon) Date: Tue, 26 Jan 2016 02:34:43 +0000 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: <56A698CF.3080402@mail.de> Message-ID: On Mon, 25 Jan 2016 at 18:22 INADA Naoki wrote: > I'm very interested in it. > > Ruby 2.2 and PHP 7 are faster than Python 2. > Python 3 is slower than Python 2. > Performance is a attractive feature. Python 3 lacks it. > That is not a fair statement to make about Python 3. It entirely depends on your workload whether it is faster or slower. https://gist.github.com/brettcannon/9d19cc184ea45b3e7ca0 -Brett > > How can I help your work? > > On Tue, Jan 26, 2016 at 7:58 AM, Victor Stinner > wrote: > >> 2016-01-25 22:51 GMT+01:00 Sven R. Kunze : >> > - they provide a great infrastructure for optimizing CPython AND >> > extending/experimenting Python as an ecosystem >> >> I hope that these API will create more optimizer projects than just >> fatoptimizer. >> >> For example, I expect more specialized optimizers like numba or >> pythran which are very efficient but more specific (ex: numeric >> computations) than fatoptimizer. Maybe not new optimizers, but just >> glue to existing static compilers (numba, pythran, cython, etc.). >> >> >> > If there's anything I can do, let me know. :) >> >> Oh, they are a lot of things to do! My patches for PEP 509, 510 and >> 511 still need some love (reviews): >> >> http://bugs.python.org/issue26058 >> http://bugs.python.org/issue26098 >> http://bugs.python.org/issue26145 >> >> I'm finishing my patch adding ast.Constant. This one is less >> controversal, it has no impact on performance nor the Python >> semantics: >> >> http://bugs.python.org/issue26146 >> >> >> But these patches are boring C code. You may prefer to work on the >> funny fatoptimizer project which is written in pure Python: >> >> https://fatoptimizer.readthedocs.org/en/latest/ >> >> Victor >> _______________________________________________ >> Python-Dev mailing list >> Python-Dev at python.org >> https://mail.python.org/mailman/listinfo/python-dev >> > Unsubscribe: >> https://mail.python.org/mailman/options/python-dev/songofacandy%40gmail.com >> > > > > -- > INADA Naoki > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From abarnert at yahoo.com Mon Jan 25 22:02:52 2016 From: abarnert at yahoo.com (Andrew Barnert) Date: Mon, 25 Jan 2016 19:02:52 -0800 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: <56A698CF.3080402@mail.de> Message-ID: On Jan 25, 2016, at 18:21, INADA Naoki wrote: > > I'm very interested in it. > > Ruby 2.2 and PHP 7 are faster than Python 2. > Python 3 is slower than Python 2. Says who? That was certainly true in the 3.2 days, but nowadays, most things that differ seem to be faster in 3.x. Maybe it's just the kinds of programs I write, but speedup in decoding UTF-8 that's usually ASCII (and then processing the decoded unicode when it's usually 1/4th the size), faster listcomps, and faster datetime seem to matter more than slower logging or slower imports. And that's just when running the same code; when you actually use new features, yield from is much faster than looping over yield; scandir blows away listdir; asyncio blows away asyncore or threading even harder; etc. Maybe if you do different things, you have a different experience. But if you have a specific problem, you'd do a lot better to file specific bugs for that problem than to just hope that everything magically gets so much faster that your bottleneck no longer matters. > Performance is a attractive feature. Python 3 lacks it. When performance matters, people don't use Python 2, Ruby, or PHP, any more than they use Python 3. Or, rather, they use _any_ of those languages for the 95% of their code that doesn't matter, and C (often through existing libraries like NumPy--and try to find a good equivalent of that for Ruby or PHP) for the 5% that does. From songofacandy at gmail.com Mon Jan 25 22:32:07 2016 From: songofacandy at gmail.com (INADA Naoki) Date: Tue, 26 Jan 2016 12:32:07 +0900 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: <56A698CF.3080402@mail.de> Message-ID: On Tue, Jan 26, 2016 at 12:02 PM, Andrew Barnert wrote: > On Jan 25, 2016, at 18:21, INADA Naoki wrote: > > > > I'm very interested in it. > > > > Ruby 2.2 and PHP 7 are faster than Python 2. > > Python 3 is slower than Python 2. > > Says who? > For example, http://benchmarksgame.alioth.debian.org/u64q/php.html In Japanese, many people compares language performance by microbench like fibbonacci. > > That was certainly true in the 3.2 days, but nowadays, most things that > differ seem to be faster in 3.x. Python is little faster than ever in these years. But PHP and Ruby are much more faster than these years. Matz announced Ruby 3x3. Ruby hackers will make more effort to optimize ruby. http://engineering.appfolio.com/appfolio-engineering/2015/11/18/ruby-3x3 > Maybe it's just the kinds of programs I write, but speedup in decoding > UTF-8 that's usually ASCII (and then processing the decoded unicode when > it's usually 1/4th the size), faster listcomps, and faster datetime seem to > matter more than slower logging or slower imports. And that's just when > running the same code; when you actually use new features, yield from is > much faster than looping over yield; scandir blows away listdir; asyncio > blows away asyncore or threading even harder; etc. > I know. But people compares language speed by simple microbench like fibbonacci. They doesn't use listcomp or libraries to compare *language* speed. > Maybe if you do different things, you have a different experience. But if > you have a specific problem, you'd do a lot better to file specific bugs > for that problem than to just hope that everything magically gets so much > faster that your bottleneck no longer matters. > I did it sometimes. But I'd like to base language performance like function call more faster. > > > Performance is a attractive feature. Python 3 lacks it. > > When performance matters, people don't use Python 2, Ruby, or PHP, any > more than they use Python 3. Or, rather, they use _any_ of those languages > for the 95% of their code that doesn't matter, and C (often through > existing libraries like NumPy--and try to find a good equivalent of that > for Ruby or PHP) for the 5% that does. In the case of Web devs, many people choose main language from PHP, Ruby and Python. When peformance matters, they choose sub language from node.js, Go and Scala. While performance is not a matter when choosing first language, slowest of three makes bad impression and people feel less attractive about Python. -- INADA Naoki -------------- next part -------------- An HTML attachment was scrubbed... URL: From rymg19 at gmail.com Mon Jan 25 22:34:35 2016 From: rymg19 at gmail.com (Ryan Gonzalez) Date: Mon, 25 Jan 2016 21:34:35 -0600 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: <56A698CF.3080402@mail.de> Message-ID: <8E5CCA97-D191-408A-8925-6A598397C622@gmail.com> On January 25, 2016 9:32:07 PM CST, INADA Naoki wrote: >On Tue, Jan 26, 2016 at 12:02 PM, Andrew Barnert >wrote: > >> On Jan 25, 2016, at 18:21, INADA Naoki >wrote: >> > >> > I'm very interested in it. >> > >> > Ruby 2.2 and PHP 7 are faster than Python 2. >> > Python 3 is slower than Python 2. >> >> Says who? >> > >For example, http://benchmarksgame.alioth.debian.org/u64q/php.html >In Japanese, many people compares language performance by microbench >like >fibbonacci. > ...does writing Fibonacci in a foreign language make a performance difference? Or did you mean "In Japan?" > >> >> That was certainly true in the 3.2 days, but nowadays, most things >that >> differ seem to be faster in 3.x. > > >Python is little faster than ever in these years. >But PHP and Ruby are much more faster than these years. > >Matz announced Ruby 3x3. Ruby hackers will make more effort to optimize >ruby. >http://engineering.appfolio.com/appfolio-engineering/2015/11/18/ruby-3x3 > > > >> Maybe it's just the kinds of programs I write, but speedup in >decoding >> UTF-8 that's usually ASCII (and then processing the decoded unicode >when >> it's usually 1/4th the size), faster listcomps, and faster datetime >seem to >> matter more than slower logging or slower imports. And that's just >when >> running the same code; when you actually use new features, yield from >is >> much faster than looping over yield; scandir blows away listdir; >asyncio >> blows away asyncore or threading even harder; etc. >> > >I know. >But people compares language speed by simple microbench like >fibbonacci. >They doesn't use listcomp or libraries to compare *language* speed. > > >> Maybe if you do different things, you have a different experience. >But if >> you have a specific problem, you'd do a lot better to file specific >bugs >> for that problem than to just hope that everything magically gets so >much >> faster that your bottleneck no longer matters. >> > >I did it sometimes. >But I'd like to base language performance like function call more >faster. > > >> >> > Performance is a attractive feature. Python 3 lacks it. >> >> When performance matters, people don't use Python 2, Ruby, or PHP, >any >> more than they use Python 3. Or, rather, they use _any_ of those >languages >> for the 95% of their code that doesn't matter, and C (often through >> existing libraries like NumPy--and try to find a good equivalent of >that >> for Ruby or PHP) for the 5% that does. > > >In the case of Web devs, many people choose main language from PHP, >Ruby >and Python. >When peformance matters, they choose sub language from node.js, Go and >Scala. > >While performance is not a matter when choosing first language, slowest >of >three makes bad impression >and people feel less attractive about Python. -- Sent from my Nexus 5 with K-9 Mail. Please excuse my brevity. From rosuav at gmail.com Mon Jan 25 22:59:36 2016 From: rosuav at gmail.com (Chris Angelico) Date: Tue, 26 Jan 2016 14:59:36 +1100 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: <56A698CF.3080402@mail.de> Message-ID: On Tue, Jan 26, 2016 at 2:32 PM, INADA Naoki wrote: > > I know. > But people compares language speed by simple microbench like fibbonacci. > They doesn't use listcomp or libraries to compare *language* speed. > Well, that's a stupid way to decide on a language. Here, look: Python is faster than C. Proof! rosuav at sikorsky:~$ time python3 fib.py 2880067194370816120 real 0m0.033s user 0m0.032s sys 0m0.000s rosuav at sikorsky:~$ cat fib.py import functools @functools.lru_cache() def fib(n): if n < 2: return n return fib(n-2) + fib(n-1) print(fib(90)) rosuav at sikorsky:~$ gcc fib.c && time ./a.out 1134903170 real 0m9.104s user 0m9.064s sys 0m0.000s rosuav at sikorsky:~$ cat fib.c #include unsigned long fib(unsigned long n) { if (n < 2) return n; return fib(n-2) + fib(n-1); } int main() { printf("%lu\n",fib(45)); } Algorithmic differences - even subtle ones - can easily outdo choice of language for run-time performance. And if you try to write a true C equivalent of that Python code, good luck - I'll have the Python one written and running while you're still trying to figure out how to write a cache, much less how to keep the syntax clean as you add a cache to an existing function. Of course, rewriting the whole thing to work iteratively instead of double-recursively will make a dramatic difference to both programs. That's an unsubtle algorithmic difference, and if you're coding like that, you probably can't see the difference in performance between any two languages at anything up to a machine word (about the 90th or so Fibonacci number, on a 64-bit system) - all you'll see is the startup performance. As soon as you go beyond a machine word, Python massively trumps C, because its default integer type is a bignum. Going beyond a machine word in C is a hassle. Going beyond a machine word in Python 2 is almost insignificant - hey look, now your repr has an 'L' on the end, and performance is immeasurably worse. In Python 3, there's no machine word whatsoever. So, yeah... Python beats C for Fibonacci calculation, too. You just have to not be stupid with your benchmarking. ChrisA From songofacandy at gmail.com Tue Jan 26 00:02:31 2016 From: songofacandy at gmail.com (INADA Naoki) Date: Tue, 26 Jan 2016 14:02:31 +0900 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: <56A698CF.3080402@mail.de> Message-ID: Do you say I and many people are so fool? People use same algorithm on every language when compares base language performance [1]. [1] There are no solid definition about "Base language performance". But it includes function call, method lookup, GC. It may include basic string and arithmetic operations. See here for example: http://d.hatena.ne.jp/satosystems/20121228/1356655565 This article is written in 2012. In this article, php 5.3 takes 85sec, Python 2.7 takes 53sec and CRuby 1.8 takes 213sec. (!!) For now: $ python2 -V Python 2.7.11 $ time python2 -S fib.py 39088169 real 0m17.133s user 0m16.970s sys 0m0.055s $ python3 -V Python 3.5.1 $ time python3 -S fib.py 39088169 real 0m21.380s user 0m21.337s sys 0m0.028s $ php -v PHP 7.0.2 (cli) (built: Jan 7 2016 10:40:21) ( NTS ) Copyright (c) 1997-2015 The PHP Group Zend Engine v3.0.0, Copyright (c) 1998-2015 Zend Technologies $ time php fib.php 39088169 real 0m7.706s user 0m7.654s sys 0m0.027s $ ruby -v ruby 2.3.0p0 (2015-12-25 revision 53290) [x86_64-darwin14] $ time ruby fib.rb 39088169 real 0m6.195s user 0m6.124s sys 0m0.032s Fibonacci microbench measures performance of function call. When I said "Base language performance", I meant performance of function call, attribute lookup, GC, etc... PHP and Ruby made grate effort to improve base language performance. While I'm fan of Python, I respect people made PHP and Ruby faster. Of course, I respect people making Python faster too. But I wonder if CPython is more faster, especially about global lookup and function call. -- INADA Naoki -------------- next part -------------- An HTML attachment was scrubbed... URL: From abarnert at yahoo.com Tue Jan 26 00:44:07 2016 From: abarnert at yahoo.com (Andrew Barnert) Date: Mon, 25 Jan 2016 21:44:07 -0800 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: <56A698CF.3080402@mail.de> Message-ID: <9A44AFB5-CE53-4C5B-9A49-6E85F9C148A2@yahoo.com> On Jan 25, 2016, at 19:32, INADA Naoki wrote: > >> On Tue, Jan 26, 2016 at 12:02 PM, Andrew Barnert wrote: >> On Jan 25, 2016, at 18:21, INADA Naoki wrote: >> > >> > I'm very interested in it. >> > >> > Ruby 2.2 and PHP 7 are faster than Python 2. >> > Python 3 is slower than Python 2. >> >> Says who? > > For example, http://benchmarksgame.alioth.debian.org/u64q/php.html > In Japanese, many people compares language performance by microbench like fibbonacci. "In Japan, the hand is sharper than a knife [man splits board with karate chop], but the same doesn't work with a tomato [man splatters tomato all over himself with karate chop]." A cheap knife really is better than a karate master at chopping tomatoes. And Python 2 really is better than Python 3 at doing integer arithmetic on the edge of what can fit into a machine word. But so what? Without seeing any of your Japanese web code, much less running a profiler, I'm willing to bet that your code is rarely CPU-bound, and, when it is, it spends a lot more time doing things like processing Unicode strings that are almost always UCS-2 (about 110% slower on Python 2) than doing this kind of arithmetic (9% faster on Python 2), or cutting tomatoes (TypeError on both versions). -------------- next part -------------- An HTML attachment was scrubbed... URL: From songofacandy at gmail.com Tue Jan 26 01:13:19 2016 From: songofacandy at gmail.com (INADA Naoki) Date: Tue, 26 Jan 2016 15:13:19 +0900 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: <9A44AFB5-CE53-4C5B-9A49-6E85F9C148A2@yahoo.com> References: <56A698CF.3080402@mail.de> <9A44AFB5-CE53-4C5B-9A49-6E85F9C148A2@yahoo.com> Message-ID: On Tue, Jan 26, 2016 at 2:44 PM, Andrew Barnert wrote: > On Jan 25, 2016, at 19:32, INADA Naoki wrote: > > On Tue, Jan 26, 2016 at 12:02 PM, Andrew Barnert > wrote: > >> On Jan 25, 2016, at 18:21, INADA Naoki wrote: >> > >> > I'm very interested in it. >> > >> > Ruby 2.2 and PHP 7 are faster than Python 2. >> > Python 3 is slower than Python 2. >> >> Says who? >> > > For example, http://benchmarksgame.alioth.debian.org/u64q/php.html > In Japanese, many people compares language performance by microbench like > fibbonacci. > > > "In Japan, the hand is sharper than a knife [man splits board with karate > chop], but the same doesn't work with a tomato [man splatters tomato all > over himself with karate chop]." > > A cheap knife really is better than a karate master at chopping tomatoes. > And Python 2 really is better than Python 3 at doing integer arithmetic on > the edge of what can fit into a machine word. But so what? Without seeing > any of your Japanese web code, much less running a profiler, I'm willing to > bet that your code is rarely CPU-bound, and, when it is, it spends a lot > more time doing things like processing Unicode strings that are almost > always UCS-2 (about 110% slower on Python 2) than doing this kind of > arithmetic (9% faster on Python 2), or cutting tomatoes (TypeError on both > versions). > > Calm down, please. I didn't say "microbench is more important than macrobench". While editor is not a main problem of software development, people likes comparing vim and emacs. Like that, Japanese dev people likes comparing speed. While it's not a real problem of typical application, new people should choose first (and probably main) editor and language. Slowest on such a basic microbench gives bad impression for them. Additionally, some application (e.g. traversing DOM) makes much function calls. Faster function call may makes some *real* application faster. -- INADA Naoki -------------- next part -------------- An HTML attachment was scrubbed... URL: From stephen at xemacs.org Tue Jan 26 02:16:15 2016 From: stephen at xemacs.org (Stephen J. Turnbull) Date: Tue, 26 Jan 2016 16:16:15 +0900 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: <56A698CF.3080402@mail.de> Message-ID: <22183.7487.315172.752001@turnbull.sk.tsukuba.ac.jp> INADA Naoki writes: > For example, http://benchmarksgame.alioth.debian.org/u64q/php.html > In Japanese, many people compares language performance by > microbench like fibbonacci. True enough. But as a teacher in a Japanese engineering school, I am ashamed to see that posted to a public list. We Japanese ;-) need to get over that tendency (and not just in programming, my own biggest battle is against those who worship goodness-of-fit statistics). Our universities are doing an awful job at getting "big picture thinking" across to our students. I see that you respond that you're talking about "base language performance" which is a big improvement over blind faith in micro- benchmarks, but still it's a matter of pulling your nose back from the bark so you can see the tree. At least from the point of view of the computations the people around me are doing with these languages, you still miss the forest. > While performance is not a matter when choosing first language, > slowest of three makes bad impression and people feel less > attractive about Python. My school is the one that Matz graduated from (different department), and as far as I can see (which is pretty local, I admit), Python is a clear first and growing as a teaching language and as a glue language for research computations. Java is second for both[1], Ruby is rare, and PHP just isn't on the map. I see some lab home pages using Ruby-on-Rails, and a couple of Mediawiki installations and the like based on PHP. For me personally, PHP and Ruby are both unusable for *performance* reasons: I need something like Pandas, which is based on NumPy, and at least the last time I checked a couple years ago Ruby had no equivalent to NumPy, let alone Pandas. (For PHP I asked the nearest PHP fanboy web programmer, so I could be way wrong about no efficient computation for PHP.) At least as used at this university, I just don't see a performance issue that Python needs to be specifically worried about. That said, optimization is important, and IMO anything that allows 1% of people currently writing (and debugging ...) modules in C for performance reasons to write them in pure Python instead is a win. But to me, relieving programmer pain, not microbenchmarks, should be the performance measure we care about (even if others do care about those benchmarks). Measure them, yes, but focus on them only when we're pretty sure that poor benchmark numbers are associated with programmer pain. Footnotes: [1] Java was clearly first as a teaching language until 2010 or so, and since most of our grad students and assistant profs were students here in that era, I suspect its persistence is just the inertia that goes with academic nepotism. From ncoghlan at gmail.com Tue Jan 26 05:06:03 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Tue, 26 Jan 2016 20:06:03 +1000 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: <22183.7487.315172.752001@turnbull.sk.tsukuba.ac.jp> References: <56A698CF.3080402@mail.de> <22183.7487.315172.752001@turnbull.sk.tsukuba.ac.jp> Message-ID: On 26 January 2016 at 17:16, Stephen J. Turnbull wrote: > Our universities are doing an awful job at getting "big picture thinking" > across to our students. That problem isn't specific to Japan - I'm not aware of *anywhere* that does a particularly good job of teaching developers not to get tribal about their programming language choice, and instead evaluate their options based on their specific problem, the team that will be working on it, and the pre-existing problem solving tools available in that ecosystem. As a result, folks making programming language choices based on criteria that aren't actually relevant to the problem they're trying to solve is going to be a fact of life. While improving those kinds of metrics isn't a *reason* to do anything, it does count as an added bonus when it comes as a beneficial side effect of working on something else (such as the function specialisation infrastructure underpinning Victor's optimiser project). Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From tjreedy at udel.edu Tue Jan 26 08:38:13 2016 From: tjreedy at udel.edu (Terry Reedy) Date: Tue, 26 Jan 2016 08:38:13 -0500 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: <56A698CF.3080402@mail.de> Message-ID: On 1/26/2016 12:02 AM, INADA Naoki wrote: > People use same algorithm on every language when compares base language > performance [1]. The python code is NOT using the same algorithm. The proof is that the Python function will return the correct value for, say fib(50) while most if not all the other versions will not. The domain of an algorithm is part of what characterizes an algorithm. -- Terry Jan Reedy From rymg19 at gmail.com Tue Jan 26 10:28:42 2016 From: rymg19 at gmail.com (Ryan Gonzalez) Date: Tue, 26 Jan 2016 09:28:42 -0600 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: <56A698CF.3080402@mail.de> Message-ID: On January 25, 2016 9:59:36 PM CST, Chris Angelico wrote: >On Tue, Jan 26, 2016 at 2:32 PM, INADA Naoki >wrote: >> >> I know. >> But people compares language speed by simple microbench like >fibbonacci. >> They doesn't use listcomp or libraries to compare *language* speed. >> > >Well, that's a stupid way to decide on a language. Here, look: Python >is faster than C. Proof! > >rosuav at sikorsky:~$ time python3 fib.py >2880067194370816120 > >real 0m0.033s >user 0m0.032s >sys 0m0.000s >rosuav at sikorsky:~$ cat fib.py >import functools > >@functools.lru_cache() >def fib(n): > if n < 2: return n > return fib(n-2) + fib(n-1) > >print(fib(90)) > >rosuav at sikorsky:~$ gcc fib.c && time ./a.out >1134903170 > >real 0m9.104s >user 0m9.064s >sys 0m0.000s >rosuav at sikorsky:~$ cat fib.c >#include > >unsigned long fib(unsigned long n) >{ > if (n < 2) return n; > return fib(n-2) + fib(n-1); >} > >int main() >{ > printf("%lu\n",fib(45)); >} > *cough* -O3 *cough* > >Algorithmic differences - even subtle ones - can easily outdo choice >of language for run-time performance. And if you try to write a true C >equivalent of that Python code, good luck - I'll have the Python one >written and running while you're still trying to figure out how to >write a cache, much less how to keep the syntax clean as you add a >cache to an existing function. > >Of course, rewriting the whole thing to work iteratively instead of >double-recursively will make a dramatic difference to both programs. >That's an unsubtle algorithmic difference, and if you're coding like >that, you probably can't see the difference in performance between any >two languages at anything up to a machine word (about the 90th or so >Fibonacci number, on a 64-bit system) - all you'll see is the startup >performance. As soon as you go beyond a machine word, Python massively >trumps C, because its default integer type is a bignum. Going beyond a >machine word in C is a hassle. Going beyond a machine word in Python 2 >is almost insignificant - hey look, now your repr has an 'L' on the >end, and performance is immeasurably worse. In Python 3, there's no >machine word whatsoever. > >So, yeah... Python beats C for Fibonacci calculation, too. You just >have to not be stupid with your benchmarking. > >ChrisA >_______________________________________________ >Python-Dev mailing list >Python-Dev at python.org >https://mail.python.org/mailman/listinfo/python-dev >Unsubscribe: >https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com -- Sent from my Nexus 5 with K-9 Mail. Please excuse my brevity. From victor.stinner at gmail.com Tue Jan 26 10:32:45 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Tue, 26 Jan 2016 16:32:45 +0100 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: <56A698CF.3080402@mail.de> Message-ID: Hi, 2016-01-26 3:21 GMT+01:00 INADA Naoki : > How can I help your work? I don't know exactly yet, but I started to write a documentation to explain how to contribute: http://faster-cpython.readthedocs.org/fat_python.html#how-can-you-contribute You may contact me directly ;-) Victor From rosuav at gmail.com Tue Jan 26 10:40:08 2016 From: rosuav at gmail.com (Chris Angelico) Date: Wed, 27 Jan 2016 02:40:08 +1100 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: <56A698CF.3080402@mail.de> Message-ID: On Wed, Jan 27, 2016 at 2:28 AM, Ryan Gonzalez wrote: >>rosuav at sikorsky:~$ gcc fib.c && time ./a.out >>1134903170 >> >>real 0m9.104s >>user 0m9.064s >>sys 0m0.000s >>rosuav at sikorsky:~$ cat fib.c >>#include >> >>unsigned long fib(unsigned long n) >>{ >> if (n < 2) return n; >> return fib(n-2) + fib(n-1); >>} >> >>int main() >>{ >> printf("%lu\n",fib(45)); >>} >> > > *cough* -O3 *cough* > Oh, I'm sorry. Let's try it again. rosuav at sikorsky:~$ gcc -O3 fib.c && time ./a.out 1134903170 real 0m3.153s user 0m3.088s sys 0m0.052s Cool! Seems to be linear. From which we can deduce that Python on my system was compiled at about -O275. ChrisA From srkunze at mail.de Tue Jan 26 12:10:59 2016 From: srkunze at mail.de (Sven R. Kunze) Date: Tue, 26 Jan 2016 18:10:59 +0100 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: <56A698CF.3080402@mail.de> Message-ID: <56A7A8A3.5040705@mail.de> Hi, will look into it soon. :) Best, Sven On 26.01.2016 16:32, Victor Stinner wrote: > Hi, > > 2016-01-26 3:21 GMT+01:00 INADA Naoki : >> How can I help your work? > I don't know exactly yet, but I started to write a documentation to > explain how to contribute: > http://faster-cpython.readthedocs.org/fat_python.html#how-can-you-contribute > > You may contact me directly ;-) > > Victor From srkunze at mail.de Tue Jan 26 12:35:56 2016 From: srkunze at mail.de (Sven R. Kunze) Date: Tue, 26 Jan 2016 18:35:56 +0100 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: <56A698CF.3080402@mail.de> Message-ID: <56A7AE7C.3080806@mail.de> I completely agree with INADA. It's like saying, because a specific crossroad features a higher accident rate, *people need to change their driving behavior*. *No!* People won't change and it's not necessary either. The crossroad needs to be changed to be safer. Same goes for Python. If it's slow using the very same piece of code (even superficially), you better make the language faster. Developers won't change and they won't change their code either. Just not necessary. Btw. it would be a great feature for Python 3 to be faster than Python 2. I've heard a lot of complaints of the scientific community that Python is slow. Would Python 3 be significantly faster than Python 2, that'll be a huge reason to upgrade (and would create pressure to upgrade libs as well). They are satisfied with Python so far, but would there be a language equally readable/maintainable and 10x faster (of course proven by some weird micro benchmarks - incomprehensible to most nervous subscribers to this list), they would readily switch over. I for one hope that *Python itself will be that language* in the foreseeable future. This is some sort of marketing but also requires hard facts indeed. Best, Sven -------------- next part -------------- An HTML attachment was scrubbed... URL: From stephen at xemacs.org Tue Jan 26 12:50:13 2016 From: stephen at xemacs.org (Stephen J. Turnbull) Date: Wed, 27 Jan 2016 02:50:13 +0900 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: <56A698CF.3080402@mail.de> <22183.7487.315172.752001@turnbull.sk.tsukuba.ac.jp> Message-ID: <22183.45525.814657.897955@turnbull.sk.tsukuba.ac.jp> Nick Coghlan writes: > On 26 January 2016 at 17:16, Stephen J. Turnbull wrote: > > Our universities are doing an awful job at getting "big picture > > thinking" across to our students. > > That problem isn't specific to Japan - I'm not aware of *anywhere* > that does a particularly good job of teaching developers not to get > tribal about their programming language choice, But that's a different issue. The approach that Naoki describes isn't "tribal", in fact it's the exact opposite: it's an attempt to base such decisions on strictly objective measures. > and instead evaluate their options based on their specific problem, > the team that will be working on it, and the pre-existing problem > solving tools available in that ecosystem. One of which is the language itself, and the team's experience with it. "We're a C++/Java/Python/Ruby/Brainf!ck/assembler/COBOL shop" isn't a bad heuristic in most cases -- outside of research and/or "we also walk dogs" generalist consultancies -- especially when you're under pressure from the bean-counting department to reduce costs. That heuristic is hard to distinguish from "tribalism", though. AFAICT, in fact, the generalists (including massive entities like IBM and Google, as well as companies like Red Hat and cooperatives like Debian) are quite good at the kind of evaluation you describe. And Python has been quite good at targeting the kinds of improvements that make it appealing to people who can and do do that kind of evaluation, in more and more areas. > While improving those kinds of metrics isn't a *reason* to do > anything, it does count as an added bonus when it comes as a > beneficial side effect of working on something else (such as the > function specialisation infrastructure underpinning Victor's > optimiser project). Sure, but that's a "have your cake and eat it too" situation. Nobody's going to complain about that! If Naoki -- or anybody else -- wants to work on optimizations enabled by FAT Python, more power to them, I say. I just would like to see them reviewed with the goal of making Python a better language for solving a wide variety of problems, rather than strictly focusing on benchmarks. If the benchmarks can be improved without closing off potential syntax improvements or restricting the domain of algorithms (cf Terry's comment), wonderful! I thought Chris's point about efficient algorithms that would be hard to implement correctly in other languages but are easy to do in Python *because* of Python's carefully crafted, but not always maximally speedy, semantics was especially apropos here. Of course his claim that Python is faster than C is tongue in cheek, and a caching version of fibonacci wouldn't be that hard to write in C, and an iterative version even easier. But others have pointed out many other syntaxes (comprehensions, generators, yield from, and so on) that put together often make efficient computation TOOWTDI. That, to me, is The Python Advantage. From stephen at xemacs.org Tue Jan 26 12:51:54 2016 From: stephen at xemacs.org (Stephen J. Turnbull) Date: Wed, 27 Jan 2016 02:51:54 +0900 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: <56A698CF.3080402@mail.de> Message-ID: <22183.45626.891297.272536@turnbull.sk.tsukuba.ac.jp> Terry Reedy writes: > On 1/26/2016 12:02 AM, INADA Naoki wrote: > > > People use same algorithm on every language when compares base language > > performance [1]. > > The python code is NOT using the same algorithm. The proof is that the > Python function will return the correct value for, say fib(50) while > most if not all the other versions will not. True, but that's not a reasonable criterion for "same algorithm" in this context. Naoki's application ("base language performance" benchmarking) requires fib(n) only for n < 40, and run it in a loop 100 times if you want 2 more decimal places of precision ("40" is appropriate for an implementation with 32-bit ints). On that restricted domain the algorithm *is* the same. If you want to argue that the bigger domain is a better one to use for evaluating programming languages, be our guest. But then you're comparing apples (speed) to oranges (domain), and Naoki (or the Japanese benchmarkers) can argue that a smaller, more risky, domain is covered by "consenting adults" -- if you know there's a risk, you need to write code to deal with it, but if you know there isn't, you shouldn't have to accept lower performance. Obviously, I don't think that's an appropriate tradeoff myself, but that's based on "IMHO" not "comparison is invalid because algorithms differ". From truong_nguyen8 at yahoo.com Tue Jan 26 18:18:08 2016 From: truong_nguyen8 at yahoo.com (Truong Nguyen) Date: Tue, 26 Jan 2016 23:18:08 +0000 (UTC) Subject: [Python-Dev] Hoping to Find a Mentor References: <970197771.523029.1453850288839.JavaMail.yahoo.ref@mail.yahoo.com> Message-ID: <970197771.523029.1453850288839.JavaMail.yahoo@mail.yahoo.com> Dear Everyone, My name is Truong Nguyen and I like web development. I'm writing hoping to find a mentor (who likes to teach), and opportunity to contribute to the python.org website to gain skills in full stack development. Python Software Foundation interest me because the language is used in a wide variety of fields. My background is not in computer science, but I've earned a B.S. in Applied Mathematics and have an equivalent minor in computer science. I did front end project with HTML, CSS3 & Media Query, and JavaScript myself, but hope to work with someone to learn from. I'm a hard worker and responsible person and do my best to contribute to the organization. Have a nice day, Truong Nguyen -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at snarky.ca Tue Jan 26 12:31:16 2016 From: brett at snarky.ca (Brett Cannon) Date: Tue, 26 Jan 2016 17:31:16 +0000 Subject: [Python-Dev] [Python-checkins] Daily reference leaks (cbd4a6a2657e): sum=134 In-Reply-To: <20160126084734.122727.74008@psf.io> References: <20160126084734.122727.74008@psf.io> Message-ID: Looks like Victor's ast.Constant change introduced a refleak. On Tue, 26 Jan 2016 at 00:47 wrote: > results for cbd4a6a2657e on branch "default" > -------------------------------------------- > > test_ast leaked [39, 39, 39] references, sum=117 > test_ast leaked [5, 5, 5] memory blocks, sum=15 > test_collections leaked [-2, 0, 0] references, sum=-2 > test_functools leaked [0, 2, 2] memory blocks, sum=4 > > > Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', > '3:3:/home/psf-users/antoine/refleaks/reflogqIZGVY', '--timeout', '7200'] > _______________________________________________ > Python-checkins mailing list > Python-checkins at python.org > https://mail.python.org/mailman/listinfo/python-checkins > -------------- next part -------------- An HTML attachment was scrubbed... URL: From sjoerdjob at sjec.nl Wed Jan 27 00:23:41 2016 From: sjoerdjob at sjec.nl (Sjoerd Job Postmus) Date: Wed, 27 Jan 2016 06:23:41 +0100 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: <56A698CF.3080402@mail.de> Message-ID: <20160127052341.GA14190@sjoerdjob.com> On Mon, Jan 25, 2016 at 11:58:12PM +0100, Victor Stinner wrote: > ... > Oh, they are a lot of things to do! ... Just wondering, do you also need a set of (abusive) test-cases which check 100% conformity to the CPython semantics? I'm sure many of us would be able to whip up some ideas of things that are possible with Python and are of the kind "but you should not do that! That's bad programming!" which may or may not break the optimizations (especially specialized functions). I'm thinking of things like def override_length_function_test(): global len value = len("abc") len = lambda obj: ord(obj[0])) value += len("abc") assert value == 3 + 97, "Outdated `len` used." And also cases where `len` was changed not in the function itself, but in another function that was called indirectly (maybe 4 functions down, one of which was monkey-patched in after the compilation step): module_a.py def test_func(callback): value = len("abc") callback() value += len("abc") assert value == 3 + 97, "Outdated `len` used." module_b.py import module_a def override_length_function_test(): def callback(): module_a.len = lambda obj: ord(obj[0]) assert module_a.test_func(callback) == 3 + 97, "Outdated `len` used." (I'm sure some of the other readers of this list can be even more creative in trying to show that making optimizations like this can break semantics.) Other fun tricks I'd like to try is overriding the `len` method from a signal handler, what happens when you monkey-patch a dependent method, having `__getitem__` and `__getattr__` on some value override `len`. Basically: trying things that I normally should not try during my working hours, on account of wanting to still have a job the next day. Kind regards, Sjoerd Job From victor.stinner at gmail.com Wed Jan 27 03:09:36 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Wed, 27 Jan 2016 09:09:36 +0100 Subject: [Python-Dev] [Python-checkins] Daily reference leaks (cbd4a6a2657e): sum=134 In-Reply-To: References: <20160126084734.122727.74008@psf.io> Message-ID: Hi, I pushed a fix before you sent your message. At least test_ast should be fixed. https://hg.python.org/cpython/rev/c5df914e73ad FYI I'm unable to reproduce the test_collections leak. Victor Le mardi 26 janvier 2016, Brett Cannon a ?crit : > Looks like Victor's ast.Constant change introduced a refleak. > > On Tue, 26 Jan 2016 at 00:47 > wrote: > >> results for cbd4a6a2657e on branch "default" >> -------------------------------------------- >> >> test_ast leaked [39, 39, 39] references, sum=117 >> test_ast leaked [5, 5, 5] memory blocks, sum=15 >> test_collections leaked [-2, 0, 0] references, sum=-2 >> test_functools leaked [0, 2, 2] memory blocks, sum=4 >> >> >> Command line was: ['./python', '-m', 'test.regrtest', '-uall', '-R', >> '3:3:/home/psf-users/antoine/refleaks/reflogqIZGVY', '--timeout', '7200'] >> _______________________________________________ >> Python-checkins mailing list >> Python-checkins at python.org >> >> https://mail.python.org/mailman/listinfo/python-checkins >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From p.f.moore at gmail.com Wed Jan 27 04:28:15 2016 From: p.f.moore at gmail.com (Paul Moore) Date: Wed, 27 Jan 2016 09:28:15 +0000 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: <20160127052341.GA14190@sjoerdjob.com> References: <56A698CF.3080402@mail.de> <20160127052341.GA14190@sjoerdjob.com> Message-ID: On 27 January 2016 at 05:23, Sjoerd Job Postmus wrote: > On Mon, Jan 25, 2016 at 11:58:12PM +0100, Victor Stinner wrote: >> ... >> Oh, they are a lot of things to do! ... > > Just wondering, do you also need a set of (abusive) test-cases which > check 100% conformity to the CPython semantics? I'm sure many of us > would be able to whip up some ideas of things that are possible with > Python and are of the kind "but you should not do that! That's bad > programming!" which may or may not break the optimizations (especially > specialized functions). > > I'm thinking of things like > > def override_length_function_test(): > global len > value = len("abc") > len = lambda obj: ord(obj[0])) > value += len("abc") > assert value == 3 + 97, "Outdated `len` used." > > And also cases where `len` was changed not in the function itself, but > in another function that was called indirectly (maybe 4 functions down, > one of which was monkey-patched in after the compilation step): > > module_a.py > def test_func(callback): > value = len("abc") > callback() > value += len("abc") > assert value == 3 + 97, "Outdated `len` used." > > module_b.py > import module_a > def override_length_function_test(): > def callback(): > module_a.len = lambda obj: ord(obj[0]) > assert module_a.test_func(callback) == 3 + 97, "Outdated `len` used." > > (I'm sure some of the other readers of this list can be even more > creative in trying to show that making optimizations like this can break > semantics.) > > Other fun tricks I'd like to try is overriding the `len` method from a > signal handler, what happens when you monkey-patch a dependent method, > having `__getitem__` and `__getattr__` on some value override `len`. > > Basically: trying things that I normally should not try during my > working hours, on account of wanting to still have a job the next day. > > Kind regards, > Sjoerd Job Maybe I'm just nasty, but IMO those kinds of "torture tests" are just as valuable in general, so I'd encourage people to submit patches to the main Python test suite to add them. Paul From victor.stinner at gmail.com Wed Jan 27 04:31:54 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Wed, 27 Jan 2016 10:31:54 +0100 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: <56A698CF.3080402@mail.de> <20160127052341.GA14190@sjoerdjob.com> Message-ID: Python has test_dynamic which tests such corner cases. For example, test_modify_builtins_while_generator_active(): "Modify the builtins out from under a live generator." https://hg.python.org/cpython/file/58266f5101cc/Lib/test/test_dynamic.py#l49 Victor 2016-01-27 10:28 GMT+01:00 Paul Moore : > On 27 January 2016 at 05:23, Sjoerd Job Postmus wrote: >> On Mon, Jan 25, 2016 at 11:58:12PM +0100, Victor Stinner wrote: >>> ... >>> Oh, they are a lot of things to do! ... >> >> Just wondering, do you also need a set of (abusive) test-cases which >> check 100% conformity to the CPython semantics? I'm sure many of us >> would be able to whip up some ideas of things that are possible with >> Python and are of the kind "but you should not do that! That's bad >> programming!" which may or may not break the optimizations (especially >> specialized functions). >> >> I'm thinking of things like >> >> def override_length_function_test(): >> global len >> value = len("abc") >> len = lambda obj: ord(obj[0])) >> value += len("abc") >> assert value == 3 + 97, "Outdated `len` used." >> >> And also cases where `len` was changed not in the function itself, but >> in another function that was called indirectly (maybe 4 functions down, >> one of which was monkey-patched in after the compilation step): >> >> module_a.py >> def test_func(callback): >> value = len("abc") >> callback() >> value += len("abc") >> assert value == 3 + 97, "Outdated `len` used." >> >> module_b.py >> import module_a >> def override_length_function_test(): >> def callback(): >> module_a.len = lambda obj: ord(obj[0]) >> assert module_a.test_func(callback) == 3 + 97, "Outdated `len` used." >> >> (I'm sure some of the other readers of this list can be even more >> creative in trying to show that making optimizations like this can break >> semantics.) >> >> Other fun tricks I'd like to try is overriding the `len` method from a >> signal handler, what happens when you monkey-patch a dependent method, >> having `__getitem__` and `__getattr__` on some value override `len`. >> >> Basically: trying things that I normally should not try during my >> working hours, on account of wanting to still have a job the next day. >> >> Kind regards, >> Sjoerd Job > > Maybe I'm just nasty, but IMO those kinds of "torture tests" are just > as valuable in general, so I'd encourage people to submit patches to > the main Python test suite to add them. > > Paul From victor.stinner at gmail.com Wed Jan 27 04:39:43 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Wed, 27 Jan 2016 10:39:43 +0100 Subject: [Python-Dev] Buildbot timing out - test suite failure - test_socket issue with UDP6? In-Reply-To: References: Message-ID: 2016-01-23 7:03 GMT+01:00 Chris Angelico : > I just had a major crash on the system that hosts the > angelico-debian-amd64 buildbot, and as usual, checked it carefully > after bringing everything up. It seems now to be timing out after an > hour of operation: > > http://buildbot.python.org/all/builders/AMD64%20Debian%20root%203.x/builds/3132/steps/test/logs/stdio I opened http://bugs.python.org/issue26206 to track this issue. > Running just that test file: > > $ ./python Lib/test/test_socket.py > ... chomp lots of lines ... > testRecvmsgPeek (__main__.RecvmsgUDP6Test) ... > > seems to indicate that the stall is due to IPv6 and UDP. The VM should > have full IPv6 support, although my ISPs don't carry IPv6 traffic, so > it won't be able to reach the internet proper; but it should be able > to do all manner of local tests. Try to apply attached patch and run: $ ./python -m test -v -m testRecvmsgPeek test_socket (...) testRecvmsgPeek (test.test_socket.RecvmsgUDP6Test) ... CLI SOCK SERV SOCK CLI SOCK ('::1', 44347, 0, 0) SERV SOCK ('::1', 40488, 0, 0) ok testRecvmsgPeek (test.test_socket.RecvmsgIntoUDP6Test) ... CLI SOCK SERV SOCK CLI SOCK ('::1', 52721, 0, 0) SERV SOCK ('::1', 43967, 0, 0) ok (...) As you can see: the test uses the local loopback interface. Inet6TestBase.host is "::1". You can try to run a UDP server using netcat: "nc -l -u ::1 12345". Keep the command running in a terminal, and then run in a different terminal: "echo abc | nc -u ::1 12345". You should receive abc in the server. Victor -------------- next part -------------- A non-text attachment was scrubbed... Name: debug.patch Type: text/x-patch Size: 928 bytes Desc: not available URL: From tjreedy at udel.edu Wed Jan 27 05:59:37 2016 From: tjreedy at udel.edu (Terry Reedy) Date: Wed, 27 Jan 2016 05:59:37 -0500 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: <56A7AE7C.3080806@mail.de> References: <56A698CF.3080402@mail.de> <56A7AE7C.3080806@mail.de> Message-ID: On 1/26/2016 12:35 PM, Sven R. Kunze wrote: > I completely agree with INADA. I an not sure you do. > It's like saying, because a specific crossroad features a higher > accident rate, *people need to change their driving behavior*. > *No!* People won't change and it's not necessary either. The crossroad > needs to be changed to be safer. Safer crossroads tend to be slower unless one switched to alternate designs that eliminate crossing streams of traffic. Python is safer, in everyday use as well as in artificial benchmarks, and is slower as a result. Languages that don't have integers but use residue classes (with wraparound) or finite integer classes (with overflow) as a (faster) substitute have, in practice, lots of accidents (bugs) when used by non-experts. Guido noticed this, gave up on changing coder behavior, and put the expert behavior of checking for wraparound/overflow and switching to real integers (longs) into the language. (I forget when this was added.) The purpose of the artificially low input to fib() is to hide and avoid the bugginess of most languages. The analogous trick with testing crossroads would be to artificially restrict the density of cars to mask the accident-proneness of a 'fast, consenting-adults' crossroads with no stop signs and no stop lights. > Same goes for Python. If it's slow using the very same piece of code > (even superficially), you better make the language faster. > Developers won't change and they won't change their code either. Just > not necessary. Instead of making people rewrite fib to dramatically increase speed, we added the lru-cache decorator to get most of the benefit without a rewrite. But Inada rejected this Python speedup. An ast optimizer could potentially do the same speedup without the explicit decorator. (No side-effects? Multiple recursive calls? Add a cache!) > Btw. it would be a great feature for Python 3 to be faster than Python > 2. We all agree on that. One way for this to happen is to add optimizers that would make Python 'cheat' on micrebenchmarks -- Terry Jan Reedy From ncoghlan at gmail.com Wed Jan 27 06:16:19 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Wed, 27 Jan 2016 21:16:19 +1000 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: <56A7AE7C.3080806@mail.de> References: <56A698CF.3080402@mail.de> <56A7AE7C.3080806@mail.de> Message-ID: On 27 January 2016 at 03:35, Sven R. Kunze wrote: > I completely agree with INADA. > > It's like saying, because a specific crossroad features a higher accident > rate, people need to change their driving behavior. > No! People won't change and it's not necessary either. The crossroad needs > to be changed to be safer. Umm, no, that's not how this works - developers contribute to community driven projects for their *own* reasons. Nobody gets to tell them what to do unless they're paying them. Micro-optimising a poor algorithm won't deliver macro level improvements because macro level code uses things like space-speed trade-offs to improve the algorithmic efficiency (as in the example of applying functools.lru_cache to a naive recursive fibonacci implementation). Victor's work on FAT optimiser is interesting because it offers opportunities to speed up even code that is already algorithmically efficient, as well as making CPython a better platform for experimenting with those kinds of changes. More generally though, much larger opportunities for improvement lie in persuading people to *stop writing code*, and instead spending more of their time on finding and assembling code other people have *already written* into solutions to interesting problems. *That's* the kind of improvement that turns enormously complex problems like facial recognition into 25 line Python scripts: https://realpython.com/blog/python/face-recognition-with-python/ Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From tjreedy at udel.edu Wed Jan 27 07:16:45 2016 From: tjreedy at udel.edu (Terry Reedy) Date: Wed, 27 Jan 2016 07:16:45 -0500 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: <22183.45626.891297.272536@turnbull.sk.tsukuba.ac.jp> References: <56A698CF.3080402@mail.de> <22183.45626.891297.272536@turnbull.sk.tsukuba.ac.jp> Message-ID: On 1/26/2016 12:51 PM, Stephen J. Turnbull wrote: > Terry Reedy writes: > > On 1/26/2016 12:02 AM, INADA Naoki wrote: > > > > > People use same algorithm on every language when compares base language > > > performance [1]. > > > > The python code is NOT using the same algorithm. The proof is that the > > Python function will return the correct value for, say fib(50) while > > most if not all the other versions will not. Let me try to be clearer. 1. Like everyone else, I would like Python function calls to be faster either in general or in special cases detected during compilation. This will require micro-benchmark for function calls that do just that. First time an empty loop, then time a loop with a call to an empty function. Do the same for various signatures, and maybe other special cases. 2. Cross-language micro-benchmarks aimed at timing specific operations are tough. To run on multiple languages, they must be restricted to a lowest-common-denominator of features. It is impossible to make every implementation perform exactly the same set of operations. Some languages may bundle features together. Some languages and implementations have optimizations that avoid unneeded operations. Not all optimizatons can be turned off. 3. While there are trends as to the speed of implementations of a language, benchmarks time particular implementations. Shedskin would compile fib to a C function that runs much faster than fib with the restricted subset of CPython allowed for the benchmark. > True, but that's not a reasonable criterion for "same algorithm" in > this context. Naoki's application ("base language performance" > benchmarking) requires fib(n) only for n < 40, and run it in a loop > 100 times if you want 2 more decimal places of precision ("40" is > appropriate for an implementation with 32-bit ints). So you agree that the limit of 39 is not intrinsic to the fib function or its uses, but is an after-the-fact limit imposed to mask the bug proneness of using substitutes for integers. To my mind, a fairer and more useful benchmark of 'base language performance' based on fib would use a wider domain. The report would say that CPython (with lru_cache disallowed) is slow but works over a wide range of inputs, while some other implementations of other languages run faster for small inputs but fail catastrophically for larger inputs. Users could then make a more informed pick. Also see my answer to Sven Kunze. -- Terry Jan Reedy From rosuav at gmail.com Wed Jan 27 07:47:33 2016 From: rosuav at gmail.com (Chris Angelico) Date: Wed, 27 Jan 2016 23:47:33 +1100 Subject: [Python-Dev] Buildbot timing out - test suite failure - test_socket issue with UDP6? In-Reply-To: References: Message-ID: On Wed, Jan 27, 2016 at 8:39 PM, Victor Stinner wrote: > 2016-01-23 7:03 GMT+01:00 Chris Angelico : >> Running just that test file: >> >> $ ./python Lib/test/test_socket.py >> ... chomp lots of lines ... >> testRecvmsgPeek (__main__.RecvmsgUDP6Test) ... >> >> seems to indicate that the stall is due to IPv6 and UDP. The VM should >> have full IPv6 support, although my ISPs don't carry IPv6 traffic, so >> it won't be able to reach the internet proper; but it should be able >> to do all manner of local tests. > > Try to apply attached patch and run: > > $ ./python -m test -v -m testRecvmsgPeek test_socket > (...) > testRecvmsgPeek (test.test_socket.RecvmsgUDP6Test) ... CLI SOCK > type=SocketKind.SOCK_DGRAM, proto=0, laddr=('::1', 44347, 0, 0)> > SERV SOCK type=SocketKind.SOCK_DGRAM, proto=0, laddr=('::1', 40488, 0, 0)> > CLI SOCK ('::1', 44347, 0, 0) > SERV SOCK ('::1', 40488, 0, 0) > ok > testRecvmsgPeek (test.test_socket.RecvmsgIntoUDP6Test) ... CLI SOCK > type=SocketKind.SOCK_DGRAM, proto=0, laddr=('::1', 52721, 0, 0)> > SERV SOCK type=SocketKind.SOCK_DGRAM, proto=0, laddr=('::1', 43967, 0, 0)> > CLI SOCK ('::1', 52721, 0, 0) > SERV SOCK ('::1', 43967, 0, 0) > ok > (...) > > As you can see: the test uses the local loopback interface. > Inet6TestBase.host is "::1". Confirmed. It does two tests of IPv4 which run just fine, and then: testRecvmsgPeek (test.test_socket.RecvmsgUDP6Test) ... CLI SOCK CLI SOCK ('::1', 47518, 0, 0) SERV SOCK SERV SOCK ('::1', 42421, 0, 0) and hangs until I interrupt it. > You can try to run a UDP server using netcat: "nc -l -u ::1 12345". > Keep the command running in a terminal, and then run in a different > terminal: "echo abc | nc -u ::1 12345". You should receive abc in the > server. (Oddly, the default netcat-traditional doesn't seem to support this, but installing netcat-openbsd adds IPv6 support.) Yep, and that works flawlessly. It's nothing weird about that particular port, either - nc can use 42421 without a problem. After digging through test_socket.py for over an hour (the MRO for RecvmsgUDP6Test is enormous!!), I've boiled the issue down to this: import socket MSG = b'asdf qwer zxcv' serv = socket.socket(socket.AF_INET6, socket.SOCK_DGRAM) serv.bind(("::1", 0)) cli = socket.socket(socket.AF_INET6, socket.SOCK_DGRAM) cli.bind(("::1", 0)) cli.sendto(MSG, serv.getsockname()) print(serv.recvmsg(len(MSG) - 3, 0, socket.MSG_PEEK)) print(serv.recvmsg(len(MSG), 0, socket.MSG_PEEK)) print(serv.recvmsg(len(MSG))) On my main system, this produces three lines of output: the first has truncated text, the second has full text, and the third also has full text. This proves that MSG_PEEK is working correctly. On the buildbot, though, the first one stalls out. Commenting that line out produces correct results - peek the full data, then read it, and all is well. Any idea why partial read on a datagram socket would sometimes stall? ChrisA From stephen at xemacs.org Wed Jan 27 12:12:16 2016 From: stephen at xemacs.org (Stephen J. Turnbull) Date: Thu, 28 Jan 2016 02:12:16 +0900 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: <56A698CF.3080402@mail.de> <22183.45626.891297.272536@turnbull.sk.tsukuba.ac.jp> Message-ID: <22184.64112.8831.955609@turnbull.sk.tsukuba.ac.jp> Terry Reedy writes: > So you agree that the limit of 39 is not intrinsic to the fib function > or its uses, but is an after-the-fact limit imposed to mask the bug > proneness of using substitutes for integers. I don't know what the limit used in the benchmark is, but it must be quite a bit lower than 50 for 32-bit integers and could be greater than 90 for 64-bit integers. And it's not "masking bugs", it's "respecting the domain of valid input", s'il vous plait. > To my mind, a fairer and more useful benchmark of 'base language > performance' based on fib would use a wider domain. "Fair", maybe. But why play this game at all? These benchmarks are simply not useful to users choosing languages, unless they already know the difficulties of interpreting benchmarks and are willing to expend the effort to account for them. Without that knowledge and effort, choosing a programming language based on microbenchmarks is like choosing a car based on the leg-length of the model sitting on the hood in the TV commercial. > The report would say that CPython (with lru_cache disallowed) is > slow but works over a wide range of inputs, No, the report would say "Use of this benchmark for cross-language comparison of function call speed is more or less inaccurate due to differences in representation of integers and in handling the possibility of exceptions in 'integer' arithmetic." You are picking one tiny difference, but there are potentially many, some quite a bit larger on the tested domain (for example, some languages may be able to optimize fib() to unboxed integers, in which case they'll blow away all those that don't). > Users could then make a more informed pick. My point in my reply to Nick is that users aren't making informed picks. If they were, we wouldn't even be thinking about having this conversation. I'm not sure what they are doing (maybe, as Nick suggests, justifying their "tribal" prejudices?), but it's not that. ;-) Sure, other things being equal, better benchmarks will improve runtime performance, but other things are so far from being equal even an economist can't say "ceteris paribus" here. To expand that point: I don't really see a point in users (ie, developers in Python and other such languages) looking at these benchmarks except for the fun of feeling like implementers, to be honest. Even the implementers shouldn't much care about cross- language benchmarks, except that when a "similar"[1] language does significantly better on a particular benchmark, it's often useful to wonder "how dey do dat?!" Typically the answer is "they 'cheat'" == "fail one of the properties we consider required", but sometimes it's "ooh, that's cute, and I bet we could make Python work the same way" or "urkh, we can't do *that* (yuck!) but we could FATten up Python with similar effect". (Let me take this opportunity to say "Thank you, Victor!") Of course in the case of a controlled experiment like "configure in Victor's changes and run the benchmarks to make sure they're not detectably slower", they're invaluable regression tests, and more or less valuable (ie, YMMV) as measures of improvement to compare to costs they may impose in other features or (even more fuzzy) in developer time. Footnotes: [1] Whatever that means.... From srkunze at mail.de Wed Jan 27 13:11:27 2016 From: srkunze at mail.de (Sven R. Kunze) Date: Wed, 27 Jan 2016 19:11:27 +0100 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: <56A698CF.3080402@mail.de> <56A7AE7C.3080806@mail.de> Message-ID: <56A9084F.4090200@mail.de> On 27.01.2016 11:59, Terry Reedy wrote: > On 1/26/2016 12:35 PM, Sven R. Kunze wrote: >> I completely agree with INADA. > > I an not sure you do. > I am sure I am. He wants to solve a problem the way that is natural to him as a unique human being. >> It's like saying, because a specific crossroad features a higher >> accident rate, *people need to change their driving behavior*. >> *No!* People won't change and it's not necessary either. The crossroad >> needs to be changed to be safer. > > Safer crossroads tend to be slower unless one switched to alternate > designs that eliminate crossing streams of traffic. So Python can be safer AND faster ( = different design) if we try hard enough. > Languages that don't have integers but use residue classes (with > wraparound) or finite integer classes (with overflow) as a (faster) > substitute have, in practice, lots of accidents (bugs) when used by > non-experts. Guido noticed this, gave up on changing coder behavior, > and put the expert behavior of checking for wraparound/overflow and > switching to real integers (longs) into the language. (I forget when > this was added.) I am glad he did because it helps humans solve their problems in a natural way without artificial boundaries. :) > The purpose of the artificially low input to fib() is to hide and > avoid the bugginess of most languages. The analogous trick with > testing crossroads would be to artificially restrict the density of > cars to mask the accident-proneness of a 'fast, consenting-adults' > crossroads with no stop signs and no stop lights. > I am completely with you here, however I disagree about suspected hiding/avoiding mentality. You say: Python -> *no problem with big integers but slow at small integers* Other Language -> *faster but breaks at big integers* Yes. That's it. We haven't solved the human side, however. A human AGAIN would need to compromise on either speed or safety. My point is: it would be insanely great if Python could be more like "*fast AND no problem with big integers*". No compromise here (at least no noticeable). So, people could entirely *concentrate on their problem domain* without every worrying about such tiny little, nitty-gritty computer science details. I love computer science but people of other domains don't have the time nor the knowledge to decide properly. That's the reason why they might decide by using some weird micro-benchmarks. Just humans. >> Same goes for Python. If it's slow using the very same piece of code >> (even superficially), you better make the language faster. >> Developers won't change and they won't change their code either. Just >> not necessary. > > Instead of making people rewrite fib to dramatically increase speed, > we added the lru-cache decorator to get most of the benefit without a > rewrite. But Inada rejected this Python speedup. An ast optimizer > could potentially do the same speedup without the explicit decorator. > (No side-effects? Multiple recursive calls? Add a cache!) > Bingo! That's the spirit. Why that decorator in the first place? Hey, I mean, if I ever want to write some cryptic-looking source code with 3-letters abbreviations (LRU), I use Assembler again. But I discovered and love Python and I never want to go back when my problem domain does not require me to. So, when a machine can detect such an optimization, hell, do it, please. It's more likely that I apply it at the wrong function AND only in 10% of the correct cases: missing 90% and introducing some wild errors. Again human stuff. >> Btw. it would be a great feature for Python 3 to be faster than Python >> 2. > > We all agree on that. One way for this to happen is to add optimizers > that would make Python 'cheat' on micrebenchmarks Then, we are all set. :) Best, Sven -------------- next part -------------- An HTML attachment was scrubbed... URL: From yselivanov.ml at gmail.com Wed Jan 27 13:25:27 2016 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 27 Jan 2016 13:25:27 -0500 Subject: [Python-Dev] Speeding up CPython 5-10% Message-ID: <56A90B97.7090001@gmail.com> Hi, tl;dr The summary is that I have a patch that improves CPython performance up to 5-10% on macro benchmarks. Benchmarks results on Macbook Pro/Mac OS X, desktop CPU/Linux, server CPU/Linux are available at [1]. There are no slowdowns that I could reproduce consistently. There are twodifferent optimizations that yield this speedup: LOAD_METHOD/CALL_METHOD opcodes and per-opcode cache in ceval loop. LOAD_METHOD & CALL_METHOD ------------------------- We had a lot of conversations with Victor about his PEP 509, and he sent me a link to his amazing compilation of notes about CPython performance [2]. One optimization that he pointed out to me was LOAD/CALL_METHOD opcodes, an idea first originated in PyPy. There is a patch that implements this optimization, it's tracked here: [3]. There are some low level details that I explained in the issue, but I'll go over the high level design in this email as well. Every time you access a method attribute on an object, a BoundMethod object is created. It is a fairly expensive operation, despite a freelist of BoundMethods (so that memory allocation is generally avoided). The idea is to detect what looks like a method call in the compiler, and emit a pair of specialized bytecodes for that. So instead of LOAD_GLOBAL/LOAD_ATTR/CALL_FUNCTION we will have LOAD_GLOBAL/LOAD_METHOD/CALL_METHOD. LOAD_METHOD looks at the object on top of the stack, and checks if the name resolves to a method or to a regular attribute. If it's a method, then we push the unbound method object and the object to the stack. If it's an attribute, we push the resolved attribute and NULL. When CALL_METHOD looks at the stack it knows how to call the unbound method properly (pushing the object as a first arg), or how to call a regular callable. This idea does make CPython faster around 2-4%. And it surely doesn't make it slower. I think it's a safe bet to at least implement this optimization in CPython 3.6. So far, the patch only optimizes positional-only method calls. It's possible to optimize all kind of calls, but this will necessitate 3 more opcodes (explained in the issue). We'll need to do some careful benchmarking to see if it's really needed. Per-opcode cache in ceval ------------------------- While reading PEP 509, I was thinking about how we can use dict->ma_version in ceval to speed up globals lookups. One of the key assumptions (and this is what makes JITs possible) is that real-life programs don't modify globals and rebind builtins (often), and that most code paths operate on objects of the same type. In CPython, all pure Python functions have code objects. When you call a function, ceval executes its code object in a frame. Frames contain contextual information, including pointers to the globals and builtins dict. The key observation here is that almost all code objects always have same pointers to the globals (the module they were defined in) and to the builtins. And it's not a good programming practice to mutate globals or rebind builtins. Let's look at this function: def spam(): print(ham) Here are its opcodes: 2 0 LOAD_GLOBAL 0 (print) 3 LOAD_GLOBAL 1 (ham) 6 CALL_FUNCTION 1 (1 positional, 0 keyword pair) 9 POP_TOP 10 LOAD_CONST 0 (None) 13 RETURN_VALUE The opcodes we want to optimize are LAOD_GLOBAL, 0 and 3. Let's look at the first one, that loads the 'print' function from builtins. The opcode knows the following bits of information: - its offset (0), - its argument (0 -> 'print'), - its type (LOAD_GLOBAL). And these bits of information will *never* change. So if this opcode could resolve the 'print' name (from globals or builtins, likely the latter) and save the pointer to it somewhere, along with globals->ma_version and builtins->ma_version, it could, on its second call, just load this cached info back, check that the globals and builtins dict haven't changed and push the cached ref to the stack. That would save it from doing two dict lookups. We can also optimize LOAD_METHOD. There are high chances, that 'obj' in 'obj.method()' will be of the same type every time we execute the code object. So if we'd have an opcodes cache, LOAD_METHOD could then cache a pointer to the resolved unbound method, a pointer to obj.__class__, and tp_version_tag of obj.__class__. Then it would only need to check if the cached object type is the same (and that it wasn't modified) and that obj.__dict__ doesn't override 'method'. Long story short, this caching really speeds up method calls on types implemented in C. list.append becomes very fast, because list doesn't have a __dict__, so the check is very cheap (with cache). A straightforward way to implement such a cache is simple, but consumes a lot of memory, that would be just wasted, since we only need such a cache for LOAD_GLOBAL and LOAD_METHOD opcodes. So we have to be creative about the cache design. Here's what I came up with: 1. We add a few fields to the code object. 2. ceval will count how many times each code object is executed. 3. When the code object is executed over ~900 times, we mark it as "hot". We also create an 'unsigned char' array "MAPPING", with length set to match the length of the code object. So we have a 1-to-1 mapping between opcodes and MAPPING array. 4. Next ~100 calls, while the code object is "hot", LOAD_GLOBAL and LOAD_METHOD do "MAPPING[opcode_offset()]++". 5. After 1024 calls to the code object, ceval loop will iterate through the MAPPING, counting all opcodes that were executed more than 50 times. 6. We then create an array of cache structs "CACHE" (here's a link to the updated code.h file: [6]). We update MAPPING to be a mapping between opcode position and position in the CACHE. The code object is now "optimized". 7. When the code object is "optimized", LOAD_METHOD and LOAD_GLOBAL use the CACHE array for fast path. 8. When there is a cache miss, i.e. the builtins/global/obj.__dict__ were mutated, the opcode marks its entry in 'CACHE' as deoptimized, and it will never try to use the cache again. Here's a link to the issue tracker with the first version of the patch: [5]. I'm working on the patch in a github repo here: [4]. Summary ------- There are many things about this algorithm that we can improve/tweak. Perhaps we should profile code objects longer, or account for time they were executed. Maybe we shouldn't deoptimize opcodes on their first cache miss. Maybe we can come up with better data structures. We also need to profile the memory and see how much more this cache will require. One thing I'm certain about, is that we can get a 5-10% speedup of CPython with relatively low memory impact. And I think it's worth exploring that! If you're interested in these kind of optimizations, please help with code reviews, ideas, profiling and benchmarks. The latter is especially important, I'd never imagine how hard it is to come up with a good macro benchmark. I also want to thank my company MagicStack (magic.io) for sponsoring this work. Thanks, Yury [1] https://gist.github.com/1st1/aed69d63a2ff4de4c7be [2] http://faster-cpython.readthedocs.org/index.html [3] http://bugs.python.org/issue26110 [4] https://github.com/1st1/cpython/tree/opcache2 [5] http://bugs.python.org/issue26219 [6] https://github.com/python/cpython/compare/master...1st1:opcache2?expand=1#diff-b253e61c56dfa646a6b1b9e7aaad418aR18 From brett at python.org Wed Jan 27 13:33:38 2016 From: brett at python.org (Brett Cannon) Date: Wed, 27 Jan 2016 18:33:38 +0000 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: <56A9084F.4090200@mail.de> References: <56A698CF.3080402@mail.de> <56A7AE7C.3080806@mail.de> <56A9084F.4090200@mail.de> Message-ID: On Wed, 27 Jan 2016 at 10:12 Sven R. Kunze wrote: > On 27.01.2016 11:59, Terry Reedy wrote: > > On 1/26/2016 12:35 PM, Sven R. Kunze wrote: > > I completely agree with INADA. > > > I an not sure you do. > > > I am sure I am. He wants to solve a problem the way that is natural to him > as a unique human being. > > > It's like saying, because a specific crossroad features a higher > accident rate, *people need to change their driving behavior*. > *No!* People won't change and it's not necessary either. The crossroad > needs to be changed to be safer. > > > Safer crossroads tend to be slower unless one switched to alternate > designs that eliminate crossing streams of traffic. > > > So Python can be safer AND faster ( = different design) if we try hard > enough. > > > Languages that don't have integers but use residue classes (with > wraparound) or finite integer classes (with overflow) as a (faster) > substitute have, in practice, lots of accidents (bugs) when used by > non-experts. Guido noticed this, gave up on changing coder behavior, and > put the expert behavior of checking for wraparound/overflow and switching > to real integers (longs) into the language. (I forget when this was > added.) > > > I am glad he did because it helps humans solve their problems in a natural > way without artificial boundaries. :) > > > The purpose of the artificially low input to fib() is to hide and avoid > the bugginess of most languages. The analogous trick with testing > crossroads would be to artificially restrict the density of cars to mask > the accident-proneness of a 'fast, consenting-adults' crossroads with no > stop signs and no stop lights. > > > I am completely with you here, however I disagree about suspected > hiding/avoiding mentality. You say: > > Python -> *no problem with big integers but slow at small integers* > Other Language -> *faster but breaks at big integers* > > Yes. That's it. > > We haven't solved the human side, however. A human AGAIN would need to > compromise on either speed or safety. > > > My point is: it would be insanely great if Python could be more like "*fast > AND no problem with big integers*". No compromise here (at least no > noticeable). > And this is why this entire email thread has devolved into a conversation that isn't really going anywhere. This whole thread has completely lost track of the point of Victor's earlier email saying "I'm still working on my FAT work and don't take any notice of the performance numbers until more stuff gets finished". And this discussion of what benchmarks to care about is rather pointless since the core team has an implicit understanding that any performance improvement is taken into consideration in terms of balancing complexity in CPython with how much improvement it gets us. So if someone wants to speed up Fibonacci then they are welcome to try, but the solution must be maintainable in proportion to the speed increase it buys Python as a whole. -Brett > > So, people could entirely *concentrate on their problem domain* without > every worrying about such tiny little, nitty-gritty computer science > details. I love computer science but people of other domains don't have the > time nor the knowledge to decide properly. That's the reason why they might > decide by using some weird micro-benchmarks. Just humans. > > > Same goes for Python. If it's slow using the very same piece of code > (even superficially), you better make the language faster. > Developers won't change and they won't change their code either. Just > not necessary. > > > Instead of making people rewrite fib to dramatically increase speed, we > added the lru-cache decorator to get most of the benefit without a > rewrite. But Inada rejected this Python speedup. An ast optimizer could > potentially do the same speedup without the explicit decorator. (No > side-effects? Multiple recursive calls? Add a cache!) > > > Bingo! That's the spirit. > > Why that decorator in the first place? Hey, I mean, if I ever want to > write some cryptic-looking source code with 3-letters abbreviations (LRU), > I use Assembler again. But I discovered and love Python and I never want to > go back when my problem domain does not require me to. So, when a machine > can detect such an optimization, hell, do it, please. It's more likely that > I apply it at the wrong function AND only in 10% of the correct cases: > missing 90% and introducing some wild errors. > > Again human stuff. > > > Btw. it would be a great feature for Python 3 to be faster than Python > 2. > > > We all agree on that. One way for this to happen is to add optimizers > that would make Python 'cheat' on micrebenchmarks > > > Then, we are all set. :) > > Best, > Sven > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From srkunze at mail.de Wed Jan 27 13:40:44 2016 From: srkunze at mail.de (Sven R. Kunze) Date: Wed, 27 Jan 2016 19:40:44 +0100 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: <56A698CF.3080402@mail.de> <56A7AE7C.3080806@mail.de> Message-ID: <56A90F2C.9000401@mail.de> On 27.01.2016 12:16, Nick Coghlan wrote: > On 27 January 2016 at 03:35, Sven R. Kunze wrote: >> I completely agree with INADA. >> >> It's like saying, because a specific crossroad features a higher accident >> rate, people need to change their driving behavior. >> No! People won't change and it's not necessary either. The crossroad needs >> to be changed to be safer. > Umm, no, that's not how this works That's exactly how it works, Nick. INADA uses Python as I use crossroads each day. Daily human business. If you read his post carefully, you can discover that he just presented to you his perspective of the world. Moreover, I can assure you that he's not alone. As usual with humans it's not about facts or mathematically proven theorems but *perception*. It's more about marketing, little important details (or unimportant ones depending on whom you ask) and so on. Stating that he has a wrong perspective will not change anything. Believing that Python is treated unfair will not change that either. Most people believe what they see. When they see a "FUNCTION CALL", it's the same in every language. Why? Because it looks like a function call ( name + parentheses ), it's called "function call" even if it's implemented completely differently. It even doesn't matter if we use commas, 'def', return types, etc. Because people understand the bigger concept, so that is what people want to compare. Average Joe doesn't care and does not understand. He looks at the benchmarks. That is something he can understand. "While performance is not a matter when choosing first language, slowest of three makes bad impression and people feel less attractive about Python." << just like that Not saying that INADA is an Average Joe, but I think you get the idea. > - developers contribute to > community driven projects for their *own* reasons. Nobody gets to tell > them what to do unless they're paying them. Bit off-topic. > Micro-optimising a poor algorithm won't deliver macro level > improvements because macro level code uses things like space-speed > trade-offs to improve the algorithmic efficiency (as in the example of > applying functools.lru_cache to a naive recursive fibonacci > implementation). I completely agree, Nick. :) But that isn't the issue here. > Victor's work on FAT optimiser is interesting because it offers > opportunities to speed up even code that is already algorithmically > efficient, as well as making CPython a better platform for > experimenting with those kinds of changes. Exactly. :) > More generally though, much larger opportunities for improvement lie > in persuading people to *stop writing code*, and instead spending more > of their time on finding and assembling code other people have > *already written* into solutions to interesting problems. *That's* the > kind of improvement that turns enormously complex problems like facial > recognition into 25 line Python scripts: > https://realpython.com/blog/python/face-recognition-with-python/ Interesting post. :) Thanks. Btw. I completely agree with you on the "improve programming education", but not everybody can do it; and not everybody wants to learn and to practice it properly. Best, Sven -------------- next part -------------- An HTML attachment was scrubbed... URL: From srkunze at mail.de Wed Jan 27 13:43:31 2016 From: srkunze at mail.de (Sven R. Kunze) Date: Wed, 27 Jan 2016 19:43:31 +0100 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: <56A698CF.3080402@mail.de> <56A7AE7C.3080806@mail.de> <56A9084F.4090200@mail.de> Message-ID: <56A90FD3.80505@mail.de> On 27.01.2016 19:33, Brett Cannon wrote: > And this is why this entire email thread has devolved into a > conversation that isn't really going anywhere. This whole thread has > completely lost track of the point of Victor's earlier email saying > "I'm still working on my FAT work and don't take any notice of the > performance numbers until more stuff gets finished". And this > discussion of what benchmarks to care about is rather pointless since > the core team has an implicit understanding that any performance > improvement is taken into consideration in terms of balancing > complexity in CPython with how much improvement it gets us. So if > someone wants to speed up Fibonacci then they are welcome to try, but > the solution must be maintainable in proportion to the speed increase > it buys Python as a whole. +1 From brett at python.org Wed Jan 27 15:01:15 2016 From: brett at python.org (Brett Cannon) Date: Wed, 27 Jan 2016 20:01:15 +0000 Subject: [Python-Dev] Speeding up CPython 5-10% In-Reply-To: <56A90B97.7090001@gmail.com> References: <56A90B97.7090001@gmail.com> Message-ID: On Wed, 27 Jan 2016 at 10:26 Yury Selivanov wrote: > Hi, > > > tl;dr The summary is that I have a patch that improves CPython > performance up to 5-10% on macro benchmarks. Benchmarks results on > Macbook Pro/Mac OS X, desktop CPU/Linux, server CPU/Linux are available > at [1]. There are no slowdowns that I could reproduce consistently. > > There are twodifferent optimizations that yield this speedup: > LOAD_METHOD/CALL_METHOD opcodes and per-opcode cache in ceval loop. > > > LOAD_METHOD & CALL_METHOD > ------------------------- > > We had a lot of conversations with Victor about his PEP 509, and he sent > me a link to his amazing compilation of notes about CPython performance > [2]. One optimization that he pointed out to me was LOAD/CALL_METHOD > opcodes, an idea first originated in PyPy. > > There is a patch that implements this optimization, it's tracked here: > [3]. There are some low level details that I explained in the issue, > but I'll go over the high level design in this email as well. > > Every time you access a method attribute on an object, a BoundMethod > object is created. It is a fairly expensive operation, despite a > freelist of BoundMethods (so that memory allocation is generally > avoided). The idea is to detect what looks like a method call in the > compiler, and emit a pair of specialized bytecodes for that. > > So instead of LOAD_GLOBAL/LOAD_ATTR/CALL_FUNCTION we will have > LOAD_GLOBAL/LOAD_METHOD/CALL_METHOD. > > LOAD_METHOD looks at the object on top of the stack, and checks if the > name resolves to a method or to a regular attribute. If it's a method, > then we push the unbound method object and the object to the stack. If > it's an attribute, we push the resolved attribute and NULL. > > When CALL_METHOD looks at the stack it knows how to call the unbound > method properly (pushing the object as a first arg), or how to call a > regular callable. > > This idea does make CPython faster around 2-4%. And it surely doesn't > make it slower. I think it's a safe bet to at least implement this > optimization in CPython 3.6. > > So far, the patch only optimizes positional-only method calls. It's > possible to optimize all kind of calls, but this will necessitate 3 more > opcodes (explained in the issue). We'll need to do some careful > benchmarking to see if it's really needed. > > > Per-opcode cache in ceval > ------------------------- > > While reading PEP 509, I was thinking about how we can use > dict->ma_version in ceval to speed up globals lookups. One of the key > assumptions (and this is what makes JITs possible) is that real-life > programs don't modify globals and rebind builtins (often), and that most > code paths operate on objects of the same type. > > In CPython, all pure Python functions have code objects. When you call > a function, ceval executes its code object in a frame. Frames contain > contextual information, including pointers to the globals and builtins > dict. The key observation here is that almost all code objects always > have same pointers to the globals (the module they were defined in) and > to the builtins. And it's not a good programming practice to mutate > globals or rebind builtins. > > Let's look at this function: > > def spam(): > print(ham) > > Here are its opcodes: > > 2 0 LOAD_GLOBAL 0 (print) > 3 LOAD_GLOBAL 1 (ham) > 6 CALL_FUNCTION 1 (1 positional, 0 keyword pair) > 9 POP_TOP > 10 LOAD_CONST 0 (None) > 13 RETURN_VALUE > > The opcodes we want to optimize are LAOD_GLOBAL, 0 and 3. Let's look at > the first one, that loads the 'print' function from builtins. The > opcode knows the following bits of information: > > - its offset (0), > - its argument (0 -> 'print'), > - its type (LOAD_GLOBAL). > > And these bits of information will *never* change. So if this opcode > could resolve the 'print' name (from globals or builtins, likely the > latter) and save the pointer to it somewhere, along with > globals->ma_version and builtins->ma_version, it could, on its second > call, just load this cached info back, check that the globals and > builtins dict haven't changed and push the cached ref to the stack. > That would save it from doing two dict lookups. > > We can also optimize LOAD_METHOD. There are high chances, that 'obj' in > 'obj.method()' will be of the same type every time we execute the code > object. So if we'd have an opcodes cache, LOAD_METHOD could then cache > a pointer to the resolved unbound method, a pointer to obj.__class__, > and tp_version_tag of obj.__class__. Then it would only need to check > if the cached object type is the same (and that it wasn't modified) and > that obj.__dict__ doesn't override 'method'. Long story short, this > caching really speeds up method calls on types implemented in C. > list.append becomes very fast, because list doesn't have a __dict__, so > the check is very cheap (with cache). > What would it take to make this work with Python-defined classes? I guess that would require knowing the version of the instance's __dict__, the instance's __class__ version, the MRO, and where the method object was found in the MRO and any intermediary classes to know if it was suddenly shadowed? I think that's everything. :) Obviously that's a lot, but I wonder how many classes have a deep inheritance model vs. inheriting only from `object`? In that case you only have to check self.__dict__.ma_version, self.__class__, self.__class__.__dict__.ma_version, and self.__class__.__class__ == `type`. I guess another way to look at this is to get an idea of how complex do the checks have to get before caching something like this is not worth it (probably also depends on how often you mutate self.__dict__ thanks to mutating attributes, but you could in that instance just decide to always look at self.__dict__ for the method's key and then do the ma_version cache check for everything coming from the class). Otherwise we can consider looking at the the caching strategies that Self helped pioneer (http://bibliography.selflanguage.org/) that all of the various JS engines lifted and consider caching all method lookups. > > A straightforward way to implement such a cache is simple, but consumes > a lot of memory, that would be just wasted, since we only need such a > cache for LOAD_GLOBAL and LOAD_METHOD opcodes. So we have to be creative > about the cache design. Here's what I came up with: > > 1. We add a few fields to the code object. > > 2. ceval will count how many times each code object is executed. > > 3. When the code object is executed over ~900 times, we mark it as > "hot". What happens if you simply consider all code as hot? Is the overhead of building the mapping such that you really need this, or is this simply to avoid some memory/startup cost? > We also create an 'unsigned char' array "MAPPING", with length > set to match the length of the code object. So we have a 1-to-1 mapping > between opcodes and MAPPING array. > > 4. Next ~100 calls, while the code object is "hot", LOAD_GLOBAL and > LOAD_METHOD do "MAPPING[opcode_offset()]++". > > 5. After 1024 calls to the code object, ceval loop will iterate through > the MAPPING, counting all opcodes that were executed more than 50 times. > Where did the "50 times" boundary come from? Was this measured somehow or did you just guess at a number? > > 6. We then create an array of cache structs "CACHE" (here's a link to > the updated code.h file: [6]). We update MAPPING to be a mapping > between opcode position and position in the CACHE. The code object is > now "optimized". > > 7. When the code object is "optimized", LOAD_METHOD and LOAD_GLOBAL use > the CACHE array for fast path. > > 8. When there is a cache miss, i.e. the builtins/global/obj.__dict__ > were mutated, the opcode marks its entry in 'CACHE' as deoptimized, and > it will never try to use the cache again. > > Here's a link to the issue tracker with the first version of the patch: > [5]. I'm working on the patch in a github repo here: [4]. > > > Summary > ------- > > There are many things about this algorithm that we can improve/tweak. > Perhaps we should profile code objects longer, or account for time they > were executed. Maybe we shouldn't deoptimize opcodes on their first > cache miss. Maybe we can come up with better data structures. We also > need to profile the memory and see how much more this cache will require. > > One thing I'm certain about, is that we can get a 5-10% speedup of > CPython with relatively low memory impact. And I think it's worth > exploring that! > Great! > > If you're interested in these kind of optimizations, please help with > code reviews, ideas, profiling and benchmarks. The latter is especially > important, I'd never imagine how hard it is to come up with a good macro > benchmark. > Have you tried hg.python.org/benchmarks? Or are you looking for new benchmarks? If the latter then we should probably strike up a discussion on speed@ and start considering a new, unified benchmark suite that CPython, PyPy, Pyston, Jython, and IronPython can all agree on. > > I also want to thank my company MagicStack (magic.io) for sponsoring > this work. > Yep, thanks to all the companies sponsoring people doing work lately to try and speed things up! -------------- next part -------------- An HTML attachment was scrubbed... URL: From damien.p.george at gmail.com Wed Jan 27 15:10:03 2016 From: damien.p.george at gmail.com (Damien George) Date: Wed, 27 Jan 2016 20:10:03 +0000 Subject: [Python-Dev] Speeding up CPython 5-10% Message-ID: Hi Yuri, I think these are great ideas to speed up CPython. They are probably the simplest yet most effective ways to get performance improvements in the VM. MicroPython has had LOAD_METHOD/CALL_METHOD from the start (inspired by PyPy, and the main reason to have it is because you don't need to allocate on the heap when doing a simple method call). The specific opcodes are: LOAD_METHOD # same behaviour as you propose CALL_METHOD # for calls with positional and/or keyword args CALL_METHOD_VAR_KW # for calls with one or both of */** We also have LOAD_ATTR, CALL_FUNCTION and CALL_FUNCTION_VAR_KW for non-method calls. MicroPython also has dictionary lookup caching, but it's a bit different to your proposal. We do something much simpler: each opcode that has a cache ability (eg LOAD_GLOBAL, STORE_GLOBAL, LOAD_ATTR, etc) includes a single byte in the opcode which is an offset-guess into the dictionary to find the desired element. Eg for LOAD_GLOBAL we have (pseudo code): CASE(LOAD_GLOBAL): key = DECODE_KEY; offset_guess = DECODE_BYTE; if (global_dict[offset_guess].key == key) { // found the element straight away } else { // not found, do a full lookup and save the offset offset_guess = dict_lookup(global_dict, key); UPDATE_BYTECODE(offset_guess); } PUSH(global_dict[offset_guess].elem); We have found that such caching gives a massive performance increase, on the order of 20%. The issue (for us) is that it increases bytecode size by a considerable amount, requires writeable bytecode, and can be non-deterministic in terms of lookup time. Those things are important in the embedded world, but not so much on the desktop. Good luck with it! Regards, Damien. From yselivanov.ml at gmail.com Wed Jan 27 15:26:35 2016 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 27 Jan 2016 15:26:35 -0500 Subject: [Python-Dev] Speeding up CPython 5-10% In-Reply-To: References: <56A90B97.7090001@gmail.com> Message-ID: <56A927FB.8080307@gmail.com> On 2016-01-27 3:01 PM, Brett Cannon wrote: > > > [..] > > > We can also optimize LOAD_METHOD. There are high chances, that > 'obj' in > 'obj.method()' will be of the same type every time we execute the code > object. So if we'd have an opcodes cache, LOAD_METHOD could then > cache > a pointer to the resolved unbound method, a pointer to obj.__class__, > and tp_version_tag of obj.__class__. Then it would only need to check > if the cached object type is the same (and that it wasn't > modified) and > that obj.__dict__ doesn't override 'method'. Long story short, this > caching really speeds up method calls on types implemented in C. > list.append becomes very fast, because list doesn't have a > __dict__, so > the check is very cheap (with cache). > > > What would it take to make this work with Python-defined classes? It already works for Python-defined classes. But it's a bit more expensive because you still have to check object's __dict__. Still, there is a very noticeable performance increase (see the results of benchmark runs). > I guess that would require knowing the version of the instance's > __dict__, the instance's __class__ version, the MRO, and where the > method object was found in the MRO and any intermediary classes to > know if it was suddenly shadowed? I think that's everything. :) No, unfortunately we can't use the version of the instance's __dict__ as it is very volatile. The current implementation of opcode cache works because types are much more stable. Remember, the cache is per *code object*, so it should work for all times when code object is executed. class F: def spam(self): self.ham() # <- version of self.__dict__ is unstable # so we'll endup invalidating the cache # too often __class__ version, MRO changes etc are covered by tp_version_tag, which I use as one of guards. > > Obviously that's a lot, but I wonder how many classes have a deep > inheritance model vs. inheriting only from `object`? In that case you > only have to check self.__dict__.ma_version, self.__class__, > self.__class__.__dict__.ma_version, and self.__class__.__class__ == > `type`. I guess another way to look at this is to get an idea of how > complex do the checks have to get before caching something like this > is not worth it (probably also depends on how often you mutate > self.__dict__ thanks to mutating attributes, but you could in that > instance just decide to always look at self.__dict__ for the method's > key and then do the ma_version cache check for everything coming from > the class). > > Otherwise we can consider looking at the the caching strategies that > Self helped pioneer (http://bibliography.selflanguage.org/) that all > of the various JS engines lifted and consider caching all method lookups. Yeah, hidden classes are great. But the infrastructure to support them properly is huge. I think that to make them work you'll need a JIT -- to trace, deoptimize, optimize, and do it all with a reasonable memory footprint. My patch is much smaller and simpler, something we can realistically tune and ship in 3.6. > > A straightforward way to implement such a cache is simple, but > consumes > a lot of memory, that would be just wasted, since we only need such a > cache for LOAD_GLOBAL and LOAD_METHOD opcodes. So we have to be > creative > about the cache design. Here's what I came up with: > > 1. We add a few fields to the code object. > > 2. ceval will count how many times each code object is executed. > > 3. When the code object is executed over ~900 times, we mark it as > "hot". > > > What happens if you simply consider all code as hot? Is the overhead > of building the mapping such that you really need this, or is this > simply to avoid some memory/startup cost? That's the first step for this patch. I think we need to profile several big applications (I'll do it later for some of my code bases) and see how big is the memory impact if we optimize everything. In any case, I expect it to be noticeable (which may be acceptable), so we'll probably try to optimize it. > We also create an 'unsigned char' array "MAPPING", with length > set to match the length of the code object. So we have a 1-to-1 > mapping > between opcodes and MAPPING array. > > 4. Next ~100 calls, while the code object is "hot", LOAD_GLOBAL and > LOAD_METHOD do "MAPPING[opcode_offset()]++". > > 5. After 1024 calls to the code object, ceval loop will iterate > through > the MAPPING, counting all opcodes that were executed more than 50 > times. > > > Where did the "50 times" boundary come from? Was this measured somehow > or did you just guess at a number? If the number is too low, then you'll optimize code in branches that are rarely executed. So I picked 50, because I only trace opcodes for 100 calls. All of those numbers can be (should be?) changed, and I think we should experiment with different heuristics. > [..] > > > If you're interested in these kind of optimizations, please help with > code reviews, ideas, profiling and benchmarks. The latter is > especially > important, I'd never imagine how hard it is to come up with a good > macro > benchmark. > > > Have you tried hg.python.org/benchmarks ? Yes: https://gist.github.com/1st1/aed69d63a2ff4de4c7be > Or are you looking for new benchmarks? If the latter then we should > probably strike up a discussion on speed@ and start considering a new, > unified benchmark suite that CPython, PyPy, Pyston, Jython, and > IronPython can all agree on. Yes, IMHO we need better benchmarks. Some of the existing ones are very unstable -- I can run them three times and get three completely different results. Benchmarking is hard :) I'll create a few issues on bugs.python.org with new/updated benchmarks, and will join the speed@ mailing list. Yury From v+python at g.nevcal.com Wed Jan 27 15:32:41 2016 From: v+python at g.nevcal.com (Glenn Linderman) Date: Wed, 27 Jan 2016 12:32:41 -0800 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: <22184.64112.8831.955609@turnbull.sk.tsukuba.ac.jp> References: <56A698CF.3080402@mail.de> <22183.45626.891297.272536@turnbull.sk.tsukuba.ac.jp> <22184.64112.8831.955609@turnbull.sk.tsukuba.ac.jp> Message-ID: <56A92969.2030600@g.nevcal.com> On 1/27/2016 9:12 AM, Stephen J. Turnbull wrote: > Without that knowledge and effort, choosing a programming language > based on microbenchmarks is like choosing a car based on the > leg-length of the model sitting on the hood in the TV commercial. +1 QOTD -------------- next part -------------- An HTML attachment was scrubbed... URL: From yselivanov.ml at gmail.com Wed Jan 27 15:37:49 2016 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 27 Jan 2016 15:37:49 -0500 Subject: [Python-Dev] Speeding up CPython 5-10% In-Reply-To: References: Message-ID: <56A92A9D.70508@gmail.com> On 2016-01-27 3:10 PM, Damien George wrote: > Hi Yuri, > > I think these are great ideas to speed up CPython. They are probably > the simplest yet most effective ways to get performance improvements > in the VM. Thanks! > > MicroPython has had LOAD_METHOD/CALL_METHOD from the start (inspired > by PyPy, and the main reason to have it is because you don't need to > allocate on the heap when doing a simple method call). The specific > opcodes are: > > LOAD_METHOD # same behaviour as you propose > CALL_METHOD # for calls with positional and/or keyword args > CALL_METHOD_VAR_KW # for calls with one or both of */** > > We also have LOAD_ATTR, CALL_FUNCTION and CALL_FUNCTION_VAR_KW for > non-method calls. Yes, we'll need to add CALL_METHOD{_VAR|_KW|etc} opcodes to optimize all kind of method calls. However, I'm not sure how big the impact will be, need to do more benchmarking. BTW, how do you benchmark MicroPython? > > MicroPython also has dictionary lookup caching, but it's a bit > different to your proposal. We do something much simpler: each opcode > that has a cache ability (eg LOAD_GLOBAL, STORE_GLOBAL, LOAD_ATTR, > etc) includes a single byte in the opcode which is an offset-guess > into the dictionary to find the desired element. Eg for LOAD_GLOBAL > we have (pseudo code): > > CASE(LOAD_GLOBAL): > key = DECODE_KEY; > offset_guess = DECODE_BYTE; > if (global_dict[offset_guess].key == key) { > // found the element straight away > } else { > // not found, do a full lookup and save the offset > offset_guess = dict_lookup(global_dict, key); > UPDATE_BYTECODE(offset_guess); > } > PUSH(global_dict[offset_guess].elem); > > We have found that such caching gives a massive performance increase, > on the order of 20%. The issue (for us) is that it increases bytecode > size by a considerable amount, requires writeable bytecode, and can be > non-deterministic in terms of lookup time. Those things are important > in the embedded world, but not so much on the desktop. That's a neat idea! You're right, it does require bytecode to become writeable. I considered implementing a similar strategy, but this would be a big change for CPython. So I decided to minimize the impact of the patch and leave the opcodes untouched. Thanks! Yury From damien.p.george at gmail.com Wed Jan 27 16:20:27 2016 From: damien.p.george at gmail.com (Damien George) Date: Wed, 27 Jan 2016 21:20:27 +0000 Subject: [Python-Dev] Speeding up CPython 5-10% Message-ID: Hi Yury, (Sorry for misspelling your name previously!) > Yes, we'll need to add CALL_METHOD{_VAR|_KW|etc} opcodes to optimize all > kind of method calls. However, I'm not sure how big the impact will be, > need to do more benchmarking. I never did such fine grained analysis with MicroPython. I don't think there are many uses of * and ** that it'd be worth it, but definitely there are lots of uses of plain keywords. Also, you'd want to consider how simple/complex it is to treat all these different opcodes in the compiler. For us, it's simpler to treat everything the same. Otherwise your LOAD_METHOD part of the compiler will need to peek deep into the AST to see what kind of call it is. > BTW, how do you benchmark MicroPython? Haha, good question! Well, we use Pystone 1.2 (unmodified) to do basic benchmarking, and find it to be quite good. We track our code live at: http://micropython.org/resources/code-dashboard/ You can see there the red line, which is the Pystone result. There was a big jump around Jan 2015 which is when we introduced opcode dictionary caching. And since then it's been very gradually increasing due to small optimisations here and there. Pystone is actually a great benchmark for embedded systems because it gives very reliable results there (almost zero variation across runs) and if we can squeeze 5 more Pystones out with some change then we know that it's a good optimisation (for efficiency at least). For us, low RAM usage and small code size are the most important factors, and we track those meticulously. But in fact, smaller code size quite often correlates with more efficient code because there's less to execute and it fits in the CPU cache (at least on the desktop). We do have some other benchmarks, but they are highly specialised for us. For example, how fast can you bit bang a GPIO pin using pure Python code. Currently we get around 200kHz on a 168MHz MCU, which shows that pure (Micro)Python code is about 100 times slower than C. > That's a neat idea! You're right, it does require bytecode to become > writeable. I considered implementing a similar strategy, but this would > be a big change for CPython. So I decided to minimize the impact of the > patch and leave the opcodes untouched. I think you need to consider "big" changes, especially ones like this that can have a great (and good) impact. But really, this is a behind-the-scenes change that *should not* affect end users, and so you should not have any second thoughts about doing it. One problem I see with CPython is that it exposes way too much to the user (both Python programmer and C extension writer) and this hurts both language evolution (you constantly need to provide backwards compatibility) and ability to optimise. Cheers, Damien. From yselivanov.ml at gmail.com Wed Jan 27 16:53:55 2016 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 27 Jan 2016 16:53:55 -0500 Subject: [Python-Dev] Speeding up CPython 5-10% In-Reply-To: References: Message-ID: <56A93C73.6030606@gmail.com> Damien, On 2016-01-27 4:20 PM, Damien George wrote: > Hi Yury, > > (Sorry for misspelling your name previously!) NP. As long as the first letter is "y" I don't care ;) > >> Yes, we'll need to add CALL_METHOD{_VAR|_KW|etc} opcodes to optimize all >> kind of method calls. However, I'm not sure how big the impact will be, >> need to do more benchmarking. > I never did such fine grained analysis with MicroPython. I don't > think there are many uses of * and ** that it'd be worth it, but > definitely there are lots of uses of plain keywords. Also, you'd want > to consider how simple/complex it is to treat all these different > opcodes in the compiler. For us, it's simpler to treat everything the > same. Otherwise your LOAD_METHOD part of the compiler will need to > peek deep into the AST to see what kind of call it is. > >> BTW, how do you benchmark MicroPython? > Haha, good question! Well, we use Pystone 1.2 (unmodified) to do > basic benchmarking, and find it to be quite good. We track our code > live at: > > http://micropython.org/resources/code-dashboard/ The dashboard is cool! An off-topic: have you ever tried hg.python.org/benchmarks or compare MicroPython vs CPython? I'm curious if MicroPython is faster -- in that case we'll try to copy some optimization ideas. > You can see there the red line, which is the Pystone result. There > was a big jump around Jan 2015 which is when we introduced opcode > dictionary caching. And since then it's been very gradually > increasing due to small optimisations here and there. Do you use opcode dictionary caching only for LOAD_GLOBAL-like opcodes? Do you have an equivalent of LOAD_FAST, or you use dicts to store local variables? >> That's a neat idea! You're right, it does require bytecode to become >> writeable. I considered implementing a similar strategy, but this would >> be a big change for CPython. So I decided to minimize the impact of the >> patch and leave the opcodes untouched. > I think you need to consider "big" changes, especially ones like this > that can have a great (and good) impact. But really, this is a > behind-the-scenes change that *should not* affect end users, and so > you should not have any second thoughts about doing it. If we change the opcode size, it will probably affect libraries that compose or modify code objects. Modules like "dis" will also need to be updated. And that's probably just a tip of the iceberg. We can still implement your approach if we add a separate private 'unsigned char' array to each code object, so that LOAD_GLOBAL can store the key offsets. It should be a bit faster than my current patch, since it has one less level of indirection. But this way we loose the ability to optimize LOAD_METHOD, simply because it requires more memory for its cache. In any case, I'll experiment! > One problem I > see with CPython is that it exposes way too much to the user (both > Python programmer and C extension writer) and this hurts both language > evolution (you constantly need to provide backwards compatibility) and > ability to optimise. Right. Even though CPython explicitly states that opcodes and code objects might change in the future, we still have to be careful about changing them. Yury From yselivanov.ml at gmail.com Wed Jan 27 17:01:17 2016 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 27 Jan 2016 17:01:17 -0500 Subject: [Python-Dev] Speeding up CPython 5-10% In-Reply-To: <56A90B97.7090001@gmail.com> References: <56A90B97.7090001@gmail.com> Message-ID: <56A93E2D.5060707@gmail.com> As Brett suggested, I've just run the benchmarks suite with memory tracking on. The results are here: https://gist.github.com/1st1/1851afb2773526fd7c58 Looks like the memory increase is around 1%. One synthetic micro-benchmark, unpack_sequence, contains hundreds of lines that load a global variable and does nothing else, consumes 5%. Yury From v+python at g.nevcal.com Wed Jan 27 15:46:15 2016 From: v+python at g.nevcal.com (Glenn Linderman) Date: Wed, 27 Jan 2016 12:46:15 -0800 Subject: [Python-Dev] Speeding up CPython 5-10% In-Reply-To: <56A92A9D.70508@gmail.com> References: <56A92A9D.70508@gmail.com> Message-ID: <56A92C97.30806@g.nevcal.com> On 1/27/2016 12:37 PM, Yury Selivanov wrote: > >> >> MicroPython also has dictionary lookup caching, but it's a bit >> different to your proposal. We do something much simpler: each opcode >> that has a cache ability (eg LOAD_GLOBAL, STORE_GLOBAL, LOAD_ATTR, >> etc) includes a single byte in the opcode which is an offset-guess >> into the dictionary to find the desired element. Eg for LOAD_GLOBAL >> we have (pseudo code): >> >> CASE(LOAD_GLOBAL): >> key = DECODE_KEY; >> offset_guess = DECODE_BYTE; >> if (global_dict[offset_guess].key == key) { >> // found the element straight away >> } else { >> // not found, do a full lookup and save the offset >> offset_guess = dict_lookup(global_dict, key); >> UPDATE_BYTECODE(offset_guess); >> } >> PUSH(global_dict[offset_guess].elem); >> >> We have found that such caching gives a massive performance increase, >> on the order of 20%. The issue (for us) is that it increases bytecode >> size by a considerable amount, requires writeable bytecode, and can be >> non-deterministic in terms of lookup time. Those things are important >> in the embedded world, but not so much on the desktop. > > That's a neat idea! You're right, it does require bytecode to become > writeable. Would it? Remember "fixup lists"? Maybe they still exist for loading function addresses from one DLL into the code of another at load time? So the equivalent for bytecode requires a static table of offset_guess, and the offsets into that table are allocated by the byte-code loader at byte-code load time, and the byte-code is "fixed up" at load time to use the correct offsets into the offset_guess table. It takes one more indirection to find the guess, but if the result is a 20% improvement, maybe you'd still get 19%... -------------- next part -------------- An HTML attachment was scrubbed... URL: From yselivanov.ml at gmail.com Wed Jan 27 17:40:26 2016 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 27 Jan 2016 17:40:26 -0500 Subject: [Python-Dev] Speeding up CPython 5-10% In-Reply-To: <56A90B97.7090001@gmail.com> References: <56A90B97.7090001@gmail.com> Message-ID: <56A9475A.3040300@gmail.com> BTW, this optimization also makes some old optimization tricks obsolete. 1. No need to write 'def func(len=len)'. Globals lookups will be fast. 2. No need to save bound methods: obj = [] obj_append = obj.append for _ in range(10**6): obj_append(something) This hand-optimized code would only be marginally faster, because of LOAD_METHOD and how it's cached. Yury From yselivanov.ml at gmail.com Wed Jan 27 17:58:11 2016 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Wed, 27 Jan 2016 17:58:11 -0500 Subject: [Python-Dev] Speeding up CPython 5-10% In-Reply-To: <56A92C97.30806@g.nevcal.com> References: <56A92A9D.70508@gmail.com> <56A92C97.30806@g.nevcal.com> Message-ID: <56A94B83.5020705@gmail.com> On 2016-01-27 3:46 PM, Glenn Linderman wrote: > On 1/27/2016 12:37 PM, Yury Selivanov wrote: >> >>> >>> MicroPython also has dictionary lookup caching, but it's a bit >>> different to your proposal. We do something much simpler: each opcode >>> that has a cache ability (eg LOAD_GLOBAL, STORE_GLOBAL, LOAD_ATTR, >>> etc) includes a single byte in the opcode which is an offset-guess >>> into the dictionary to find the desired element. Eg for LOAD_GLOBAL >>> we have (pseudo code): >>> >>> CASE(LOAD_GLOBAL): >>> key = DECODE_KEY; >>> offset_guess = DECODE_BYTE; >>> if (global_dict[offset_guess].key == key) { >>> // found the element straight away >>> } else { >>> // not found, do a full lookup and save the offset >>> offset_guess = dict_lookup(global_dict, key); >>> UPDATE_BYTECODE(offset_guess); >>> } >>> PUSH(global_dict[offset_guess].elem); >>> >>> We have found that such caching gives a massive performance increase, >>> on the order of 20%. The issue (for us) is that it increases bytecode >>> size by a considerable amount, requires writeable bytecode, and can be >>> non-deterministic in terms of lookup time. Those things are important >>> in the embedded world, but not so much on the desktop. >> >> That's a neat idea! You're right, it does require bytecode to become >> writeable. > > Would it? > > Remember "fixup lists"? Maybe they still exist for loading function > addresses from one DLL into the code of another at load time? > > So the equivalent for bytecode requires a static table of > offset_guess, and the offsets into that table are allocated by the > byte-code loader at byte-code load time, and the byte-code is "fixed > up" at load time to use the correct offsets into the offset_guess > table. It takes one more indirection to find the guess, but if the > result is a 20% improvement, maybe you'd still get 19%... Right, in my current patch I have an offset table per code object. Essentially, this offset table adds 8bits per opcode. It also means that only first 255 LOAD_GLOBAL/LOAD_METHOD opcodes *per-code-object* are optimized (because the offset table only can store 8bit offsets), which is usually enough (I think you need to have more than a 500 lines of code function to reach that limit). Yury From stephen at xemacs.org Wed Jan 27 19:53:51 2016 From: stephen at xemacs.org (Stephen J. Turnbull) Date: Thu, 28 Jan 2016 09:53:51 +0900 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: <56A698CF.3080402@mail.de> <56A7AE7C.3080806@mail.de> <56A9084F.4090200@mail.de> Message-ID: <22185.26271.894312.852637@turnbull.sk.tsukuba.ac.jp> Brett Cannon writes: > the core team has an implicit understanding that any performance > improvement is taken into consideration in terms of balancing > complexity in CPython with how much improvement it gets us. EIBTI. I can shut up now. Thank you! From ncoghlan at gmail.com Thu Jan 28 02:58:07 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 28 Jan 2016 17:58:07 +1000 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: <56A90F2C.9000401@mail.de> References: <56A698CF.3080402@mail.de> <56A7AE7C.3080806@mail.de> <56A90F2C.9000401@mail.de> Message-ID: On 28 January 2016 at 04:40, Sven R. Kunze wrote: > On 27.01.2016 12:16, Nick Coghlan wrote: >> Umm, no, that's not how this works > That's exactly how it works, Nick. > > INADA uses Python as I use crossroads each day. Daily human business. > > If you read his post carefully, you can discover that he just presented to > you his perspective of the world. Moreover, I can assure you that he's not > alone. As usual with humans it's not about facts or mathematically proven > theorems but perception. It's more about marketing, little important details > (or unimportant ones depending on whom you ask) and so on. Stating that he > has a wrong perspective will not change anything. The only part I disagree with is requesting that *other people* care about marketing numbers if that's not something they're already inclined to care about. I'm not in any way disputing that folks make decisions based on inappropriate metrics, nor that it bothers some folks that there are dozens of perfectly viable programming languages people may choose to use instead of Python. The fact remains that contributors to open source projects work on what they want to work on or on what their employers pay them to work on (for a lucky few, those are the same thing), so telling other contributors that they're working on the "wrong thing" because their priorities differ from our priorities is almost always going to be irritating rather than helpful. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From songofacandy at gmail.com Thu Jan 28 03:30:43 2016 From: songofacandy at gmail.com (INADA Naoki) Date: Thu, 28 Jan 2016 17:30:43 +0900 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: <56A698CF.3080402@mail.de> <56A7AE7C.3080806@mail.de> <56A90F2C.9000401@mail.de> Message-ID: Please stop. I'm sorry about messing up this thread. I just wanted to represent why I'm very interested in Victor's efforts. Regards. On Thu, Jan 28, 2016 at 4:58 PM, Nick Coghlan wrote: > On 28 January 2016 at 04:40, Sven R. Kunze wrote: > > On 27.01.2016 12:16, Nick Coghlan wrote: > >> Umm, no, that's not how this works > > That's exactly how it works, Nick. > > > > INADA uses Python as I use crossroads each day. Daily human business. > > > > If you read his post carefully, you can discover that he just presented > to > > you his perspective of the world. Moreover, I can assure you that he's > not > > alone. As usual with humans it's not about facts or mathematically proven > > theorems but perception. It's more about marketing, little important > details > > (or unimportant ones depending on whom you ask) and so on. Stating that > he > > has a wrong perspective will not change anything. > > The only part I disagree with is requesting that *other people* care > about marketing numbers if that's not something they're already > inclined to care about. I'm not in any way disputing that folks make > decisions based on inappropriate metrics, nor that it bothers some > folks that there are dozens of perfectly viable programming languages > people may choose to use instead of Python. > > The fact remains that contributors to open source projects work on > what they want to work on or on what their employers pay them to work > on (for a lucky few, those are the same thing), so telling other > contributors that they're working on the "wrong thing" because their > priorities differ from our priorities is almost always going to be > irritating rather than helpful. > > Cheers, > Nick. > > -- > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia > -- INADA Naoki -------------- next part -------------- An HTML attachment was scrubbed... URL: From vadmium+py at gmail.com Thu Jan 28 03:35:59 2016 From: vadmium+py at gmail.com (Martin Panter) Date: Thu, 28 Jan 2016 08:35:59 +0000 Subject: [Python-Dev] Buildbot timing out - test suite failure - test_socket issue with UDP6? In-Reply-To: References: Message-ID: > After digging through test_socket.py for over an hour (the MRO for > RecvmsgUDP6Test is enormous!!), I've boiled the issue down to this: > > import socket > MSG = b'asdf qwer zxcv' > serv = socket.socket(socket.AF_INET6, socket.SOCK_DGRAM) > serv.bind(("::1", 0)) > cli = socket.socket(socket.AF_INET6, socket.SOCK_DGRAM) > cli.bind(("::1", 0)) > cli.sendto(MSG, serv.getsockname()) > print(serv.recvmsg(len(MSG) - 3, 0, socket.MSG_PEEK)) > print(serv.recvmsg(len(MSG), 0, socket.MSG_PEEK)) > print(serv.recvmsg(len(MSG))) > > On my main system, this produces three lines of output: the first has > truncated text, the second has full text, and the third also has full > text. This proves that MSG_PEEK is working correctly. On the buildbot, > though, the first one stalls out. Commenting that line out produces > correct results - peek the full data, then read it, and all is well. > > Any idea why partial read on a datagram socket would sometimes stall? I think it would stall if there is no data to receive. Maybe check the return value of sendto(), to ensure it is sending the whole message. Attached is a C program which should do the equivalent of your boiled-down Python script, in case that helps: $ gcc -Wall peek-udp6.c -o peek-udp6 $ ./peek-udp6 Bytes sent: 14 Received [asdf qwer z] Received [asdf qwer zxcv] Received [asdf qwer zxcv] Other things that come to mind are to see if there is anything odd about the buildbot?s Linux kernel and glibc versions. Maybe run the Python script under ?strace? to see if anything strange is going on. -------------- next part -------------- A non-text attachment was scrubbed... Name: peek-udp6.c Type: text/x-csrc Size: 2404 bytes Desc: not available URL: From ncoghlan at gmail.com Thu Jan 28 04:39:52 2016 From: ncoghlan at gmail.com (Nick Coghlan) Date: Thu, 28 Jan 2016 19:39:52 +1000 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: <56A698CF.3080402@mail.de> <56A7AE7C.3080806@mail.de> <56A90F2C.9000401@mail.de> Message-ID: On 28 January 2016 at 18:30, INADA Naoki wrote: > Please stop. > > I'm sorry about messing up this thread. Not your fault at all! This is just a particular bugbear of mine, since software architecture design (including appropriate programming language selection) is an even more poorly understood discipline than software development in general :) > I just wanted to represent why I'm very interested in Victor's efforts. And thanks for posting that, as it is indeed cool that the optimisation efforts currently being discussed may result in performance improvements on some of the simplified micro-benchmarks popular in programming language shootouts. There's no way you could have anticipated the subsequent tangential discussion on motives for contributing to open source projects, and the impact that has on what we can reasonably expect from fellow contributors. Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia From rosuav at gmail.com Thu Jan 28 05:41:08 2016 From: rosuav at gmail.com (Chris Angelico) Date: Thu, 28 Jan 2016 21:41:08 +1100 Subject: [Python-Dev] Buildbot timing out - test suite failure - test_socket issue with UDP6? In-Reply-To: References: Message-ID: On Thu, Jan 28, 2016 at 7:35 PM, Martin Panter wrote: > Other things that come to mind are to see if there is anything odd > about the buildbot?s Linux kernel and glibc versions. Maybe run the > Python script under ?strace? to see if anything strange is going on. > I did that, and a few other things. Most notably, commenting out the partial-read resulted in a flawless run, and strace showed it stalling in the partial read too. However, as I was doing so (and I just discarded a draft message where I'd been typing up notes), my entire system went kerblooie, and I've spent the last day rebuilding stuff from scratch. When I get around to it, I'll rebuild the buildbot VM - and it'll be Debian Jessie (current stable), because there's no particular reason to use Wheezy (oldstable). So the problem will almost certainly disappear. ChrisA From stefan at bytereef.org Thu Jan 28 08:49:50 2016 From: stefan at bytereef.org (Stefan Krah) Date: Thu, 28 Jan 2016 14:49:50 +0100 Subject: [Python-Dev] [Python-checkins] BAD Benchmark Results for Python Default 2016-01-26 In-Reply-To: References: <2acd60eb-a5d1-4b31-853d-7c7f3f0841fa@irsmsx101.ger.corp.intel.com> Message-ID: <20160128134950.GA3196@bytereef.org> IMO the timings of the benchmark suite are a bit unstable -- this is not the fault of Intel's setup, I noticed it also when running the suite myself. On Tue, Jan 26, 2016 at 06:48:54PM +0000, Stewart, David C wrote: > Wow, what happened to Python default to cause such a regression? > > > > > On 1/26/16, 7:31 AM, "lp_benchmark_robot" wrote: > > >Results for project Python default, build date 2016-01-26 03:07:40 +0000 > >commit: cbd4a6a2657e > >previous commit: f700bc0412bc > >revision date: 2016-01-26 02:54:37 +0000 > >environment: Haswell-EP > > cpu: Intel(R) Xeon(R) CPU E5-2699 v3 @ 2.30GHz 2x18 cores, stepping 2, LLC 45 MB > > mem: 128 GB > > os: CentOS 7.1 > > kernel: Linux 3.10.0-229.4.2.el7.x86_64 > > > >Baseline results were generated using release v3.4.3, with hash b4cbecbc0781 > >from 2015-02-25 12:15:33+00:00 > > > >---------------------------------------------------------------------------------- > > benchmark relative change since change since current rev run > > std_dev* last run baseline with PGO > >---------------------------------------------------------------------------------- > >:-) django_v2 0.21% -2.93% 8.95% 16.19% > >:-| pybench 0.10% 0.05% -1.87% 5.40% > >:-( regex_v8 2.72% -0.02% -4.67% 4.57% > >:-| nbody 0.13% -0.92% -1.33% 7.40% > >:-| json_dump_v2 0.20% 0.87% -1.59% 11.48% > >:-| normal_startup 0.90% -0.57% 0.10% 5.35% > >---------------------------------------------------------------------------------- > >* Relative Standard Deviation (Standard Deviation/Average) > > > >If this is not displayed properly please visit our results page here: http://languagesperformance.intel.com/bad-benchmark-results-for-python-default-2016-01-26/ > > > >Note: Benchmark results are measured in seconds. > > > >Subject Label Legend: > >Attributes are determined based on the performance evolution of the workloads > >compared to the previous measurement iteration. > >NEUTRAL: performance did not change by more than 1% for any workload > >GOOD: performance improved by more than 1% for at least one workload and there > >is no regression greater than 1% > >BAD: performance dropped by more than 1% for at least one workload and there is > >no improvement greater than 1% > >UGLY: performance improved by more than 1% for at least one workload and also > >dropped by more than 1% for at least one workload > > > > > >Our lab does a nightly source pull and build of the Python project and measures > >performance changes against the previous stable version and the previous nightly > >measurement. This is provided as a service to the community so that quality > >issues with current hardware can be identified quickly. > > > >Intel technologies' features and benefits depend on system configuration and may > >require enabled hardware, software or service activation. Performance varies > >depending on system configuration. > _______________________________________________ > Python-checkins mailing list > Python-checkins at python.org > https://mail.python.org/mailman/listinfo/python-checkins From larry at hastings.org Thu Jan 28 09:57:20 2016 From: larry at hastings.org (Larry Hastings) Date: Thu, 28 Jan 2016 06:57:20 -0800 Subject: [Python-Dev] Fun with ancient unsupported platforms Message-ID: <56AA2C50.7050700@hastings.org> Check out and cd into Python trunk. % grep -Ri win16 * | wc 10 66 625 % grep -Ri nextstep | wc 23 119 1328 % grep -Ri rhapsody * | wc 47 269 3390 % grep -Ri msdos * | wc 56 381 3895 % grep -Ri ms-dos * | wc 20 180 1425 win16! *laughs* *wipes tear from eye* It's currently 2016. Perhaps it's time to remove all vestiges of these unsupported operating systems nobody's cared about since a year that started with a '1'? //arry/ p.s. I suspect some of the uses of "rhapsody" are actually live code used for OS X. So this isn't necessarily dead code, some of it is merely long-out-of-date comments. p.p.s. At least there were no references to "taligent", "copland", or "gershwin"...! -------------- next part -------------- An HTML attachment was scrubbed... URL: From christian at python.org Thu Jan 28 10:40:48 2016 From: christian at python.org (Christian Heimes) Date: Thu, 28 Jan 2016 16:40:48 +0100 Subject: [Python-Dev] Fun with ancient unsupported platforms In-Reply-To: <56AA2C50.7050700@hastings.org> References: <56AA2C50.7050700@hastings.org> Message-ID: <56AA3680.8060602@python.org> On 2016-01-28 15:57, Larry Hastings wrote: > > > Check out and cd into Python trunk. > > % grep -Ri win16 * | wc > 10 66 625 > > % grep -Ri nextstep | wc > 23 119 1328 > > % grep -Ri rhapsody * | wc > 47 269 3390 > > % grep -Ri msdos * | wc > 56 381 3895 > % grep -Ri ms-dos * | wc > 20 180 1425 > > > win16! *laughs* *wipes tear from eye* > > It's currently 2016. Perhaps it's time to remove all vestiges of these > unsupported operating systems nobody's cared about since a year that > started with a '1'? The platform module has more hilarious comments: Still needed: * more support for WinCE * support for MS-DOS (PythonDX ?) * support for Amiga and other still unsupported platforms running Python Christian From rymg19 at gmail.com Thu Jan 28 10:44:32 2016 From: rymg19 at gmail.com (Ryan Gonzalez) Date: Thu, 28 Jan 2016 09:44:32 -0600 Subject: [Python-Dev] Fun with ancient unsupported platforms In-Reply-To: <56AA2C50.7050700@hastings.org> References: <56AA2C50.7050700@hastings.org> Message-ID: win16 doesn't seem to have important stuff: https://github.com/python/cpython/search?utf8=?&q="win16" On January 28, 2016 8:57:20 AM CST, Larry Hastings wrote: > > >Check out and cd into Python trunk. > >% grep -Ri win16 * | wc > 10 66 625 > >% grep -Ri nextstep | wc > 23 119 1328 > >% grep -Ri rhapsody * | wc > 47 269 3390 > >% grep -Ri msdos * | wc > 56 381 3895 > % grep -Ri ms-dos * | wc > 20 180 1425 > > >win16! *laughs* *wipes tear from eye* > >It's currently 2016. Perhaps it's time to remove all vestiges of these > >unsupported operating systems nobody's cared about since a year that >started with a '1'? > > >//arry/ > >p.s. I suspect some of the uses of "rhapsody" are actually live code >used for OS X. So this isn't necessarily dead code, some of it is >merely long-out-of-date comments. > >p.p.s. At least there were no references to "taligent", "copland", or >"gershwin"...! > > >------------------------------------------------------------------------ > >_______________________________________________ >Python-Dev mailing list >Python-Dev at python.org >https://mail.python.org/mailman/listinfo/python-dev >Unsubscribe: >https://mail.python.org/mailman/options/python-dev/rymg19%40gmail.com -- Sent from my Nexus 5 with K-9 Mail. Please excuse my brevity. -------------- next part -------------- An HTML attachment was scrubbed... URL: From barry at python.org Thu Jan 28 11:00:31 2016 From: barry at python.org (Barry Warsaw) Date: Thu, 28 Jan 2016 11:00:31 -0500 Subject: [Python-Dev] Fun with ancient unsupported platforms In-Reply-To: References: <56AA2C50.7050700@hastings.org> Message-ID: <20160128110031.7e7e2295@subdivisions.wooz.org> Just as long as you can still build and run Python on Guido's ancient SGI machine . -Barry From victor.stinner at gmail.com Thu Jan 28 11:29:41 2016 From: victor.stinner at gmail.com (Victor Stinner) Date: Thu, 28 Jan 2016 17:29:41 +0100 Subject: [Python-Dev] Fun with ancient unsupported platforms In-Reply-To: <56AA2C50.7050700@hastings.org> References: <56AA2C50.7050700@hastings.org> Message-ID: We slowly remove old platforms, but only if the code specific to these old platforms is annoying to maintain. For example, I wrote the change: https://hg.python.org/cpython/rev/a1605d2508af """ Issue #22591: Drop support of MS-DOS Drop support of MS-DOS, especially of the DJGPP compiler (MS-DOS port of GCC). Today is a sad day. Good bye MS-DOS, good bye my friend :'-( """ => http://bugs.python.org/issue22591 Victor 2016-01-28 15:57 GMT+01:00 Larry Hastings : > > > Check out and cd into Python trunk. > > % grep -Ri win16 * | wc > 10 66 625 > > % grep -Ri nextstep | wc > 23 119 1328 > > % grep -Ri rhapsody * | wc > 47 269 3390 > > % grep -Ri msdos * | wc > 56 381 3895 > % grep -Ri ms-dos * | wc > 20 180 1425 > > > win16! *laughs* *wipes tear from eye* > > It's currently 2016. Perhaps it's time to remove all vestiges of these > unsupported operating systems nobody's cared about since a year that started > with a '1'? > > > /arry > > p.s. I suspect some of the uses of "rhapsody" are actually live code used > for OS X. So this isn't necessarily dead code, some of it is merely > long-out-of-date comments. > > p.p.s. At least there were no references to "taligent", "copland", or > "gershwin"...! > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/victor.stinner%40gmail.com > From pmiscml at gmail.com Thu Jan 28 13:38:36 2016 From: pmiscml at gmail.com (Paul Sokolovsky) Date: Thu, 28 Jan 2016 20:38:36 +0200 Subject: [Python-Dev] Fun with ancient unsupported platforms In-Reply-To: References: <56AA2C50.7050700@hastings.org> Message-ID: <20160128203836.7e65ed2d@x230> Hello, On Thu, 28 Jan 2016 17:29:41 +0100 Victor Stinner wrote: > We slowly remove old platforms, but only if the code specific to these > old platforms is annoying to maintain. For example, I wrote the > change: > > https://hg.python.org/cpython/rev/a1605d2508af > """ > Issue #22591: Drop support of MS-DOS > > Drop support of MS-DOS, especially of the DJGPP compiler (MS-DOS port > of GCC). > > Today is a sad day. Good bye MS-DOS, good bye my friend :'-( > """ > => http://bugs.python.org/issue22591 Well, MicroPython is ready to take the baton - FreeDOS (that's how they call MS-DOS now) support was added recently: https://github.com/micropython/micropython/commit/64a909ef5113925adef19f275f62473de8ee68c5 > > Victor > [] -- Best regards, Paul mailto:pmiscml at gmail.com From rosuav at gmail.com Thu Jan 28 22:47:03 2016 From: rosuav at gmail.com (Chris Angelico) Date: Fri, 29 Jan 2016 14:47:03 +1100 Subject: [Python-Dev] Buildbot timing out - test suite failure - test_socket issue with UDP6? In-Reply-To: References: Message-ID: On Thu, Jan 28, 2016 at 9:41 PM, Chris Angelico wrote: > However, as I was doing so (and I just discarded a draft message where > I'd been typing up notes), my entire system went kerblooie, and I've > spent the last day rebuilding stuff from scratch. When I get around to > it, I'll rebuild the buildbot VM - and it'll be Debian Jessie (current > stable), because there's no particular reason to use Wheezy > (oldstable). So the problem will almost certainly disappear. Sure enough, the problem is no longer reproducible. Sorry folks. Hopefully it wasn't actually a bug anywhere, but was some bizarre piece of VM setup weirdness. The "AMD64 Debian root" buildbot (angelico-debian-amd64) is now running Debian Jessie, rather than the Wheezy it was before. ChrisA From pludemann at google.com Fri Jan 29 03:05:09 2016 From: pludemann at google.com (Peter Ludemann) Date: Fri, 29 Jan 2016 00:05:09 -0800 Subject: [Python-Dev] FAT Python (lack of) performance In-Reply-To: References: <56A698CF.3080402@mail.de> <56A7AE7C.3080806@mail.de> <56A90F2C.9000401@mail.de> Message-ID: About benchmarks ... I've been there and it's not benchmarks that decide whether something succeeds or fails. (I found this old discussion which mentions FIB (also TAK , which is rather more brutal) ... do you recognize the language that got an implementation that was competitive with C in performance, was vastly more expressive, yet failed to catch on?) OTOH, good performance is never a bad thing and sometimes is a necessity; so I applau this work. On 28 January 2016 at 01:39, Nick Coghlan wrote: > On 28 January 2016 at 18:30, INADA Naoki wrote: > > Please stop. > > > > I'm sorry about messing up this thread. > > Not your fault at all! This is just a particular bugbear of mine, > since software architecture design (including appropriate programming > language selection) is an even more poorly understood discipline than > software development in general :) > > > I just wanted to represent why I'm very interested in Victor's efforts. > > And thanks for posting that, as it is indeed cool that the > optimisation efforts currently being discussed may result in > performance improvements on some of the simplified micro-benchmarks > popular in programming language shootouts. > > There's no way you could have anticipated the subsequent tangential > discussion on motives for contributing to open source projects, and > the impact that has on what we can reasonably expect from fellow > contributors. > > Cheers, > Nick. > > -- > Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/pludemann%40google.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefan_ml at behnel.de Fri Jan 29 05:00:02 2016 From: stefan_ml at behnel.de (Stefan Behnel) Date: Fri, 29 Jan 2016 11:00:02 +0100 Subject: [Python-Dev] Speeding up CPython 5-10% In-Reply-To: <56A90B97.7090001@gmail.com> References: <56A90B97.7090001@gmail.com> Message-ID: Yury Selivanov schrieb am 27.01.2016 um 19:25: > tl;dr The summary is that I have a patch that improves CPython performance > up to 5-10% on macro benchmarks. Benchmarks results on Macbook Pro/Mac OS > X, desktop CPU/Linux, server CPU/Linux are available at [1]. There are no > slowdowns that I could reproduce consistently. > > There are two different optimizations that yield this speedup: > LOAD_METHOD/CALL_METHOD opcodes and per-opcode cache in ceval loop. > > LOAD_METHOD & CALL_METHOD > ------------------------- > > We had a lot of conversations with Victor about his PEP 509, and he sent me > a link to his amazing compilation of notes about CPython performance [2]. > One optimization that he pointed out to me was LOAD/CALL_METHOD opcodes, an > idea first originated in PyPy. > > There is a patch that implements this optimization, it's tracked here: > [3]. There are some low level details that I explained in the issue, but > I'll go over the high level design in this email as well. > > Every time you access a method attribute on an object, a BoundMethod object > is created. It is a fairly expensive operation, despite a freelist of > BoundMethods (so that memory allocation is generally avoided). The idea is > to detect what looks like a method call in the compiler, and emit a pair of > specialized bytecodes for that. > > So instead of LOAD_GLOBAL/LOAD_ATTR/CALL_FUNCTION we will have > LOAD_GLOBAL/LOAD_METHOD/CALL_METHOD. > > LOAD_METHOD looks at the object on top of the stack, and checks if the name > resolves to a method or to a regular attribute. If it's a method, then we > push the unbound method object and the object to the stack. If it's an > attribute, we push the resolved attribute and NULL. > > When CALL_METHOD looks at the stack it knows how to call the unbound method > properly (pushing the object as a first arg), or how to call a regular > callable. > > This idea does make CPython faster around 2-4%. And it surely doesn't make > it slower. I think it's a safe bet to at least implement this optimization > in CPython 3.6. > > So far, the patch only optimizes positional-only method calls. It's > possible to optimize all kind of calls, but this will necessitate 3 more > opcodes (explained in the issue). We'll need to do some careful > benchmarking to see if it's really needed. I implemented a similar but simpler optimisation in Cython a while back: http://blog.behnel.de/posts/faster-python-calls-in-cython-021.html Instead of avoiding the creation of method objects, as you proposed, it just normally calls getattr and if that returns a bound method object, it uses inlined calling code that avoids re-packing the argument tuple. Interestingly, I got speedups of 5-15% for some of the Python benchmarks, but I don't quite remember which ones (at least raytrace and richards, I think), nor do I recall the overall gain, which (I assume) is what you are referring to with your 2-4% above. Might have been in the same order. Stefan From damien.p.george at gmail.com Fri Jan 29 07:38:53 2016 From: damien.p.george at gmail.com (Damien George) Date: Fri, 29 Jan 2016 12:38:53 +0000 Subject: [Python-Dev] Speeding up CPython 5-10% Message-ID: Hi Yury, > An off-topic: have you ever tried hg.python.org/benchmarks > or compare MicroPython vs CPython? I'm curious if MicroPython > is faster -- in that case we'll try to copy some optimization > ideas. I've tried a small number of those benchmarks, but not in any rigorous way, and not enough to compare properly with CPython. Maybe one day I (or someone) will get to it and report results :) One thing that makes MP fast is the use of pointer tagging and stuffing of small integers within object pointers. Thus integer arithmetic below 2**30 (on 32-bit arch) requires no heap. > Do you use opcode dictionary caching only for LOAD_GLOBAL-like > opcodes? Do you have an equivalent of LOAD_FAST, or you use > dicts to store local variables? The opcodes that have dict caching are: LOAD_NAME LOAD_GLOBAL LOAD_ATTR STORE_ATTR LOAD_METHOD (not implemented yet in mainline repo) For local variables we use LOAD_FAST and STORE_FAST (and DELETE_FAST). Actually, there are 16 dedicated opcodes for loading from positions 0-15, and 16 for storing to these positions. Eg: LOAD_FAST_0 LOAD_FAST_1 ... Mostly this is done to save RAM, since LOAD_FAST_0 is 1 byte. > If we change the opcode size, it will probably affect libraries > that compose or modify code objects. Modules like "dis" will > also need to be updated. And that's probably just a tip of the > iceberg. > > We can still implement your approach if we add a separate > private 'unsigned char' array to each code object, so that > LOAD_GLOBAL can store the key offsets. It should be a bit > faster than my current patch, since it has one less level > of indirection. But this way we loose the ability to > optimize LOAD_METHOD, simply because it requires more memory > for its cache. In any case, I'll experiment! Problem with that approach (having a separate array for offset_guess) is that how do you know where to look into that array for a given LOAD_GLOBAL opcode? The second LOAD_GLOBAL in your bytecode should look into the second entry in the array, but how does it know? I'd love to experiment implementing my original caching idea with CPython, but no time! Cheers, Damien. From yselivanov.ml at gmail.com Fri Jan 29 09:49:11 2016 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Fri, 29 Jan 2016 09:49:11 -0500 Subject: [Python-Dev] Speeding up CPython 5-10% In-Reply-To: References: <56A90B97.7090001@gmail.com> Message-ID: <56AB7BE7.9060902@gmail.com> On 2016-01-29 5:00 AM, Stefan Behnel wrote: > Yury Selivanov schrieb am 27.01.2016 um 19:25: >> [..] >> >> LOAD_METHOD looks at the object on top of the stack, and checks if the name >> resolves to a method or to a regular attribute. If it's a method, then we >> push the unbound method object and the object to the stack. If it's an >> attribute, we push the resolved attribute and NULL. >> >> When CALL_METHOD looks at the stack it knows how to call the unbound method >> properly (pushing the object as a first arg), or how to call a regular >> callable. >> >> This idea does make CPython faster around 2-4%. And it surely doesn't make >> it slower. I think it's a safe bet to at least implement this optimization >> in CPython 3.6. >> >> So far, the patch only optimizes positional-only method calls. It's >> possible to optimize all kind of calls, but this will necessitate 3 more >> opcodes (explained in the issue). We'll need to do some careful >> benchmarking to see if it's really needed. > I implemented a similar but simpler optimisation in Cython a while back: > > http://blog.behnel.de/posts/faster-python-calls-in-cython-021.html > > Instead of avoiding the creation of method objects, as you proposed, it > just normally calls getattr and if that returns a bound method object, it > uses inlined calling code that avoids re-packing the argument tuple. > Interestingly, I got speedups of 5-15% for some of the Python benchmarks, > but I don't quite remember which ones (at least raytrace and richards, I > think), nor do I recall the overall gain, which (I assume) is what you are > referring to with your 2-4% above. Might have been in the same order. > That's great! I'm still working on the patch, but so far it looks like adding just LOAD_METHOD/CALL_METHOD (that avoid instantiating BoundMethods) gives us 10-15% faster method calls. Combining them with my opcode cache makes them 30-35% faster. Yury From yselivanov.ml at gmail.com Fri Jan 29 10:06:38 2016 From: yselivanov.ml at gmail.com (Yury Selivanov) Date: Fri, 29 Jan 2016 10:06:38 -0500 Subject: [Python-Dev] Speeding up CPython 5-10% In-Reply-To: References: Message-ID: <56AB7FFE.7010004@gmail.com> Hi Damien, BTW I just saw (and backed!) your new Kickstarter campaign to port MicroPython to ESP8266, good stuff! On 2016-01-29 7:38 AM, Damien George wrote: > Hi Yury, > > [..] >> Do you use opcode dictionary caching only for LOAD_GLOBAL-like >> opcodes? Do you have an equivalent of LOAD_FAST, or you use >> dicts to store local variables? > The opcodes that have dict caching are: > > LOAD_NAME > LOAD_GLOBAL > LOAD_ATTR > STORE_ATTR > LOAD_METHOD (not implemented yet in mainline repo) > > For local variables we use LOAD_FAST and STORE_FAST (and DELETE_FAST). > Actually, there are 16 dedicated opcodes for loading from positions > 0-15, and 16 for storing to these positions. Eg: > > LOAD_FAST_0 > LOAD_FAST_1 > ... > > Mostly this is done to save RAM, since LOAD_FAST_0 is 1 byte. Interesting. This might actually make CPython slightly faster too. Worth trying. > >> If we change the opcode size, it will probably affect libraries >> that compose or modify code objects. Modules like "dis" will >> also need to be updated. And that's probably just a tip of the >> iceberg. >> >> We can still implement your approach if we add a separate >> private 'unsigned char' array to each code object, so that >> LOAD_GLOBAL can store the key offsets. It should be a bit >> faster than my current patch, since it has one less level >> of indirection. But this way we loose the ability to >> optimize LOAD_METHOD, simply because it requires more memory >> for its cache. In any case, I'll experiment! > Problem with that approach (having a separate array for offset_guess) > is that how do you know where to look into that array for a given > LOAD_GLOBAL opcode? The second LOAD_GLOBAL in your bytecode should > look into the second entry in the array, but how does it know? > > I've changed my approach a little bit. Now I have a simple function [1] to initialize the cache for code objects that are called frequently enough. It walks through the code object's opcodes and creates the appropriate offset/cache tables. Then, in ceval loop I have a couple of convenient macros to work with the cache [2]. They use INSTR_OFFSET() macro to locate the cache entry via the offset table initialized by [1]. Thanks, Yury [1] https://github.com/1st1/cpython/blob/opcache4/Objects/codeobject.c#L167 [2] https://github.com/1st1/cpython/blob/opcache4/Python/ceval.c#L1164 From steve.dower at python.org Fri Jan 29 12:05:18 2016 From: steve.dower at python.org (Steve Dower) Date: Fri, 29 Jan 2016 09:05:18 -0800 Subject: [Python-Dev] More optimisation ideas Message-ID: <56AB9BCE.2080000@python.org> Since we're all talking about making Python faster, I thought I'd drop some previous ideas I've had here in case (1) someone wants to actually do them, and (2) they really are new ideas that haven't failed in the past. Mostly I was thinking about startup time. Here are the list of modules imported on clean startup on my Windows, US-English machine (from -v and cleaned up a bit): import _frozen_importlib import _imp import sys import '_warnings' import '_thread' import '_weakref' import '_frozen_importlib_external' import '_io' import 'marshal' import 'nt' import '_thread' import '_weakref' import 'winreg' import 'zipimport' import '_codecs' import 'codecs' import 'encodings.aliases' import 'encodings' import 'encodings.mbcs' import '_signal' import 'encodings.utf_8' import 'encodings.latin_1' import '_weakrefset' import 'abc' import 'io' import 'encodings.cp437' import 'errno' import '_stat' import 'stat' import 'genericpath' import 'ntpath' import '_collections_abc' import 'os' import '_sitebuiltins' import 'sysconfig' import '_locale' import '_bootlocale' import 'encodings.cp1252' import 'site' Obviously the easiest first thing is to remove or delay unnecessary imports. But a while ago I used a native profiler to trace through this and the most impactful modules were the encodings: import 'encodings.mbcs' import 'encodings.utf_8' import 'encodings.latin_1' import 'encodings.cp437' import 'encodings.cp1252' While I don't doubt that we need all of these for *some* reason, aliases, cp437 and cp1252 are relatively expensive modules to import. Mostly due to having large static dictionaries or data structures generated on startup. Given this is static and mostly read-only information[1], I see no reason why we couldn't either generate completely static versions of them, or better yet compile the resulting data structures into the core binary. ([1]: If being able to write to some of the encoding data is used by some people, I vote for breaking that for 3.6 and making it read-only.) This is probably the code snippet that bothered me the most: ### Encoding table encoding_table=codecs.charmap_build(decoding_table) It shows up in many of the encodings modules, and while it is not a bad function in itself, we are obviously generating a known data structure on every startup. Storing these in static data is a tradeoff between disk space and startup performance, and one I think it likely to be worthwhile. Anyway, just an idea if someone wants to try it and see what improvements we can get. I'd love to do it myself, but when it actually comes to finding time I keep coming up short. Cheers, Steve P.S. If you just want to discuss optimisation techniques or benchmarking in general, without specific application to CPython 3.6, there's a whole internet out there. Please don't make me the cause of a pointless centithread. :) From status at bugs.python.org Fri Jan 29 12:08:30 2016 From: status at bugs.python.org (Python tracker) Date: Fri, 29 Jan 2016 18:08:30 +0100 (CET) Subject: [Python-Dev] Summary of Python tracker Issues Message-ID: <20160129170830.60F3156645@psf.upfronthosting.co.za> ACTIVITY SUMMARY (2016-01-22 - 2016-01-29) Python tracker at http://bugs.python.org/ To view or respond to any of the issues listed below, click on the issue. Do NOT respond to this message. Issues counts and deltas: open 5381 (+27) closed 32615 (+30) total 37996 (+57) Open issues with patches: 2359 Issues opened (39) ================== #26182: Deprecation warnings for the future async and await keywords http://bugs.python.org/issue26182 opened by marco.buttu #26184: raise an error when create_module() is not defined by exec_mod http://bugs.python.org/issue26184 opened by brett.cannon #26185: zipfile.ZipInfo slots can raise unexpected AttributeError http://bugs.python.org/issue26185 opened by Matthew Zipay #26186: LazyLoader rejecting use of SourceFileLoader http://bugs.python.org/issue26186 opened by brett.cannon #26187: sqlite3 trace callback prints duplicate line http://bugs.python.org/issue26187 opened by palaviv #26188: Provide more helpful error message when `await` is called insi http://bugs.python.org/issue26188 opened by Nicholas Chammas #26192: python3 k1om dissociation permanence: libffi http://bugs.python.org/issue26192 opened by mancoast #26193: python3 k1om dissociation permanence: readelf http://bugs.python.org/issue26193 opened by mancoast #26194: Undefined behavior for deque.insert() when len(d) == maxlen http://bugs.python.org/issue26194 opened by rhettinger #26195: Windows frozen .exe multiprocessing.Queue access is denied exc http://bugs.python.org/issue26195 opened by alex_python_org #26198: PyArg_ParseTuple with format "et#" and "es#" detects overflow http://bugs.python.org/issue26198 opened by hniksic #26200: SETREF adds unnecessary work in some cases http://bugs.python.org/issue26200 opened by rhettinger #26204: compiler: ignore constants used as statements? (don't emit LOA http://bugs.python.org/issue26204 opened by haypo #26205: Inconsistency concerning nested scopes http://bugs.python.org/issue26205 opened by Roscoe R. Higgins #26207: distutils msvccompiler fails due to mspdb140.dll error on debu http://bugs.python.org/issue26207 opened by haypo #26208: decimal C module's exceptions don't match the Python version http://bugs.python.org/issue26208 opened by encukou #26209: TypeError in smtpd module with string arguments http://bugs.python.org/issue26209 opened by lorenzo.ancora #26210: `HTMLParser.handle_data` may be invoked although `HTMLParser.r http://bugs.python.org/issue26210 opened by Hibou57 #26212: Python with ncurses6.0 will not load _curses module on Solaris http://bugs.python.org/issue26212 opened by jonesrw #26213: Document BUILD_*_UNPACK opcodes http://bugs.python.org/issue26213 opened by brett.cannon #26214: textwrap should minimize number of breaks in extra long words http://bugs.python.org/issue26214 opened by Tuomas Salo #26215: Make GC_Head a compile-time option http://bugs.python.org/issue26215 opened by yuriy_levchenko #26216: run runtktests.py error when test tkinter http://bugs.python.org/issue26216 opened by allensll #26218: Set PrependPath default to true http://bugs.python.org/issue26218 opened by Wallison Resende Santos #26219: implement per-opcode cache in ceval http://bugs.python.org/issue26219 opened by yselivanov #26221: asynco run_in_executor swallows StopIteration http://bugs.python.org/issue26221 opened by ikelly #26222: Missing code in linux_distribution python 2.7.11 http://bugs.python.org/issue26222 opened by Rasmus Rynning Rasmussen #26223: decimal.to_eng_string() does not implement engineering notatio http://bugs.python.org/issue26223 opened by serge.stroobandt #26224: Add "version added" for documentation of asyncio.timeout for d http://bugs.python.org/issue26224 opened by Udi Oron #26225: New misleading wording in execution model documenation http://bugs.python.org/issue26225 opened by abarnert #26226: Various test suite failures on Windows http://bugs.python.org/issue26226 opened by ebarry #26228: pty.spawn hangs on FreeBSD 9.3, 10.x http://bugs.python.org/issue26228 opened by chris.torek #26229: Make number serialization ES6/V8 compatible http://bugs.python.org/issue26229 opened by anders.rundgren.net at gmail.com #26231: HTTPResponse.close() should consume all remaining data in body http://bugs.python.org/issue26231 opened by Jacky #26233: select.epoll.poll() should avoid calling malloc() each time http://bugs.python.org/issue26233 opened by haypo #26234: The typing module includes 're' and 'io' in __all__ http://bugs.python.org/issue26234 opened by gvanrossum #26235: argparse docs: Positional * argument in mutually exclusive gro http://bugs.python.org/issue26235 opened by paul.j3 #26236: urllib2 initiate irregular call to gethostbyaddr http://bugs.python.org/issue26236 opened by juliadolgova #26238: httplib use wrong hostname in https request with SNI support http://bugs.python.org/issue26238 opened by lvhancy Most recent 15 issues with no replies (15) ========================================== #26238: httplib use wrong hostname in https request with SNI support http://bugs.python.org/issue26238 #26236: urllib2 initiate irregular call to gethostbyaddr http://bugs.python.org/issue26236 #26235: argparse docs: Positional * argument in mutually exclusive gro http://bugs.python.org/issue26235 #26224: Add "version added" for documentation of asyncio.timeout for d http://bugs.python.org/issue26224 #26216: run runtktests.py error when test tkinter http://bugs.python.org/issue26216 #26214: textwrap should minimize number of breaks in extra long words http://bugs.python.org/issue26214 #26209: TypeError in smtpd module with string arguments http://bugs.python.org/issue26209 #26200: SETREF adds unnecessary work in some cases http://bugs.python.org/issue26200 #26195: Windows frozen .exe multiprocessing.Queue access is denied exc http://bugs.python.org/issue26195 #26193: python3 k1om dissociation permanence: readelf http://bugs.python.org/issue26193 #26192: python3 k1om dissociation permanence: libffi http://bugs.python.org/issue26192 #26187: sqlite3 trace callback prints duplicate line http://bugs.python.org/issue26187 #26185: zipfile.ZipInfo slots can raise unexpected AttributeError http://bugs.python.org/issue26185 #26184: raise an error when create_module() is not defined by exec_mod http://bugs.python.org/issue26184 #26176: EmailMessage example doesn't work http://bugs.python.org/issue26176 Most recent 15 issues waiting for review (15) ============================================= #26233: select.epoll.poll() should avoid calling malloc() each time http://bugs.python.org/issue26233 #26228: pty.spawn hangs on FreeBSD 9.3, 10.x http://bugs.python.org/issue26228 #26224: Add "version added" for documentation of asyncio.timeout for d http://bugs.python.org/issue26224 #26219: implement per-opcode cache in ceval http://bugs.python.org/issue26219 #26204: compiler: ignore constants used as statements? (don't emit LOA http://bugs.python.org/issue26204 #26198: PyArg_ParseTuple with format "et#" and "es#" detects overflow http://bugs.python.org/issue26198 #26194: Undefined behavior for deque.insert() when len(d) == maxlen http://bugs.python.org/issue26194 #26192: python3 k1om dissociation permanence: libffi http://bugs.python.org/issue26192 #26185: zipfile.ZipInfo slots can raise unexpected AttributeError http://bugs.python.org/issue26185 #26177: tkinter: Canvas().keys returns empty strings. http://bugs.python.org/issue26177 #26175: Fully implement IOBase abstract on SpooledTemporaryFile http://bugs.python.org/issue26175 #26173: test_ssl.bad_cert_test() exception handling http://bugs.python.org/issue26173 #26168: Py_BuildValue may leak 'N' arguments on PyTuple_New failure http://bugs.python.org/issue26168 #26167: Improve copy.copy speed for built-in types (list/set/dict) http://bugs.python.org/issue26167 #26145: PEP 511: Add sys.set_code_transformers() http://bugs.python.org/issue26145 Top 10 most discussed issues (10) ================================= #26194: Undefined behavior for deque.insert() when len(d) == maxlen http://bugs.python.org/issue26194 17 msgs #26207: distutils msvccompiler fails due to mspdb140.dll error on debu http://bugs.python.org/issue26207 12 msgs #26198: PyArg_ParseTuple with format "et#" and "es#" detects overflow http://bugs.python.org/issue26198 11 msgs #26039: More flexibility in zipfile interface http://bugs.python.org/issue26039 7 msgs #26219: implement per-opcode cache in ceval http://bugs.python.org/issue26219 7 msgs #26223: decimal.to_eng_string() does not implement engineering notatio http://bugs.python.org/issue26223 7 msgs #19883: Integer overflow in zipimport.c http://bugs.python.org/issue19883 6 msgs #20598: argparse docs: '7'.split() is confusing magic http://bugs.python.org/issue20598 6 msgs #25314: Documentation: argparse's actions store_{true,false} default t http://bugs.python.org/issue25314 6 msgs #26204: compiler: ignore constants used as statements? (don't emit LOA http://bugs.python.org/issue26204 6 msgs Issues closed (27) ================== #18018: SystemError: Parent module '' not loaded, cannot perform relat http://bugs.python.org/issue18018 closed by brett.cannon #18898: Apply the setobject optimizations to dictionaries http://bugs.python.org/issue18898 closed by rhettinger #19023: ctypes docs: Unimplemented and undocumented features http://bugs.python.org/issue19023 closed by martin.panter #22363: argparse AssertionError with add_mutually_exclusive_group and http://bugs.python.org/issue22363 closed by martin.panter #24705: sysconfig._parse_makefile doesn't expand ${} vars appearing be http://bugs.python.org/issue24705 closed by berker.peksag #25296: Simple End-of-life guide covering all unsupported versions http://bugs.python.org/issue25296 closed by berker.peksag #26034: venv documentation out of date http://bugs.python.org/issue26034 closed by berker.peksag #26146: PEP 511: Add ast.Constant to allow AST optimizer to emit const http://bugs.python.org/issue26146 closed by haypo #26181: argparse can't handle positional argument after list (help mes http://bugs.python.org/issue26181 closed by martin.panter #26183: 2.7.11 won't clean install on Windows 10 x64 http://bugs.python.org/issue26183 closed by Roger Cook #26189: Interpreter returns control to cmd.exe early http://bugs.python.org/issue26189 closed by Ivan.Pozdeev #26190: GC memory leak using weak and cyclic references http://bugs.python.org/issue26190 closed by koehlma #26191: pip on Windows doesn't honor Case http://bugs.python.org/issue26191 closed by SilentGhost #26196: Argparse breaks when a switch is given an argument beginning w http://bugs.python.org/issue26196 closed by eric.smith #26197: arange from numpy function has some limits....I propose a pyth http://bugs.python.org/issue26197 closed by ebarry #26199: fix broken link to hamcrest.library.integration.match_equality http://bugs.python.org/issue26199 closed by berker.peksag #26201: Faster type checking in listobject.c http://bugs.python.org/issue26201 closed by rhettinger #26202: The range() object is deepcopied as atomic http://bugs.python.org/issue26202 closed by serhiy.storchaka #26203: nesting venv in virtualenv segfaults http://bugs.python.org/issue26203 closed by brett.cannon #26206: test_socket.testRecvmsgPeek() timeout on "AMD64 Debian root 3. http://bugs.python.org/issue26206 closed by martin.panter #26211: HTMLParser: ???AssertionError: we should not get here!??? http://bugs.python.org/issue26211 closed by Hibou57 #26217: Fatal error when importing ``test.test_os`` in debug mode on W http://bugs.python.org/issue26217 closed by ebarry #26220: Unicode HOWTO references a question mark that isn't in snippet http://bugs.python.org/issue26220 closed by martin.panter #26227: Windows: socket.gethostbyaddr(name) fails for non-ASCII hostna http://bugs.python.org/issue26227 closed by haypo #26230: Cookies do not correct if cookiename includes [ or ] http://bugs.python.org/issue26230 closed by tom_lt #26232: Mock(spec=spec) has no effect http://bugs.python.org/issue26232 closed by michael.foord #26237: UnboundLocalError error while handling exception http://bugs.python.org/issue26237 closed by haypo From storchaka at gmail.com Fri Jan 29 13:14:00 2016 From: storchaka at gmail.com (Serhiy Storchaka) Date: Fri, 29 Jan 2016 20:14:00 +0200 Subject: [Python-Dev] How to resolve distinguishing between documentation and implementation Message-ID: How to resolve distinguishing between documentation and implementation if current implementation is incorrect, but third-party code can implicitly depends on it? For example see issue26198. Currently buffer overflow of predefined buffer for "es#" and "et#" format units causes TypeError (with misleading message, but this is other story). The correct and *documented* exception is ValueError. User code can depend on current behavior, because TypeError is what is raised now for this type of errors, and this is what is raised for other types of errors. Unlikely authors of such code read the documentation, otherwise this issue would be reported earlier. On other hand, looks these format units are rarely used with predefined buffer (never in the stdlib since 3.5). I think it is obvious that the code in the development branch should be changed to produce documented and more logical exception. But what about bugfix releases? Changing the documentation would be misleading, changing the code can break existing code (unlikely, but). From brett at python.org Fri Jan 29 13:26:32 2016 From: brett at python.org (Brett Cannon) Date: Fri, 29 Jan 2016 18:26:32 +0000 Subject: [Python-Dev] How to resolve distinguishing between documentation and implementation In-Reply-To: References: Message-ID: On Fri, 29 Jan 2016 at 10:14 Serhiy Storchaka wrote: > How to resolve distinguishing between documentation and implementation > if current implementation is incorrect, but third-party code can > implicitly depends on it? > > For example see issue26198. Currently buffer overflow of predefined > buffer for "es#" and "et#" format units causes TypeError (with > misleading message, but this is other story). The correct and > *documented* exception is ValueError. User code can depend on current > behavior, because TypeError is what is raised now for this type of > errors, and this is what is raised for other types of errors. Unlikely > authors of such code read the documentation, otherwise this issue would > be reported earlier. On other hand, looks these format units are rarely > used with predefined buffer (never in the stdlib since 3.5). > > I think it is obvious that the code in the development branch should be > changed to produce documented and more logical exception. But what about > bugfix releases? Changing the documentation would be misleading, > changing the code can break existing code (unlikely, but). > When the potential breakage is low, I would move to the more reasonable exception and add the appropriate note to the docs and What's New about how to port pre-existing code by either changing the exception caught or catching both exceptions until support < 3.6 can be dropped by the user. -------------- next part -------------- An HTML attachment was scrubbed... URL: From guido at python.org Fri Jan 29 13:24:48 2016 From: guido at python.org (Guido van Rossum) Date: Fri, 29 Jan 2016 10:24:48 -0800 Subject: [Python-Dev] How to resolve distinguishing between documentation and implementation In-Reply-To: References: Message-ID: Typically we fix this in the next feature release but not in bugfix releases, and that's what I recommend in this case. But deciding remains an art, not an exact science. People who catch specific lists of exceptions based on experience are bound to run into trouble occasionally, so I have little pity on them when they upgrade to the next feature release (especially when the new behavior is more logical and follows an established standard). For bugfix releases I like to be much more conservative, since people often don't have control over which bugfix release is installed. On Fri, Jan 29, 2016 at 10:14 AM, Serhiy Storchaka wrote: > How to resolve distinguishing between documentation and implementation if > current implementation is incorrect, but third-party code can implicitly > depends on it? > > For example see issue26198. Currently buffer overflow of predefined buffer > for "es#" and "et#" format units causes TypeError (with misleading message, > but this is other story). The correct and *documented* exception is > ValueError. User code can depend on current behavior, because TypeError is > what is raised now for this type of errors, and this is what is raised for > other types of errors. Unlikely authors of such code read the documentation, > otherwise this issue would be reported earlier. On other hand, looks these > format units are rarely used with predefined buffer (never in the stdlib > since 3.5). > > I think it is obvious that the code in the development branch should be > changed to produce documented and more logical exception. But what about > bugfix releases? Changing the documentation would be misleading, changing > the code can break existing code (unlikely, but). > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/guido%40python.org -- --Guido van Rossum (python.org/~guido) From francismb at email.de Fri Jan 29 16:54:35 2016 From: francismb at email.de (francismb) Date: Fri, 29 Jan 2016 22:54:35 +0100 Subject: [Python-Dev] More optimisation ideas In-Reply-To: <56AB9BCE.2080000@python.org> References: <56AB9BCE.2080000@python.org> Message-ID: <56ABDF9B.9040502@email.de> Hi, > > Storing these in static data is a tradeoff between > disk space and startup performance, and one I think it likely to be > worthwhile. it's really an important trade off? As far a I understand from your email those modules are always being loaded and the final data created. won't the space be there (on mem or disk)? Thanks in advance! francis From steve.dower at python.org Fri Jan 29 22:48:41 2016 From: steve.dower at python.org (Steve Dower) Date: Fri, 29 Jan 2016 19:48:41 -0800 Subject: [Python-Dev] More optimisation ideas In-Reply-To: <56ABDF9B.9040502@email.de> References: <56AB9BCE.2080000@python.org> <56ABDF9B.9040502@email.de> Message-ID: It doesn't currently end up on disk. Some tables are partially or completely stored on disk as Python source code (some are partially generated from simple rules), but others are generated by inverting those. That process takes time that could be avoided by storing the generated tables, and storing all of it in a format that doesn't require parsing, compiling and executing (such as a native array). Potentially it could be a win all around if we stopped including the (larger) source files, but that doesn't seem like a good idea for maintaining portability to other implementations. The main thought is making the compiler binary bigger to avoid generating encoding tables at startup. Top-posted from my Windows Phone -----Original Message----- From: "francismb" Sent: ?1/?29/?2016 13:56 To: "python-dev at python.org" Subject: Re: [Python-Dev] More optimisation ideas Hi, > > Storing these in static data is a tradeoff between > disk space and startup performance, and one I think it likely to be > worthwhile. it's really an important trade off? As far a I understand from your email those modules are always being loaded and the final data created. won't the space be there (on mem or disk)? Thanks in advance! francis _______________________________________________ Python-Dev mailing list Python-Dev at python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/steve.dower%40python.org -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve at pearwood.info Fri Jan 29 23:28:35 2016 From: steve at pearwood.info (Steven D'Aprano) Date: Sat, 30 Jan 2016 15:28:35 +1100 Subject: [Python-Dev] Speeding up CPython 5-10% In-Reply-To: <56A90B97.7090001@gmail.com> References: <56A90B97.7090001@gmail.com> Message-ID: <20160130042835.GJ4619@ando.pearwood.info> On Wed, Jan 27, 2016 at 01:25:27PM -0500, Yury Selivanov wrote: > Hi, > > > tl;dr The summary is that I have a patch that improves CPython > performance up to 5-10% on macro benchmarks. Benchmarks results on > Macbook Pro/Mac OS X, desktop CPU/Linux, server CPU/Linux are available > at [1]. There are no slowdowns that I could reproduce consistently. Have you looked at Cesare Di Mauro's wpython? As far as I know, it's now unmaintained, and the project repo on Google Code appears to be dead (I get a 404), but I understand that it was significantly faster than CPython back in the 2.6 days. https://wpython.googlecode.com/files/Beyond%20Bytecode%20-%20A%20Wordcode-based%20Python.pdf -- Steve From oscar.j.benjamin at gmail.com Sat Jan 30 08:54:54 2016 From: oscar.j.benjamin at gmail.com (Oscar Benjamin) Date: Sat, 30 Jan 2016 13:54:54 +0000 Subject: [Python-Dev] More optimisation ideas In-Reply-To: References: <56AB9BCE.2080000@python.org> <56ABDF9B.9040502@email.de> Message-ID: On 30 January 2016 at 03:48, Steve Dower wrote: > > It doesn't currently end up on disk. Some tables are partially or completely > stored on disk as Python source code (some are partially generated from > simple rules), but others are generated by inverting those. That process > takes time that could be avoided by storing the generated tables, and > storing all of it in a format that doesn't require parsing, compiling and > executing (such as a native array). > > Potentially it could be a win all around if we stopped including the > (larger) source files, but that doesn't seem like a good idea for > maintaining portability to other implementations. The main thought is making > the compiler binary bigger to avoid generating encoding tables at startup. When I last tried to profile startup on Windows (I haven't used Windows for some time now) it seemed that the time was totally dominated by file system access. Essentially the limiting factor was the inordinate number of stat calls and small file accesses. Although this was probably Python 2.x which may not import those particular modules and maybe it depends on virus scanner software etc. Things may have changed now but I concluded that substantive gains could only come from improving FS access. Perhaps something like zipping up the standard library would see a big improvement. -- Oscar From storchaka at gmail.com Sat Jan 30 09:45:44 2016 From: storchaka at gmail.com (Serhiy Storchaka) Date: Sat, 30 Jan 2016 16:45:44 +0200 Subject: [Python-Dev] More optimisation ideas In-Reply-To: <56AB9BCE.2080000@python.org> References: <56AB9BCE.2080000@python.org> Message-ID: On 29.01.16 19:05, Steve Dower wrote: > This is probably the code snippet that bothered me the most: > > ### Encoding table > encoding_table=codecs.charmap_build(decoding_table) > > It shows up in many of the encodings modules, and while it is not a bad > function in itself, we are obviously generating a known data structure > on every startup. Storing these in static data is a tradeoff between > disk space and startup performance, and one I think it likely to be > worthwhile. $ ./python -m timeit -s "import codecs; from encodings.cp437 import decoding_table" -- "codecs.charmap_build(decoding_table)" 100000 loops, best of 3: 4.36 usec per loop Getting rid from charmap_build() would save you at most 4.4 microseconds per encoding. 0.0005 seconds if you have imported *all* standard encodings! And how you expected to store encoding_table in more efficient way? From steve.dower at python.org Sat Jan 30 11:31:32 2016 From: steve.dower at python.org (Steve Dower) Date: Sat, 30 Jan 2016 08:31:32 -0800 Subject: [Python-Dev] More optimisation ideas In-Reply-To: References: <56AB9BCE.2080000@python.org> Message-ID: <56ACE564.7080107@python.org> On 30Jan2016 0645, Serhiy Storchaka wrote: > $ ./python -m timeit -s "import codecs; from encodings.cp437 import > decoding_table" -- "codecs.charmap_build(decoding_table)" > 100000 loops, best of 3: 4.36 usec per loop > > Getting rid from charmap_build() would save you at most 4.4 microseconds > per encoding. 0.0005 seconds if you have imported *all* standard encodings! Just as happy to be proven wrong. Perhaps I misinterpreted my original profiling and then, embarrassingly, ran with the result for a long time without retesting. > And how you expected to store encoding_table in more efficient way? There's nothing inefficient about its storage, but as it does not change it would be trivial to store it statically. Then "building" the map is simply obtaining a pointer into an already loaded memory page. Much faster than building it on load, but both are clearly insignificant compared to other factors. Cheers, Steve From storchaka at gmail.com Sat Jan 30 13:20:08 2016 From: storchaka at gmail.com (Serhiy Storchaka) Date: Sat, 30 Jan 2016 20:20:08 +0200 Subject: [Python-Dev] More optimisation ideas In-Reply-To: <56ACE564.7080107@python.org> References: <56AB9BCE.2080000@python.org> <56ACE564.7080107@python.org> Message-ID: On 30.01.16 18:31, Steve Dower wrote: > On 30Jan2016 0645, Serhiy Storchaka wrote: >> $ ./python -m timeit -s "import codecs; from encodings.cp437 import >> decoding_table" -- "codecs.charmap_build(decoding_table)" >> 100000 loops, best of 3: 4.36 usec per loop >> >> Getting rid from charmap_build() would save you at most 4.4 microseconds >> per encoding. 0.0005 seconds if you have imported *all* standard >> encodings! > > Just as happy to be proven wrong. Perhaps I misinterpreted my original > profiling and then, embarrassingly, ran with the result for a long time > without retesting. AFAIK the most time is spent in system calls like stat or open. Archiving the stdlib into the ZIP file and using zipimport can decrease Python startup time (perhaps there is an open issue about this). From brett at python.org Sat Jan 30 14:09:43 2016 From: brett at python.org (Brett Cannon) Date: Sat, 30 Jan 2016 19:09:43 +0000 Subject: [Python-Dev] More optimisation ideas In-Reply-To: References: <56AB9BCE.2080000@python.org> <56ACE564.7080107@python.org> Message-ID: On Sat, 30 Jan 2016 at 10:21 Serhiy Storchaka wrote: > On 30.01.16 18:31, Steve Dower wrote: > > On 30Jan2016 0645, Serhiy Storchaka wrote: > >> $ ./python -m timeit -s "import codecs; from encodings.cp437 import > >> decoding_table" -- "codecs.charmap_build(decoding_table)" > >> 100000 loops, best of 3: 4.36 usec per loop > >> > >> Getting rid from charmap_build() would save you at most 4.4 microseconds > >> per encoding. 0.0005 seconds if you have imported *all* standard > >> encodings! > > > > Just as happy to be proven wrong. Perhaps I misinterpreted my original > > profiling and then, embarrassingly, ran with the result for a long time > > without retesting. > > AFAIK the most time is spent in system calls like stat or open. > Archiving the stdlib into the ZIP file and using zipimport can decrease > Python startup time (perhaps there is an open issue about this). > Check the archives, but I did trying freezing the entire stdlib and it didn't really make a difference in startup, so I don't know if this still holds true anymore. At this point I think all of our knowledge of what takes the most amount of time during startup is outdated and someone should try to really profile the whole thing to see where the hotspots are (e.g., is it stat calls from imports, is it actually some specific function, is it just so many little things adding up to a big thing, etc.). -------------- next part -------------- An HTML attachment was scrubbed... URL: From steve.dower at python.org Sat Jan 30 14:15:26 2016 From: steve.dower at python.org (Steve Dower) Date: Sat, 30 Jan 2016 11:15:26 -0800 Subject: [Python-Dev] More optimisation ideas In-Reply-To: References: <56AB9BCE.2080000@python.org> <56ACE564.7080107@python.org> Message-ID: Brett tried freezing the entire stdlib at one point (as we do for parts of importlib) and reported no significant improvement. Since that rules out code compilation as well as the OS calls, it'd seem the priority is to execute less code on startup. Details of that work were posted to python-dev about twelve months ago, IIRC. Maybe a little longer. Top-posted from my Windows Phone -----Original Message----- From: "Serhiy Storchaka" Sent: ?1/?30/?2016 10:22 To: "python-dev at python.org" Subject: Re: [Python-Dev] More optimisation ideas On 30.01.16 18:31, Steve Dower wrote: > On 30Jan2016 0645, Serhiy Storchaka wrote: >> $ ./python -m timeit -s "import codecs; from encodings.cp437 import >> decoding_table" -- "codecs.charmap_build(decoding_table)" >> 100000 loops, best of 3: 4.36 usec per loop >> >> Getting rid from charmap_build() would save you at most 4.4 microseconds >> per encoding. 0.0005 seconds if you have imported *all* standard >> encodings! > > Just as happy to be proven wrong. Perhaps I misinterpreted my original > profiling and then, embarrassingly, ran with the result for a long time > without retesting. AFAIK the most time is spent in system calls like stat or open. Archiving the stdlib into the ZIP file and using zipimport can decrease Python startup time (perhaps there is an open issue about this). _______________________________________________ Python-Dev mailing list Python-Dev at python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/steve.dower%40python.org -------------- next part -------------- An HTML attachment was scrubbed... URL: From srkunze at mail.de Sat Jan 30 15:29:20 2016 From: srkunze at mail.de (Sven R. Kunze) Date: Sat, 30 Jan 2016 21:29:20 +0100 Subject: [Python-Dev] More optimisation ideas In-Reply-To: References: <56AB9BCE.2080000@python.org> <56ACE564.7080107@python.org> Message-ID: <56AD1D20.5090205@mail.de> On 30.01.2016 19:20, Serhiy Storchaka wrote: > AFAIK the most time is spent in system calls like stat or open. > Archiving the stdlib into the ZIP file and using zipimport can > decrease Python startup time (perhaps there is an open issue about this). Oh, please don't. One thing I love about Python is the ease of access. I personally think that startup time is not really a big issue; even when it comes to microbenchmarks. Best, Sven From brett at python.org Sat Jan 30 15:32:55 2016 From: brett at python.org (Brett Cannon) Date: Sat, 30 Jan 2016 20:32:55 +0000 Subject: [Python-Dev] More optimisation ideas In-Reply-To: <56AD1D20.5090205@mail.de> References: <56AB9BCE.2080000@python.org> <56ACE564.7080107@python.org> <56AD1D20.5090205@mail.de> Message-ID: On Sat, Jan 30, 2016, 12:30 Sven R. Kunze wrote: > On 30.01.2016 19:20, Serhiy Storchaka wrote: > > AFAIK the most time is spent in system calls like stat or open. > > Archiving the stdlib into the ZIP file and using zipimport can > > decrease Python startup time (perhaps there is an open issue about this). > > Oh, please don't. One thing I love about Python is the ease of access. > It wouldn't be a requirement, just a nootion > I personally think that startup time is not really a big issue; even > when it comes to microbenchmarks. > You might not, but just about every command-line app does. -brett > Best, > Sven > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From srkunze at mail.de Sat Jan 30 16:50:13 2016 From: srkunze at mail.de (Sven R. Kunze) Date: Sat, 30 Jan 2016 22:50:13 +0100 Subject: [Python-Dev] More optimisation ideas In-Reply-To: References: <56AB9BCE.2080000@python.org> <56ACE564.7080107@python.org> <56AD1D20.5090205@mail.de> Message-ID: <56AD3015.6010003@mail.de> On 30.01.2016 21:32, Brett Cannon wrote: > On Sat, Jan 30, 2016, 12:30 Sven R. Kunze > wrote: > > On 30.01.2016 19:20, Serhiy Storchaka wrote: > > AFAIK the most time is spent in system calls like stat or open. > > Archiving the stdlib into the ZIP file and using zipimport can > > decrease Python startup time (perhaps there is an open issue > about this). > > Oh, please don't. One thing I love about Python is the ease of access. > > > It wouldn't be a requirement, just a nootion > That's good. :) -------------- next part -------------- An HTML attachment was scrubbed... URL: From larry at hastings.org Sun Jan 31 10:11:06 2016 From: larry at hastings.org (Larry Hastings) Date: Sun, 31 Jan 2016 07:11:06 -0800 Subject: [Python-Dev] Fun with ancient unsupported platforms In-Reply-To: <56AA2C50.7050700@hastings.org> References: <56AA2C50.7050700@hastings.org> Message-ID: <56AE240A.1040807@hastings.org> On 01/28/2016 06:57 AM, Larry Hastings wrote: > It's currently 2016. Perhaps it's time to remove all vestiges of > these unsupported operating systems nobody's cared about since a year > that started with a '1'? We dropped support for Irix in 2.3. We dropped support for Irix threads in 3.2. All our supported platforms have Thread Local Storage (TLS) support. Maybe we can drop our 250-line portable TLS library from Python/thread.c? //arry/ p.s. Derpy code in Python/thread_nt.h. It literally looks like this: /* use native Windows TLS functions */ #define Py_HAVE_NATIVE_TLS #ifdef Py_HAVE_NATIVE_TLS It seems this developer had the short-term memory of a goldfish. -------------- next part -------------- An HTML attachment was scrubbed... URL: From antoine at python.org Sun Jan 31 11:56:21 2016 From: antoine at python.org (Antoine Pitrou) Date: Sun, 31 Jan 2016 16:56:21 +0000 (UTC) Subject: [Python-Dev] More optimisation ideas References: <56AB9BCE.2080000@python.org> Message-ID: Hi, If you want to make startup time faster for a broad range of applications, please consider adding a lazy import facility in the stdlib. I recently tried to write a lazy import mechanism using import hooks (to make it portable from 2.6 to 3.5), it seems nearly impossible to do so (or, at least, for an average Python programmer like me). This would be much more useful (for actual users, not for architecture astronauts) than refactoring the importlib APIs in each feature version... Thanks in advance Antoine. From brett at python.org Sun Jan 31 12:02:30 2016 From: brett at python.org (Brett Cannon) Date: Sun, 31 Jan 2016 17:02:30 +0000 Subject: [Python-Dev] More optimisation ideas In-Reply-To: References: <56AB9BCE.2080000@python.org> Message-ID: A lazy importer was added in Python 3.5 and it was not possible without the module spec refactoring. On Sun, 31 Jan 2016, 08:57 Antoine Pitrou wrote: > > Hi, > > If you want to make startup time faster for a broad range of applications, > please consider adding a lazy import facility in the stdlib. > I recently tried to write a lazy import mechanism using import hooks > (to make it portable from 2.6 to 3.5), it seems nearly impossible to do > so (or, at least, for an average Python programmer like me). > > This would be much more useful (for actual users, not for architecture > astronauts) than refactoring the importlib APIs in each feature version... > > Thanks in advance > > Antoine. > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From antoine at python.org Sun Jan 31 12:09:49 2016 From: antoine at python.org (Antoine Pitrou) Date: Sun, 31 Jan 2016 17:09:49 +0000 (UTC) Subject: [Python-Dev] More optimisation ideas References: <56AB9BCE.2080000@python.org> Message-ID: Brett Cannon python.org> writes: > > > A lazy importer was added in Python 3.5 and it was not possible > without the module spec refactoring. Wow... Thank you, I didn't know about that. Now for the next question: how am I supposed to use it? The following documentation leaves me absolutely clueless: """This class only works with loaders that define exec_module() as control over what module type is used for the module is required. For those same reasons, the loader?s create_module() method will be ignored (i.e., the loader?s method should only return None). Finally, modules which substitute the object placed into sys.modules will not work as there is no way to properly replace the module references throughout the interpreter safely; ValueError is raised if such a substitution is detected.""" (reference: https://docs.python.org/3/library/importlib.html#importlib.util.LazyLoader) I want to import lazily the modules from package "foobar.*", but not other modules as other libraries may depend on import side effects. How do I do that? The quoted snippet doesn't really help. Regards Antoine. From donald at stufft.io Sun Jan 31 12:11:23 2016 From: donald at stufft.io (Donald Stufft) Date: Sun, 31 Jan 2016 12:11:23 -0500 Subject: [Python-Dev] More optimisation ideas In-Reply-To: References: <56AB9BCE.2080000@python.org> Message-ID: > On Jan 31, 2016, at 12:02 PM, Brett Cannon wrote: > > A lazy importer was added in Python 3.5 Is there any docs on how to actually use the LazyLoader in 3.5? I can?t seem to find any but I don?t really know the import system that well. ----------------- Donald Stufft PGP: 0x6E3CBCE93372DCFA // 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 842 bytes Desc: Message signed with OpenPGP using GPGMail URL: From brett at python.org Sun Jan 31 12:26:19 2016 From: brett at python.org (Brett Cannon) Date: Sun, 31 Jan 2016 17:26:19 +0000 Subject: [Python-Dev] More optimisation ideas In-Reply-To: References: <56AB9BCE.2080000@python.org> Message-ID: There are no example docs for it yet, but enough people have asked this week about how to set up a custom importer that I will write up a generic example case which will make sense for a lazy loader (need to file the issue before I forget). On Sun, 31 Jan 2016, 09:11 Donald Stufft wrote: > > On Jan 31, 2016, at 12:02 PM, Brett Cannon wrote: > > A lazy importer was added in Python 3.5 > > > Is there any docs on how to actually use the LazyLoader in 3.5? I can?t > seem to find any but I don?t really know the import system that well. > > ----------------- > Donald Stufft > PGP: 0x6E3CBCE93372DCFA // 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 > DCFA > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From brett at python.org Sun Jan 31 12:57:12 2016 From: brett at python.org (Brett Cannon) Date: Sun, 31 Jan 2016 17:57:12 +0000 Subject: [Python-Dev] More optimisation ideas In-Reply-To: References: <56AB9BCE.2080000@python.org> Message-ID: I have opened http://bugs.python.org/issue26252 to track writing the example (and before ppl go playing with the lazy loader, be aware of http://bugs.python.org/issue26186). On Sun, 31 Jan 2016 at 09:26 Brett Cannon wrote: > There are no example docs for it yet, but enough people have asked this > week about how to set up a custom importer that I will write up a generic > example case which will make sense for a lazy loader (need to file the > issue before I forget). > > On Sun, 31 Jan 2016, 09:11 Donald Stufft wrote: > >> >> On Jan 31, 2016, at 12:02 PM, Brett Cannon wrote: >> >> A lazy importer was added in Python 3.5 >> >> >> Is there any docs on how to actually use the LazyLoader in 3.5? I can?t >> seem to find any but I don?t really know the import system that well. >> >> ----------------- >> Donald Stufft >> PGP: 0x6E3CBCE93372DCFA // 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 >> DCFA >> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From mal at egenix.com Sun Jan 31 13:43:20 2016 From: mal at egenix.com (M.-A. Lemburg) Date: Sun, 31 Jan 2016 19:43:20 +0100 Subject: [Python-Dev] More optimisation ideas In-Reply-To: References: <56AB9BCE.2080000@python.org> <56ACE564.7080107@python.org> Message-ID: <56AE55C8.8000807@egenix.com> On 30.01.2016 20:15, Steve Dower wrote: > Brett tried freezing the entire stdlib at one point (as we do for parts of importlib) and reported no significant improvement. Since that rules out code compilation as well as the OS calls, it'd seem the priority is to execute less code on startup. > > Details of that work were posted to python-dev about twelve months ago, IIRC. Maybe a little longer. Freezing the entire stdlib does improve the startup time, simply because it removes stat calls, which dominate the startup time at least on Unix. It also allows sharing the stdlib byte code in memory, since it gets stored in static C structs which the OS will happily mmap into multiple processes for you without any additional effort. Our eGenix PyRun does exactly that. Even though the original motivation is a different one, the gained improvement in startup time is a nice side effect: http://www.egenix.com/products/python/PyRun/ Aside: The encodings don't really make much difference here. The dictionaries aren't all that big, so generating them on the fly doesn't really create much overhead. The trade off in terms of maintainability/speed definitely leans toward maintainability. For the larger encoding tables we already have C implementations with appropriate data structures to make lookup speed vs. storage needs efficient. -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Experts (#1, Jan 31 2016) >>> Python Projects, Coaching and Consulting ... http://www.egenix.com/ >>> Python Database Interfaces ... http://products.egenix.com/ >>> Plone/Zope Database Interfaces ... http://zope.egenix.com/ ________________________________________________________________________ ::: We implement business ideas - efficiently in both time and costs ::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/ http://www.malemburg.com/ > Top-posted from my Windows Phone > > -----Original Message----- > From: "Serhiy Storchaka" > Sent: ?1/?30/?2016 10:22 > To: "python-dev at python.org" > Subject: Re: [Python-Dev] More optimisation ideas > > On 30.01.16 18:31, Steve Dower wrote: >> On 30Jan2016 0645, Serhiy Storchaka wrote: >>> $ ./python -m timeit -s "import codecs; from encodings.cp437 import >>> decoding_table" -- "codecs.charmap_build(decoding_table)" >>> 100000 loops, best of 3: 4.36 usec per loop >>> >>> Getting rid from charmap_build() would save you at most 4.4 microseconds >>> per encoding. 0.0005 seconds if you have imported *all* standard >>> encodings! >> >> Just as happy to be proven wrong. Perhaps I misinterpreted my original >> profiling and then, embarrassingly, ran with the result for a long time >> without retesting. > > AFAIK the most time is spent in system calls like stat or open. > Archiving the stdlib into the ZIP file and using zipimport can decrease > Python startup time (perhaps there is an open issue about this). > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/steve.dower%40python.org > > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: https://mail.python.org/mailman/options/python-dev/mal%40egenix.com > From brett at python.org Sun Jan 31 15:23:00 2016 From: brett at python.org (Brett Cannon) Date: Sun, 31 Jan 2016 20:23:00 +0000 Subject: [Python-Dev] More optimisation ideas In-Reply-To: <56AE55C8.8000807@egenix.com> References: <56AB9BCE.2080000@python.org> <56ACE564.7080107@python.org> <56AE55C8.8000807@egenix.com> Message-ID: So freezing the stdlib helps on UNIX and not on OS X (if my old testing is still accurate). I guess the next question is what it does on Windows and if we would want to ever consider freezing the stdlib as part of the build process (and if we would want to change the order of importers on sys.meta_path so frozen modules came after file-based ones). On Sun, 31 Jan 2016, 10:43 M.-A. Lemburg wrote: > On 30.01.2016 20:15, Steve Dower wrote: > > Brett tried freezing the entire stdlib at one point (as we do for parts > of importlib) and reported no significant improvement. Since that rules out > code compilation as well as the OS calls, it'd seem the priority is to > execute less code on startup. > > > > Details of that work were posted to python-dev about twelve months ago, > IIRC. Maybe a little longer. > > Freezing the entire stdlib does improve the startup time, > simply because it removes stat calls, which dominate the startup > time at least on Unix. > > It also allows sharing the stdlib byte code in memory, since it gets > stored in static C structs which the OS will happily mmap into > multiple processes for you without any additional effort. > > Our eGenix PyRun does exactly that. Even though the original > motivation is a different one, the gained improvement in > startup time is a nice side effect: > > http://www.egenix.com/products/python/PyRun/ > > Aside: The encodings don't really make much difference here. The > dictionaries aren't all that big, so generating them on the fly doesn't > really create much overhead. The trade off in terms of > maintainability/speed > definitely leans toward maintainability. For the larger encoding > tables we already have C implementations with appropriate data > structures to make lookup speed vs. storage needs efficient. > > -- > Marc-Andre Lemburg > eGenix.com > > Professional Python Services directly from the Experts (#1, Jan 31 2016) > >>> Python Projects, Coaching and Consulting ... http://www.egenix.com/ > >>> Python Database Interfaces ... http://products.egenix.com/ > >>> Plone/Zope Database Interfaces ... http://zope.egenix.com/ > ________________________________________________________________________ > > ::: We implement business ideas - efficiently in both time and costs ::: > > eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 > D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg > Registered at Amtsgericht Duesseldorf: HRB 46611 > http://www.egenix.com/company/contact/ > http://www.malemburg.com/ > > > > Top-posted from my Windows Phone > > > > -----Original Message----- > > From: "Serhiy Storchaka" > > Sent: ?1/?30/?2016 10:22 > > To: "python-dev at python.org" > > Subject: Re: [Python-Dev] More optimisation ideas > > > > On 30.01.16 18:31, Steve Dower wrote: > >> On 30Jan2016 0645, Serhiy Storchaka wrote: > >>> $ ./python -m timeit -s "import codecs; from encodings.cp437 import > >>> decoding_table" -- "codecs.charmap_build(decoding_table)" > >>> 100000 loops, best of 3: 4.36 usec per loop > >>> > >>> Getting rid from charmap_build() would save you at most 4.4 > microseconds > >>> per encoding. 0.0005 seconds if you have imported *all* standard > >>> encodings! > >> > >> Just as happy to be proven wrong. Perhaps I misinterpreted my original > >> profiling and then, embarrassingly, ran with the result for a long time > >> without retesting. > > > > AFAIK the most time is spent in system calls like stat or open. > > Archiving the stdlib into the ZIP file and using zipimport can decrease > > Python startup time (perhaps there is an open issue about this). > > > > > > _______________________________________________ > > Python-Dev mailing list > > Python-Dev at python.org > > https://mail.python.org/mailman/listinfo/python-dev > > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/steve.dower%40python.org > > > > > > > > _______________________________________________ > > Python-Dev mailing list > > Python-Dev at python.org > > https://mail.python.org/mailman/listinfo/python-dev > > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/mal%40egenix.com > > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From aarmour at cipmail.org Sun Jan 31 15:16:27 2016 From: aarmour at cipmail.org (ty armour) Date: Sun, 31 Jan 2016 15:16:27 -0500 Subject: [Python-Dev] just a friendly developer Message-ID: I am looking for tutorials on basically how to write the python language. As well I want to know how to write wrappers for things like alsa and directx and coreaudio and jackd and pulseaudio. I would be looking to write wrappers for coreaudio in mac and coreaudio for windows as well as alsa. i think it would benefit you to have this information. I am looking to aid in development of computers and languages and tutorials on how to write the actual python language would be super useful. but yeah im looking to write applications like guitarix rakarrack blender ardour lmms. I am also going to build my own computers so these tutorials would help a lot. anyhow thanks for your time -------------- next part -------------- An HTML attachment was scrubbed... URL: From ethan at stoneleaf.us Sun Jan 31 17:43:22 2016 From: ethan at stoneleaf.us (Ethan Furman) Date: Sun, 31 Jan 2016 14:43:22 -0800 Subject: [Python-Dev] just a friendly developer In-Reply-To: References: Message-ID: <56AE8E0A.2010906@stoneleaf.us> On 01/31/2016 12:16 PM, ty armour wrote: > I am looking for tutorials on basically how to write the python > language. Try asking on the python-list [1] mailing list, as that is for general discussion of Python. This list is for developing Python itself. -- ~Ethan~ [1] https://mail.python.org/mailman/listinfo/python-list From tjreedy at udel.edu Sun Jan 31 18:35:44 2016 From: tjreedy at udel.edu (Terry Reedy) Date: Sun, 31 Jan 2016 18:35:44 -0500 Subject: [Python-Dev] More optimisation ideas In-Reply-To: References: <56AB9BCE.2080000@python.org> Message-ID: On 1/31/2016 12:09 PM, Antoine Pitrou wrote: > The following documentation leaves me absolutely clueless: > > """This class only works with loaders that define exec_module() as control > over what module type is used for the module is required. No wonder. I cannot parse it as an English sentence. It needs rewriting. > For those same > reasons, the loader?s create_module() method will be ignored (i.e., the > loader?s method should only return None). Finally, modules which substitute > the object placed into sys.modules will not work as there is no way to > properly replace the module references throughout the interpreter safely; > ValueError is raised if such a substitution is detected.""" > > (reference: > https://docs.python.org/3/library/importlib.html#importlib.util.LazyLoader) -- Terry Jan Reedy From brett at python.org Sun Jan 31 20:09:27 2016 From: brett at python.org (Brett Cannon) Date: Mon, 01 Feb 2016 01:09:27 +0000 Subject: [Python-Dev] More optimisation ideas In-Reply-To: References: <56AB9BCE.2080000@python.org> Message-ID: On Sun, 31 Jan 2016, 15:36 Terry Reedy wrote: > On 1/31/2016 12:09 PM, Antoine Pitrou wrote: > > > The following documentation leaves me absolutely clueless: > > > > """This class only works with loaders that define exec_module() as > control > > over what module type is used for the module is required. > > No wonder. I cannot parse it as an English sentence. It needs rewriting. > Feel free to open an issue to clarify the wording. -Brett > > For those same > > reasons, the loader?s create_module() method will be ignored (i.e., the > > loader?s method should only return None). Finally, modules which > substitute > > the object placed into sys.modules will not work as there is no way to > > properly replace the module references throughout the interpreter safely; > > ValueError is raised if such a substitution is detected.""" > > > > (reference: > > > https://docs.python.org/3/library/importlib.html#importlib.util.LazyLoader > ) > > -- > Terry Jan Reedy > > > _______________________________________________ > Python-Dev mailing list > Python-Dev at python.org > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/brett%40python.org > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ben+python at benfinney.id.au Sun Jan 31 21:27:35 2016 From: ben+python at benfinney.id.au (Ben Finney) Date: Mon, 01 Feb 2016 13:27:35 +1100 Subject: [Python-Dev] just a friendly developer References: Message-ID: <857fipdv1k.fsf@benfinney.id.au> ty armour writes: > I am looking for tutorials on basically how to write the python > language. Newcomers to Python are especially invited to the Python ?tutor? forum , for collaborative tutoring in a friendly environment. -- \ ?The internet's completely over.? Anyway, all these computers | `\ and digital gadgets are no good. They just fill your head with | _o__) numbers and that can't be good for you.? ?Prince, 2010-07-05 | Ben Finney From steve at pearwood.info Sun Jan 31 22:12:27 2016 From: steve at pearwood.info (Steven D'Aprano) Date: Mon, 1 Feb 2016 14:12:27 +1100 Subject: [Python-Dev] More optimisation ideas In-Reply-To: References: <56AB9BCE.2080000@python.org> <56ACE564.7080107@python.org> <56AE55C8.8000807@egenix.com> Message-ID: <20160201031226.GF31806@ando.pearwood.info> On Sun, Jan 31, 2016 at 08:23:00PM +0000, Brett Cannon wrote: > So freezing the stdlib helps on UNIX and not on OS X (if my old testing is > still accurate). I guess the next question is what it does on Windows and > if we would want to ever consider freezing the stdlib as part of the build > process (and if we would want to change the order of importers on > sys.meta_path so frozen modules came after file-based ones). I find that being able to easily open stdlib .py files in a text editor to read the source is extremely valuable. I've learned much more from reading the source than from (e.g.) StackOverflow. Likewise, it's often handy to do a grep over the stdlib. When you talk about freezing the stdlib, what exactly does that mean? - will the source files still be there? - how will this affect people writing patches for bugs? -- Steve